Sample records for computational challenges arising

  1. Cloud Implementation in Organizations: Critical Success Factors, Challenges, and Impacts on the IT Function

    ERIC Educational Resources Information Center

    Suo, Shuguang

    2013-01-01

    Organizations have been forced to rethink business models and restructure facilities through IT innovation as they have faced the challenges arising from globalization, mergers and acquisitions, big data, and the ever-changing demands of customers. Cloud computing has emerged as a new computing paradigm that has fundamentally shaped the business…

  2. Teaching Computer Science Courses in Distance Learning

    ERIC Educational Resources Information Center

    Huan, Xiaoli; Shehane, Ronald; Ali, Adel

    2011-01-01

    As the success of distance learning (DL) has driven universities to increase the courses offered online, certain challenges arise when teaching computer science (CS) courses to students who are not physically co-located and have individual learning schedules. Teaching CS courses involves high level demonstrations and interactivity between the…

  3. Specifying and Refining a Measurement Model for a Computer-Based Interactive Assessment

    ERIC Educational Resources Information Center

    Levy, Roy; Mislevy, Robert J.

    2004-01-01

    The challenges of modeling students' performance in computer-based interactive assessments include accounting for multiple aspects of knowledge and skill that arise in different situations and the conditional dependencies among multiple aspects of performance. This article describes a Bayesian approach to modeling and estimating cognitive models…

  4. Numerical Relativity, Black Hole Mergers, and Gravitational Waves: Part I

    NASA Technical Reports Server (NTRS)

    Centrella, Joan

    2012-01-01

    This series of 3 lectures will present recent developments in numerical relativity, and their applications to simulating black hole mergers and computing the resulting gravitational waveforms. In this first lecture, we introduce the basic ideas of numerical relativity, highlighting the challenges that arise in simulating gravitational wave sources on a computer.

  5. A Comparison of Solver Performance for Complex Gastric Electrophysiology Models

    PubMed Central

    Sathar, Shameer; Cheng, Leo K.; Trew, Mark L.

    2016-01-01

    Computational techniques for solving systems of equations arising in gastric electrophysiology have not been studied for efficient solution process. We present a computationally challenging problem of simulating gastric electrophysiology in anatomically realistic stomach geometries with multiple intracellular and extracellular domains. The multiscale nature of the problem and mesh resolution required to capture geometric and functional features necessitates efficient solution methods if the problem is to be tractable. In this study, we investigated and compared several parallel preconditioners for the linear systems arising from tetrahedral discretisation of electrically isotropic and anisotropic problems, with and without stimuli. The results showed that the isotropic problem was computationally less challenging than the anisotropic problem and that the application of extracellular stimuli increased workload considerably. Preconditioning based on block Jacobi and algebraic multigrid solvers were found to have the best overall solution times and least iteration counts, respectively. The algebraic multigrid preconditioner would be expected to perform better on large problems. PMID:26736543

  6. Network gateway security method for enterprise Grid: a literature review

    NASA Astrophysics Data System (ADS)

    Sujarwo, A.; Tan, J.

    2017-03-01

    The computational Grid has brought big computational resources closer to scientists. It enables people to do a large computational job anytime and anywhere without any physical border anymore. However, the massive and spread of computer participants either as user or computational provider arise problems in security. The challenge is on how the security system, especially the one which filters data in the gateway could works in flexibility depends on the registered Grid participants. This paper surveys what people have done to approach this challenge, in order to find the better and new method for enterprise Grid. The findings of this paper is the dynamically controlled enterprise firewall to secure the Grid resources from unwanted connections with a new firewall controlling method and components.

  7. Computational Molecular Modeling for Evaluating the Toxicity of Environmental Chemicals: Prioritizing Bioassay Requirements

    EPA Science Inventory

    This commentary provides an overview of the challenges that arise from applying molecular modeling tools developed and commonly used for pharmaceutical discovery to the problem of predicting the potential toxicities of environmental chemicals.

  8. Computational ecology as an emerging science

    PubMed Central

    Petrovskii, Sergei; Petrovskaya, Natalia

    2012-01-01

    It has long been recognized that numerical modelling and computer simulations can be used as a powerful research tool to understand, and sometimes to predict, the tendencies and peculiarities in the dynamics of populations and ecosystems. It has been, however, much less appreciated that the context of modelling and simulations in ecology is essentially different from those that normally exist in other natural sciences. In our paper, we review the computational challenges arising in modern ecology in the spirit of computational mathematics, i.e. with our main focus on the choice and use of adequate numerical methods. Somewhat paradoxically, the complexity of ecological problems does not always require the use of complex computational methods. This paradox, however, can be easily resolved if we recall that application of sophisticated computational methods usually requires clear and unambiguous mathematical problem statement as well as clearly defined benchmark information for model validation. At the same time, many ecological problems still do not have mathematically accurate and unambiguous description, and available field data are often very noisy, and hence it can be hard to understand how the results of computations should be interpreted from the ecological viewpoint. In this scientific context, computational ecology has to deal with a new paradigm: conventional issues of numerical modelling such as convergence and stability become less important than the qualitative analysis that can be provided with the help of computational techniques. We discuss this paradigm by considering computational challenges arising in several specific ecological applications. PMID:23565336

  9. A Computational Approach for Probabilistic Analysis of Water Impact Simulations

    NASA Technical Reports Server (NTRS)

    Horta, Lucas G.; Mason, Brian H.; Lyle, Karen H.

    2009-01-01

    NASA's development of new concepts for the Crew Exploration Vehicle Orion presents many similar challenges to those worked in the sixties during the Apollo program. However, with improved modeling capabilities, new challenges arise. For example, the use of the commercial code LS-DYNA, although widely used and accepted in the technical community, often involves high-dimensional, time consuming, and computationally intensive simulations. The challenge is to capture what is learned from a limited number of LS-DYNA simulations to develop models that allow users to conduct interpolation of solutions at a fraction of the computational time. This paper presents a description of the LS-DYNA model, a brief summary of the response surface techniques, the analysis of variance approach used in the sensitivity studies, equations used to estimate impact parameters, results showing conditions that might cause injuries, and concluding remarks.

  10. A survey on hair modeling: styling, simulation, and rendering.

    PubMed

    Ward, Kelly; Bertails, Florence; Kim, Tae-Yong; Marschner, Stephen R; Cani, Marie-Paule; Lin, Ming C

    2007-01-01

    Realistic hair modeling is a fundamental part of creating virtual humans in computer graphics. This paper surveys the state of the art in the major topics of hair modeling: hairstyling, hair simulation, and hair rendering. Because of the difficult, often unsolved problems that arise in all these areas, a broad diversity of approaches are used, each with strengths that make it appropriate for particular applications. We discuss each of these major topics in turn, presenting the unique challenges facing each area and describing solutions that have been presented over the years to handle these complex issues. Finally, we outline some of the remaining computational challenges in hair modeling.

  11. Statistical mechanics of complex neural systems and high dimensional data

    NASA Astrophysics Data System (ADS)

    Advani, Madhu; Lahiri, Subhaneil; Ganguli, Surya

    2013-03-01

    Recent experimental advances in neuroscience have opened new vistas into the immense complexity of neuronal networks. This proliferation of data challenges us on two parallel fronts. First, how can we form adequate theoretical frameworks for understanding how dynamical network processes cooperate across widely disparate spatiotemporal scales to solve important computational problems? Second, how can we extract meaningful models of neuronal systems from high dimensional datasets? To aid in these challenges, we give a pedagogical review of a collection of ideas and theoretical methods arising at the intersection of statistical physics, computer science and neurobiology. We introduce the interrelated replica and cavity methods, which originated in statistical physics as powerful ways to quantitatively analyze large highly heterogeneous systems of many interacting degrees of freedom. We also introduce the closely related notion of message passing in graphical models, which originated in computer science as a distributed algorithm capable of solving large inference and optimization problems involving many coupled variables. We then show how both the statistical physics and computer science perspectives can be applied in a wide diversity of contexts to problems arising in theoretical neuroscience and data analysis. Along the way we discuss spin glasses, learning theory, illusions of structure in noise, random matrices, dimensionality reduction and compressed sensing, all within the unified formalism of the replica method. Moreover, we review recent conceptual connections between message passing in graphical models, and neural computation and learning. Overall, these ideas illustrate how statistical physics and computer science might provide a lens through which we can uncover emergent computational functions buried deep within the dynamical complexities of neuronal networks.

  12. Brain-Computer Interfaces: A Neuroscience Paradigm of Social Interaction? A Matter of Perspective

    PubMed Central

    Mattout, Jérémie

    2012-01-01

    A number of recent studies have put human subjects in true social interactions, with the aim of better identifying the psychophysiological processes underlying social cognition. Interestingly, this emerging Neuroscience of Social Interactions (NSI) field brings up challenges which resemble important ones in the field of Brain-Computer Interfaces (BCI). Importantly, these challenges go beyond common objectives such as the eventual use of BCI and NSI protocols in the clinical domain or common interests pertaining to the use of online neurophysiological techniques and algorithms. Common fundamental challenges are now apparent and one can argue that a crucial one is to develop computational models of brain processes relevant to human interactions with an adaptive agent, whether human or artificial. Coupled with neuroimaging data, such models have proved promising in revealing the neural basis and mental processes behind social interactions. Similar models could help BCI to move from well-performing but offline static machines to reliable online adaptive agents. This emphasizes a social perspective to BCI, which is not limited to a computational challenge but extends to all questions that arise when studying the brain in interaction with its environment. PMID:22675291

  13. Analytical Cost Metrics : Days of Future Past

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prajapati, Nirmal; Rajopadhye, Sanjay; Djidjev, Hristo Nikolov

    As we move towards the exascale era, the new architectures must be capable of running the massive computational problems efficiently. Scientists and researchers are continuously investing in tuning the performance of extreme-scale computational problems. These problems arise in almost all areas of computing, ranging from big data analytics, artificial intelligence, search, machine learning, virtual/augmented reality, computer vision, image/signal processing to computational science and bioinformatics. With Moore’s law driving the evolution of hardware platforms towards exascale, the dominant performance metric (time efficiency) has now expanded to also incorporate power/energy efficiency. Therefore the major challenge that we face in computing systems researchmore » is: “how to solve massive-scale computational problems in the most time/power/energy efficient manner?”« less

  14. Randomized Dynamic Mode Decomposition

    NASA Astrophysics Data System (ADS)

    Erichson, N. Benjamin; Brunton, Steven L.; Kutz, J. Nathan

    2017-11-01

    The dynamic mode decomposition (DMD) is an equation-free, data-driven matrix decomposition that is capable of providing accurate reconstructions of spatio-temporal coherent structures arising in dynamical systems. We present randomized algorithms to compute the near-optimal low-rank dynamic mode decomposition for massive datasets. Randomized algorithms are simple, accurate and able to ease the computational challenges arising with `big data'. Moreover, randomized algorithms are amenable to modern parallel and distributed computing. The idea is to derive a smaller matrix from the high-dimensional input data matrix using randomness as a computational strategy. Then, the dynamic modes and eigenvalues are accurately learned from this smaller representation of the data, whereby the approximation quality can be controlled via oversampling and power iterations. Here, we present randomized DMD algorithms that are categorized by how many passes the algorithm takes through the data. Specifically, the single-pass randomized DMD does not require data to be stored for subsequent passes. Thus, it is possible to approximately decompose massive fluid flows (stored out of core memory, or not stored at all) using single-pass algorithms, which is infeasible with traditional DMD algorithms.

  15. Improved Flux Formulations for Unsteady Low Mach Number Flows

    DTIC Science & Technology

    2012-07-01

    challenging problem since it requires the resolution of disparate time scales. Unsteady effects may arise from a combination of hydrodynamic effects...Many practical applications including rotorcraft flows, jets and shear layers include a combination of both acoustic and hydrodynamic effects...are computed independently as scalar formulations thus making it possible to independently tailor the dissipation for hydrodynamic and acoustic

  16. Computational Challenges in the Analysis of Petrophysics Using Microtomography and Upscaling

    NASA Astrophysics Data System (ADS)

    Liu, J.; Pereira, G.; Freij-Ayoub, R.; Regenauer-Lieb, K.

    2014-12-01

    Microtomography provides detailed 3D internal structures of rocks in micro- to tens of nano-meter resolution and is quickly turning into a new technology for studying petrophysical properties of materials. An important step is the upscaling of these properties as micron or sub-micron resolution can only be done on the sample-scale of millimeters or even less than a millimeter. We present here a recently developed computational workflow for the analysis of microstructures including the upscaling of material properties. Computations of properties are first performed using conventional material science simulations at micro to nano-scale. The subsequent upscaling of these properties is done by a novel renormalization procedure based on percolation theory. We have tested the workflow using different rock samples, biological and food science materials. We have also applied the technique on high-resolution time-lapse synchrotron CT scans. In this contribution we focus on the computational challenges that arise from the big data problem of analyzing petrophysical properties and its subsequent upscaling. We discuss the following challenges: 1) Characterization of microtomography for extremely large data sets - our current capability. 2) Computational fluid dynamics simulations at pore-scale for permeability estimation - methods, computing cost and accuracy. 3) Solid mechanical computations at pore-scale for estimating elasto-plastic properties - computational stability, cost, and efficiency. 4) Extracting critical exponents from derivative models for scaling laws - models, finite element meshing, and accuracy. Significant progress in each of these challenges is necessary to transform microtomography from the current research problem into a robust computational big data tool for multi-scale scientific and engineering problems.

  17. Technical challenges for big data in biomedicine and health: data sources, infrastructure, and analytics.

    PubMed

    Peek, N; Holmes, J H; Sun, J

    2014-08-15

    To review technical and methodological challenges for big data research in biomedicine and health. We discuss sources of big datasets, survey infrastructures for big data storage and big data processing, and describe the main challenges that arise when analyzing big data. The life and biomedical sciences are massively contributing to the big data revolution through secondary use of data that were collected during routine care and through new data sources such as social media. Efficient processing of big datasets is typically achieved by distributing computation over a cluster of computers. Data analysts should be aware of pitfalls related to big data such as bias in routine care data and the risk of false-positive findings in high-dimensional datasets. The major challenge for the near future is to transform analytical methods that are used in the biomedical and health domain, to fit the distributed storage and processing model that is required to handle big data, while ensuring confidentiality of the data being analyzed.

  18. A Secure and Verifiable Outsourced Access Control Scheme in Fog-Cloud Computing.

    PubMed

    Fan, Kai; Wang, Junxiong; Wang, Xin; Li, Hui; Yang, Yintang

    2017-07-24

    With the rapid development of big data and Internet of things (IOT), the number of networking devices and data volume are increasing dramatically. Fog computing, which extends cloud computing to the edge of the network can effectively solve the bottleneck problems of data transmission and data storage. However, security and privacy challenges are also arising in the fog-cloud computing environment. Ciphertext-policy attribute-based encryption (CP-ABE) can be adopted to realize data access control in fog-cloud computing systems. In this paper, we propose a verifiable outsourced multi-authority access control scheme, named VO-MAACS. In our construction, most encryption and decryption computations are outsourced to fog devices and the computation results can be verified by using our verification method. Meanwhile, to address the revocation issue, we design an efficient user and attribute revocation method for it. Finally, analysis and simulation results show that our scheme is both secure and highly efficient.

  19. Is thinking computable?

    NASA Technical Reports Server (NTRS)

    Denning, Peter J.

    1990-01-01

    Strong artificial intelligence claims that conscious thought can arise in computers containing the right algorithms even though none of the programs or components of those computers understand which is going on. As proof, it asserts that brains are finite webs of neurons, each with a definite function governed by the laws of physics; this web has a set of equations that can be solved (or simulated) by a sufficiently powerful computer. Strong AI claims the Turing test as a criterion of success. A recent debate in Scientific American concludes that the Turing test is not sufficient, but leaves intact the underlying premise that thought is a computable process. The recent book by Roger Penrose, however, offers a sharp challenge, arguing that the laws of quantum physics may govern mental processes and that these laws may not be computable. In every area of mathematics and physics, Penrose finds evidence of nonalgorithmic human activity and concludes that mental processes are inherently more powerful than computational processes.

  20. Item Difficulty in the Evaluation of Computer-Based Instruction: An Example from Neuroanatomy

    PubMed Central

    Chariker, Julia H.; Naaz, Farah; Pani, John R.

    2012-01-01

    This article reports large item effects in a study of computer-based learning of neuroanatomy. Outcome measures of the efficiency of learning, transfer of learning, and generalization of knowledge diverged by a wide margin across test items, with certain sets of items emerging as particularly difficult to master. In addition, the outcomes of comparisons between instructional methods changed with the difficulty of the items to be learned. More challenging items better differentiated between instructional methods. This set of results is important for two reasons. First, it suggests that instruction may be more efficient if sets of consistently difficult items are the targets of instructional methods particularly suited to them. Second, there is wide variation in the published literature regarding the outcomes of empirical evaluations of computer-based instruction. As a consequence, many questions arise as to the factors that may affect such evaluations. The present paper demonstrates that the level of challenge in the material that is presented to learners is an important factor to consider in the evaluation of a computer-based instructional system. PMID:22231801

  1. Item difficulty in the evaluation of computer-based instruction: an example from neuroanatomy.

    PubMed

    Chariker, Julia H; Naaz, Farah; Pani, John R

    2012-01-01

    This article reports large item effects in a study of computer-based learning of neuroanatomy. Outcome measures of the efficiency of learning, transfer of learning, and generalization of knowledge diverged by a wide margin across test items, with certain sets of items emerging as particularly difficult to master. In addition, the outcomes of comparisons between instructional methods changed with the difficulty of the items to be learned. More challenging items better differentiated between instructional methods. This set of results is important for two reasons. First, it suggests that instruction may be more efficient if sets of consistently difficult items are the targets of instructional methods particularly suited to them. Second, there is wide variation in the published literature regarding the outcomes of empirical evaluations of computer-based instruction. As a consequence, many questions arise as to the factors that may affect such evaluations. The present article demonstrates that the level of challenge in the material that is presented to learners is an important factor to consider in the evaluation of a computer-based instructional system. Copyright © 2011 American Association of Anatomists.

  2. A Computational Approach for Probabilistic Analysis of LS-DYNA Water Impact Simulations

    NASA Technical Reports Server (NTRS)

    Horta, Lucas G.; Mason, Brian H.; Lyle, Karen H.

    2010-01-01

    NASA s development of new concepts for the Crew Exploration Vehicle Orion presents many similar challenges to those worked in the sixties during the Apollo program. However, with improved modeling capabilities, new challenges arise. For example, the use of the commercial code LS-DYNA, although widely used and accepted in the technical community, often involves high-dimensional, time consuming, and computationally intensive simulations. Because of the computational cost, these tools are often used to evaluate specific conditions and rarely used for statistical analysis. The challenge is to capture what is learned from a limited number of LS-DYNA simulations to develop models that allow users to conduct interpolation of solutions at a fraction of the computational time. For this problem, response surface models are used to predict the system time responses to a water landing as a function of capsule speed, direction, attitude, water speed, and water direction. Furthermore, these models can also be used to ascertain the adequacy of the design in terms of probability measures. This paper presents a description of the LS-DYNA model, a brief summary of the response surface techniques, the analysis of variance approach used in the sensitivity studies, equations used to estimate impact parameters, results showing conditions that might cause injuries, and concluding remarks.

  3. Nanotechnology: Opportunities and Challenges

    NASA Technical Reports Server (NTRS)

    Meyyappan, Meyya

    2003-01-01

    Nanotechnology seeks to exploit novel physical, chemical, biological, mechanical, electrical, and other properties, which arise primarily due to the nanoscale nature of certain materials. A key example is carbon nanotubes (CNTs) which exhibit unique electrical and extraordinary mechanical properties and offer remarkable potential for revolutionary applications in electronics devices, computing, and data storage technology, sensors, composites, nanoelectromechanical systems (NEMS), and as tip in scanning probe microscopy (SPM) for imaging and nanolithography. Thus the CNT synthesis, characterization, and applications touch upon all disciplines of science and engineering. This presentation will provide an overview and progress report on this and other major research candidates in Nanotechnology and address opportunities and challenges ahead.

  4. Editorial: Cognitive Architectures, Model Comparison and AGI

    NASA Astrophysics Data System (ADS)

    Lebiere, Christian; Gonzalez, Cleotilde; Warwick, Walter

    2010-12-01

    Cognitive Science and Artificial Intelligence share compatible goals of understanding and possibly generating broadly intelligent behavior. In order to determine if progress is made, it is essential to be able to evaluate the behavior of complex computational models, especially those built on general cognitive architectures, and compare it to benchmarks of intelligent behavior such as human performance. Significant methodological challenges arise, however, when trying to extend approaches used to compare model and human performance from tightly controlled laboratory tasks to complex tasks involving more open-ended behavior. This paper describes a model comparison challenge built around a dynamic control task, the Dynamic Stocks and Flows. We present and discuss distinct approaches to evaluating performance and comparing models. Lessons drawn from this challenge are discussed in light of the challenge of using cognitive architectures to achieve Artificial General Intelligence.

  5. Clinical decision-making and secondary findings in systems medicine.

    PubMed

    Fischer, T; Brothers, K B; Erdmann, P; Langanke, M

    2016-05-21

    Systems medicine is the name for an assemblage of scientific strategies and practices that include bioinformatics approaches to human biology (especially systems biology); "big data" statistical analysis; and medical informatics tools. Whereas personalized and precision medicine involve similar analytical methods applied to genomic and medical record data, systems medicine draws on these as well as other sources of data. Given this distinction, the clinical translation of systems medicine poses a number of important ethical and epistemological challenges for researchers working to generate systems medicine knowledge and clinicians working to apply it. This article focuses on three key challenges: First, we will discuss the conflicts in decision-making that can arise when healthcare providers committed to principles of experimental medicine or evidence-based medicine encounter individualized recommendations derived from computer algorithms. We will explore in particular whether controlled experiments, such as comparative effectiveness trials, should mediate the translation of systems medicine, or if instead individualized findings generated through "big data" approaches can be applied directly in clinical decision-making. Second, we will examine the case of the Riyadh Intensive Care Program Mortality Prediction Algorithm, pejoratively referred to as the "death computer," to demonstrate the ethical challenges that can arise when big-data-driven scoring systems are applied in clinical contexts. We argue that the uncritical use of predictive clinical algorithms, including those envisioned for systems medicine, challenge basic understandings of the doctor-patient relationship. Third, we will build on the recent discourse on secondary findings in genomics and imaging to draw attention to the important implications of secondary findings derived from the joint analysis of data from diverse sources, including data recorded by patients in an attempt to realize their "quantified self." This paper examines possible ethical challenges that are likely to be raised as systems medicine to be translated into clinical medicine. These include the epistemological challenges for clinical decision-making, the use of scoring systems optimized by big data techniques and the risk that incidental and secondary findings will significantly increase. While some ethical implications remain still hypothetical we should use the opportunity to prospectively identify challenges to avoid making foreseeable mistakes when systems medicine inevitably arrives in routine care.

  6. Computing disease incidence, prevalence and comorbidity from electronic medical records.

    PubMed

    Bagley, Steven C; Altman, Russ B

    2016-10-01

    Electronic medical records (EMR) represent a convenient source of coded medical data, but disease patterns found in EMRs may be biased when compared to surveys based on sampling. In this communication we draw attention to complications that arise when using EMR data to calculate disease prevalence, incidence, age of onset, and disease comorbidity. We review known solutions to these problems and identify challenges for future work. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. Secure data aggregation in heterogeneous and disparate networks using stand off server architecture

    NASA Astrophysics Data System (ADS)

    Vimalathithan, S.; Sudarsan, S. D.; Seker, R.; Lenin, R. B.; Ramaswamy, S.

    2009-04-01

    The emerging global reach of technology presents myriad challenges and intricacies as Information Technology teams aim to provide anywhere, anytime and anyone access, for service providers and customers alike. The world is fraught with stifling inequalities, both from an economic as well as socio-political perspective. The net result has been large capability gaps between various organizational locations that need to work together, which has raised new challenges for information security teams. Similar issues arise, when mergers and acquisitions among and between organizations take place. While integrating remote business locations with mainstream operations, one or more of the issues including the lack of application level support, computational capabilities, communication limitations, and legal requirements cause a serious impediment thereby complicating integration while not violating the organizations' security requirements. Often resorted techniques like IPSec, tunneling, secure socket layer, etc. may not be always techno-economically feasible. This paper addresses such security issues by introducing an intermediate server between corporate central server and remote sites, called stand-off-server. We present techniques such as break-before-make connection, break connection after transfer, multiple virtual machine instances with different operating systems using the concept of a stand-off-server. Our experiments show that the proposed solution provides sufficient isolation for the central server/site from attacks arising out of weak communication and/or computing links and is simple to implement.

  8. A Secure and Verifiable Outsourced Access Control Scheme in Fog-Cloud Computing

    PubMed Central

    Fan, Kai; Wang, Junxiong; Wang, Xin; Li, Hui; Yang, Yintang

    2017-01-01

    With the rapid development of big data and Internet of things (IOT), the number of networking devices and data volume are increasing dramatically. Fog computing, which extends cloud computing to the edge of the network can effectively solve the bottleneck problems of data transmission and data storage. However, security and privacy challenges are also arising in the fog-cloud computing environment. Ciphertext-policy attribute-based encryption (CP-ABE) can be adopted to realize data access control in fog-cloud computing systems. In this paper, we propose a verifiable outsourced multi-authority access control scheme, named VO-MAACS. In our construction, most encryption and decryption computations are outsourced to fog devices and the computation results can be verified by using our verification method. Meanwhile, to address the revocation issue, we design an efficient user and attribute revocation method for it. Finally, analysis and simulation results show that our scheme is both secure and highly efficient. PMID:28737733

  9. Reproducible Earth observation analytics: challenges, ideas, and a study case on containerized land use change detection

    NASA Astrophysics Data System (ADS)

    Appel, Marius; Nüst, Daniel; Pebesma, Edzer

    2017-04-01

    Geoscientific analyses of Earth observation data typically involve a long path from data acquisition to scientific results and conclusions. Before starting the actual processing, scenes must be downloaded from the providers' platforms and the computing infrastructure needs to be prepared. The computing environment often requires specialized software, which in turn might have lots of dependencies. The software is often highly customized and provided without commercial support, which leads to rather ad-hoc systems and irreproducible results. To let other scientists reproduce the analyses, the full workspace including data, code, the computing environment, and documentation must be bundled and shared. Technologies such as virtualization or containerization allow for the creation of identical computing environments with relatively little effort. Challenges, however, arise when the volume of the data is too large, when computations are done in a cluster environment, or when complex software components such as databases are used. We discuss these challenges for the example of scalable Land use change detection on Landsat imagery. We present a reproducible implementation that runs R and the scalable data management and analytical system SciDB within a Docker container. Thanks to an explicit container recipe (the Dockerfile), this enables the all-in-one reproduction including the installation of software components, the ingestion of the data, and the execution of the analysis in a well-defined environment. We furthermore discuss possibilities how the implementation could be transferred to multi-container environments in order to support reproducibility on large cluster environments.

  10. Atmospheric Flux Computations in Complex Terrain

    NASA Technical Reports Server (NTRS)

    Smith, Paul L.; Kopp, Fred J.; Orville, Harold D.

    2000-01-01

    The greatest challenges in applying atmospheric water budget expressions are in determining the divergence and evapotranspiration terms. The evapotranspiration problem is ubiquitous, and critical issues of spatial and temporal resolution commonly arise in establishing the divergence term. In complex terrain, further difficulties crop up in using typical data on atmospheric profiles of water vapor and wind to estimate the divergence term. Those difficulties are the subject of this paper; considerations related to topographic variations both along and normal to the flow direction are treated.

  11. Users matter : multi-agent systems model of high performance computing cluster users.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    North, M. J.; Hood, C. S.; Decision and Information Sciences

    2005-01-01

    High performance computing clusters have been a critical resource for computational science for over a decade and have more recently become integral to large-scale industrial analysis. Despite their well-specified components, the aggregate behavior of clusters is poorly understood. The difficulties arise from complicated interactions between cluster components during operation. These interactions have been studied by many researchers, some of whom have identified the need for holistic multi-scale modeling that simultaneously includes network level, operating system level, process level, and user level behaviors. Each of these levels presents its own modeling challenges, but the user level is the most complex duemore » to the adaptability of human beings. In this vein, there are several major user modeling goals, namely descriptive modeling, predictive modeling and automated weakness discovery. This study shows how multi-agent techniques were used to simulate a large-scale computing cluster at each of these levels.« less

  12. Squamous cell carcinoma of the larynx arising in multifocal pharyngolaryngeal oncocytic papillary cystadenoma: a case report and review of the literature.

    PubMed

    Stenner, Markus; Müller, Klaus-Michael; Koopmann, Mario; Rudack, Claudia

    2014-09-01

    We report on a rare case of a laryngeal carcinoma arising in a multifocal pharyngolaryngeal oncocytic papillary cystadenoma (OPC). The disease of a 63-year-old man is well documented by computed and positron emission tomography, histology, and electron microscopy. We could show that an OPC can even develop in the pharynx. The coexistence of both tumors makes this a challenging diagnosis for pathologists. Treated by surgery and radiotherapy, both lesions dissolved. Based on the literature available, we discuss the theory that the laryngeal carcinoma might be the result of a true metaplasia facilitated by chronic irritation and recommend a regular follow-up for OPC too. As in benign oncocytic lesions, we could show that the detection of numerous mitochondria is a diagnostic indicator for malignant variants as well.

  13. Squamous Cell Carcinoma of the Larynx Arising in Multifocal Pharyngolaryngeal Oncocytic Papillary Cystadenoma

    PubMed Central

    Stenner, Markus; Müller, Klaus-Michael; Koopmann, Mario; Rudack, Claudia

    2014-01-01

    Abstract We report on a rare case of a laryngeal carcinoma arising in a multifocal pharyngolaryngeal oncocytic papillary cystadenoma (OPC). The disease of a 63-year-old man is well documented by computed and positron emission tomography, histology, and electron microscopy. We could show that an OPC can even develop in the pharynx. The coexistence of both tumors makes this a challenging diagnosis for pathologists. Treated by surgery and radiotherapy, both lesions dissolved. Based on the literature available, we discuss the theory that the laryngeal carcinoma might be the result of a true metaplasia facilitated by chronic irritation and recommend a regular follow-up for OPC too. As in benign oncocytic lesions, we could show that the detection of numerous mitochondria is a diagnostic indicator for malignant variants as well. PMID:25211046

  14. Doing Qualitative Comparative Research on Teaching: Challenges and Benefits of Working with Grounded Theory

    ERIC Educational Resources Information Center

    Rupp, Claudia

    2016-01-01

    The last decades have seen the completion of an increasing number of qualitative comparative research projects on teaching. Challenges and benefits which might arise from a qualitative international comparative research design have been considered. However, very little has been published on challenges and benefits which may arise from using…

  15. Perturbation biology nominates upstream-downstream drug combinations in RAF inhibitor resistant melanoma cells.

    PubMed

    Korkut, Anil; Wang, Weiqing; Demir, Emek; Aksoy, Bülent Arman; Jing, Xiaohong; Molinelli, Evan J; Babur, Özgün; Bemis, Debra L; Onur Sumer, Selcuk; Solit, David B; Pratilas, Christine A; Sander, Chris

    2015-08-18

    Resistance to targeted cancer therapies is an important clinical problem. The discovery of anti-resistance drug combinations is challenging as resistance can arise by diverse escape mechanisms. To address this challenge, we improved and applied the experimental-computational perturbation biology method. Using statistical inference, we build network models from high-throughput measurements of molecular and phenotypic responses to combinatorial targeted perturbations. The models are computationally executed to predict the effects of thousands of untested perturbations. In RAF-inhibitor resistant melanoma cells, we measured 143 proteomic/phenotypic entities under 89 perturbation conditions and predicted c-Myc as an effective therapeutic co-target with BRAF or MEK. Experiments using the BET bromodomain inhibitor JQ1 affecting the level of c-Myc protein and protein kinase inhibitors targeting the ERK pathway confirmed the prediction. In conclusion, we propose an anti-cancer strategy of co-targeting a specific upstream alteration and a general downstream point of vulnerability to prevent or overcome resistance to targeted drugs.

  16. Computational Cellular Dynamics Based on the Chemical Master Equation: A Challenge for Understanding Complexity

    PubMed Central

    Liang, Jie; Qian, Hong

    2010-01-01

    Modern molecular biology has always been a great source of inspiration for computational science. Half a century ago, the challenge from understanding macromolecular dynamics has led the way for computations to be part of the tool set to study molecular biology. Twenty-five years ago, the demand from genome science has inspired an entire generation of computer scientists with an interest in discrete mathematics to join the field that is now called bioinformatics. In this paper, we shall lay out a new mathematical theory for dynamics of biochemical reaction systems in a small volume (i.e., mesoscopic) in terms of a stochastic, discrete-state continuous-time formulation, called the chemical master equation (CME). Similar to the wavefunction in quantum mechanics, the dynamically changing probability landscape associated with the state space provides a fundamental characterization of the biochemical reaction system. The stochastic trajectories of the dynamics are best known through the simulations using the Gillespie algorithm. In contrast to the Metropolis algorithm, this Monte Carlo sampling technique does not follow a process with detailed balance. We shall show several examples how CMEs are used to model cellular biochemical systems. We shall also illustrate the computational challenges involved: multiscale phenomena, the interplay between stochasticity and nonlinearity, and how macroscopic determinism arises from mesoscopic dynamics. We point out recent advances in computing solutions to the CME, including exact solution of the steady state landscape and stochastic differential equations that offer alternatives to the Gilespie algorithm. We argue that the CME is an ideal system from which one can learn to understand “complex behavior” and complexity theory, and from which important biological insight can be gained. PMID:24999297

  17. Computational Cellular Dynamics Based on the Chemical Master Equation: A Challenge for Understanding Complexity.

    PubMed

    Liang, Jie; Qian, Hong

    2010-01-01

    Modern molecular biology has always been a great source of inspiration for computational science. Half a century ago, the challenge from understanding macromolecular dynamics has led the way for computations to be part of the tool set to study molecular biology. Twenty-five years ago, the demand from genome science has inspired an entire generation of computer scientists with an interest in discrete mathematics to join the field that is now called bioinformatics. In this paper, we shall lay out a new mathematical theory for dynamics of biochemical reaction systems in a small volume (i.e., mesoscopic) in terms of a stochastic, discrete-state continuous-time formulation, called the chemical master equation (CME). Similar to the wavefunction in quantum mechanics, the dynamically changing probability landscape associated with the state space provides a fundamental characterization of the biochemical reaction system. The stochastic trajectories of the dynamics are best known through the simulations using the Gillespie algorithm. In contrast to the Metropolis algorithm, this Monte Carlo sampling technique does not follow a process with detailed balance. We shall show several examples how CMEs are used to model cellular biochemical systems. We shall also illustrate the computational challenges involved: multiscale phenomena, the interplay between stochasticity and nonlinearity, and how macroscopic determinism arises from mesoscopic dynamics. We point out recent advances in computing solutions to the CME, including exact solution of the steady state landscape and stochastic differential equations that offer alternatives to the Gilespie algorithm. We argue that the CME is an ideal system from which one can learn to understand "complex behavior" and complexity theory, and from which important biological insight can be gained.

  18. Big Computing in Astronomy: Perspectives and Challenges

    NASA Astrophysics Data System (ADS)

    Pankratius, Victor

    2014-06-01

    Hardware progress in recent years has led to astronomical instruments gathering large volumes of data. In radio astronomy for instance, the current generation of antenna arrays produces data at Tbits per second, and forthcoming instruments will expand these rates much further. As instruments are increasingly becoming software-based, astronomers will get more exposed to computer science. This talk therefore outlines key challenges that arise at the intersection of computer science and astronomy and presents perspectives on how both communities can collaborate to overcome these challenges.Major problems are emerging due to increases in data rates that are much larger than in storage and transmission capacity, as well as humans being cognitively overwhelmed when attempting to opportunistically scan through Big Data. As a consequence, the generation of scientific insight will become more dependent on automation and algorithmic instrument control. Intelligent data reduction will have to be considered across the entire acquisition pipeline. In this context, the presentation will outline the enabling role of machine learning and parallel computing.BioVictor Pankratius is a computer scientist who joined MIT Haystack Observatory following his passion for astronomy. He is currently leading efforts to advance astronomy through cutting-edge computer science and parallel computing. Victor is also involved in projects such as ALMA Phasing to enhance the ALMA Observatory with Very-Long Baseline Interferometry capabilities, the Event Horizon Telescope, as well as in the Radio Array of Portable Interferometric Detectors (RAPID) to create an analysis environment using parallel computing in the cloud. He has an extensive track record of research in parallel multicore systems and software engineering, with contributions to auto-tuning, debugging, and empirical experiments studying programmers. Victor has worked with major industry partners such as Intel, Sun Labs, and Oracle. He holds a distinguished doctorate and a Habilitation degree in Computer Science from the University of Karlsruhe. Contact him at pankrat@mit.edu, victorpankratius.com, or Twitter @vpankratius.

  19. A New Informatics Geography.

    PubMed

    Coiera, E

    2016-11-10

    Anyone with knowledge of information systems has experienced frustration when it comes to system implementation or use. Unanticipated challenges arise frequently and unanticipated consequences may follow. Working from first principles, to understand why information technology (IT) is often challenging, identify which IT endeavors are more likely to succeed, and predict the best role that technology can play in different tasks and settings. The fundamental purpose of IT is to enhance our ability to undertake tasks, supplying new information that changes what we decide and ultimately what occurs in the world. The value of this information (VOI) can be calculated at different stages of the decision-making process and will vary depending on how technology is used. We can imagine a task space that describes the relative benefits of task completion by humans or computers and that contains specific areas where humans or computers are superior. There is a third area where neither is strong and a final joint workspace where humans and computers working in partnership produce the best results. By understanding that information has value and that VOI can be quantified, we can make decisions about how best to support the work we do. Evaluation of the expected utility of task completion by humans or computers should allow us to decide whether solutions should depend on technology, humans, or a partnership between the two.

  20. Matched computed tomography segmentation and demographic data for oropharyngeal cancer radiomics challenges

    PubMed Central

    Elhalawani, Hesham; Mohamed, Abdallah S.R.; White, Aubrey L.; Zafereo, James; Wong, Andrew J.; Berends, Joel E.; AboHashem, Shady; Williams, Bowman; Aymard, Jeremy M.; Kanwar, Aasheesh; Perni, Subha; Rock, Crosby D.; Cooksey, Luke; Campbell, Shauna; Ding, Yao; Lai, Stephen Y.; Marai, Elisabeta G.; Vock, David; Canahuate, Guadalupe M.; Freymann, John; Farahani, Keyvan; Kalpathy-Cramer, Jayashree; Fuller, Clifton D.

    2017-01-01

    Cancers arising from the oropharynx have become increasingly more studied in the past few years, as they are now epidemic domestically. These tumors are treated with definitive (chemo)radiotherapy, and have local recurrence as a primary mode of clinical failure. Recent data suggest that ‘radiomics’, or extraction of image texture analysis to generate mineable quantitative data from medical images, can reflect phenotypes for various cancers. Several groups have shown that developed radiomic signatures, in head and neck cancers, can be correlated with survival outcomes. This data descriptor defines a repository for head and neck radiomic challenges, executed via a Kaggle in Class platform, in partnership with the MICCAI society 2016 annual meeting.These public challenges were designed to leverage radiomics and/or machine learning workflows to discriminate HPV phenotype in one challenge (HPV status challenge) and to identify patients who will develop a local recurrence in the primary tumor volume in the second one (Local recurrence prediction challenge) in a segmented, clinically curated anonymized oropharyngeal cancer (OPC) data set. PMID:28675381

  1. Physics in Screening Environments

    NASA Astrophysics Data System (ADS)

    Certik, Ondrej

    In the current study, we investigated atoms in screening environments like plasmas. It is common practice to extract physical data, such as temperature and electron densities, from plasma experiments. We present results that address inherent computational difficulties that arise when the screening approach is extended to include the interaction between the atomic electrons. We show that there may arise an ambiguity in the interpretation of physical properties, such as temperature and charge density, from experimental data due to the opposing effects of electron-nucleus screening and electron-electron screening. The focus of the work, however, is on the resolution of inherent computational challenges that appear in the computation of two-particle matrix elements. Those enter already at the Hartree-Fock level. Furthermore, as examples of post Hartree-Fock calculations, we show second-order Green's function results and many body perturbation theory results of second order. A self-contained derivation of all necessary equations has been included. The accuracy of the implementation of the method is established by comparing standard unscreened results for various atoms and molecules against literature for Hartree-Fock as well as Green's function and many body perturbation theory. The main results of the thesis are presented in the chapter called Screened Results, where the behavior of several atomic systems depending on electron-electron and electron-nucleus Debye screening was studied. The computer code that we have developed has been made available for anybody to use. Finally, we present and discuss results obtained for screened interactions. We also examine thoroughly the computational details of the calculations and particular implementations of the method.

  2. Simultaneous analysis of large INTEGRAL/SPI1 datasets: Optimizing the computation of the solution and its variance using sparse matrix algorithms

    NASA Astrophysics Data System (ADS)

    Bouchet, L.; Amestoy, P.; Buttari, A.; Rouet, F.-H.; Chauvin, M.

    2013-02-01

    Nowadays, analyzing and reducing the ever larger astronomical datasets is becoming a crucial challenge, especially for long cumulated observation times. The INTEGRAL/SPI X/γ-ray spectrometer is an instrument for which it is essential to process many exposures at the same time in order to increase the low signal-to-noise ratio of the weakest sources. In this context, the conventional methods for data reduction are inefficient and sometimes not feasible at all. Processing several years of data simultaneously requires computing not only the solution of a large system of equations, but also the associated uncertainties. We aim at reducing the computation time and the memory usage. Since the SPI transfer function is sparse, we have used some popular methods for the solution of large sparse linear systems; we briefly review these methods. We use the Multifrontal Massively Parallel Solver (MUMPS) to compute the solution of the system of equations. We also need to compute the variance of the solution, which amounts to computing selected entries of the inverse of the sparse matrix corresponding to our linear system. This can be achieved through one of the latest features of the MUMPS software that has been partly motivated by this work. In this paper we provide a brief presentation of this feature and evaluate its effectiveness on astrophysical problems requiring the processing of large datasets simultaneously, such as the study of the entire emission of the Galaxy. We used these algorithms to solve the large sparse systems arising from SPI data processing and to obtain both their solutions and the associated variances. In conclusion, thanks to these newly developed tools, processing large datasets arising from SPI is now feasible with both a reasonable execution time and a low memory usage.

  3. A knowledge-based approach to automated flow-field zoning for computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Vogel, Alison Andrews

    1989-01-01

    An automated three-dimensional zonal grid generation capability for computational fluid dynamics is shown through the development of a demonstration computer program capable of automatically zoning the flow field of representative two-dimensional (2-D) aerodynamic configurations. The applicability of a knowledge-based programming approach to the domain of flow-field zoning is examined. Several aspects of flow-field zoning make the application of knowledge-based techniques challenging: the need for perceptual information, the role of individual bias in the design and evaluation of zonings, and the fact that the zoning process is modeled as a constructive, design-type task (for which there are relatively few examples of successful knowledge-based systems in any domain). Engineering solutions to the problems arising from these aspects are developed, and a demonstration system is implemented which can design, generate, and output flow-field zonings for representative 2-D aerodynamic configurations.

  4. An algorithmic framework for multiobjective optimization.

    PubMed

    Ganesan, T; Elamvazuthi, I; Shaari, Ku Zilati Ku; Vasant, P

    2013-01-01

    Multiobjective (MO) optimization is an emerging field which is increasingly being encountered in many fields globally. Various metaheuristic techniques such as differential evolution (DE), genetic algorithm (GA), gravitational search algorithm (GSA), and particle swarm optimization (PSO) have been used in conjunction with scalarization techniques such as weighted sum approach and the normal-boundary intersection (NBI) method to solve MO problems. Nevertheless, many challenges still arise especially when dealing with problems with multiple objectives (especially in cases more than two). In addition, problems with extensive computational overhead emerge when dealing with hybrid algorithms. This paper discusses these issues by proposing an alternative framework that utilizes algorithmic concepts related to the problem structure for generating efficient and effective algorithms. This paper proposes a framework to generate new high-performance algorithms with minimal computational overhead for MO optimization.

  5. An Algorithmic Framework for Multiobjective Optimization

    PubMed Central

    Ganesan, T.; Elamvazuthi, I.; Shaari, Ku Zilati Ku; Vasant, P.

    2013-01-01

    Multiobjective (MO) optimization is an emerging field which is increasingly being encountered in many fields globally. Various metaheuristic techniques such as differential evolution (DE), genetic algorithm (GA), gravitational search algorithm (GSA), and particle swarm optimization (PSO) have been used in conjunction with scalarization techniques such as weighted sum approach and the normal-boundary intersection (NBI) method to solve MO problems. Nevertheless, many challenges still arise especially when dealing with problems with multiple objectives (especially in cases more than two). In addition, problems with extensive computational overhead emerge when dealing with hybrid algorithms. This paper discusses these issues by proposing an alternative framework that utilizes algorithmic concepts related to the problem structure for generating efficient and effective algorithms. This paper proposes a framework to generate new high-performance algorithms with minimal computational overhead for MO optimization. PMID:24470795

  6. Final Report: Correctness Tools for Petascale Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mellor-Crummey, John

    2014-10-27

    In the course of developing parallel programs for leadership computing systems, subtle programming errors often arise that are extremely difficult to diagnose without tools. To meet this challenge, University of Maryland, the University of Wisconsin—Madison, and Rice University worked to develop lightweight tools to help code developers pinpoint a variety of program correctness errors that plague parallel scientific codes. The aim of this project was to develop software tools that help diagnose program errors including memory leaks, memory access errors, round-off errors, and data races. Research at Rice University focused on developing algorithms and data structures to support efficient monitoringmore » of multithreaded programs for memory access errors and data races. This is a final report about research and development work at Rice University as part of this project.« less

  7. Final Report for DOE Award ER25756

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kesselman, Carl

    2014-11-17

    The SciDAC-funded Center for Enabling Distributed Petascale Science (CEDPS) was established to address technical challenges that arise due to the frequent geographic distribution of data producers (in particular, supercomputers and scientific instruments) and data consumers (people and computers) within the DOE laboratory system. Its goal is to produce technical innovations that meet DOE end-user needs for (a) rapid and dependable placement of large quantities of data within a distributed high-performance environment, and (b) the convenient construction of scalable science services that provide for the reliable and high-performance processing of computation and data analysis requests from many remote clients. The Centermore » is also addressing (c) the important problem of troubleshooting these and other related ultra-high-performance distributed activities from the perspective of both performance and functionality« less

  8. The big data challenges of connectomics.

    PubMed

    Lichtman, Jeff W; Pfister, Hanspeter; Shavit, Nir

    2014-11-01

    The structure of the nervous system is extraordinarily complicated because individual neurons are interconnected to hundreds or even thousands of other cells in networks that can extend over large volumes. Mapping such networks at the level of synaptic connections, a field called connectomics, began in the 1970s with a the study of the small nervous system of a worm and has recently garnered general interest thanks to technical and computational advances that automate the collection of electron-microscopy data and offer the possibility of mapping even large mammalian brains. However, modern connectomics produces 'big data', unprecedented quantities of digital information at unprecedented rates, and will require, as with genomics at the time, breakthrough algorithmic and computational solutions. Here we describe some of the key difficulties that may arise and provide suggestions for managing them.

  9. The menu-setting problem and subsidized prices: drug formulary illustration.

    PubMed

    Olmstead, T; Zeckhauser, R

    1999-10-01

    The menu-setting problem (MSP) determines the goods and services an institution offers and the prices charged. It appears widely in health care, from choosing the services an insurance arrangement offers, to selecting the health plans an employer proffers. The challenge arises because purchases are subsidized, and consumers (or their physician agents) may make cost-ineffective choices. The intuitively comprehensible MSP model--readily solved by computer using actual data--helps structure thinking and support decision making about such problems. The analysis uses drug formularies--lists of approved drugs in a plan or institution--to illustrate the framework.

  10. Perturbation biology nominates upstream–downstream drug combinations in RAF inhibitor resistant melanoma cells

    PubMed Central

    Korkut, Anil; Wang, Weiqing; Demir, Emek; Aksoy, Bülent Arman; Jing, Xiaohong; Molinelli, Evan J; Babur, Özgün; Bemis, Debra L; Onur Sumer, Selcuk; Solit, David B; Pratilas, Christine A; Sander, Chris

    2015-01-01

    Resistance to targeted cancer therapies is an important clinical problem. The discovery of anti-resistance drug combinations is challenging as resistance can arise by diverse escape mechanisms. To address this challenge, we improved and applied the experimental-computational perturbation biology method. Using statistical inference, we build network models from high-throughput measurements of molecular and phenotypic responses to combinatorial targeted perturbations. The models are computationally executed to predict the effects of thousands of untested perturbations. In RAF-inhibitor resistant melanoma cells, we measured 143 proteomic/phenotypic entities under 89 perturbation conditions and predicted c-Myc as an effective therapeutic co-target with BRAF or MEK. Experiments using the BET bromodomain inhibitor JQ1 affecting the level of c-Myc protein and protein kinase inhibitors targeting the ERK pathway confirmed the prediction. In conclusion, we propose an anti-cancer strategy of co-targeting a specific upstream alteration and a general downstream point of vulnerability to prevent or overcome resistance to targeted drugs. DOI: http://dx.doi.org/10.7554/eLife.04640.001 PMID:26284497

  11. Landing the Job: How Special Libraries Can Support Career Research Introduction

    ERIC Educational Resources Information Center

    Howard, Heather

    2017-01-01

    "Special Libraries, Special Challenges" is a column dedicated to exploring the unique public services challenges that arise in libraries that specialize in a particular subject, such as law, medicine, business, and so forth. In each column, the author will discuss public service dilemmas and opportunities that arise in special libraries.…

  12. A Computer Program for Solving a Set of Conditional Maximum Likelihood Equations Arising in the Rasch Model for Questionnaires.

    ERIC Educational Resources Information Center

    Andersen, Erling B.

    A computer program for solving the conditional likelihood equations arising in the Rasch model for questionnaires is described. The estimation method and the computational problems involved are described in a previous research report by Andersen, but a summary of those results are given in two sections of this paper. A working example is also…

  13. Virtual reality: a reality for future military pilotage?

    NASA Astrophysics Data System (ADS)

    McIntire, John P.; Martinsen, Gary L.; Marasco, Peter L.; Havig, Paul R.

    2009-05-01

    Virtual reality (VR) systems provide exciting new ways to interact with information and with the world. The visual VR environment can be synthetic (computer generated) or be an indirect view of the real world using sensors and displays. With the potential opportunities of a VR system, the question arises about what benefits or detriments a military pilot might incur by operating in such an environment. Immersive and compelling VR displays could be accomplished with an HMD (e.g., imagery on the visor), large area collimated displays, or by putting the imagery on an opaque canopy. But what issues arise when, instead of viewing the world directly, a pilot views a "virtual" image of the world? Is 20/20 visual acuity in a VR system good enough? To deliver this acuity over the entire visual field would require over 43 megapixels (MP) of display surface for an HMD or about 150 MP for an immersive CAVE system, either of which presents a serious challenge with current technology. Additionally, the same number of sensor pixels would be required to drive the displays to this resolution (and formidable network architectures required to relay this information), or massive computer clusters are necessary to create an entirely computer-generated virtual reality with this resolution. Can we presently implement such a system? What other visual requirements or engineering issues should be considered? With the evolving technology, there are many technological issues and human factors considerations that need to be addressed before a pilot is placed within a virtual cockpit.

  14. Inequalities, Assessment and Computer Algebra

    ERIC Educational Resources Information Center

    Sangwin, Christopher J.

    2015-01-01

    The goal of this paper is to examine single variable real inequalities that arise as tutorial problems and to examine the extent to which current computer algebra systems (CAS) can (1) automatically solve such problems and (2) determine whether students' own answers to such problems are correct. We review how inequalities arise in contemporary…

  15. Mechanical Properties in Metal-Organic Frameworks: Emerging Opportunities and Challenges for Device Functionality and Technological Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burtch, Nicholas C.; Heinen, Jurn; Bennett, Thomas D.

    We report that some of the most remarkable recent developments in metal–organic framework (MOF) performance properties can only be rationalized by the mechanical properties endowed by their hybrid inorganic–organic nanoporous structures. While these characteristics create intriguing application prospects, the same attributes also present challenges that will need to be overcome to enable the integration of MOFs with technologies where these promising traits can be exploited. In this review, emerging opportunities and challenges are identified for MOF-enabled device functionality and technological applications that arise from their fascinating mechanical properties. This is discussed not only in the context of their more well-studiedmore » gas storage and separation applications, but also for instances where MOFs serve as components of functional nanodevices. Recent advances in understanding MOF mechanical structure–property relationships due to attributes such as defects and interpenetration are highlighted, and open questions related to state-of-the-art computational approaches for quantifying their mechanical properties are critically discussed.« less

  16. Mechanical Properties in Metal-Organic Frameworks: Emerging Opportunities and Challenges for Device Functionality and Technological Applications

    DOE PAGES

    Burtch, Nicholas C.; Heinen, Jurn; Bennett, Thomas D.; ...

    2017-11-17

    We report that some of the most remarkable recent developments in metal–organic framework (MOF) performance properties can only be rationalized by the mechanical properties endowed by their hybrid inorganic–organic nanoporous structures. While these characteristics create intriguing application prospects, the same attributes also present challenges that will need to be overcome to enable the integration of MOFs with technologies where these promising traits can be exploited. In this review, emerging opportunities and challenges are identified for MOF-enabled device functionality and technological applications that arise from their fascinating mechanical properties. This is discussed not only in the context of their more well-studiedmore » gas storage and separation applications, but also for instances where MOFs serve as components of functional nanodevices. Recent advances in understanding MOF mechanical structure–property relationships due to attributes such as defects and interpenetration are highlighted, and open questions related to state-of-the-art computational approaches for quantifying their mechanical properties are critically discussed.« less

  17. Mechanical Properties in Metal-Organic Frameworks: Emerging Opportunities and Challenges for Device Functionality and Technological Applications.

    PubMed

    Burtch, Nicholas C; Heinen, Jurn; Bennett, Thomas D; Dubbeldam, David; Allendorf, Mark D

    2017-11-17

    Some of the most remarkable recent developments in metal-organic framework (MOF) performance properties can only be rationalized by the mechanical properties endowed by their hybrid inorganic-organic nanoporous structures. While these characteristics create intriguing application prospects, the same attributes also present challenges that will need to be overcome to enable the integration of MOFs with technologies where these promising traits can be exploited. In this review, emerging opportunities and challenges are identified for MOF-enabled device functionality and technological applications that arise from their fascinating mechanical properties. This is discussed not only in the context of their more well-studied gas storage and separation applications, but also for instances where MOFs serve as components of functional nanodevices. Recent advances in understanding MOF mechanical structure-property relationships due to attributes such as defects and interpenetration are highlighted, and open questions related to state-of-the-art computational approaches for quantifying their mechanical properties are critically discussed. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Computational modeling of electromechanical instabilities in dielectric elastomers (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Park, Harold

    2016-04-01

    Dielectric elastomers are a class of soft, active materials that have recently gained significant interest due to the fact that they can be electrostatically actuated into undergoing extremely large deformations. An ongoing challenge has been the development of robust and accurate computational models for elastomers, particularly those that can capture electromechanical instabilities that limit the performance of elastomers such as creasing, wrinkling, and snap-through. I discuss in this work a recently developed finite element model for elastomers that is dynamic, nonlinear, and fully electromechanically coupled. The model also significantly alleviates volumetric locking due that arises due to the incompressible nature of the elastomers, and incorporates viscoelasticity within a finite deformation framework. Numerical examples are shown that demonstrate the performance of the proposed method in capturing electromechanical instabilities (snap-through, creasing, cratering, wrinkling) that have been observed experimentally.

  19. The application of CFD to the modelling of fires in complex geometries

    NASA Astrophysics Data System (ADS)

    Burns, A. D.; Clarke, D. S.; Guilbert, P.; Jones, I. P.; Simcox, S.; Wilkes, N. S.

    The application of Computational Fluid Dynamics (CFD) to industrial safety is a challenging activity. In particular it involves the interaction of several different physical processes, including turbulence, combustion, radiation, buoyancy, compressible flow and shock waves in complex three-dimensional geometries. In addition, there may be multi-phase effects arising, for example, from sprinkler systems for extinguishing fires. The FLOW3D software (1-3) from Computational Fluid Dynamics Services (CFDS) is in widespread use in industrial safety problems, both within AEA Technology, and also by CFDS's commercial customers, for example references (4-13). This paper discusses some other applications of FLOW3D to safety problems. These applications illustrate the coupling of the gas flows with radiation models and combustion models, particularly for complex geometries where simpler radiation models are not applicable.

  20. The big data challenges of connectomics

    PubMed Central

    Lichtman, Jeff W; Pfister, Hanspeter; Shavit, Nir

    2015-01-01

    The structure of the nervous system is extraordinarily complicated because individual neurons are interconnected to hundreds or even thousands of other cells in networks that can extend over large volumes. Mapping such networks at the level of synaptic connections, a field called connectomics, began in the 1970s with a the study of the small nervous system of a worm and has recently garnered general interest thanks to technical and computational advances that automate the collection of electron-microscopy data and offer the possibility of mapping even large mammalian brains. However, modern connectomics produces ‘big data’, unprecedented quantities of digital information at unprecedented rates, and will require, as with genomics at the time, breakthrough algorithmic and computational solutions. Here we describe some of the key difficulties that may arise and provide suggestions for managing them. PMID:25349911

  1. The big data challenges of connectomics

    DOE PAGES

    Lichtman, Jeff W.; Pfister, Hanspeter; Shavit, Nir

    2014-10-28

    The structure of the nervous system is extraordinarily complicated because individual neurons are interconnected to hundreds or even thousands of other cells in networks that can extend over large volumes. Mapping such networks at the level of synaptic connections, a field called connectomics, began in the 1970s with a the study of the small nervous system of a worm and has recently garnered general interest thanks to technical and computational advances that automate the collection of electron-microscopy data and offer the possibility of mapping even large mammalian brains. However, modern connectomics produces ‘big data’, unprecedented quantities of digital information atmore » unprecedented rates, and will require, as with genomics at the time, breakthrough algorithmic and computational solutions. Here in this paper we describe some of the key difficulties that may arise and provide suggestions for managing them.« less

  2. Letter regarding 'Comparison between low-cost marker-less and high-end marker-based motion capture systems for the computer-aided assessment of working ergonomics' by Patrizi et al. and research reproducibility.

    PubMed

    2017-04-01

    The reporting of research in a manner that allows reproduction in subsequent investigations is important for scientific progress. Several details of the recent study by Patrizi et al., 'Comparison between low-cost marker-less and high-end marker-based motion capture systems for the computer-aided assessment of working ergonomics', are absent from the published manuscript and make reproduction of findings impossible. As new and complex technologies with great promise for ergonomics develop, new but surmountable challenges for reporting investigations using these technologies in a reproducible manner arise. Practitioner Summary: As with traditional methods, scientific reporting of new and complex ergonomics technologies should be performed in a manner that allows reproduction in subsequent investigations and supports scientific advancement.

  3. The big data challenges of connectomics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lichtman, Jeff W.; Pfister, Hanspeter; Shavit, Nir

    The structure of the nervous system is extraordinarily complicated because individual neurons are interconnected to hundreds or even thousands of other cells in networks that can extend over large volumes. Mapping such networks at the level of synaptic connections, a field called connectomics, began in the 1970s with a the study of the small nervous system of a worm and has recently garnered general interest thanks to technical and computational advances that automate the collection of electron-microscopy data and offer the possibility of mapping even large mammalian brains. However, modern connectomics produces ‘big data’, unprecedented quantities of digital information atmore » unprecedented rates, and will require, as with genomics at the time, breakthrough algorithmic and computational solutions. Here in this paper we describe some of the key difficulties that may arise and provide suggestions for managing them.« less

  4. Structure preserving parallel algorithms for solving the Bethe–Salpeter eigenvalue problem

    DOE PAGES

    Shao, Meiyue; da Jornada, Felipe H.; Yang, Chao; ...

    2015-10-02

    The Bethe–Salpeter eigenvalue problem is a dense structured eigenvalue problem arising from discretized Bethe–Salpeter equation in the context of computing exciton energies and states. A computational challenge is that at least half of the eigenvalues and the associated eigenvectors are desired in practice. In this paper, we establish the equivalence between Bethe–Salpeter eigenvalue problems and real Hamiltonian eigenvalue problems. Based on theoretical analysis, structure preserving algorithms for a class of Bethe–Salpeter eigenvalue problems are proposed. We also show that for this class of problems all eigenvalues obtained from the Tamm–Dancoff approximation are overestimated. In order to solve large scale problemsmore » of practical interest, we discuss parallel implementations of our algorithms targeting distributed memory systems. Finally, several numerical examples are presented to demonstrate the efficiency and accuracy of our algorithms.« less

  5. Integrative pipeline for profiling DNA copy number and inferring tumor phylogeny.

    PubMed

    Urrutia, Eugene; Chen, Hao; Zhou, Zilu; Zhang, Nancy R; Jiang, Yuchao

    2018-06-15

    Copy number variation is an important and abundant source of variation in the human genome, which has been associated with a number of diseases, especially cancer. Massively parallel next-generation sequencing allows copy number profiling with fine resolution. Such efforts, however, have met with mixed successes, with setbacks arising partly from the lack of reliable analytical methods to meet the diverse and unique challenges arising from the myriad experimental designs and study goals in genetic studies. In cancer genomics, detection of somatic copy number changes and profiling of allele-specific copy number (ASCN) are complicated by experimental biases and artifacts as well as normal cell contamination and cancer subclone admixture. Furthermore, careful statistical modeling is warranted to reconstruct tumor phylogeny by both somatic ASCN changes and single nucleotide variants. Here we describe a flexible computational pipeline, MARATHON, which integrates multiple related statistical software for copy number profiling and downstream analyses in disease genetic studies. MARATHON is publicly available at https://github.com/yuchaojiang/MARATHON. Supplementary data are available at Bioinformatics online.

  6. A Computational Framework for Bioimaging Simulation.

    PubMed

    Watabe, Masaki; Arjunan, Satya N V; Fukushima, Seiya; Iwamoto, Kazunari; Kozuka, Jun; Matsuoka, Satomi; Shindo, Yuki; Ueda, Masahiro; Takahashi, Koichi

    2015-01-01

    Using bioimaging technology, biologists have attempted to identify and document analytical interpretations that underlie biological phenomena in biological cells. Theoretical biology aims at distilling those interpretations into knowledge in the mathematical form of biochemical reaction networks and understanding how higher level functions emerge from the combined action of biomolecules. However, there still remain formidable challenges in bridging the gap between bioimaging and mathematical modeling. Generally, measurements using fluorescence microscopy systems are influenced by systematic effects that arise from stochastic nature of biological cells, the imaging apparatus, and optical physics. Such systematic effects are always present in all bioimaging systems and hinder quantitative comparison between the cell model and bioimages. Computational tools for such a comparison are still unavailable. Thus, in this work, we present a computational framework for handling the parameters of the cell models and the optical physics governing bioimaging systems. Simulation using this framework can generate digital images of cell simulation results after accounting for the systematic effects. We then demonstrate that such a framework enables comparison at the level of photon-counting units.

  7. Institute for scientific computing research;fiscal year 1999 annual report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keyes, D

    2000-03-28

    Large-scale scientific computation, and all of the disciplines that support it and help to validate it, have been placed at the focus of Lawrence Livermore National Laboratory by the Accelerated Strategic Computing Initiative (ASCI). The Laboratory operates the computer with the highest peak performance in the world and has undertaken some of the largest and most compute-intensive simulations ever performed. Computers at the architectural extremes, however, are notoriously difficult to use efficiently. Even such successes as the Laboratory's two Bell Prizes awarded in November 1999 only emphasize the need for much better ways of interacting with the results of large-scalemore » simulations. Advances in scientific computing research have, therefore, never been more vital to the core missions of the Laboratory than at present. Computational science is evolving so rapidly along every one of its research fronts that to remain on the leading edge, the Laboratory must engage researchers at many academic centers of excellence. In FY 1999, the Institute for Scientific Computing Research (ISCR) has expanded the Laboratory's bridge to the academic community in the form of collaborative subcontracts, visiting faculty, student internships, a workshop, and a very active seminar series. ISCR research participants are integrated almost seamlessly with the Laboratory's Center for Applied Scientific Computing (CASC), which, in turn, addresses computational challenges arising throughout the Laboratory. Administratively, the ISCR flourishes under the Laboratory's University Relations Program (URP). Together with the other four Institutes of the URP, it must navigate a course that allows the Laboratory to benefit from academic exchanges while preserving national security. Although FY 1999 brought more than its share of challenges to the operation of an academic-like research enterprise within the context of a national security laboratory, the results declare the challenges well met and well worth the continued effort. A change of administration for the ISCR occurred during FY 1999. Acting Director John Fitzgerald retired from LLNL in August after 35 years of service, including the last two at helm of the ISCR. David Keyes, who has been a regular visitor in conjunction with ASCI scalable algorithms research since October 1997, overlapped with John for three months and serves half-time as the new Acting Director.« less

  8. Spintronic Nanodevices for Bioinspired Computing

    PubMed Central

    Grollier, Julie; Querlioz, Damien; Stiles, Mark D.

    2016-01-01

    Bioinspired hardware holds the promise of low-energy, intelligent, and highly adaptable computing systems. Applications span from automatic classification for big data management, through unmanned vehicle control, to control for biomedical prosthesis. However, one of the major challenges of fabricating bioinspired hardware is building ultra-high-density networks out of complex processing units interlinked by tunable connections. Nanometer-scale devices exploiting spin electronics (or spintronics) can be a key technology in this context. In particular, magnetic tunnel junctions (MTJs) are well suited for this purpose because of their multiple tunable functionalities. One such functionality, non-volatile memory, can provide massive embedded memory in unconventional circuits, thus escaping the von-Neumann bottleneck arising when memory and processors are located separately. Other features of spintronic devices that could be beneficial for bioinspired computing include tunable fast nonlinear dynamics, controlled stochasticity, and the ability of single devices to change functions in different operating conditions. Large networks of interacting spintronic nanodevices can have their interactions tuned to induce complex dynamics such as synchronization, chaos, soliton diffusion, phase transitions, criticality, and convergence to multiple metastable states. A number of groups have recently proposed bioinspired architectures that include one or several types of spintronic nanodevices. In this paper, we show how spintronics can be used for bioinspired computing. We review the different approaches that have been proposed, the recent advances in this direction, and the challenges toward fully integrated spintronics complementary metal–oxide–semiconductor (CMOS) bioinspired hardware. PMID:27881881

  9. Detection and quantification of flow consistency in business process models.

    PubMed

    Burattin, Andrea; Bernstein, Vered; Neurauter, Manuel; Soffer, Pnina; Weber, Barbara

    2018-01-01

    Business process models abstract complex business processes by representing them as graphical models. Their layout, as determined by the modeler, may have an effect when these models are used. However, this effect is currently not fully understood. In order to systematically study this effect, a basic set of measurable key visual features is proposed, depicting the layout properties that are meaningful to the human user. The aim of this research is thus twofold: first, to empirically identify key visual features of business process models which are perceived as meaningful to the user and second, to show how such features can be quantified into computational metrics, which are applicable to business process models. We focus on one particular feature, consistency of flow direction, and show the challenges that arise when transforming it into a precise metric. We propose three different metrics addressing these challenges, each following a different view of flow consistency. We then report the results of an empirical evaluation, which indicates which metric is more effective in predicting the human perception of this feature. Moreover, two other automatic evaluations describing the performance and the computational capabilities of our metrics are reported as well.

  10. Protein Dynamics from NMR and Computer Simulation

    NASA Astrophysics Data System (ADS)

    Wu, Qiong; Kravchenko, Olga; Kemple, Marvin; Likic, Vladimir; Klimtchuk, Elena; Prendergast, Franklyn

    2002-03-01

    Proteins exhibit internal motions from the millisecond to sub-nanosecond time scale. The challenge is to relate these internal motions to biological function. A strategy to address this aim is to apply a combination of several techniques including high-resolution NMR, computer simulation of molecular dynamics (MD), molecular graphics, and finally molecular biology, the latter to generate appropriate samples. Two difficulties that arise are: (1) the time scale which is most directly biologically relevant (ms to μs) is not readily accessible by these techniques and (2) the techniques focus on local and not collective motions. We will outline methods using ^13C-NMR to help alleviate the second problem, as applied to intestinal fatty acid binding protein, a relatively small intracellular protein believed to be involved in fatty acid transport and metabolism. This work is supported in part by PHS Grant GM34847 (FGP) and by a fellowship from the American Heart Association (QW).

  11. Every factor helps: Rapid Ptychographic Reconstruction

    NASA Astrophysics Data System (ADS)

    Nashed, Youssef

    2015-03-01

    Recent advances in microscopy, specifically higher spatial resolution and data acquisition rates, require faster and more robust phase retrieval reconstruction methods. Ptychography is a phase retrieval technique for reconstructing the complex transmission function of a specimen from a sequence of diffraction patterns in visible light, X-ray, and electron microscopes. As technical advances allow larger fields to be imaged, computational challenges arise for reconstructing the correspondingly larger data volumes. Waiting to postprocess datasets offline results in missed opportunities. Here we present a parallel method for real-time ptychographic phase retrieval. It uses a hybrid parallel strategy to divide the computation between multiple graphics processing units (GPUs). A final specimen reconstruction is then achieved by different techniques to merge sub-dataset results into a single complex phase and amplitude image. Results are shown on a simulated specimen and real datasets from X-ray experiments conducted at a synchrotron light source.

  12. Spiking network simulation code for petascale computers.

    PubMed

    Kunkel, Susanne; Schmidt, Maximilian; Eppler, Jochen M; Plesser, Hans E; Masumoto, Gen; Igarashi, Jun; Ishii, Shin; Fukai, Tomoki; Morrison, Abigail; Diesmann, Markus; Helias, Moritz

    2014-01-01

    Brain-scale networks exhibit a breathtaking heterogeneity in the dynamical properties and parameters of their constituents. At cellular resolution, the entities of theory are neurons and synapses and over the past decade researchers have learned to manage the heterogeneity of neurons and synapses with efficient data structures. Already early parallel simulation codes stored synapses in a distributed fashion such that a synapse solely consumes memory on the compute node harboring the target neuron. As petaflop computers with some 100,000 nodes become increasingly available for neuroscience, new challenges arise for neuronal network simulation software: Each neuron contacts on the order of 10,000 other neurons and thus has targets only on a fraction of all compute nodes; furthermore, for any given source neuron, at most a single synapse is typically created on any compute node. From the viewpoint of an individual compute node, the heterogeneity in the synaptic target lists thus collapses along two dimensions: the dimension of the types of synapses and the dimension of the number of synapses of a given type. Here we present a data structure taking advantage of this double collapse using metaprogramming techniques. After introducing the relevant scaling scenario for brain-scale simulations, we quantitatively discuss the performance on two supercomputers. We show that the novel architecture scales to the largest petascale supercomputers available today.

  13. Data Structures for Extreme Scale Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kahan, Simon

    As computing problems of national importance grow, the government meets the increased demand by funding the development of ever larger systems. The overarching goal of the work supported in part by this grant is to increase efficiency of programming and performing computations on these large computing systems. In past work, we have demonstrated that some of these computations once thought to require expensive hardware designs and/or complex, special-purpose programming may be executed efficiently on low-cost commodity cluster computing systems using a general-purpose “latency-tolerant” programming framework. One important developed application of the ideas underlying this framework is graph database technology supportingmore » social network pattern matching used by US intelligence agencies to more quickly identify potential terrorist threats. This database application has been spun out by the Pacific Northwest National Laboratory, a Department of Energy Laboratory, into a commercial start-up, Trovares Inc. We explore an alternative application of the same underlying ideas to a well-studied challenge arising in engineering: solving unstructured sparse linear equations. Solving these equations is key to predicting the behavior of large electronic circuits before they are fabricated. Predicting that behavior ahead of fabrication means that designs can optimized and errors corrected ahead of the expense of manufacture.« less

  14. Spiking network simulation code for petascale computers

    PubMed Central

    Kunkel, Susanne; Schmidt, Maximilian; Eppler, Jochen M.; Plesser, Hans E.; Masumoto, Gen; Igarashi, Jun; Ishii, Shin; Fukai, Tomoki; Morrison, Abigail; Diesmann, Markus; Helias, Moritz

    2014-01-01

    Brain-scale networks exhibit a breathtaking heterogeneity in the dynamical properties and parameters of their constituents. At cellular resolution, the entities of theory are neurons and synapses and over the past decade researchers have learned to manage the heterogeneity of neurons and synapses with efficient data structures. Already early parallel simulation codes stored synapses in a distributed fashion such that a synapse solely consumes memory on the compute node harboring the target neuron. As petaflop computers with some 100,000 nodes become increasingly available for neuroscience, new challenges arise for neuronal network simulation software: Each neuron contacts on the order of 10,000 other neurons and thus has targets only on a fraction of all compute nodes; furthermore, for any given source neuron, at most a single synapse is typically created on any compute node. From the viewpoint of an individual compute node, the heterogeneity in the synaptic target lists thus collapses along two dimensions: the dimension of the types of synapses and the dimension of the number of synapses of a given type. Here we present a data structure taking advantage of this double collapse using metaprogramming techniques. After introducing the relevant scaling scenario for brain-scale simulations, we quantitatively discuss the performance on two supercomputers. We show that the novel architecture scales to the largest petascale supercomputers available today. PMID:25346682

  15. [Activities of Research Institute for Advanced Computer Science

    NASA Technical Reports Server (NTRS)

    Gross, Anthony R. (Technical Monitor); Leiner, Barry M.

    2001-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administrations missions. RIACS is located at the NASA Ames Research Center, Moffett Field, California. RIACS research focuses on the three cornerstones of IT research necessary to meet the future challenges of NASA missions: 1. Automated Reasoning for Autonomous Systems Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. 2. Human-Centered Computing Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities. 3. High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to analysis of large scientific datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply IT research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, visiting scientist programs and student summer programs, designed to encourage and facilitate collaboration between the university and NASA IT research communities.

  16. GraphMeta: Managing HPC Rich Metadata in Graphs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Dong; Chen, Yong; Carns, Philip

    High-performance computing (HPC) systems face increasingly critical metadata management challenges, especially in the approaching exascale era. These challenges arise not only from exploding metadata volumes, but also from increasingly diverse metadata, which contains data provenance and arbitrary user-defined attributes in addition to traditional POSIX metadata. This ‘rich’ metadata is becoming critical to supporting advanced data management functionality such as data auditing and validation. In our prior work, we identified a graph-based model as a promising solution to uniformly manage HPC rich metadata due to its flexibility and generality. However, at the same time, graph-based HPC rich metadata anagement also introducesmore » significant challenges to the underlying infrastructure. In this study, we first identify the challenges on the underlying infrastructure to support scalable, high-performance rich metadata management. Based on that, we introduce GraphMeta, a graphbased engine designed for this use case. It achieves performance scalability by introducing a new graph partitioning algorithm and a write-optimal storage engine. We evaluate GraphMeta under both synthetic and real HPC metadata workloads, compare it with other approaches, and demonstrate its advantages in terms of efficiency and usability for rich metadata management in HPC systems.« less

  17. Overcoming challenges integrating patient-generated data into the clinical EHR: lessons from the CONtrolling Disease Using Inexpensive IT--Hypertension in Diabetes (CONDUIT-HID) Project.

    PubMed

    Marquard, Jenna L; Garber, Lawrence; Saver, Barry; Amster, Brian; Kelleher, Michael; Preusse, Peggy

    2013-10-01

    The CONDUIT-HID intervention integrates patients' electronic blood pressure measurements directly into the clinical EHR using Microsoft HealthVault as an intermediary data store. The goal of this paper is to describe generalizable categories of patient and technical challenges encountered in the development and implementation of this inexpensive, commercial off-the-shelf consumer health informatics intervention, examples of challenges within each category, and how the example challenges were resolved prior to conducting an RCT of the intervention. The research team logged all challenges and mediation strategies during the technical development of the intervention, conducted home visits to observe patients using the intervention, and conducted telephone calls with patients to understand challenges they encountered. We then used these data to iteratively refine the intervention. The research team identified a variety of generalizable categories of challenges associated with patients uploading data from their homes, patients uploading data from clinics because they did not have or were not comfortable using home computers, and patients establishing the connection between HealthVault and the clinical EHR. Specific challenges within these categories arose because: (1) the research team had little control over the device and application design, (2) multiple vendors needed to coordinate their actions and design changes, (3) the intervention use cases were not anticipated by the device and application designers, (4) PHI accessed on clinic computers needed to be kept secure, (5) the research team wanted the data in the clinical EHR to be valid and reliable, (6) patients needed the ability to share only the data they wanted, and (7) the development of some EHR functionalities were new to the organization. While these challenges were varied and complex, the research team was able to successfully resolve each one prior to the start of the RCT. By identifying these generalizable categories of challenges, we aim to help others proactively search for and remedy potential challenges associated with their interventions, rather than reactively responding to problems as they arise. We posit that this approach will significantly increase the likelihood that these types of interventions will be successful. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  18. Liquid Chromatography Mass Spectrometry-Based Proteomics: Biological and Technological Aspects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karpievitch, Yuliya V.; Polpitiya, Ashoka D.; Anderson, Gordon A.

    2010-12-01

    Mass spectrometry-based proteomics has become the tool of choice for identifying and quantifying the proteome of an organism. Though recent years have seen a tremendous improvement in instrument performance and the computational tools used, significant challenges remain, and there are many opportunities for statisticians to make important contributions. In the most widely used "bottom-up" approach to proteomics, complex mixtures of proteins are first subjected to enzymatic cleavage, the resulting peptide products are separated based on chemical or physical properties and analyzed using a mass spectrometer. The two fundamental challenges in the analysis of bottom-up MS-based proteomics are: (1) Identifying themore » proteins that are present in a sample, and (2) Quantifying the abundance levels of the identified proteins. Both of these challenges require knowledge of the biological and technological context that gives rise to observed data, as well as the application of sound statistical principles for estimation and inference. We present an overview of bottom-up proteomics and outline the key statistical issues that arise in protein identification and quantification.« less

  19. Magnetic Resonance Microscopy of the Lung

    NASA Astrophysics Data System (ADS)

    Johnson, G. Allan

    1999-11-01

    The lung presents both challenges and opportunities for study by magnetic resonance imaging (MRI). The technical challenges arise from respiratory and cardiac motion, limited signal from the tissues, and unique physical structure of the lung. These challenges are heightened in magnetic resonance microscopy (MRM) where the spatial resolution may be up to a million times higher than that of conventional MRI. The development of successful techniques for MRM of the lung present enormous opportunities for basic studies of lung structure and function, toxicology, environmental stress, and drug discovery by permitting investigators to study this most essential organ nondestructively in the live animal. Over the last 15 years, scientists at the Duke Center for In Vivo Microscopy have developed techniques for MRM in the live animal through an interdisciplinary program of biology, physics, chemistry, electrical engineering, and computer science. This talk will focus on the development of specialized radiofrequency coils for lung imaging, projection encoding methods to limit susceptibility losses, specialized support structures to control and monitor physiologic motion, and the most recent development of hyperpolarized gas imaging with ^3He and ^129Xe.

  20. Cerebral cartography and connectomics

    PubMed Central

    Sporns, Olaf

    2015-01-01

    Cerebral cartography and connectomics pursue similar goals in attempting to create maps that can inform our understanding of the structural and functional organization of the cortex. Connectome maps explicitly aim at representing the brain as a complex network, a collection of nodes and their interconnecting edges. This article reflects on some of the challenges that currently arise in the intersection of cerebral cartography and connectomics. Principal challenges concern the temporal dynamics of functional brain connectivity, the definition of areal parcellations and their hierarchical organization into large-scale networks, the extension of whole-brain connectivity to cellular-scale networks, and the mapping of structure/function relations in empirical recordings and computational models. Successfully addressing these challenges will require extensions of methods and tools from network science to the mapping and analysis of human brain connectivity data. The emerging view that the brain is more than a collection of areas, but is fundamentally operating as a complex networked system, will continue to drive the creation of ever more detailed and multi-modal network maps as tools for on-going exploration and discovery in human connectomics. PMID:25823870

  1. Stochastic Gravitational-Wave Background due to Primordial Binary Black Hole Mergers.

    PubMed

    Mandic, Vuk; Bird, Simeon; Cholis, Ilias

    2016-11-11

    Recent Advanced LIGO detections of binary black hole mergers have prompted multiple studies investigating the possibility that the heavy GW150914 binary system was of primordial origin, and hence could be evidence for dark matter in the form of black holes. We compute the stochastic background arising from the incoherent superposition of such primordial binary black hole systems in the Universe and compare it to the similar background spectrum due to binary black hole systems of stellar origin. We investigate the possibility of detecting this background with future gravitational-wave detectors, and conclude that constraining the dark matter component in the form of black holes using stochastic gravitational-wave background measurements will be very challenging.

  2. The role of the host in a cooperating mainframe and workstation environment, volumes 1 and 2

    NASA Technical Reports Server (NTRS)

    Kusmanoff, Antone; Martin, Nancy L.

    1989-01-01

    In recent years, advancements made in computer systems have prompted a move from centralized computing based on timesharing a large mainframe computer to distributed computing based on a connected set of engineering workstations. A major factor in this advancement is the increased performance and lower cost of engineering workstations. The shift to distributed computing from centralized computing has led to challenges associated with the residency of application programs within the system. In a combined system of multiple engineering workstations attached to a mainframe host, the question arises as to how does a system designer assign applications between the larger mainframe host and the smaller, yet powerful, workstation. The concepts related to real time data processing are analyzed and systems are displayed which use a host mainframe and a number of engineering workstations interconnected by a local area network. In most cases, distributed systems can be classified as having a single function or multiple functions and as executing programs in real time or nonreal time. In a system of multiple computers, the degree of autonomy of the computers is important; a system with one master control computer generally differs in reliability, performance, and complexity from a system in which all computers share the control. This research is concerned with generating general criteria principles for software residency decisions (host or workstation) for a diverse yet coupled group of users (the clustered workstations) which may need the use of a shared resource (the mainframe) to perform their functions.

  3. Comparison of bias analysis strategies applied to a large data set.

    PubMed

    Lash, Timothy L; Abrams, Barbara; Bodnar, Lisa M

    2014-07-01

    Epidemiologic data sets continue to grow larger. Probabilistic-bias analyses, which simulate hundreds of thousands of replications of the original data set, may challenge desktop computational resources. We implemented a probabilistic-bias analysis to evaluate the direction, magnitude, and uncertainty of the bias arising from misclassification of prepregnancy body mass index when studying its association with early preterm birth in a cohort of 773,625 singleton births. We compared 3 bias analysis strategies: (1) using the full cohort, (2) using a case-cohort design, and (3) weighting records by their frequency in the full cohort. Underweight and overweight mothers were more likely to deliver early preterm. A validation substudy demonstrated misclassification of prepregnancy body mass index derived from birth certificates. Probabilistic-bias analyses suggested that the association between underweight and early preterm birth was overestimated by the conventional approach, whereas the associations between overweight categories and early preterm birth were underestimated. The 3 bias analyses yielded equivalent results and challenged our typical desktop computing environment. Analyses applied to the full cohort, case cohort, and weighted full cohort required 7.75 days and 4 terabytes, 15.8 hours and 287 gigabytes, and 8.5 hours and 202 gigabytes, respectively. Large epidemiologic data sets often include variables that are imperfectly measured, often because data were collected for other purposes. Probabilistic-bias analysis allows quantification of errors but may be difficult in a desktop computing environment. Solutions that allow these analyses in this environment can be achieved without new hardware and within reasonable computational time frames.

  4. On improving the algorithm efficiency in the particle-particle force calculations

    NASA Astrophysics Data System (ADS)

    Kozynchenko, Alexander I.; Kozynchenko, Sergey A.

    2016-09-01

    The problem of calculating inter-particle forces in the particle-particle (PP) simulation models takes an important place in scientific computing. Such simulation models are used in diverse scientific applications arising in astrophysics, plasma physics, particle accelerators, etc., where the long-range forces are considered. The inverse-square laws such as Coulomb's law of electrostatic forces and Newton's law of universal gravitation are the examples of laws pertaining to the long-range forces. The standard naïve PP method outlined, for example, by Hockney and Eastwood [1] is straightforward, processing all pairs of particles in a double nested loop. The PP algorithm provides the best accuracy of all possible methods, but its computational complexity is O (Np2), where Np is a total number of particles involved. Too low efficiency of the PP algorithm seems to be the challenging issue in some cases where the high accuracy is required. An example can be taken from the charged particle beam dynamics where, under computing the own space charge of the beam, so-called macro-particles are used (see e.g., Humphries Jr. [2], Kozynchenko and Svistunov [3]).

  5. An Improved Lattice Boltzmann Model for Non-Newtonian Flows with Applications to Solid-Fluid Interactions in External Flows

    NASA Astrophysics Data System (ADS)

    Adam, Saad; Premnath, Kannan

    2016-11-01

    Fluid mechanics of non-Newtonian fluids, which arise in numerous settings, are characterized by non-linear constitutive models that pose certain unique challenges for computational methods. Here, we consider the lattice Boltzmann method (LBM), which offers some computational advantages due to its kinetic basis and its simpler stream-and-collide procedure enabling efficient simulations. However, further improvements are necessary to improve its numerical stability and accuracy for computations involving broader parameter ranges. Hence, in this study, we extend the cascaded LBM formulation by modifying its moment equilibria and relaxation parameters to handle a variety of non-Newtonian constitutive equations, including power-law and Bingham fluids, with improved stability. In addition, we include corrections to the moment equilibria to obtain an inertial frame invariant scheme without cubic-velocity defects. After preforming its validation study for various benchmark flows, we study the physics of non-Newtonian flow over pairs of circular and square cylinders in a tandem arrangement, especially the wake structure interactions and their effects on resulting forces in each cylinder, and elucidate the effect of the various characteristic parameters.

  6. NASA's Integrated Instrument Simulator Suite for Atmospheric Remote Sensing from Spaceborne Platforms (ISSARS) and Its Role for the ACE and GPM Missions

    NASA Technical Reports Server (NTRS)

    Tanelli, Simone; Tao, Wei-Kuo; Hostetler, Chris; Kuo, Kwo-Sen; Matsui, Toshihisa; Jacob, Joseph C.; Niamsuwam, Noppasin; Johnson, Michael P.; Hair, John; Butler, Carolyn; hide

    2011-01-01

    Forward simulation is an indispensable tool for evaluation of precipitation retrieval algorithms as well as for studying snow/ice microphysics and their radiative properties. The main challenge of the implementation arises due to the size of the problem domain. To overcome this hurdle, assumptions need to be made to simplify compiles cloud microphysics. It is important that these assumptions are applied consistently throughout the simulation process. ISSARS addresses this issue by providing a computationally efficient and modular framework that can integrate currently existing models and is also capable of expanding for future development. ISSARS is designed to accommodate the simulation needs of the Aerosol/Clouds/Ecosystems (ACE) mission and the Global Precipitation Measurement (GPM) mission: radars, microwave radiometers, and optical instruments such as lidars and polarimeter. ISSARS's computation is performed in three stages: input reconditioning (IRM), electromagnetic properties (scattering/emission/absorption) calculation (SEAM), and instrument simulation (ISM). The computation is implemented as a web service while its configuration can be accessed through a web-based interface.

  7. A Computational Framework for Bioimaging Simulation

    PubMed Central

    Watabe, Masaki; Arjunan, Satya N. V.; Fukushima, Seiya; Iwamoto, Kazunari; Kozuka, Jun; Matsuoka, Satomi; Shindo, Yuki; Ueda, Masahiro; Takahashi, Koichi

    2015-01-01

    Using bioimaging technology, biologists have attempted to identify and document analytical interpretations that underlie biological phenomena in biological cells. Theoretical biology aims at distilling those interpretations into knowledge in the mathematical form of biochemical reaction networks and understanding how higher level functions emerge from the combined action of biomolecules. However, there still remain formidable challenges in bridging the gap between bioimaging and mathematical modeling. Generally, measurements using fluorescence microscopy systems are influenced by systematic effects that arise from stochastic nature of biological cells, the imaging apparatus, and optical physics. Such systematic effects are always present in all bioimaging systems and hinder quantitative comparison between the cell model and bioimages. Computational tools for such a comparison are still unavailable. Thus, in this work, we present a computational framework for handling the parameters of the cell models and the optical physics governing bioimaging systems. Simulation using this framework can generate digital images of cell simulation results after accounting for the systematic effects. We then demonstrate that such a framework enables comparison at the level of photon-counting units. PMID:26147508

  8. Accelerated failure time models for semi-competing risks data in the presence of complex censoring.

    PubMed

    Lee, Kyu Ha; Rondeau, Virginie; Haneuse, Sebastien

    2017-12-01

    Statistical analyses that investigate risk factors for Alzheimer's disease (AD) are often subject to a number of challenges. Some of these challenges arise due to practical considerations regarding data collection such that the observation of AD events is subject to complex censoring including left-truncation and either interval or right-censoring. Additional challenges arise due to the fact that study participants under investigation are often subject to competing forces, most notably death, that may not be independent of AD. Towards resolving the latter, researchers may choose to embed the study of AD within the "semi-competing risks" framework for which the recent statistical literature has seen a number of advances including for the so-called illness-death model. To the best of our knowledge, however, the semi-competing risks literature has not fully considered analyses in contexts with complex censoring, as in studies of AD. This is particularly the case when interest lies with the accelerated failure time (AFT) model, an alternative to the traditional multiplicative Cox model that places emphasis away from the hazard function. In this article, we outline a new Bayesian framework for estimation/inference of an AFT illness-death model for semi-competing risks data subject to complex censoring. An efficient computational algorithm that gives researchers the flexibility to adopt either a fully parametric or a semi-parametric model specification is developed and implemented. The proposed methods are motivated by and illustrated with an analysis of data from the Adult Changes in Thought study, an on-going community-based prospective study of incident AD in western Washington State. © 2017, The International Biometric Society.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Germain, Shawn

    Nuclear Power Plant (NPP) refueling outages create some of the most challenging activities the utilities face in both tracking and coordinating thousands of activities in a short period of time. Other challenges, including nuclear safety concerns arising from atypical system configurations and resource allocation issues, can create delays and schedule overruns, driving up outage costs. Today the majority of the outage communication is done using processes that do not take advantage of advances in modern technologies that enable enhanced communication, collaboration and information sharing. Some of the common practices include: runners that deliver paper-based requests for approval, radios, telephones, desktopmore » computers, daily schedule printouts, and static whiteboards that are used to display information. Many gains have been made to reduce the challenges facing outage coordinators; however; new opportunities can be realized by utilizing modern technological advancements in communication and information tools that can enhance the collective situational awareness of plant personnel leading to improved decision-making. Ongoing research as part of the Light Water Reactor Sustainability Program (LWRS) has been targeting NPP outage improvement. As part of this research, various applications of collaborative software have been demonstrated through pilot project utility partnerships. Collaboration software can be utilized as part of the larger concept of Computer-Supported Cooperative Work (CSCW). Collaborative software can be used for emergent issue resolution, Outage Control Center (OCC) displays, and schedule monitoring. Use of collaboration software enables outage staff and subject matter experts (SMEs) to view and update critical outage information from any location on site or off.« less

  10. Program Aids Visualization Of Data

    NASA Technical Reports Server (NTRS)

    Truong, L. V.

    1995-01-01

    Living Color Frame System (LCFS) computer program developed to solve some problems that arise in connection with generation of real-time graphical displays of numerical data and of statuses of systems. Need for program like LCFS arises because computer graphics often applied for better understanding and interpretation of data under observation and these graphics become more complicated when animation required during run time. Eliminates need for custom graphical-display software for application programs. Written in Turbo C++.

  11. epiDMS: Data Management and Analytics for Decision-Making From Epidemic Spread Simulation Ensembles.

    PubMed

    Liu, Sicong; Poccia, Silvestro; Candan, K Selçuk; Chowell, Gerardo; Sapino, Maria Luisa

    2016-12-01

    Carefully calibrated large-scale computational models of epidemic spread represent a powerful tool to support the decision-making process during epidemic emergencies. Epidemic models are being increasingly used for generating forecasts of the spatial-temporal progression of epidemics at different spatial scales and for assessing the likely impact of different intervention strategies. However, the management and analysis of simulation ensembles stemming from large-scale computational models pose challenges, particularly when dealing with multiple interdependent parameters, spanning multiple layers and geospatial frames, affected by complex dynamic processes operating at different resolutions. We describe and illustrate with examples a novel epidemic simulation data management system, epiDMS, that was developed to address the challenges that arise from the need to generate, search, visualize, and analyze, in a scalable manner, large volumes of epidemic simulation ensembles and observations during the progression of an epidemic. epiDMS is a publicly available system that facilitates management and analysis of large epidemic simulation ensembles. epiDMS aims to fill an important hole in decision-making during healthcare emergencies by enabling critical services with significant economic and health impact. © The Author 2016. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail journals.permissions@oup.com.

  12. Transforming Big Data into cancer-relevant insight: An initial, multi-tier approach to assess reproducibility and relevance

    PubMed Central

    2016-01-01

    The Cancer Target Discovery and Development (CTD2) Network was established to accelerate the transformation of “Big Data” into novel pharmacological targets, lead compounds, and biomarkers for rapid translation into improved patient outcomes. It rapidly became clear in this collaborative network that a key central issue was to define what constitutes sufficient computational or experimental evidence to support a biologically or clinically relevant finding. This manuscript represents a first attempt to delineate the challenges of supporting and confirming discoveries arising from the systematic analysis of large-scale data resources in a collaborative work environment and to provide a framework that would begin a community discussion to resolve these challenges. The Network implemented a multi-Tier framework designed to substantiate the biological and biomedical relevance as well as the reproducibility of data and insights resulting from its collaborative activities. The same approach can be used by the broad scientific community to drive development of novel therapeutic and biomarker strategies for cancer. PMID:27401613

  13. Role of data warehousing in healthcare epidemiology.

    PubMed

    Wyllie, D; Davies, J

    2015-04-01

    Electronic storage of healthcare data, including individual-level risk factors for both infectious and other diseases, is increasing. These data can be integrated at hospital, regional and national levels. Data sources that contain risk factor and outcome information for a wide range of conditions offer the potential for efficient epidemiological analysis of multiple diseases. Opportunities may also arise for monitoring healthcare processes. Integrating diverse data sources presents epidemiological, practical, and ethical challenges. For example, diagnostic criteria, outcome definitions, and ascertainment methods may differ across the data sources. Data volumes may be very large, requiring sophisticated computing technology. Given the large populations involved, perhaps the most challenging aspect is how informed consent can be obtained for the development of integrated databases, particularly when it is not easy to demonstrate their potential. In this article, we discuss some of the ups and downs of recent projects as well as the potential of data warehousing for antimicrobial resistance monitoring. Copyright © 2015. Published by Elsevier Ltd.

  14. Using Linked Electronic Health Records to Estimate Healthcare Costs: Key Challenges and Opportunities.

    PubMed

    Asaria, Miqdad; Grasic, Katja; Walker, Simon

    2016-02-01

    This paper discusses key challenges and opportunities that arise when using linked electronic health records (EHR) in health economics and outcomes research (HEOR), with a particular focus on estimating healthcare costs. These challenges and opportunities are framed in the context of a case study modelling the costs of stable coronary artery disease in England. The challenges and opportunities discussed fall broadly into the categories of (1) handling and organising data of this size and sensitivity; (2) extracting clinical endpoints from datasets that have not been designed and collected with such endpoints in mind; and (3) the principles and practice of costing resource use from routinely collected data. We find that there are a number of new challenges and opportunities that arise when working with EHR compared with more traditional sources of data for HEOR. These call for greater clinician involvement and intelligent use of sensitivity analysis.

  15. InfoSymbiotics/DDDAS - The power of Dynamic Data Driven Applications Systems for New Capabilities in Environmental -, Geo-, and Space- Sciences

    NASA Astrophysics Data System (ADS)

    Darema, F.

    2016-12-01

    InfoSymbiotics/DDDAS embodies the power of Dynamic Data Driven Applications Systems (DDDAS), a concept whereby an executing application model is dynamically integrated, in a feed-back loop, with the real-time data-acquisition and control components, as well as other data sources of the application system. Advanced capabilities can be created through such new computational approaches in modeling and simulations, and in instrumentation methods, and include: enhancing the accuracy of the application model; speeding-up the computation to allow faster and more comprehensive models of a system, and create decision support systems with the accuracy of full-scale simulations; in addition, the notion of controlling instrumentation processes by the executing application results in more efficient management of application-data and addresses challenges of how to architect and dynamically manage large sets of heterogeneous sensors and controllers, an advance over the static and ad-hoc ways of today - with DDDAS these sets of resources can be managed adaptively and in optimized ways. Large-Scale-Dynamic-Data encompasses the next wave of Big Data, and namely dynamic data arising from ubiquitous sensing and control in engineered, natural, and societal systems, through multitudes of heterogeneous sensors and controllers instrumenting these systems, and where opportunities and challenges at these "large-scales" relate not only to data size but the heterogeneity in data, data collection modalities, fidelities, and timescales, ranging from real-time data to archival data. In tandem with this important dimension of dynamic data, there is an extended view of Big Computing, which includes the collective computing by networked assemblies of multitudes of sensors and controllers, this range from the high-end to the real-time seamlessly integrated and unified, and comprising the Large-Scale-Big-Computing. InfoSymbiotics/DDDAS engenders transformative impact in many application domains, ranging from the nano-scale to the terra-scale and to the extra-terra-scale. The talk will address opportunities for new capabilities together with corresponding research challenges, with illustrative examples from several application areas including environmental sciences, geosciences, and space sciences.

  16. Distributed computation: the new wave of synthetic biology devices.

    PubMed

    Macía, Javier; Posas, Francesc; Solé, Ricard V

    2012-06-01

    Synthetic biology (SB) offers a unique opportunity for designing complex molecular circuits able to perform predefined functions. But the goal of achieving a flexible toolbox of reusable molecular components has been shown to be limited due to circuit unpredictability, incompatible parts or random fluctuations. Many of these problems arise from the challenges posed by engineering the molecular circuitry: multiple wires are usually difficult to implement reliably within one cell and the resulting systems cannot be reused in other modules. These problems are solved by means of a nonstandard approach to single cell devices, using cell consortia and allowing the output signal to be distributed among different cell types, which can be combined in multiple, reusable and scalable ways. Copyright © 2012 Elsevier Ltd. All rights reserved.

  17. Non-stoquastic Hamiltonians in quantum annealing via geometric phases

    NASA Astrophysics Data System (ADS)

    Vinci, Walter; Lidar, Daniel A.

    2017-09-01

    We argue that a complete description of quantum annealing implemented with continuous variables must take into account the non-adiabatic Aharonov-Anandan geometric phase that arises when the system Hamiltonian changes during the anneal. We show that this geometric effect leads to the appearance of non-stoquasticity in the effective quantum Ising Hamiltonians that are typically used to describe quantum annealing with flux qubits. We explicitly demonstrate the effect of this geometric non-stoquasticity when quantum annealing is performed with a system of one and two coupled flux qubits. The realization of non-stoquastic Hamiltonians has important implications from a computational complexity perspective, since it is believed that in many cases quantum annealing with stoquastic Hamiltonians can be efficiently simulated via classical algorithms such as Quantum Monte Carlo. It is well known that the direct implementation of non-stoquastic Hamiltonians with flux qubits is particularly challenging. Our results suggest an alternative path for the implementation of non-stoquasticity via geometric phases that can be exploited for computational purposes.

  18. Haplotype Reconstruction in Large Pedigrees with Many Untyped Individuals

    NASA Astrophysics Data System (ADS)

    Li, Xin; Li, Jing

    Haplotypes, as they specify the linkage patterns between dispersed genetic variations, provide important information for understanding the genetics of human traits. However haplotypes are not directly available from current genotyping platforms, and hence there are extensive investigations of computational methods to recover such information. Two major computational challenges arising in current family-based disease studies are large family sizes and many ungenotyped family members. Traditional haplotyping methods can neither handle large families nor families with missing members. In this paper, we propose a method which addresses these issues by integrating multiple novel techniques. The method consists of three major components: pairwise identical-bydescent (IBD) inference, global IBD reconstruction and haplotype restoring. By reconstructing the global IBD of a family from pairwise IBD and then restoring the haplotypes based on the inferred IBD, this method can scale to large pedigrees, and more importantly it can handle families with missing members. Compared with existing methods, this method demonstrates much higher power to recover haplotype information, especially in families with many untyped individuals.

  19. New Python-based methods for data processing

    PubMed Central

    Sauter, Nicholas K.; Hattne, Johan; Grosse-Kunstleve, Ralf W.; Echols, Nathaniel

    2013-01-01

    Current pixel-array detectors produce diffraction images at extreme data rates (of up to 2 TB h−1) that make severe demands on computational resources. New multiprocessing frameworks are required to achieve rapid data analysis, as it is important to be able to inspect the data quickly in order to guide the experiment in real time. By utilizing readily available web-serving tools that interact with the Python scripting language, it was possible to implement a high-throughput Bragg-spot analyzer (cctbx.spotfinder) that is presently in use at numerous synchrotron-radiation beamlines. Similarly, Python interoperability enabled the production of a new data-reduction package (cctbx.xfel) for serial femto­second crystallography experiments at the Linac Coherent Light Source (LCLS). Future data-reduction efforts will need to focus on specialized problems such as the treatment of diffraction spots on interleaved lattices arising from multi-crystal specimens. In these challenging cases, accurate modeling of close-lying Bragg spots could benefit from the high-performance computing capabilities of graphics-processing units. PMID:23793153

  20. Parallel ptychographic reconstruction

    DOE PAGES

    Nashed, Youssef S. G.; Vine, David J.; Peterka, Tom; ...

    2014-12-19

    Ptychography is an imaging method whereby a coherent beam is scanned across an object, and an image is obtained by iterative phasing of the set of diffraction patterns. It is able to be used to image extended objects at a resolution limited by scattering strength of the object and detector geometry, rather than at an optics-imposed limit. As technical advances allow larger fields to be imaged, computational challenges arise for reconstructing the correspondingly larger data volumes, yet at the same time there is also a need to deliver reconstructed images immediately so that one can evaluate the next steps tomore » take in an experiment. Here we present a parallel method for real-time ptychographic phase retrieval. It uses a hybrid parallel strategy to divide the computation between multiple graphics processing units (GPUs) and then employs novel techniques to merge sub-datasets into a single complex phase and amplitude image. Results are shown on a simulated specimen and a real dataset from an X-ray experiment conducted at a synchrotron light source.« less

  1. Predictive models of lyophilization process for development, scale-up/tech transfer and manufacturing.

    PubMed

    Zhu, Tong; Moussa, Ehab M; Witting, Madeleine; Zhou, Deliang; Sinha, Kushal; Hirth, Mario; Gastens, Martin; Shang, Sherwin; Nere, Nandkishor; Somashekar, Shubha Chetan; Alexeenko, Alina; Jameel, Feroz

    2018-07-01

    Scale-up and technology transfer of lyophilization processes remains a challenge that requires thorough characterization of the laboratory and larger scale lyophilizers. In this study, computational fluid dynamics (CFD) was employed to develop computer-based models of both laboratory and manufacturing scale lyophilizers in order to understand the differences in equipment performance arising from distinct designs. CFD coupled with steady state heat and mass transfer modeling of the vial were then utilized to study and predict independent variables such as shelf temperature and chamber pressure, and response variables such as product resistance, product temperature and primary drying time for a given formulation. The models were then verified experimentally for the different lyophilizers. Additionally, the models were applied to create and evaluate a design space for a lyophilized product in order to provide justification for the flexibility to operate within a certain range of process parameters without the need for validation. Published by Elsevier B.V.

  2. An Implicit Solver on A Parallel Block-Structured Adaptive Mesh Grid for FLASH

    NASA Astrophysics Data System (ADS)

    Lee, D.; Gopal, S.; Mohapatra, P.

    2012-07-01

    We introduce a fully implicit solver for FLASH based on a Jacobian-Free Newton-Krylov (JFNK) approach with an appropriate preconditioner. The main goal of developing this JFNK-type implicit solver is to provide efficient high-order numerical algorithms and methodology for simulating stiff systems of differential equations on large-scale parallel computer architectures. A large number of natural problems in nonlinear physics involve a wide range of spatial and time scales of interest. A system that encompasses such a wide magnitude of scales is described as "stiff." A stiff system can arise in many different fields of physics, including fluid dynamics/aerodynamics, laboratory/space plasma physics, low Mach number flows, reactive flows, radiation hydrodynamics, and geophysical flows. One of the big challenges in solving such a stiff system using current-day computational resources lies in resolving time and length scales varying by several orders of magnitude. We introduce FLASH's preliminary implementation of a time-accurate JFNK-based implicit solver in the framework of FLASH's unsplit hydro solver.

  3. The growth of language: Universal Grammar, experience, and principles of computation.

    PubMed

    Yang, Charles; Crain, Stephen; Berwick, Robert C; Chomsky, Noam; Bolhuis, Johan J

    2017-10-01

    Human infants develop language remarkably rapidly and without overt instruction. We argue that the distinctive ontogenesis of child language arises from the interplay of three factors: domain-specific principles of language (Universal Grammar), external experience, and properties of non-linguistic domains of cognition including general learning mechanisms and principles of efficient computation. We review developmental evidence that children make use of hierarchically composed structures ('Merge') from the earliest stages and at all levels of linguistic organization. At the same time, longitudinal trajectories of development show sensitivity to the quantity of specific patterns in the input, which suggests the use of probabilistic processes as well as inductive learning mechanisms that are suitable for the psychological constraints on language acquisition. By considering the place of language in human biology and evolution, we propose an approach that integrates principles from Universal Grammar and constraints from other domains of cognition. We outline some initial results of this approach as well as challenges for future research. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Cybersim: geographic, temporal, and organizational dynamics of malware propagation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Santhi, Nandakishore; Yan, Guanhua; Eidenbenz, Stephan

    2010-01-01

    Cyber-infractions into a nation's strategic security envelope pose a constant and daunting challenge. We present the modular CyberSim tool which has been developed in response to the need to realistically simulate at a national level, software vulnerabilities and resulting mal ware propagation in online social networks. CyberSim suite (a) can generate realistic scale-free networks from a database of geocoordinated computers to closely model social networks arising from personal and business email contacts and online communities; (b) maintains for each,bost a list of installed software, along with the latest published vulnerabilities; (d) allows designated initial nodes where malware gets introduced; (e)more » simulates, using distributed discrete event-driven technology, the spread of malware exploiting a specific vulnerability, with packet delay and user online behavior models; (f) provides a graphical visualization of spread of infection, its severity, businesses affected etc to the analyst. We present sample simulations on a national level network with millions of computers.« less

  5. Strategies to Address Common Challenges When Teaching in an Active Learning Classroom

    ERIC Educational Resources Information Center

    Petersen, Christina I.; Gorman, Kristen S.

    2014-01-01

    This chapter provides practical strategies for addressing common challenges that arise for teachers in active learning classrooms. Our strategies come from instructors with experience teaching in these environments.

  6. Solitary Fibrous Tumor Arising from Stomach: CT Findings

    PubMed Central

    Park, Sung Hee; Kwon, Jieun; Park, Jong-pil; Park, Mi-Suk; Lim, Joon Seok; Kim, Joo Hee; Kim, Ki Whang

    2007-01-01

    Solitary fibrous tumors are spindle-cell neoplasms that usually develop in the pleura and peritoneum, and rarely arise in the stomach. To our knowledge, there is only one case reporting a solitary fibrous tumor arising from stomach in the English literature. Here we report the case of a 26-year-old man with a large solitary fibrous tumor arising from the stomach which involved the submucosa and muscular layer and resembled a gastrointestinal stromal tumor in the stomach, based on what was seen during abdominal computed tomography. A solitary fibrous tumor arising from the stomach, although rare, could be considered as a diagnostic possibility for gastric submucosal tumors. PMID:18159603

  7. Cerebral cartography and connectomics.

    PubMed

    Sporns, Olaf

    2015-05-19

    Cerebral cartography and connectomics pursue similar goals in attempting to create maps that can inform our understanding of the structural and functional organization of the cortex. Connectome maps explicitly aim at representing the brain as a complex network, a collection of nodes and their interconnecting edges. This article reflects on some of the challenges that currently arise in the intersection of cerebral cartography and connectomics. Principal challenges concern the temporal dynamics of functional brain connectivity, the definition of areal parcellations and their hierarchical organization into large-scale networks, the extension of whole-brain connectivity to cellular-scale networks, and the mapping of structure/function relations in empirical recordings and computational models. Successfully addressing these challenges will require extensions of methods and tools from network science to the mapping and analysis of human brain connectivity data. The emerging view that the brain is more than a collection of areas, but is fundamentally operating as a complex networked system, will continue to drive the creation of ever more detailed and multi-modal network maps as tools for on-going exploration and discovery in human connectomics. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  8. Laboratory Exercises to Examine Recombination & Aneuploidy in "Drosophila"

    ERIC Educational Resources Information Center

    Venema, Dennis R.

    2009-01-01

    Chromosomal aneuploidy, a deviation from an exact multiple of an organism's haploid chromosome number, is a difficult concept for students to master. Aneuploidy arising from chromosomal non-disjunction (NDJ) is particularly problematic for students, since it arises in the context of meiosis, itself a challenging subject. Students learning NDJ are…

  9. Understanding Leadership in Schools Facing Challenging Circumstances: A Chilean Case Study

    ERIC Educational Resources Information Center

    Ahumada, Luis; Galdames, Sergio; Clarke, Simon

    2016-01-01

    During the last 10 years, research into schools facing challenging circumstances has attracted the attention of researchers around the world. The aim of this study was to understand the challenges that school leaders face as they per form their work, the nature of the context in which these challenges arise, the strategies school leaders adopt to…

  10. Factors Affecting Recruitment of Participants for Studies of Diabetes Technology in Newly Diagnosed Youth with Type 1 Diabetes: A Qualitative Focus Group Study with Parents and Children.

    PubMed

    Farrington, Conor; Allen, Janet; Tauschmann, Martin; Randell, Tabitha; Trevelyan, Nicola; Hovorka, Roman

    2016-09-01

    Relatively little is known about parents' or children's attitudes toward recruitment for, and participation in, studies of new diabetes technologies immediately after diagnosis. This study investigated factors affecting recruitment of participants for studies in newly diagnosed youth with type 1 diabetes. Qualitative focus group study incorporating four recorded focus groups, conducted in four outpatient pediatric diabetes clinics in large regional hospitals in England. Participants comprised four groups of parents (n = 22) and youth (n = 17) with type 1 diabetes, purposively sampled on the basis of past involvement (either participation or nonparticipation) in an ongoing two-arm randomized trial comparing multiple daily injection with conventional continuous subcutaneous insulin infusion regimens from the onset of type 1 diabetes. Stress associated with diagnosis presents significant challenges in terms of study recruitment, with parents demonstrating varied levels of willingness to be approached soon after diagnosis. Additional challenges arise regarding the following: randomization when study arms are perceived as sharply differentiated in terms of therapy effectiveness; burdens arising from study participation; and the need to surrender new technologies following the end of the study. However, these challenges were mostly insufficient to rule out study participation. Participants emphasized the benefits and reassurance arising from support provided by staff and fellow study participants. Recruitment to studies of new diabetes technologies immediately after diagnosis in youth presents significant challenges, but these are not insurmountable. The stress and uncertainty arising from potential participation may be alleviated by personalized discussion with staff and peer support from fellow study participants.

  11. Human-Interaction Challenges in UAV-Based Autonomous Surveillance

    NASA Technical Reports Server (NTRS)

    Freed, Michael; Harris, Robert; Shafto, Michael G.

    2004-01-01

    Autonomous UAVs provide a platform for intelligent surveillance in application domains ranging from security and military operations to scientific information gathering and land management. Surveillance tasks are often long duration, requiring that any approach be adaptive to changes in the environment or user needs. We describe a decision- theoretic model of surveillance, appropriate for use on our autonomous helicopter, that provides a basis for optimizing the value of information returned by the UAV. From this approach arise a range of challenges in making this framework practical for use by human operators lacking specialized knowledge of autonomy and mathematics. This paper describes our platform and approach, then describes human-interaction challenges arising from this approach that we have identified and begun to address.

  12. Some current themes in physical hydrology of the land-atmosphere interface

    USGS Publications Warehouse

    Milly, P.C.D.

    1991-01-01

    Certain themes arise repeatedly in current literature dealing with the physical hydrology of the interface between the atmosphere and the continents. Papers contributed to the 1991 International Association of Hydrological Sciences Symposium on Hydrological Interactions between Atmosphere, Soil and Vegetation echo these themes, which are discussed in this paper. The land-atmosphere interface is the region where atmosphere, soil, and vegetation have mutual physical contact, and a description of exchanges of matter or energy among these domains must often consider the physical properties and states of the entire system. A difficult family of problems is associated with the reconciliation of the wide range of spatial scales that arise in the course of observational, theoretical, and modeling activities. These scales are determined by some of the physical elements of the interface, by patterns of natural variability of the physical composition of the interface, by the dynamics of the processes at the interface, and by methods of measurement and computation. Global environmental problems are seen by many hydrologists as a major driving force for development of the science. The challenge for hydrologists will be to respond to this force as scientists rather than problem-solvers.

  13. Navigating the Challenges Arising from University-School Collaborative Action Research

    ERIC Educational Resources Information Center

    Yuan, Rui; Mak, Pauline

    2016-01-01

    Despite increasing evidence showing the benefits language teachers can reap from university-school collaborative action research (CAR), scant attention has been given to how university researchers collaborate with language teachers, what challenges they might encounter, and how they navigate such challenges in CAR. To fill the gap, this study…

  14. Development of a Mental Health Nursing Simulation: Challenges and Solutions

    ERIC Educational Resources Information Center

    Kidd, Lori I.; Morgan, Karyn I.; Savery, John R.

    2012-01-01

    Nursing education programs are proliferating rapidly in the United States in an effort to meet demand for nurse professionals. Multiple challenges arise from this rapid expansion. One challenge is finding sufficient clinical sites to accommodate students. Increased competition for scarce resources requires creativity in clinical contracting. This…

  15. Diversifying Academic and Professional Identities in Higher Education: Some Management Challenges

    ERIC Educational Resources Information Center

    Whitchurch, Celia; Gordon, George

    2010-01-01

    This paper draws on an international study of the management challenges arising from diversifying academic and professional identities in higher education. These challenges include, for instance, the introduction of practice-based disciplines with different traditions such as health and social care, the changing aspirations and expectations of…

  16. Finite Dimensional Approximations for Continuum Multiscale Problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berlyand, Leonid

    2017-01-24

    The completed research project concerns the development of novel computational techniques for modeling nonlinear multiscale physical and biological phenomena. Specifically, it addresses the theoretical development and applications of the homogenization theory (coarse graining) approach to calculation of the effective properties of highly heterogenous biological and bio-inspired materials with many spatial scales and nonlinear behavior. This theory studies properties of strongly heterogeneous media in problems arising in materials science, geoscience, biology, etc. Modeling of such media raises fundamental mathematical questions, primarily in partial differential equations (PDEs) and calculus of variations, the subject of the PI’s research. The focus of completed researchmore » was on mathematical models of biological and bio-inspired materials with the common theme of multiscale analysis and coarse grain computational techniques. Biological and bio-inspired materials offer the unique ability to create environmentally clean functional materials used for energy conversion and storage. These materials are intrinsically complex, with hierarchical organization occurring on many nested length and time scales. The potential to rationally design and tailor the properties of these materials for broad energy applications has been hampered by the lack of computational techniques, which are able to bridge from the molecular to the macroscopic scale. The project addressed the challenge of computational treatments of such complex materials by the development of a synergistic approach that combines innovative multiscale modeling/analysis techniques with high performance computing.« less

  17. GPU acceleration of Dock6's Amber scoring computation.

    PubMed

    Yang, Hailong; Zhou, Qiongqiong; Li, Bo; Wang, Yongjian; Luan, Zhongzhi; Qian, Depei; Li, Hanlu

    2010-01-01

    Dressing the problem of virtual screening is a long-term goal in the drug discovery field, which if properly solved, can significantly shorten new drugs' R&D cycle. The scoring functionality that evaluates the fitness of the docking result is one of the major challenges in virtual screening. In general, scoring functionality in docking requires a large amount of floating-point calculations, which usually takes several weeks or even months to be finished. This time-consuming procedure is unacceptable, especially when highly fatal and infectious virus arises such as SARS and H1N1, which forces the scoring task to be done in a limited time. This paper presents how to leverage the computational power of GPU to accelerate Dock6's (http://dock.compbio.ucsf.edu/DOCK_6/) Amber (J. Comput. Chem. 25: 1157-1174, 2004) scoring with NVIDIA CUDA (NVIDIA Corporation Technical Staff, Compute Unified Device Architecture - Programming Guide, NVIDIA Corporation, 2008) (Compute Unified Device Architecture) platform. We also discuss many factors that will greatly influence the performance after porting the Amber scoring to GPU, including thread management, data transfer, and divergence hidden. Our experiments show that the GPU-accelerated Amber scoring achieves a 6.5× speedup with respect to the original version running on AMD dual-core CPU for the same problem size. This acceleration makes the Amber scoring more competitive and efficient for large-scale virtual screening problems.

  18. ASME V\\&V challenge problem: Surrogate-based V&V

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beghini, Lauren L.; Hough, Patricia D.

    2015-12-18

    The process of verification and validation can be resource intensive. From the computational model perspective, the resource demand typically arises from long simulation run times on multiple cores coupled with the need to characterize and propagate uncertainties. In addition, predictive computations performed for safety and reliability analyses have similar resource requirements. For this reason, there is a tradeoff between the time required to complete the requisite studies and the fidelity or accuracy of the results that can be obtained. At a high level, our approach is cast within a validation hierarchy that provides a framework in which we perform sensitivitymore » analysis, model calibration, model validation, and prediction. The evidence gathered as part of these activities is mapped into the Predictive Capability Maturity Model to assess credibility of the model used for the reliability predictions. With regard to specific technical aspects of our analysis, we employ surrogate-based methods, primarily based on polynomial chaos expansions and Gaussian processes, for model calibration, sensitivity analysis, and uncertainty quantification in order to reduce the number of simulations that must be done. The goal is to tip the tradeoff balance to improving accuracy without increasing the computational demands.« less

  19. Physical Computing and Its Scope--Towards a Constructionist Computer Science Curriculum with Physical Computing

    ERIC Educational Resources Information Center

    Przybylla, Mareen; Romeike, Ralf

    2014-01-01

    Physical computing covers the design and realization of interactive objects and installations and allows students to develop concrete, tangible products of the real world, which arise from the learners' imagination. This can be used in computer science education to provide students with interesting and motivating access to the different topic…

  20. Computer ethics and teritary level education in Hong Kong

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wong, E.Y.W.; Davison, R.M.; Wade, P.W.

    1994-12-31

    This paper seeks to highlight some ethical issues relating to the increasing proliferation of Information Technology into our everyday lives. The authors explain their understanding of computer ethics, and give some reasons why the study of computer ethics is becoming increasingly pertinent. The paper looks at some of the problems that arise in attempting to develop appropriate ethical concepts in a constantly changing environment, and explores some of the ethical dilemmas arising from the increasing use of computers. Some initial research undertaken to explore the ideas and understanding of tertiary level students in Hong Kong on a number of ethicalmore » issues of interest is described, and our findings discussed. We hope that presenting this paper and eliciting subsequent discussion will enable us to draw up more comprehensive guidelines for the teaching of computer related ethics to tertiary level students, as well as reveal some directions for future research.« less

  1. ARISE to the Challenge: Partnering with Urban Youth to Improve Educational Research and Learning

    ERIC Educational Resources Information Center

    Brown, Tara M.

    2010-01-01

    This paper examines Action Research into School Exclusion (Project ARISE), a two-year research partnership between K-12 students and university researchers. Based on the principles of participatory action research (PAR), the project intentionally brought together university researchers, K-12 students, and pre-service teachers to bridge research…

  2. Physically based modeling in catchment hydrology at 50: Survey and outlook

    NASA Astrophysics Data System (ADS)

    Paniconi, Claudio; Putti, Mario

    2015-09-01

    Integrated, process-based numerical models in hydrology are rapidly evolving, spurred by novel theories in mathematical physics, advances in computational methods, insights from laboratory and field experiments, and the need to better understand and predict the potential impacts of population, land use, and climate change on our water resources. At the catchment scale, these simulation models are commonly based on conservation principles for surface and subsurface water flow and solute transport (e.g., the Richards, shallow water, and advection-dispersion equations), and they require robust numerical techniques for their resolution. Traditional (and still open) challenges in developing reliable and efficient models are associated with heterogeneity and variability in parameters and state variables; nonlinearities and scale effects in process dynamics; and complex or poorly known boundary conditions and initial system states. As catchment modeling enters a highly interdisciplinary era, new challenges arise from the need to maintain physical and numerical consistency in the description of multiple processes that interact over a range of scales and across different compartments of an overall system. This paper first gives an historical overview (past 50 years) of some of the key developments in physically based hydrological modeling, emphasizing how the interplay between theory, experiments, and modeling has contributed to advancing the state of the art. The second part of the paper examines some outstanding problems in integrated catchment modeling from the perspective of recent developments in mathematical and computational science.

  3. Whither systems medicine?

    PubMed Central

    Apweiler, Rolf; Beissbarth, Tim; Berthold, Michael R; Blüthgen, Nils; Burmeister, Yvonne; Dammann, Olaf; Deutsch, Andreas; Feuerhake, Friedrich; Franke, Andre; Hasenauer, Jan; Hoffmann, Steve; Höfer, Thomas; Jansen, Peter LM; Kaderali, Lars; Klingmüller, Ursula; Koch, Ina; Kohlbacher, Oliver; Kuepfer, Lars; Lammert, Frank; Maier, Dieter; Pfeifer, Nico; Radde, Nicole; Rehm, Markus; Roeder, Ingo; Saez-Rodriguez, Julio; Sax, Ulrich; Schmeck, Bernd; Schuppert, Andreas; Seilheimer, Bernd; Theis, Fabian J; Vera, Julio; Wolkenhauer, Olaf

    2018-01-01

    New technologies to generate, store and retrieve medical and research data are inducing a rapid change in clinical and translational research and health care. Systems medicine is the interdisciplinary approach wherein physicians and clinical investigators team up with experts from biology, biostatistics, informatics, mathematics and computational modeling to develop methods to use new and stored data to the benefit of the patient. We here provide a critical assessment of the opportunities and challenges arising out of systems approaches in medicine and from this provide a definition of what systems medicine entails. Based on our analysis of current developments in medicine and healthcare and associated research needs, we emphasize the role of systems medicine as a multilevel and multidisciplinary methodological framework for informed data acquisition and interdisciplinary data analysis to extract previously inaccessible knowledge for the benefit of patients. PMID:29497170

  4. Solitary fibrous tumor of the prostate: case report and review of the literature.

    PubMed

    Moureau-Zabotto, Laurence; Chetaille, Bruno; Bladou, Franck; Dauvergne, Pierre-Yves; Marcy, Myriam; Perrot, Delphine; Guiramand, Jérôme; Sarran, Anthony; Bertucci, François

    2012-01-01

    Solitary fibrous tumor (SFT), usually described in the pleura, is exceedingly rare in the prostate. We report a 60-year-old man with prostatic SFT revealed by obstructive urinary symptoms, and detected by ultrasonography. Computed tomography (CT) and magnetic resonance imaging suggested a prostatic origin. CT-guided tumor biopsy diagnosed a SFT. A cystoprostatectomy was performed. Pathologic examination showed a 15-cm tumor arising from the prostate and showing histological criteria suggestive of aggressiveness. The surgical resection margins were tumor-free. The patient was then regularly monitored and is still alive in complete remission, 28 months after surgery. In conclusion, we report a new exceptional case of prostatic SFT. We review the literature and discuss the challenging issues of misdiagnosis, prognosis and treatment.

  5. Solitary Fibrous Tumor of the Prostate: Case Report and Review of the Literature

    PubMed Central

    Moureau-Zabotto, Laurence; Chetaille, Bruno; Bladou, Franck; Dauvergne, Pierre-Yves; Marcy, Myriam; Perrot, Delphine; Guiramand, Jérôme; Sarran, Anthony; Bertucci, François

    2012-01-01

    Solitary fibrous tumor (SFT), usually described in the pleura, is exceedingly rare in the prostate. We report a 60-year-old man with prostatic SFT revealed by obstructive urinary symptoms, and detected by ultrasonography. Computed tomography (CT) and magnetic resonance imaging suggested a prostatic origin. CT-guided tumor biopsy diagnosed a SFT. A cystoprostatectomy was performed. Pathologic examination showed a 15-cm tumor arising from the prostate and showing histological criteria suggestive of aggressiveness. The surgical resection margins were tumor-free. The patient was then regularly monitored and is still alive in complete remission, 28 months after surgery. In conclusion, we report a new exceptional case of prostatic SFT. We review the literature and discuss the challenging issues of misdiagnosis, prognosis and treatment. PMID:22379473

  6. Exploring corrections to the Optomechanical Hamiltonian.

    PubMed

    Sala, Kamila; Tufarelli, Tommaso

    2018-06-14

    We compare two approaches for deriving corrections to the "linear model" of cavity optomechanics, in order to describe effects that are beyond first order in the radiation pressure coupling. In the regime where the mechanical frequency is much lower than the cavity one, we compare: (I) a widely used phenomenological Hamiltonian conserving the photon number; (II) a two-mode truncation of C. K. Law's microscopic model, which we take as the "true" system Hamiltonian. While these approaches agree at first order, the latter model does not conserve the photon number, resulting in challenging computations. We find that approach (I) allows for several analytical predictions, and significantly outperforms the linear model in our numerical examples. Yet, we also find that the phenomenological Hamiltonian cannot fully capture all high-order corrections arising from the C. K. Law model.

  7. Software and knowledge engineering aspects of smart homes applied to health.

    PubMed

    Augusto, Juan Carlos; Nugent, Chris; Martin, Suzanne; Olphert, Colin

    2005-01-01

    Smart Home technology offers a viable solution to the increasing needs of the elderly, special needs and home based-healthcare populations. The research to date has largely focused on the development of communication technologies, sensor technologies and intelligent user interfaces. We claim that this technological evolution has not been matched with a step of a similar size on the software counterpart. We particularly focus on the software that emphasizes the intelligent aspects of a Smart Home and the difficulties that arise from the computational analysis of the information collected from a Smart Home. The process of translating information into accurate diagnosis when using non-invasive technology is full of challenges, some of which have been considered in the literature to some extent but as yet without clear landmarks.

  8. Computer Literacy and Social Stratification. Interactive Technology Laboratory Report #9.

    ERIC Educational Resources Information Center

    Mehan, Hugh

    As schools acquire and use computers for educational purposes, two major questions arise: (1) whether students from different strata of society will obtain equal access to computers, and (2) whether students from different strata of society will be taught similar or different uses of the computer. To explore the relationship between the…

  9. Enrolling Advisers in Governing Privatised Agricultural Extension in Australia: Challenges and Opportunities for the Research, Development and Extension System

    ERIC Educational Resources Information Center

    Paschen, Jana-Axinja; Reichelt, Nicole; King, Barbara; Ayre, Margaret; Nettle, Ruth

    2017-01-01

    Purpose: Current developments in the Australian agricultural research, development and extension (RD&E) system exemplify the complex governance challenges arising from the international privatisation of agricultural extension. Presenting early challenges emerging from a multi-stakeholder project aimed at stimulating the role of the private…

  10. Is Particle Physics Ready for the LHC

    ScienceCinema

    Lykken, Joseph

    2017-12-09

    The advent of the Large Hadron Collider in 2007 entails daunting challenges to particle physicists. The first set of challenges will arise from trying to separate new physics from old. The second set of challenges will come in trying to interpret the new discoveries. I will describe a few of the scariest examples.

  11. Ethical Challenges of Military Social Workers Serving in a Combat Zone

    ERIC Educational Resources Information Center

    Simmons, Catherine A.; Rycraft, Joan R.

    2010-01-01

    Often faced with ethical challenges that may appear extraordinary, military social workers comprise a distinctive subgroup of the social work profession. From the unique paradigms in which they practice their craft, obvious questions about how military social workers address the ethical challenges inherent to their wartime mission arise. Using a…

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bai, Zhaojun; Yang, Chao

    What is common among electronic structure calculation, design of MEMS devices, vibrational analysis of high speed railways, and simulation of the electromagnetic field of a particle accelerator? The answer: they all require solving large scale nonlinear eigenvalue problems. In fact, these are just a handful of examples in which solving nonlinear eigenvalue problems accurately and efficiently is becoming increasingly important. Recognizing the importance of this class of problems, an invited minisymposium dedicated to nonlinear eigenvalue problems was held at the 2005 SIAM Annual Meeting. The purpose of the minisymposium was to bring together numerical analysts and application scientists to showcasemore » some of the cutting edge results from both communities and to discuss the challenges they are still facing. The minisymposium consisted of eight talks divided into two sessions. The first three talks focused on a type of nonlinear eigenvalue problem arising from electronic structure calculations. In this type of problem, the matrix Hamiltonian H depends, in a non-trivial way, on the set of eigenvectors X to be computed. The invariant subspace spanned by these eigenvectors also minimizes a total energy function that is highly nonlinear with respect to X on a manifold defined by a set of orthonormality constraints. In other applications, the nonlinearity of the matrix eigenvalue problem is restricted to the dependency of the matrix on the eigenvalues to be computed. These problems are often called polynomial or rational eigenvalue problems In the second session, Christian Mehl from Technical University of Berlin described numerical techniques for solving a special type of polynomial eigenvalue problem arising from vibration analysis of rail tracks excited by high-speed trains.« less

  13. Evidence-Based Ethics for Neurology and Psychiatry Research

    PubMed Central

    Kim, Scott Y. H.

    2004-01-01

    Summary: American bioethics, historically arising out of theology and philosophy, has been dominated by the method of normative analysis. Ethics as policy, however, requires in addition a solid evidence base. This paper discusses the background conditions that make neurotherapeutics research particularly challenging. Three key ethical issues are discussed within an evidence-based ethics framework: the ethical challenges arising from changes in the financial incentive structures for academic researchers and their institutions, the challenges of risk-benefit analysis for neurotherapeutics protocols testing innovative interventions, and the evolving issues surrounding impaired decision-making capacity and surrogate consent for research. For each of these issues, selected empirical data are reviewed, areas for further inquiry are noted, and the need for development of novel methods for bioethics policy research is discussed. PMID:15717040

  14. Cluster randomization and political philosophy.

    PubMed

    Chwang, Eric

    2012-11-01

    In this paper, I will argue that, while the ethical issues raised by cluster randomization can be challenging, they are not new. My thesis divides neatly into two parts. In the first, easier part I argue that many of the ethical challenges posed by cluster randomized human subjects research are clearly present in other types of human subjects research, and so are not novel. In the second, more difficult part I discuss the thorniest ethical challenge for cluster randomized research--cases where consent is genuinely impractical to obtain. I argue that once again these cases require no new analytic insight; instead, we should look to political philosophy for guidance. In other words, the most serious ethical problem that arises in cluster randomized research also arises in political philosophy. © 2011 Blackwell Publishing Ltd.

  15. Relative Binding Free Energy Calculations in Drug Discovery: Recent Advances and Practical Considerations.

    PubMed

    Cournia, Zoe; Allen, Bryce; Sherman, Woody

    2017-12-26

    Accurate in silico prediction of protein-ligand binding affinities has been a primary objective of structure-based drug design for decades due to the putative value it would bring to the drug discovery process. However, computational methods have historically failed to deliver value in real-world drug discovery applications due to a variety of scientific, technical, and practical challenges. Recently, a family of approaches commonly referred to as relative binding free energy (RBFE) calculations, which rely on physics-based molecular simulations and statistical mechanics, have shown promise in reliably generating accurate predictions in the context of drug discovery projects. This advance arises from accumulating developments in the underlying scientific methods (decades of research on force fields and sampling algorithms) coupled with vast increases in computational resources (graphics processing units and cloud infrastructures). Mounting evidence from retrospective validation studies, blind challenge predictions, and prospective applications suggests that RBFE simulations can now predict the affinity differences for congeneric ligands with sufficient accuracy and throughput to deliver considerable value in hit-to-lead and lead optimization efforts. Here, we present an overview of current RBFE implementations, highlighting recent advances and remaining challenges, along with examples that emphasize practical considerations for obtaining reliable RBFE results. We focus specifically on relative binding free energies because the calculations are less computationally intensive than absolute binding free energy (ABFE) calculations and map directly onto the hit-to-lead and lead optimization processes, where the prediction of relative binding energies between a reference molecule and new ideas (virtual molecules) can be used to prioritize molecules for synthesis. We describe the critical aspects of running RBFE calculations, from both theoretical and applied perspectives, using a combination of retrospective literature examples and prospective studies from drug discovery projects. This work is intended to provide a contemporary overview of the scientific, technical, and practical issues associated with running relative binding free energy simulations, with a focus on real-world drug discovery applications. We offer guidelines for improving the accuracy of RBFE simulations, especially for challenging cases, and emphasize unresolved issues that could be improved by further research in the field.

  16. New design for interfacing computers to the Octopus network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sloan, L.J.

    1977-03-14

    The Lawrence Livermore Laboratory has several large-scale computers which are connected to the Octopus network. Several difficulties arise in providing adequate resources along with reliable performance. To alleviate some of these problems a new method of bringing large computers into the Octopus environment is proposed.

  17. Parallel block schemes for large scale least squares computations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Golub, G.H.; Plemmons, R.J.; Sameh, A.

    1986-04-01

    Large scale least squares computations arise in a variety of scientific and engineering problems, including geodetic adjustments and surveys, medical image analysis, molecular structures, partial differential equations and substructuring methods in structural engineering. In each of these problems, matrices often arise which possess a block structure which reflects the local connection nature of the underlying physical problem. For example, such super-large nonlinear least squares computations arise in geodesy. Here the coordinates of positions are calculated by iteratively solving overdetermined systems of nonlinear equations by the Gauss-Newton method. The US National Geodetic Survey will complete this year (1986) the readjustment ofmore » the North American Datum, a problem which involves over 540 thousand unknowns and over 6.5 million observations (equations). The observation matrix for these least squares computations has a block angular form with 161 diagnonal blocks, each containing 3 to 4 thousand unknowns. In this paper parallel schemes are suggested for the orthogonal factorization of matrices in block angular form and for the associated backsubstitution phase of the least squares computations. In addition, a parallel scheme for the calculation of certain elements of the covariance matrix for such problems is described. It is shown that these algorithms are ideally suited for multiprocessors with three levels of parallelism such as the Cedar system at the University of Illinois. 20 refs., 7 figs.« less

  18. Computational models for predicting interactions with membrane transporters.

    PubMed

    Xu, Y; Shen, Q; Liu, X; Lu, J; Li, S; Luo, C; Gong, L; Luo, X; Zheng, M; Jiang, H

    2013-01-01

    Membrane transporters, including two members: ATP-binding cassette (ABC) transporters and solute carrier (SLC) transporters are proteins that play important roles to facilitate molecules into and out of cells. Consequently, these transporters can be major determinants of the therapeutic efficacy, toxicity and pharmacokinetics of a variety of drugs. Considering the time and expense of bio-experiments taking, research should be driven by evaluation of efficacy and safety. Computational methods arise to be a complementary choice. In this article, we provide an overview of the contribution that computational methods made in transporters field in the past decades. At the beginning, we present a brief introduction about the structure and function of major members of two families in transporters. In the second part, we focus on widely used computational methods in different aspects of transporters research. In the absence of a high-resolution structure of most of transporters, homology modeling is a useful tool to interpret experimental data and potentially guide experimental studies. We summarize reported homology modeling in this review. Researches in computational methods cover major members of transporters and a variety of topics including the classification of substrates and/or inhibitors, prediction of protein-ligand interactions, constitution of binding pocket, phenotype of non-synonymous single-nucleotide polymorphisms, and the conformation analysis that try to explain the mechanism of action. As an example, one of the most important transporters P-gp is elaborated to explain the differences and advantages of various computational models. In the third part, the challenges of developing computational methods to get reliable prediction, as well as the potential future directions in transporter related modeling are discussed.

  19. Detecting brain tumor in computed tomography images using Markov random fields and fuzzy C-means clustering techniques

    NASA Astrophysics Data System (ADS)

    Abdulbaqi, Hayder Saad; Jafri, Mohd Zubir Mat; Omar, Ahmad Fairuz; Mustafa, Iskandar Shahrim Bin; Abood, Loay Kadom

    2015-04-01

    Brain tumors, are an abnormal growth of tissues in the brain. They may arise in people of any age. They must be detected early, diagnosed accurately, monitored carefully, and treated effectively in order to optimize patient outcomes regarding both survival and quality of life. Manual segmentation of brain tumors from CT scan images is a challenging and time consuming task. Size and location accurate detection of brain tumor plays a vital role in the successful diagnosis and treatment of tumors. Brain tumor detection is considered a challenging mission in medical image processing. The aim of this paper is to introduce a scheme for tumor detection in CT scan images using two different techniques Hidden Markov Random Fields (HMRF) and Fuzzy C-means (FCM). The proposed method has been developed in this research in order to construct hybrid method between (HMRF) and threshold. These methods have been applied on 4 different patient data sets. The result of comparison among these methods shows that the proposed method gives good results for brain tissue detection, and is more robust and effective compared with (FCM) techniques.

  20. Random Walk Particle Tracking For Multiphase Heat Transfer

    NASA Astrophysics Data System (ADS)

    Lattanzi, Aaron; Yin, Xiaolong; Hrenya, Christine

    2017-11-01

    As computing capabilities have advanced, direct numerical simulation (DNS) has become a highly effective tool for quantitatively predicting the heat transfer within multiphase flows. Here we utilize a hybrid DNS framework that couples the lattice Boltzmann method (LBM) to the random walk particle tracking (RWPT) algorithm. The main challenge of such a hybrid is that discontinuous fields pose a significant challenge to the RWPT framework and special attention must be given to the handling of interfaces. We derive a method for addressing discontinuities in the diffusivity field, arising at the interface between two phases. Analytical means are utilized to develop an interfacial tracer balance and modify the RWPT algorithm. By expanding the modulus of the stochastic (diffusive) step and only allowing a subset of the tracers within the high diffusivity medium to undergo a diffusive step, the correct equilibrium state can be restored (globally homogeneous tracer distribution). The new RWPT algorithm is implemented within the SUSP3D code and verified against a variety of systems: effective diffusivity of a static gas-solids mixture, hot sphere in unbounded diffusion, cooling sphere in unbounded diffusion, and uniform flow past a hot sphere.

  1. 20 CFR 410.510 - Computation of benefits.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ..., TITLE IV-BLACK LUNG BENEFITS (1969- ) Payment of Benefits § 410.510 Computation of benefits. (a) Basic... benefits) based on the disability or death due to pneumoconiosis arising out of the coal mine employment of...

  2. Peer-Assessment in Higher Education--Twenty-First Century Practices, Challenges and the Way Forward

    ERIC Educational Resources Information Center

    Ashenafi, Michael Mogessie

    2017-01-01

    Peer assessment in higher education has been studied for decades. Despite the substantial amount of research carried out, peer assessment has yet to make significant advances. This review identifies themes of recent research and highlights the challenges that have hampered its advance. Most of these challenges arise from the manual nature of peer…

  3. Computer memory: the LLL experience. [Octopus computer network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fletcher, J.G.

    1976-02-01

    Those aspects of Octopus computer network design are reviewed that relate to memory and storage. Emphasis is placed on the difficulties and problems that arise because of the limitations of present storage devices, and indications are made of the directions in which technological advance could be of most value. (auth)

  4. Polyomino Problems to Confuse Computers

    ERIC Educational Resources Information Center

    Coffin, Stewart

    2009-01-01

    Computers are very good at solving certain types combinatorial problems, such as fitting sets of polyomino pieces into square or rectangular trays of a given size. However, most puzzle-solving programs now in use assume orthogonal arrangements. When one departs from the usual square grid layout, complications arise. The author--using a computer,…

  5. Digital Maps, Matrices and Computer Algebra

    ERIC Educational Resources Information Center

    Knight, D. G.

    2005-01-01

    The way in which computer algebra systems, such as Maple, have made the study of complex problems accessible to undergraduate mathematicians with modest computational skills is illustrated by some large matrix calculations, which arise from representing the Earth's surface by digital elevation models. Such problems are often considered to lie in…

  6. An overview of current approaches and future challenges in physiological monitoring

    NASA Technical Reports Server (NTRS)

    Horst, Richard L.

    1988-01-01

    Sufficient evidence exists from laboratory studies to suggest that physiological measures can be useful as an adjunct to behavioral and subjective measures of human performance and capabilities. Thus it is reasonable to address the conceptual and engineering challenges that arise in applying this technology in operational settings. Issues reviewed include the advantages and disadvantages of constructs such as mental states, the need for physiological measures of performance, areas of application for physiological measures in operational settings, which measures appear to be most useful, problem areas that arise in the use of these measures in operational settings, and directions for future development.

  7. The Challenges of Digital Leadership

    ERIC Educational Resources Information Center

    McLeod, Scott

    2015-01-01

    Because digital devices and online environments can simultaneously be transformatively empowering and maddeningly disruptive, the work of integrating digital learning tools into schools is usually difficult and complex. Common challenges arise, however, and can be thoughtfully addressed by proactive leadership. In the end, technology change in…

  8. An Algorithm for Pedestrian Detection in Multispectral Image Sequences

    NASA Astrophysics Data System (ADS)

    Kniaz, V. V.; Fedorenko, V. V.

    2017-05-01

    The growing interest for self-driving cars provides a demand for scene understanding and obstacle detection algorithms. One of the most challenging problems in this field is the problem of pedestrian detection. Main difficulties arise from a diverse appearances of pedestrians. Poor visibility conditions such as fog and low light conditions also significantly decrease the quality of pedestrian detection. This paper presents a new optical flow based algorithm BipedDetet that provides robust pedestrian detection on a single-borad computer. The algorithm is based on the idea of simplified Kalman filtering suitable for realization on modern single-board computers. To detect a pedestrian a synthetic optical flow of the scene without pedestrians is generated using slanted-plane model. The estimate of a real optical flow is generated using a multispectral image sequence. The difference of the synthetic optical flow and the real optical flow provides the optical flow induced by pedestrians. The final detection of pedestrians is done by the segmentation of the difference of optical flows. To evaluate the BipedDetect algorithm a multispectral dataset was collected using a mobile robot.

  9. Autogrid-based clustering of kinases: selection of representative conformations for docking purposes.

    PubMed

    Marzaro, Giovanni; Ferrarese, Alessandro; Chilin, Adriana

    2014-08-01

    The selection of the most appropriate protein conformation is a crucial aspect in molecular docking experiments. In order to reduce the errors arising from the use of a single protein conformation, several authors suggest the use of several tridimensional structures for the target. However, the selection of the most appropriate protein conformations still remains a challenging goal. The protein 3D-structures selection is mainly performed based on pairwise root-mean-square-deviation (RMSD) values computation, followed by hierarchical clustering. Herein we report an alternative strategy, based on the computation of only two atom affinity map for each protein conformation, followed by multivariate analysis and hierarchical clustering. This methodology was applied on seven different kinases of pharmaceutical interest. The comparison with the classical RMSD-based strategy was based on cross-docking of co-crystallized ligands. In the case of epidermal growth factor receptor kinase, also the docking performance on 220 known ligands were evaluated, followed by 3D-QSAR studies. In all the cases, the herein proposed methodology outperformed the RMSD-based one.

  10. Efficient and Robust Optimization for Building Energy Simulation

    PubMed Central

    Pourarian, Shokouh; Kearsley, Anthony; Wen, Jin; Pertzborn, Amanda

    2016-01-01

    Efficiently, robustly and accurately solving large sets of structured, non-linear algebraic and differential equations is one of the most computationally expensive steps in the dynamic simulation of building energy systems. Here, the efficiency, robustness and accuracy of two commonly employed solution methods are compared. The comparison is conducted using the HVACSIM+ software package, a component based building system simulation tool. The HVACSIM+ software presently employs Powell’s Hybrid method to solve systems of nonlinear algebraic equations that model the dynamics of energy states and interactions within buildings. It is shown here that the Powell’s method does not always converge to a solution. Since a myriad of other numerical methods are available, the question arises as to which method is most appropriate for building energy simulation. This paper finds considerable computational benefits result from replacing the Powell’s Hybrid method solver in HVACSIM+ with a solver more appropriate for the challenges particular to numerical simulations of buildings. Evidence is provided that a variant of the Levenberg-Marquardt solver has superior accuracy and robustness compared to the Powell’s Hybrid method presently used in HVACSIM+. PMID:27325907

  11. Efficient and Robust Optimization for Building Energy Simulation.

    PubMed

    Pourarian, Shokouh; Kearsley, Anthony; Wen, Jin; Pertzborn, Amanda

    2016-06-15

    Efficiently, robustly and accurately solving large sets of structured, non-linear algebraic and differential equations is one of the most computationally expensive steps in the dynamic simulation of building energy systems. Here, the efficiency, robustness and accuracy of two commonly employed solution methods are compared. The comparison is conducted using the HVACSIM+ software package, a component based building system simulation tool. The HVACSIM+ software presently employs Powell's Hybrid method to solve systems of nonlinear algebraic equations that model the dynamics of energy states and interactions within buildings. It is shown here that the Powell's method does not always converge to a solution. Since a myriad of other numerical methods are available, the question arises as to which method is most appropriate for building energy simulation. This paper finds considerable computational benefits result from replacing the Powell's Hybrid method solver in HVACSIM+ with a solver more appropriate for the challenges particular to numerical simulations of buildings. Evidence is provided that a variant of the Levenberg-Marquardt solver has superior accuracy and robustness compared to the Powell's Hybrid method presently used in HVACSIM+.

  12. Force Field Accelerated Density Functional Theory Molecular Dynamics for Simulation of Reactive Systems at Extreme Conditions

    NASA Astrophysics Data System (ADS)

    Lindsey, Rebecca; Goldman, Nir; Fried, Laurence

    2017-06-01

    Atomistic modeling of chemistry at extreme conditions remains a challenge, despite continuing advances in computing resources and simulation tools. While first principles methods provide a powerful predictive tool, the time and length scales associated with chemistry at extreme conditions (ns and μm, respectively) largely preclude extension of such models to molecular dynamics. In this work, we develop a simulation approach that retains the accuracy of density functional theory (DFT) while decreasing computational effort by several orders of magnitude. We generate n-body descriptions for atomic interactions by mapping forces arising from short density functional theory (DFT) trajectories on to simple Chebyshev polynomial series. We examine the importance of including greater than 2-body interactions, model transferability to different state points, and discuss approaches to ensure smooth and reasonable model shape outside of the distance domain sampled by the DFT training set. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  13. Financing Academic Departments of Psychiatry

    ERIC Educational Resources Information Center

    Liptzin, Benjamin; Meyer, Roger E.

    2011-01-01

    Objective: The authors describe the many financial challenges facing academic departments of psychiatry and the resulting opportunities that may arise. Method: The authors review the history of financial challenges, the current economic situation, and what may lie ahead for academic departments of psychiatry. Results: The current environment has…

  14. Software Carpentry and the Hydrological Sciences

    NASA Astrophysics Data System (ADS)

    Ahmadia, A. J.; Kees, C. E.; Farthing, M. W.

    2013-12-01

    Scientists are spending an increasing amount of time building and using hydrology software. However, most scientists are never taught how to do this efficiently. As a result, many are unaware of tools and practices that would allow them to write more reliable and maintainable code with less effort. As hydrology models increase in capability and enter use by a growing number of scientists and their communities, it is important that the scientific software development practices scale up to meet the challenges posed by increasing software complexity, lengthening software lifecycles, a growing number of stakeholders and contributers, and a broadened developer base that extends from application domains to high performance computing centers. Many of these challenges in complexity, lifecycles, and developer base have been successfully met by the open source community, and there are many lessons to be learned from their experiences and practices. Additionally, there is much wisdom to be found in the results of research studies conducted on software engineering itself. Software Carpentry aims to bridge the gap between the current state of software development and these known best practices for scientific software development, with a focus on hands-on exercises and practical advice based on the following principles: 1. Write programs for people, not computers. 2. Automate repetitive tasks 3. Use the computer to record history 4. Make incremental changes 5. Use version control 6. Don't repeat yourself (or others) 7. Plan for mistakes 8. Optimize software only after it works 9. Document design and purpose, not mechanics 10. Collaborate We discuss how these best practices, arising from solid foundations in research and experience, have been shown to help improve scientist's productivity and the reliability of their software.

  15. LLNL Mercury Project Trinity Open Science Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brantley, Patrick; Dawson, Shawn; McKinley, Scott

    2016-04-20

    The Mercury Monte Carlo particle transport code developed at Lawrence Livermore National Laboratory (LLNL) is used to simulate the transport of radiation through urban environments. These challenging calculations include complicated geometries and require significant computational resources to complete. As a result, a question arises as to the level of convergence of the calculations with Monte Carlo simulation particle count. In the Trinity Open Science calculations, one main focus was to investigate convergence of the relevant simulation quantities with Monte Carlo particle count to assess the current simulation methodology. Both for this application space but also of more general applicability, wemore » also investigated the impact of code algorithms on parallel scaling on the Trinity machine as well as the utilization of the Trinity DataWarp burst buffer technology in Mercury via the LLNL Scalable Checkpoint/Restart (SCR) library.« less

  16. Integrative Analysis of Many RNA-Seq Datasets to Study Alternative Splicing

    PubMed Central

    Li, Wenyuan; Dai, Chao; Kang, Shuli; Zhou, Xianghong Jasmine

    2014-01-01

    Alternative splicing is an important gene regulatory mechanism that dramatically increases the complexity of the proteome. However, how alternative splicing is regulated and how transcription and splicing are coordinated are still poorly understood, and functions of transcript isoforms have been studied only in a few limited cases. Nowadays, RNA-seq technology provides an exceptional opportunity to study alternative splicing on genome-wide scales and in an unbiased manner. With the rapid accumulation of data in public repositories, new challenges arise from the urgent need to effectively integrate many different RNA-seq datasets for study alterative splicing. This paper discusses a set of advanced computational methods that can integrate and analyze many RNA-seq datasets to systematically identify splicing modules, unravel the coupling of transcription and splicing, and predict the functions of splicing isoforms on a genome-wide scale. PMID:24583115

  17. Exploiting Publication Contents and Collaboration Networks for Collaborator Recommendation

    PubMed Central

    Kong, Xiangjie; Jiang, Huizhen; Yang, Zhuo; Xu, Zhenzhen; Xia, Feng; Tolba, Amr

    2016-01-01

    Thanks to the proliferation of online social networks, it has become conventional for researchers to communicate and collaborate with each other. Meanwhile, one critical challenge arises, that is, how to find the most relevant and potential collaborators for each researcher? In this work, we propose a novel collaborator recommendation model called CCRec, which combines the information on researchers’ publications and collaboration network to generate better recommendation. In order to effectively identify the most potential collaborators for researchers, we adopt a topic clustering model to identify the academic domains, as well as a random walk model to compute researchers’ feature vectors. Using DBLP datasets, we conduct benchmarking experiments to examine the performance of CCRec. The experimental results show that CCRec outperforms other state-of-the-art methods in terms of precision, recall and F1 score. PMID:26849682

  18. Ubiquitous human computing.

    PubMed

    Zittrain, Jonathan

    2008-10-28

    Ubiquitous computing means network connectivity everywhere, linking devices and systems as small as a drawing pin and as large as a worldwide product distribution chain. What could happen when people are so readily networked? This paper explores issues arising from two possible emerging models of ubiquitous human computing: fungible networked brainpower and collective personal vital sign monitoring.

  19. First-Principles Monte Carlo Simulations of Reaction Equilibria in Compressed Vapors

    PubMed Central

    2016-01-01

    Predictive modeling of reaction equilibria presents one of the grand challenges in the field of molecular simulation. Difficulties in the study of such systems arise from the need (i) to accurately model both strong, short-ranged interactions leading to the formation of chemical bonds and weak interactions arising from the environment, and (ii) to sample the range of time scales involving frequent molecular collisions, slow diffusion, and infrequent reactive events. Here we present a novel reactive first-principles Monte Carlo (RxFPMC) approach that allows for investigation of reaction equilibria without the need to prespecify a set of chemical reactions and their ideal-gas equilibrium constants. We apply RxFPMC to investigate a nitrogen/oxygen mixture at T = 3000 K and p = 30 GPa, i.e., conditions that are present in atmospheric lightning strikes and explosions. The RxFPMC simulations show that the solvation environment leads to a significantly enhanced NO concentration that reaches a maximum when oxygen is present in slight excess. In addition, the RxFPMC simulations indicate the formation of NO2 and N2O in mole fractions approaching 1%, whereas N3 and O3 are not observed. The equilibrium distributions obtained from the RxFPMC simulations agree well with those from a thermochemical computer code parametrized to experimental data. PMID:27413785

  20. The Internet and the menopause consultation: menopause management in the third millennium.

    PubMed

    Cumming, Grant P; Currie, Heather

    2005-09-01

    The Internet was born in 1969; it was originally developed so that computers could share information on research and development in the scientific and military fields. The original Internet consisted of four university computers networked in the United States. Email became available two years later. The infant Internet initially required complex computing knowledge to be used. However, this was all to change with the development of the World Wide Web in the early 1990s, which made the Internet much more widely accessible. The Internet has since grown at a phenomenal rate and has evolved into a global communications tool. It is by nature anarchic, in that it is an unrestricted broadcast medium. Although this lack of censorship is a strength, it is also a weakness. The quality of information available on the Web is variable and discernment is required. With the growth of e-health, medicine and its allied specialties are faced with the challenges of providing their services in a novel way while maintaining the first principle of medicine, primum non nocere (first, do no harm). This provision of e-health care is in its infancy and this review explores issues arising from the use of the Internet as a medium for organizing menopausal health care in the third millennium.

  1. High-field Transport in Low Symmetry β-Ga2O3 Crystal

    NASA Astrophysics Data System (ADS)

    Ghosh, Krishnendu; Singisetti, Uttam

    High-field carrier transport plays an important role in many disciplines of electronics. Conventional transport theories work well on high-symmetry materials but lacks insight as the crystal symmetry goes down. Newly emerging materials, many of which possess low symmetry, demand more rigorous treatment of charge transport. We will present a comprehensive study of high-field transport using ab initio electron-phonon interaction (EPI) elements in a full-band Monte Carlo (FBMC) algorithm. We use monoclinic β-Ga2O3 as a benchmark low-symmetry material which is also an emerging wide-bandgap semiconductor. β-Ga2O3 has a C2m space group and a 10 atom primitive cell. In this work the EPIs are calculated under density-functional perturbation theory framework. We will focus on the computational challenges arising from many phonon modes and low crystal symmetry. Significant insights will be presented on the details of energy relaxation by the hot electrons mediated by different phonon modes. We will also show the velocity-field curves of electrons in different crystal directions. The authors acknowledge the support from the National Science Foundation Grant (ECCS 1607833). The authors also acknowledge the computing support provided by the Center for Computational Research at the University at Buffalo.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sauter, Nicholas K., E-mail: nksauter@lbl.gov; Hattne, Johan; Grosse-Kunstleve, Ralf W.

    The Computational Crystallography Toolbox (cctbx) is a flexible software platform that has been used to develop high-throughput crystal-screening tools for both synchrotron sources and X-ray free-electron lasers. Plans for data-processing and visualization applications are discussed, and the benefits and limitations of using graphics-processing units are evaluated. Current pixel-array detectors produce diffraction images at extreme data rates (of up to 2 TB h{sup −1}) that make severe demands on computational resources. New multiprocessing frameworks are required to achieve rapid data analysis, as it is important to be able to inspect the data quickly in order to guide the experiment in realmore » time. By utilizing readily available web-serving tools that interact with the Python scripting language, it was possible to implement a high-throughput Bragg-spot analyzer (cctbx.spotfinder) that is presently in use at numerous synchrotron-radiation beamlines. Similarly, Python interoperability enabled the production of a new data-reduction package (cctbx.xfel) for serial femtosecond crystallography experiments at the Linac Coherent Light Source (LCLS). Future data-reduction efforts will need to focus on specialized problems such as the treatment of diffraction spots on interleaved lattices arising from multi-crystal specimens. In these challenging cases, accurate modeling of close-lying Bragg spots could benefit from the high-performance computing capabilities of graphics-processing units.« less

  3. Ethical Challenges in Biomarker-Driven Drug Development.

    PubMed

    Hey, Spencer Phillips

    2018-01-01

    The increasing importance of biomarkers-as drivers of research and drug development activity, surrogate outcomes in clinical trials, and the centerpiece of precision medicine-raises many new ethical challenges. In what follows, I briefly review some of the major ethical challenges and debates already identified in the literature, and then describe a new ethical challenge that arises from the abstract nature of biomarker hypotheses. © 2017 American Society for Clinical Pharmacology and Therapeutics.

  4. Towards addressing transient learning challenges in undergraduate physics: an example from electrostatics

    NASA Astrophysics Data System (ADS)

    Fredlund, T.; Linder, C.; Airey, J.

    2015-09-01

    In this article we characterize transient learning challenges as learning challenges that arise out of teaching situations rather than conflicts with prior knowledge. We propose that these learning challenges can be identified by paying careful attention to the representations that students produce. Once a transient learning challenge has been identified, teachers can create interventions to address it. By illustration, we argue that an appropriate way to design such interventions is to create variation around the disciplinary-relevant aspects associated with the transient learning challenge.

  5. Mucosal melanoma: a clinically and biologically unique disease entity.

    PubMed

    Carvajal, Richard D; Spencer, Sharon A; Lydiatt, William

    2012-03-01

    Mucosal melanoma (MM) is an aggressive and clinically complex malignancy made more challenging by its relative rarity. Because of the rarity of MM as a whole, and because of the unique biology and clinical challenges of MM arising from each anatomic location, understanding of this disease and its optimal management remains limited. The impact of various treatment strategies on disease control and survival has been difficult to assess because of the small size of most reported series of MM arising from any one particular site, the retrospective nature of most series, and the lack of a uniform comprehensive staging system for this disease. This article summarizes the clinical, pathologic, and molecular features, and the diagnostic and therapeutic considerations for the management of MM, underscoring the similarities and differences from cutaneous melanoma. Furthermore, the distinct clinical features and management implications unique to melanoma arising from the mucosal surfaces of the head and neck, the anorectal region, and the female genital tract are highlighted.

  6. Identifying Unique Ethical Challenges of Indigenous Field-Workers: A Commentary on Alexander and Richman's "Ethical Dilemmas in Evaluations Using Indigenous Research Workers"

    ERIC Educational Resources Information Center

    Smith, Nick L.

    2008-01-01

    In contrast with nonindigenous workers, to what extent do unique ethical problems arise when indigenous field-workers participate in field studies? Three aspects of study design and operation are considered: data integrity issues, risk issues, and protection issues. Although many of the data quality issues that arise with the use of indigenous…

  7. Closing the Guantanamo Detention Center: Legal Issues

    DTIC Science & Technology

    2009-07-20

    to arise as a result of executive and legislative action to close the Guantanamo detention facility. It discusses legal issues related to the...combatants could pursue legal challenges regarding their detention or other wartime actions taken by the Executive. The Bush Administration initially...that are likely to arise as a result of executive and legislative action to close the Guantanamo detention facility. It discusses legal issues

  8. Closing the Guantanamo Detention Center: Legal Issues

    DTIC Science & Technology

    2009-11-17

    immigration consequences. This report provides an overview of major legal issues likely to arise as a result of executive and legislative action to...legal challenges regarding their detention or other wartime actions taken by the Executive. The Bush Administration initially believed that Guantanamo...major legal issues that are likely to arise as a result of executive and legislative action to close the Guantanamo detention facility. It discusses

  9. Overview of the SAMPL5 host–guest challenge: Are we doing better?

    PubMed Central

    Yin, Jian; Henriksen, Niel M.; Slochower, David R.; Shirts, Michael R.; Chiu, Michael W.; Mobley, David L.; Gilson, Michael K.

    2016-01-01

    The ability to computationally predict protein-small molecule binding affinities with high accuracy would accelerate drug discovery and reduce its cost by eliminating rounds of trial-and-error synthesis and experimental evaluation of candidate ligands. As academic and industrial groups work toward this capability, there is an ongoing need for datasets that can be used to rigorously test new computational methods. Although protein–ligand data are clearly important for this purpose, their size and complexity make it difficult to obtain well-converged results and to troubleshoot computational methods. Host–guest systems offer a valuable alternative class of test cases, as they exemplify noncovalent molecular recognition but are far smaller and simpler. As a consequence, host–guest systems have been part of the prior two rounds of SAMPL prediction exercises, and they also figure in the present SAMPL5 round. In addition to being blinded, and thus avoiding biases that may arise in retrospective studies, the SAMPL challenges have the merit of focusing multiple researchers on a common set of molecular systems, so that methods may be compared and ideas exchanged. The present paper provides an overview of the host–guest component of SAMPL5, which centers on three different hosts, two octa-acids and a glycoluril-based molecular clip, and two different sets of guest molecules, in aqueous solution. A range of methods were applied, including electronic structure calculations with implicit solvent models; methods that combine empirical force fields with implicit solvent models; and explicit solvent free energy simulations. The most reliable methods tend to fall in the latter class, consistent with results in prior SAMPL rounds, but the level of accuracy is still below that sought for reliable computer-aided drug design. Advances in force field accuracy, modeling of protonation equilibria, electronic structure methods, and solvent models, hold promise for future improvements. PMID:27658802

  10. Overview of the SAMPL5 host-guest challenge: Are we doing better?

    PubMed

    Yin, Jian; Henriksen, Niel M; Slochower, David R; Shirts, Michael R; Chiu, Michael W; Mobley, David L; Gilson, Michael K

    2017-01-01

    The ability to computationally predict protein-small molecule binding affinities with high accuracy would accelerate drug discovery and reduce its cost by eliminating rounds of trial-and-error synthesis and experimental evaluation of candidate ligands. As academic and industrial groups work toward this capability, there is an ongoing need for datasets that can be used to rigorously test new computational methods. Although protein-ligand data are clearly important for this purpose, their size and complexity make it difficult to obtain well-converged results and to troubleshoot computational methods. Host-guest systems offer a valuable alternative class of test cases, as they exemplify noncovalent molecular recognition but are far smaller and simpler. As a consequence, host-guest systems have been part of the prior two rounds of SAMPL prediction exercises, and they also figure in the present SAMPL5 round. In addition to being blinded, and thus avoiding biases that may arise in retrospective studies, the SAMPL challenges have the merit of focusing multiple researchers on a common set of molecular systems, so that methods may be compared and ideas exchanged. The present paper provides an overview of the host-guest component of SAMPL5, which centers on three different hosts, two octa-acids and a glycoluril-based molecular clip, and two different sets of guest molecules, in aqueous solution. A range of methods were applied, including electronic structure calculations with implicit solvent models; methods that combine empirical force fields with implicit solvent models; and explicit solvent free energy simulations. The most reliable methods tend to fall in the latter class, consistent with results in prior SAMPL rounds, but the level of accuracy is still below that sought for reliable computer-aided drug design. Advances in force field accuracy, modeling of protonation equilibria, electronic structure methods, and solvent models, hold promise for future improvements.

  11. Analytic energy gradients for the orbital-optimized third-order Møller-Plesset perturbation theory

    NASA Astrophysics Data System (ADS)

    Bozkaya, Uǧur

    2013-09-01

    Analytic energy gradients for the orbital-optimized third-order Møller-Plesset perturbation theory (OMP3) [U. Bozkaya, J. Chem. Phys. 135, 224103 (2011)], 10.1063/1.3665134 are presented. The OMP3 method is applied to problematic chemical systems with challenging electronic structures. The performance of the OMP3 method is compared with those of canonical second-order Møller-Plesset perturbation theory (MP2), third-order Møller-Plesset perturbation theory (MP3), coupled-cluster singles and doubles (CCSD), and coupled-cluster singles and doubles with perturbative triples [CCSD(T)] for investigating equilibrium geometries, vibrational frequencies, and open-shell reaction energies. For bond lengths, the performance of OMP3 is in between those of MP3 and CCSD. For harmonic vibrational frequencies, the OMP3 method significantly eliminates the singularities arising from the abnormal response contributions observed for MP3 in case of symmetry-breaking problems, and provides noticeably improved vibrational frequencies for open-shell molecules. For open-shell reaction energies, OMP3 exhibits a better performance than MP3 and CCSD as in case of barrier heights and radical stabilization energies. As discussed in previous studies, the OMP3 method is several times faster than CCSD in energy computations. Further, in analytic gradient computations for the CCSD method one needs to solve λ-amplitude equations, however for OMP3 one does not since λ _{ab}^{ij(1)} = t_{ij}^{ab(1)} and λ _{ab}^{ij(2)} = t_{ij}^{ab(2)}. Additionally, one needs to solve orbital Z-vector equations for CCSD, but for OMP3 orbital response contributions are zero owing to the stationary property of OMP3. Overall, for analytic gradient computations the OMP3 method is several times less expensive than CCSD (roughly ˜4-6 times). Considering the balance of computational cost and accuracy we conclude that the OMP3 method emerges as a very useful tool for the study of electronically challenging chemical systems.

  12. Analytic energy gradients for the orbital-optimized third-order Møller-Plesset perturbation theory.

    PubMed

    Bozkaya, Uğur

    2013-09-14

    Analytic energy gradients for the orbital-optimized third-order Møller-Plesset perturbation theory (OMP3) [U. Bozkaya, J. Chem. Phys. 135, 224103 (2011)] are presented. The OMP3 method is applied to problematic chemical systems with challenging electronic structures. The performance of the OMP3 method is compared with those of canonical second-order Møller-Plesset perturbation theory (MP2), third-order Møller-Plesset perturbation theory (MP3), coupled-cluster singles and doubles (CCSD), and coupled-cluster singles and doubles with perturbative triples [CCSD(T)] for investigating equilibrium geometries, vibrational frequencies, and open-shell reaction energies. For bond lengths, the performance of OMP3 is in between those of MP3 and CCSD. For harmonic vibrational frequencies, the OMP3 method significantly eliminates the singularities arising from the abnormal response contributions observed for MP3 in case of symmetry-breaking problems, and provides noticeably improved vibrational frequencies for open-shell molecules. For open-shell reaction energies, OMP3 exhibits a better performance than MP3 and CCSD as in case of barrier heights and radical stabilization energies. As discussed in previous studies, the OMP3 method is several times faster than CCSD in energy computations. Further, in analytic gradient computations for the CCSD method one needs to solve λ-amplitude equations, however for OMP3 one does not since λ(ab)(ij(1))=t(ij)(ab(1)) and λ(ab)(ij(2))=t(ij)(ab(2)). Additionally, one needs to solve orbital Z-vector equations for CCSD, but for OMP3 orbital response contributions are zero owing to the stationary property of OMP3. Overall, for analytic gradient computations the OMP3 method is several times less expensive than CCSD (roughly ~4-6 times). Considering the balance of computational cost and accuracy we conclude that the OMP3 method emerges as a very useful tool for the study of electronically challenging chemical systems.

  13. Ethical dilemmas in pediatric and adolescent psychogenic nonepileptic seizures.

    PubMed

    Cole, Cristie M; Falcone, Tatiana; Caplan, Rochelle; Timmons-Mitchell, Jane; Jares, Kristine; Ford, Paul J

    2014-08-01

    To date, only a very narrow window of ethical dilemmas in psychogenic nonepileptic seizures (PNES) has been explored. Numerous distinct ethical dilemmas arise in diagnosing and treating pediatric and adolescent patients with PNESs. Important ethical values at stake include trust, transparency, confidentiality, professionalism, autonomy of all stakeholders, and justice. In order to further elucidate the ethical challenges in caring for this population, an ethical analysis of the special challenges faced in four specific domains is undertaken: (1) conducting and communicating a diagnosis of PNESs, (2) advising patients about full transparency and disclosure to community including patients' peers, (3) responding to requests to continue antiepileptic drugs, and (4) managing challenges arising from school policy and procedure. An analysis of these ethical issues is essential for the advancement of best care practices that promote the overall well-being of patients and their families. Copyright © 2014 Elsevier Inc. All rights reserved.

  14. Ethical Dilemmas in Pediatric and Adolescent Psychogenic Non-Epileptic Seizures

    PubMed Central

    Cole, Cristie M.; Falcone, Tatiana; Caplan, Rochelle; Timmons-Mitchell, Jane; Jares, Kristine; Ford, Paul J.

    2014-01-01

    To date only a very narrow window of ethical dilemmas in psychogenic non-epileptic seizures (PNES) have been explored. Numerous distinct ethical dilemmas arise in diagnosing and treating pediatric and adolescent patients with PNES. Important ethical values at stake include trust, transparency, confidentiality, professionalism, autonomy of all stakeholders and justice. In order to further elucidate the ethical challenges in caring for this population, an ethical analysis of the special challenges faced in four specific domains is undertaken: (1) conducting and communicating a diagnosis of PNES; (2) advising patients about full transparency and disclosure to community including patients’ peers; (3) responding to requests to continue anti-epileptic drugs; and (4) managing challenges arising from school policy and procedure. An analysis of these ethical issues is essential for the advancement of best care practices that promote the overall well-being of patients and their families. PMID:25022823

  15. Classroom Management Challenges in the Dance Class

    ERIC Educational Resources Information Center

    Clark, Dawn

    2007-01-01

    Teaching dance can be a challenge because of the unique classroom-management situations that arise from the dynamic nature of the class content. Management is a delicate navigation of advance planning; rule setting; the establishment and implementation of daily protocols, routines, and interventions; and the teacher's own presentation. This…

  16. An Ethics Challenge for School Counselors

    ERIC Educational Resources Information Center

    Froeschle, Janet G.; Crews, Charles

    2010-01-01

    Ethical issues arise more often for school counselors than for those who work in other settings (Remley, 2002). The challenge of working not only with minors but also with other stakeholders including parents, teachers, school administrators, and community members sets the stage for potential legal and ethical dilemmas. Awareness and adherence to…

  17. Technology and Higher Education: Challenges in the Halls of Academe

    ERIC Educational Resources Information Center

    Duhaney, Devon C.

    2005-01-01

    As new technologies become a part of the 'landscape' of universities and colleges, many questions arise concerning the transformation resulting from their utilization in these institutions. This paper discusses the increasing use of technology for education, training, and development in higher education, and challenges that higher education faces…

  18. Action Research Facilitated by University-School Collaboration

    ERIC Educational Resources Information Center

    Yuan, Rui; Lee, Icy

    2015-01-01

    While Action Research (AR) is promoted as a powerful route for teachers' professional development, different contextual challenges may arise during the process; teachers may be helped to overcome these challenges with the guidance of external facilitators. Drawing on data from interviews and the teachers' AR reports, this article explores how two…

  19. Access and benefit sharing (ABS) under the convention on biological diversity (CBD): implications for microbial biological control

    USDA-ARS?s Scientific Manuscript database

    Researchers and implementers of biological control are confronted with a variety of scientific, regulatory and administrative challenges to their biological control programs. One developing challenge will arise from the implementation of provisions of the Convention on Biological Diversity (CBD) co...

  20. Tensions and Challenges in China's Education Policy Borrowing

    ERIC Educational Resources Information Center

    Tan, Charlene

    2016-01-01

    Background: This article critically discusses the key tensions and challenges arising from the educational policy borrowing in China, through its current education reform. Focussing on the new curriculum reform (NCR), the paper highlights the interactions and conflicts between foreign and local ideologies and practices. Sources of evidence: The…

  1. Researching across Boundaries and Borders: The Challenges for Research

    ERIC Educational Resources Information Center

    Bowl, Marion; Cooke, Sandra; Hockings, Christine

    2008-01-01

    This article explores some of the challenges of conducting action research in higher education. It arises from an ongoing research project funded by the Economic and Social Research Council's Teaching and Learning Research Programme (ESRC/TLRP), "Learning and Teaching for Social Diversity and Difference", which examines the dynamics of…

  2. Implementation Challenges for a Constructivist Physical Education Curriculum

    ERIC Educational Resources Information Center

    Zhu, Xihe; Ennis, Catherine D.; Chen, Ang

    2011-01-01

    Background: Curriculum fidelity describes the extent to which a curriculum is implemented faithfully as planned. Curriculum fidelity issues may arise when teachers implement the curriculum inconsistently due to differences in philosophy, barriers in the setting, or other local concerns. Purpose: The study examined challenges that a teacher faced…

  3. The symbolic computation and automatic analysis of trajectories

    NASA Technical Reports Server (NTRS)

    Grossman, Robert

    1991-01-01

    Research was generally done on computation of trajectories of dynamical systems, especially control systems. Algorithms were further developed for rewriting expressions involving differential operators. The differential operators involved arise in the local analysis of nonlinear control systems. An initial design was completed of the system architecture for software to analyze nonlinear control systems using data base computing.

  4. Families and Home Computer Use: Exploring Parent Perceptions of the Importance of Current Technology

    ERIC Educational Resources Information Center

    Ortiz, Robert W.; Green, Tim; Lim, HeeJeong

    2011-01-01

    Many families today have access to computers that help them with their daily living activities, such as finding employment and helping children with schoolwork. With more families owning personal computers, questions arise as to the role they play in these households. An exploratory study was conducted looking at parents whose children were…

  5. Teaching Advanced Concepts in Computer Networks: VNUML-UM Virtualization Tool

    ERIC Educational Resources Information Center

    Ruiz-Martinez, A.; Pereniguez-Garcia, F.; Marin-Lopez, R.; Ruiz-Martinez, P. M.; Skarmeta-Gomez, A. F.

    2013-01-01

    In the teaching of computer networks the main problem that arises is the high price and limited number of network devices the students can work with in the laboratories. Nowadays, with virtualization we can overcome this limitation. In this paper, we present a methodology that allows students to learn advanced computer network concepts through…

  6. Assessment of background hydrogen by the Monte Carlo computer code MCNP-4A during measurements of total body nitrogen.

    PubMed

    Ryde, S J; al-Agel, F A; Evans, C J; Hancock, D A

    2000-05-01

    The use of a hydrogen internal standard to enable the estimation of absolute mass during measurement of total body nitrogen by in vivo neutron activation is an established technique. Central to the technique is a determination of the H prompt gamma ray counts arising from the subject. In practice, interference counts from other sources--e.g., neutron shielding--are included. This study reports use of the Monte Carlo computer code, MCNP-4A, to investigate the interference counts arising from shielding both with and without a phantom containing a urea solution. Over a range of phantom size (depth 5 to 30 cm, width 20 to 40 cm), the counts arising from shielding increased by between 4% and 32% compared with the counts without a phantom. For any given depth, the counts increased approximately linearly with width. For any given width, there was little increase for depths exceeding 15 centimeters. The shielding counts comprised between 15% and 26% of those arising from the urea phantom. These results, although specific to the Swansea apparatus, suggest that extraneous hydrogen counts can be considerable and depend strongly on the subject's size.

  7. Computer algebra and operators

    NASA Technical Reports Server (NTRS)

    Fateman, Richard; Grossman, Robert

    1989-01-01

    The symbolic computation of operator expansions is discussed. Some of the capabilities that prove useful when performing computer algebra computations involving operators are considered. These capabilities may be broadly divided into three areas: the algebraic manipulation of expressions from the algebra generated by operators; the algebraic manipulation of the actions of the operators upon other mathematical objects; and the development of appropriate normal forms and simplification algorithms for operators and their actions. Brief descriptions are given of the computer algebra computations that arise when working with various operators and their actions.

  8. The maps problem and the mapping problem: Two challenges for a cognitive neuroscience of speech and language

    PubMed Central

    Poeppel, David

    2012-01-01

    Research on the brain basis of speech and language faces theoretical and empirical challenges. The majority of current research, dominated by imaging, deficit-lesion, and electrophysiological techniques, seeks to identify regions that underpin aspects of language processing such as phonology, syntax, or semantics. The emphasis lies on localization and spatial characterization of function. The first part of the paper deals with a practical challenge that arises in the context of such a research program. This maps problem concerns the extent to which spatial information and localization can satisfy the explanatory needs for perception and cognition. Several areas of investigation exemplify how the neural basis of speech and language is discussed in those terms (regions, streams, hemispheres, networks). The second part of the paper turns to a more troublesome challenge, namely how to formulate the formal links between neurobiology and cognition. This principled problem thus addresses the relation between the primitives of cognition (here speech, language) and neurobiology. Dealing with this mapping problem invites the development of linking hypotheses between the domains. The cognitive sciences provide granular, theoretically motivated claims about the structure of various domains (the ‘cognome’); neurobiology, similarly, provides a list of the available neural structures. However, explanatory connections will require crafting computationally explicit linking hypotheses at the right level of abstraction. For both the practical maps problem and the principled mapping problem, developmental approaches and evidence can play a central role in the resolution. PMID:23017085

  9. Bio-inspired computational heuristics to study Lane-Emden systems arising in astrophysics model.

    PubMed

    Ahmad, Iftikhar; Raja, Muhammad Asif Zahoor; Bilal, Muhammad; Ashraf, Farooq

    2016-01-01

    This study reports novel hybrid computational methods for the solutions of nonlinear singular Lane-Emden type differential equation arising in astrophysics models by exploiting the strength of unsupervised neural network models and stochastic optimization techniques. In the scheme the neural network, sub-part of large field called soft computing, is exploited for modelling of the equation in an unsupervised manner. The proposed approximated solutions of higher order ordinary differential equation are calculated with the weights of neural networks trained with genetic algorithm, and pattern search hybrid with sequential quadratic programming for rapid local convergence. The results of proposed solvers for solving the nonlinear singular systems are in good agreements with the standard solutions. Accuracy and convergence the design schemes are demonstrated by the results of statistical performance measures based on the sufficient large number of independent runs.

  10. Final Report, DE-FG01-06ER25718 Domain Decomposition and Parallel Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Widlund, Olof B.

    2015-06-09

    The goal of this project is to develop and improve domain decomposition algorithms for a variety of partial differential equations such as those of linear elasticity and electro-magnetics.These iterative methods are designed for massively parallel computing systems and allow the fast solution of the very large systems of algebraic equations that arise in large scale and complicated simulations. A special emphasis is placed on problems arising from Maxwell's equation. The approximate solvers, the preconditioners, are combined with the conjugate gradient method and must always include a solver of a coarse model in order to have a performance which is independentmore » of the number of processors used in the computer simulation. A recent development allows for an adaptive construction of this coarse component of the preconditioner.« less

  11. Eigenmode computation of cavities with perturbed geometry using matrix perturbation methods applied on generalized eigenvalue problems

    NASA Astrophysics Data System (ADS)

    Gorgizadeh, Shahnam; Flisgen, Thomas; van Rienen, Ursula

    2018-07-01

    Generalized eigenvalue problems are standard problems in computational sciences. They may arise in electromagnetic fields from the discretization of the Helmholtz equation by for example the finite element method (FEM). Geometrical perturbations of the structure under concern lead to a new generalized eigenvalue problems with different system matrices. Geometrical perturbations may arise by manufacturing tolerances, harsh operating conditions or during shape optimization. Directly solving the eigenvalue problem for each perturbation is computationally costly. The perturbed eigenpairs can be approximated using eigenpair derivatives. Two common approaches for the calculation of eigenpair derivatives, namely modal superposition method and direct algebraic methods, are discussed in this paper. Based on the direct algebraic methods an iterative algorithm is developed for efficiently calculating the eigenvalues and eigenvectors of the perturbed geometry from the eigenvalues and eigenvectors of the unperturbed geometry.

  12. Computer Graphics Simulations of Sampling Distributions.

    ERIC Educational Resources Information Center

    Gordon, Florence S.; Gordon, Sheldon P.

    1989-01-01

    Describes the use of computer graphics simulations to enhance student understanding of sampling distributions that arise in introductory statistics. Highlights include the distribution of sample proportions, the distribution of the difference of sample means, the distribution of the difference of sample proportions, and the distribution of sample…

  13. Computational approaches to schizophrenia: A perspective on negative symptoms.

    PubMed

    Deserno, Lorenz; Heinz, Andreas; Schlagenhauf, Florian

    2017-08-01

    Schizophrenia is a heterogeneous spectrum disorder often associated with detrimental negative symptoms. In recent years, computational approaches to psychiatry have attracted growing attention. Negative symptoms have shown some overlap with general cognitive impairments and were also linked to impaired motivational processing in brain circuits implementing reward prediction. In this review, we outline how computational approaches may help to provide a better understanding of negative symptoms in terms of the potentially underlying behavioural and biological mechanisms. First, we describe the idea that negative symptoms could arise from a failure to represent reward expectations to enable flexible behavioural adaptation. It has been proposed that these impairments arise from a failure to use prediction errors to update expectations. Important previous studies focused on processing of so-called model-free prediction errors where learning is determined by past rewards only. However, learning and decision-making arise from multiple cognitive mechanisms functioning simultaneously, and dissecting them via well-designed tasks in conjunction with computational modelling is a promising avenue. Second, we move on to a proof-of-concept example on how generative models of functional imaging data from a cognitive task enable the identification of subgroups of patients mapping on different levels of negative symptoms. Combining the latter approach with behavioural studies regarding learning and decision-making may allow the identification of key behavioural and biological parameters distinctive for different dimensions of negative symptoms versus a general cognitive impairment. We conclude with an outlook on how this computational framework could, at some point, enrich future clinical studies. Copyright © 2016. Published by Elsevier B.V.

  14. NoRMCorre: An online algorithm for piecewise rigid motion correction of calcium imaging data.

    PubMed

    Pnevmatikakis, Eftychios A; Giovannucci, Andrea

    2017-11-01

    Motion correction is a challenging pre-processing problem that arises early in the analysis pipeline of calcium imaging data sequences. The motion artifacts in two-photon microscopy recordings can be non-rigid, arising from the finite time of raster scanning and non-uniform deformations of the brain medium. We introduce an algorithm for fast Non-Rigid Motion Correction (NoRMCorre) based on template matching. NoRMCorre operates by splitting the field of view (FOV) into overlapping spatial patches along all directions. The patches are registered at a sub-pixel resolution for rigid translation against a regularly updated template. The estimated alignments are subsequently up-sampled to create a smooth motion field for each frame that can efficiently approximate non-rigid artifacts in a piecewise-rigid manner. Existing approaches either do not scale well in terms of computational performance or are targeted to non-rigid artifacts arising just from the finite speed of raster scanning, and thus cannot correct for non-rigid motion observable in datasets from a large FOV. NoRMCorre can be run in an online mode resulting in comparable to or even faster than real time motion registration of streaming data. We evaluate its performance with simple yet intuitive metrics and compare against other non-rigid registration methods on simulated data and in vivo two-photon calcium imaging datasets. Open source Matlab and Python code is also made available. The proposed method and accompanying code can be useful for solving large scale image registration problems in calcium imaging, especially in the presence of non-rigid deformations. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.

  15. Transnational Approaches to Teaching and Learning in Higher Education: Challenges and Possible Guiding Principles

    ERIC Educational Resources Information Center

    Bovill, C.; Jordan, L.; Watters, N.

    2015-01-01

    The higher education sector has become increasingly internationalised over recent decades. This paper examines a range of challenges that can arise where teaching staff in one context support and implement learning and teaching initiatives in another international context--transnational teaching. We use examples and experiences from our own…

  16. 77 FR 48162 - Announcement of Requirements and Registration for the Challenge To Identify Audacious Goals in...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-13

    ... science and describe current specific research needs and opportunities. The current NEI strategic planning... worldwide. The creativity arising from a variety of new perspectives is expected to generate new research... and Registration for the Challenge To Identify Audacious Goals in Vision Research and Blindness...

  17. Small-Scale Design Experiments as Working Space for Larger Mobile Communication Challenges

    ERIC Educational Resources Information Center

    Lowe, Sarah; Stuedahl, Dagny

    2014-01-01

    In this paper, a design experiment using Instagram as a cultural probe is submitted as a method for analyzing the challenges that arise when considering the implementation of social media within a distributed communication space. It outlines how small, iterative investigations can reveal deeper research questions relevant to the education of…

  18. Challenges and Opportunities in Analysing Students Modelling

    ERIC Educational Resources Information Center

    Blanco-Anaya, Paloma; Justi, Rosária; Díaz de Bustamante, Joaquín

    2017-01-01

    Modelling-based teaching activities have been designed and analysed from distinct theoretical perspectives. In this paper, we use one of them--the model of modelling diagram (MMD)--as an analytical tool in a regular classroom context. This paper examines the challenges that arise when the MMD is used as an analytical tool to characterise the…

  19. Helping Students Navigate Faith Challenges in the Biblical Studies Classroom

    ERIC Educational Resources Information Center

    Sharp, Carolyn J.; Clark-Soles, Jaime

    2012-01-01

    What happens when students encounter the academic study of the Bible in the seminary or undergraduate classroom? Does a teacher have a responsibility to help students navigate challenges to Christian faith that might arise? What pedagogical problems and opportunities does this encounter present? How does this issue manifest differently in…

  20. The Psychological Study of Video Game Players: Methodological Challenges and Practical Advice

    ERIC Educational Resources Information Center

    King, Daniel; Delfabbro, Paul; Griffiths, Mark

    2009-01-01

    Video game playing has received increased academic interest over the last few decades, particularly with regard to the psychological understanding of addiction. Based on the many studies carried out by the authors, this paper summarises some of the methodological challenges which may arise when studying video game players, including obstacles…

  1. Adaptive Teaching for English Language Arts: Following the Pathway of Classroom Data in Preservice Teacher Inquiry

    ERIC Educational Resources Information Center

    Athanases, Steven Z.; Bennett, Lisa H.; Wahleithner, Juliet Michelsen

    2015-01-01

    Consensus exists that effective teaching includes capacity to adapt instruction to respond to student learning challenges as they arise. Adaptive teachers may keep pace with rapidly evolving youth literacies and students' increasing cultural and linguistic diversity. Teachers are challenged to critically examine pedagogy when some contexts expect…

  2. Automated Source-Code-Based Testing of Object-Oriented Software

    NASA Astrophysics Data System (ADS)

    Gerlich, Ralf; Gerlich, Rainer; Dietrich, Carsten

    2014-08-01

    With the advent of languages such as C++ and Java in mission- and safety-critical space on-board software, new challenges for testing and specifically automated testing arise. In this paper we discuss some of these challenges, consequences and solutions based on an experiment in automated source- code-based testing for C++.

  3. Integrated Sensing and Information Processing Theme-Based Redesign of the Undergraduate Electrical and Computer Engineering Curriculum at Duke University

    ERIC Educational Resources Information Center

    Ybarra, Gary A.; Collins, Leslie M.; Huettel, Lisa G.; Brown, April S.; Coonley, Kip D.; Massoud, Hisham Z.; Board, John A.; Cummer, Steven A.; Choudhury, Romit Roy; Gustafson, Michael R.; Jokerst, Nan M.; Brooke, Martin A.; Willett, Rebecca M.; Kim, Jungsang; Absher, Martha S.

    2011-01-01

    The field of electrical and computer engineering has evolved significantly in the past two decades. This evolution has broadened the field of ECE, and subfields have seen deep penetration into very specialized areas. Remarkable devices and systems arising from innovative processes, exotic materials, high speed computer simulations, and complex…

  4. A general method for computing the total solar radiation force on complex spacecraft structures

    NASA Technical Reports Server (NTRS)

    Chan, F. K.

    1981-01-01

    The method circumvents many of the existing difficulties in computational logic presently encountered in the direct analytical or numerical evaluation of the appropriate surface integral. It may be applied to complex spacecraft structures for computing the total force arising from either specular or diffuse reflection or even from non-Lambertian reflection and re-radiation.

  5. Computer vision and soft computing for automatic skull-face overlay in craniofacial superimposition.

    PubMed

    Campomanes-Álvarez, B Rosario; Ibáñez, O; Navarro, F; Alemán, I; Botella, M; Damas, S; Cordón, O

    2014-12-01

    Craniofacial superimposition can provide evidence to support that some human skeletal remains belong or not to a missing person. It involves the process of overlaying a skull with a number of ante mortem images of an individual and the analysis of their morphological correspondence. Within the craniofacial superimposition process, the skull-face overlay stage just focuses on achieving the best possible overlay of the skull and a single ante mortem image of the suspect. Although craniofacial superimposition has been in use for over a century, skull-face overlay is still applied by means of a trial-and-error approach without an automatic method. Practitioners finish the process once they consider that a good enough overlay has been attained. Hence, skull-face overlay is a very challenging, subjective, error prone, and time consuming part of the whole process. Though the numerical assessment of the method quality has not been achieved yet, computer vision and soft computing arise as powerful tools to automate it, dramatically reducing the time taken by the expert and obtaining an unbiased overlay result. In this manuscript, we justify and analyze the use of these techniques to properly model the skull-face overlay problem. We also present the automatic technical procedure we have developed using these computational methods and show the four overlays obtained in two craniofacial superimposition cases. This automatic procedure can be thus considered as a tool to aid forensic anthropologists to develop the skull-face overlay, automating and avoiding subjectivity of the most tedious task within craniofacial superimposition. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  6. Developing patient rapport, trust and therapeutic relationships.

    PubMed

    Price, Bob

    2017-08-09

    Rapport is established at the first meeting between the patient and nurse, and is developed throughout the therapeutic relationship. However, challenges can arise during this process. Initially, nurses can establish trust with the patient through the questions they ask, however, as care progresses, the nurse will be required to demonstrate a commitment to maintaining the patient's psychological well-being. When the therapeutic relationship ends, the nurse should assist the patient to assess progress and plan the next stage of recovery. This article provides three reflective exercises using case study examples to demonstrate how rapport is developed and sustained. Evidence is provided to identify why challenges arise in the therapeutic relationship and how the nurse can ensure they provide care that the patient regards as genuine.

  7. Approximate Bayesian Computation in the estimation of the parameters of the Forbush decrease model

    NASA Astrophysics Data System (ADS)

    Wawrzynczak, A.; Kopka, P.

    2017-12-01

    Realistic modeling of the complicated phenomena as Forbush decrease of the galactic cosmic ray intensity is a quite challenging task. One aspect is a numerical solution of the Fokker-Planck equation in five-dimensional space (three spatial variables, the time and particles energy). The second difficulty arises from a lack of detailed knowledge about the spatial and time profiles of the parameters responsible for the creation of the Forbush decrease. Among these parameters, the central role plays a diffusion coefficient. Assessment of the correctness of the proposed model can be done only by comparison of the model output with the experimental observations of the galactic cosmic ray intensity. We apply the Approximate Bayesian Computation (ABC) methodology to match the Forbush decrease model to experimental data. The ABC method is becoming increasing exploited for dynamic complex problems in which the likelihood function is costly to compute. The main idea of all ABC methods is to accept samples as an approximate posterior draw if its associated modeled data are close enough to the observed one. In this paper, we present application of the Sequential Monte Carlo Approximate Bayesian Computation algorithm scanning the space of the diffusion coefficient parameters. The proposed algorithm is adopted to create the model of the Forbush decrease observed by the neutron monitors at the Earth in March 2002. The model of the Forbush decrease is based on the stochastic approach to the solution of the Fokker-Planck equation.

  8. Learning to Predict Combinatorial Structures

    NASA Astrophysics Data System (ADS)

    Vembu, Shankar

    2009-12-01

    The major challenge in designing a discriminative learning algorithm for predicting structured data is to address the computational issues arising from the exponential size of the output space. Existing algorithms make different assumptions to ensure efficient, polynomial time estimation of model parameters. For several combinatorial structures, including cycles, partially ordered sets, permutations and other graph classes, these assumptions do not hold. In this thesis, we address the problem of designing learning algorithms for predicting combinatorial structures by introducing two new assumptions: (i) The first assumption is that a particular counting problem can be solved efficiently. The consequence is a generalisation of the classical ridge regression for structured prediction. (ii) The second assumption is that a particular sampling problem can be solved efficiently. The consequence is a new technique for designing and analysing probabilistic structured prediction models. These results can be applied to solve several complex learning problems including but not limited to multi-label classification, multi-category hierarchical classification, and label ranking.

  9. Semi-Supervised Marginal Fisher Analysis for Hyperspectral Image Classification

    NASA Astrophysics Data System (ADS)

    Huang, H.; Liu, J.; Pan, Y.

    2012-07-01

    The problem of learning with both labeled and unlabeled examples arises frequently in Hyperspectral image (HSI) classification. While marginal Fisher analysis is a supervised method, which cannot be directly applied for Semi-supervised classification. In this paper, we proposed a novel method, called semi-supervised marginal Fisher analysis (SSMFA), to process HSI of natural scenes, which uses a combination of semi-supervised learning and manifold learning. In SSMFA, a new difference-based optimization objective function with unlabeled samples has been designed. SSMFA preserves the manifold structure of labeled and unlabeled samples in addition to separating labeled samples in different classes from each other. The semi-supervised method has an analytic form of the globally optimal solution, and it can be computed based on eigen decomposition. Classification experiments with a challenging HSI task demonstrate that this method outperforms current state-of-the-art HSI-classification methods.

  10. Fluid Dynamics of Bottle Filling

    NASA Astrophysics Data System (ADS)

    McGough, Patrick; Gao, Haijing; Appathurai, Santosh; Basaran, Osman

    2011-11-01

    Filling of bottles is a widely practiced operation in a large number of industries. Well known examples include filling of ``large'' bottles with shampoos and cleaners in the household products and beauty care industries and filling of ``small'' bottles in the pharmaceutical industry. Some bottle filling operations have recently drawn much attention from the fluid mechanics community because of the occurrence of a multitude of complex flow regimes, transitions, and instabilities such as mounding and coiling that occur as a bottle is filled with a fluid. In this talk, we present a primarily computational study of the fluid dynamical challenges that can arise during the rapid filling of bottles. Given the diversity of fluids used in filling applications, we consider four representative classes of fluids that exhibit Newtonian, shear-thinning, viscoelastic, and yield-stress rheologies. The equations governing the dynamics of bottle filling are solved either in their full 3D but axisymmetric form or using the slender-jet approximation.

  11. Digital watermarking opportunities enabled by mobile media proliferation

    NASA Astrophysics Data System (ADS)

    Modro, Sierra; Sharma, Ravi K.

    2009-02-01

    Consumer usages of mobile devices and electronic media are changing. Mobile devices now include increased computational capabilities, mobile broadband access, better integrated sensors, and higher resolution screens. These enhanced features are driving increased consumption of media such as images, maps, e-books, audio, video, and games. As users become more accustomed to using mobile devices for media, opportunities arise for new digital watermarking usage models. For example, transient media, like images being displayed on screens, could be watermarked to provide a link between mobile devices. Applications based on these emerging usage models utilizing watermarking can provide richer user experiences and drive increased media consumption. We describe the enabling factors and highlight a few of the usage models and new opportunities. We also outline how the new opportunities are driving further innovation in watermarking technologies. We discuss challenges in market adoption of applications based on these usage models.

  12. Using directed information for influence discovery in interconnected dynamical systems

    NASA Astrophysics Data System (ADS)

    Rao, Arvind; Hero, Alfred O.; States, David J.; Engel, James Douglas

    2008-08-01

    Structure discovery in non-linear dynamical systems is an important and challenging problem that arises in various applications such as computational neuroscience, econometrics, and biological network discovery. Each of these systems have multiple interacting variables and the key problem is the inference of the underlying structure of the systems (which variables are connected to which others) based on the output observations (such as multiple time trajectories of the variables). Since such applications demand the inference of directed relationships among variables in these non-linear systems, current methods that have a linear assumption on structure or yield undirected variable dependencies are insufficient. Hence, in this work, we present a methodology for structure discovery using an information-theoretic metric called directed time information (DTI). Using both synthetic dynamical systems as well as true biological datasets (kidney development and T-cell data), we demonstrate the utility of DTI in such problems.

  13. Determining Individual Particle Magnetizations in Assemblages of Micrograins

    NASA Astrophysics Data System (ADS)

    de Groot, Lennart V.; Fabian, Karl; Béguin, Annemarieke; Reith, Pim; Barnhoorn, Auke; Hilgenkamp, Hans

    2018-04-01

    Obtaining reliable information from even the most challenging paleomagnetic recorders, such as the oldest igneous rocks and meteorites, is paramount to open new windows into Earth's history. Currently, such information is acquired by simultaneously sensing millions of particles in small samples or single crystals using superconducting quantum interference device magnetometers. The obtained rock-magnetic signal is a statistical ensemble of grains potentially differing in reliability as paleomagnetic recorder due to variations in physical dimensions, chemistry, and magnetic behavior. Here we go beyond bulk magnetic measurements and combine computed tomography and scanning magnetometry to uniquely invert for the magnetic moments of individual grains. This enables us to select and consider contributions of subsets of grains as a function of particle-specific selection criteria and avoid contributions that arise from particles that are altered or contain unreliable magnetic carriers. This new, nondestructive, method unlocks information from complex paleomagnetic recorders that until now goes obscured.

  14. Toward performance portability of the Albany finite element analysis code using the Kokkos library

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Demeshko, Irina; Watkins, Jerry; Tezaur, Irina K.

    Performance portability on heterogeneous high-performance computing (HPC) systems is a major challenge faced today by code developers: parallel code needs to be executed correctly as well as with high performance on machines with different architectures, operating systems, and software libraries. The finite element method (FEM) is a popular and flexible method for discretizing partial differential equations arising in a wide variety of scientific, engineering, and industrial applications that require HPC. This paper presents some preliminary results pertaining to our development of a performance portable implementation of the FEM-based Albany code. Performance portability is achieved using the Kokkos library. We presentmore » performance results for the Aeras global atmosphere dynamical core module in Albany. Finally, numerical experiments show that our single code implementation gives reasonable performance across three multicore/many-core architectures: NVIDIA General Processing Units (GPU’s), Intel Xeon Phis, and multicore CPUs.« less

  15. An optical Fourier transform coprocessor with direct phase determination.

    PubMed

    Macfaden, Alexander J; Gordon, George S D; Wilkinson, Timothy D

    2017-10-20

    The Fourier transform is a ubiquitous mathematical operation which arises naturally in optics. We propose and demonstrate a practical method to optically evaluate a complex-to-complex discrete Fourier transform. By implementing the Fourier transform optically we can overcome the limiting O(nlogn) complexity of fast Fourier transform algorithms. Efficiently extracting the phase from the well-known optical Fourier transform is challenging. By appropriately decomposing the input and exploiting symmetries of the Fourier transform we are able to determine the phase directly from straightforward intensity measurements, creating an optical Fourier transform with O(n) apparent complexity. Performing larger optical Fourier transforms requires higher resolution spatial light modulators, but the execution time remains unchanged. This method could unlock the potential of the optical Fourier transform to permit 2D complex-to-complex discrete Fourier transforms with a performance that is currently untenable, with applications across information processing and computational physics.

  16. Toward performance portability of the Albany finite element analysis code using the Kokkos library

    DOE PAGES

    Demeshko, Irina; Watkins, Jerry; Tezaur, Irina K.; ...

    2018-02-05

    Performance portability on heterogeneous high-performance computing (HPC) systems is a major challenge faced today by code developers: parallel code needs to be executed correctly as well as with high performance on machines with different architectures, operating systems, and software libraries. The finite element method (FEM) is a popular and flexible method for discretizing partial differential equations arising in a wide variety of scientific, engineering, and industrial applications that require HPC. This paper presents some preliminary results pertaining to our development of a performance portable implementation of the FEM-based Albany code. Performance portability is achieved using the Kokkos library. We presentmore » performance results for the Aeras global atmosphere dynamical core module in Albany. Finally, numerical experiments show that our single code implementation gives reasonable performance across three multicore/many-core architectures: NVIDIA General Processing Units (GPU’s), Intel Xeon Phis, and multicore CPUs.« less

  17. Learning mechanisms in multidisciplinary teamwork with real customers and open-ended problems

    NASA Astrophysics Data System (ADS)

    Heikkinen, Juho; Isomöttönen, Ville

    2015-11-01

    Recently, there has been a trend towards adding a multidisciplinary or multicultural element to traditional monodisciplinary project courses in computing and engineering. In this article, we examine the implications of multidisciplinarity for students' learning experiences during a one-semester project course for real customers. We use a qualitative research approach and base our analysis on students' learning reports on three instances of a project course titled Multidisciplinary working life project. The main contribution of this article is the unified theoretical picture of the learning mechanisms stemming from multidisciplinarity. Our main conclusions are that (1) students generally have a positive view of multidisciplinarity; (2) multidisciplinary teams enable students to better identify their own expertise, which leads to increased occupational identity; and (3) learning experiences are not fixed, as team spirit and student attitude play an important role in how students react to challenging situations arising from introduction of the multidisciplinarity.

  18. HEALPix: A Framework for High-Resolution Discretization and Fast Analysis of Data Distributed on the Sphere

    NASA Technical Reports Server (NTRS)

    Gorski, K. M.; Hivon, Eric; Banday, A. J.; Wandelt, Benjamin D.; Hansen, Frode K.; Reinecke, Mstvos; Bartelmann, Matthia

    2005-01-01

    HEALPix the Hierarchical Equal Area isoLatitude Pixelization is a versatile structure for the pixelization of data on the sphere. An associated library of computational algorithms and visualization software supports fast scientific applications executable directly on discretized spherical maps generated from very large volumes of astronomical data. Originally developed to address the data processing and analysis needs of the present generation of cosmic microwave background experiments (e.g., BOOMERANG, WMAP), HEALPix can be expanded to meet many of the profound challenges that will arise in confrontation with the observational output of future missions and experiments, including, e.g., Planck, Herschel, SAFIR, and the Beyond Einstein inflation probe. In this paper we consider the requirements and implementation constraints on a framework that simultaneously enables an efficient discretization with associated hierarchical indexation and fast analysis/synthesis of functions defined on the sphere. We demonstrate how these are explicitly satisfied by HEALPix.

  19. Dynamic Resource Allocation in Disaster Response: Tradeoffs in Wildfire Suppression

    PubMed Central

    Petrovic, Nada; Alderson, David L.; Carlson, Jean M.

    2012-01-01

    Challenges associated with the allocation of limited resources to mitigate the impact of natural disasters inspire fundamentally new theoretical questions for dynamic decision making in coupled human and natural systems. Wildfires are one of several types of disaster phenomena, including oil spills and disease epidemics, where (1) the disaster evolves on the same timescale as the response effort, and (2) delays in response can lead to increased disaster severity and thus greater demand for resources. We introduce a minimal stochastic process to represent wildfire progression that nonetheless accurately captures the heavy tailed statistical distribution of fire sizes observed in nature. We then couple this model for fire spread to a series of response models that isolate fundamental tradeoffs both in the strength and timing of response and also in division of limited resources across multiple competing suppression efforts. Using this framework, we compute optimal strategies for decision making scenarios that arise in fire response policy. PMID:22514605

  20. Security Attacks and Solutions in Electronic Health (E-health) Systems.

    PubMed

    Zeadally, Sherali; Isaac, Jesús Téllez; Baig, Zubair

    2016-12-01

    For centuries, healthcare has been a basic service provided by many governments to their citizens. Over the past few decades, we have witnessed a significant transformation in the quality of healthcare services provided by healthcare organizations and professionals. Recent advances have led to the emergence of Electronic Health (E-health), largely made possible by the massive deployment and adoption of information and communication technologies (ICTs). However, cybercriminals and attackers are exploiting vulnerabilities associated primarily with ICTs, causing data breaches of patients' confidential digital health information records. Here, we review recent security attacks reported for E-healthcare and discuss the solutions proposed to mitigate them. We also identify security challenges that must be addressed by E-health system designers and implementers in the future, to respond to threats that could arise as E-health systems become integrated with technologies such as cloud computing, the Internet of Things, and smart cities.

  1. Robust quantum control using smooth pulses and topological winding

    NASA Astrophysics Data System (ADS)

    Barnes, Edwin; Wang, Xin

    2015-03-01

    Perhaps the greatest challenge in achieving control of microscopic quantum systems is the decoherence induced by the environment, a problem which pervades experimental quantum physics and is particularly severe in the context of solid state quantum computing and nanoscale quantum devices because of the inherently strong coupling to the surrounding material. We present an analytical approach to constructing intrinsically robust driving fields which automatically cancel the leading-order noise-induced errors in a qubit's evolution exactly. We address two of the most common types of non-Markovian noise that arise in qubits: slow fluctuations of the qubit energy splitting and fluctuations in the driving field itself. We demonstrate our method by constructing robust quantum gates for several types of spin qubits, including phosphorous donors in silicon and nitrogen-vacancy centers in diamond. Our results constitute an important step toward achieving robust generic control of quantum systems, bringing their novel applications closer to realization. Work supported by LPS-CMTC.

  2. Receiving social support online: implications for health education.

    PubMed

    White, M; Dorman, S M

    2001-12-01

    Online support groups are expanding as the general public becomes more comfortable using computer-mediated communication technology. These support groups have certain benefits for users who may not be able to or do not have the desire to attend face-to-face sessions. Online support groups also present challenges when compared to traditional face-to-face group communication. Communication difficulties may arise resulting from lack of visual and aural cues found in traditional face-to-face communication. Online support groups have emerged within health care as a result of the need individuals have to know more about health conditions they are confronting. The proliferation of these online communities may provide an opportunity for health educators to reach target populations with specific messages. This paper reviews the development of health-related online support groups, examines research conducted within these communities, compares their utility with traditional support groups and discusses the implications of these groups for health education.

  3. Investigation of Nitride Morphology After Self-Aligned Contact Etch

    NASA Technical Reports Server (NTRS)

    Hwang, Helen H.; Keil, J.; Helmer, B. A.; Chien, T.; Gopaladasu, P.; Kim, J.; Shon, J.; Biegel, Bryan (Technical Monitor)

    2001-01-01

    Self-Aligned Contact (SAC) etch has emerged as a key enabling technology for the fabrication of very large-scale memory devices. However, this is also a very challenging technology to implement from an etch viewpoint. The issues that arise range from poor oxide etch selectivity to nitride to problems with post etch nitride surface morphology. Unfortunately, the mechanisms that drive nitride loss and surface behavior remain poorly understood. Using a simple langmuir site balance model, SAC nitride etch simulations have been performed and compared to actual etched results. This approach permits the study of various etch mechanisms that may play a role in determining nitride loss and surface morphology. Particle trajectories and fluxes are computed using Monte-Carlo techniques and initial data obtained from double Langmuir probe measurements. Etched surface advancement is implemented using a shock tracking algorithm. Sticking coefficients and etch yields are adjusted to obtain the best agreement between actual etched results and simulated profiles.

  4. Catecholaminergic challenge uncovers distinct Pavlovian and instrumental mechanisms of motivated (in)action.

    PubMed

    Swart, Jennifer C; Froböse, Monja I; Cook, Jennifer L; Geurts, Dirk Em; Frank, Michael J; Cools, Roshan; den Ouden, Hanneke Em

    2017-05-15

    Catecholamines modulate the impact of motivational cues on action. Such motivational biases have been proposed to reflect cue-based, 'Pavlovian' effects. Here, we assess whether motivational biases may also arise from asymmetrical instrumental learning of active and passive responses following reward and punishment outcomes. We present a novel paradigm, allowing us to disentangle the impact of reward and punishment on instrumental learning from Pavlovian response biasing. Computational analyses showed that motivational biases reflect both Pavlovian and instrumental effects: reward and punishment cues promoted generalized (in)action in a Pavlovian manner, whereas outcomes enhanced instrumental (un)learning of chosen actions. These cue- and outcome-based biases were altered independently by the catecholamine enhancer melthylphenidate. Methylphenidate's effect varied across individuals with a putative proxy of baseline dopamine synthesis capacity, working memory span. Our study uncovers two distinct mechanisms by which motivation impacts behaviour, and helps refine current models of catecholaminergic modulation of motivated action.

  5. Evaluation of the scientific underpinnings for identifying ...

    EPA Pesticide Factsheets

    A major challenge in chemical risk assessment is extrapolation of toxicity data from tested to untested species. Successful cross-species extrapolation involves understanding similarities and differences in toxicokinetic and toxicodynamic processes among species. Herein we consider the toxicodynamic challenge, and propose a hierarchal framework, based on the adverse outcome pathway (AOP) concept, to transparently and systematically assess cross-species conservation of biological pathways that could be perturbed by toxic chemicals. The approach features consideration of computational, in vitro and in vivo evidence to assess molecular initiating and intermediate key events of an AOP in a systematic, comparative manner. To demonstrate practical application of the framework, we consider an assessment question arising from the legislatively-mandated USEPA endocrine disruptor screening program, which involves the degree to which data generated using mammalian systems can be translated to non-mammalian species. Specifically, there is a need to define cross-species conservation of pathways controlled by activation of estrogen receptor-á (ERá), as a basis for using mammalian (primarily human) high-throughput (HTP) in vitro data to prioritize subsequent testing to assess human health and ecological risks of estrogenic chemicals. The initial phase of our analysis revealed good structural conservation the ERá across vertebrate species in terms of amino acid sequence

  6. Profiling undergraduates' generic learning skills on entry to medical school; an international study.

    PubMed

    Murdoch-Eaton, D; Manning, D; Kwizera, E; Burch, V; Pell, G; Whittle, S

    2012-01-01

    Medical education faces challenges posed by widening access to training, a demand for globally competent healthcare workers and progress towards harmonisation of standards. To explore potential challenges arising from variation in diversity and educational background of medical school entrants. This study investigated the reported experience and confidence, in a range of 31 generic skills underpinning learning, of 2606 medical undergraduates entering 14 medical schools in England and South Africa, using a validated questionnaire. Responses suggest that there is considerable similarity in prior educational experience and confidence skills profiles on entry to South African and English medical schools. South African entrants reported significantly more experience in 'Technical skills', 'Managing their own Learning', and 'Presentation', while English students reported increased experience in 'IT' skills. South African undergraduates reported more confidence in 'Information Handling', while English students were more confident in 'IT' skills. The most noticeable difference, in 'IT' skills, is probably due to documented differences in access to computer facilities at high school level. Differences between individual schools within each country are noticeable. Educators need to acquire a good understanding of their incoming cohorts, and ensure necessary tailored support for skills development.

  7. Detecting brain tumor in computed tomography images using Markov random fields and fuzzy C-means clustering techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abdulbaqi, Hayder Saad; Department of Physics, College of Education, University of Al-Qadisiya, Al-Qadisiya; Jafri, Mohd Zubir Mat

    Brain tumors, are an abnormal growth of tissues in the brain. They may arise in people of any age. They must be detected early, diagnosed accurately, monitored carefully, and treated effectively in order to optimize patient outcomes regarding both survival and quality of life. Manual segmentation of brain tumors from CT scan images is a challenging and time consuming task. Size and location accurate detection of brain tumor plays a vital role in the successful diagnosis and treatment of tumors. Brain tumor detection is considered a challenging mission in medical image processing. The aim of this paper is to introducemore » a scheme for tumor detection in CT scan images using two different techniques Hidden Markov Random Fields (HMRF) and Fuzzy C-means (FCM). The proposed method has been developed in this research in order to construct hybrid method between (HMRF) and threshold. These methods have been applied on 4 different patient data sets. The result of comparison among these methods shows that the proposed method gives good results for brain tissue detection, and is more robust and effective compared with (FCM) techniques.« less

  8. Thermalnet: a Deep Convolutional Network for Synthetic Thermal Image Generation

    NASA Astrophysics Data System (ADS)

    Kniaz, V. V.; Gorbatsevich, V. S.; Mizginov, V. A.

    2017-05-01

    Deep convolutional neural networks have dramatically changed the landscape of the modern computer vision. Nowadays methods based on deep neural networks show the best performance among image recognition and object detection algorithms. While polishing of network architectures received a lot of scholar attention, from the practical point of view the preparation of a large image dataset for a successful training of a neural network became one of major challenges. This challenge is particularly profound for image recognition in wavelengths lying outside the visible spectrum. For example no infrared or radar image datasets large enough for successful training of a deep neural network are available to date in public domain. Recent advances of deep neural networks prove that they are also capable to do arbitrary image transformations such as super-resolution image generation, grayscale image colorisation and imitation of style of a given artist. Thus a natural question arise: how could be deep neural networks used for augmentation of existing large image datasets? This paper is focused on the development of the Thermalnet deep convolutional neural network for augmentation of existing large visible image datasets with synthetic thermal images. The Thermalnet network architecture is inspired by colorisation deep neural networks.

  9. The Scientific Image in Behavior Analysis.

    PubMed

    Keenan, Mickey

    2016-05-01

    Throughout the history of science, the scientific image has played a significant role in communication. With recent developments in computing technology, there has been an increase in the kinds of opportunities now available for scientists to communicate in more sophisticated ways. Within behavior analysis, though, we are only just beginning to appreciate the importance of going beyond the printing press to elucidate basic principles of behavior. The aim of this manuscript is to stimulate appreciation of both the role of the scientific image and the opportunities provided by a quick response code (QR code) for enhancing the functionality of the printed page. I discuss the limitations of imagery in behavior analysis ("Introduction"), and I show examples of what can be done with animations and multimedia for teaching philosophical issues that arise when teaching about private events ("Private Events 1 and 2"). Animations are also useful for bypassing ethical issues when showing examples of challenging behavior ("Challenging Behavior"). Each of these topics can be accessed only by scanning the QR code provided. This contingency has been arranged to help the reader embrace this new technology. In so doing, I hope to show its potential for going beyond the limitations of the printing press.

  10. Implications of pleiotropy: challenges and opportunities for mining Big Data in biomedicine.

    PubMed

    Yang, Can; Li, Cong; Wang, Qian; Chung, Dongjun; Zhao, Hongyu

    2015-01-01

    Pleiotropy arises when a locus influences multiple traits. Rich GWAS findings of various traits in the past decade reveal many examples of this phenomenon, suggesting the wide existence of pleiotropic effects. What underlies this phenomenon is the biological connection among seemingly unrelated traits/diseases. Characterizing the molecular mechanisms of pleiotropy not only helps to explain the relationship between diseases, but may also contribute to novel insights concerning the pathological mechanism of each specific disease, leading to better disease prevention, diagnosis and treatment. However, most pleiotropic effects remain elusive because their functional roles have not been systematically examined. A systematic investigation requires availability of qualified measurements at multilayered biological processes (e.g., transcription and translation). The rise of Big Data in biomedicine, such as high-quality multi-omics data, biomedical imaging data and electronic medical records of patients, offers us an unprecedented opportunity to investigate pleiotropy. There will be a great need of computationally efficient and statistically rigorous methods for integrative analysis of these Big Data in biomedicine. In this review, we outline many opportunities and challenges in methodology developments for systematic analysis of pleiotropy, and highlight its implications on disease prevention, diagnosis and treatment.

  11. Druggable orthosteric and allosteric hot spots to target protein-protein interactions.

    PubMed

    Ma, Buyong; Nussinov, Ruth

    2014-01-01

    Drug designing targeting protein-protein interactions is challenging. Because structural elucidation and computational analysis have revealed the importance of hot spot residues in stabilizing these interactions, there have been on-going efforts to develop drugs which bind the hot spots and out-compete the native protein partners. The question arises as to what are the key 'druggable' properties of hot spots in protein-protein interactions and whether these mimic the general hot spot definition. Identification of orthosteric (at the protein- protein interaction site) and allosteric (elsewhere) druggable hot spots is expected to help in discovering compounds that can more effectively modulate protein-protein interactions. For example, are there any other significant features beyond their location in pockets in the interface? The interactions of protein-protein hot spots are coupled with conformational dynamics of protein complexes. Currently increasing efforts focus on the allosteric drug discovery. Allosteric drugs bind away from the native binding site and can modulate the native interactions. We propose that identification of allosteric hot spots could similarly help in more effective allosteric drug discovery. While detection of allosteric hot spots is challenging, targeting drugs to these residues has the potential of greatly increasing the hot spot and protein druggability.

  12. Emerging Biometric Modalities: Challenges and Opportunities

    NASA Astrophysics Data System (ADS)

    Gafurov, Davrondzhon

    Recent advances in sensor technology and wide spread use of various electronics (computers, PDA, mobile phones etc.) provide new opportunities for capturing and analyses of novel physiological and behavioural traits of human beings for biometric authentication. This paper presents an overview of several such types of human characteristics that have been proposed as alternatives to traditional types of biometrics. We refer to these characteristics as emerging biometrics. We survey various types of emerging modalities and techniques, and discuss their pros and cons. Emerging biometrics faces several limitations and challenges which include subject population coverage (focusing mostly on adults); unavailability of benchmark databases; little research with respect to vulnerability/robustness against attacks; and some privacy concerns they may arise. In addition, recognition performance of emerging modalities are generally less accurate compared to the traditional biometrics. Despite all of these emerging biometrics posses their own benefits and advantages compared to traditional biometrics which makes them still attractive for research. First of all, emerging biometrics can always serve as a complementary source for identity information; they can be suitable in applications where traditional biometrics are difficult or impossible to adapt such as continuous or periodic re-verification of the user's identity etc.

  13. Advances in understanding tumour evolution through single-cell sequencing.

    PubMed

    Kuipers, Jack; Jahn, Katharina; Beerenwinkel, Niko

    2017-04-01

    The mutational heterogeneity observed within tumours poses additional challenges to the development of effective cancer treatments. A thorough understanding of a tumour's subclonal composition and its mutational history is essential to open up the design of treatments tailored to individual patients. Comparative studies on a large number of tumours permit the identification of mutational patterns which may refine forecasts of cancer progression, response to treatment and metastatic potential. The composition of tumours is shaped by evolutionary processes. Recent advances in next-generation sequencing offer the possibility to analyse the evolutionary history and accompanying heterogeneity of tumours at an unprecedented resolution, by sequencing single cells. New computational challenges arise when moving from bulk to single-cell sequencing data, leading to the development of novel modelling frameworks. In this review, we present the state of the art methods for understanding the phylogeny encoded in bulk or single-cell sequencing data, and highlight future directions for developing more comprehensive and informative pictures of tumour evolution. This article is part of a Special Issue entitled: Evolutionary principles - heterogeneity in cancer?, edited by Dr. Robert A. Gatenby. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  14. Second Computational Aeroacoustics (CAA) Workshop on Benchmark Problems

    NASA Technical Reports Server (NTRS)

    Tam, C. K. W. (Editor); Hardin, J. C. (Editor)

    1997-01-01

    The proceedings of the Second Computational Aeroacoustics (CAA) Workshop on Benchmark Problems held at Florida State University are the subject of this report. For this workshop, problems arising in typical industrial applications of CAA were chosen. Comparisons between numerical solutions and exact solutions are presented where possible.

  15. Addressing the computational cost of large EIT solutions.

    PubMed

    Boyle, Alistair; Borsic, Andrea; Adler, Andy

    2012-05-01

    Electrical impedance tomography (EIT) is a soft field tomography modality based on the application of electric current to a body and measurement of voltages through electrodes at the boundary. The interior conductivity is reconstructed on a discrete representation of the domain using a finite-element method (FEM) mesh and a parametrization of that domain. The reconstruction requires a sequence of numerically intensive calculations. There is strong interest in reducing the cost of these calculations. An improvement in the compute time for current problems would encourage further exploration of computationally challenging problems such as the incorporation of time series data, wide-spread adoption of three-dimensional simulations and correlation of other modalities such as CT and ultrasound. Multicore processors offer an opportunity to reduce EIT computation times but may require some restructuring of the underlying algorithms to maximize the use of available resources. This work profiles two EIT software packages (EIDORS and NDRM) to experimentally determine where the computational costs arise in EIT as problems scale. Sparse matrix solvers, a key component for the FEM forward problem and sensitivity estimates in the inverse problem, are shown to take a considerable portion of the total compute time in these packages. A sparse matrix solver performance measurement tool, Meagre-Crowd, is developed to interface with a variety of solvers and compare their performance over a range of two- and three-dimensional problems of increasing node density. Results show that distributed sparse matrix solvers that operate on multiple cores are advantageous up to a limit that increases as the node density increases. We recommend a selection procedure to find a solver and hardware arrangement matched to the problem and provide guidance and tools to perform that selection.

  16. Visual Object Recognition with 3D-Aware Features in KITTI Urban Scenes

    PubMed Central

    Yebes, J. Javier; Bergasa, Luis M.; García-Garrido, Miguel Ángel

    2015-01-01

    Driver assistance systems and autonomous robotics rely on the deployment of several sensors for environment perception. Compared to LiDAR systems, the inexpensive vision sensors can capture the 3D scene as perceived by a driver in terms of appearance and depth cues. Indeed, providing 3D image understanding capabilities to vehicles is an essential target in order to infer scene semantics in urban environments. One of the challenges that arises from the navigation task in naturalistic urban scenarios is the detection of road participants (e.g., cyclists, pedestrians and vehicles). In this regard, this paper tackles the detection and orientation estimation of cars, pedestrians and cyclists, employing the challenging and naturalistic KITTI images. This work proposes 3D-aware features computed from stereo color images in order to capture the appearance and depth peculiarities of the objects in road scenes. The successful part-based object detector, known as DPM, is extended to learn richer models from the 2.5D data (color and disparity), while also carrying out a detailed analysis of the training pipeline. A large set of experiments evaluate the proposals, and the best performing approach is ranked on the KITTI website. Indeed, this is the first work that reports results with stereo data for the KITTI object challenge, achieving increased detection ratios for the classes car and cyclist compared to a baseline DPM. PMID:25903553

  17. Visual Object Recognition with 3D-Aware Features in KITTI Urban Scenes.

    PubMed

    Yebes, J Javier; Bergasa, Luis M; García-Garrido, Miguel Ángel

    2015-04-20

    Driver assistance systems and autonomous robotics rely on the deployment of several sensors for environment perception. Compared to LiDAR systems, the inexpensive vision sensors can capture the 3D scene as perceived by a driver in terms of appearance and depth cues. Indeed, providing 3D image understanding capabilities to vehicles is an essential target in order to infer scene semantics in urban environments. One of the challenges that arises from the navigation task in naturalistic urban scenarios is the detection of road participants (e.g., cyclists, pedestrians and vehicles). In this regard, this paper tackles the detection and orientation estimation of cars, pedestrians and cyclists, employing the challenging and naturalistic KITTI images. This work proposes 3D-aware features computed from stereo color images in order to capture the appearance and depth peculiarities of the objects in road scenes. The successful part-based object detector, known as DPM, is extended to learn richer models from the 2.5D data (color and disparity), while also carrying out a detailed analysis of the training pipeline. A large set of experiments evaluate the proposals, and the best performing approach is ranked on the KITTI website. Indeed, this is the first work that reports results with stereo data for the KITTI object challenge, achieving increased detection ratios for the classes car and cyclist compared to a baseline DPM.

  18. Design issues for grid-connected photovoltaic systems

    NASA Astrophysics Data System (ADS)

    Ropp, Michael Eugene

    1998-08-01

    Photovoltaics (PV) is the direct conversion of sunlight to electrical energy. In areas without centralized utility grids, the benefits of PV easily overshadow the present shortcomings of the technology. However, in locations with centralized utility systems, significant technical challenges remain before utility-interactive PV (UIPV) systems can be integrated into the mix of electricity sources. One challenge is that the needed computer design tools for optimal design of PV systems with curved PV arrays are not available, and even those that are available do not facilitate monitoring of the system once it is built. Another arises from the issue of islanding. Islanding occurs when a UIPV system continues to energize a section of a utility system after that section has been isolated from the utility voltage source. Islanding, which is potentially dangerous to both personnel and equipment, is difficult to prevent completely. The work contained within this thesis targets both of these technical challenges. In Task 1, a method for modeling a PV system with a curved PV array using only existing computer software is developed. This methodology also facilitates comparison of measured and modeled data for use in system monitoring. The procedure is applied to the Georgia Tech Aquatic Center (GTAC) FV system. In the work contained under Task 2, islanding prevention is considered. The existing state-of-the- art is thoroughly reviewed. In Subtask 2.1, an analysis is performed which suggests that standard protective relays are in fact insufficient to guarantee protection against islanding. In Subtask 2.2. several existing islanding prevention methods are compared in a novel way. The superiority of this new comparison over those used previously is demonstrated. A new islanding prevention method is the subject under Subtask 2.3. It is shown that it does not compare favorably with other existing techniques. However, in Subtask 2.4, a novel method for dramatically improving this new islanding prevention method is described. It is shown, both by computer modeling and experiment, that this new method is one of the most effective available today. Finally, under Subtask 2.5, the effects of certain types of loads; on the effectiveness of islanding prevention methods are discussed.

  19. Implementing Parquet equations using HPX

    NASA Astrophysics Data System (ADS)

    Kellar, Samuel; Wagle, Bibek; Yang, Shuxiang; Tam, Ka-Ming; Kaiser, Hartmut; Moreno, Juana; Jarrell, Mark

    A new C++ runtime system (HPX) enables simulations of complex systems to run more efficiently on parallel and heterogeneous systems. This increased efficiency allows for solutions to larger simulations of the parquet approximation for a system with impurities. The relevancy of the parquet equations depends upon the ability to solve systems which require long runs and large amounts of memory. These limitations, in addition to numerical complications arising from stability of the solutions, necessitate running on large distributed systems. As the computational resources trend towards the exascale and the limitations arising from computational resources vanish efficiency of large scale simulations becomes a focus. HPX facilitates efficient simulations through intelligent overlapping of computation and communication. Simulations such as the parquet equations which require the transfer of large amounts of data should benefit from HPX implementations. Supported by the the NSF EPSCoR Cooperative Agreement No. EPS-1003897 with additional support from the Louisiana Board of Regents.

  20. Modeling synthetic lethality

    PubMed Central

    Le Meur, Nolwenn; Gentleman, Robert

    2008-01-01

    Background Synthetic lethality defines a genetic interaction where the combination of mutations in two or more genes leads to cell death. The implications of synthetic lethal screens have been discussed in the context of drug development as synthetic lethal pairs could be used to selectively kill cancer cells, but leave normal cells relatively unharmed. A challenge is to assess genome-wide experimental data and integrate the results to better understand the underlying biological processes. We propose statistical and computational tools that can be used to find relationships between synthetic lethality and cellular organizational units. Results In Saccharomyces cerevisiae, we identified multi-protein complexes and pairs of multi-protein complexes that share an unusually high number of synthetic genetic interactions. As previously predicted, we found that synthetic lethality can arise from subunits of an essential multi-protein complex or between pairs of multi-protein complexes. Finally, using multi-protein complexes allowed us to take into account the pleiotropic nature of the gene products. Conclusions Modeling synthetic lethality using current estimates of the yeast interactome is an efficient approach to disentangle some of the complex molecular interactions that drive a cell. Our model in conjunction with applied statistical methods and computational methods provides new tools to better characterize synthetic genetic interactions. PMID:18789146

  1. Physiological Signal Analysis for Evaluating Flow during Playing of Computer Games of Varying Difficulty.

    PubMed

    Tian, Yu; Bian, Yulong; Han, Piguo; Wang, Peng; Gao, Fengqiang; Chen, Yingmin

    2017-01-01

    Flow is the experience of effortless attention, reduced self-consciousness, and a deep sense of control that typically occurs during the optimal performance of challenging tasks. On the basis of the person-artifact-task model, we selected computer games (tasks) with varying levels of difficulty (difficult, medium, and easy) and shyness (personality) as flow precursors to study the physiological activity of users in a flow state. Cardiac and respiratory activity and mean changes in skin conductance (SC) were measured continuously while the participants ( n = 40) played the games. Moreover, the associations between self-reported psychological flow and physiological measures were investigated through a series of repeated-measures analyses. The results showed that the flow experience is related to a faster respiratory rate, deeper respiration, moderate heart rate (HR), moderate HR variability, and moderate SC. The main effect of shyness was non-significant, whereas the interaction of shyness and difficulty influenced the flow experience. These findings are discussed in relation to current models of arousal and valence. The results indicate that the flow state is a state of moderate mental effort that arises through the increased parasympathetic modulation of sympathetic activity.

  2. The application of quaternions and other spatial representations to the reconstruction of re-entry vehicle motion.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    De Sapio, Vincent

    2010-09-01

    The analysis of spacecraft kinematics and dynamics requires an efficient scheme for spatial representation. While the representation of displacement in three dimensional Euclidean space is straightforward, orientation in three dimensions poses particular challenges. The unit quaternion provides an approach that mitigates many of the problems intrinsic in other representation approaches, including the ill-conditioning that arises from computing many successive rotations. This report focuses on the computational utility of unit quaternions and their application to the reconstruction of re-entry vehicle (RV) motion history from sensor data. To this end they will be used in conjunction with other kinematic and data processingmore » techniques. We will present a numerical implementation for the reconstruction of RV motion solely from gyroscope and accelerometer data. This will make use of unit quaternions due to their numerical efficacy in dealing with the composition of many incremental rotations over a time series. In addition to signal processing and data conditioning procedures, algorithms for numerical quaternion-based integration of gyroscope data will be addressed, as well as accelerometer triangulation and integration to yield RV trajectory. Actual processed flight data will be presented to demonstrate the implementation of these methods.« less

  3. Physiological Signal Analysis for Evaluating Flow during Playing of Computer Games of Varying Difficulty

    PubMed Central

    Tian, Yu; Bian, Yulong; Han, Piguo; Wang, Peng; Gao, Fengqiang; Chen, Yingmin

    2017-01-01

    Flow is the experience of effortless attention, reduced self-consciousness, and a deep sense of control that typically occurs during the optimal performance of challenging tasks. On the basis of the person–artifact–task model, we selected computer games (tasks) with varying levels of difficulty (difficult, medium, and easy) and shyness (personality) as flow precursors to study the physiological activity of users in a flow state. Cardiac and respiratory activity and mean changes in skin conductance (SC) were measured continuously while the participants (n = 40) played the games. Moreover, the associations between self-reported psychological flow and physiological measures were investigated through a series of repeated-measures analyses. The results showed that the flow experience is related to a faster respiratory rate, deeper respiration, moderate heart rate (HR), moderate HR variability, and moderate SC. The main effect of shyness was non-significant, whereas the interaction of shyness and difficulty influenced the flow experience. These findings are discussed in relation to current models of arousal and valence. The results indicate that the flow state is a state of moderate mental effort that arises through the increased parasympathetic modulation of sympathetic activity. PMID:28725206

  4. Expanding the scope of health information systems. Challenges and developments.

    PubMed

    Kuhn, K A; Wurst, S H R; Bott, O J; Giuse, D A

    2006-01-01

    To identify current challenges and developments in health information systems. Reports on HIS, eHealth and process support were analyzed, core problems and challenges were identified. Health information systems are extending their scope towards regional networks and health IT infrastructures. Integration, interoperability and interaction design are still today's core problems. Additional problems arise through the integration of genetic information into the health care process. There are noticeable trends towards solutions for these problems.

  5. ISCR Annual Report: Fical Year 2004

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McGraw, J R

    2005-03-03

    Large-scale scientific computation and all of the disciplines that support and help to validate it have been placed at the focus of Lawrence Livermore National Laboratory (LLNL) by the Advanced Simulation and Computing (ASC) program of the National Nuclear Security Administration (NNSA) and the Scientific Discovery through Advanced Computing (SciDAC) initiative of the Office of Science of the Department of Energy (DOE). The maturation of computational simulation as a tool of scientific and engineering research is underscored in the November 2004 statement of the Secretary of Energy that, ''high performance computing is the backbone of the nation's science and technologymore » enterprise''. LLNL operates several of the world's most powerful computers--including today's single most powerful--and has undertaken some of the largest and most compute-intensive simulations ever performed. Ultrascale simulation has been identified as one of the highest priorities in DOE's facilities planning for the next two decades. However, computers at architectural extremes are notoriously difficult to use efficiently. Furthermore, each successful terascale simulation only points out the need for much better ways of interacting with the resulting avalanche of data. Advances in scientific computing research have, therefore, never been more vital to LLNL's core missions than at present. Computational science is evolving so rapidly along every one of its research fronts that to remain on the leading edge, LLNL must engage researchers at many academic centers of excellence. In Fiscal Year 2004, the Institute for Scientific Computing Research (ISCR) served as one of LLNL's main bridges to the academic community with a program of collaborative subcontracts, visiting faculty, student internships, workshops, and an active seminar series. The ISCR identifies researchers from the academic community for computer science and computational science collaborations with LLNL and hosts them for short- and long-term visits with the aim of encouraging long-term academic research agendas that address LLNL's research priorities. Through such collaborations, ideas and software flow in both directions, and LLNL cultivates its future workforce. The Institute strives to be LLNL's ''eyes and ears'' in the computer and information sciences, keeping the Laboratory aware of and connected to important external advances. It also attempts to be the ''feet and hands'' that carry those advances into the Laboratory and incorporates them into practice. ISCR research participants are integrated into LLNL's Computing and Applied Research (CAR) Department, especially into its Center for Applied Scientific Computing (CASC). In turn, these organizations address computational challenges arising throughout the rest of the Laboratory. Administratively, the ISCR flourishes under LLNL's University Relations Program (URP). Together with the other five institutes of the URP, it navigates a course that allows LLNL to benefit from academic exchanges while preserving national security. While it is difficult to operate an academic-like research enterprise within the context of a national security laboratory, the results declare the challenges well met and worth the continued effort.« less

  6. Human Resource Management, Computers, and Organization Theory.

    ERIC Educational Resources Information Center

    Garson, G. David

    In an attempt to provide a framework for research and theory building in public management information systems (PMIS), state officials responsible for computing in personnel operations were surveyed. The data were applied to hypotheses arising from a recent model by Bozeman and Bretschneider, attempting to relate organization theory to management…

  7. Univariate and Bivariate Loglinear Models for Discrete Test Score Distributions.

    ERIC Educational Resources Information Center

    Holland, Paul W.; Thayer, Dorothy T.

    2000-01-01

    Applied the theory of exponential families of distributions to the problem of fitting the univariate histograms and discrete bivariate frequency distributions that often arise in the analysis of test scores. Considers efficient computation of the maximum likelihood estimates of the parameters using Newton's Method and computationally efficient…

  8. Salesperson Ethics: An Interactive Computer Simulation

    ERIC Educational Resources Information Center

    Castleberry, Stephen

    2014-01-01

    A new interactive computer simulation designed to teach sales ethics is described. Simulation learner objectives include gaining a better understanding of legal issues in selling; realizing that ethical dilemmas do arise in selling; realizing the need to be honest when selling; seeing that there are conflicting demands from a salesperson's…

  9. Developing Simulations in Multi-User Virtual Environments to Enhance Healthcare Education

    ERIC Educational Resources Information Center

    Rogers, Luke

    2011-01-01

    Computer-based clinical simulations are a powerful teaching and learning tool because of their ability to expand healthcare students' clinical experience by providing practice-based learning. Despite the benefits of traditional computer-based clinical simulations, there are significant issues that arise when incorporating them into a flexible,…

  10. Facing Up to the Learning Organisation Challenge: Key Issues from a European Perspective. Volume I. CEDEFOP Reference Series.

    ERIC Educational Resources Information Center

    Nyhan, Barry; Cressey, Peter; Tomassini, Massimo; Kelleher, Michael; Poell, Rob

    This first volume of a two-volume publication provides an analytical overview of main questions emerging from recent European research and development projects related to the learning organization. Chapter 1 provides context for the European learning organization challenge and presents four main messages arising from the learning organization…

  11. Resettlement Outcomes for People with Severe Challenging Behaviour Moving from Institutional to Community Living

    ERIC Educational Resources Information Center

    Perry, Jonathan; Felce, David; Allen, David; Meek, Andrea

    2011-01-01

    Background: The purpose of this study was to evaluate the quality of life consequences arising from the resettlement of adults with challenging behaviour severe enough to be deemed to require continuing healthcare from a traditional learning disability hospital to new purpose-built bungalows. The new accommodation was provided by a specialist NHS…

  12. Challenges in the Use of Social Networking Sites to Trace Potential Research Participants

    ERIC Educational Resources Information Center

    Marsh, Jackie; Bishop, Julia C.

    2014-01-01

    This paper reports on a number of challenges faced in tracing contributors to research projects that were originally conducted many decades previously. The need to trace contributors in this way arises in projects which focus on involving research participants in previous studies who have not been maintained on a database, or with whom the…

  13. Towards Integrative Religious Education in Belgium and Flanders: Challenges and Opportunities

    ERIC Educational Resources Information Center

    Loobuyck, Patrick; Franken, Leni

    2011-01-01

    This article describes the way in which religious education (RE) has been organised in Flanders and Belgium, and gives attention to the problems and challenges that arise these days. We argue that the "Schoolpact" of 1958 which implies separate RE in different religions in public schools needs a revision. Therefore, we propose an…

  14. Using Personnel and Financial Data for Reporting Purposes: What Are the Challenges to Using Such Data Accurately?

    ERIC Educational Resources Information Center

    Valcik, Nicolas A.; Stigdon, Andrea D.

    2008-01-01

    Although institutional researchers devote a great deal of time mining and using student data to fulfill mandatory federal and state reports and analyze institutional effectiveness, financial and personnel information is also necessary for such endeavors. In this article, the authors discuss the challenges that arise from extracting data from…

  15. Competing Priorities and Challenges: Principal Leadership for Social Justice along the U.S.-Mexico Border

    ERIC Educational Resources Information Center

    DeMatthews, David Edward

    2016-01-01

    Background/Context: Previous research has focused on the importance of a social justice leadership approach to improve schools that serve marginalized students, but less attention has been focused on potential dilemmas associated with social justice leadership and the ways in which principals prioritize when dilemmas or challenges arise.…

  16. First-principles Monte Carlo simulations of reaction equilibria in compressed vapors

    DOE PAGES

    Fetisov, Evgenii O.; Kuo, I-Feng William; Knight, Chris; ...

    2016-06-13

    Predictive modeling of reaction equilibria presents one of the grand challenges in the field of molecular simulation. Difficulties in the study of such systems arise from the need (i) to accurately model both strong, short-ranged interactions leading to the formation of chemical bonds and weak interactions arising from the environment, and (ii) to sample the range of time scales involving frequent molecular collisions, slow diffusion, and infrequent reactive events. Here we present a novel reactive first-principles Monte Carlo (RxFPMC) approach that allows for investigation of reaction equilibria without the need to prespecify a set of chemical reactions and their ideal-gasmore » equilibrium constants. We apply RxFPMC to investigate a nitrogen/oxygen mixture at T = 3000 K and p = 30 GPa, i.e., conditions that are present in atmospheric lightning strikes and explosions. The RxFPMC simulations show that the solvation environment leads to a significantly enhanced NO concentration that reaches a maximum when oxygen is present in slight excess. In addition, the RxFPMC simulations indicate the formation of NO 2 and N 2O in mole fractions approaching 1%, whereas N 3 and O 3 are not observed. Lastly, the equilibrium distributions obtained from the RxFPMC simulations agree well with those from a thermochemical computer code parametrized to experimental data.« less

  17. First-principles Monte Carlo simulations of reaction equilibria in compressed vapors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fetisov, Evgenii O.; Kuo, I-Feng William; Knight, Chris

    Predictive modeling of reaction equilibria presents one of the grand challenges in the field of molecular simulation. Difficulties in the study of such systems arise from the need (i) to accurately model both strong, short-ranged interactions leading to the formation of chemical bonds and weak interactions arising from the environment, and (ii) to sample the range of time scales involving frequent molecular collisions, slow diffusion, and infrequent reactive events. Here we present a novel reactive first-principles Monte Carlo (RxFPMC) approach that allows for investigation of reaction equilibria without the need to prespecify a set of chemical reactions and their ideal-gasmore » equilibrium constants. We apply RxFPMC to investigate a nitrogen/oxygen mixture at T = 3000 K and p = 30 GPa, i.e., conditions that are present in atmospheric lightning strikes and explosions. The RxFPMC simulations show that the solvation environment leads to a significantly enhanced NO concentration that reaches a maximum when oxygen is present in slight excess. In addition, the RxFPMC simulations indicate the formation of NO 2 and N 2O in mole fractions approaching 1%, whereas N 3 and O 3 are not observed. Lastly, the equilibrium distributions obtained from the RxFPMC simulations agree well with those from a thermochemical computer code parametrized to experimental data.« less

  18. Efficient Computation of Info-Gap Robustness for Finite Element Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stull, Christopher J.; Hemez, Francois M.; Williams, Brian J.

    2012-07-05

    A recent research effort at LANL proposed info-gap decision theory as a framework by which to measure the predictive maturity of numerical models. Info-gap theory explores the trade-offs between accuracy, that is, the extent to which predictions reproduce the physical measurements, and robustness, that is, the extent to which predictions are insensitive to modeling assumptions. Both accuracy and robustness are necessary to demonstrate predictive maturity. However, conducting an info-gap analysis can present a formidable challenge, from the standpoint of the required computational resources. This is because a robustness function requires the resolution of multiple optimization problems. This report offers anmore » alternative, adjoint methodology to assess the info-gap robustness of Ax = b-like numerical models solved for a solution x. Two situations that can arise in structural analysis and design are briefly described and contextualized within the info-gap decision theory framework. The treatments of the info-gap problems, using the adjoint methodology are outlined in detail, and the latter problem is solved for four separate finite element models. As compared to statistical sampling, the proposed methodology offers highly accurate approximations of info-gap robustness functions for the finite element models considered in the report, at a small fraction of the computational cost. It is noted that this report considers only linear systems; a natural follow-on study would extend the methodologies described herein to include nonlinear systems.« less

  19. Unperturbed Schelling Segregation in Two or Three Dimensions

    NASA Astrophysics Data System (ADS)

    Barmpalias, George; Elwes, Richard; Lewis-Pye, Andrew

    2016-09-01

    Schelling's models of segregation, first described in 1969 (Am Econ Rev 59:488-493, 1969) are among the best known models of self-organising behaviour. Their original purpose was to identify mechanisms of urban racial segregation. But his models form part of a family which arises in statistical mechanics, neural networks, social science, and beyond, where populations of agents interact on networks. Despite extensive study, unperturbed Schelling models have largely resisted rigorous analysis, prior results generally focusing on variants in which noise is introduced into the dynamics, the resulting system being amenable to standard techniques from statistical mechanics or stochastic evolutionary game theory (Young in Individual strategy and social structure: an evolutionary theory of institutions, Princeton University Press, Princeton, 1998). A series of recent papers (Brandt et al. in: Proceedings of the 44th annual ACM symposium on theory of computing (STOC 2012), 2012); Barmpalias et al. in: 55th annual IEEE symposium on foundations of computer science, Philadelphia, 2014, J Stat Phys 158:806-852, 2015), has seen the first rigorous analyses of 1-dimensional unperturbed Schelling models, in an asymptotic framework largely unknown in statistical mechanics. Here we provide the first such analysis of 2- and 3-dimensional unperturbed models, establishing most of the phase diagram, and answering a challenge from Brandt et al. in: Proceedings of the 44th annual ACM symposium on theory of computing (STOC 2012), 2012).

  20. Computer Security Incident Response Team Effectiveness: A Needs Assessment

    PubMed Central

    Van der Kleij, Rick; Kleinhuis, Geert; Young, Heather

    2017-01-01

    Computer security incident response teams (CSIRTs) respond to a computer security incident when the need arises. Failure of these teams can have far-reaching effects for the economy and national security. CSIRTs often have to work on an ad hoc basis, in close cooperation with other teams, and in time constrained environments. It could be argued that under these working conditions CSIRTs would be likely to encounter problems. A needs assessment was done to see to which extent this argument holds true. We constructed an incident response needs model to assist in identifying areas that require improvement. We envisioned a model consisting of four assessment categories: Organization, Team, Individual and Instrumental. Central to this is the idea that both problems and needs can have an organizational, team, individual, or technical origin or a combination of these levels. To gather data we conducted a literature review. This resulted in a comprehensive list of challenges and needs that could hinder or improve, respectively, the performance of CSIRTs. Then, semi-structured in depth interviews were held with team coordinators and team members of five public and private sector Dutch CSIRTs to ground these findings in practice and to identify gaps between current and desired incident handling practices. This paper presents the findings of our needs assessment and ends with a discussion of potential solutions to problems with performance in incident response. PMID:29312051

  1. Computer Security Incident Response Team Effectiveness: A Needs Assessment.

    PubMed

    Van der Kleij, Rick; Kleinhuis, Geert; Young, Heather

    2017-01-01

    Computer security incident response teams (CSIRTs) respond to a computer security incident when the need arises. Failure of these teams can have far-reaching effects for the economy and national security. CSIRTs often have to work on an ad hoc basis, in close cooperation with other teams, and in time constrained environments. It could be argued that under these working conditions CSIRTs would be likely to encounter problems. A needs assessment was done to see to which extent this argument holds true. We constructed an incident response needs model to assist in identifying areas that require improvement. We envisioned a model consisting of four assessment categories: Organization, Team, Individual and Instrumental. Central to this is the idea that both problems and needs can have an organizational, team, individual, or technical origin or a combination of these levels. To gather data we conducted a literature review. This resulted in a comprehensive list of challenges and needs that could hinder or improve, respectively, the performance of CSIRTs. Then, semi-structured in depth interviews were held with team coordinators and team members of five public and private sector Dutch CSIRTs to ground these findings in practice and to identify gaps between current and desired incident handling practices. This paper presents the findings of our needs assessment and ends with a discussion of potential solutions to problems with performance in incident response.

  2. High-resolution detection of 13C multiplets from the conscious mouse brain by ex vivo NMR spectroscopy

    PubMed Central

    Marin-Valencia, Isaac; Good, Levi B.; Ma, Qian; Jeffrey, F. Mark; Malloy, Craig R.; Pascual, Juan M.

    2011-01-01

    Glucose readily supplies the brain with the majority of carbon needed to sustain neurotransmitter production and utilization., The rate of brain glucose metabolism can be computed using 13C nuclear magnetic resonance (NMR) spectroscopy by detecting changes in 13C contents of products generated by cerebral metabolism. As previously observed, scalar coupling between adjacent 13C carbons (multiplets) can provide additional information to 13C contents for the computation of metabolic rates. Most NMR studies have been conducted in large animals (often under anesthesia) because the mass of the target organ is a limiting factor for NMR. Yet, despite the challengingly small size of the mouse brain, NMR studies are highly desirable because the mouse constitutes a common animal model for human neurological disorders. We have developed a method for the ex vivo resolution of NMR multiplets arising from the brain of an awake mouse after the infusion of [1,6-13C2]glucose. NMR spectra obtained by this method display favorable signal-to-noise ratios. With this protocol, the 13C multiplets of glutamate, glutamine, GABA and aspartate achieved steady state after 150 min. The method enables the accurate resolution of multiplets over time in the awake mouse brain. We anticipate that this method can be broadly applicable to compute brain fluxes in normal and transgenic mouse models of neurological disorders. PMID:21946227

  3. The "invisible caregiver": multicaregiving among diabetic African-American grandmothers.

    PubMed

    Carthron, Dana L; Bailey, Donald E; Anderson, Ruth A

    2014-01-01

    To explore the multicaregiving roles African-American grandmothers assume while self-managing their diabetes. This longitudinal, qualitative pilot study explored the challenges of self-managing diabetes among six African-American caregiving grandmothers. Data were collected at 5 times points across 18 months. Content analysis, guided by the Adaptive Leadership framework, was conducted using data matrices to facilitate within-case and cross-case analyses. Although participants initially stated they cared only for grandchildren, all had additional caregiving responsibilities. Four themes emerged which illustrated how African-American caregiving grandmothers put the care of dependent children, extended family and community before themselves. Using the Adaptive Leadership framework, technical and adaptive challenges arising from multicaregiving were described as barriers to diabetes self-management. When assisting these women to self-manage their diabetes, clinicians must assess challenges arising from multicaregiving. This might require developing collaborative work relationships with the client to develop meaningful and attainable goals. Copyright © 2014 Mosby, Inc. All rights reserved.

  4. MATIN: a random network coding based framework for high quality peer-to-peer live video streaming.

    PubMed

    Barekatain, Behrang; Khezrimotlagh, Dariush; Aizaini Maarof, Mohd; Ghaeini, Hamid Reza; Salleh, Shaharuddin; Quintana, Alfonso Ariza; Akbari, Behzad; Cabrera, Alicia Triviño

    2013-01-01

    In recent years, Random Network Coding (RNC) has emerged as a promising solution for efficient Peer-to-Peer (P2P) video multicasting over the Internet. This probably refers to this fact that RNC noticeably increases the error resiliency and throughput of the network. However, high transmission overhead arising from sending large coefficients vector as header has been the most important challenge of the RNC. Moreover, due to employing the Gauss-Jordan elimination method, considerable computational complexity can be imposed on peers in decoding the encoded blocks and checking linear dependency among the coefficients vectors. In order to address these challenges, this study introduces MATIN which is a random network coding based framework for efficient P2P video streaming. The MATIN includes a novel coefficients matrix generation method so that there is no linear dependency in the generated coefficients matrix. Using the proposed framework, each peer encapsulates one instead of n coefficients entries into the generated encoded packet which results in very low transmission overhead. It is also possible to obtain the inverted coefficients matrix using a bit number of simple arithmetic operations. In this regard, peers sustain very low computational complexities. As a result, the MATIN permits random network coding to be more efficient in P2P video streaming systems. The results obtained from simulation using OMNET++ show that it substantially outperforms the RNC which uses the Gauss-Jordan elimination method by providing better video quality on peers in terms of the four important performance metrics including video distortion, dependency distortion, End-to-End delay and Initial Startup delay.

  5. Understanding Challenges, Strategies, and the Role of Support Networks in Medication Self-management Among Patients With Type 2 Diabetes.

    PubMed

    Bernhard, Gerda; Ose, Dominik; Baudendistel, Ines; Seidling, Hanna M; Stützle, Marion; Szecsenyi, Joachim; Wensing, Michel; Mahler, Cornelia

    2017-04-01

    Purpose The purpose of this qualitative study was to investigate the challenges and strategies of patients with type 2 diabetes mellitus (T2DM) regarding daily management of their medication regimen focusing on the role of their support networks. Methods A purposeful sample of 25 patients with T2DM was recruited from local self-help groups, general practitioner practices, and a university hospital in southwestern Germany. Four semi-structured focus groups were conducted to identify the challenges patients experienced, the strategies they used, and their collaboration with support networks to assist them in self-managing their medication regimen. Sessions were audio- and video-recorded, fully transcribed, and subjected to computer-aided qualitative content analysis, guided by the Self- and Family Management Framework (SFMF). Results Patients with T2DM experienced numerous challenges affecting medication self-management arising from their personal situation, health status and resources, characteristics of their regimen, and how health care is currently organized. Patients' self-initiated strategies included activating health care, community, social, and online resources; taking ownership of medication-related needs; and integrating medication-taking into daily life. Patients drew on self-help groups, family, and friends to discuss concerns regarding medication safety and receive experience-based information and advice for navigating within the health care system as well as practical hands-on support with daily medication self-management. Conclusions Understanding the challenges and building on strategies patients with T2DM devised help diabetes educators to better address patients' needs and priorities and guide patient-centered interventions to support patients' self-management activities. Community and social support networks operating in patients' lives need to be engaged in the self-management support.

  6. Anterior mandibular ameloblastoma

    PubMed Central

    Bhandarwar, Ajay H.; Bakhshi, Girish D.; Borisa, Ashok D.; Wagh, Amol; Kapoor, Rajat; Kori, Channabasappa G.

    2012-01-01

    Ameloblastoma is a benign odontogenic tumor. These are usually asymptomatic until a large size is attained. Ameloblastoma has tendency to spread locally and has a high recurrence rate. Majority of ameloblastomas (80%) arise from the mandible. Ameloblastoma arising from anterior mandibular region (symphysis-menti) is rare. Very few cases of midline anterior ameloblastomas are reported in the literature. They often require wide local excision. Reconstruction of mandible in these cases is challenging. We present a case of mandibular ameloblastoma arising from symphysis-menti. Patient underwent wide surgical excision of the tumor followed by immediate reconstruction using free fibular vascular flap, stabilized with titanium reconstructive plates. A brief case report ands review of literature is presented. PMID:24765429

  7. Integrating high levels of variable renewable energy into electric power systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kroposki, Benjamin

    As more variable renewable energy (VRE) such as wind and solar are integrated into electric power systems, technical challenges arise from the need to maintain the balance between load and generation at all timescales. This paper examines the challenges with integrating ultra-high levels of VRE into electric power system, reviews a range of solutions to these challenges, and provides a description of several examples of ultra-high VRE systems that are in operation today.

  8. Integrating high levels of variable renewable energy into electric power systems

    DOE PAGES

    Kroposki, Benjamin

    2017-11-17

    As more variable renewable energy (VRE) such as wind and solar are integrated into electric power systems, technical challenges arise from the need to maintain the balance between load and generation at all timescales. This paper examines the challenges with integrating ultra-high levels of VRE into electric power system, reviews a range of solutions to these challenges, and provides a description of several examples of ultra-high VRE systems that are in operation today.

  9. Research Institute for Advanced Computer Science: Annual Report October 1998 through September 1999

    NASA Technical Reports Server (NTRS)

    Leiner, Barry M.; Gross, Anthony R. (Technical Monitor)

    1999-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center (ARC). It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. ARC has been designated NASA's Center of Excellence in Information Technology. In this capacity, ARC is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA ARC and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, and visiting scientist programs, designed to encourage and facilitate collaboration between the university and NASA information technology research communities.

  10. Research Institute for Advanced Computer Science

    NASA Technical Reports Server (NTRS)

    Gross, Anthony R. (Technical Monitor); Leiner, Barry M.

    2000-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center. It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. Ames has been designated NASA's Center of Excellence in Information Technology. In this capacity, Ames is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA Ames and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth; (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking. Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, and visiting scientist programs, designed to encourage and facilitate collaboration between the university and NASA information technology research communities.

  11. Model-Averaged ℓ1 Regularization using Markov Chain Monte Carlo Model Composition

    PubMed Central

    Fraley, Chris; Percival, Daniel

    2014-01-01

    Bayesian Model Averaging (BMA) is an effective technique for addressing model uncertainty in variable selection problems. However, current BMA approaches have computational difficulty dealing with data in which there are many more measurements (variables) than samples. This paper presents a method for combining ℓ1 regularization and Markov chain Monte Carlo model composition techniques for BMA. By treating the ℓ1 regularization path as a model space, we propose a method to resolve the model uncertainty issues arising in model averaging from solution path point selection. We show that this method is computationally and empirically effective for regression and classification in high-dimensional datasets. We apply our technique in simulations, as well as to some applications that arise in genomics. PMID:25642001

  12. Unresolved Issues and New Challenges in Teaching English to Young Learners: The Case of South Korea

    ERIC Educational Resources Information Center

    Garton, Sue

    2014-01-01

    The introduction of languages, especially English, into the primary curriculum around the world has been one of the major language-in-education policy developments in recent years. In countries where English has been compulsory for a number of years, the question arises as to what extent the numerous and well-documented challenges faced by the…

  13. History Educators and the Challenge of Immersive Pasts: A Critical Review of Virtual Reality "Tools" and History Pedagogy

    ERIC Educational Resources Information Center

    Allison, John

    2008-01-01

    This paper will undertake a critical review of the impact of virtual reality tools on the teaching of history. Virtual reality is useful in several different ways. History educators, elementary and secondary school teachers and professors, can all profit from the digital environment. Challenges arise quickly however. Virtual reality technologies…

  14. The Perceived Benefits and Difficulties in Introducing and Maintaining Supervision Groups in a SEMH Special School

    ERIC Educational Resources Information Center

    Willis, Jonathan; Baines, Ed

    2018-01-01

    Supervision groups are often used in professional settings and are introduced to address and provide support in relation to the challenges that arise in everyday practice. Although group supervision is common amongst a range of helping professions, its use in schools is rare. Little research exists as to the merits and challenges of providing…

  15. The Perils of a Lack of Student Engagement: Reflections of a "Lonely, Brave, and Rather Exposed" Online Instructor

    ERIC Educational Resources Information Center

    Stott, Philip

    2016-01-01

    Wholly online presentation of courses is becoming increasingly common, but poor levels of student engagement pose challenges to institutions, instructors and students. In this paper, I explore the risks arising from those challenges using an analysis of the presentation of a wildlife management course as a model, comparing data about levels of…

  16. Collaborative filtering for brain-computer interaction using transfer learning and active class selection.

    PubMed

    Wu, Dongrui; Lance, Brent J; Parsons, Thomas D

    2013-01-01

    Brain-computer interaction (BCI) and physiological computing are terms that refer to using processed neural or physiological signals to influence human interaction with computers, environment, and each other. A major challenge in developing these systems arises from the large individual differences typically seen in the neural/physiological responses. As a result, many researchers use individually-trained recognition algorithms to process this data. In order to minimize time, cost, and barriers to use, there is a need to minimize the amount of individual training data required, or equivalently, to increase the recognition accuracy without increasing the number of user-specific training samples. One promising method for achieving this is collaborative filtering, which combines training data from the individual subject with additional training data from other, similar subjects. This paper describes a successful application of a collaborative filtering approach intended for a BCI system. This approach is based on transfer learning (TL), active class selection (ACS), and a mean squared difference user-similarity heuristic. The resulting BCI system uses neural and physiological signals for automatic task difficulty recognition. TL improves the learning performance by combining a small number of user-specific training samples with a large number of auxiliary training samples from other similar subjects. ACS optimally selects the classes to generate user-specific training samples. Experimental results on 18 subjects, using both k nearest neighbors and support vector machine classifiers, demonstrate that the proposed approach can significantly reduce the number of user-specific training data samples. This collaborative filtering approach will also be generalizable to handling individual differences in many other applications that involve human neural or physiological data, such as affective computing.

  17. Collaborative Filtering for Brain-Computer Interaction Using Transfer Learning and Active Class Selection

    PubMed Central

    Wu, Dongrui; Lance, Brent J.; Parsons, Thomas D.

    2013-01-01

    Brain-computer interaction (BCI) and physiological computing are terms that refer to using processed neural or physiological signals to influence human interaction with computers, environment, and each other. A major challenge in developing these systems arises from the large individual differences typically seen in the neural/physiological responses. As a result, many researchers use individually-trained recognition algorithms to process this data. In order to minimize time, cost, and barriers to use, there is a need to minimize the amount of individual training data required, or equivalently, to increase the recognition accuracy without increasing the number of user-specific training samples. One promising method for achieving this is collaborative filtering, which combines training data from the individual subject with additional training data from other, similar subjects. This paper describes a successful application of a collaborative filtering approach intended for a BCI system. This approach is based on transfer learning (TL), active class selection (ACS), and a mean squared difference user-similarity heuristic. The resulting BCI system uses neural and physiological signals for automatic task difficulty recognition. TL improves the learning performance by combining a small number of user-specific training samples with a large number of auxiliary training samples from other similar subjects. ACS optimally selects the classes to generate user-specific training samples. Experimental results on 18 subjects, using both nearest neighbors and support vector machine classifiers, demonstrate that the proposed approach can significantly reduce the number of user-specific training data samples. This collaborative filtering approach will also be generalizable to handling individual differences in many other applications that involve human neural or physiological data, such as affective computing. PMID:23437188

  18. ENVIRONMENTAL IMMUNOCHEMISTRY

    EPA Science Inventory

    Environmental immunochemical methods are responding to the changing needs of regulatory and monitoring programs and are meeting new analytical challenges as they arise. Immunoassays are being developed for screening multiple organophosphorous (OP) pesticides (0,0-diethyl thionate...

  19. High Voltage Hybrid Electric Propulsion - Multilayered Functional Insulation System (MFIS) NASA-GRC

    NASA Technical Reports Server (NTRS)

    Lizcano, M.

    2017-01-01

    High power transmission cables pose a key challenge in future Hybrid Electric Propulsion Aircraft. The challenge arises in developing safe transmission lines that can withstand the unique environment found in aircraft while providing megawatts of power. High voltage AC, variable frequency cables do not currently exist and present particular electrical insulation challenges since electrical arcing and high heating are more prevalent at higher voltages and frequencies. Identifying and developing materials that maintain their dielectric properties at high voltage and frequencies is crucial.

  20. Psychoactive Substance Dependence: A Dentist's Challenge.

    PubMed

    Millar, Lynsey

    2015-05-01

    Given the number of individuals who are dependent on alcohol and/or drugs, it is inevitable that they will present for dental treatment. They are at an increased risk of dental disease for multiple reasons. This paper aims to provide an overview for general dental practitioners (GDPs) of the challenges that can arise in treating such patients, alongside some suggestions for meeting these challenges. General issues are taken into consideration first, then a focus is made on each of the most common substances, together with their implications in dentistry.

  1. Markerless laser registration in image-guided oral and maxillofacial surgery.

    PubMed

    Marmulla, Rüdiger; Lüth, Tim; Mühling, Joachim; Hassfeld, Stefan

    2004-07-01

    The use of registration markers in computer-assisted surgery is combined with high logistic costs and efforts. Markerless patient registration using laser scan surface registration techniques is a new challenging method. The present study was performed to evaluate the clinical accuracy in finding defined target points within the surgical site after markerless patient registration in image-guided oral and maxillofacial surgery. Twenty consecutive patients with different cranial diseases were scheduled for computer-assisted surgery. Data set alignment between the surgical site and the computed tomography (CT) data set was performed by markerless laser scan surface registration of the patient's face. Intraoral rigidly attached registration markers were used as target points, which had to be detected by an infrared pointer. The Surgical Segment Navigator SSN++ has been used for all procedures. SSN++ is an investigative product based on the SSN system that had previously been developed by the presenting authors with the support of Carl Zeiss (Oberkochen, Germany). SSN++ is connected to a Polaris infrared camera (Northern Digital, Waterloo, Ontario, Canada) and to a Minolta VI 900 3D digitizer (Tokyo, Japan) for high-resolution laser scanning. Minimal differences in shape between the laser scan surface and the surface generated from the CT data set could be detected. Nevertheless, high-resolution laser scan of the skin surface allows for a precise patient registration (mean deviation 1.1 mm, maximum deviation 1.8 mm). Radiation load, logistic costs, and efforts arising from the planning of computer-assisted surgery of the head can be reduced because native (markerless) CT data sets can be used for laser scan-based surface registration.

  2. Large-Scale Sentinel-1 Processing for Solid Earth Science and Urgent Response using Cloud Computing and Machine Learning

    NASA Astrophysics Data System (ADS)

    Hua, H.; Owen, S. E.; Yun, S. H.; Agram, P. S.; Manipon, G.; Starch, M.; Sacco, G. F.; Bue, B. D.; Dang, L. B.; Linick, J. P.; Malarout, N.; Rosen, P. A.; Fielding, E. J.; Lundgren, P.; Moore, A. W.; Liu, Z.; Farr, T.; Webb, F.; Simons, M.; Gurrola, E. M.

    2017-12-01

    With the increased availability of open SAR data (e.g. Sentinel-1 A/B), new challenges are being faced with processing and analyzing the voluminous SAR datasets to make geodetic measurements. Upcoming SAR missions such as NISAR are expected to generate close to 100TB per day. The Advanced Rapid Imaging and Analysis (ARIA) project can now generate geocoded unwrapped phase and coherence products from Sentinel-1 TOPS mode data in an automated fashion, using the ISCE software. This capability is currently being exercised on various study sites across the United States and around the globe, including Hawaii, Central California, Iceland and South America. The automated and large-scale SAR data processing and analysis capabilities use cloud computing techniques to speed the computations and provide scalable processing power and storage. Aspects such as how to processing these voluminous SLCs and interferograms at global scales, keeping up with the large daily SAR data volumes, and how to handle the voluminous data rates are being explored. Scene-partitioning approaches in the processing pipeline help in handling global-scale processing up to unwrapped interferograms with stitching done at a late stage. We have built an advanced science data system with rapid search functions to enable access to the derived data products. Rapid image processing of Sentinel-1 data to interferograms and time series is already being applied to natural hazards including earthquakes, floods, volcanic eruptions, and land subsidence due to fluid withdrawal. We will present the status of the ARIA science data system for generating science-ready data products and challenges that arise from being able to process SAR datasets to derived time series data products at large scales. For example, how do we perform large-scale data quality screening on interferograms? What approaches can be used to minimize compute, storage, and data movement costs for time series analysis in the cloud? We will also present some of our findings from applying machine learning and data analytics on the processed SAR data streams. We will also present lessons learned on how to ease the SAR community onto interfacing with these cloud-based SAR science data systems.

  3. Multimodality Imaging Approach towards Primary Aortic Sarcomas Arising after Endovascular Abdominal Aortic Aneurysm Repair: Case Series Report.

    PubMed

    Kamran, Mudassar; Fowler, Kathryn J; Mellnick, Vincent M; Sicard, Gregorio A; Narra, Vamsi R

    2016-06-01

    Primary aortic neoplasms are rare. Aortic sarcoma arising after endovascular aneurysm repair (EVAR) is a scarce subset of primary aortic malignancies, reports of which are infrequent in the published literature. The diagnosis of aortic sarcoma is challenging due to its non-specific clinical presentation, and the prognosis is poor due to delayed diagnosis, rapid proliferation, and propensity for metastasis. Post-EVAR, aortic sarcomas may mimic other more common aortic processes on surveillance imaging. Radiologists are rarely knowledgeable about this rare entity for which multimodality imaging and awareness are invaluable in early diagnosis. A series of three pathologically confirmed cases are presented to display the multimodality imaging features and clinical presentations of aortic sarcoma arising after EVAR.

  4. Development and Validation of a Novel Fusion Algorithm for Continuous, Accurate and Automated R-wave Detection and Calculation of Signal-Derived Metrics

    DTIC Science & Technology

    2013-01-01

    Predicting the onset of atrial fibrillation : the Computers in Cardiology Challenge 2001. Comput Cardiol 2001;28:113-6. [22] Moody GB, Mark RG, Goldberger AL...Computers in Cardiology Challenge 2006: QT interval measurement. Comput Cardiol 2006;33:313-6. [18] Moody GB. Spontaneous termination of atrial ... fibrillation : a challenge from PhysioNet and Computers in Cardiology 2004. Comput Cardiol 2004;31:101-4. [19] Moody GB, Jager F. Distinguishing ischemic from non

  5. Technologies for Army Knowledge Fusion

    DTIC Science & Technology

    2004-09-01

    interpret it in context and understand the implications (Alberts et al., 2002). Note that the knowledge / information fusion issue arises immediately here...Army Knowledge Fusion Richard Scherl Department of Computer Science Monmouth University Dana L. Ulery Computational and Information Sciences...civilian and military sources. Knowledge fusion, also called information fusion and multisensor data fusion, names the body of techniques needed to

  6. Pilots of the future - Human or computer?

    NASA Technical Reports Server (NTRS)

    Chambers, A. B.; Nagel, D. C.

    1985-01-01

    In connection with the occurrence of aircraft accidents and the evolution of the air-travel system, questions arise regarding the computer's potential for making fundamental contributions to improving the safety and reliability of air travel. An important result of an analysis of the causes of aircraft accidents is the conclusion that humans - 'pilots and other personnel' - are implicated in well over half of the accidents which occur. Over 70 percent of the incident reports contain evidence of human error. In addition, almost 75 percent show evidence of an 'information-transfer' problem. Thus, the question arises whether improvements in air safety could be achieved by removing humans from control situations. In an attempt to answer this question, it is important to take into account also certain advantages which humans have in comparison to computers. Attention is given to human error and the effects of technology, the motivation to automate, aircraft automation at the crossroads, the evolution of cockpit automation, and pilot factors.

  7. Neurons compute internal models of the physical laws of motion.

    PubMed

    Angelaki, Dora E; Shaikh, Aasef G; Green, Andrea M; Dickman, J David

    2004-07-29

    A critical step in self-motion perception and spatial awareness is the integration of motion cues from multiple sensory organs that individually do not provide an accurate representation of the physical world. One of the best-studied sensory ambiguities is found in visual processing, and arises because of the inherent uncertainty in detecting the motion direction of an untextured contour moving within a small aperture. A similar sensory ambiguity arises in identifying the actual motion associated with linear accelerations sensed by the otolith organs in the inner ear. These internal linear accelerometers respond identically during translational motion (for example, running forward) and gravitational accelerations experienced as we reorient the head relative to gravity (that is, head tilt). Using new stimulus combinations, we identify here cerebellar and brainstem motion-sensitive neurons that compute a solution to the inertial motion detection problem. We show that the firing rates of these populations of neurons reflect the computations necessary to construct an internal model representation of the physical equations of motion.

  8. Challenges in the Assessment and Treatment of Health Anxiety: The Case of Mrs. A.

    ERIC Educational Resources Information Center

    McCabe, Randi E.; Antony, Martin M.

    2004-01-01

    Health anxiety can present a challenge for clinicians, both from the perspective of assigning a "DSM-IV" diagnosis and in developing an appropriate treatment plan. The case of Mrs. A. illustrates some of the complexities that arise in the diagnosis and treatment of health anxiety. Mrs. A. is a 60-year-old retired teacher who presented to a…

  9. Early Childhood Education and Care in Austria: Challenges and Education Policies

    ERIC Educational Resources Information Center

    Smidt, Wilfried

    2018-01-01

    After a first peak in the late 1960s and early 1970s, early childhood education and care (ECEC) again plays an important role in the educational system in Austria. Over 90% of 3-5-year-old children attend non-familial institutions such as preschools. A consequence of this development is that new challenges arise, which have become the subject of…

  10. Challenges of Partnership Research: Insights from a Collaborative Partnership in Evidence-Informed Public Health Decision Making

    ERIC Educational Resources Information Center

    Traynor, Robyn; Dobbins, Maureen; DeCorby, Kara

    2015-01-01

    The investment of decision makers in research can increase the likelihood that relevant and timely practice-based research questions are asked and that these findings are readily taken up into policy and practice. While many positive benefits may be gained from this type of research, various challenges may also arise along the way. These include:…

  11. Education, Gender and Islam in China: The Place of Religious Education in Challenging and Sustaining "Undisputed Traditions" among Chinese Muslim Women

    ERIC Educational Resources Information Center

    Jaschok, Maria; Chan, Hau Ming Vicky

    2009-01-01

    The essay investigates the place of religious and secular education in the lives of Chinese Muslim women. Education is treated as a site where state and society are reproduced and/or challenged, where tensions arise over control of minds and bodies, and over interpretations and uses of religion and culture. Specifically, the essay compares…

  12. Issues of Shared Parenting of LGBTQ Children and Youth in Foster Care: Preparing Foster Parents for New Roles

    ERIC Educational Resources Information Center

    Craig-Oldsen, Heather; Craig, J. Ann; Morton, Thomas

    2006-01-01

    Foster parents have increasingly assumed new and challenging roles during the past decade. Meeting the developmental, attachment, and grieving needs of children and youth in out of home care is challenging by itself, but can become even more difficult with the issues that arise when the child is lesbian, gay, bisexual, transgender, or questioning…

  13. Non-commuting two-local Hamiltonians for quantum error suppression

    NASA Astrophysics Data System (ADS)

    Jiang, Zhang; Rieffel, Eleanor G.

    2017-04-01

    Physical constraints make it challenging to implement and control many-body interactions. For this reason, designing quantum information processes with Hamiltonians consisting of only one- and two-local terms is a worthwhile challenge. Enabling error suppression with two-local Hamiltonians is particularly challenging. A no-go theorem of Marvian and Lidar (Phys Rev Lett 113(26):260504, 2014) demonstrates that, even allowing particles with high Hilbert space dimension, it is impossible to protect quantum information from single-site errors by encoding in the ground subspace of any Hamiltonian containing only commuting two-local terms. Here, we get around this no-go result by encoding in the ground subspace of a Hamiltonian consisting of non-commuting two-local terms arising from the gauge operators of a subsystem code. Specifically, we show how to protect stored quantum information against single-qubit errors using a Hamiltonian consisting of sums of the gauge generators from Bacon-Shor codes (Bacon in Phys Rev A 73(1):012340, 2006) and generalized-Bacon-Shor code (Bravyi in Phys Rev A 83(1):012320, 2011). Our results imply that non-commuting two-local Hamiltonians have more error-suppressing power than commuting two-local Hamiltonians. While far from providing full fault tolerance, this approach improves the robustness achievable in near-term implementable quantum storage and adiabatic quantum computations, reducing the number of higher-order terms required to encode commonly used adiabatic Hamiltonians such as the Ising Hamiltonians common in adiabatic quantum optimization and quantum annealing.

  14. Preclinical magnetic resonance imaging and systems biology in cancer research: current applications and challenges.

    PubMed

    Albanese, Chris; Rodriguez, Olga C; VanMeter, John; Fricke, Stanley T; Rood, Brian R; Lee, YiChien; Wang, Sean S; Madhavan, Subha; Gusev, Yuriy; Petricoin, Emanuel F; Wang, Yue

    2013-02-01

    Biologically accurate mouse models of human cancer have become important tools for the study of human disease. The anatomical location of various target organs, such as brain, pancreas, and prostate, makes determination of disease status difficult. Imaging modalities, such as magnetic resonance imaging, can greatly enhance diagnosis, and longitudinal imaging of tumor progression is an important source of experimental data. Even in models where the tumors arise in areas that permit visual determination of tumorigenesis, longitudinal anatomical and functional imaging can enhance the scope of studies by facilitating the assessment of biological alterations, (such as changes in angiogenesis, metabolism, cellular invasion) as well as tissue perfusion and diffusion. One of the challenges in preclinical imaging is the development of infrastructural platforms required for integrating in vivo imaging and therapeutic response data with ex vivo pathological and molecular data using a more systems-based multiscale modeling approach. Further challenges exist in integrating these data for computational modeling to better understand the pathobiology of cancer and to better affect its cure. We review the current applications of preclinical imaging and discuss the implications of applying functional imaging to visualize cancer progression and treatment. Finally, we provide new data from an ongoing preclinical drug study demonstrating how multiscale modeling can lead to a more comprehensive understanding of cancer biology and therapy. Copyright © 2013 American Society for Investigative Pathology. Published by Elsevier Inc. All rights reserved.

  15. BAYESIAN PROTEIN STRUCTURE ALIGNMENT.

    PubMed

    Rodriguez, Abel; Schmidler, Scott C

    The analysis of the three-dimensional structure of proteins is an important topic in molecular biochemistry. Structure plays a critical role in defining the function of proteins and is more strongly conserved than amino acid sequence over evolutionary timescales. A key challenge is the identification and evaluation of structural similarity between proteins; such analysis can aid in understanding the role of newly discovered proteins and help elucidate evolutionary relationships between organisms. Computational biologists have developed many clever algorithmic techniques for comparing protein structures, however, all are based on heuristic optimization criteria, making statistical interpretation somewhat difficult. Here we present a fully probabilistic framework for pairwise structural alignment of proteins. Our approach has several advantages, including the ability to capture alignment uncertainty and to estimate key "gap" parameters which critically affect the quality of the alignment. We show that several existing alignment methods arise as maximum a posteriori estimates under specific choices of prior distributions and error models. Our probabilistic framework is also easily extended to incorporate additional information, which we demonstrate by including primary sequence information to generate simultaneous sequence-structure alignments that can resolve ambiguities obtained using structure alone. This combined model also provides a natural approach for the difficult task of estimating evolutionary distance based on structural alignments. The model is illustrated by comparison with well-established methods on several challenging protein alignment examples.

  16. Caregiving and Sibling Relationships: Challenges and Opportunities

    MedlinePlus

    ... can also lead to strained connections and painful conflict. One major source of sibling friction is the ... they vie for mom’s attention and affection. Another conflict can arise when one sibling is in denial ...

  17. Women holding hands.

    PubMed

    Jacobson, J

    1995-01-01

    It is estimated that 80% of the people involved in grassroots environmental protection advocacy in the US are women. One such self-described "average" woman became an activist upon learning that her drinking water was contaminated with uranium leaking from a US Department of Energy (DOE) facility. When DOE officials tried to brush off her concerns and those of her neighbors at a hearing, she presented them with a jar of water from her kitchen tap and challenged them to drink it. They refused. Thus began a long, but ultimately successful, struggle to shut down the offending facility. The efforts of these US women are mirrored all over the world as women have embraced environmental justice as one of their causes. At recent UN conferences, activists have challenged conventional strategies of economic development as being incompatible with equity and environmental sustainability. They have also established that "women's rights are human rights" and added domestic violence and rape to the human rights agenda. The recent International Conference on Population and Development revolved around women's health and rights issues. Throughout the world, women activists have challenged and changed the social dynamics of families, households, communities, and societies in general. One reason for the increased success of women's groups is that they have adopted the tactics of mass communication, including the use of computers, radio, and film. Although the various efforts are arising from diverse circumstances, they have some things in common such as finding personal experience to be a major impetus for action, realizing the self-reinforcing empowering nature of advocacy work, breaking the silence surrounding culturally taboo topics, and challenging the status quo. Such challenges often lead to political backlash or to counter measures taken by fundamentalist religious groups who link improvements in women's status with societal ills. Despite these challenges, the global women's movement continues to grow and to seek democracy and social justice.

  18. A homotopy analysis method for the nonlinear partial differential equations arising in engineering

    NASA Astrophysics Data System (ADS)

    Hariharan, G.

    2017-05-01

    In this article, we have established the homotopy analysis method (HAM) for solving a few partial differential equations arising in engineering. This technique provides the solutions in rapid convergence series with computable terms for the problems with high degree of nonlinear terms appearing in the governing differential equations. The convergence analysis of the proposed method is also discussed. Finally, we have given some illustrative examples to demonstrate the validity and applicability of the proposed method.

  19. Is realistic neuronal modeling realistic?

    PubMed Central

    Almog, Mara

    2016-01-01

    Scientific models are abstractions that aim to explain natural phenomena. A successful model shows how a complex phenomenon arises from relatively simple principles while preserving major physical or biological rules and predicting novel experiments. A model should not be a facsimile of reality; it is an aid for understanding it. Contrary to this basic premise, with the 21st century has come a surge in computational efforts to model biological processes in great detail. Here we discuss the oxymoronic, realistic modeling of single neurons. This rapidly advancing field is driven by the discovery that some neurons don't merely sum their inputs and fire if the sum exceeds some threshold. Thus researchers have asked what are the computational abilities of single neurons and attempted to give answers using realistic models. We briefly review the state of the art of compartmental modeling highlighting recent progress and intrinsic flaws. We then attempt to address two fundamental questions. Practically, can we realistically model single neurons? Philosophically, should we realistically model single neurons? We use layer 5 neocortical pyramidal neurons as a test case to examine these issues. We subject three publically available models of layer 5 pyramidal neurons to three simple computational challenges. Based on their performance and a partial survey of published models, we conclude that current compartmental models are ad hoc, unrealistic models functioning poorly once they are stretched beyond the specific problems for which they were designed. We then attempt to plot possible paths for generating realistic single neuron models. PMID:27535372

  20. An improved anonymous authentication scheme for roaming in ubiquitous networks.

    PubMed

    Lee, Hakjun; Lee, Donghoon; Moon, Jongho; Jung, Jaewook; Kang, Dongwoo; Kim, Hyoungshick; Won, Dongho

    2018-01-01

    With the evolution of communication technology and the exponential increase of mobile devices, the ubiquitous networking allows people to use our data and computing resources anytime and everywhere. However, numerous security concerns and complicated requirements arise as these ubiquitous networks are deployed throughout people's lives. To meet the challenge, the user authentication schemes in ubiquitous networks should ensure the essential security properties for the preservation of the privacy with low computational cost. In 2017, Chaudhry et al. proposed a password-based authentication scheme for the roaming in ubiquitous networks to enhance the security. Unfortunately, we found that their scheme remains insecure in its protection of the user privacy. In this paper, we prove that Chaudhry et al.'s scheme is vulnerable to the stolen-mobile device and user impersonation attacks, and its drawbacks comprise the absence of the incorrect login-input detection, the incorrectness of the password change phase, and the absence of the revocation provision. Moreover, we suggest a possible way to fix the security flaw in Chaudhry et al's scheme by using the biometric-based authentication for which the bio-hash is applied in the implementation of a three-factor authentication. We prove the security of the proposed scheme with the random oracle model and formally verify its security properties using a tool named ProVerif, and analyze it in terms of the computational and communication cost. The analysis result shows that the proposed scheme is suitable for resource-constrained ubiquitous environments.

  1. An improved anonymous authentication scheme for roaming in ubiquitous networks

    PubMed Central

    Lee, Hakjun; Lee, Donghoon; Moon, Jongho; Jung, Jaewook; Kang, Dongwoo; Kim, Hyoungshick

    2018-01-01

    With the evolution of communication technology and the exponential increase of mobile devices, the ubiquitous networking allows people to use our data and computing resources anytime and everywhere. However, numerous security concerns and complicated requirements arise as these ubiquitous networks are deployed throughout people’s lives. To meet the challenge, the user authentication schemes in ubiquitous networks should ensure the essential security properties for the preservation of the privacy with low computational cost. In 2017, Chaudhry et al. proposed a password-based authentication scheme for the roaming in ubiquitous networks to enhance the security. Unfortunately, we found that their scheme remains insecure in its protection of the user privacy. In this paper, we prove that Chaudhry et al.’s scheme is vulnerable to the stolen-mobile device and user impersonation attacks, and its drawbacks comprise the absence of the incorrect login-input detection, the incorrectness of the password change phase, and the absence of the revocation provision. Moreover, we suggest a possible way to fix the security flaw in Chaudhry et al’s scheme by using the biometric-based authentication for which the bio-hash is applied in the implementation of a three-factor authentication. We prove the security of the proposed scheme with the random oracle model and formally verify its security properties using a tool named ProVerif, and analyze it in terms of the computational and communication cost. The analysis result shows that the proposed scheme is suitable for resource-constrained ubiquitous environments. PMID:29505575

  2. Snowflake: A Lightweight Portable Stencil DSL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Nathan; Driscoll, Michael; Markley, Charles

    Stencil computations are not well optimized by general-purpose production compilers and the increased use of multicore, manycore, and accelerator-based systems makes the optimization problem even more challenging. In this paper we present Snowflake, a Domain Specific Language (DSL) for stencils that uses a 'micro-compiler' approach, i.e., small, focused, domain-specific code generators. The approach is similar to that used in image processing stencils, but Snowflake handles the much more complex stencils that arise in scientific computing, including complex boundary conditions, higher-order operators (larger stencils), higher dimensions, variable coefficients, non-unit-stride iteration spaces, and multiple input or output meshes. Snowflake is embedded inmore » the Python language, allowing it to interoperate with popular scientific tools like SciPy and iPython; it also takes advantage of built-in Python libraries for powerful dependence analysis as part of a just-in-time compiler. We demonstrate the power of the Snowflake language and the micro-compiler approach with a complex scientific benchmark, HPGMG, that exercises the generality of stencil support in Snowflake. By generating OpenMP comparable to, and OpenCL within a factor of 2x of hand-optimized HPGMG, Snowflake demonstrates that a micro-compiler can support diverse processor architectures and is performance-competitive whilst preserving a high-level Python implementation.« less

  3. Snowflake: A Lightweight Portable Stencil DSL

    DOE PAGES

    Zhang, Nathan; Driscoll, Michael; Markley, Charles; ...

    2017-05-01

    Stencil computations are not well optimized by general-purpose production compilers and the increased use of multicore, manycore, and accelerator-based systems makes the optimization problem even more challenging. In this paper we present Snowflake, a Domain Specific Language (DSL) for stencils that uses a 'micro-compiler' approach, i.e., small, focused, domain-specific code generators. The approach is similar to that used in image processing stencils, but Snowflake handles the much more complex stencils that arise in scientific computing, including complex boundary conditions, higher-order operators (larger stencils), higher dimensions, variable coefficients, non-unit-stride iteration spaces, and multiple input or output meshes. Snowflake is embedded inmore » the Python language, allowing it to interoperate with popular scientific tools like SciPy and iPython; it also takes advantage of built-in Python libraries for powerful dependence analysis as part of a just-in-time compiler. We demonstrate the power of the Snowflake language and the micro-compiler approach with a complex scientific benchmark, HPGMG, that exercises the generality of stencil support in Snowflake. By generating OpenMP comparable to, and OpenCL within a factor of 2x of hand-optimized HPGMG, Snowflake demonstrates that a micro-compiler can support diverse processor architectures and is performance-competitive whilst preserving a high-level Python implementation.« less

  4. Python for Information Theoretic Analysis of Neural Data

    PubMed Central

    Ince, Robin A. A.; Petersen, Rasmus S.; Swan, Daniel C.; Panzeri, Stefano

    2008-01-01

    Information theory, the mathematical theory of communication in the presence of noise, is playing an increasingly important role in modern quantitative neuroscience. It makes it possible to treat neural systems as stochastic communication channels and gain valuable, quantitative insights into their sensory coding function. These techniques provide results on how neurons encode stimuli in a way which is independent of any specific assumptions on which part of the neuronal response is signal and which is noise, and they can be usefully applied even to highly non-linear systems where traditional techniques fail. In this article, we describe our work and experiences using Python for information theoretic analysis. We outline some of the algorithmic, statistical and numerical challenges in the computation of information theoretic quantities from neural data. In particular, we consider the problems arising from limited sampling bias and from calculation of maximum entropy distributions in the presence of constraints representing the effects of different orders of interaction in the system. We explain how and why using Python has allowed us to significantly improve the speed and domain of applicability of the information theoretic algorithms, allowing analysis of data sets characterized by larger numbers of variables. We also discuss how our use of Python is facilitating integration with collaborative databases and centralised computational resources. PMID:19242557

  5. A computational modeling approach for the characterization of mechanical properties of 3D alginate tissue scaffolds.

    PubMed

    Nair, K; Yan, K C; Sun, W

    2008-01-01

    Scaffold guided tissue engineering is an innovative approach wherein cells are seeded onto biocompatible and biodegradable materials to form 3-dimensional (3D) constructs that, when implanted in the body facilitate the regeneration of tissue. Tissue scaffolds act as artificial extracellular matrix providing the environment conducive for tissue growth. Characterization of scaffold properties is necessary to understand better the underlying processes involved in controlling cell behavior and formation of functional tissue. We report a computational modeling approach to characterize mechanical properties of 3D gellike biomaterial, specifically, 3D alginate scaffold encapsulated with cells. Alginate inherent nonlinearity and variations arising from minute changes in its concentration and viscosity make experimental evaluation of its mechanical properties a challenging and time consuming task. We developed an in silico model to determine the stress-strain relationship of alginate based scaffolds from experimental data. In particular, we compared the Ogden hyperelastic model to other hyperelastic material models and determined that this model was the most suitable to characterize the nonlinear behavior of alginate. We further propose a mathematical model that represents the alginate material constants in Ogden model as a function of concentrations and viscosity. This study demonstrates the model capability to predict mechanical properties of 3D alginate scaffolds.

  6. Adaptive Importance Sampling for Control and Inference

    NASA Astrophysics Data System (ADS)

    Kappen, H. J.; Ruiz, H. C.

    2016-03-01

    Path integral (PI) control problems are a restricted class of non-linear control problems that can be solved formally as a Feynman-Kac PI and can be estimated using Monte Carlo sampling. In this contribution we review PI control theory in the finite horizon case. We subsequently focus on the problem how to compute and represent control solutions. We review the most commonly used methods in robotics and control. Within the PI theory, the question of how to compute becomes the question of importance sampling. Efficient importance samplers are state feedback controllers and the use of these requires an efficient representation. Learning and representing effective state-feedback controllers for non-linear stochastic control problems is a very challenging, and largely unsolved, problem. We show how to learn and represent such controllers using ideas from the cross entropy method. We derive a gradient descent method that allows to learn feed-back controllers using an arbitrary parametrisation. We refer to this method as the path integral cross entropy method or PICE. We illustrate this method for some simple examples. The PI control methods can be used to estimate the posterior distribution in latent state models. In neuroscience these problems arise when estimating connectivity from neural recording data using EM. We demonstrate the PI control method as an accurate alternative to particle filtering.

  7. Functional analysis of rare variants in mismatch repair proteins augments results from computation-based predictive methods

    PubMed Central

    Arora, Sanjeevani; Huwe, Peter J.; Sikder, Rahmat; Shah, Manali; Browne, Amanda J.; Lesh, Randy; Nicolas, Emmanuelle; Deshpande, Sanat; Hall, Michael J.; Dunbrack, Roland L.; Golemis, Erica A.

    2017-01-01

    ABSTRACT The cancer-predisposing Lynch Syndrome (LS) arises from germline mutations in DNA mismatch repair (MMR) genes, predominantly MLH1, MSH2, MSH6, and PMS2. A major challenge for clinical diagnosis of LS is the frequent identification of variants of uncertain significance (VUS) in these genes, as it is often difficult to determine variant pathogenicity, particularly for missense variants. Generic programs such as SIFT and PolyPhen-2, and MMR gene-specific programs such as PON-MMR and MAPP-MMR, are often used to predict deleterious or neutral effects of VUS in MMR genes. We evaluated the performance of multiple predictive programs in the context of functional biologic data for 15 VUS in MLH1, MSH2, and PMS2. Using cell line models, we characterized VUS predicted to range from neutral to pathogenic on mRNA and protein expression, basal cellular viability, viability following treatment with a panel of DNA-damaging agents, and functionality in DNA damage response (DDR) signaling, benchmarking to wild-type MMR proteins. Our results suggest that the MMR gene-specific classifiers do not always align with the experimental phenotypes related to DDR. Our study highlights the importance of complementary experimental and computational assessment to develop future predictors for the assessment of VUS. PMID:28494185

  8. Reconstructing the hidden states in time course data of stochastic models.

    PubMed

    Zimmer, Christoph

    2015-11-01

    Parameter estimation is central for analyzing models in Systems Biology. The relevance of stochastic modeling in the field is increasing. Therefore, the need for tailored parameter estimation techniques is increasing as well. Challenges for parameter estimation are partial observability, measurement noise, and the computational complexity arising from the dimension of the parameter space. This article extends the multiple shooting for stochastic systems' method, developed for inference in intrinsic stochastic systems. The treatment of extrinsic noise and the estimation of the unobserved states is improved, by taking into account the correlation between unobserved and observed species. This article demonstrates the power of the method on different scenarios of a Lotka-Volterra model, including cases in which the prey population dies out or explodes, and a Calcium oscillation system. Besides showing how the new extension improves the accuracy of the parameter estimates, this article analyzes the accuracy of the state estimates. In contrast to previous approaches, the new approach is well able to estimate states and parameters for all the scenarios. As it does not need stochastic simulations, it is of the same order of speed as conventional least squares parameter estimation methods with respect to computational time. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  9. MATIN: A Random Network Coding Based Framework for High Quality Peer-to-Peer Live Video Streaming

    PubMed Central

    Barekatain, Behrang; Khezrimotlagh, Dariush; Aizaini Maarof, Mohd; Ghaeini, Hamid Reza; Salleh, Shaharuddin; Quintana, Alfonso Ariza; Akbari, Behzad; Cabrera, Alicia Triviño

    2013-01-01

    In recent years, Random Network Coding (RNC) has emerged as a promising solution for efficient Peer-to-Peer (P2P) video multicasting over the Internet. This probably refers to this fact that RNC noticeably increases the error resiliency and throughput of the network. However, high transmission overhead arising from sending large coefficients vector as header has been the most important challenge of the RNC. Moreover, due to employing the Gauss-Jordan elimination method, considerable computational complexity can be imposed on peers in decoding the encoded blocks and checking linear dependency among the coefficients vectors. In order to address these challenges, this study introduces MATIN which is a random network coding based framework for efficient P2P video streaming. The MATIN includes a novel coefficients matrix generation method so that there is no linear dependency in the generated coefficients matrix. Using the proposed framework, each peer encapsulates one instead of n coefficients entries into the generated encoded packet which results in very low transmission overhead. It is also possible to obtain the inverted coefficients matrix using a bit number of simple arithmetic operations. In this regard, peers sustain very low computational complexities. As a result, the MATIN permits random network coding to be more efficient in P2P video streaming systems. The results obtained from simulation using OMNET++ show that it substantially outperforms the RNC which uses the Gauss-Jordan elimination method by providing better video quality on peers in terms of the four important performance metrics including video distortion, dependency distortion, End-to-End delay and Initial Startup delay. PMID:23940530

  10. Analysis of partially observed clustered data using generalized estimating equations and multiple imputation

    PubMed Central

    Aloisio, Kathryn M.; Swanson, Sonja A.; Micali, Nadia; Field, Alison; Horton, Nicholas J.

    2015-01-01

    Clustered data arise in many settings, particularly within the social and biomedical sciences. As an example, multiple–source reports are commonly collected in child and adolescent psychiatric epidemiologic studies where researchers use various informants (e.g. parent and adolescent) to provide a holistic view of a subject’s symptomatology. Fitzmaurice et al. (1995) have described estimation of multiple source models using a standard generalized estimating equation (GEE) framework. However, these studies often have missing data due to additional stages of consent and assent required. The usual GEE is unbiased when missingness is Missing Completely at Random (MCAR) in the sense of Little and Rubin (2002). This is a strong assumption that may not be tenable. Other options such as weighted generalized estimating equations (WEEs) are computationally challenging when missingness is non–monotone. Multiple imputation is an attractive method to fit incomplete data models while only requiring the less restrictive Missing at Random (MAR) assumption. Previously estimation of partially observed clustered data was computationally challenging however recent developments in Stata have facilitated their use in practice. We demonstrate how to utilize multiple imputation in conjunction with a GEE to investigate the prevalence of disordered eating symptoms in adolescents reported by parents and adolescents as well as factors associated with concordance and prevalence. The methods are motivated by the Avon Longitudinal Study of Parents and their Children (ALSPAC), a cohort study that enrolled more than 14,000 pregnant mothers in 1991–92 and has followed the health and development of their children at regular intervals. While point estimates were fairly similar to the GEE under MCAR, the MAR model had smaller standard errors, while requiring less stringent assumptions regarding missingness. PMID:25642154

  11. Coincidence between malignant perivascular epithelioid cell tumor arising in the gastric serosa and lung adenocarcinoma.

    PubMed

    Yamada, Sohsuke; Nabeshima, Atsunori; Noguchi, Hirotsugu; Nawata, Aya; Nishii, Hisae; Guo, Xin; Wang, Ke-Yong; Hisaoka, Masanori; Nakayama, Toshiyuki

    2015-01-28

    A 4-mo history of both epigastralgia and back pain was presented in a 39-year-old male. Computed tomography showed right lung nodule and abdominal mass attached to the gastric wall, measuring approximately 30 mm and 70 mm in diameter. Since biopsy samples from the lung and abdomen revealed poorly differentiated adenocarcinoma and malignant tumor, clinicians first interpreted the abdominal mass as metastatic carcinoma, and a right lower lobectomy with following resection of the mass was performed. Gross examination of both lesions displayed gray-whitish to yellow-whitish cut surfaces with hemorrhagic and necrotic foci, and the mass attached to the serosa of the lesser curvature on the gastric body. On microscopic examination, the lung tumor was composed of a proliferation of highly atypical epithelial cells having abundant eosinophilic cytoplasm, predominantly arranged in an acinar or solid growth pattern with vessel permeation, while the abdominal tumor consisted of sheets or nests with markedly atypical epithelioid cells having pleomorphic nuclei and abundant eosinophilic to clear cytoplasm focally in a radial perivascular or infiltrative growth pattern. Immunohistochemically, the latter cells were positive for HMB45 or α-smooth muscle actin, but the former ones not. Therefore, we finally made a diagnosis of malignant perivascular epithelioid cell tumor (PEComa) arising in the gastric serosa, combined with primary lung adenocarcinoma. Furthermore, small papillary carcinoma of the thyroid gland was identified. The current case describes the coincidence of malignant PEComa with other carcinomas, posing a challenge in distinction from metastatic tumor disease.

  12. Toward high throughput optical metamaterial assemblies.

    PubMed

    Fontana, Jake; Ratna, Banahalli R

    2015-11-01

    Optical metamaterials have unique engineered optical properties. These properties arise from the careful organization of plasmonic elements. Transitioning these properties from laboratory experiments to functional materials may lead to disruptive technologies for controlling light. A significant issue impeding the realization of optical metamaterial devices is the need for robust and efficient assembly strategies to govern the order of the nanometer-sized elements while enabling macroscopic throughput. This mini-review critically highlights recent approaches and challenges in creating these artificial materials. As the ability to assemble optical metamaterials improves, new unforeseen opportunities may arise for revolutionary optical devices.

  13. Chandrasekhar equations and computational algorithms for distributed parameter systems

    NASA Technical Reports Server (NTRS)

    Burns, J. A.; Ito, K.; Powers, R. K.

    1984-01-01

    The Chandrasekhar equations arising in optimal control problems for linear distributed parameter systems are considered. The equations are derived via approximation theory. This approach is used to obtain existence, uniqueness, and strong differentiability of the solutions and provides the basis for a convergent computation scheme for approximating feedback gain operators. A numerical example is presented to illustrate these ideas.

  14. [Cognitive neuroscience of aging. Contributions and challenges].

    PubMed

    Díaz, Fernando; Pereiro, Arturo X

    The cognitive neuroscience of aging is a young discipline that has emerged as a result of the combination of: A) the theoretical and explanatory frameworks proposed by the cognitive psychology perspective throughout the second half of the twentieth century; B) the designs and methodological procedures arising from experimental psychology and the need to test the hypotheses proposed from the cognitive psychology perspective; C) the contributions of the computer sciences to the explanation of brain functions; and D) the development and use of neuroimaging techniques that have enabled the recording of brain activity in humans while tasks that test some cognitive process or function are performed. An analysis on the impact of research conducted from this perspective over the last 3decades has been carried out, including its shortcomings, as well as the potential directions and usefulness that will advantageously continue to drive this discipline in its description and explanation of the process es of cerebral and cognitive aging. Copyright © 2017 SEGG. Publicado por Elsevier España, S.L.U. All rights reserved.

  15. Ferrotoroidic ground state in a heterometallic {CrIIIDyIII6} complex displaying slow magnetic relaxation.

    PubMed

    Vignesh, Kuduva R; Soncini, Alessandro; Langley, Stuart K; Wernsdorfer, Wolfgang; Murray, Keith S; Rajaraman, Gopalan

    2017-10-18

    Toroidal quantum states are most promising for building quantum computing and information storage devices, as they are insensitive to homogeneous magnetic fields, but interact with charge and spin currents, allowing this moment to be manipulated purely by electrical means. Coupling molecular toroids into larger toroidal moments via ferrotoroidic interactions can be pivotal not only to enhance ground state toroidicity, but also to develop materials displaying ferrotoroidic ordered phases, which sustain linear magneto-electric coupling and multiferroic behavior. However, engineering ferrotoroidic coupling is known to be a challenging task. Here we have isolated a {Cr III Dy III 6 } complex that exhibits the much sought-after ferrotoroidic ground state with an enhanced toroidal moment, solely arising from intramolecular dipolar interactions. Moreover, a theoretical analysis of the observed sub-Kelvin zero-field hysteretic spin dynamics of {Cr III Dy III 6 } reveals the pivotal role played by ferrotoroidic states in slowing down the magnetic relaxation, in spite of large calculated single-ion quantum tunneling rates.

  16. Parameter Estimation for Geoscience Applications Using a Measure-Theoretic Approach

    NASA Astrophysics Data System (ADS)

    Dawson, C.; Butler, T.; Mattis, S. A.; Graham, L.; Westerink, J. J.; Vesselinov, V. V.; Estep, D.

    2016-12-01

    Effective modeling of complex physical systems arising in the geosciences is dependent on knowing parameters which are often difficult or impossible to measure in situ. In this talk we focus on two such problems, estimating parameters for groundwater flow and contaminant transport, and estimating parameters within a coastal ocean model. The approach we will describe, proposed by collaborators D. Estep, T. Butler and others, is based on a novel stochastic inversion technique based on measure theory. In this approach, given a probability space on certain observable quantities of interest, one searches for the sets of highest probability in parameter space which give rise to these observables. When viewed as mappings between sets, the stochastic inversion problem is well-posed in certain settings, but there are computational challenges related to the set construction. We will focus the talk on estimating scalar parameters and fields in a contaminant transport setting, and in estimating bottom friction in a complicated near-shore coastal application.

  17. International Standards for Genomes, Transcriptomes, and Metagenomes

    PubMed Central

    Mason, Christopher E.; Afshinnekoo, Ebrahim; Tighe, Scott; Wu, Shixiu; Levy, Shawn

    2017-01-01

    Challenges and biases in preparing, characterizing, and sequencing DNA and RNA can have significant impacts on research in genomics across all kingdoms of life, including experiments in single-cells, RNA profiling, and metagenomics (across multiple genomes). Technical artifacts and contamination can arise at each point of sample manipulation, extraction, sequencing, and analysis. Thus, the measurement and benchmarking of these potential sources of error are of paramount importance as next-generation sequencing (NGS) projects become more global and ubiquitous. Fortunately, a variety of methods, standards, and technologies have recently emerged that improve measurements in genomics and sequencing, from the initial input material to the computational pipelines that process and annotate the data. Here we review current standards and their applications in genomics, including whole genomes, transcriptomes, mixed genomic samples (metagenomes), and the modified bases within each (epigenomes and epitranscriptomes). These standards, tools, and metrics are critical for quantifying the accuracy of NGS methods, which will be essential for robust approaches in clinical genomics and precision medicine. PMID:28337071

  18. The HD molecule in small and medium cages of clathrate hydrates: Quantum dynamics studied by neutron scattering measurements and computation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Colognesi, Daniele; Celli, Milva; Ulivi, Lorenzo, E-mail: lorenzo.ulivi@isc.cnr.it

    2014-10-07

    We report inelastic neutron scattering (INS) measurements on molecular hydrogen deuteride (HD) trapped in binary cubic (sII) and hexagonal (sH) clathrate hydrates, performed at low temperature using two different neutron spectrometers in order to probe both energy and momentum transfer. The INS spectra of binary clathrate samples exhibit a rich structure containing sharp bands arising from both the rotational transitions and the rattling modes of the guest molecule. For the clathrates with sII structure, there is a very good agreement with the rigorous fully quantum simulations which account for the subtle effects of the anisotropy, angular and radial, of themore » host cage on the HD microscopic dynamics. The sH clathrate sample presents a much greater challenge, due to the uncertainties regarding the crystal structure, which is known only for similar crystals with different promoter, but nor for HD (or H{sub 2}) plus methyl tert-butyl ether (MTBE-d12)« less

  19. Virus evolution and transmission in an ever more connected world

    PubMed Central

    Pybus, Oliver G.; Tatem, Andrew J.; Lemey, Philippe

    2015-01-01

    The frequency and global impact of infectious disease outbreaks, particularly those caused by emerging viruses, demonstrate the need for a better understanding of how spatial ecology and pathogen evolution jointly shape epidemic dynamics. Advances in computational techniques and the increasing availability of genetic and geospatial data are helping to address this problem, particularly when both information sources are combined. Here, we review research at the intersection of evolutionary biology, human geography and epidemiology that is working towards an integrated view of spatial incidence, host mobility and viral genetic diversity. We first discuss how empirical studies have combined viral spatial and genetic data, focusing particularly on the contribution of evolutionary analyses to epidemiology and disease control. Second, we explore the interplay between virus evolution and global dispersal in more depth for two pathogens: human influenza A virus and chikungunya virus. We discuss the opportunities for future research arising from new analyses of human transportation and trade networks, as well as the associated challenges in accessing and sharing relevant spatial and genetic data. PMID:26702033

  20. US Power Production at Risk from Water Stress in a Changing Climate.

    PubMed

    Ganguli, Poulomi; Kumar, Devashish; Ganguly, Auroop R

    2017-09-20

    Thermoelectric power production in the United States primarily relies on wet-cooled plants, which in turn require water below prescribed design temperatures, both for cooling and operational efficiency. Thus, power production in US remains particularly vulnerable to water scarcity and rising stream temperatures under climate change and variability. Previous studies on the climate-water-energy nexus have primarily focused on mid- to end-century horizons and have not considered the full range of uncertainty in climate projections. Technology managers and energy policy makers are increasingly interested in the decadal time scales to understand adaptation challenges and investment strategies. Here we develop a new approach that relies on a novel multivariate water stress index, which considers the joint probability of warmer and scarcer water, and computes uncertainties arising from climate model imperfections and intrinsic variability. Our assessments over contiguous US suggest consistent increase in water stress for power production with about 27% of the production severely impacted by 2030s.

  1. Synthesis of tetra- and octa-aurated heteroaryl complexes towards probing aromatic indoliums

    PubMed Central

    Yuan, Jun; Sun, Tingting; He, Xin; An, Ke; Zhu, Jun; Zhao, Liang

    2016-01-01

    Polymetalated aromatic compounds are particularly challenging synthetic goals because of the limited thermodynamic stability of polyanionic species arising from strong electrostatic repulsion between adjacent carbanionic sites. Here we describe a facile synthesis of two polyaurated complexes including a tetra-aurated indole and an octa-aurated benzodipyrrole. The imido trinuclear gold(I) moiety exhibits nucleophilicity and undergoes an intramolecular attack on a gold(I)-activated ethynyl to generate polyanionic heteroaryl species. Their computed magnetic properties reveal the aromatic character in the five-membered ring. The incorporation of the aurated substituents at the nitrogen atom can convert non-aromaticity in the parent indolium into aromaticity in the aurated one because of hyperconjugation. Thus, the concept of hyperconjugative aromaticity is extended to heterocycles with transition metal substituents. More importantly, further analysis indicates that the aurated substituents can perform better than traditional main-group substituents. This work highlights the difference in aromaticity between polymetalated aryls and their organic prototypes. PMID:27186982

  2. Catecholaminergic challenge uncovers distinct Pavlovian and instrumental mechanisms of motivated (in)action

    PubMed Central

    Swart, Jennifer C; Froböse, Monja I; Cook, Jennifer L; Geurts, Dirk EM; Frank, Michael J; Cools, Roshan; den Ouden, Hanneke EM

    2017-01-01

    Catecholamines modulate the impact of motivational cues on action. Such motivational biases have been proposed to reflect cue-based, ‘Pavlovian’ effects. Here, we assess whether motivational biases may also arise from asymmetrical instrumental learning of active and passive responses following reward and punishment outcomes. We present a novel paradigm, allowing us to disentangle the impact of reward and punishment on instrumental learning from Pavlovian response biasing. Computational analyses showed that motivational biases reflect both Pavlovian and instrumental effects: reward and punishment cues promoted generalized (in)action in a Pavlovian manner, whereas outcomes enhanced instrumental (un)learning of chosen actions. These cue- and outcome-based biases were altered independently by the catecholamine enhancer melthylphenidate. Methylphenidate’s effect varied across individuals with a putative proxy of baseline dopamine synthesis capacity, working memory span. Our study uncovers two distinct mechanisms by which motivation impacts behaviour, and helps refine current models of catecholaminergic modulation of motivated action. DOI: http://dx.doi.org/10.7554/eLife.22169.001 PMID:28504638

  3. Direct single-shot phase retrieval from the diffraction pattern of separated objects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leshem, Ben; Xu, Rui; Dallal, Yehonatan

    The non-crystallographic phase problem arises in numerous scientific and technological fields. An important application is coherent diffractive imaging. Recent advances in X-ray free-electron lasers allow capturing of the diffraction pattern from a single nanoparticle before it disintegrates, in so-called ‘diffraction before destruction’ experiments. Presently, the phase is reconstructed by iterative algorithms, imposing a non-convex computational challenge, or by Fourier holography, requiring a well-characterized reference field. Here we present a convex scheme for single-shot phase retrieval for two (or more) sufficiently separated objects, demonstrated in two dimensions. In our approach, the objects serve as unknown references to one another, reducing themore » phase problem to a solvable set of linear equations. We establish our method numerically and experimentally in the optical domain and demonstrate a proof-of-principle single-shot coherent diffractive imaging using X-ray free-electron lasers pulses. Lastly, our scheme alleviates several limitations of current methods, offering a new pathway towards direct reconstruction of complex objects.« less

  4. Direct single-shot phase retrieval from the diffraction pattern of separated objects

    DOE PAGES

    Leshem, Ben; Xu, Rui; Dallal, Yehonatan; ...

    2016-02-22

    The non-crystallographic phase problem arises in numerous scientific and technological fields. An important application is coherent diffractive imaging. Recent advances in X-ray free-electron lasers allow capturing of the diffraction pattern from a single nanoparticle before it disintegrates, in so-called ‘diffraction before destruction’ experiments. Presently, the phase is reconstructed by iterative algorithms, imposing a non-convex computational challenge, or by Fourier holography, requiring a well-characterized reference field. Here we present a convex scheme for single-shot phase retrieval for two (or more) sufficiently separated objects, demonstrated in two dimensions. In our approach, the objects serve as unknown references to one another, reducing themore » phase problem to a solvable set of linear equations. We establish our method numerically and experimentally in the optical domain and demonstrate a proof-of-principle single-shot coherent diffractive imaging using X-ray free-electron lasers pulses. Lastly, our scheme alleviates several limitations of current methods, offering a new pathway towards direct reconstruction of complex objects.« less

  5. Elastic Model Transitions Using Quadratic Inequality Constrained Least Squares

    NASA Technical Reports Server (NTRS)

    Orr, Jeb S.

    2012-01-01

    A technique is presented for initializing multiple discrete finite element model (FEM) mode sets for certain types of flight dynamics formulations that rely on superposition of orthogonal modes for modeling the elastic response. Such approaches are commonly used for modeling launch vehicle dynamics, and challenges arise due to the rapidly time-varying nature of the rigid-body and elastic characteristics. By way of an energy argument, a quadratic inequality constrained least squares (LSQI) algorithm is employed to e ect a smooth transition from one set of FEM eigenvectors to another with no requirement that the models be of similar dimension or that the eigenvectors be correlated in any particular way. The physically unrealistic and controversial method of eigenvector interpolation is completely avoided, and the discrete solution approximates that of the continuously varying system. The real-time computational burden is shown to be negligible due to convenient features of the solution method. Simulation results are presented, and applications to staging and other discontinuous mass changes are discussed

  6. On the impact of water activity on reversal tolerant fuel cell anode performance and durability

    NASA Astrophysics Data System (ADS)

    Hong, Bo Ki; Mandal, Pratiti; Oh, Jong-Gil; Litster, Shawn

    2016-10-01

    Durability of polymer electrolyte fuel cells in automotive applications can be severely affected by hydrogen starvation arising due to transients during the drive-cycle. It causes individual cell voltage reversal, yielding water electrolysis and carbon corrosion reactions at the anode, ultimately leading to catastrophic cell failure. A popular material-based mitigation strategy is to employ a reversal tolerant anode (RTA) that includes oxygen evolution reaction (OER) catalyst (e.g., IrO2) to promote water electrolysis over carbon corrosion. Here we report that RTA performance surprisingly drops under not only water-deficient but also water-excess conditions. This presents a significant technical challenge since the most common triggers for cell reversal involve excess liquid water. Our findings from detailed electrochemical diagnostics and nano-scale X-ray computed tomography provide insight into how automotive fuel cells can overcome critical vulnerabilities using material-based solutions. Our work also highlights the need for improved materials, electrode designs, and operation strategies for robust RTAs.

  7. Lung cancer diagnosis on ovary mass: a case report

    PubMed Central

    2013-01-01

    Metastatic neoplasms to the ovary often cause diagnostic problems, in particular those large ovarian masses mimicking primary tumors. Most of these tumors arise from digestive system or breast, while 37-year-old woman diagnosed as right adnexal complex mass, with a subpleural nodule in the apical part of the left lower lobe, at preoperative chest computed tomography scan. The patient underwent total abdominal hysterectomy with right salpingo-oophorectomy (ovarian mass 220 × 200 mm), total omentectomy, left ovarian biopsy, peritoneal random biopsies, and peritoneal washings for cytology. Pathologic and immunohistochemical examination of ovarian specimen suggested morphology and expression of metastatic lung adenocarcinoma with an intense positivity for Thyroid Transcriptional Factor-1 (TTF-1) and Cytokeratin 7 (CK7) staining. Fine needle biopsy of the lung nodule found epithelioid like malignant cells, confirming the diagnosis of an ovarian metastasis from a primary lung cancer. This report focused on the clinical and pathologic diagnostic challenge of distinguishing secondary from primary ovarian neoplasms. Issues on useful immunohistochemical stains are also discussed. PMID:23663245

  8. Age effects on explicit and implicit memory

    PubMed Central

    Ward, Emma V.; Berry, Christopher J.; Shanks, David R.

    2013-01-01

    It is well-documented that explicit memory (e.g., recognition) declines with age. In contrast, many argue that implicit memory (e.g., priming) is preserved in healthy aging. For example, priming on tasks such as perceptual identification is often not statistically different in groups of young and older adults. Such observations are commonly taken as evidence for distinct explicit and implicit learning/memory systems. In this article we discuss several lines of evidence that challenge this view. We describe how patterns of differential age-related decline may arise from differences in the ways in which the two forms of memory are commonly measured, and review recent research suggesting that under improved measurement methods, implicit memory is not age-invariant. Formal computational models are of considerable utility in revealing the nature of underlying systems. We report the results of applying single and multiple-systems models to data on age effects in implicit and explicit memory. Model comparison clearly favors the single-system view. Implications for the memory systems debate are discussed. PMID:24065942

  9. Tamping Ramping: Algorithmic, Implementational, and Computational Explanations of Phasic Dopamine Signals in the Accumbens

    PubMed Central

    Lloyd, Kevin; Dayan, Peter

    2015-01-01

    Substantial evidence suggests that the phasic activity of dopamine neurons represents reinforcement learning’s temporal difference prediction error. However, recent reports of ramp-like increases in dopamine concentration in the striatum when animals are about to act, or are about to reach rewards, appear to pose a challenge to established thinking. This is because the implied activity is persistently predictable by preceding stimuli, and so cannot arise as this sort of prediction error. Here, we explore three possible accounts of such ramping signals: (a) the resolution of uncertainty about the timing of action; (b) the direct influence of dopamine over mechanisms associated with making choices; and (c) a new model of discounted vigour. Collectively, these suggest that dopamine ramps may be explained, with only minor disturbance, by standard theoretical ideas, though urgent questions remain regarding their proximal cause. We suggest experimental approaches to disentangling which of the proposed mechanisms are responsible for dopamine ramps. PMID:26699940

  10. New approaches: innovations in cancer prevention, diagnosis, treatment, and support.

    PubMed

    Engelking, C

    1994-01-01

    To discuss and project changes in cancer care in the 21st century. Projections are based on synthesis of multiple scholarly, professional, and governmental information sources. Changes will be reflected in the areas of patient subgrouping for more effective prevention and treatment; a redesigned therapeutic paradigm; a mind-body renaissance emphasizing holism and quality of life; and an accentuated influence of ethics on oncology nursing practice arising from healthcare reform and new scientific understanding of the human gene. Opportunities for nurses resulting from these changes include roles as genetic-risk analysts, health-education media designers, patient readiness evaluators, technology accessors, partners in a holistic care center, and treatment options advisors. Nurses will need to respond to these challenges by expanding their knowledge base with respect to genetics and computer science, refining the interactive skills that are necessary to address the psychosocial aspects of cancer care, and assimilating new technology while designing strategies to minimize the dehumanizing consequences of technology dependency.

  11. Processes and Challenges in Identifying Learning Disabilities among Students Who Are English Language Learners in Three New York State Districts. Issues & Answers. REL 2010-No. 085

    ERIC Educational Resources Information Center

    Sanchez, Maria Teresa; Parker, Caroline; Akbayin, Bercem; McTigue, Anna

    2010-01-01

    Using interviews with district and school personnel and documents from state and district web sites in three districts in New York State, the study examines practices for identifying learning disabilities among students who are English language learners and the challenges that arise. Specifically, two research questions guided the project: (1)…

  12. Making the right decisions in a consolidating market.

    PubMed

    Kaufman, Kenneth; Grube, Mark E

    2009-07-01

    Market forces may lead to increased consolidation in the healthcare industry, creating both opportunities and challenges. Opportunities for small hospitals and health systems include partnering with stronger organizations, while for larger organizations, acquiring potentially undervalued hospitals can yield the benefits associated with increased size and scale. Potential barriers to success arise in three areas-strategy, finance, and operations. Healthcare executives must understand and be willing to fully address these challenges.

  13. GMLC Extreme Event Modeling -- Slow-Dynamics Models for Renewable Energy Resources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Korkali, M.; Min, L.

    The need for slow dynamics models of renewable resources in cascade modeling essentially arises from the challenges associated with the increased use of solar and wind electric power. Indeed, the main challenge is that the power produced by wind and sunlight is not consistent; thus, renewable energy resources tend to have variable output power on many different timescales, including the timescales that a cascade unfolds.

  14. Moving beyond regression techniques in cardiovascular risk prediction: applying machine learning to address analytic challenges

    PubMed Central

    Goldstein, Benjamin A.; Navar, Ann Marie; Carter, Rickey E.

    2017-01-01

    Abstract Risk prediction plays an important role in clinical cardiology research. Traditionally, most risk models have been based on regression models. While useful and robust, these statistical methods are limited to using a small number of predictors which operate in the same way on everyone, and uniformly throughout their range. The purpose of this review is to illustrate the use of machine-learning methods for development of risk prediction models. Typically presented as black box approaches, most machine-learning methods are aimed at solving particular challenges that arise in data analysis that are not well addressed by typical regression approaches. To illustrate these challenges, as well as how different methods can address them, we consider trying to predicting mortality after diagnosis of acute myocardial infarction. We use data derived from our institution's electronic health record and abstract data on 13 regularly measured laboratory markers. We walk through different challenges that arise in modelling these data and then introduce different machine-learning approaches. Finally, we discuss general issues in the application of machine-learning methods including tuning parameters, loss functions, variable importance, and missing data. Overall, this review serves as an introduction for those working on risk modelling to approach the diffuse field of machine learning. PMID:27436868

  15. Schools as Sites for Recruiting Participants and Implementing Research.

    PubMed

    Bartlett, Robin; Wright, Tiffany; Olarinde, Tia; Holmes, Tara; Beamon, Emily R; Wallace, Debra

    2017-01-01

    Schools can be a valuable resource for recruitment of participants for research involving children, adolescents, and parents. Awareness of the benefits and challenges of working with schools can assist researchers in developing effective school partnerships. This article discusses the advantages of conducting research within the school system as well as the challenges that may also arise. Such challenges include developing key contacts, building relationships, logistical arrangements, and facilitating trust in the research topic and team. Suggestions for strategies to forge successful collaborative relationships with schools are provided.

  16. The implementation of AI technologies in computer wargames

    NASA Astrophysics Data System (ADS)

    Tiller, John A.

    2004-08-01

    Computer wargames involve the most in-depth analysis of general game theory. The enumerated turns of a game like chess are dwarfed by the exponentially larger possibilities of even a simple computer wargame. Implementing challenging AI is computer wargames is an important goal in both the commercial and military environments. In the commercial marketplace, customers demand a challenging AI opponent when they play a computer wargame and are frustrated by a lack of competence on the part of the AI. In the military environment, challenging AI opponents are important for several reasons. A challenging AI opponent will force the military professional to avoid routine or set-piece approaches to situations and cause them to think much deeper about military situations before taking action. A good AI opponent would also include national characteristics of the opponent being simulated, thus providing the military professional with even more of a challenge in planning and approach. Implementing current AI technologies in computer wargames is a technological challenge. The goal is to join the needs of AI in computer wargames with the solutions of current AI technologies. This talk will address several of those issues, possible solutions, and currently unsolved problems.

  17. Modelling cell motility and chemotaxis with evolving surface finite elements

    PubMed Central

    Elliott, Charles M.; Stinner, Björn; Venkataraman, Chandrasekhar

    2012-01-01

    We present a mathematical and a computational framework for the modelling of cell motility. The cell membrane is represented by an evolving surface, with the movement of the cell determined by the interaction of various forces that act normal to the surface. We consider external forces such as those that may arise owing to inhomogeneities in the medium and a pressure that constrains the enclosed volume, as well as internal forces that arise from the reaction of the cells' surface to stretching and bending. We also consider a protrusive force associated with a reaction–diffusion system (RDS) posed on the cell membrane, with cell polarization modelled by this surface RDS. The computational method is based on an evolving surface finite-element method. The general method can account for the large deformations that arise in cell motility and allows the simulation of cell migration in three dimensions. We illustrate applications of the proposed modelling framework and numerical method by reporting on numerical simulations of a model for eukaryotic chemotaxis and a model for the persistent movement of keratocytes in two and three space dimensions. Movies of the simulated cells can be obtained from http://homepages.warwick.ac.uk/∼maskae/CV_Warwick/Chemotaxis.html. PMID:22675164

  18. 77 FR 50637 - Schedule of Fees Authorized

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-22

    ... direct and indirect costs must be taken into account in the computation of these costs. Since we last... thereof; or (4) Raise novel legal or policy issues arising out of legal mandates, the President's...

  19. A Hidden Surface Algorithm for Computer Generated Halftone Pictures

    DTIC Science & Technology

    converting data describing three-dimensional objects into data that can be used to generate two-dimensional halftone images. It deals with some problems that arise in black and white, and color shading.

  20. Inequalities, assessment and computer algebra

    NASA Astrophysics Data System (ADS)

    Sangwin, Christopher J.

    2015-01-01

    The goal of this paper is to examine single variable real inequalities that arise as tutorial problems and to examine the extent to which current computer algebra systems (CAS) can (1) automatically solve such problems and (2) determine whether students' own answers to such problems are correct. We review how inequalities arise in contemporary curricula. We consider the formal mathematical processes by which such inequalities are solved, and we consider the notation and syntax through which solutions are expressed. We review the extent to which current CAS can accurately solve these inequalities, and the form given to the solutions by the designers of this software. Finally, we discuss the functionality needed to deal with students' answers, i.e. to establish equivalence (or otherwise) of expressions representing unions of intervals. We find that while contemporary CAS accurately solve inequalities there is a wide variety of notation used.

  1. $${{\\bar{d}} - {\\bar{u}}}$$ Flavor Asymmetry in the Proton in Chiral Effective Field Theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Salamu, Y.; Ji, Cheung-Ryong; Melnitchouk, Wally

    2015-09-01

    Themore » $${\\bar d - \\bar u}$$ flavor asymmetry in the proton arising from pion loops is computed using chiral effective field theory. calculation includes both nucleon and Δ intermediate states, and uses both the fully relativistic and heavy baryon frameworks. x dependence of $${\\bar d - \\bar u}$$ extracted from the Fermilab E866 Drell–Yan data can be well reproduced in terms of a single transverse momentum cutoff parameter regulating the ultraviolet behavior of the loop integrals. In addition to the distribution at x > 0, corrections to the integrated asymmetry from zero momentum contributions are computed, which arise from pion rainbow and bubble diagrams at x = 0. These have not been accounted for in previous analyses, and can make important contributions to the lowest moment of $${\\bar d-\\bar u}$$ .« less

  2. Persistent Mullerian duct syndrome in a Miniature Schnauzer dog with signs of feminization and a Sertoli cell tumour.

    PubMed

    Vegter, A R; Kooistra, H S; van Sluijs, F J; van Bruggen, L W L; Ijzer, J; Zijlstra, C; Okkens, A C

    2010-06-01

    A 5-year-old male Miniature Schnauzer was presented with unilateral cryptorchidism and signs of feminization. Abdominal ultrasonography revealed an enlarged right testis and a large, fluid-filled cavity that appeared to arise from the prostate. Computed tomography revealed the cavity to be consistent with an enlarged uterine body, arising from the prostate, and showed two structures resembling uterine horns that terminated close to the adjacent testes. The dog had a normal male karyotype, 78 XY. Gonadohysterectomy was performed and both the surgical and the histological findings confirmed the presence of a uterus in this male animal, resulting in a diagnosis of persistent Mullerian duct syndrome (PMDS). The enlarged intra-abdominal testis contained a Sertoli cell tumour. Computed tomography proved to be an excellent diagnostic tool for PMDS.

  3. Reply to “Comment on ‘Axion induced oscillating electric dipole moments’”

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hill, Christopher T.

    A recent paper of Flambaum, Roberts and Stadnik, [1], claims there is no induced oscillating electric dipole moment (OEDM), eg, for the electron, arising from the oscillating cosmic axion background via the anomaly. This claim is based upon the assumption that electric dipoles always be defined by their coupling to static (constant in time) electric fields. The relevant Feynman diagram, as computed by [1], then becomes a total divergence, and vanishes in momentum space. However, an OEDM does arise from the anomaly, coupled to time dependent electric fields. It shares the decoupling properties with the anomaly. The full action, inmore » an arbitrary gauge, was computed in [2], [3]. It is nonvanishing with a time dependent outgoing photon, and yields physics, eg, electric dipole radiation of an electron immersed in a cosmic axion field.« less

  4. High fidelity simulation and analysis of liquid jet atomization in a gaseous crossflow at intermediate Weber numbers

    NASA Astrophysics Data System (ADS)

    Li, Xiaoyi; Soteriou, Marios C.

    2016-08-01

    Recent advances in numerical methods coupled with the substantial enhancements in computing power and the advent of high performance computing have presented first principle, high fidelity simulation as a viable tool in the prediction and analysis of spray atomization processes. The credibility and potential impact of such simulations, however, has been hampered by the relative absence of detailed validation against experimental evidence. The numerical stability and accuracy challenges arising from the need to simulate the high liquid-gas density ratio across the sharp interfaces encountered in these flows are key reasons for this. In this work we challenge this status quo by presenting a numerical model able to deal with these challenges, employing it in simulations of liquid jet in crossflow atomization and performing extensive validation of its results against a carefully executed experiment with detailed measurements in the atomization region. We then proceed to the detailed analysis of the flow physics. The computational model employs the coupled level set and volume of fluid approach to directly capture the spatiotemporal evolution of the liquid-gas interface and the sharp-interface ghost fluid method to stably handle high liquid-air density ratio. Adaptive mesh refinement and Lagrangian droplet models are shown to be viable options for computational cost reduction. Moreover, high performance computing is leveraged to manage the computational cost. The experiment selected for validation eliminates the impact of inlet liquid and gas turbulence and focuses on the impact of the crossflow aerodynamic forces on the atomization physics. Validation is demonstrated by comparing column surface wavelengths, deformation, breakup locations, column trajectories and droplet sizes, velocities, and mass rates for a range of intermediate Weber numbers. Analysis of the physics is performed in terms of the instability and breakup characteristics and the features of downstream flow recirculation, and vortex shedding. Formation of "Λ" shape windward column waves is observed and explained by the combined upward and lateral surface motion. The existence of Rayleigh-Taylor instability as the primary mechanism for the windward column waves is verified for this case by comparing wavelengths from the simulations to those predicted by linear stability analyses. Physical arguments are employed to postulate that the type of instability manifested may be related to conditions such as the gas Weber number and the inlet turbulence level. The decreased column wavelength with increasing Weber number is found to cause enhanced surface stripping and early depletion of liquid core at higher Weber number. A peculiar "three-streak-two-membrane" liquid structure is identified at the lowest Weber number and explained as the consequence of the symmetric recirculation zones behind the jet column. It is found that the vortical flow downstream of the liquid column resembles a von Karman vortex street and that the coupling between the gas flow and droplet transport is weak for the conditions explored.

  5. High fidelity simulation and analysis of liquid jet atomization in a gaseous crossflow at intermediate Weber numbers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Xiaoyi, E-mail: lixy2@utrc.utc.com; Soteriou, Marios C.

    Recent advances in numerical methods coupled with the substantial enhancements in computing power and the advent of high performance computing have presented first principle, high fidelity simulation as a viable tool in the prediction and analysis of spray atomization processes. The credibility and potential impact of such simulations, however, has been hampered by the relative absence of detailed validation against experimental evidence. The numerical stability and accuracy challenges arising from the need to simulate the high liquid-gas density ratio across the sharp interfaces encountered in these flows are key reasons for this. In this work we challenge this status quomore » by presenting a numerical model able to deal with these challenges, employing it in simulations of liquid jet in crossflow atomization and performing extensive validation of its results against a carefully executed experiment with detailed measurements in the atomization region. We then proceed to the detailed analysis of the flow physics. The computational model employs the coupled level set and volume of fluid approach to directly capture the spatiotemporal evolution of the liquid-gas interface and the sharp-interface ghost fluid method to stably handle high liquid-air density ratio. Adaptive mesh refinement and Lagrangian droplet models are shown to be viable options for computational cost reduction. Moreover, high performance computing is leveraged to manage the computational cost. The experiment selected for validation eliminates the impact of inlet liquid and gas turbulence and focuses on the impact of the crossflow aerodynamic forces on the atomization physics. Validation is demonstrated by comparing column surface wavelengths, deformation, breakup locations, column trajectories and droplet sizes, velocities, and mass rates for a range of intermediate Weber numbers. Analysis of the physics is performed in terms of the instability and breakup characteristics and the features of downstream flow recirculation, and vortex shedding. Formation of “Λ” shape windward column waves is observed and explained by the combined upward and lateral surface motion. The existence of Rayleigh-Taylor instability as the primary mechanism for the windward column waves is verified for this case by comparing wavelengths from the simulations to those predicted by linear stability analyses. Physical arguments are employed to postulate that the type of instability manifested may be related to conditions such as the gas Weber number and the inlet turbulence level. The decreased column wavelength with increasing Weber number is found to cause enhanced surface stripping and early depletion of liquid core at higher Weber number. A peculiar “three-streak-two-membrane” liquid structure is identified at the lowest Weber number and explained as the consequence of the symmetric recirculation zones behind the jet column. It is found that the vortical flow downstream of the liquid column resembles a von Karman vortex street and that the coupling between the gas flow and droplet transport is weak for the conditions explored.« less

  6. Combined Uncertainty and A-Posteriori Error Bound Estimates for General CFD Calculations: Theory and Software Implementation

    NASA Technical Reports Server (NTRS)

    Barth, Timothy J.

    2014-01-01

    This workshop presentation discusses the design and implementation of numerical methods for the quantification of statistical uncertainty, including a-posteriori error bounds, for output quantities computed using CFD methods. Hydrodynamic realizations often contain numerical error arising from finite-dimensional approximation (e.g. numerical methods using grids, basis functions, particles) and statistical uncertainty arising from incomplete information and/or statistical characterization of model parameters and random fields. The first task at hand is to derive formal error bounds for statistics given realizations containing finite-dimensional numerical error [1]. The error in computed output statistics contains contributions from both realization error and the error resulting from the calculation of statistics integrals using a numerical method. A second task is to devise computable a-posteriori error bounds by numerically approximating all terms arising in the error bound estimates. For the same reason that CFD calculations including error bounds but omitting uncertainty modeling are only of limited value, CFD calculations including uncertainty modeling but omitting error bounds are only of limited value. To gain maximum value from CFD calculations, a general software package for uncertainty quantification with quantified error bounds has been developed at NASA. The package provides implementations for a suite of numerical methods used in uncertainty quantification: Dense tensorization basis methods [3] and a subscale recovery variant [1] for non-smooth data, Sparse tensorization methods[2] utilizing node-nested hierarchies, Sampling methods[4] for high-dimensional random variable spaces.

  7. Making classical and quantum canonical general relativity computable through a power series expansion in the inverse cosmological constant.

    PubMed

    Gambini, R; Pullin, J

    2000-12-18

    We consider general relativity with a cosmological constant as a perturbative expansion around a completely solvable diffeomorphism invariant field theory. This theory is the lambda --> infinity limit of general relativity. This allows an explicit perturbative computational setup in which the quantum states of the theory and the classical observables can be explicitly computed. An unexpected relationship arises at a quantum level between the discrete spectrum of the volume operator and the allowed values of the cosmological constant.

  8. Bethe-Salpeter Eigenvalue Solver Package (BSEPACK) v0.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    SHAO, MEIYEU; YANG, CHAO

    2017-04-25

    The BSEPACK contains a set of subroutines for solving the Bethe-Salpeter Eigenvalue (BSE) problem. This type of problem arises in this study of optical excitation of nanoscale materials. The BSE problem is a structured non-Hermitian eigenvalue problem. The BSEPACK software can be used to compute all or subset of eigenpairs of a BSE Hamiltonian. It can also be used to compute the optical absorption spectrum without computing BSE eigenvalues and eigenvectors explicitly. The package makes use of the ScaLAPACK, LAPACK and BLAS.

  9. Easing The Calculation Of Bolt-Circle Coordinates

    NASA Technical Reports Server (NTRS)

    Burley, Richard K.

    1995-01-01

    Bolt Circle Calculation (BOLT-CALC) computer program used to reduce significant time consumed in manually computing trigonometry of rectangular Cartesian coordinates of holes in bolt circle as shown on blueprint or drawing. Eliminates risk of computational errors, particularly in cases involving many holes or in cases in which coordinates expressed to many significant digits. Program assists in many practical situations arising in machine shops. Written in BASIC. Also successfully compiled and implemented by use of Microsoft's QuickBasic v4.0.

  10. Legal Challenges and Pitfalls for Start-up Companies - 48 Common Questions and Answers.

    PubMed

    Staehelin, Matthias

    2014-12-01

    Transforming a business idea into reality requires a legal implementation plan. The following 48 questions and answers address key issues that typically arise in start-up situations. Early planning can help avoid costly mistakes.

  11. Data Streams: An Overview and Scientific Applications

    NASA Astrophysics Data System (ADS)

    Aggarwal, Charu C.

    In recent years, advances in hardware technology have facilitated the ability to collect data continuously. Simple transactions of everyday life such as using a credit card, a phone, or browsing the web lead to automated data storage. Similarly, advances in information technology have lead to large flows of data across IP networks. In many cases, these large volumes of data can be mined for interesting and relevant information in a wide variety of applications. When the volume of the underlying data is very large, it leads to a number of computational and mining challenges: With increasing volume of the data, it is no longer possible to process the data efficiently by using multiple passes. Rather, one can process a data item at most once. This leads to constraints on the implementation of the underlying algorithms. Therefore, stream mining algorithms typically need to be designed so that the algorithms work with one pass of the data. In most cases, there is an inherent temporal component to the stream mining process. This is because the data may evolve over time. This behavior of data streams is referred to as temporal locality. Therefore, a straightforward adaptation of one-pass mining algorithms may not be an effective solution to the task. Stream mining algorithms need to be carefully designed with a clear focus on the evolution of the underlying data. Another important characteristic of data streams is that they are often mined in a distributed fashion. Furthermore, the individual processors may have limited processing and memory. Examples of such cases include sensor networks, in which it may be desirable to perform in-network processing of data stream with limited processing and memory [1, 2]. This chapter will provide an overview of the key challenges in stream mining algorithms which arise from the unique setup in which these problems are encountered. This chapter is organized as follows. In the next section, we will discuss the generic challenges that stream mining poses to a variety of data management and data mining problems. The next section also deals with several issues which arise in the context of data stream management. In Sect. 3, we discuss several mining algorithms on the data stream model. Section 4 discusses various scientific applications of data streams. Section 5 discusses the research directions and conclusions.

  12. Multiscale Modeling in Computational Biomechanics: Determining Computational Priorities and Addressing Current Challenges

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tawhai, Merryn; Bischoff, Jeff; Einstein, Daniel R.

    2009-05-01

    Abstract In this article, we describe some current multiscale modeling issues in computational biomechanics from the perspective of the musculoskeletal and respiratory systems and mechanotransduction. First, we outline the necessity of multiscale simulations in these biological systems. Then we summarize challenges inherent to multiscale biomechanics modeling, regardless of the subdiscipline, followed by computational challenges that are system-specific. We discuss some of the current tools that have been utilized to aid research in multiscale mechanics simulations, and the priorities to further the field of multiscale biomechanics computation.

  13. Neural decoding of collective wisdom with multi-brain computing.

    PubMed

    Eckstein, Miguel P; Das, Koel; Pham, Binh T; Peterson, Matthew F; Abbey, Craig K; Sy, Jocelyn L; Giesbrecht, Barry

    2012-01-02

    Group decisions and even aggregation of multiple opinions lead to greater decision accuracy, a phenomenon known as collective wisdom. Little is known about the neural basis of collective wisdom and whether its benefits arise in late decision stages or in early sensory coding. Here, we use electroencephalography and multi-brain computing with twenty humans making perceptual decisions to show that combining neural activity across brains increases decision accuracy paralleling the improvements shown by aggregating the observers' opinions. Although the largest gains result from an optimal linear combination of neural decision variables across brains, a simpler neural majority decision rule, ubiquitous in human behavior, results in substantial benefits. In contrast, an extreme neural response rule, akin to a group following the most extreme opinion, results in the least improvement with group size. Analyses controlling for number of electrodes and time-points while increasing number of brains demonstrate unique benefits arising from integrating neural activity across different brains. The benefits of multi-brain integration are present in neural activity as early as 200 ms after stimulus presentation in lateral occipital sites and no additional benefits arise in decision related neural activity. Sensory-related neural activity can predict collective choices reached by aggregating individual opinions, voting results, and decision confidence as accurately as neural activity related to decision components. Estimation of the potential for the collective to execute fast decisions by combining information across numerous brains, a strategy prevalent in many animals, shows large time-savings. Together, the findings suggest that for perceptual decisions the neural activity supporting collective wisdom and decisions arises in early sensory stages and that many properties of collective cognition are explainable by the neural coding of information across multiple brains. Finally, our methods highlight the potential of multi-brain computing as a technique to rapidly and in parallel gather increased information about the environment as well as to access collective perceptual/cognitive choices and mental states. Copyright © 2011 Elsevier Inc. All rights reserved.

  14. A simplified simulation model for a HPDC die with conformal cooling channels

    NASA Astrophysics Data System (ADS)

    Frings, Markus; Behr, Marek; Elgeti, Stefanie

    2017-10-01

    In general, the cooling phase of the high-pressure die casting process is based on complex physical phenomena: so-lidification of molten material; heat exchange between cast part, die and cooling fluid; turbulent flow inside the cooling channels that needs to be considered when computing the heat flux; interdependency of properties and temperature of the cooling liquid. Intuitively understanding and analyzing all of these effects when designing HPDC dies is not feasible. A remedy that has become available is numerical design, based for example on shape optimization methods. However, current computing power is not sufficient to perform optimization while at the same time fully resolving all physical phenomena. But since in HPDC suitable objective functions very often lead to integral values, e.g., average die temperature, this paper identifies possible simplifications in the modeling of the cooling phase. As a consequence, the computational effort is reduced to an acceptable level. A further aspect that arises in the context of shape optimization is the evaluation of shape gradients. The challenge here is to allow for large shape deformations without remeshing. In our approach, the cooling channels are described by their center lines. The flow profile of the cooling fluid is then estimated based on experimental data found in literature for turbulent pipe flows. In combination, the heat flux throughout cavity, die, and cooling channel can be described by one single advection-diffusion equation on a fixed mesh. The parameters in the equation are adjusted based on the position of cavity and cooling channel. Both results contribute towards a computationally efficient, yet accurate method, which can be employed within the frame of shape optimization of cooling channels in HPDC dies.

  15. Anharmonic interatomic force constants and thermal conductivity from Grüneisen parameters: An application to graphene

    NASA Astrophysics Data System (ADS)

    Lee, Ching Hua; Gan, Chee Kwan

    2017-07-01

    Phonon-mediated thermal conductivity, which is of great technological relevance, arises due fundamentally to anharmonic scattering from interatomic potentials. Despite its prevalence, accurate first-principles calculations of thermal conductivity remain challenging, primarily due to the high computational cost of anharmonic interatomic force constant (IFC) calculations. Meanwhile, the related anharmonic phenomenon of thermal expansion is much more tractable, being computable from the Grüneisen parameters associated with phonon frequency shifts due to crystal deformations. In this work, we propose an approach for computing the largest cubic IFCs from the Grüneisen parameter data. This allows an approximate determination of the thermal conductivity via a much less expensive route. The key insight is that although the Grüneisen parameters cannot possibly contain all the information on the cubic IFCs, being derivable from spatially uniform deformations, they can still unambiguously and accurately determine the largest and most physically relevant ones. By fitting the anisotropic Grüneisen parameter data along judiciously designed deformations, we can deduce (i.e., reverse-engineer) the dominant cubic IFCs and estimate three-phonon scattering amplitudes. We illustrate our approach by explicitly computing the largest cubic IFCs and thermal conductivity of graphene, especially for its out-of-plane (flexural) modes that exhibit anomalously large anharmonic shifts and thermal conductivity contributions. Our calculations on graphene not only exhibit reasonable agreement with established density-functional theory results, but they also present a pedagogical opportunity for introducing an elegant analytic treatment of the Grüneisen parameters of generic two-band models. Our approach can be readily extended to more complicated crystalline materials with nontrivial anharmonic lattice effects.

  16. Rethinking Visual Analytics for Streaming Data Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crouser, R. Jordan; Franklin, Lyndsey; Cook, Kris

    In the age of data science, the use of interactive information visualization techniques has become increasingly ubiquitous. From online scientific journals to the New York Times graphics desk, the utility of interactive visualization for both storytelling and analysis has become ever more apparent. As these techniques have become more readily accessible, the appeal of combining interactive visualization with computational analysis continues to grow. Arising out of a need for scalable, human-driven analysis, primary objective of visual analytics systems is to capitalize on the complementary strengths of human and machine analysis, using interactive visualization as a medium for communication between themore » two. These systems leverage developments from the fields of information visualization, computer graphics, machine learning, and human-computer interaction to support insight generation in areas where purely computational analyses fall short. Over the past decade, visual analytics systems have generated remarkable advances in many historically challenging analytical contexts. These include areas such as modeling political systems [Crouser et al. 2012], detecting financial fraud [Chang et al. 2008], and cybersecurity [Harrison et al. 2012]. In each of these contexts, domain expertise and human intuition is a necessary component of the analysis. This intuition is essential to building trust in the analytical products, as well as supporting the translation of evidence into actionable insight. In addition, each of these examples also highlights the need for scalable analysis. In each case, it is infeasible for a human analyst to manually assess the raw information unaided, and the communication overhead to divide the task between a large number of analysts makes simple parallelism intractable. Regardless of the domain, visual analytics tools strive to optimize the allocation of human analytical resources, and to streamline the sensemaking process on data that is massive, complex, incomplete, and uncertain in scenarios requiring human judgment.« less

  17. Consistency of the adiabatic theorem.

    PubMed

    Amin, M H S

    2009-06-05

    The adiabatic theorem provides the basis for the adiabatic model of quantum computation. Recently the conditions required for the adiabatic theorem to hold have become a subject of some controversy. Here we show that the reported violations of the adiabatic theorem all arise from resonant transitions between energy levels. In the absence of fast driven oscillations the traditional adiabatic theorem holds. Implications for adiabatic quantum computation are discussed.

  18. Quantum information, cognition, and music.

    PubMed

    Dalla Chiara, Maria L; Giuntini, Roberto; Leporini, Roberto; Negri, Eleonora; Sergioli, Giuseppe

    2015-01-01

    Parallelism represents an essential aspect of human mind/brain activities. One can recognize some common features between psychological parallelism and the characteristic parallel structures that arise in quantum theory and in quantum computation. The article is devoted to a discussion of the following questions: a comparison between classical probabilistic Turing machines and quantum Turing machines.possible applications of the quantum computational semantics to cognitive problems.parallelism in music.

  19. Quantum information, cognition, and music

    PubMed Central

    Dalla Chiara, Maria L.; Giuntini, Roberto; Leporini, Roberto; Negri, Eleonora; Sergioli, Giuseppe

    2015-01-01

    Parallelism represents an essential aspect of human mind/brain activities. One can recognize some common features between psychological parallelism and the characteristic parallel structures that arise in quantum theory and in quantum computation. The article is devoted to a discussion of the following questions: a comparison between classical probabilistic Turing machines and quantum Turing machines.possible applications of the quantum computational semantics to cognitive problems.parallelism in music. PMID:26539139

  20. Biomechanics of compensatory mechanisms in spinal-pelvic complex

    NASA Astrophysics Data System (ADS)

    Ivanov, D. V.; Hominets, V. V.; Kirillova, I. V.; Kossovich, L. Yu; Kudyashev, A. L.; Teremshonok, A. V.

    2018-04-01

    3D geometric solid computer model of spinal-pelvic complex was constructed on the basis of computed tomography and full body X-ray in standing position data. The constructed model was used for biomechanical analysis of compensatory mechanisms arising in the spine with anteversion and retroversion of the pelvis. The results of numerical biomechanical 3D modeling are in good agreement with the clinical data.

  1. Imaging of the oral cavity.

    PubMed

    Meesa, Indu Rekha; Srinivasan, Ashok

    2015-01-01

    The oral cavity is a challenging area in head and neck imaging because of its complex anatomy and the numerous pathophysiologies that involve its contents. This challenge is further compounded by the ubiquitous artifacts that arise from the dental amalgam, which compromise image quality. In this article, the anatomy of the oral cavity is discussed in brief, followed by a description of the imaging technique and some common pathologic abnormalities. Copyright © 2015 Elsevier Inc. All rights reserved.

  2. Automotive Fleet Fuel Consumption Model : Fuel For

    DOT National Transportation Integrated Search

    1978-01-01

    The computer model described in this report is a tool for determining the fuel conservation benefits arising from various hypothetical schedules of new car fuel economy standards. (Portions of this document are not fully legible)

  3. Mobile Computing and Ubiquitous Networking: Concepts, Technologies and Challenges.

    ERIC Educational Resources Information Center

    Pierre, Samuel

    2001-01-01

    Analyzes concepts, technologies and challenges related to mobile computing and networking. Defines basic concepts of cellular systems. Describes the evolution of wireless technologies that constitute the foundations of mobile computing and ubiquitous networking. Presents characterization and issues of mobile computing. Analyzes economical and…

  4. Software Systems for High-performance Quantum Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Humble, Travis S; Britt, Keith A

    Quantum computing promises new opportunities for solving hard computational problems, but harnessing this novelty requires breakthrough concepts in the design, operation, and application of computing systems. We define some of the challenges facing the development of quantum computing systems as well as software-based approaches that can be used to overcome these challenges. Following a brief overview of the state of the art, we present models for the quantum programming and execution models, the development of architectures for hybrid high-performance computing systems, and the realization of software stacks for quantum networking. This leads to a discussion of the role that conventionalmore » computing plays in the quantum paradigm and how some of the current challenges for exascale computing overlap with those facing quantum computing.« less

  5. Insights into Parkinson's disease from computational models of the basal ganglia.

    PubMed

    Humphries, Mark D; Obeso, Jose Angel; Dreyer, Jakob Kisbye

    2018-04-17

    Movement disorders arise from the complex interplay of multiple changes to neural circuits. Successful treatments for these disorders could interact with these complex changes in myriad ways, and as a consequence their mechanisms of action and their amelioration of symptoms are incompletely understood. Using Parkinson's disease as a case study, we review here how computational models are a crucial tool for taming this complexity, across causative mechanisms, consequent neural dynamics and treatments. For mechanisms, we review models that capture the effects of losing dopamine on basal ganglia function; for dynamics, we discuss models that have transformed our understanding of how beta-band (15-30 Hz) oscillations arise in the parkinsonian basal ganglia. For treatments, we touch on the breadth of computational modelling work trying to understand the therapeutic actions of deep brain stimulation. Collectively, models from across all levels of description are providing a compelling account of the causes, symptoms and treatments for Parkinson's disease. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  6. Seminary Formation: A Case Study from the Pontifical Beda College, Rome

    ERIC Educational Resources Information Center

    Strange, Roderick

    2015-01-01

    This case study account reviews issues related to seminary formation and education at the Beda College, Rome, including Fundamentals of Formation, Community Life, Organizing Formation, Intellectual Formation, Spiritual Formation, Pastoral Formation, and the challenges arising in these fields.

  7. ISCR FY2005 Annual Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keyes, D E; McGraw, J R

    2006-02-02

    Large-scale scientific computation and all of the disciplines that support and help validate it have been placed at the focus of Lawrence Livermore National Laboratory (LLNL) by the Advanced Simulation and Computing (ASC) program of the National Nuclear Security Administration (NNSA) and the Scientific Discovery through Advanced Computing (SciDAC) initiative of the Office of Science of the Department of Energy (DOE). The maturation of simulation as a fundamental tool of scientific and engineering research is underscored in the President's Information Technology Advisory Committee (PITAC) June 2005 finding that ''computational science has become critical to scientific leadership, economic competitiveness, and nationalmore » security''. LLNL operates several of the world's most powerful computers--including today's single most powerful--and has undertaken some of the largest and most compute-intensive simulations ever performed, most notably the molecular dynamics simulation that sustained more than 100 Teraflop/s and won the 2005 Gordon Bell Prize. Ultrascale simulation has been identified as one of the highest priorities in DOE's facilities planning for the next two decades. However, computers at architectural extremes are notoriously difficult to use in an efficient manner. Furthermore, each successful terascale simulation only points out the need for much better ways of interacting with the resulting avalanche of data. Advances in scientific computing research have, therefore, never been more vital to the core missions of LLNL than at present. Computational science is evolving so rapidly along every one of its research fronts that to remain on the leading edge, LLNL must engage researchers at many academic centers of excellence. In FY 2005, the Institute for Scientific Computing Research (ISCR) served as one of LLNL's main bridges to the academic community with a program of collaborative subcontracts, visiting faculty, student internships, workshops, and an active seminar series. The ISCR identifies researchers from the academic community for computer science and computational science collaborations with LLNL and hosts them for both brief and extended visits with the aim of encouraging long-term academic research agendas that address LLNL research priorities. Through these collaborations, ideas and software flow in both directions, and LLNL cultivates its future workforce. The Institute strives to be LLNL's ''eyes and ears'' in the computer and information sciences, keeping the Laboratory aware of and connected to important external advances. It also attempts to be the ''hands and feet'' that carry those advances into the Laboratory and incorporate them into practice. ISCR research participants are integrated into LLNL's Computing Applications and Research (CAR) Department, especially into its Center for Applied Scientific Computing (CASC). In turn, these organizations address computational challenges arising throughout the rest of the Laboratory. Administratively, the ISCR flourishes under LLNL's University Relations Program (URP). Together with the other four institutes of the URP, the ISCR navigates a course that allows LLNL to benefit from academic exchanges while preserving national security. While it is difficult to operate an academic-like research enterprise within the context of a national security laboratory, the results declare the challenges well met and worth the continued effort. The pages of this annual report summarize the activities of the faculty members, postdoctoral researchers, students, and guests from industry and other laboratories who participated in LLNL's computational mission under the auspices of the ISCR during FY 2005.« less

  8. Nurses' views on challenging doctors' practice in an acute hospital.

    PubMed

    Churchman, J J; Doherty, C

    To explore the extent to which nurses are willing to challenge doctors' practice in everyday situations in an acute NHS hospital. Qualitative data were collected using in-depth interviews with 12 nurses in an acute NHS hospital in England. Participants believed that they challenged doctors' practice and acted as patients' advocates. However, data revealed that nurses questioned doctors' practice only under specific circumstances. Nurses would not challenge doctors if they perceived that this would result in conflict or stress, if they were afraid of the doctor or feared reprisal. Nurses are discouraged from challenging doctors' practice by the structural inequality arising from the gender division of labour and doctors' expert knowledge and status (medical dominance) in the workplace.

  9. A conceptual and computational model of moral decision making in human and artificial agents.

    PubMed

    Wallach, Wendell; Franklin, Stan; Allen, Colin

    2010-07-01

    Recently, there has been a resurgence of interest in general, comprehensive models of human cognition. Such models aim to explain higher-order cognitive faculties, such as deliberation and planning. Given a computational representation, the validity of these models can be tested in computer simulations such as software agents or embodied robots. The push to implement computational models of this kind has created the field of artificial general intelligence (AGI). Moral decision making is arguably one of the most challenging tasks for computational approaches to higher-order cognition. The need for increasingly autonomous artificial agents to factor moral considerations into their choices and actions has given rise to another new field of inquiry variously known as Machine Morality, Machine Ethics, Roboethics, or Friendly AI. In this study, we discuss how LIDA, an AGI model of human cognition, can be adapted to model both affective and rational features of moral decision making. Using the LIDA model, we will demonstrate how moral decisions can be made in many domains using the same mechanisms that enable general decision making. Comprehensive models of human cognition typically aim for compatibility with recent research in the cognitive and neural sciences. Global workspace theory, proposed by the neuropsychologist Bernard Baars (1988), is a highly regarded model of human cognition that is currently being computationally instantiated in several software implementations. LIDA (Franklin, Baars, Ramamurthy, & Ventura, 2005) is one such computational implementation. LIDA is both a set of computational tools and an underlying model of human cognition, which provides mechanisms that are capable of explaining how an agent's selection of its next action arises from bottom-up collection of sensory data and top-down processes for making sense of its current situation. We will describe how the LIDA model helps integrate emotions into the human decision-making process, and we will elucidate a process whereby an agent can work through an ethical problem to reach a solution that takes account of ethically relevant factors. Copyright © 2010 Cognitive Science Society, Inc.

  10. Bilateral subclavian origin of the bronchial arteries combined with absence of other origins.

    PubMed

    Jie, Bing; Sun, Xi-Wen; Yu, Dong; Jiang, Sen

    2014-08-01

    There are numerous anatomical variations of the sites of origin of the bronchial arteries (BAs). A subclavian origin of a BA involves an aberrant artery that originates from the subclavian artery (SCA) or its branches. However, the aberrant artery usually originates directly from the SCA, and an SCA-origin BA arising from the branches of the SCA is rare. We herein present an extremely rare case of a right BA arising from the ipsilateral costocervical trunk, and a left BA arising from the ipsilateral thyrocervical trunk, in the absence of other origins of the BA. This anatomical variation was detected during pretherapeutic evaluation by multidetector-row computed tomography and confirmed by selective angiography. Recognition of these anatomic variations is important to surgical, diagnostic, and interventional radiologic procedures in the thorax.

  11. Visualisation of variable binding pockets on protein surfaces by probabilistic analysis of related structure sets.

    PubMed

    Ashford, Paul; Moss, David S; Alex, Alexander; Yeap, Siew K; Povia, Alice; Nobeli, Irene; Williams, Mark A

    2012-03-14

    Protein structures provide a valuable resource for rational drug design. For a protein with no known ligand, computational tools can predict surface pockets that are of suitable size and shape to accommodate a complementary small-molecule drug. However, pocket prediction against single static structures may miss features of pockets that arise from proteins' dynamic behaviour. In particular, ligand-binding conformations can be observed as transiently populated states of the apo protein, so it is possible to gain insight into ligand-bound forms by considering conformational variation in apo proteins. This variation can be explored by considering sets of related structures: computationally generated conformers, solution NMR ensembles, multiple crystal structures, homologues or homology models. It is non-trivial to compare pockets, either from different programs or across sets of structures. For a single structure, difficulties arise in defining particular pocket's boundaries. For a set of conformationally distinct structures the challenge is how to make reasonable comparisons between them given that a perfect structural alignment is not possible. We have developed a computational method, Provar, that provides a consistent representation of predicted binding pockets across sets of related protein structures. The outputs are probabilities that each atom or residue of the protein borders a predicted pocket. These probabilities can be readily visualised on a protein using existing molecular graphics software. We show how Provar simplifies comparison of the outputs of different pocket prediction algorithms, of pockets across multiple simulated conformations and between homologous structures. We demonstrate the benefits of use of multiple structures for protein-ligand and protein-protein interface analysis on a set of complexes and consider three case studies in detail: i) analysis of a kinase superfamily highlights the conserved occurrence of surface pockets at the active and regulatory sites; ii) a simulated ensemble of unliganded Bcl2 structures reveals extensions of a known ligand-binding pocket not apparent in the apo crystal structure; iii) visualisations of interleukin-2 and its homologues highlight conserved pockets at the known receptor interfaces and regions whose conformation is known to change on inhibitor binding. Through post-processing of the output of a variety of pocket prediction software, Provar provides a flexible approach to the analysis and visualization of the persistence or variability of pockets in sets of related protein structures.

  12. Anomaly Detection in Dynamic Networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Turcotte, Melissa

    2014-10-14

    Anomaly detection in dynamic communication networks has many important security applications. These networks can be extremely large and so detecting any changes in their structure can be computationally challenging; hence, computationally fast, parallelisable methods for monitoring the network are paramount. For this reason the methods presented here use independent node and edge based models to detect locally anomalous substructures within communication networks. As a first stage, the aim is to detect changes in the data streams arising from node or edge communications. Throughout the thesis simple, conjugate Bayesian models for counting processes are used to model these data streams. Amore » second stage of analysis can then be performed on a much reduced subset of the network comprising nodes and edges which have been identified as potentially anomalous in the first stage. The first method assumes communications in a network arise from an inhomogeneous Poisson process with piecewise constant intensity. Anomaly detection is then treated as a changepoint problem on the intensities. The changepoint model is extended to incorporate seasonal behavior inherent in communication networks. This seasonal behavior is also viewed as a changepoint problem acting on a piecewise constant Poisson process. In a static time frame, inference is made on this extended model via a Gibbs sampling strategy. In a sequential time frame, where the data arrive as a stream, a novel, fast Sequential Monte Carlo (SMC) algorithm is introduced to sample from the sequence of posterior distributions of the change points over time. A second method is considered for monitoring communications in a large scale computer network. The usage patterns in these types of networks are very bursty in nature and don’t fit a Poisson process model. For tractable inference, discrete time models are considered, where the data are aggregated into discrete time periods and probability models are fitted to the communication counts. In a sequential analysis, anomalous behavior is then identified from outlying behavior with respect to the fitted predictive probability models. Seasonality is again incorporated into the model and is treated as a changepoint model on the transition probabilities of a discrete time Markov process. Second stage analytics are then developed which combine anomalous edges to identify anomalous substructures in the network.« less

  13. Simulating Pelletization Strategies to Reduce the Biomass Supply Risk at America’s Biorefineries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jacob J. Jacobson; Shane Carnohan; Andrew Ford

    2014-07-01

    Demand for cellulosic ethanol and other advanced biofuels has been on the rise, due in part to federal targets enacted in 2005 and extended in 2007. The industry faces major challenges in meeting these worthwhile and ambitious targets. The challenges are especially severe in the logistics of timely feedstock delivery to biorefineries. Logistical difficulties arise from seasonal production that forces the biomass to be stored in uncontrolled field-side environments. In this storage format physical difficulties arise; transportation is hindered by the low bulk density of baled biomass and the unprotected material can decay leading to unpredictable losses. Additionally, uncertain yieldsmore » and contractual difficulties can exacerbate these challenges making biorefineries a high-risk venture. Investors’ risk could limit business entry and prevent America from reaching the targets. This paper explores pelletizer strategies to convert the lignocellulosic biomass into a denser form more suitable for storage. The densification of biomass would reduce supply risks, and the new system would outperform conventional biorefinery supply systems. Pelletizer strategies exhibit somewhat higher costs, but the reduction in risk is well worth the extra cost if America is to grow the advanced biofuels industry in a sustainable manner.« less

  14. Challenges and considerations for the design and production of a purpose-optimized body-worn wrist-watch computer

    NASA Astrophysics Data System (ADS)

    Narayanaswami, Chandra; Raghunath, Mandayam T.

    2004-09-01

    We outline a collection of technological challenges in the design of wearable computers with a focus on one of the most desirable form-factors, the wrist watch. We describe our experience with building three generations of wrist watch computers. We built these research prototypes as platforms to investigate the fundamental limitations of wearable computing. Results of our investigations are presented in the form of challenges that have been overcome and those that still remain.

  15. Fast Inference with Min-Sum Matrix Product.

    PubMed

    Felzenszwalb, Pedro F; McAuley, Julian J

    2011-12-01

    The MAP inference problem in many graphical models can be solved efficiently using a fast algorithm for computing min-sum products of n × n matrices. The class of models in question includes cyclic and skip-chain models that arise in many applications. Although the worst-case complexity of the min-sum product operation is not known to be much better than O(n(3)), an O(n(2.5)) expected time algorithm was recently given, subject to some constraints on the input matrices. In this paper, we give an algorithm that runs in O(n(2) log n) expected time, assuming that the entries in the input matrices are independent samples from a uniform distribution. We also show that two variants of our algorithm are quite fast for inputs that arise in several applications. This leads to significant performance gains over previous methods in applications within computer vision and natural language processing.

  16. The use of analysis of variance procedures in biological studies

    USGS Publications Warehouse

    Williams, B.K.

    1987-01-01

    The analysis of variance (ANOVA) is widely used in biological studies, yet there remains considerable confusion among researchers about the interpretation of hypotheses being tested. Ambiguities arise when statistical designs are unbalanced, and in particular when not all combinations of design factors are represented in the data. This paper clarifies the relationship among hypothesis testing, statistical modelling and computing procedures in ANOVA for unbalanced data. A simple two-factor fixed effects design is used to illustrate three common parametrizations for ANOVA models, and some associations among these parametrizations are developed. Biologically meaningful hypotheses for main effects and interactions are given in terms of each parametrization, and procedures for testing the hypotheses are described. The standard statistical computing procedures in ANOVA are given along with their corresponding hypotheses. Throughout the development unbalanced designs are assumed and attention is given to problems that arise with missing cells.

  17. Survival probability of diffusion with trapping in cellular neurobiology

    NASA Astrophysics Data System (ADS)

    Holcman, David; Marchewka, Avi; Schuss, Zeev

    2005-09-01

    The problem of diffusion with absorption and trapping sites arises in the theory of molecular signaling inside and on the membranes of biological cells. In particular, this problem arises in the case of spine-dendrite communication, where the number of calcium ions, modeled as random particles, is regulated across the spine microstructure by pumps, which play the role of killing sites, while the end of the dendritic shaft is an absorbing boundary. We develop a general mathematical framework for diffusion in the presence of absorption and killing sites and apply it to the computation of the time-dependent survival probability of ions. We also compute the ratio of the number of absorbed particles at a specific location to the number of killed particles. We show that the ratio depends on the distribution of killing sites. The biological consequence is that the position of the pumps regulates the fraction of calcium ions that reach the dendrite.

  18. Clinical Challenges in the Growing Medical Marijuana Field.

    PubMed

    Barker, Jonathan

    2018-03-01

    Unique clinical challenges arise with the growing number of patients who possess medical marijuana cards. Medical marijuana patients with mental disorders can have worsening symptoms with marijuana use. Often there is sparse continuity of care between the patient and the medical marijuana practitioner. Lack of communication between the patient's treating practitioners and the practitioner who has authorized the medical marijuana can be problematic. This article is a discussion of the new clinical challenges practitioners are likely to encounter with the growing number of medical marijuana patients. [Full article available at http://rimed.org/rimedicaljournal-2018-03.asp].

  19. Clinical Challenges in the Growing Medical Marijuana Field.

    PubMed

    Barker, Jonathan

    2018-02-02

    Unique clinical challenges arise with the growing number of patients who possess medical marijuana cards. Medical marijuana patients with mental disorders can have worsening symptoms with marijuana use. Often there is sparse continuity of care between the patient and the medical marijuana practitioner. Lack of communication between the patient's treating practitioners and the practitioner who has authorized the medical marijuana can be problematic. This article is a discussion of the new clinical challenges practitioners are likely to encounter with the growing number of medical marijuana patients. [Full article available at http://rimed.org/rimedicaljournal-2018-02.asp].

  20. Developing Tomorrow's Integrated Community Health Systems: A Leadership Challenge for Public Health and Primary Care

    PubMed Central

    Welton, William E.; Kantner, Theodore A.; Katz, Sheila Moriber

    1997-01-01

    As the nation's health system moves away from earlier models to one grounded in population health and market-based systems of care, new challenges arise for public health professionals, primary care practitioners, health plan and institutional managers, and community leaders. Among the challenges are the need to develop creative concepts of organization and accountability and to assure that dynamic, system-oriented structures support the new kind of leadership that is required. Developing tomorrow's integrated community health systems will challenge the leadership skills and integrative abilities of public health professionals, primary care practitioners, and managers. These leaders and their new organizations must, in turn, assume increased accountability for improving community health. PMID:9184684

  1. Computer-Enriched Instruction (CEI) Is Better for Preview Material Instead of Review Material: An Example of a Biostatistics Chapter, the Central Limit Theorem

    ERIC Educational Resources Information Center

    See, Lai-Chu; Huang, Yu-Hsun; Chang, Yi-Hu; Chiu, Yeo-Ju; Chen, Yi-Fen; Napper, Vicki S.

    2010-01-01

    This study examines the timing using computer-enriched instruction (CEI), before or after a traditional lecture to determine cross-over effect, period effect, and learning effect arising from sequencing of instruction. A 2 x 2 cross-over design was used with CEI to teach central limit theorem (CLT). Two sequences of graduate students in nursing…

  2. The Evolution of a Connectionist Model of Situated Human Language Understanding

    NASA Astrophysics Data System (ADS)

    Mayberry, Marshall R.; Crocker, Matthew W.

    The Adaptive Mechanisms in Human Language Processing (ALPHA) project features both experimental and computational tracks designed to complement each other in the investigation of the cognitive mechanisms that underlie situated human utterance processing. The models developed in the computational track replicate results obtained in the experimental track and, in turn, suggest further experiments by virtue of behavior that arises as a by-product of their operation.

  3. Solar ultraviolet radiation in a changing climate

    EPA Science Inventory

    The projected large increases in damaging ultraviolet radiation as a result of global emissions of ozone-depleting substances have been forestalled by the success of the Montreal Protocol. New challenges are now arising in relation to climate change. We highlight the complex inte...

  4. Susceptibility of Redundant Versus Singular Clock Domains Implemented in SRAM-Based FPGA TMR Designs

    NASA Technical Reports Server (NTRS)

    Berg, Melanie D.; LaBel, Kenneth A.; Pellish, Jonathan

    2016-01-01

    We present the challenges that arise when using redundant clock domains due to their clock-skew. Radiation data show that a singular clock domain (DTMR) provides an improved TMR methodology for SRAM-based FPGAs over redundant clocks.

  5. Global trade and assisted reproductive technologies: regulatory challenges in international surrogacy.

    PubMed

    Nelson, Erin

    2013-01-01

    International surrogacy is an increasingly common phenomenon and an important global health challenge. Legal rules are a key consideration for the participants in international surrogacy arrangements. In some cases the law can help to resolve the complex issues that arise in this context, but it is important to consider the role played by law in contributing to the complex conflicts that such arrangements can generate. © 2013 American Society of Law, Medicine & Ethics, Inc.

  6. Work-based learning: challenges and opportunities.

    PubMed

    Gallagher, Ann; Holland, Lesley

    This article discusses some of the challenges and opportunities arising from the development and implementation of an innovative work-based open and distance learning programme available exclusively to healthcare assistants working in general health and mental health practice. The programme is based on a partnership between the sponsoring organisation and the Open University. The focus is on the development of standards of proficiency, service user involvement, partnership working, skills development and the pedagogic implications of a work-based learning format.

  7. Context-specific and/or context-free challenges and opportunities in writing scholarly reviews in health care management: a conceptual note.

    PubMed

    Blair, John D

    2011-01-01

    Challenges and opportunities arise from the significantly different perspectives of context-specific versus context-free researchers and the literatures they contribute to. Reviews of one type or the other or both types of literatures may provide different understandings of the state of the art in a particular area of health care management. Suggestions for writing quality reviews are also included along with suggested topics for future reviews.

  8. Opportunities and challenges in developing deep learning models using electronic health records data: a systematic review.

    PubMed

    Xiao, Cao; Choi, Edward; Sun, Jimeng

    2018-06-08

    To conduct a systematic review of deep learning models for electronic health record (EHR) data, and illustrate various deep learning architectures for analyzing different data sources and their target applications. We also highlight ongoing research and identify open challenges in building deep learning models of EHRs. We searched PubMed and Google Scholar for papers on deep learning studies using EHR data published between January 1, 2010, and January 31, 2018. We summarize them according to these axes: types of analytics tasks, types of deep learning model architectures, special challenges arising from health data and tasks and their potential solutions, as well as evaluation strategies. We surveyed and analyzed multiple aspects of the 98 articles we found and identified the following analytics tasks: disease detection/classification, sequential prediction of clinical events, concept embedding, data augmentation, and EHR data privacy. We then studied how deep architectures were applied to these tasks. We also discussed some special challenges arising from modeling EHR data and reviewed a few popular approaches. Finally, we summarized how performance evaluations were conducted for each task. Despite the early success in using deep learning for health analytics applications, there still exist a number of issues to be addressed. We discuss them in detail including data and label availability, the interpretability and transparency of the model, and ease of deployment.

  9. Moving beyond regression techniques in cardiovascular risk prediction: applying machine learning to address analytic challenges.

    PubMed

    Goldstein, Benjamin A; Navar, Ann Marie; Carter, Rickey E

    2017-06-14

    Risk prediction plays an important role in clinical cardiology research. Traditionally, most risk models have been based on regression models. While useful and robust, these statistical methods are limited to using a small number of predictors which operate in the same way on everyone, and uniformly throughout their range. The purpose of this review is to illustrate the use of machine-learning methods for development of risk prediction models. Typically presented as black box approaches, most machine-learning methods are aimed at solving particular challenges that arise in data analysis that are not well addressed by typical regression approaches. To illustrate these challenges, as well as how different methods can address them, we consider trying to predicting mortality after diagnosis of acute myocardial infarction. We use data derived from our institution's electronic health record and abstract data on 13 regularly measured laboratory markers. We walk through different challenges that arise in modelling these data and then introduce different machine-learning approaches. Finally, we discuss general issues in the application of machine-learning methods including tuning parameters, loss functions, variable importance, and missing data. Overall, this review serves as an introduction for those working on risk modelling to approach the diffuse field of machine learning. © The Author 2016. Published by Oxford University Press on behalf of the European Society of Cardiology.

  10. Measurement Challenges in International Agreements

    NASA Astrophysics Data System (ADS)

    Luke, John

    2006-10-01

    Making measurements in support of international agreements can pose many challenges both from a policy and science point of view. Policy issues may arise because physics measurements made in the area of arms control or disarmament may be deemed too intrusive since they could possibly reveal sensitive information about the material that is being interrogated. Therefore, agreements must include a framework for safeguarding against the potential release of this information. Most of the scientific issues center around the fact that it is desirable to make high quality measurements without any operator interaction. This leads to the development of instrumentation and software that are very stable and robust. Due to different concerns, policy and science priorities may be at odds with one another. Therefore, it is the scientist's challenge - in this field - to keep policy makers informed by conveying what is technically possible and what is not in a manner that is easily understood and also negotiable. In this paper we will discuss some of the technology that has been developed to address some of these challenges in various international and model agreements. We will discuss the principle of informational barrier used in these measurement technologies to safeguard the release of sensitive information. We will also discuss some of the pitfalls that may arise when policy is ill informed about the physical constraints in the making of measurements of nuclear materials.

  11. Fluid-Structure Interaction Modeling of Parachutes with Disreefing and Modified Geometric Porosity and Separation Aerodynamics of a Cover Jettisoned to the Spacecraft Wake

    NASA Astrophysics Data System (ADS)

    Fritze, Matthew D.

    Fluid-structure interaction (FSI) modeling of spacecraft parachutes involves a number of computational challenges. The canopy complexity created by the hundreds of gaps and slits and design-related modification of that geometric porosity by removal of some of the sails and panels are among the formidable challenges. Disreefing from one stage to another when the parachute is used in multiple stages is another formidable challenge. This thesis addresses the computational challenges involved in disreefing of spacecraft parachutes and fully-open and reefed stages of the parachutes with modified geometric porosity. The special techniques developed to address these challenges are described and the FSI computations are be reported. The thesis also addresses the modeling and computation challenges involved in very early stages, where the sudden separation of a cover jettisoned to the spacecraft wake needs to be modeled. Higher-order temporal representations used in modeling the separation motion are described, and the computed separation and wake-induced forces acting on the cover are reported.

  12. High order discretization techniques for real-space ab initio simulations

    NASA Astrophysics Data System (ADS)

    Anderson, Christopher R.

    2018-03-01

    In this paper, we present discretization techniques to address numerical problems that arise when constructing ab initio approximations that use real-space computational grids. We present techniques to accommodate the singular nature of idealized nuclear and idealized electronic potentials, and we demonstrate the utility of using high order accurate grid based approximations to Poisson's equation in unbounded domains. To demonstrate the accuracy of these techniques, we present results for a Full Configuration Interaction computation of the dissociation of H2 using a computed, configuration dependent, orbital basis set.

  13. Computer constructed imagery of distant plasma interaction boundaries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grenstadt, E.W.; Schurr, H.D.; Tsugawa, R.K.

    1982-01-01

    Computer constructed sketches of plasma boundaries arising from the interaction between the solar wind and the magnetosphere can serve as both didactic and research tools. In particular, the structure of the earth's bow shock can be represented as a nonuniform surfce according to the instantaneous orientation of the IMF, and temporal changes in structural distribution can be modeled as a sequence of sketches based on observed sequences of spacecraft-based measurements. Viewed rapidly, such a sequence of sketches can be the basis for representation of plasma processes by computer animation.

  14. Program design by a multidisciplinary team. [for structural finite element analysis on STAR-100 computer

    NASA Technical Reports Server (NTRS)

    Voigt, S.

    1975-01-01

    The use of software engineering aids in the design of a structural finite-element analysis computer program for the STAR-100 computer is described. Nested functional diagrams to aid in communication among design team members were used, and a standardized specification format to describe modules designed by various members was adopted. This is a report of current work in which use of the functional diagrams provided continuity and helped resolve some of the problems arising in this long-running part-time project.

  15. Jungle Computing: Distributed Supercomputing Beyond Clusters, Grids, and Clouds

    NASA Astrophysics Data System (ADS)

    Seinstra, Frank J.; Maassen, Jason; van Nieuwpoort, Rob V.; Drost, Niels; van Kessel, Timo; van Werkhoven, Ben; Urbani, Jacopo; Jacobs, Ceriel; Kielmann, Thilo; Bal, Henri E.

    In recent years, the application of high-performance and distributed computing in scientific practice has become increasingly wide spread. Among the most widely available platforms to scientists are clusters, grids, and cloud systems. Such infrastructures currently are undergoing revolutionary change due to the integration of many-core technologies, providing orders-of-magnitude speed improvements for selected compute kernels. With high-performance and distributed computing systems thus becoming more heterogeneous and hierarchical, programming complexity is vastly increased. Further complexities arise because urgent desire for scalability and issues including data distribution, software heterogeneity, and ad hoc hardware availability commonly force scientists into simultaneous use of multiple platforms (e.g., clusters, grids, and clouds used concurrently). A true computing jungle.

  16. Addressing unmet mental health and substance abuse needs: a partnered planning effort between grassroots community agencies, faith-based organizations, service providers, and academic institutions.

    PubMed

    Wong, Eunice C; Chung, Bowen; Stover, Gabriel; Stockdale, Susan; Jones, Felica; Litt, Paula; Klap, Ruth S; Patel, Kavita; Wells, Kenneth B

    2011-01-01

    To conduct a process evaluation of the Restoration Center Los Angeles, a community-academic partnered planning effort aimed at holistically addressing the unmet mental health and substance abuse needs of the Los Angeles African American community. Semi-structured interviews with open-ended questions on key domains of partnership effectiveness were conducted with a random stratified sample of participants varying by level of involvement. Eleven partners representing grassroots community agencies, faith-based organizations, service providers, and academic institutions. Common themes identified by an evaluation consultant and partners relating to partnership effectiveness, perceived benefits and costs, and future expectations. Findings underscore the importance of considering the potential issues that may arise with the increasing diversity of partners and perspectives. Many of the challenges and facilitating factors that arise within academic-community partnerships were similarly experienced between the diverse set of community partners. Challenges that affected partnership development between community-to-community partners included differences in expectations regarding the final goal of the project, trust-building, and the distribution of funds. Despite such challenges, partners were able to jointly develop a final set of recommendations for the creation of restoration centers, which was viewed as a major accomplishment. Limited guidance exists on how to navigate differences that arise between community members who have shared identities on some dimensions (eg, African American ethnicity, Los Angeles residence) but divergent identities on other dimensions (eg, formal church affiliation). With increasing diversity of community representation, careful attention needs to be dedicated to not only the development of academic-community partnerships but also community-community partnerships.

  17. Extreme Mechanics in Soft Pneumatic Robots and Soft Microfluidic Electronics and Sensors

    NASA Astrophysics Data System (ADS)

    Majidi, Carmel

    2012-02-01

    In the near future, machines and robots will be completely soft, stretchable, impact resistance, and capable of adapting their shape and functionality to changes in mission and environment. Similar to biological tissue and soft-body organisms, these next-generation technologies will contain no rigid parts and instead be composed entirely of soft elastomers, gels, fluids, and other non-rigid matter. Using a combination of rapid prototyping tools, microfabrication methods, and emerging techniques in so-called ``soft lithography,'' scientists and engineers are currently introducing exciting new families of soft pneumatic robots, soft microfluidic sensors, and hyperelastic electronics that can be stretched to as much as 10x their natural length. Progress has been guided by an interdisciplinary collection of insights from chemistry, life sciences, robotics, microelectronics, and solid mechanics. In virtually every technology and application domain, mechanics and elasticity have a central role in governing functionality and design. Moreover, in contrast to conventional machines and electronics, soft pneumatic systems and microfluidics typically operate in the finite deformation regime, with materials stretching to several times their natural length. In this talk, I will review emerging paradigms in soft pneumatic robotics and soft microfluidic electronics and highlight modeling and design challenges that arise from the extreme mechanics of inflation, locomotion, sensor operation, and human interaction. I will also discuss perceived challenges and opportunities in a broad range of potential application, from medicine to wearable computing.

  18. Preclinical Magnetic Resonance Imaging and Systems Biology in Cancer Research

    PubMed Central

    Albanese, Chris; Rodriguez, Olga C.; VanMeter, John; Fricke, Stanley T.; Rood, Brian R.; Lee, YiChien; Wang, Sean S.; Madhavan, Subha; Gusev, Yuriy; Petricoin, Emanuel F.; Wang, Yue

    2014-01-01

    Biologically accurate mouse models of human cancer have become important tools for the study of human disease. The anatomical location of various target organs, such as brain, pancreas, and prostate, makes determination of disease status difficult. Imaging modalities, such as magnetic resonance imaging, can greatly enhance diagnosis, and longitudinal imaging of tumor progression is an important source of experimental data. Even in models where the tumors arise in areas that permit visual determination of tumorigenesis, longitudinal anatomical and functional imaging can enhance the scope of studies by facilitating the assessment of biological alterations, (such as changes in angiogenesis, metabolism, cellular invasion) as well as tissue perfusion and diffusion. One of the challenges in preclinical imaging is the development of infrastructural platforms required for integrating in vivo imaging and therapeutic response data with ex vivo pathological and molecular data using a more systems-based multiscale modeling approach. Further challenges exist in integrating these data for computational modeling to better understand the pathobiology of cancer and to better affect its cure. We review the current applications of preclinical imaging and discuss the implications of applying functional imaging to visualize cancer progression and treatment. Finally, we provide new data from an ongoing preclinical drug study demonstrating how multiscale modeling can lead to a more comprehensive understanding of cancer biology and therapy. PMID:23219428

  19. How Heterogeneity Affects the Design of Hadoop MapReduce Schedulers: A State-of-the-Art Survey and Challenges.

    PubMed

    Pandey, Vaibhav; Saini, Poonam

    2018-06-01

    MapReduce (MR) computing paradigm and its open source implementation Hadoop have become a de facto standard to process big data in a distributed environment. Initially, the Hadoop system was homogeneous in three significant aspects, namely, user, workload, and cluster (hardware). However, with growing variety of MR jobs and inclusion of different configurations of nodes in the existing cluster, heterogeneity has become an essential part of Hadoop systems. The heterogeneity factors adversely affect the performance of a Hadoop scheduler and limit the overall throughput of the system. To overcome this problem, various heterogeneous Hadoop schedulers have been proposed in the literature. Existing survey works in this area mostly cover homogeneous schedulers and classify them on the basis of quality of service parameters they optimize. Hence, there is a need to study the heterogeneous Hadoop schedulers on the basis of various heterogeneity factors considered by them. In this survey article, we first discuss different heterogeneity factors that typically exist in a Hadoop system and then explore various challenges that arise while designing the schedulers in the presence of such heterogeneity. Afterward, we present the comparative study of heterogeneous scheduling algorithms available in the literature and classify them by the previously said heterogeneity factors. Lastly, we investigate different methods and environment used for evaluation of discussed Hadoop schedulers.

  20. Solute-Solvent Charge-Transfer Excitations and Optical Absorption of Hydrated Hydroxide from Time-Dependent Density-Functional Theory.

    PubMed

    Opalka, Daniel; Sprik, Michiel

    2014-06-10

    The electronic structure of simple hydrated ions represents one of the most challenging problems in electronic-structure theory. Spectroscopic experiments identified the lowest excited state of the solvated hydroxide as a charge-transfer-to-solvent (CTTS) state. In the present work we report computations of the absorption spectrum of the solvated hydroxide ion, treating both solvent and solute strictly at the same level of theory. The average absorption spectrum up to 25 eV has been computed for samples taken from periodic ab initio molecular dynamics simulations. The experimentally observed CTTS state near the onset of the absorption threshold has been analyzed at the generalized-gradient approximation (GGA) and with a hybrid density-functional. Based on results for the lowest excitation energies computed with the HSE hybrid functional and a Davidson diagonalization scheme, the CTTS transition has been found 0.6 eV below the first absorption band of liquid water. The transfer of an electron to the solvent can be assigned to an excitation from the solute 2pπ orbitals, which are subject to a small energetic splitting due to the asymmetric solvent environment, to the significantly delocalized lowest unoccupied orbital of the solvent. The distribution of the centers of the excited state shows that CTTS along the OH(-) axis of the hydroxide ion is avoided. Furthermore, our simulations indicate that the systematic error arising in the calculated spectrum at the GGA originates from a poor description of the valence band energies in the solution.

  1. Integrated Hardware and Software for No-Loss Computing

    NASA Technical Reports Server (NTRS)

    James, Mark

    2007-01-01

    When an algorithm is distributed across multiple threads executing on many distinct processors, a loss of one of those threads or processors can potentially result in the total loss of all the incremental results up to that point. When implementation is massively hardware distributed, then the probability of a hardware failure during the course of a long execution is potentially high. Traditionally, this problem has been addressed by establishing checkpoints where the current state of some or part of the execution is saved. Then in the event of a failure, this state information can be used to recompute that point in the execution and resume the computation from that point. A serious problem arises when one distributes a problem across multiple threads and physical processors is that one increases the likelihood of the algorithm failing due to no fault of the scientist but as a result of hardware faults coupled with operating system problems. With good reason, scientists expect their computing tools to serve them and not the other way around. What is novel here is a unique combination of hardware and software that reformulates an application into monolithic structure that can be monitored in real-time and dynamically reconfigured in the event of a failure. This unique reformulation of hardware and software will provide advanced aeronautical technologies to meet the challenges of next-generation systems in aviation, for civilian and scientific purposes, in our atmosphere and in atmospheres of other worlds. In particular, with respect to NASA s manned flight to Mars, this technology addresses the critical requirements for improving safety and increasing reliability of manned spacecraft.

  2. Cores Of Recurrent Events (CORE) | Informatics Technology for Cancer Research (ITCR)

    Cancer.gov

    CORE is a statistically supported computational method for finding recurrently targeted regions in massive collections of genomic intervals, such as those arising from DNA copy number analysis of single tumor cells or bulk tumor tissues.

  3. Families of returned defence force personnel: a changing landscape of challenges.

    PubMed

    Berle, David; Steel, Zachary

    2015-08-01

    This paper aims to identify the key challenges experienced by the families of defence force personnel following deployment. We undertook a selective review of four post-deployment challenges to the families of defence force personnel: (1) changes to relationships; (2) changes to family member roles and responsibilities; (3) adjustment of children and parenting challenges; and (4) anger, family conflict and violence. Emerging issues in the area of post-deployment adjustment are also discussed. Empirical studies of post-deployment family adjustment are lacking. Each of the reviewed challenges can contribute to psychological difficulties and precipitate contact with mental health services. The challenges faced by defence force personnel when returning from deployment arise within a family context. Clinicians should thoroughly assess these factors in families following deployment, but also recognise family strengths and resilience to these challenges. © The Royal Australian and New Zealand College of Psychiatrists 2015.

  4. The Future of Electronic Device Design: Device and Process Simulation Find Intelligence on the World Wide Web

    NASA Technical Reports Server (NTRS)

    Biegel, Bryan A.

    1999-01-01

    We are on the path to meet the major challenges ahead for TCAD (technology computer aided design). The emerging computational grid will ultimately solve the challenge of limited computational power. The Modular TCAD Framework will solve the TCAD software challenge once TCAD software developers realize that there is no other way to meet industry's needs. The modular TCAD framework (MTF) also provides the ideal platform for solving the TCAD model challenge by rapid implementation of models in a partial differential solver.

  5. Coincidence between malignant perivascular epithelioid cell tumor arising in the gastric serosa and lung adenocarcinoma

    PubMed Central

    Yamada, Sohsuke; Nabeshima, Atsunori; Noguchi, Hirotsugu; Nawata, Aya; Nishii, Hisae; Guo, Xin; Wang, Ke-Yong; Hisaoka, Masanori; Nakayama, Toshiyuki

    2015-01-01

    A 4-mo history of both epigastralgia and back pain was presented in a 39-year-old male. Computed tomography showed right lung nodule and abdominal mass attached to the gastric wall, measuring approximately 30 mm and 70 mm in diameter. Since biopsy samples from the lung and abdomen revealed poorly differentiated adenocarcinoma and malignant tumor, clinicians first interpreted the abdominal mass as metastatic carcinoma, and a right lower lobectomy with following resection of the mass was performed. Gross examination of both lesions displayed gray-whitish to yellow-whitish cut surfaces with hemorrhagic and necrotic foci, and the mass attached to the serosa of the lesser curvature on the gastric body. On microscopic examination, the lung tumor was composed of a proliferation of highly atypical epithelial cells having abundant eosinophilic cytoplasm, predominantly arranged in an acinar or solid growth pattern with vessel permeation, while the abdominal tumor consisted of sheets or nests with markedly atypical epithelioid cells having pleomorphic nuclei and abundant eosinophilic to clear cytoplasm focally in a radial perivascular or infiltrative growth pattern. Immunohistochemically, the latter cells were positive for HMB45 or α-smooth muscle actin, but the former ones not. Therefore, we finally made a diagnosis of malignant perivascular epithelioid cell tumor (PEComa) arising in the gastric serosa, combined with primary lung adenocarcinoma. Furthermore, small papillary carcinoma of the thyroid gland was identified. The current case describes the coincidence of malignant PEComa with other carcinomas, posing a challenge in distinction from metastatic tumor disease. PMID:25632212

  6. Bayesian Hierarchical Modeling for Big Data Fusion in Soil Hydrology

    NASA Astrophysics Data System (ADS)

    Mohanty, B.; Kathuria, D.; Katzfuss, M.

    2016-12-01

    Soil moisture datasets from remote sensing (RS) platforms (such as SMOS and SMAP) and reanalysis products from land surface models are typically available on a coarse spatial granularity of several square km. Ground based sensors on the other hand provide observations on a finer spatial scale (meter scale or less) but are sparsely available. Soil moisture is affected by high variability due to complex interactions between geologic, topographic, vegetation and atmospheric variables. Hydrologic processes usually occur at a scale of 1 km or less and therefore spatially ubiquitous and temporally periodic soil moisture products at this scale are required to aid local decision makers in agriculture, weather prediction and reservoir operations. Past literature has largely focused on downscaling RS soil moisture for a small extent of a field or a watershed and hence the applicability of such products has been limited. The present study employs a spatial Bayesian Hierarchical Model (BHM) to derive soil moisture products at a spatial scale of 1 km for the state of Oklahoma by fusing point scale Mesonet data and coarse scale RS data for soil moisture and its auxiliary covariates such as precipitation, topography, soil texture and vegetation. It is seen that the BHM model handles change of support problems easily while performing accurate uncertainty quantification arising from measurement errors and imperfect retrieval algorithms. The computational challenge arising due to the large number of measurements is tackled by utilizing basis function approaches and likelihood approximations. The BHM model can be considered as a complex Bayesian extension of traditional geostatistical prediction methods (such as Kriging) for large datasets in the presence of uncertainties.

  7. Model and controller reduction of large-scale structures based on projection methods

    NASA Astrophysics Data System (ADS)

    Gildin, Eduardo

    The design of low-order controllers for high-order plants is a challenging problem theoretically as well as from a computational point of view. Frequently, robust controller design techniques result in high-order controllers. It is then interesting to achieve reduced-order models and controllers while maintaining robustness properties. Controller designed for large structures based on models obtained by finite element techniques yield large state-space dimensions. In this case, problems related to storage, accuracy and computational speed may arise. Thus, model reduction methods capable of addressing controller reduction problems are of primary importance to allow the practical applicability of advanced controller design methods for high-order systems. A challenging large-scale control problem that has emerged recently is the protection of civil structures, such as high-rise buildings and long-span bridges, from dynamic loadings such as earthquakes, high wind, heavy traffic, and deliberate attacks. Even though significant effort has been spent in the application of control theory to the design of civil structures in order increase their safety and reliability, several challenging issues are open problems for real-time implementation. This dissertation addresses with the development of methodologies for controller reduction for real-time implementation in seismic protection of civil structures using projection methods. Three classes of schemes are analyzed for model and controller reduction: nodal truncation, singular value decomposition methods and Krylov-based methods. A family of benchmark problems for structural control are used as a framework for a comparative study of model and controller reduction techniques. It is shown that classical model and controller reduction techniques, such as balanced truncation, modal truncation and moment matching by Krylov techniques, yield reduced-order controllers that do not guarantee stability of the closed-loop system, that is, the reduced-order controller implemented with the full-order plant. A controller reduction approach is proposed such that to guarantee closed-loop stability. It is based on the concept of dissipativity (or positivity) of linear dynamical systems. Utilizing passivity preserving model reduction together with dissipative-LQG controllers, effective low-order optimal controllers are obtained. Results are shown through simulations.

  8. A spoonful of care ethics: The challenges of enriching medical education.

    PubMed

    van Reenen, Eva; van Nistelrooij, Inge

    2017-01-01

    Nursing Ethics has featured several discussions on what good care comprises and how to achieve good care practices. We should "nurse" ethics by continuously reflecting on the way we "do" ethics, which is what care ethicists have been doing over the past few decades and continue to do so. Ethics is not limited to nursing but extends to all caring professions. In 2011, Elin Martinsen argued in this journal that care should be included as a core concept in medical ethical terminology because of "the harm to which patients may be exposed owing to a lack of care in the clinical encounter," specifically between doctors and patients. However, Martinsen leaves the didactical challenges arising from such a venture open for further enquiry. In this article, we explore the challenges arising from implementing care-ethical insights into medical education. Medical education in the Netherlands is investigated through a "care-ethical lens". This means exploring the possibility of enriching medical education with care-ethical insights, while at the same time discovering possible challenges emerging from such an undertaking. Participants and research context: This paper has been written from the academic context of a master in care ethics and policy. Ethical considerations: We have tried to be fair and respectful to the authors discussed and take a neutral stance towards the findings portrayed. Several challenges are identified, which we narrow down to two types: didactical and non-didactical. In order to overcome these challenges, we must not underestimate the possible resistance to a paradigm shift. Our efforts should mainly target the learning that takes place in the clinical phases of medical training and should be accompanied by the creation of awareness in healthcare practice.

  9. The Effects of Race Conditions When Implementing Single-Source Redundant Clock Trees in Triple Modular Redundant Synchronous Architectures

    NASA Technical Reports Server (NTRS)

    Berg, Melanie D.; Kim, Hak S.; Phan, Anthony M.; Seidleck, Christina M.; Label, Kenneth A.; Pellish, Jonathan A.; Campola, Michael J.

    2016-01-01

    We present the challenges that arise when using redundant clock domains due to their time-skew. Radiation data show that a singular clock domain provides an improved triple modular redundant (TMR) scheme over redundant clocks.

  10. Through the Front Door.

    ERIC Educational Resources Information Center

    Geller, Joseph

    2002-01-01

    Discusses challenges that arise in creating school entranceways that meld accessibility with attractiveness, noting the importance of considering both aesthetic impact and the design mandates of the Americans with Disabilities Act (ADA). Creative solutions include tying a walkway into a progressive stair; incorporating the ramp into a masonry…

  11. ENVIRONMENTAL IMMUNOCHEMISTRY AT THE U.S. EPA, NATIONAL EXPOSURE RESEARCH LABORATORY'S HUMAN EXPOSURE BRANCH

    EPA Science Inventory

    Immunoehemical methods are responding to the changing needs of regulatory and monitoring programs and are meeting new analytical challenges as they arise. Recent advances in environmental immunoehemistry have expanded the role of immunoassays from field screening methods to hig...

  12. EMERGING ENVIRONMENTAL CONTAMINANTS: ACHIEVEMENTS AND CHALLENGES WITH MASS SPECTROMETRY

    EPA Science Inventory

    Much has been achieved in the way of environmental protection over the last 30 years. However, as we learn more, new concerns arise. This presentation will discuss emerging contaminants that the U.S. Environmental Protection Agency (EPA) and other agencies are currently concerned...

  13. Generative Inferences Based on Learned Relations

    ERIC Educational Resources Information Center

    Chen, Dawn; Lu, Hongjing; Holyoak, Keith J.

    2017-01-01

    A key property of relational representations is their "generativity": From partial descriptions of relations between entities, additional inferences can be drawn about other entities. A major theoretical challenge is to demonstrate how the capacity to make generative inferences could arise as a result of learning relations from…

  14. Arts and Cultural Education at School in Europe

    ERIC Educational Resources Information Center

    Baidak, Nathalie; Horvath, Anna; Sharp, Caroline; Kearney, Caroline

    2009-01-01

    Education in European countries is subject to many competing demands which have an influence on the organisation and content of arts education. Increasing globalisation has brought both benefits and challenges, including those arising from increased international competition, migration and multiculturalism, advancements in technology and the…

  15. Uncertainty quantification of seabed parameters for large data volumes along survey tracks with a tempered particle filter

    NASA Astrophysics Data System (ADS)

    Dettmer, J.; Quijano, J. E.; Dosso, S. E.; Holland, C. W.; Mandolesi, E.

    2016-12-01

    Geophysical seabed properties are important for the detection and classification of unexploded ordnance. However, current surveying methods such as vertical seismic profiling, coring, or inversion are of limited use when surveying large areas with high spatial sampling density. We consider surveys based on a source and receiver array towed by an autonomous vehicle which produce large volumes of seabed reflectivity data that contain unprecedented and detailed seabed information. The data are analyzed with a particle filter, which requires efficient reflection-coefficient computation, efficient inversion algorithms and efficient use of computer resources. The filter quantifies information content of multiple sequential data sets by considering results from previous data along the survey track to inform the importance sampling at the current point. Challenges arise from environmental changes along the track where the number of sediment layers and their properties change. This is addressed by a trans-dimensional model in the filter which allows layering complexity to change along a track. Efficiency is improved by likelihood tempering of various particle subsets and including exchange moves (parallel tempering). The filter is implemented on a hybrid computer that combines central processing units (CPUs) and graphics processing units (GPUs) to exploit three levels of parallelism: (1) fine-grained parallel computation of spherical reflection coefficients with a GPU implementation of Levin integration; (2) updating particles by concurrent CPU processes which exchange information using automatic load balancing (coarse grained parallelism); (3) overlapping CPU-GPU communication (a major bottleneck) with GPU computation by staggering CPU access to the multiple GPUs. The algorithm is applied to spherical reflection coefficients for data sets along a 14-km track on the Malta Plateau, Mediterranean Sea. We demonstrate substantial efficiency gains over previous methods. [This research was supported in part by the U.S. Dept of Defense, thought the Strategic Environmental Research and Development Program (SERDP).

  16. Exploring the challenges faced by polytechnic students

    NASA Astrophysics Data System (ADS)

    Matore, Mohd Effendi @ Ewan Mohd; Khairani, Ahmad Zamri

    2015-02-01

    This study aims to identify other challenges besides those already faced by students, in seven polytechnics in Malaysia as a continuation to the previous research that had identified 52 main challenges faced by students using the Rasch Model. The explorative study focuses on the challenges that are not included in the Mooney Problem Checklist (MPCL). A total of 121 polytechnic students submitted 183 written responses through the open questions provided. Two hundred fifty two students had responded from a students' perspective on the dichotomous questions regarding their view on the challenges faced. The data was analysed qualitatively using the NVivo 8.0. The findings showed that students from Politeknik Seberang Perai (PSP) gave the highest response, which was 56 (30.6%) and Politeknik Metro Kuala Lumpur (PMKL) had the lowest response of 2 (1.09%). Five dominant challenges were identified, which were the English language (32, 17.5%), learning (14, 7.7%), vehicles (13, 7.1%), information technology and communication (ICT) (13, 7.1%), and peers (11, 6.0%). This article, however, focus on three apparent challenges, namely, English language, vehicles, as well as computer and ICT, as the challenges of learning and peers had been analysed in the previous MPCL. The challenge of English language that had been raised was regarding the weakness in commanding the aspects of speech and fluency. The computer and ICT challenge covered the weakness in mastering ICT and computers, as well as computer breakdowns and low-performance computers. The challenge of vehicles emphasized the unavailability of vehicles to attend lectures and go elsewhere, lack of transportation service in the polytechnic and not having a valid driving license. These challenges are very relevant and need to be discussed in an effort to prepare polytechnics in facing the transformational process of polytechnics.

  17. A minimally-resolved immersed boundary model for reaction-diffusion problems

    NASA Astrophysics Data System (ADS)

    Pal Singh Bhalla, Amneet; Griffith, Boyce E.; Patankar, Neelesh A.; Donev, Aleksandar

    2013-12-01

    We develop an immersed boundary approach to modeling reaction-diffusion processes in dispersions of reactive spherical particles, from the diffusion-limited to the reaction-limited setting. We represent each reactive particle with a minimally-resolved "blob" using many fewer degrees of freedom per particle than standard discretization approaches. More complicated or more highly resolved particle shapes can be built out of a collection of reactive blobs. We demonstrate numerically that the blob model can provide an accurate representation at low to moderate packing densities of the reactive particles, at a cost not much larger than solving a Poisson equation in the same domain. Unlike multipole expansion methods, our method does not require analytically computed Green's functions, but rather, computes regularized discrete Green's functions on the fly by using a standard grid-based discretization of the Poisson equation. This allows for great flexibility in implementing different boundary conditions, coupling to fluid flow or thermal transport, and the inclusion of other effects such as temporal evolution and even nonlinearities. We develop multigrid-based preconditioners for solving the linear systems that arise when using implicit temporal discretizations or studying steady states. In the diffusion-limited case the resulting linear system is a saddle-point problem, the efficient solution of which remains a challenge for suspensions of many particles. We validate our method by comparing to published results on reaction-diffusion in ordered and disordered suspensions of reactive spheres.

  18. Sharing privacy-sensitive access to neuroimaging and genetics data: a review and preliminary validation

    PubMed Central

    Sarwate, Anand D.; Plis, Sergey M.; Turner, Jessica A.; Arbabshirani, Mohammad R.; Calhoun, Vince D.

    2014-01-01

    The growth of data sharing initiatives for neuroimaging and genomics represents an exciting opportunity to confront the “small N” problem that plagues contemporary neuroimaging studies while further understanding the role genetic markers play in the function of the brain. When it is possible, open data sharing provides the most benefits. However, some data cannot be shared at all due to privacy concerns and/or risk of re-identification. Sharing other data sets is hampered by the proliferation of complex data use agreements (DUAs) which preclude truly automated data mining. These DUAs arise because of concerns about the privacy and confidentiality for subjects; though many do permit direct access to data, they often require a cumbersome approval process that can take months. An alternative approach is to only share data derivatives such as statistical summaries—the challenges here are to reformulate computational methods to quantify the privacy risks associated with sharing the results of those computations. For example, a derived map of gray matter is often as identifiable as a fingerprint. Thus alternative approaches to accessing data are needed. This paper reviews the relevant literature on differential privacy, a framework for measuring and tracking privacy loss in these settings, and demonstrates the feasibility of using this framework to calculate statistics on data distributed at many sites while still providing privacy. PMID:24778614

  19. Sharing privacy-sensitive access to neuroimaging and genetics data: a review and preliminary validation.

    PubMed

    Sarwate, Anand D; Plis, Sergey M; Turner, Jessica A; Arbabshirani, Mohammad R; Calhoun, Vince D

    2014-01-01

    The growth of data sharing initiatives for neuroimaging and genomics represents an exciting opportunity to confront the "small N" problem that plagues contemporary neuroimaging studies while further understanding the role genetic markers play in the function of the brain. When it is possible, open data sharing provides the most benefits. However, some data cannot be shared at all due to privacy concerns and/or risk of re-identification. Sharing other data sets is hampered by the proliferation of complex data use agreements (DUAs) which preclude truly automated data mining. These DUAs arise because of concerns about the privacy and confidentiality for subjects; though many do permit direct access to data, they often require a cumbersome approval process that can take months. An alternative approach is to only share data derivatives such as statistical summaries-the challenges here are to reformulate computational methods to quantify the privacy risks associated with sharing the results of those computations. For example, a derived map of gray matter is often as identifiable as a fingerprint. Thus alternative approaches to accessing data are needed. This paper reviews the relevant literature on differential privacy, a framework for measuring and tracking privacy loss in these settings, and demonstrates the feasibility of using this framework to calculate statistics on data distributed at many sites while still providing privacy.

  20. Cloud-based NEXRAD Data Processing and Analysis for Hydrologic Applications

    NASA Astrophysics Data System (ADS)

    Seo, B. C.; Demir, I.; Keem, M.; Goska, R.; Weber, J.; Krajewski, W. F.

    2016-12-01

    The real-time and full historical archive of NEXRAD Level II data, covering the entire United States from 1991 to present, recently became available on Amazon cloud S3. This provides a new opportunity to rebuild the Hydro-NEXRAD software system that enabled users to access vast amounts of NEXRAD radar data in support of a wide range of research. The system processes basic radar data (Level II) and delivers radar-rainfall products based on the user's custom selection of features such as space and time domain, river basin, rainfall product space and time resolution, and rainfall estimation algorithms. The cloud-based new system can eliminate prior challenges faced by Hydro-NEXRAD data acquisition and processing: (1) temporal and spatial limitation arising from the limited data storage; (2) archive (past) data ingestion and format conversion; and (3) separate data processing flow for the past and real-time Level II data. To enhance massive data processing and computational efficiency, the new system is implemented and tested for the Iowa domain. This pilot study begins by ingesting rainfall metadata and implementing Hydro-NEXRAD capabilities on the cloud using the new polarimetric features, as well as the existing algorithm modules and scripts. The authors address the reliability and feasibility of cloud computation and processing, followed by an assessment of response times from an interactive web-based system.

  1. Translating stem cell research: challenges at the research frontier.

    PubMed

    Magnus, David

    2010-01-01

    This paper will address the translation of basic stem cell research into clinical research. While "stem cell" trials are sometimes used to describe established practices of bone marrow transplantation or transplantation of primary cells derived from bone marrow, for the purposes of this paper, I am primarily focusing on stem cell trials which are far less established, including use of hESC derived stem cells. The central ethical challenges in stem cell clinical trials arise in frontier research, not in standard, well-established areas of research.

  2. Employee vs independent contractor.

    PubMed

    Kolender, Ellen

    2012-01-01

    Finding qualified personnel for the cancer registry department has become increasingly difficult, as experienced abstractors retire and cancer diagnoses increase. Faced with hiring challenges, managers turn to teleworkers to fill positions and accomplish work in a timely manner. Suddenly, the hospital hires new legal staff and all telework agreements are disrupted. The question arises: Are teleworkers employees or independent contractors? Creating telework positions requires approval from the legal department and human resources. Caught off-guard in the last quarter of the year, I found myself again faced with hiring challenges.

  3. Challenges facing the distribution of an artificial-intelligence-based system for nursing.

    PubMed

    Evans, S

    1985-04-01

    The marketing and successful distribution of artificial-intelligence-based decision-support systems for nursing face special barriers and challenges. Issues that must be confronted arise particularly from the present culture of the nursing profession as well as the typical organizational structures in which nurses predominantly work. Generalizations in the literature based on the limited experience of physician-oriented artificial intelligence applications (predominantly in diagnosis and pharmacologic treatment) must be modified for applicability to other health professions.

  4. Climate change and evolutionary adaptation.

    PubMed

    Hoffmann, Ary A; Sgrò, Carla M

    2011-02-24

    Evolutionary adaptation can be rapid and potentially help species counter stressful conditions or realize ecological opportunities arising from climate change. The challenges are to understand when evolution will occur and to identify potential evolutionary winners as well as losers, such as species lacking adaptive capacity living near physiological limits. Evolutionary processes also need to be incorporated into management programmes designed to minimize biodiversity loss under rapid climate change. These challenges can be met through realistic models of evolutionary change linked to experimental data across a range of taxa.

  5. 78 FR 28569 - Nondiscrimination on the Basis of Age in Federally Assisted Programs or Activities

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-15

    ... the rights and obligations of recipients thereof; or (4) raise novel legal or policy issues arising...)(1), or increases in funding as a result of changed computation of formula awards. (2) NEH will not...

  6. Stress analysis under component relative interference fit

    NASA Technical Reports Server (NTRS)

    Taylor, C. M.

    1978-01-01

    Finite-element computer program enables analysis of distortions and stresses occurring in components having relative interference. Program restricts itself to simple elements and axisymmetric loading situations. External inertial and thermal loads may be applied in addition to forces arising from interference conditions.

  7. New Factorization Techniques and Fast Serial and Parrallel Algorithms for Operational Space Control of Robot Manipulators

    NASA Technical Reports Server (NTRS)

    Fijany, Amir; Djouani, Karim; Fried, George; Pontnau, Jean

    1997-01-01

    In this paper a new factorization technique for computation of inverse of mass matrix, and the operational space mass matrix, as arising in implementation of the operational space control scheme, is presented.

  8. Interplanetary Trajectories, Encke Method (ITEM)

    NASA Technical Reports Server (NTRS)

    Whitlock, F. H.; Wolfe, H.; Lefton, L.; Levine, N.

    1972-01-01

    Modified program has been developed using improved variation of Encke method which avoids accumulation of round-off errors and avoids numerical ambiguities arising from near-circular orbits of low inclination. Variety of interplanetary trajectory problems can be computed with maximum accuracy and efficiency.

  9. Computational communities: African-American cultural capital in computer science education

    NASA Astrophysics Data System (ADS)

    Lachney, Michael

    2017-10-01

    Enrolling the cultural capital of underrepresented communities in PK-12 technology and curriculum design has been a primary strategy for broadening the participation of students of color in U.S. computer science (CS) fields. This article examines two ways that African-American cultural capital and computing can be bridged in CS education. The first is community representation, using cultural capital to highlight students' social identities and networks through computational thinking. The second, computational integration, locates computation in cultural capital itself. I survey two risks - the appearance of shallow computing and the reproduction of assimilationist logics - that may arise when constructing one bridge without the other. To avoid these risks, I introduce the concept of computational communities by exploring areas in CS education that employ both strategies. This concept is then grounded in qualitative data from an after school program that connected CS to African-American cosmetology.

  10. Tracking Decimal Misconceptions: Strategic Instructional Choices

    ERIC Educational Resources Information Center

    Griffin, Linda B.

    2016-01-01

    Understanding the decimal system is challenging, requiring coordination of place-value concepts with features of whole-number and fraction knowledge (Moloney and Stacey 1997). Moreover, the learner must discern if and how previously learned concepts and procedures apply. The process is complex, and misconceptions will naturally arise. In a…

  11. Institutionalizing Equitable Policies and Practices for Contingent Faculty

    ERIC Educational Resources Information Center

    Kezar, Adrianna; Sam, Cecile

    2013-01-01

    This study is a qualitative inquiry into the institutionalization of equitable policies for non-tenure-track faculty. Through the theoretical framework of institutionalization, we examine factors and strategies forwarding various policies and practices and the challenges that arise. The results highlight themes throughout the stages of…

  12. Minority Adolescent Stress and Coping

    ERIC Educational Resources Information Center

    Gonzales, Nancy A.; George, Preethy E.; Fernandez, Aida Cristina; Huerta, Violeta L.

    2005-01-01

    Many of the stressful life events and daily hassles of adolescence are similar for youths despite differences in cultural background or place of residence. However, adolescents from diverse cultural groups often encounter unique challenges that arise from the particular cultural-ecological niches they inhabit by virtue of their ethnic group…

  13. MODIS EVI as a Surrogate for Net Primary Production across Precipitation Regimes

    USDA-ARS?s Scientific Manuscript database

    According to Global Climate Models (GCMs) the occurrence of extreme events of precipitation will be more frequent in the future. Therefore, important challenges arise regarding climate variability, which are mainly related to the understanding of ecosystem responses to changes in precipitation patte...

  14. Subscriptions Are Us: Content, Access, & Collections

    ERIC Educational Resources Information Center

    Thomas, Lisa Carlucci

    2012-01-01

    In a time of increasingly digital distribution, challenging questions arise regarding what people own, what they want to access to, and how they develop and maintain collections. What considerations influence their decision making, as individuals and libraries shift toward more subscription-oriented content? Digital access to e-books and…

  15. Primary Teachers, Policy, and Physical Education

    ERIC Educational Resources Information Center

    Petrie, Kirsten; lisahunter,

    2011-01-01

    This article focuses on the challenges arising for primary school teachers who have responsibility for teaching physical education (PE) and who are working in particularly complex and contestable policy contexts. In New Zealand provision of physical education is identified as occurring amidst multiple, and not necessarily compatible, sets of…

  16. The Awareness and Challenges of Cloud Computing Adoption on Tertiary Education in Malaysia

    NASA Astrophysics Data System (ADS)

    Hazreeni Hamzah, Nor; Mahmud, Maziah; Zukri, Shamsunarnie Mohamed; Yaacob, Wan Fairos Wan; Yacob, Jusoh

    2017-09-01

    This preliminary study aims to investigate the awareness of the adoption of cloud computing among the academicians in tertiary education in Malaysia. Besides, this study also want to explore the possible challenges faced by the academician while adopting this new technology. The pilot study was done on 40 lecturers in Universiti Teknologi MARA Kampus Kota Bharu (UiTMKB) by using self administered questionnaire. The results found that almost half (40 percent) were not aware on the existing of cloud computing in teaching and learning (T&L) process. The challenges confronting the adoption of cloud computing are data insecurity, data insecurity, unsolicited advertisement, lock-in, reluctance to eliminate staff positions, privacy concerns, reliability challenge, regulatory compliance concerns/user control and institutional culture/resistance to change in technology. This possible challenges can be factorized in two major factors which were security and dependency factor and user control and mentality factor.

  17. Perioperative management of patients with severe hypophosphataemia secondary to oncogenic osteomalacia: Our experience and review of literature

    PubMed Central

    Verma, Alka; Tewari, Saipriya; Kannaujia, Ashish

    2017-01-01

    Oncogenic osteomalacia (OOM) is a rare paraneoplastic syndrome associated with mesenchymal tumours. It is characterised by phosphaturia, hypophosphataemia, decreased serum Vitamin D3 levels and severe osteomalacia. OOM-inducing tumours are usually benign, arising either from bone or soft tissue, with extremities and craniofacial region being the most common sites. Surgical resection of the tumour remains the mainstay of treatment. Challenges to an anaesthesiologist arise when such patients are planned for surgical resection of the underlying tumour. All the perioperative dilemmas are directly related to the severe hypophosphataemia. We describe three such cases of OOM and their perioperative management. PMID:28794533

  18. Hydatid cyst of urinary bladder associated with pregnancy:a case report.

    PubMed

    Kanagal, Deepa V; Hanumanalu, Lokeshchandra C

    2010-07-01

    Echinococcosis or hydatid disease which is caused by Echinococcus group of cestodes is very rare in pregnancy. While liver and lungs are commonly involved, other sites can be rarely affected. The management of hydatid disease in pregnancy is challenging in view of varied presentation and manifestation. We report a case of hydatid cyst arising from the bladder associated with pregnancy and presenting with abdominal pain. The cyst was surgically removed and the bladder wash was given with povidone-iodine. The postoperative recovery was uneventful with ongoing pregnancy. This is to our knowledge, the first case of hydatid cyst arising from the bladder associated with pregnancy to be reported.

  19. Good, now keep going: challenging the status quo in STEM pipeline and access programs

    NASA Astrophysics Data System (ADS)

    Wiseman, Dawn; Herrmann, Randy

    2018-03-01

    This contribution engages in conversation with McMahon, Griese, and Kenyon (this issue) to consider how the SURE program they describe represents a pragmatic approach to addressing the issue of underrepresentation of Indigenous people in STEM post-secondary programs. We explore how such programs are generally positioned and how they might be positioned differently to challenge the status quo within Western post-secondary institutions. The challenge arises from moving beyond the immediate pragmatics of addressing an identifiable issue framed as a problem to considering how post-secondary institutions and people developing access recruitment programs might begin unlearning colonialism.

  20. Pharmacological Fingerprints of Contextual Uncertainty

    PubMed Central

    Ruge, Diane; Stephan, Klaas E.

    2016-01-01

    Successful interaction with the environment requires flexible updating of our beliefs about the world. By estimating the likelihood of future events, it is possible to prepare appropriate actions in advance and execute fast, accurate motor responses. According to theoretical proposals, agents track the variability arising from changing environments by computing various forms of uncertainty. Several neuromodulators have been linked to uncertainty signalling, but comprehensive empirical characterisation of their relative contributions to perceptual belief updating, and to the selection of motor responses, is lacking. Here we assess the roles of noradrenaline, acetylcholine, and dopamine within a single, unified computational framework of uncertainty. Using pharmacological interventions in a sample of 128 healthy human volunteers and a hierarchical Bayesian learning model, we characterise the influences of noradrenergic, cholinergic, and dopaminergic receptor antagonism on individual computations of uncertainty during a probabilistic serial reaction time task. We propose that noradrenaline influences learning of uncertain events arising from unexpected changes in the environment. In contrast, acetylcholine balances attribution of uncertainty to chance fluctuations within an environmental context, defined by a stable set of probabilistic associations, or to gross environmental violations following a contextual switch. Dopamine supports the use of uncertainty representations to engender fast, adaptive responses. PMID:27846219

  1. A Stochastic Spiking Neural Network for Virtual Screening.

    PubMed

    Morro, A; Canals, V; Oliver, A; Alomar, M L; Galan-Prado, F; Ballester, P J; Rossello, J L

    2018-04-01

    Virtual screening (VS) has become a key computational tool in early drug design and screening performance is of high relevance due to the large volume of data that must be processed to identify molecules with the sought activity-related pattern. At the same time, the hardware implementations of spiking neural networks (SNNs) arise as an emerging computing technique that can be applied to parallelize processes that normally present a high cost in terms of computing time and power. Consequently, SNN represents an attractive alternative to perform time-consuming processing tasks, such as VS. In this brief, we present a smart stochastic spiking neural architecture that implements the ultrafast shape recognition (USR) algorithm achieving two order of magnitude of speed improvement with respect to USR software implementations. The neural system is implemented in hardware using field-programmable gate arrays allowing a highly parallelized USR implementation. The results show that, due to the high parallelization of the system, millions of compounds can be checked in reasonable times. From these results, we can state that the proposed architecture arises as a feasible methodology to efficiently enhance time-consuming data-mining processes such as 3-D molecular similarity search.

  2. Challenges and Security in Cloud Computing

    NASA Astrophysics Data System (ADS)

    Chang, Hyokyung; Choi, Euiin

    People who live in this world want to solve any problems as they happen then. An IT technology called Ubiquitous computing should help the situations easier and we call a technology which makes it even better and powerful cloud computing. Cloud computing, however, is at the stage of the beginning to implement and use and it faces a lot of challenges in technical matters and security issues. This paper looks at the cloud computing security.

  3. Beyond the Workshop: Educational Policy in Situated Practice.

    ERIC Educational Resources Information Center

    Jenson, Jennifer; Lewis, Brian

    2001-01-01

    Identifies questions arising from implementation of computer-based technologies in Canadian schools--questions of public policy in an increasingly technocentric and commercialized environment, of investment in technological infrastructure, and of teachers' professional development and its effectiveness. Lists necessary factors for the success of…

  4. Towards a Computational Model of Sketching

    DTIC Science & Technology

    2000-01-01

    interaction that sketching provides in human-to- human communication , multimodal research will rely heavily upon, and even drive, AI research . This...can. Dimensions of sketching The power of sketching in human communication arises from the high bandwidth it provides [21] . There is high perceptual

  5. An efficient numerical method for solving the Boltzmann equation in multidimensions

    NASA Astrophysics Data System (ADS)

    Dimarco, Giacomo; Loubère, Raphaël; Narski, Jacek; Rey, Thomas

    2018-01-01

    In this paper we deal with the extension of the Fast Kinetic Scheme (FKS) (Dimarco and Loubère, 2013 [26]) originally constructed for solving the BGK equation, to the more challenging case of the Boltzmann equation. The scheme combines a robust and fast method for treating the transport part based on an innovative Lagrangian technique supplemented with conservative fast spectral schemes to treat the collisional operator by means of an operator splitting approach. This approach along with several implementation features related to the parallelization of the algorithm permits to construct an efficient simulation tool which is numerically tested against exact and reference solutions on classical problems arising in rarefied gas dynamic. We present results up to the 3 D × 3 D case for unsteady flows for the Variable Hard Sphere model which may serve as benchmark for future comparisons between different numerical methods for solving the multidimensional Boltzmann equation. For this reason, we also provide for each problem studied details on the computational cost and memory consumption as well as comparisons with the BGK model or the limit model of compressible Euler equations.

  6. Syllables and bigrams: orthographic redundancy and syllabic units affect visual word recognition at different processing levels.

    PubMed

    Conrad, Markus; Carreiras, Manuel; Tamm, Sascha; Jacobs, Arthur M

    2009-04-01

    Over the last decade, there has been increasing evidence for syllabic processing during visual word recognition. If syllabic effects prove to be independent from orthographic redundancy, this would seriously challenge the ability of current computational models to account for the processing of polysyllabic words. Three experiments are presented to disentangle effects of the frequency of syllabic units and orthographic segments in lexical decision. In Experiment 1 the authors obtained an inhibitory syllable frequency effect that was unaffected by the presence or absence of a bigram trough at the syllable boundary. In Experiments 2 and 3 an inhibitory effect of initial syllable frequency but a facilitative effect of initial bigram frequency emerged when manipulating 1 of the 2 measures and controlling for the other in Spanish words starting with consonant-vowel syllables. The authors conclude that effects of syllable frequency and letter-cluster frequency are independent and arise at different processing levels of visual word recognition. Results are discussed within the framework of an interactive activation model of visual word recognition. (c) 2009 APA, all rights reserved.

  7. Exact Open Quantum System Dynamics Using the Hierarchy of Pure States (HOPS).

    PubMed

    Hartmann, Richard; Strunz, Walter T

    2017-12-12

    We show that the general and numerically exact Hierarchy of Pure States method (HOPS) is very well applicable to calculate the reduced dynamics of an open quantum system. In particular, we focus on environments with a sub-Ohmic spectral density (SD) resulting in an algebraic decay of the bath correlation function (BCF). The universal applicability of HOPS, reaching from weak to strong coupling for zero and nonzero temperature, is demonstrated by solving the spin-boson model for which we find perfect agreement with other methods, each one suitable for a special regime of parameters. The challenges arising in the strong coupling regime are not only reflected in the computational effort needed for the HOPS method to converge but also in the necessity for an importance sampling mechanism, accounted for by the nonlinear variant of HOPS. In order to include nonzero-temperature effects in the strong coupling regime we found that it is highly favorable for the HOPS method to use the zero-temperature BCF and include temperature via a stochastic Hermitian contribution to the system Hamiltonian.

  8. Electrical stimulus artifact cancellation and neural spike detection on large multi-electrode arrays

    PubMed Central

    Grosberg, Lauren E.; Madugula, Sasidhar; Litke, Alan; Cunningham, John; Chichilnisky, E. J.; Paninski, Liam

    2017-01-01

    Simultaneous electrical stimulation and recording using multi-electrode arrays can provide a valuable technique for studying circuit connectivity and engineering neural interfaces. However, interpreting these measurements is challenging because the spike sorting process (identifying and segregating action potentials arising from different neurons) is greatly complicated by electrical stimulation artifacts across the array, which can exhibit complex and nonlinear waveforms, and overlap temporarily with evoked spikes. Here we develop a scalable algorithm based on a structured Gaussian Process model to estimate the artifact and identify evoked spikes. The effectiveness of our methods is demonstrated in both real and simulated 512-electrode recordings in the peripheral primate retina with single-electrode and several types of multi-electrode stimulation. We establish small error rates in the identification of evoked spikes, with a computational complexity that is compatible with real-time data analysis. This technology may be helpful in the design of future high-resolution sensory prostheses based on tailored stimulation (e.g., retinal prostheses), and for closed-loop neural stimulation at a much larger scale than currently possible. PMID:29131818

  9. Virtual slide telepathology workstation of the future: lessons learned from teleradiology.

    PubMed

    Krupinski, Elizabeth A

    2009-08-01

    The clinical reading environment for the 21st century pathologist looks very different than it did even a few short years ago. Glass slides are quickly being replaced by digital "virtual slides," and the traditional light microscope is being replaced by the computer display. There are numerous questions that arise however when deciding exactly what this new digital display viewing environment will be like. Choosing a workstation for daily use in the interpretation of digital pathology images can be a very daunting task. Radiology went digital nearly 20 years ago and faced many of the same challenges so there are lessons to be learned from these experiences. One major lesson is that there is no "one size fits all" workstation so users must consider a variety of factors when choosing a workstation. In this article, we summarize some of the potentially critical elements in a pathology workstation and the characteristics one should be aware of and look for in the selection of one. Issues pertaining to both hardware and software aspects of medical workstations will be reviewed particularly as they may impact the interpretation process.

  10. Gigwa-Genotype investigator for genome-wide analyses.

    PubMed

    Sempéré, Guilhem; Philippe, Florian; Dereeper, Alexis; Ruiz, Manuel; Sarah, Gautier; Larmande, Pierre

    2016-06-06

    Exploring the structure of genomes and analyzing their evolution is essential to understanding the ecological adaptation of organisms. However, with the large amounts of data being produced by next-generation sequencing, computational challenges arise in terms of storage, search, sharing, analysis and visualization. This is particularly true with regards to studies of genomic variation, which are currently lacking scalable and user-friendly data exploration solutions. Here we present Gigwa, a web-based tool that provides an easy and intuitive way to explore large amounts of genotyping data by filtering it not only on the basis of variant features, including functional annotations, but also on genotype patterns. The data storage relies on MongoDB, which offers good scalability properties. Gigwa can handle multiple databases and may be deployed in either single- or multi-user mode. In addition, it provides a wide range of popular export formats. The Gigwa application is suitable for managing large amounts of genomic variation data. Its user-friendly web interface makes such processing widely accessible. It can either be simply deployed on a workstation or be used to provide a shared data portal for a given community of researchers.

  11. Surrogate-based Analysis and Optimization

    NASA Technical Reports Server (NTRS)

    Queipo, Nestor V.; Haftka, Raphael T.; Shyy, Wei; Goel, Tushar; Vaidyanathan, Raj; Tucker, P. Kevin

    2005-01-01

    A major challenge to the successful full-scale development of modem aerospace systems is to address competing objectives such as improved performance, reduced costs, and enhanced safety. Accurate, high-fidelity models are typically time consuming and computationally expensive. Furthermore, informed decisions should be made with an understanding of the impact (global sensitivity) of the design variables on the different objectives. In this context, the so-called surrogate-based approach for analysis and optimization can play a very valuable role. The surrogates are constructed using data drawn from high-fidelity models, and provide fast approximations of the objectives and constraints at new design points, thereby making sensitivity and optimization studies feasible. This paper provides a comprehensive discussion of the fundamental issues that arise in surrogate-based analysis and optimization (SBAO), highlighting concepts, methods, techniques, as well as practical implications. The issues addressed include the selection of the loss function and regularization criteria for constructing the surrogates, design of experiments, surrogate selection and construction, sensitivity analysis, convergence, and optimization. The multi-objective optimal design of a liquid rocket injector is presented to highlight the state of the art and to help guide future efforts.

  12. Final Technical Report: Mathematical Foundations for Uncertainty Quantification in Materials Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plechac, Petr; Vlachos, Dionisios G.

    We developed path-wise information theory-based and goal-oriented sensitivity analysis and parameter identification methods for complex high-dimensional dynamics and in particular of non-equilibrium extended molecular systems. The combination of these novel methodologies provided the first methods in the literature which are capable to handle UQ questions for stochastic complex systems with some or all of the following features: (a) multi-scale stochastic models such as (bio)chemical reaction networks, with a very large number of parameters, (b) spatially distributed systems such as Kinetic Monte Carlo or Langevin Dynamics, (c) non-equilibrium processes typically associated with coupled physico-chemical mechanisms, driven boundary conditions, hybrid micro-macro systems,more » etc. A particular computational challenge arises in simulations of multi-scale reaction networks and molecular systems. Mathematical techniques were applied to in silico prediction of novel materials with emphasis on the effect of microstructure on model uncertainty quantification (UQ). We outline acceleration methods to make calculations of real chemistry feasible followed by two complementary tasks on structure optimization and microstructure-induced UQ.« less

  13. The neural mechanisms of learning from competitors.

    PubMed

    Howard-Jones, Paul A; Bogacz, Rafal; Yoo, Jee H; Leonards, Ute; Demetriou, Skevi

    2010-11-01

    Learning from competitors poses a challenge for existing theories of reward-based learning, which assume that rewarded actions are more likely to be executed in the future. Such a learning mechanism would disadvantage a player in a competitive situation because, since the competitor's loss is the player's gain, reward might become associated with an action the player should themselves avoid. Using fMRI, we investigated the neural activity of humans competing with a computer in a foraging task. We observed neural activity that represented the variables required for learning from competitors: the actions of the competitor (in the player's motor and premotor cortex) and the reward prediction error arising from the competitor's feedback. In particular, regions positively correlated with the unexpected loss of the competitor (which was beneficial to the player) included the striatum and those regions previously implicated in response inhibition. Our results suggest that learning in such contexts may involve the competitor's unexpected losses activating regions of the player's brain that subserve response inhibition, as the player learns to avoid the actions that produced them. Copyright 2010 Elsevier Inc. All rights reserved.

  14. Data Quality in Remote Sensing

    NASA Astrophysics Data System (ADS)

    Batini, C.; Blaschke, T.; Lang, S.; Albrecht, F.; Abdulmutalib, H. M.; Barsi, Á.; Szabó, G.; Kugler, Zs.

    2017-09-01

    The issue of data quality (DQ) is of growing importance in Remote Sensing (RS), due to the widespread use of digital services (incl. apps) that exploit remote sensing data. In this position paper a body of experts from the ISPRS Intercommission working group III/IVb "DQ" identifies, categorises and reasons about issues that are considered as crucial for a RS research and application agenda. This ISPRS initiative ensures to build on earlier work by other organisations such as IEEE, CEOS or GEO, in particular on the meritorious work of the Quality Assurance Framework for Earth Observation (QA4EO) which was established and endorsed by the Committee on Earth Observation Satellites (CEOS) but aims to broaden the view by including experts from computer science and particularly database science. The main activities and outcomes include: providing a taxonomy of DQ dimensions in the RS domain, achieving a global approach to DQ for heterogeneous-format RS data sets, investigate DQ dimensions in use, conceive a methodology for managing cost effective solutions on DQ in RS initiatives, and to address future challenges on RS DQ dimensions arising in the new era of the big Earth data.

  15. Protein complexes, big data, machine learning and integrative proteomics: lessons learned over a decade of systematic analysis of protein interaction networks.

    PubMed

    Havugimana, Pierre C; Hu, Pingzhao; Emili, Andrew

    2017-10-01

    Elucidation of the networks of physical (functional) interactions present in cells and tissues is fundamental for understanding the molecular organization of biological systems, the mechanistic basis of essential and disease-related processes, and for functional annotation of previously uncharacterized proteins (via guilt-by-association or -correlation). After a decade in the field, we felt it timely to document our own experiences in the systematic analysis of protein interaction networks. Areas covered: Researchers worldwide have contributed innovative experimental and computational approaches that have driven the rapidly evolving field of 'functional proteomics'. These include mass spectrometry-based methods to characterize macromolecular complexes on a global-scale and sophisticated data analysis tools - most notably machine learning - that allow for the generation of high-quality protein association maps. Expert commentary: Here, we recount some key lessons learned, with an emphasis on successful workflows, and challenges, arising from our own and other groups' ongoing efforts to generate, interpret and report proteome-scale interaction networks in increasingly diverse biological contexts.

  16. Electrical stimulus artifact cancellation and neural spike detection on large multi-electrode arrays.

    PubMed

    Mena, Gonzalo E; Grosberg, Lauren E; Madugula, Sasidhar; Hottowy, Paweł; Litke, Alan; Cunningham, John; Chichilnisky, E J; Paninski, Liam

    2017-11-01

    Simultaneous electrical stimulation and recording using multi-electrode arrays can provide a valuable technique for studying circuit connectivity and engineering neural interfaces. However, interpreting these measurements is challenging because the spike sorting process (identifying and segregating action potentials arising from different neurons) is greatly complicated by electrical stimulation artifacts across the array, which can exhibit complex and nonlinear waveforms, and overlap temporarily with evoked spikes. Here we develop a scalable algorithm based on a structured Gaussian Process model to estimate the artifact and identify evoked spikes. The effectiveness of our methods is demonstrated in both real and simulated 512-electrode recordings in the peripheral primate retina with single-electrode and several types of multi-electrode stimulation. We establish small error rates in the identification of evoked spikes, with a computational complexity that is compatible with real-time data analysis. This technology may be helpful in the design of future high-resolution sensory prostheses based on tailored stimulation (e.g., retinal prostheses), and for closed-loop neural stimulation at a much larger scale than currently possible.

  17. Fully coupled methods for multiphase morphodynamics

    NASA Astrophysics Data System (ADS)

    Michoski, C.; Dawson, C.; Mirabito, C.; Kubatko, E. J.; Wirasaet, D.; Westerink, J. J.

    2013-09-01

    We present numerical methods for a system of equations consisting of the two dimensional Saint-Venant shallow water equations (SWEs) fully coupled to a completely generalized Exner formulation of hydrodynamically driven sediment discharge. This formulation is implemented by way of a discontinuous Galerkin (DG) finite element method, using a Roe Flux for the advective components and the unified form for the dissipative components. We implement a number of Runge-Kutta time integrators, including a family of strong stability preserving (SSP) schemes, and Runge-Kutta Chebyshev (RKC) methods. A brief discussion is provided regarding implementational details for generalizable computer algebra tokenization using arbitrary algebraic fluxes. We then run numerical experiments to show standard convergence rates, and discuss important mathematical and numerical nuances that arise due to prominent features in the coupled system, such as the emergence of nondifferentiable and sharp zero crossing functions, radii of convergence in manufactured solutions, and nonconservative product (NCP) formalisms. Finally we present a challenging application model concerning hydrothermal venting across metalliferous muds in the presence of chemical reactions occurring in low pH environments.

  18. Acquiring a 2D rolled equivalent fingerprint image from a non-contact 3D finger scan

    NASA Astrophysics Data System (ADS)

    Fatehpuria, Abhishika; Lau, Daniel L.; Hassebrook, Laurence G.

    2006-04-01

    The use of fingerprints as a biometric is both the oldest mode of computer aided personal identification and the most relied-upon technology in use today. But current fingerprint scanning systems have some challenging and peculiar difficulties. Often skin conditions and imperfect acquisition circumstances cause the captured fingerprint image to be far from ideal. Also some of the acquisition techniques can be slow and cumbersome to use and may not provide the complete information required for reliable feature extraction and fingerprint matching. Most of the difficulties arise due to the contact of the fingerprint surface with the sensor platen. To attain a fast-capture, non-contact, fingerprint scanning technology, we are developing a scanning system that employs structured light illumination as a means for acquiring a 3-D scan of the finger with sufficiently high resolution to record ridge-level details. In this paper, we describe the postprocessing steps used for converting the acquired 3-D scan of the subject's finger into a 2-D rolled equivalent image.

  19. Advancing the Theory of Nuclear Reactions with Rare Isotopes. From the Laboratory to the Cosmos

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nunes, Filomena

    2015-06-01

    The mission of the Topical Collaboration on the Theory of Reactions for Unstable iSotopes (TORUS) was to develop new methods to advance nuclear reaction theory for unstable isotopes—particularly the (d,p) reaction in which a deuteron, composed of a proton and a neutron, transfers its neutron to an unstable nucleus. After benchmarking the state-of-the-art theories, the TORUS collaboration found that there were no exact methods to study (d,p) reactions involving heavy targets; the difficulty arising from the long-range nature of the well known, yet subtle, Coulomb force. To overcome this challenge, the TORUS collaboration developed a new theory where the complexitymore » of treating the long-range Coulomb interaction is shifted to the calculation of so-called form-factors. An efficient implementation for the computation of these form factors was a major achievement of the TORUS collaboration. All the new machinery developed are essential ingredients to analyse (d,p) reactions involving heavy nuclei relevant for astrophysics, energy production, and stockpile stewardship.« less

  20. Analysis of Inlet-Compressor Acoustic Interactions Using Coupled CFD Codes

    NASA Technical Reports Server (NTRS)

    Suresh, A.; Townsend, S. E.; Cole, G. L.; Slater, J. W.; Chima, R.

    1998-01-01

    A problem that arises in the numerical simulation of supersonic inlets is the lack of a suitable boundary condition at the engine face. In this paper, a coupled approach, in which the inlet computation is coupled dynamically to a turbomachinery computation, is proposed as a means to overcome this problem. The specific application chosen for validation of this approach is the collapsing bump experiment performed at the University of Cincinnati. The computed results are found to be in reasonable agreement with experimental results. The coupled simulation results could also be used to aid development of a simplified boundary condition.

  1. Relative motion using analytical differential gravity

    NASA Technical Reports Server (NTRS)

    Gottlieb, Robert G.

    1988-01-01

    This paper presents a new approach to the computation of the motion of one satellite relative to another. The trajectory of the reference satellite is computed accurately subject to geopotential perturbations. This precise trajectory is used as a reference in computing the position of a nearby body, or bodies. The problem that arises in this approach is differencing nearly equal terms in the geopotential model, especially as the separation of the reference and nearby bodies approaches zero. By developing closed form expressions for differences in higher order and degree geopotential terms, the numerical problem inherent in the differencing approach is eliminated.

  2. Automatic editing of manuals

    NASA Technical Reports Server (NTRS)

    Rich, R. P.

    1970-01-01

    The documentation problem is discussed that arises in getting all the many items included in a computer program prepared in a timely fashion and keeping them all correct and mutually consistent during the life of the program. The proposed approach to the problem is to collect all the necessary information into a single document, which is maintained with computer assistance during the life of the program and from which the required subdocuments can be extracted as desired. Implementation of this approach requires a package of programs for computer editorial assistance and is facilitated by certain programming practices that are discussed.

  3. Solving difficult problems creatively: a role for energy optimised deterministic/stochastic hybrid computing

    PubMed Central

    Palmer, Tim N.; O’Shea, Michael

    2015-01-01

    How is the brain configured for creativity? What is the computational substrate for ‘eureka’ moments of insight? Here we argue that creative thinking arises ultimately from a synergy between low-energy stochastic and energy-intensive deterministic processing, and is a by-product of a nervous system whose signal-processing capability per unit of available energy has become highly energy optimised. We suggest that the stochastic component has its origin in thermal (ultimately quantum decoherent) noise affecting the activity of neurons. Without this component, deterministic computational models of the brain are incomplete. PMID:26528173

  4. Distributed computing environments for future space control systems

    NASA Technical Reports Server (NTRS)

    Viallefont, Pierre

    1993-01-01

    The aim of this paper is to present the results of a CNES research project on distributed computing systems. The purpose of this research was to study the impact of the use of new computer technologies in the design and development of future space applications. The first part of this study was a state-of-the-art review of distributed computing systems. One of the interesting ideas arising from this review is the concept of a 'virtual computer' allowing the distributed hardware architecture to be hidden from a software application. The 'virtual computer' can improve system performance by adapting the best architecture (addition of computers) to the software application without having to modify its source code. This concept can also decrease the cost and obsolescence of the hardware architecture. In order to verify the feasibility of the 'virtual computer' concept, a prototype representative of a distributed space application is being developed independently of the hardware architecture.

  5. Perceptual organization in computer vision - A review and a proposal for a classificatory structure

    NASA Technical Reports Server (NTRS)

    Sarkar, Sudeep; Boyer, Kim L.

    1993-01-01

    The evolution of perceptual organization in biological vision, and its necessity in advanced computer vision systems, arises from the characteristic that perception, the extraction of meaning from sensory input, is an intelligent process. This is particularly so for high order organisms and, analogically, for more sophisticated computational models. The role of perceptual organization in computer vision systems is explored. This is done from four vantage points. First, a brief history of perceptual organization research in both humans and computer vision is offered. Next, a classificatory structure in which to cast perceptual organization research to clarify both the nomenclature and the relationships among the many contributions is proposed. Thirdly, the perceptual organization work in computer vision in the context of this classificatory structure is reviewed. Finally, the array of computational techniques applied to perceptual organization problems in computer vision is surveyed.

  6. Strategies, Challenges and Prospects for Active Learning in the Computer-Based Classroom

    ERIC Educational Resources Information Center

    Holbert, K. E.; Karady, G. G.

    2009-01-01

    The introduction of computer-equipped classrooms into engineering education has brought with it a host of opportunities and issues. Herein, some of the challenges and successes for creating an environment for active learning within computer-based classrooms are described. The particular teaching approach developed for undergraduate electrical…

  7. The Effects of Race Conditions when Implementing Single-Source Redundant Clock Trees in Triple Modular Redundant Synchronous Architectures

    NASA Technical Reports Server (NTRS)

    Berg, Melanie D.; Label, Kenneth A.; Pellish, Jonathan

    2016-01-01

    We present the challenges that arise when using redundant clock domains due to their clock-skew. Heavy-ion radiation data show that a singular clock domain (DTMR) provides an improved TMR methodology for SRAM-based FPGAs over redundant clocks.

  8. Assessing the Quality of Expertise Differences in the Comprehension of Medical Visualizations

    ERIC Educational Resources Information Center

    Gegenfurtner, Andreas; Siewiorek, Anna; Lehtinen, Erno; Saljo, Roger

    2013-01-01

    Understanding how best to assess expertise, the situational variations of expertise, and distinctive qualities of expertise that arises from particular workplace experiences, presents an important challenge. Certainly, at this time, there is much interest in identifying standard occupational measures and competences, which are not well aligned…

  9. An Open Conversation on Using Eye-Gaze Methods in Studies of Neurodevelopmental Disorders

    ERIC Educational Resources Information Center

    Venker, Courtney E.; Kover, Sara T.

    2015-01-01

    Purpose: Eye-gaze methods have the potential to advance the study of neurodevelopmental disorders. Despite their increasing use, challenges arise in using these methods with individuals with neurodevelopmental disorders and in reporting sufficient methodological detail such that the resulting research is replicable and interpretable. Method: This…

  10. Religious Education and Religious Pluralism in the New Africa

    ERIC Educational Resources Information Center

    Asamoah-Gyadu, J. Kwabena

    2010-01-01

    This article examines some of the pertinent challenges arising out of personal experiences encountered through teaching religion and theology within an African environment. What the author describes as the "new Africa" in his title is a continent that has transitioned from slavery and colonialism into a global fraternity of democratic…

  11. Education and the Possibility of Outsider Understanding

    ERIC Educational Resources Information Center

    Bridges, David

    2009-01-01

    In education issues to do with insider and outsider understanding arise in debates about religious education and about certain areas of research, and in argument about education for international understanding. Here I challenge the dichotomy between insider and outsider, arguing that a more collectivist view of human identity combined with…

  12. Sustainability: Why the Language and Ethics of Sustainability Matter in the Geoscience Classroom

    ERIC Educational Resources Information Center

    Metzger, Ellen P.; Curren, Randall R.

    2017-01-01

    Because challenges to sustainability arise at the intersection of human and biophysical systems they are inescapably embedded in social contexts and involve multiple stakeholders with diverse and often conflicting needs and value systems. Addressing complex and solution-resistant problems such as climate change, biodiversity loss, and…

  13. Digital Citizenship in the Afterschool Space: Implications for Education for Sustainable Development

    ERIC Educational Resources Information Center

    Howard, Patrick

    2015-01-01

    Education for sustainable development (ESD) challenges traditional curricula and formal schooling in important ways. ESD requires systemic thinking, interdisciplinarity and is strengthened through the contributions of all disciplines. As with any transformative societal and technological shift, new questions arise when educators are required to…

  14. Sawtooth Functions. Classroom Notes

    ERIC Educational Resources Information Center

    Hirst, Keith

    2004-01-01

    Using MAPLE enables students to consider many examples which would be very tedious to work out by hand. This applies to graph plotting as well as to algebraic manipulation. The challenge is to use these observations to develop the students' understanding of mathematical concepts. In this note an interesting relationship arising from inverse…

  15. Open Educational Resources for Call Teacher Education: The iTILT Interactive Whiteboard Project

    ERIC Educational Resources Information Center

    Whyte, Shona; Schmid, Euline Cutrim; van Hazebrouck Thompson, Sanderin; Oberhofer, Margret

    2014-01-01

    This paper discusses challenges and opportunities arising during the development of open educational resources (OERs) to support communicative language teaching (CLT) with interactive whiteboards (IWBs). iTILT (interactive Technologies in Language Teaching), a European Lifelong Learning Project, has two main aims: (a) to promote "best…

  16. Seeing the Unseen: Molecular Visualization in Biology

    ERIC Educational Resources Information Center

    Finnan, Jeff; Taylor-Papp, Kim; Duran, Mesut

    2005-01-01

    In high school biology, students are challenged by many molecular concepts and structures. They meander through a number of molecular structures, some in macromolecular form: carbohydrates, amino acids, fatty acids, nucleotides. Student difficulties arise in part from inability to visualize what they can't easily see. Students struggle moving from…

  17. Creating Dialogue: The Role of Urban and Metropolitan Universities in Fostering Civil Society.

    ERIC Educational Resources Information Center

    Wong, Milton K.

    2003-01-01

    This keynote address encourages urban and metropolitan universities to be courageous and creative in meeting new social and economic challenges. Explores why universities, products of the Age of Reason, must now face the competitive pressures arising from a collapse of the Authority of Reason. (EV)

  18. Globalisation, Mergers and "Inadvertent Multi-Campus Universities": Reflections from Wales

    ERIC Educational Resources Information Center

    Zeeman, Nadine; Benneworth, Paul

    2017-01-01

    Multi-site universities face the challenge of integrating campuses that may have different profiles and orientations arising from place-specific attachments. Multi-campus universities created via mergers seeking to ensure long-term financial sustainability, and increasing their attractiveness to students, create a tension in campuses' purposes. We…

  19. Legal Challenges to Single-Sex Colleges Expected to Spread.

    ERIC Educational Resources Information Center

    Jaschik, Scott

    1990-01-01

    Court cases arising out of the current legal and political controversy over the Virginia Military Institute's policy of admitting only men are examined as they apply to the nation's three other publicly supported single-sex colleges: the Citadel (South Carolina), Mississippi University for Women, and Texas Woman's University. (DB)

  20. Adapting Manualized Treatments: Treating Anxiety Disorders among Native Americans

    ERIC Educational Resources Information Center

    De Coteau, Tami; Anderson, Jessiline; Hope, Debra

    2006-01-01

    Although there is a small but growing body of literature examining the psychopathology of anxiety among Native Americans, no data are available regarding the efficacy of empirically supported treatments for anxiety disorders among Native Americans. Moreover, exceptional challenges arise in adapting mainstream approaches to Native Americans, such…

  1. Convergence of Dynamic Vegetation Net Productivity Responses to Precipitation Variability from 10 Years of MODIS EVI

    USDA-ARS?s Scientific Manuscript database

    According to Global Climate Models (GCMs) the occurrence of extreme events of precipitation will be more frequent in the future. Therefore, important challenges arise regarding climate variability, which are mainly related to the understanding of ecosystem responses to changes in precipitation patte...

  2. School Transportation Issues, Laws and Concerns: Implications for Future Administrators

    ERIC Educational Resources Information Center

    Durick, Jody M.

    2010-01-01

    Nearly all building administrators are confronted with a variety of transportation issues. Challenges, concerns and questions can arise from various aspects, including student misbehaviors, transportation laws and its implications at the school level, to importance and implementation of a school bus safety program. As new and upcoming future…

  3. Primary and Secondary Contamination Mechanisms in ASR Modeling and Design of Practical Management

    EPA Science Inventory

    Aquifer storage and recovery (ASR) is a useful water resource management option for water storage and reuse. Its increased use is recognized in adaptation to the ever increasing problem of water availability, both in timing and flow. Challenges in the ASR process may arise from...

  4. Critical Text Analysis: Linking Language and Cultural Studies

    ERIC Educational Resources Information Center

    Wharton, Sue

    2011-01-01

    Many UK universities offer degree programmes in English Language specifically for non-native speakers of English. Such programmes typically include not only language development but also development in various areas of content knowledge. A challenge that arises is to design courses in different areas that mutually support each other, thus…

  5. Inhibiting Intuitive Thinking in Mathematics Education

    ERIC Educational Resources Information Center

    Thomas, Michael O. J.

    2015-01-01

    The papers in this issue describe recent collaborative research into the role of inhibition of intuitive thinking in mathematics education. This commentary reflects on this research from a mathematics education perspective and draws attention to some of the challenges that arise in collaboration between research fields with different cultures,…

  6. Delivering the "Write" Message: The Memo and Transformational Leadership.

    ERIC Educational Resources Information Center

    Lamb, Bill; And Others

    Since the vast majority of conflict situations arise from a breakdown in the communication process, where one individual misunderstands something generated by another, the crucial challenge for leaders lies in avoiding misunderstandings. Leaders must remember, when formulating all sorts of correspondence, that when a "sender" forms a message and…

  7. Towards Constructions of Musical Childhoods: Diversity and Digital Technologies

    ERIC Educational Resources Information Center

    Young, Susan

    2009-01-01

    The changing economic, social, cultural and technological circumstances in which children live impact significantly on the ways in which early childhood is both viewed and experienced. Understanding the implications, the potentials, the challenges that arise as a consequence of the diversity and technological changes that characterise contemporary…

  8. Everyday Ethics in Dementia Day Care: Narratives of Crossing the Line.

    ERIC Educational Resources Information Center

    Hasselkus, Betty Risteen

    1997-01-01

    Examines the ethical aspects of the experience of providing day care to dementia patients. Results, based on telephone interviews (N=40), indicate that ethical challenges arise in everyday incidents when participants, staff, or family members "cross the line" of acceptable behavior. Staff responses ranged from benign manipulation to…

  9. Indigenous Autoethnography: Formulating Our Knowledge, Our Way

    ERIC Educational Resources Information Center

    Houston, Jennifer

    2007-01-01

    This paper seeks to engage the cultural interface where Indigenous knowledge meets Western academia, by questioning the validity of traditional research methods. Firstly, it is a response to the challenges facing Indigenous people confronted with the ethical and methodological issues arising from academic research. Secondly, it is a journey "into"…

  10. Conflict as a Catalyst for Learning

    ERIC Educational Resources Information Center

    Jehangir, Rashne R.

    2012-01-01

    The author challenges her students and herself to engage with tough issues like class, race, gender, disability, and homophobia. In this article, she discusses how she helps them learn from, and even embrace, the conflict that inevitably arises. Constructive management of classroom conflict begins with creating a cooperative learning environment…

  11. Consistent Query Answering of Conjunctive Queries under Primary Key Constraints

    ERIC Educational Resources Information Center

    Pema, Enela

    2014-01-01

    An inconsistent database is a database that violates one or more of its integrity constraints. In reality, violations of integrity constraints arise frequently under several different circumstances. Inconsistent databases have long posed the challenge to develop suitable tools for meaningful query answering. A principled approach for querying…

  12. Risky Business: Whose "Right Thing" Are We Talking about?

    ERIC Educational Resources Information Center

    Monseau, Virginia R.

    2008-01-01

    Questioning the dichotomy of right and wrong, Virginia R. Monseau explores the tensions that arise from choices teachers make when trying to "do the right thing." From warning teachers about difficult students to presenting sensitive materials that challenge student belief systems, Monseau advises educators to pay close attention,…

  13. Exhibiting the Field for Learning: Telling New York's Stories

    ERIC Educational Resources Information Center

    Saunders, Angharad

    2011-01-01

    This paper explores the challenges of engaging and assessing students in residential field learning. Fieldwork presents students with complex learning environments, wherein they are asked to participate in a variety of learning activities. Difficulties arise, however, over how to sustain engagement in field learning while simultaneously capturing…

  14. Knowledge and Vision in Teaching

    ERIC Educational Resources Information Center

    Kennedy, Mary M.

    2006-01-01

    The author challenges the role of knowledge in teaching by pointing out the variety of issues and concerns teachers must simultaneously address. Teachers use two strategies to manage their multidimensional space: They develop integrated habits and rules of thumb for handling situations as they arise, and they plan their lessons by envisioning them…

  15. City Slickers: Let the Cattle Speak for Themselves

    ERIC Educational Resources Information Center

    Woodcock, Ray

    2006-01-01

    "City Slickers", the classic movie starring Billy Crystal, portrays a man who rediscovers a part of himself during a two-week cattle drive adventure in the rugged American West. His rediscovery arises from the challenge itself, with minimal psychologically oriented discussion or "processing." The belief that such a thing can…

  16. Will Learning to Solve One-Step Equations Pose a Challenge to 8th Grade Students?

    ERIC Educational Resources Information Center

    Ngu, Bing Hiong; Phan, Huy P.

    2017-01-01

    Assimilating multiple interactive elements simultaneously in working memory to allow understanding to occur, while solving an equation, would impose a high cognitive load. "Element interactivity" arises from the interaction between elements within and across operational and relational lines. Moreover, operating with special features…

  17. Primary and Secondary Contamination Mechanisms for Consideration in ASR Modeling and Practical Management

    EPA Science Inventory

    Aquifer storage and recovery (ASR) is a useful water resource management option for water storage and reuse. Its increased use is recognized in adaptation to the ever increasing problem of water availability, both in timing and flow. Challenges in the ASR process may arise from...

  18. The Option of Rationality in the Source of Joy of Life.

    ERIC Educational Resources Information Center

    Bondergaard, Jette

    1998-01-01

    Examines challenges arising from a growing consciousness of environmental problems and their implications for education in general and for early childhood education specifically. Argues that cultures should consider both human rights and environmental protection, that they need to develop solidarity and create peaceful ways of living. Maintains…

  19. Students and Real World Applications: Still a Challenging Mix

    ERIC Educational Resources Information Center

    Galbraith, Peter

    2013-01-01

    Rhetoric about the importance of students being equipped to apply mathematics to relevant problems arising in their lives, individually, as citizens, and in the workplace has never been matched by serious policy or curricular support. This paper identifies and elaborates authenticity implications for addressing this issue, and describes aspects of…

  20. One-Loop Test of Quantum Black Holes in anti–de Sitter Space

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, James T.; Pando Zayas, Leopoldo A.; Rathee, Vimal

    Within 11-dimensional supergravity we compute the logarithmic correction to the entropy of magnetically charged asymptotically AdS4 black holes with arbitrary horizon topology. We find perfect agreement with the expected microscopic result arising from the dual field theory computation of the topologically twisted index. Our result relies crucially on a particular limit to the extremal black hole case and clarifies some aspects of quantum corrections in asymptotically AdS spacetimes.

Top