Sample records for sophisticated computer models

  1. Ocean Models and Proper Orthogonal Decomposition

    NASA Astrophysics Data System (ADS)

    Salas-de-Leon, D. A.

    2007-05-01

    The increasing computational developments and the better understanding of mathematical and physical systems resulted in an increasing number of ocean models. Long time ago, modelers were like a secret organization and recognize each other by using secret codes and languages that only a select group of people was able to recognize and understand. The access to computational systems was reduced, on one hand equipment and the using time of computers were expensive and restricted, and on the other hand, they required an advance computational languages that not everybody wanted to learn. Now a days most college freshman own a personal computer (PC or laptop), and/or have access to more sophisticated computational systems than those available for research in the early 80's. The resource availability resulted in a mayor access to all kind models. Today computer speed and time and the algorithms does not seem to be a problem, even though some models take days to run in small computational systems. Almost every oceanographic institution has their own model, what is more, in the same institution from one office to the next there are different models for the same phenomena, developed by different research member, the results does not differ substantially since the equations are the same, and the solving algorithms are similar. The algorithms and the grids, constructed with algorithms, can be found in text books and/or over the internet. Every year more sophisticated models are constructed. The Proper Orthogonal Decomposition is a technique that allows the reduction of the number of variables to solve keeping the model properties, for which it can be a very useful tool in diminishing the processes that have to be solved using "small" computational systems, making sophisticated models available for a greater community.

  2. Fiber Composite Sandwich Thermostructural Behavior: Computational Simulation

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Aiello, R. A.; Murthy, P. L. N.

    1986-01-01

    Several computational levels of progressive sophistication/simplification are described to computationally simulate composite sandwich hygral, thermal, and structural behavior. The computational levels of sophistication include: (1) three-dimensional detailed finite element modeling of the honeycomb, the adhesive and the composite faces; (2) three-dimensional finite element modeling of the honeycomb assumed to be an equivalent continuous, homogeneous medium, the adhesive and the composite faces; (3) laminate theory simulation where the honeycomb (metal or composite) is assumed to consist of plies with equivalent properties; and (4) derivations of approximate, simplified equations for thermal and mechanical properties by simulating the honeycomb as an equivalent homogeneous medium. The approximate equations are combined with composite hygrothermomechanical and laminate theories to provide a simple and effective computational procedure for simulating the thermomechanical/thermostructural behavior of fiber composite sandwich structures.

  3. The Application of a Massively Parallel Computer to the Simulation of Electrical Wave Propagation Phenomena in the Heart Muscle Using Simplified Models

    NASA Technical Reports Server (NTRS)

    Karpoukhin, Mikhii G.; Kogan, Boris Y.; Karplus, Walter J.

    1995-01-01

    The simulation of heart arrhythmia and fibrillation are very important and challenging tasks. The solution of these problems using sophisticated mathematical models is beyond the capabilities of modern super computers. To overcome these difficulties it is proposed to break the whole simulation problem into two tightly coupled stages: generation of the action potential using sophisticated models. and propagation of the action potential using simplified models. The well known simplified models are compared and modified to bring the rate of depolarization and action potential duration restitution closer to reality. The modified method of lines is used to parallelize the computational process. The conditions for the appearance of 2D spiral waves after the application of a premature beat and the subsequent traveling of the spiral wave inside the simulated tissue are studied.

  4. Factors Affecting Utilization of Information Output of Computer-Based Modeling Procedures in Local Government Organizations.

    ERIC Educational Resources Information Center

    Komsky, Susan

    Fiscal Impact Budgeting Systems (FIBS) are sophisticated computer based modeling procedures used in local government organizations, whose results, however, are often overlooked or ignored by decision makers. A study attempted to discover the reasons for this situation by focusing on four factors: potential usefulness, faith in computers,…

  5. InteractiveROSETTA: a graphical user interface for the PyRosetta protein modeling suite.

    PubMed

    Schenkelberg, Christian D; Bystroff, Christopher

    2015-12-15

    Modern biotechnical research is becoming increasingly reliant on computational structural modeling programs to develop novel solutions to scientific questions. Rosetta is one such protein modeling suite that has already demonstrated wide applicability to a number of diverse research projects. Unfortunately, Rosetta is largely a command-line-driven software package which restricts its use among non-computational researchers. Some graphical interfaces for Rosetta exist, but typically are not as sophisticated as commercial software. Here, we present InteractiveROSETTA, a graphical interface for the PyRosetta framework that presents easy-to-use controls for several of the most widely used Rosetta protocols alongside a sophisticated selection system utilizing PyMOL as a visualizer. InteractiveROSETTA is also capable of interacting with remote Rosetta servers, facilitating sophisticated protocols that are not accessible in PyRosetta or which require greater computational resources. InteractiveROSETTA is freely available at https://github.com/schenc3/InteractiveROSETTA/releases and relies upon a separate download of PyRosetta which is available at http://www.pyrosetta.org after obtaining a license (free for academic use). © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  6. Using a cloud to replenish parched groundwater modeling efforts.

    PubMed

    Hunt, Randall J; Luchette, Joseph; Schreuder, Willem A; Rumbaugh, James O; Doherty, John; Tonkin, Matthew J; Rumbaugh, Douglas B

    2010-01-01

    Groundwater models can be improved by introduction of additional parameter flexibility and simultaneous use of soft-knowledge. However, these sophisticated approaches have high computational requirements. Cloud computing provides unprecedented access to computing power via the Internet to facilitate the use of these techniques. A modeler can create, launch, and terminate "virtual" computers as needed, paying by the hour, and save machine images for future use. Such cost-effective and flexible computing power empowers groundwater modelers to routinely perform model calibration and uncertainty analysis in ways not previously possible.

  7. Using a cloud to replenish parched groundwater modeling efforts

    USGS Publications Warehouse

    Hunt, Randall J.; Luchette, Joseph; Schreuder, Willem A.; Rumbaugh, James O.; Doherty, John; Tonkin, Matthew J.; Rumbaugh, Douglas B.

    2010-01-01

    Groundwater models can be improved by introduction of additional parameter flexibility and simultaneous use of soft-knowledge. However, these sophisticated approaches have high computational requirements. Cloud computing provides unprecedented access to computing power via the Internet to facilitate the use of these techniques. A modeler can create, launch, and terminate “virtual” computers as needed, paying by the hour, and save machine images for future use. Such cost-effective and flexible computing power empowers groundwater modelers to routinely perform model calibration and uncertainty analysis in ways not previously possible.

  8. Reproducible research in vadose zone sciences

    USDA-ARS?s Scientific Manuscript database

    A significant portion of present-day soil and Earth science research is computational, involving complex data analysis pipelines, advanced mathematical and statistical models, and sophisticated computer codes. Opportunities for scientific progress are greatly diminished if reproducing and building o...

  9. Performance of hybrid programming models for multiscale cardiac simulations: preparing for petascale computation.

    PubMed

    Pope, Bernard J; Fitch, Blake G; Pitman, Michael C; Rice, John J; Reumann, Matthias

    2011-10-01

    Future multiscale and multiphysics models that support research into human disease, translational medical science, and treatment can utilize the power of high-performance computing (HPC) systems. We anticipate that computationally efficient multiscale models will require the use of sophisticated hybrid programming models, mixing distributed message-passing processes [e.g., the message-passing interface (MPI)] with multithreading (e.g., OpenMP, Pthreads). The objective of this study is to compare the performance of such hybrid programming models when applied to the simulation of a realistic physiological multiscale model of the heart. Our results show that the hybrid models perform favorably when compared to an implementation using only the MPI and, furthermore, that OpenMP in combination with the MPI provides a satisfactory compromise between performance and code complexity. Having the ability to use threads within MPI processes enables the sophisticated use of all processor cores for both computation and communication phases. Considering that HPC systems in 2012 will have two orders of magnitude more cores than what was used in this study, we believe that faster than real-time multiscale cardiac simulations can be achieved on these systems.

  10. AOPs & Biomarkers: Bridging High Throughput Screening and Regulatory Decision Making.

    EPA Science Inventory

    As high throughput screening (HTS) approaches play a larger role in toxicity testing, computational toxicology has emerged as a critical component in interpreting the large volume of data produced. Computational models for this purpose are becoming increasingly more sophisticated...

  11. FRAMEWORK FOR EVALUATION OF PHYSIOLOGICALLY-BASED PHARMACOKINETIC MODELS FOR USE IN SAFETY OR RISK ASSESSMENT

    EPA Science Inventory

    ABSTRACT

    Proposed applications of increasingly sophisticated biologically-based computational models, such as physiologically-based pharmacokinetic (PBPK) models, raise the issue of how to evaluate whether the models are adequate for proposed uses including safety or risk ...

  12. Climate Models

    NASA Technical Reports Server (NTRS)

    Druyan, Leonard M.

    2012-01-01

    Climate models is a very broad topic, so a single volume can only offer a small sampling of relevant research activities. This volume of 14 chapters includes descriptions of a variety of modeling studies for a variety of geographic regions by an international roster of authors. The climate research community generally uses the rubric climate models to refer to organized sets of computer instructions that produce simulations of climate evolution. The code is based on physical relationships that describe the shared variability of meteorological parameters such as temperature, humidity, precipitation rate, circulation, radiation fluxes, etc. Three-dimensional climate models are integrated over time in order to compute the temporal and spatial variations of these parameters. Model domains can be global or regional and the horizontal and vertical resolutions of the computational grid vary from model to model. Considering the entire climate system requires accounting for interactions between solar insolation, atmospheric, oceanic and continental processes, the latter including land hydrology and vegetation. Model simulations may concentrate on one or more of these components, but the most sophisticated models will estimate the mutual interactions of all of these environments. Advances in computer technology have prompted investments in more complex model configurations that consider more phenomena interactions than were possible with yesterday s computers. However, not every attempt to add to the computational layers is rewarded by better model performance. Extensive research is required to test and document any advantages gained by greater sophistication in model formulation. One purpose for publishing climate model research results is to present purported advances for evaluation by the scientific community.

  13. Computational Modeling and Treatment Identification in the Myelodysplastic Syndromes.

    PubMed

    Drusbosky, Leylah M; Cogle, Christopher R

    2017-10-01

    This review discusses the need for computational modeling in myelodysplastic syndromes (MDS) and early test results. As our evolving understanding of MDS reveals a molecularly complicated disease, the need for sophisticated computer analytics is required to keep track of the number and complex interplay among the molecular abnormalities. Computational modeling and digital drug simulations using whole exome sequencing data input have produced early results showing high accuracy in predicting treatment response to standard of care drugs. Furthermore, the computational MDS models serve as clinically relevant MDS cell lines for pre-clinical assays of investigational agents. MDS is an ideal disease for computational modeling and digital drug simulations. Current research is focused on establishing the prediction value of computational modeling. Future research will test the clinical advantage of computer-informed therapy in MDS.

  14. Available for the Apple II: FIRM: Florida InteRactive Modeler.

    ERIC Educational Resources Information Center

    Levy, C. Michael; And Others

    1983-01-01

    The Apple II microcomputer program described allows instructors with minimal programing experience to construct computer models of psychological phenomena for students to investigate. Use of these models eliminates need to maintain/house/breed animals or purchase sophisticated laboratory equipment. Several content models are also described,…

  15. MODELS-3 (CMAQ). NARSTO NEWS (VOL. 3, NO. 2, SUMMER/FALL 1999)

    EPA Science Inventory

    A revised version of the U.S. EPA's Models-3/CMAQ system was released on June 30, 1999. Models-3 consists of a sophisticated computational framework for environmental models allowing for much flexibility in the communications between component parts of the system, in updating or ...

  16. An Innovative Learning Model for Computation in First Year Mathematics

    ERIC Educational Resources Information Center

    Tonkes, E. J.; Loch, B. I.; Stace, A. W.

    2005-01-01

    MATLAB is a sophisticated software tool for numerical analysis and visualization. The University of Queensland has adopted Matlab as its official teaching package across large first year mathematics courses. In the past, the package has met severe resistance from students who have not appreciated their computational experience. Several main…

  17. Teaching for CAD Expertise

    ERIC Educational Resources Information Center

    Chester, Ivan

    2007-01-01

    CAD (Computer Aided Design) has now become an integral part of Technology Education. The recent introduction of highly sophisticated, low-cost CAD software and CAM hardware capable of running on desktop computers has accelerated this trend. There is now quite widespread introduction of solid modeling CAD software into secondary schools but how…

  18. Does a better model yield a better argument? An info-gap analysis

    NASA Astrophysics Data System (ADS)

    Ben-Haim, Yakov

    2017-04-01

    Theories, models and computations underlie reasoned argumentation in many areas. The possibility of error in these arguments, though of low probability, may be highly significant when the argument is used in predicting the probability of rare high-consequence events. This implies that the choice of a theory, model or computational method for predicting rare high-consequence events must account for the probability of error in these components. However, error may result from lack of knowledge or surprises of various sorts, and predicting the probability of error is highly uncertain. We show that the putatively best, most innovative and sophisticated argument may not actually have the lowest probability of error. Innovative arguments may entail greater uncertainty than more standard but less sophisticated methods, creating an innovation dilemma in formulating the argument. We employ info-gap decision theory to characterize and support the resolution of this problem and present several examples.

  19. Tools for Atmospheric Radiative Transfer: Streamer and FluxNet. Revised

    NASA Technical Reports Server (NTRS)

    Key, Jeffrey R.; Schweiger, Axel J.

    1998-01-01

    Two tools for the solution of radiative transfer problems are presented. Streamer is a highly flexible medium spectral resolution radiative transfer model based on the plane-parallel theory of radiative transfer. Capable of computing either fluxes or radiances, it is suitable for studying radiative processes at the surface or within the atmosphere and for the development of remote-sensing algorithms. FluxNet is a fast neural network-based implementation of Streamer for computing surface fluxes. It allows for a sophisticated treatment of radiative processes in the analysis of large data sets and potential integration into geophysical models where computational efficiency is an issue. Documentation and tools for the development of alternative versions of Fluxnet are available. Collectively, Streamer and FluxNet solve a wide variety of problems related to radiative transfer: Streamer provides the detail and sophistication needed to perform basic research on most aspects of complex radiative processes while the efficiency and simplicity of FluxNet make it ideal for operational use.

  20. Creation of Anatomically Accurate Computer-Aided Design (CAD) Solid Models from Medical Images

    NASA Technical Reports Server (NTRS)

    Stewart, John E.; Graham, R. Scott; Samareh, Jamshid A.; Oberlander, Eric J.; Broaddus, William C.

    1999-01-01

    Most surgical instrumentation and implants used in the world today are designed with sophisticated Computer-Aided Design (CAD)/Computer-Aided Manufacturing (CAM) software. This software automates the mechanical development of a product from its conceptual design through manufacturing. CAD software also provides a means of manipulating solid models prior to Finite Element Modeling (FEM). Few surgical products are designed in conjunction with accurate CAD models of human anatomy because of the difficulty with which these models are created. We have developed a novel technique that creates anatomically accurate, patient specific CAD solids from medical images in a matter of minutes.

  1. Cloud computing can simplify HIT infrastructure management.

    PubMed

    Glaser, John

    2011-08-01

    Software as a Service (SaaS), built on cloud computing technology, is emerging as the forerunner in IT infrastructure because it helps healthcare providers reduce capital investments. Cloud computing leads to predictable, monthly, fixed operating expenses for hospital IT staff. Outsourced cloud computing facilities are state-of-the-art data centers boasting some of the most sophisticated networking equipment on the market. The SaaS model helps hospitals safeguard against technology obsolescence, minimizes maintenance requirements, and simplifies management.

  2. Data management in the mission data system

    NASA Technical Reports Server (NTRS)

    Wagner, David A.

    2005-01-01

    As spacecraft evolve from simple embedded devices to become more sophisticated computing platforms with complex behaviors it is increasingly necessary to model and manage the flow of data, and to provide uniform models for managing data that promote adaptability, yet pay heed to the physical limitations of the embedded and space environments.

  3. Novel opportunities for computational biology and sociology in drug discovery☆

    PubMed Central

    Yao, Lixia; Evans, James A.; Rzhetsky, Andrey

    2013-01-01

    Current drug discovery is impossible without sophisticated modeling and computation. In this review we outline previous advances in computational biology and, by tracing the steps involved in pharmaceutical development, explore a range of novel, high-value opportunities for computational innovation in modeling the biological process of disease and the social process of drug discovery. These opportunities include text mining for new drug leads, modeling molecular pathways and predicting the efficacy of drug cocktails, analyzing genetic overlap between diseases and predicting alternative drug use. Computation can also be used to model research teams and innovative regions and to estimate the value of academy–industry links for scientific and human benefit. Attention to these opportunities could promise punctuated advance and will complement the well-established computational work on which drug discovery currently relies. PMID:20349528

  4. Recent advances in modeling languages for pathway maps and computable biological networks.

    PubMed

    Slater, Ted

    2014-02-01

    As our theories of systems biology grow more sophisticated, the models we use to represent them become larger and more complex. Languages necessarily have the expressivity and flexibility required to represent these models in ways that support high-resolution annotation, and provide for simulation and analysis that are sophisticated enough to allow researchers to master their data in the proper context. These languages also need to facilitate model sharing and collaboration, which is currently best done by using uniform data structures (such as graphs) and language standards. In this brief review, we discuss three of the most recent systems biology modeling languages to appear: BEL, PySB and BCML, and examine how they meet these needs. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. Software Surface Modeling and Grid Generation Steering Committee

    NASA Technical Reports Server (NTRS)

    Smith, Robert E. (Editor)

    1992-01-01

    It is a NASA objective to promote improvements in the capability and efficiency of computational fluid dynamics. Grid generation, the creation of a discrete representation of the solution domain, is an essential part of computational fluid dynamics. However, grid generation about complex boundaries requires sophisticated surface-model descriptions of the boundaries. The surface modeling and the associated computation of surface grids consume an extremely large percentage of the total time required for volume grid generation. Efficient and user friendly software systems for surface modeling and grid generation are critical for computational fluid dynamics to reach its potential. The papers presented here represent the state-of-the-art in software systems for surface modeling and grid generation. Several papers describe improved techniques for grid generation.

  6. MorphoHawk: Geometric-based Software for Manufacturing and More

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keith Arterburn

    2001-04-01

    Hollywood movies portray facial recognition as a perfected technology, but reality is that sophisticated computers and algorithmic calculations are far from perfect. In fact, the most sophisticated and successful computer for recognizing faces and other imagery already is the human brain with more than 10 billion nerve cells. Beginning at birth, humans process data and connect optical and sensory experiences that create unparalleled accumulation of data for people to associate images with life experiences, emotions and knowledge. Computers are powerful, rapid and tireless, but still cannot compare to the highly sophisticated relational calculations and associations that the human computer canmore » produce in connecting ‘what we see with what we know.’« less

  7. High-Throughput Thermodynamic Modeling and Uncertainty Quantification for ICME

    NASA Astrophysics Data System (ADS)

    Otis, Richard A.; Liu, Zi-Kui

    2017-05-01

    One foundational component of the integrated computational materials engineering (ICME) and Materials Genome Initiative is the computational thermodynamics based on the calculation of phase diagrams (CALPHAD) method. The CALPHAD method pioneered by Kaufman has enabled the development of thermodynamic, atomic mobility, and molar volume databases of individual phases in the full space of temperature, composition, and sometimes pressure for technologically important multicomponent engineering materials, along with sophisticated computational tools for using the databases. In this article, our recent efforts will be presented in terms of developing new computational tools for high-throughput modeling and uncertainty quantification based on high-throughput, first-principles calculations and the CALPHAD method along with their potential propagations to downstream ICME modeling and simulations.

  8. C-SWAT: The Soil and Water Assessment Tool with consolidated input files in alleviating computational burden of recursive simulations

    USDA-ARS?s Scientific Manuscript database

    The temptation to include model parameters and high resolution input data together with the availability of powerful optimization and uncertainty analysis algorithms has significantly enhanced the complexity of hydrologic and water quality modeling. However, the ability to take advantage of sophist...

  9. Cutter Resource Effectiveness Evaluation (CREE) Program : A Guide for Users and Analysts

    DOT National Transportation Integrated Search

    1978-03-01

    The Cutter Resource Effectiveness Evaluation (CREE) project has developed a sophisticated, user-oriented computer model which can evaluate the effectiveness of any existing Coast Guard craft, or the effectiveness of any of a number of proposed altern...

  10. Novel opportunities for computational biology and sociology in drug discovery

    PubMed Central

    Yao, Lixia

    2009-01-01

    Drug discovery today is impossible without sophisticated modeling and computation. In this review we touch on previous advances in computational biology and by tracing the steps involved in pharmaceutical development, we explore a range of novel, high value opportunities for computational innovation in modeling the biological process of disease and the social process of drug discovery. These opportunities include text mining for new drug leads, modeling molecular pathways and predicting the efficacy of drug cocktails, analyzing genetic overlap between diseases and predicting alternative drug use. Computation can also be used to model research teams and innovative regions and to estimate the value of academy-industry ties for scientific and human benefit. Attention to these opportunities could promise punctuated advance, and will complement the well-established computational work on which drug discovery currently relies. PMID:19674801

  11. The process group approach to reliable distributed computing

    NASA Technical Reports Server (NTRS)

    Birman, Kenneth P.

    1992-01-01

    The difficulty of developing reliable distribution software is an impediment to applying distributed computing technology in many settings. Experience with the ISIS system suggests that a structured approach based on virtually synchronous process groups yields systems that are substantially easier to develop, exploit sophisticated forms of cooperative computation, and achieve high reliability. Six years of research on ISIS, describing the model, its implementation challenges, and the types of applications to which ISIS has been applied are reviewed.

  12. Growth and yield models for central hardwoods

    Treesearch

    Martin E. Dale; Donald E. Hilt

    1989-01-01

    Over the last 20 years computers have become an efficient tool to estimate growth and yield. Computerized yield estimates vary from simple approximation or interpolation of traditional normal yield tables to highly sophisticated programs that simulate the growth and yield of each individual tree.

  13. Viewing CAD Drawings on the Internet

    ERIC Educational Resources Information Center

    Schwendau, Mark

    2004-01-01

    Computer aided design (CAD) has been producing 3-D models for years. AutoCAD software is frequently used to create sophisticated 3-D models. These CAD files can be exported as 3DS files for import into Autodesk's 3-D Studio Viz. In this program, the user can render and modify the 3-D model before exporting it out as a WRL (world file hyperlinked)…

  14. Efficient modeling of vector hysteresis using a novel Hopfield neural network implementation of Stoner–Wohlfarth-like operators

    PubMed Central

    Adly, Amr A.; Abd-El-Hafiz, Salwa K.

    2012-01-01

    Incorporation of hysteresis models in electromagnetic analysis approaches is indispensable to accurate field computation in complex magnetic media. Throughout those computations, vector nature and computational efficiency of such models become especially crucial when sophisticated geometries requiring massive sub-region discretization are involved. Recently, an efficient vector Preisach-type hysteresis model constructed from only two scalar models having orthogonally coupled elementary operators has been proposed. This paper presents a novel Hopfield neural network approach for the implementation of Stoner–Wohlfarth-like operators that could lead to a significant enhancement in the computational efficiency of the aforementioned model. Advantages of this approach stem from the non-rectangular nature of these operators that substantially minimizes the number of operators needed to achieve an accurate vector hysteresis model. Details of the proposed approach, its identification and experimental testing are presented in the paper. PMID:25685446

  15. A computer simulation of aircraft evacuation with fire

    NASA Technical Reports Server (NTRS)

    Middleton, V. E.

    1983-01-01

    A computer simulation was developed to assess passenger survival during the post-crash evacuation of a transport category aircraft when fire is a major threat. The computer code, FIREVAC, computes individual passenger exit paths and times to exit, taking into account delays and congestion caused by the interaction among the passengers and changing cabin conditions. Simple models for the physiological effects of the toxic cabin atmosphere are included with provision for including more sophisticated models as they become available. Both wide-body and standard-body aircraft may be simulated. Passenger characteristics are assigned stochastically from experimentally derived distributions. Results of simulations of evacuation trials and hypothetical evacuations under fire conditions are presented.

  16. Modeling of the Human - Operator in a Complex System Functioning Under Extreme Conditions

    NASA Astrophysics Data System (ADS)

    Getzov, Peter; Hubenova, Zoia; Yordanov, Dimitar; Popov, Wiliam

    2013-12-01

    Problems, related to the explication of sophisticated control systems of objects, operating under extreme conditions, have been examined and the impact of the effectiveness of the operator's activity on the systems as a whole. The necessity of creation of complex simulation models, reflecting operator's activity, is discussed. Organizational and technical system of an unmanned aviation complex is described as a sophisticated ergatic system. Computer realization of main subsystems of algorithmic system of the man as a controlling system is implemented and specialized software for data processing and analysis is developed. An original computer model of a Man as a tracking system has been implemented. Model of unmanned complex for operators training and formation of a mental model in emergency situation, implemented in "matlab-simulink" environment, has been synthesized. As a unit of the control loop, the pilot (operator) is simplified viewed as an autocontrol system consisting of three main interconnected subsystems: sensitive organs (perception sensors); central nervous system; executive organs (muscles of the arms, legs, back). Theoretical-data model of prediction the level of operator's information load in ergatic systems is proposed. It allows the assessment and prediction of the effectiveness of a real working operator. Simulation model of operator's activity in takeoff based on the Petri nets has been synthesized.

  17. Impact of Media Richness and Flow on E-Learning Technology Acceptance

    ERIC Educational Resources Information Center

    Liu, Su-Houn; Liao, Hsiu-Li; Pratt, Jean A.

    2009-01-01

    Advances in e-learning technologies parallels a general increase in sophistication by computer users. The use of just one theory or model, such as the technology acceptance model, is no longer sufficient to study the intended use of e-learning systems. Rather, a combination of theories must be integrated in order to fully capture the complexity of…

  18. Undecidability and Irreducibility Conditions for Open-Ended Evolution and Emergence.

    PubMed

    Hernández-Orozco, Santiago; Hernández-Quiroz, Francisco; Zenil, Hector

    2018-01-01

    Is undecidability a requirement for open-ended evolution (OEE)? Using methods derived from algorithmic complexity theory, we propose robust computational definitions of open-ended evolution and the adaptability of computable dynamical systems. Within this framework, we show that decidability imposes absolute limits on the stable growth of complexity in computable dynamical systems. Conversely, systems that exhibit (strong) open-ended evolution must be undecidable, establishing undecidability as a requirement for such systems. Complexity is assessed in terms of three measures: sophistication, coarse sophistication, and busy beaver logical depth. These three complexity measures assign low complexity values to random (incompressible) objects. As time grows, the stated complexity measures allow for the existence of complex states during the evolution of a computable dynamical system. We show, however, that finding these states involves undecidable computations. We conjecture that for similar complexity measures that assign low complexity values, decidability imposes comparable limits on the stable growth of complexity, and that such behavior is necessary for nontrivial evolutionary systems. We show that the undecidability of adapted states imposes novel and unpredictable behavior on the individuals or populations being modeled. Such behavior is irreducible. Finally, we offer an example of a system, first proposed by Chaitin, that exhibits strong OEE.

  19. Computational models of the pulmonary circulation: Insights and the move towards clinically directed studies

    PubMed Central

    Tawhai, Merryn H.; Clark, Alys R.; Burrowes, Kelly S.

    2011-01-01

    Biophysically-based computational models provide a tool for integrating and explaining experimental data, observations, and hypotheses. Computational models of the pulmonary circulation have evolved from minimal and efficient constructs that have been used to study individual mechanisms that contribute to lung perfusion, to sophisticated multi-scale and -physics structure-based models that predict integrated structure-function relationships within a heterogeneous organ. This review considers the utility of computational models in providing new insights into the function of the pulmonary circulation, and their application in clinically motivated studies. We review mathematical and computational models of the pulmonary circulation based on their application; we begin with models that seek to answer questions in basic science and physiology and progress to models that aim to have clinical application. In looking forward, we discuss the relative merits and clinical relevance of computational models: what important features are still lacking; and how these models may ultimately be applied to further increasing our understanding of the mechanisms occurring in disease of the pulmonary circulation. PMID:22034608

  20. Defense Small Business Innovation Research Program (SBIR), Volume 4, Defense Agencies Abstracts of Phase 1 Awards 1991

    DTIC Science & Technology

    1991-01-01

    EXPERIENCE IN DEVELOPING INTEGRATED OPTICAL DEVICES, NONLINEAR MAGNETIC-OPTIC MATERIALS, HIGH FREQUENCY MODULATORS, COMPUTER-AIDED MODELING AND SOPHISTICATED... HIGH -LEVEL PRESENTATION AND DISTRIBUTED CONTROL MODELS FOR INTEGRATING HETEROGENEOUS MECHANICAL ENGINEERING APPLICATIONS AND TOOLS. THE DESIGN IS FOCUSED...STATISTICALLY ACCURATE WORST CASE DEVICE MODELS FOR CIRCUIT SIMULATION. PRESENT METHODS OF WORST CASE DEVICE DESIGN ARE AD HOC AND DO NOT ALLOW THE

  1. Computational Modeling of Cultural Dimensions in Adversary Organizations

    DTIC Science & Technology

    2010-01-01

    Nodes”, In the Proceedings of the 9th Conference on Uncertainty in Artificial Intelli - gence, 1993. [8] Pearl, J. Probabilistic Reasoning in...the artificial life simulations; in con- trast, models with only a few agents typically employ quite sophisticated cognitive agents capa- ble of...Model Construction 45 cisions as to how to allocate scarce ISR assets (two Unmanned Air Systems, UAS ) among the two Red activities while at the same

  2. Technology Acceptance in Social Work Education: Implications for the Field Practicum

    ERIC Educational Resources Information Center

    Colvin, Alex Don; Bullock, Angela N.

    2014-01-01

    The exponential growth and sophistication of new information and computer technology (ICT) have greatly influenced human interactions and provided new metaphors for understanding the world. The acceptance and integration of ICT into social work field education are examined here using the technological acceptance model. This article also explores…

  3. Models and Methodologies for Multimedia Courseware Production.

    ERIC Educational Resources Information Center

    Barker, Philip; Giller, Susan

    Many new technologies are now available for delivering and/or providing access to computer-based learning (CBL) materials. These technologies vary in sophistication in many important ways, depending upon the bandwidth that they provide, the interactivity that they offer and the types of end-user connectivity that they support.Invariably,…

  4. Moose: An Open-Source Framework to Enable Rapid Development of Collaborative, Multi-Scale, Multi-Physics Simulation Tools

    NASA Astrophysics Data System (ADS)

    Slaughter, A. E.; Permann, C.; Peterson, J. W.; Gaston, D.; Andrs, D.; Miller, J.

    2014-12-01

    The Idaho National Laboratory (INL)-developed Multiphysics Object Oriented Simulation Environment (MOOSE; www.mooseframework.org), is an open-source, parallel computational framework for enabling the solution of complex, fully implicit multiphysics systems. MOOSE provides a set of computational tools that scientists and engineers can use to create sophisticated multiphysics simulations. Applications built using MOOSE have computed solutions for chemical reaction and transport equations, computational fluid dynamics, solid mechanics, heat conduction, mesoscale materials modeling, geomechanics, and others. To facilitate the coupling of diverse and highly-coupled physical systems, MOOSE employs the Jacobian-free Newton-Krylov (JFNK) method when solving the coupled nonlinear systems of equations arising in multiphysics applications. The MOOSE framework is written in C++, and leverages other high-quality, open-source scientific software packages such as LibMesh, Hypre, and PETSc. MOOSE uses a "hybrid parallel" model which combines both shared memory (thread-based) and distributed memory (MPI-based) parallelism to ensure efficient resource utilization on a wide range of computational hardware. MOOSE-based applications are inherently modular, which allows for simulation expansion (via coupling of additional physics modules) and the creation of multi-scale simulations. Any application developed with MOOSE supports running (in parallel) any other MOOSE-based application. Each application can be developed independently, yet easily communicate with other applications (e.g., conductivity in a slope-scale model could be a constant input, or a complete phase-field micro-structure simulation) without additional code being written. This method of development has proven effective at INL and expedites the development of sophisticated, sustainable, and collaborative simulation tools.

  5. Using genetic information while protecting the privacy of the soul.

    PubMed

    Moor, J H

    1999-01-01

    Computing plays an important role in genetics (and vice versa). Theoretically, computing provides a conceptual model for the function and malfunction of our genetic machinery. Practically, contemporary computers and robots equipped with advanced algorithms make the revelation of the complete human genome imminent--computers are about to reveal our genetic souls for the first time. Ethically, computers help protect privacy by restricting access in sophisticated ways to genetic information. But the inexorable fact that computers will increasingly collect, analyze, and disseminate abundant amounts of genetic information made available through the genetic revolution, not to mention that inexpensive computing devices will make genetic information gathering easier, underscores the need for strong and immediate privacy legislation.

  6. The RCM: A Resource Management and Program Budgeting Approach for State and Local Educational Agencies.

    ERIC Educational Resources Information Center

    Chambers, Jay G.; Parrish, Thomas B.

    The Resource Cost Model (RCM) is a resource management system that combines the technical advantages of sophisticated computer simulation software with the practical benefits of group decision making to provide detailed information about educational program costs. The first section of this document introduces the conceptual framework underlying…

  7. Using Novel Word Context Measures to Predict Human Ratings of Lexical Proficiency

    ERIC Educational Resources Information Center

    Berger, Cynthia M.; Crossley, Scott A.; Kyle, Kristopher

    2017-01-01

    This study introduces a model of lexical proficiency based on novel computational indices related to word context. The indices come from an updated version of the Tool for the Automatic Analysis of Lexical Sophistication (TAALES) and include associative, lexical, and semantic measures of word context. Human ratings of holistic lexical proficiency…

  8. Introducing DeBRa: a detailed breast model for radiological studies

    NASA Astrophysics Data System (ADS)

    Ma, Andy K. W.; Gunn, Spencer; Darambara, Dimitra G.

    2009-07-01

    Currently, x-ray mammography is the method of choice in breast cancer screening programmes. As the mammography technology moves from 2D imaging modalities to 3D, conventional computational phantoms do not have sufficient detail to support the studies of these advanced imaging systems. Studies of these 3D imaging systems call for a realistic and sophisticated computational model of the breast. DeBRa (Detailed Breast model for Radiological studies) is the most advanced, detailed, 3D computational model of the breast developed recently for breast imaging studies. A DeBRa phantom can be constructed to model a compressed breast, as in film/screen, digital mammography and digital breast tomosynthesis studies, or a non-compressed breast as in positron emission mammography and breast CT studies. Both the cranial-caudal and mediolateral oblique views can be modelled. The anatomical details inside the phantom include the lactiferous duct system, the Cooper ligaments and the pectoral muscle. The fibroglandular tissues are also modelled realistically. In addition, abnormalities such as microcalcifications, irregular tumours and spiculated tumours are inserted into the phantom. Existing sophisticated breast models require specialized simulation codes. Unlike its predecessors, DeBRa has elemental compositions and densities incorporated into its voxels including those of the explicitly modelled anatomical structures and the noise-like fibroglandular tissues. The voxel dimensions are specified as needed by any study and the microcalcifications are embedded into the voxels so that the microcalcification sizes are not limited by the voxel dimensions. Therefore, DeBRa works with general-purpose Monte Carlo codes. Furthermore, general-purpose Monte Carlo codes allow different types of imaging modalities and detector characteristics to be simulated with ease. DeBRa is a versatile and multipurpose model specifically designed for both x-ray and γ-ray imaging studies.

  9. Minimum-complexity helicopter simulation math model

    NASA Technical Reports Server (NTRS)

    Heffley, Robert K.; Mnich, Marc A.

    1988-01-01

    An example of a minimal complexity simulation helicopter math model is presented. Motivating factors are the computational delays, cost, and inflexibility of the very sophisticated math models now in common use. A helicopter model form is given which addresses each of these factors and provides better engineering understanding of the specific handling qualities features which are apparent to the simulator pilot. The technical approach begins with specification of features which are to be modeled, followed by a build up of individual vehicle components and definition of equations. Model matching and estimation procedures are given which enable the modeling of specific helicopters from basic data sources such as flight manuals. Checkout procedures are given which provide for total model validation. A number of possible model extensions and refinement are discussed. Math model computer programs are defined and listed.

  10. Analysis of peristaltic waves and their role in migrating Physarum plasmodia

    NASA Astrophysics Data System (ADS)

    Lewis, Owen L.; Guy, Robert D.

    2017-07-01

    The true slime mold Physarum polycephalum exhibits a vast array of sophisticated manipulations of its intracellular cytoplasm. Growing microplasmodia of Physarum have been observed to adopt an elongated tadpole shape, then contract in a rhythmic, traveling wave pattern that resembles peristaltic pumping. This contraction drives a fast flow of non-gelated cytoplasm along the cell longitudinal axis. It has been hypothesized that this flow of cytoplasm is a driving factor in generating motility of the plasmodium. In this work, we use two different mathematical models to investigate how peristaltic pumping within Physarum may be used to drive cellular motility. We compare the relative phase of flow and deformation waves predicted by both models to similar phase data collected from in vivo experiments using Physarum plasmodia. The first is a PDE model based on a dimensional reduction of peristaltic pumping within a finite length chamber. The second is a more sophisticated computational model which accounts for more general shape changes, more complex cellular mechanics, and dynamically modulated adhesion to the underlying substrate. This model allows us to directly compute cell crawling speed. Both models suggest that a mechanical asymmetry in the cell is required to reproduce the experimental observations. Such a mechanical asymmetry is also shown to increase the potential for cellular migration, as measured by both stress generation and migration velocity.

  11. Combined Numerical/Analytical Perturbation Solutions of the Navier-Stokes Equations for Aerodynamic Ejector/Mixer Nozzle Flows

    NASA Technical Reports Server (NTRS)

    DeChant, Lawrence Justin

    1998-01-01

    In spite of rapid advances in both scalar and parallel computational tools, the large number of variables involved in both design and inverse problems make the use of sophisticated fluid flow models impractical, With this restriction, it is concluded that an important family of methods for mathematical/computational development are reduced or approximate fluid flow models. In this study a combined perturbation/numerical modeling methodology is developed which provides a rigorously derived family of solutions. The mathematical model is computationally more efficient than classical boundary layer but provides important two-dimensional information not available using quasi-1-d approaches. An additional strength of the current methodology is its ability to locally predict static pressure fields in a manner analogous to more sophisticated parabolized Navier Stokes (PNS) formulations. To resolve singular behavior, the model utilizes classical analytical solution techniques. Hence, analytical methods have been combined with efficient numerical methods to yield an efficient hybrid fluid flow model. In particular, the main objective of this research has been to develop a system of analytical and numerical ejector/mixer nozzle models, which require minimal empirical input. A computer code, DREA Differential Reduced Ejector/mixer Analysis has been developed with the ability to run sufficiently fast so that it may be used either as a subroutine or called by an design optimization routine. Models are of direct use to the High Speed Civil Transport Program (a joint government/industry project seeking to develop an economically.viable U.S. commercial supersonic transport vehicle) and are currently being adopted by both NASA and industry. Experimental validation of these models is provided by comparison to results obtained from open literature and Limited Exclusive Right Distribution (LERD) sources, as well as dedicated experiments performed at Texas A&M. These experiments have been performed using a hydraulic/gas flow analog. Results of comparisons of DREA computations with experimental data, which include entrainment, thrust, and local profile information, are overall good. Computational time studies indicate that DREA provides considerably more information at a lower computational cost than contemporary ejector nozzle design models. Finally. physical limitations of the method, deviations from experimental data, potential improvements and alternative formulations are described. This report represents closure to the NASA Graduate Researchers Program. Versions of the DREA code and a user's guide may be obtained from the NASA Lewis Research Center.

  12. Recent advances in non-LTE stellar atmosphere models

    NASA Astrophysics Data System (ADS)

    Sander, Andreas A. C.

    2017-11-01

    In the last decades, stellar atmosphere models have become a key tool in understanding massive stars. Applied for spectroscopic analysis, these models provide quantitative information on stellar wind properties as well as fundamental stellar parameters. The intricate non-LTE conditions in stellar winds dictate the development of adequate sophisticated model atmosphere codes. The increase in both, the computational power and our understanding of physical processes in stellar atmospheres, led to an increasing complexity in the models. As a result, codes emerged that can tackle a wide range of stellar and wind parameters. After a brief address of the fundamentals of stellar atmosphere modeling, the current stage of clumped and line-blanketed model atmospheres will be discussed. Finally, the path for the next generation of stellar atmosphere models will be outlined. Apart from discussing multi-dimensional approaches, I will emphasize on the coupling of hydrodynamics with a sophisticated treatment of the radiative transfer. This next generation of models will be able to predict wind parameters from first principles, which could open new doors for our understanding of the various facets of massive star physics, evolution, and death.

  13. Data mining: sophisticated forms of managed care modeling through artificial intelligence.

    PubMed

    Borok, L S

    1997-01-01

    Data mining is a recent development in computer science that combines artificial intelligence algorithms and relational databases to discover patterns automatically, without the use of traditional statistical methods. Work with data mining tools in health care is in a developmental stage that holds great promise, given the combination of demographic and diagnostic information.

  14. An Ecological Framework for Cancer Communication: Implications for Research

    PubMed Central

    Intille, Stephen S; Zabinski, Marion F

    2005-01-01

    The field of cancer communication has undergone a major revolution as a result of the Internet. As recently as the early 1990s, face-to-face, print, and the telephone were the dominant methods of communication between health professionals and individuals in support of the prevention and treatment of cancer. Computer-supported interactive media existed, but this usually required sophisticated computer and video platforms that limited availability. The introduction of point-and-click interfaces for the Internet dramatically improved the ability of non-expert computer users to obtain and publish information electronically on the Web. Demand for Web access has driven computer sales for the home setting and improved the availability, capability, and affordability of desktop computers. New advances in information and computing technologies will lead to similarly dramatic changes in the affordability and accessibility of computers. Computers will move from the desktop into the environment and onto the body. Computers are becoming smaller, faster, more sophisticated, more responsive, less expensive, and—essentially—ubiquitous. Computers are evolving into much more than desktop communication devices. New computers include sensing, monitoring, geospatial tracking, just-in-time knowledge presentation, and a host of other information processes. The challenge for cancer communication researchers is to acknowledge the expanded capability of the Web and to move beyond the approaches to health promotion, behavior change, and communication that emerged during an era when language- and image-based interpersonal and mass communication strategies predominated. Ecological theory has been advanced since the early 1900s to explain the highly complex relationships among individuals, society, organizations, the built and natural environments, and personal and population health and well-being. This paper provides background on ecological theory, advances an Ecological Model of Internet-Based Cancer Communication intended to broaden the vision of potential uses of the Internet for cancer communication, and provides some examples of how such a model might inform future research and development in cancer communication. PMID:15998614

  15. Extensible Computational Chemistry Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2012-08-09

    ECCE provides a sophisticated graphical user interface, scientific visualization tools, and the underlying data management framework enabling scientists to efficiently set up calculations and store, retrieve, and analyze the rapidly growing volumes of data produced by computational chemistry studies. ECCE was conceived as part of the Environmental Molecular Sciences Laboratory construction to solve the problem of researchers being able to effectively utilize complex computational chemistry codes and massively parallel high performance compute resources. Bringing the power of these codes and resources to the desktops of researcher and thus enabling world class research without users needing a detailed understanding of themore » inner workings of either the theoretical codes or the supercomputers needed to run them was a grand challenge problem in the original version of the EMSL. ECCE allows collaboration among researchers using a web-based data repository where the inputs and results for all calculations done within ECCE are organized. ECCE is a first of kind end-to-end problem solving environment for all phases of computational chemistry research: setting up calculations with sophisticated GUI and direct manipulation visualization tools, submitting and monitoring calculations on remote high performance supercomputers without having to be familiar with the details of using these compute resources, and performing results visualization and analysis including creating publication quality images. ECCE is a suite of tightly integrated applications that are employed as the user moves through the modeling process.« less

  16. Cartographic Modeling: Computer-assisted Analysis of Spatially Defined Neighborhoods

    NASA Technical Reports Server (NTRS)

    Berry, J. K.; Tomlin, C. D.

    1982-01-01

    Cartographic models addressing a wide variety of applications are composed of fundamental map processing operations. These primitive operations are neither data base nor application-specific. By organizing the set of operations into a mathematical-like structure, the basis for a generalized cartographic modeling framework can be developed. Among the major classes of primitive operations are those associated with reclassifying map categories, overlaying maps, determining distance and connectivity, and characterizing cartographic neighborhoods. The conceptual framework of cartographic modeling is established and techniques for characterizing neighborhoods are used as a means of demonstrating some of the more sophisticated procedures of computer-assisted map analysis. A cartographic model for assessing effective roundwood supply is briefly described as an example of a computer analysis. Most of the techniques described have been implemented as part of the map analysis package developed at the Yale School of Forestry and Environmental Studies.

  17. Computer-aided light sheet flow visualization using photogrammetry

    NASA Technical Reports Server (NTRS)

    Stacy, Kathryn; Severance, Kurt; Childers, Brooks A.

    1994-01-01

    A computer-aided flow visualization process has been developed to analyze video images acquired from rotating and translating light sheet visualization systems. The computer process integrates a mathematical model for image reconstruction, advanced computer graphics concepts, and digital image processing to provide a quantitative and a visual analysis capability. The image reconstruction model, based on photogrammetry, uses knowledge of the camera and light sheet locations and orientations to project two-dimensional light sheet video images into three-dimensional space. A sophisticated computer visualization package, commonly used to analyze computational fluid dynamics (CFD) results, was chosen to interactively display the reconstructed light sheet images with the numerical surface geometry for the model or aircraft under study. The photogrammetric reconstruction technique and the image processing and computer graphics techniques and equipment are described. Results of the computer-aided process applied to both a wind tunnel translating light sheet experiment and an in-flight rotating light sheet experiment are presented. The capability to compare reconstructed experimental light sheet images with CFD solutions in the same graphics environment is also demonstrated.

  18. Computer-Aided Light Sheet Flow Visualization

    NASA Technical Reports Server (NTRS)

    Stacy, Kathryn; Severance, Kurt; Childers, Brooks A.

    1993-01-01

    A computer-aided flow visualization process has been developed to analyze video images acquired from rotating and translating light sheet visualization systems. The computer process integrates a mathematical model for image reconstruction, advanced computer graphics concepts, and digital image processing to provide a quantitative and visual analysis capability. The image reconstruction model, based on photogrammetry, uses knowledge of the camera and light sheet locations and orientations to project two-dimensional light sheet video images into three-dimensional space. A sophisticated computer visualization package, commonly used to analyze computational fluid dynamics (CFD) data sets, was chosen to interactively display the reconstructed light sheet images, along with the numerical surface geometry for the model or aircraft under study. A description is provided of the photogrammetric reconstruction technique, and the image processing and computer graphics techniques and equipment. Results of the computer aided process applied to both a wind tunnel translating light sheet experiment and an in-flight rotating light sheet experiment are presented. The capability to compare reconstructed experimental light sheet images and CFD solutions in the same graphics environment is also demonstrated.

  19. Computer-aided light sheet flow visualization

    NASA Technical Reports Server (NTRS)

    Stacy, Kathryn; Severance, Kurt; Childers, Brooks A.

    1993-01-01

    A computer-aided flow visualization process has been developed to analyze video images acquired from rotating and translating light sheet visualization systems. The computer process integrates a mathematical model for image reconstruction, advanced computer graphics concepts, and digital image processing to provide a quantitative and visual analysis capability. The image reconstruction model, based on photogrammetry, uses knowledge of the camera and light sheet locations and orientations to project two-dimensional light sheet video images into three-dimensional space. A sophisticated computer visualization package, commonly used to analyze computational fluid dynamics (CFD) data sets, was chosen to interactively display the reconstructed light sheet images, along with the numerical surface geometry for the model or aircraft under study. A description is provided of the photogrammetric reconstruction technique, and the image processing and computer graphics techniques and equipment. Results of the computer aided process applied to both a wind tunnel translating light sheet experiment and an in-flight rotating light sheet experiment are presented. The capability to compare reconstructed experimental light sheet images and CFD solutions in the same graphics environment is also demonstrated.

  20. FROM REINFORCEMENT LEARNING MODELS OF THE BASAL GANGLIA TO THE PATHOPHYSIOLOGY OF PSYCHIATRIC AND NEUROLOGICAL DISORDERS

    PubMed Central

    Maia, Tiago V.; Frank, Michael J.

    2013-01-01

    Over the last decade and a half, reinforcement learning models have fostered an increasingly sophisticated understanding of the functions of dopamine and cortico-basal ganglia-thalamo-cortical (CBGTC) circuits. More recently, these models, and the insights that they afford, have started to be used to understand key aspects of several psychiatric and neurological disorders that involve disturbances of the dopaminergic system and CBGTC circuits. We review this approach and its existing and potential applications to Parkinson’s disease, Tourette’s syndrome, attention-deficit/hyperactivity disorder, addiction, schizophrenia, and preclinical animal models used to screen novel antipsychotic drugs. The approach’s proven explanatory and predictive power bodes well for the continued growth of computational psychiatry and computational neurology. PMID:21270784

  1. The Crossroads between the Galactic Disk and Interstellar Space, Ablaze in 3/4 keV Light

    NASA Astrophysics Data System (ADS)

    Shelton, Robin L.

    2016-04-01

    The halo is the crossroads between the Galactic disk and intergalactic space. This region is inhabited by hot gas that has risen from the disk, gas heated in situ, and hot material that has fallen in from intergalactic space. Owing to high spectral resolution observations made by by XMM-Newton, Suzaku, and Chandra of the hot plasma's 3/4 keV emission and absorption, increasingly sophisticated and CPU intensive computer modeling, and an awareness that charge exchange can contaminate 3/4 keV observations, we are now better able to understand the hot halo gas than ever before.Spectral analyses indicate that the 3/4 keV emission comes from T ~ 2.2 million Kelvin gas. Although observations suggest that the gas may be convectively unstable and the spectra's temperature is similar to that predicted by recent sophisticated models of the galactic fountain, the observed emission measure is significantly brighter than that predicted by fountain models. This brightness disparity presents us with another type of crossroads: should we continue down the road of adding physics to already sophisticated modeling or should we seek out other sources? In this presentation, I will discuss the galactic fountain crossroads, note the latitudinal and longitudinal distribution of the hot halo gas, provide an update on charge exchange, and explain how shadowing observations have helped to fine tune our understanding of the hot gas.

  2. Box compression analysis of world-wide data spanning 46 years

    Treesearch

    Thomas J. Urbanik; Benjamin Frank

    2006-01-01

    The state of the art among most industry citations of box compression estimation is the equation by McKee developed in 1963. Because of limitations in computing tools at the time the McKee equation was developed, the equation is a simplification, with many constraints, of a more general relationship. By applying the results of sophisticated finite element modeling, in...

  3. MoCog1: A computer simulation of recognition-primed human decision making

    NASA Technical Reports Server (NTRS)

    Gevarter, William B.

    1991-01-01

    The results of the first stage of a research effort to develop a 'sophisticated' computer model of human cognitive behavior are described. Most human decision making is an experience-based, relatively straight-forward, largely automatic response to internal goals and drives, utilizing cues and opportunities perceived from the current environment. The development of the architecture and computer program (MoCog1) associated with such 'recognition-primed' decision making is discussed. The resultant computer program was successfully utilized as a vehicle to simulate earlier findings that relate how an individual's implicit theories orient the individual toward particular goals, with resultant cognitions, affects, and behavior in response to their environment.

  4. When Machines Think: Radiology's Next Frontier.

    PubMed

    Dreyer, Keith J; Geis, J Raymond

    2017-12-01

    Artificial intelligence (AI), machine learning, and deep learning are terms now seen frequently, all of which refer to computer algorithms that change as they are exposed to more data. Many of these algorithms are surprisingly good at recognizing objects in images. The combination of large amounts of machine-consumable digital data, increased and cheaper computing power, and increasingly sophisticated statistical models combine to enable machines to find patterns in data in ways that are not only cost-effective but also potentially beyond humans' abilities. Building an AI algorithm can be surprisingly easy. Understanding the associated data structures and statistics, on the other hand, is often difficult and obscure. Converting the algorithm into a sophisticated product that works consistently in broad, general clinical use is complex and incompletely understood. To show how these AI products reduce costs and improve outcomes will require clinical translation and industrial-grade integration into routine workflow. Radiology has the chance to leverage AI to become a center of intelligently aggregated, quantitative, diagnostic information. Centaur radiologists, formed as a synergy of human plus computer, will provide interpretations using data extracted from images by humans and image-analysis computer algorithms, as well as the electronic health record, genomics, and other disparate sources. These interpretations will form the foundation of precision health care, or care customized to an individual patient. © RSNA, 2017.

  5. Distributed geospatial model sharing based on open interoperability standards

    USGS Publications Warehouse

    Feng, Min; Liu, Shuguang; Euliss, Ned H.; Fang, Yin

    2009-01-01

    Numerous geospatial computational models have been developed based on sound principles and published in journals or presented in conferences. However modelers have made few advances in the development of computable modules that facilitate sharing during model development or utilization. Constraints hampering development of model sharing technology includes limitations on computing, storage, and connectivity; traditional stand-alone and closed network systems cannot fully support sharing and integrating geospatial models. To address this need, we have identified methods for sharing geospatial computational models using Service Oriented Architecture (SOA) techniques and open geospatial standards. The service-oriented model sharing service is accessible using any tools or systems compliant with open geospatial standards, making it possible to utilize vast scientific resources available from around the world to solve highly sophisticated application problems. The methods also allow model services to be empowered by diverse computational devices and technologies, such as portable devices and GRID computing infrastructures. Based on the generic and abstract operations and data structures required for Web Processing Service (WPS) standards, we developed an interactive interface for model sharing to help reduce interoperability problems for model use. Geospatial computational models are shared on model services, where the computational processes provided by models can be accessed through tools and systems compliant with WPS. We developed a platform to help modelers publish individual models in a simplified and efficient way. Finally, we illustrate our technique using wetland hydrological models we developed for the prairie pothole region of North America.

  6. A survey of computational aerodynamics in the United States

    NASA Technical Reports Server (NTRS)

    Gessow, A.; Morris, D. J.

    1977-01-01

    Programs in theoretical and computational aerodynamics in the United States are described. Those aspects of programs that relate to aeronautics are detailed. The role of analysis at various levels of sophistication is discussed as well as the inverse solution techniques that are of primary importance in design methodology. The research is divided into the broad categories of application for boundary layer flow, Navier-Stokes turbulence modeling, internal flows, two-dimensional configurations, subsonic and supersonic aircraft, transonic aircraft, and the space shuttle. A survey of representative work in each area is presented.

  7. Performance evaluation of GPU parallelization, space-time adaptive algorithms, and their combination for simulating cardiac electrophysiology.

    PubMed

    Sachetto Oliveira, Rafael; Martins Rocha, Bernardo; Burgarelli, Denise; Meira, Wagner; Constantinides, Christakis; Weber Dos Santos, Rodrigo

    2018-02-01

    The use of computer models as a tool for the study and understanding of the complex phenomena of cardiac electrophysiology has attained increased importance nowadays. At the same time, the increased complexity of the biophysical processes translates into complex computational and mathematical models. To speed up cardiac simulations and to allow more precise and realistic uses, 2 different techniques have been traditionally exploited: parallel computing and sophisticated numerical methods. In this work, we combine a modern parallel computing technique based on multicore and graphics processing units (GPUs) and a sophisticated numerical method based on a new space-time adaptive algorithm. We evaluate each technique alone and in different combinations: multicore and GPU, multicore and GPU and space adaptivity, multicore and GPU and space adaptivity and time adaptivity. All the techniques and combinations were evaluated under different scenarios: 3D simulations on slabs, 3D simulations on a ventricular mouse mesh, ie, complex geometry, sinus-rhythm, and arrhythmic conditions. Our results suggest that multicore and GPU accelerate the simulations by an approximate factor of 33×, whereas the speedups attained by the space-time adaptive algorithms were approximately 48. Nevertheless, by combining all the techniques, we obtained speedups that ranged between 165 and 498. The tested methods were able to reduce the execution time of a simulation by more than 498× for a complex cellular model in a slab geometry and by 165× in a realistic heart geometry simulating spiral waves. The proposed methods will allow faster and more realistic simulations in a feasible time with no significant loss of accuracy. Copyright © 2017 John Wiley & Sons, Ltd.

  8. Simulating motivated cognition

    NASA Technical Reports Server (NTRS)

    Gevarter, William B.

    1991-01-01

    A research effort to develop a sophisticated computer model of human behavior is described. A computer framework of motivated cognition was developed. Motivated cognition focuses on the motivations or affects that provide the context and drive in human cognition and decision making. A conceptual architecture of the human decision-making approach from the perspective of information processing in the human brain is developed in diagrammatic form. A preliminary version of such a diagram is presented. This architecture is then used as a vehicle for successfully constructing a computer program simulation Dweck and Leggett's findings that relate how an individual's implicit theories orient them toward particular goals, with resultant cognitions, affects, and behavior.

  9. Modeling low-temperature geochemical processes: Chapter 2

    USGS Publications Warehouse

    Nordstrom, D. Kirk; Campbell, Kate M.

    2014-01-01

    This chapter provides an overview of geochemical modeling that applies to water–rock interactions under ambient conditions of temperature and pressure. Topics include modeling definitions, historical background, issues of activity coefficients, popular codes and databases, examples of modeling common types of water–rock interactions, and issues of model reliability. Examples include speciation, microbial redox kinetics and ferrous iron oxidation, calcite dissolution, pyrite oxidation, combined pyrite and calcite dissolution, dedolomitization, seawater–carbonate groundwater mixing, reactive-transport modeling in streams, modeling catchments, and evaporation of seawater. The chapter emphasizes limitations to geochemical modeling: that a proper understanding and ability to communicate model results well are as important as completing a set of useful modeling computations and that greater sophistication in model and code development is not necessarily an advancement. If the goal is to understand how a particular geochemical system behaves, it is better to collect more field data than rely on computer codes.

  10. The importance of structural anisotropy in computational models of traumatic brain injury.

    PubMed

    Carlsen, Rika W; Daphalapurkar, Nitin P

    2015-01-01

    Understanding the mechanisms of injury might prove useful in assisting the development of methods for the management and mitigation of traumatic brain injury (TBI). Computational head models can provide valuable insight into the multi-length-scale complexity associated with the primary nature of diffuse axonal injury. It involves understanding how the trauma to the head (at the centimeter length scale) translates to the white-matter tissue (at the millimeter length scale), and even further down to the axonal-length scale, where physical injury to axons (e.g., axon separation) may occur. However, to accurately represent the development of TBI, the biofidelity of these computational models is of utmost importance. There has been a focused effort to improve the biofidelity of computational models by including more sophisticated material definitions and implementing physiologically relevant measures of injury. This paper summarizes recent computational studies that have incorporated structural anisotropy in both the material definition of the white matter and the injury criterion as a means to improve the predictive capabilities of computational models for TBI. We discuss the role of structural anisotropy on both the mechanical response of the brain tissue and on the development of injury. We also outline future directions in the computational modeling of TBI.

  11. Hybrid reduced order modeling for assembly calculations

    DOE PAGES

    Bang, Youngsuk; Abdel-Khalik, Hany S.; Jessee, Matthew A.; ...

    2015-08-14

    While the accuracy of assembly calculations has greatly improved due to the increase in computer power enabling more refined description of the phase space and use of more sophisticated numerical algorithms, the computational cost continues to increase which limits the full utilization of their effectiveness for routine engineering analysis. Reduced order modeling is a mathematical vehicle that scales down the dimensionality of large-scale numerical problems to enable their repeated executions on small computing environment, often available to end users. This is done by capturing the most dominant underlying relationships between the model's inputs and outputs. Previous works demonstrated the usemore » of the reduced order modeling for a single physics code, such as a radiation transport calculation. This paper extends those works to coupled code systems as currently employed in assembly calculations. Finally, numerical tests are conducted using realistic SCALE assembly models with resonance self-shielding, neutron transport, and nuclides transmutation/depletion models representing the components of the coupled code system.« less

  12. The Center for Computational Biology: resources, achievements, and challenges

    PubMed Central

    Dinov, Ivo D; Thompson, Paul M; Woods, Roger P; Van Horn, John D; Shattuck, David W; Parker, D Stott

    2011-01-01

    The Center for Computational Biology (CCB) is a multidisciplinary program where biomedical scientists, engineers, and clinicians work jointly to combine modern mathematical and computational techniques, to perform phenotypic and genotypic studies of biological structure, function, and physiology in health and disease. CCB has developed a computational framework built around the Manifold Atlas, an integrated biomedical computing environment that enables statistical inference on biological manifolds. These manifolds model biological structures, features, shapes, and flows, and support sophisticated morphometric and statistical analyses. The Manifold Atlas includes tools, workflows, and services for multimodal population-based modeling and analysis of biological manifolds. The broad spectrum of biomedical topics explored by CCB investigators include the study of normal and pathological brain development, maturation and aging, discovery of associations between neuroimaging and genetic biomarkers, and the modeling, analysis, and visualization of biological shape, form, and size. CCB supports a wide range of short-term and long-term collaborations with outside investigators, which drive the center's computational developments and focus the validation and dissemination of CCB resources to new areas and scientific domains. PMID:22081221

  13. The Center for Computational Biology: resources, achievements, and challenges.

    PubMed

    Toga, Arthur W; Dinov, Ivo D; Thompson, Paul M; Woods, Roger P; Van Horn, John D; Shattuck, David W; Parker, D Stott

    2012-01-01

    The Center for Computational Biology (CCB) is a multidisciplinary program where biomedical scientists, engineers, and clinicians work jointly to combine modern mathematical and computational techniques, to perform phenotypic and genotypic studies of biological structure, function, and physiology in health and disease. CCB has developed a computational framework built around the Manifold Atlas, an integrated biomedical computing environment that enables statistical inference on biological manifolds. These manifolds model biological structures, features, shapes, and flows, and support sophisticated morphometric and statistical analyses. The Manifold Atlas includes tools, workflows, and services for multimodal population-based modeling and analysis of biological manifolds. The broad spectrum of biomedical topics explored by CCB investigators include the study of normal and pathological brain development, maturation and aging, discovery of associations between neuroimaging and genetic biomarkers, and the modeling, analysis, and visualization of biological shape, form, and size. CCB supports a wide range of short-term and long-term collaborations with outside investigators, which drive the center's computational developments and focus the validation and dissemination of CCB resources to new areas and scientific domains.

  14. Thermal Analysis

    NASA Technical Reports Server (NTRS)

    1986-01-01

    The University of Georgia used NASTRAN, a COSMIC program that predicts how a design will stand up under stress, to develop a model for monitoring the transient cooling of vegetables. The winter use of passive solar heating for poultry houses is also under investigation by the Agricultural Engineering Dept. Another study involved thermal analysis of black and green nursery containers. The use of NASTRAN has encouraged student appreciation of sophisticated computer analysis.

  15. Development and Applications of the FV3 GEOS-5 Adjoint Modeling System

    NASA Technical Reports Server (NTRS)

    Holdaway, Daniel; Kim, Jong G.; Lin, Shian-Jiann; Errico, Ron; Gelaro, Ron; Kent, James; Coy, Larry; Doyle, Jim; Goldstein, Alex

    2017-01-01

    GMAO has developed a highly sophisticated adjoint modeling system based on the most recent version of the finite volume cubed sphere (FV3) dynamical core. This provides a mechanism for investigating sensitivity to initial conditions and examining observation impacts. It also allows for the computation of singular vectors and for the implementation of hybrid 4DVAR. In this work we will present the scientific assessment of the new adjoint system and show results from a number of research application of the adjoint system.

  16. Pharmacophore modeling, docking, and principal component analysis based clustering: combined computer-assisted approaches to identify new inhibitors of the human rhinovirus coat protein.

    PubMed

    Steindl, Theodora M; Crump, Carolyn E; Hayden, Frederick G; Langer, Thierry

    2005-10-06

    The development and application of a sophisticated virtual screening and selection protocol to identify potential, novel inhibitors of the human rhinovirus coat protein employing various computer-assisted strategies are described. A large commercially available database of compounds was screened using a highly selective, structure-based pharmacophore model generated with the program Catalyst. A docking study and a principal component analysis were carried out within the software package Cerius and served to validate and further refine the obtained results. These combined efforts led to the selection of six candidate structures, for which in vitro anti-rhinoviral activity could be shown in a biological assay.

  17. Efficient Unsteady Flow Visualization with High-Order Access Dependencies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Jiang; Guo, Hanqi; Yuan, Xiaoru

    We present a novel high-order access dependencies based model for efficient pathline computation in unsteady flow visualization. By taking longer access sequences into account to model more sophisticated data access patterns in particle tracing, our method greatly improves the accuracy and reliability in data access prediction. In our work, high-order access dependencies are calculated by tracing uniformly-seeded pathlines in both forward and backward directions in a preprocessing stage. The effectiveness of our proposed approach is demonstrated through a parallel particle tracing framework with high-order data prefetching. Results show that our method achieves higher data locality and hence improves the efficiencymore » of pathline computation.« less

  18. Need for evaluative methodologies in land use, regional resource and waste management planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Croke, E. J.

    The transfer of planning methodology from the research community to the practitioner very frequently takes the form of analytical and evaluative techniques and procedures. In the end, these become operational in the form of data acquisition, management and display systems, computational schemes that are codified in the form of manuals and handbooks, and computer simulation models. The complexity of the socioeconomic and physical processes that govern environmental resource and waste management have reinforced the need for computer assisted, scientifically sophisticated planning models that are fully operational, dependent on an attainable data base and accessible in terms of the resources normallymore » available to practitioners of regional resource management, waste management, and land use planning. A variety of models and procedures that attempt to meet one or more of the needs of these practitioners are discussed.« less

  19. The role of simulation in neurosurgery.

    PubMed

    Rehder, Roberta; Abd-El-Barr, Muhammad; Hooten, Kristopher; Weinstock, Peter; Madsen, Joseph R; Cohen, Alan R

    2016-01-01

    In an era of residency duty-hour restrictions, there has been a recent effort to implement simulation-based training methods in neurosurgery teaching institutions. Several surgical simulators have been developed, ranging from physical models to sophisticated virtual reality systems. To date, there is a paucity of information describing the clinical benefits of existing simulators and the assessment strategies to help implement them into neurosurgical curricula. Here, we present a systematic review of the current models of simulation and discuss the state-of-the-art and future directions for simulation in neurosurgery. Retrospective literature review. Multiple simulators have been developed for neurosurgical training, including those for minimally invasive procedures, vascular, skull base, pediatric, tumor resection, functional neurosurgery, and spine surgery. The pros and cons of existing systems are reviewed. Advances in imaging and computer technology have led to the development of different simulation models to complement traditional surgical training. Sophisticated virtual reality (VR) simulators with haptic feedback and impressive imaging technology have provided novel options for training in neurosurgery. Breakthrough training simulation using 3D printing technology holds promise for future simulation practice, proving high-fidelity patient-specific models to complement residency surgical learning.

  20. Acoustic backscatter models of fish: Gradual or punctuated evolution

    NASA Astrophysics Data System (ADS)

    Horne, John K.

    2004-05-01

    Sound-scattering characteristics of aquatic organisms are routinely investigated using theoretical and numerical models. Development of the inverse approach by van Holliday and colleagues in the 1970s catalyzed the development and validation of backscatter models for fish and zooplankton. As the understanding of biological scattering properties increased, so did the number and computational sophistication of backscatter models. The complexity of data used to represent modeled organisms has also evolved in parallel to model development. Simple geometric shapes representing body components or the whole organism have been replaced by anatomically accurate representations derived from imaging sensors such as computer-aided tomography (CAT) scans. In contrast, Medwin and Clay (1998) recommend that fish and zooplankton should be described by simple theories and models, without acoustically superfluous extensions. Since van Holliday's early work, how has data and computational complexity influenced accuracy and precision of model predictions? How has the understanding of aquatic organism scattering properties increased? Significant steps in the history of model development will be identified and changes in model results will be characterized and compared. [Work supported by ONR and the Alaska Fisheries Science Center.

  1. Perceptual organization in computer vision - A review and a proposal for a classificatory structure

    NASA Technical Reports Server (NTRS)

    Sarkar, Sudeep; Boyer, Kim L.

    1993-01-01

    The evolution of perceptual organization in biological vision, and its necessity in advanced computer vision systems, arises from the characteristic that perception, the extraction of meaning from sensory input, is an intelligent process. This is particularly so for high order organisms and, analogically, for more sophisticated computational models. The role of perceptual organization in computer vision systems is explored. This is done from four vantage points. First, a brief history of perceptual organization research in both humans and computer vision is offered. Next, a classificatory structure in which to cast perceptual organization research to clarify both the nomenclature and the relationships among the many contributions is proposed. Thirdly, the perceptual organization work in computer vision in the context of this classificatory structure is reviewed. Finally, the array of computational techniques applied to perceptual organization problems in computer vision is surveyed.

  2. Reading wild minds: A computational assay of Theory of Mind sophistication across seven primate species

    PubMed Central

    Devaine, Marie; San-Galli, Aurore; Trapanese, Cinzia; Bardino, Giulia; Hano, Christelle; Saint Jalme, Michel; Bouret, Sebastien

    2017-01-01

    Theory of Mind (ToM), i.e. the ability to understand others' mental states, endows humans with highly adaptive social skills such as teaching or deceiving. Candidate evolutionary explanations have been proposed for the unique sophistication of human ToM among primates. For example, the Machiavellian intelligence hypothesis states that the increasing complexity of social networks may have induced a demand for sophisticated ToM. This type of scenario ignores neurocognitive constraints that may eventually be crucial limiting factors for ToM evolution. In contradistinction, the cognitive scaffolding hypothesis asserts that a species' opportunity to develop sophisticated ToM is mostly determined by its general cognitive capacity (on which ToM is scaffolded). However, the actual relationships between ToM sophistication and either brain volume (a proxy for general cognitive capacity) or social group size (a proxy for social network complexity) are unclear. Here, we let 39 individuals sampled from seven non-human primate species (lemurs, macaques, mangabeys, orangutans, gorillas and chimpanzees) engage in simple dyadic games against artificial ToM players (via a familiar human caregiver). Using computational analyses of primates' choice sequences, we found that the probability of exhibiting a ToM-compatible learning style is mainly driven by species' brain volume (rather than by social group size). Moreover, primates' social cognitive sophistication culminates in a precursor form of ToM, which still falls short of human fully-developed ToM abilities. PMID:29112973

  3. Reading wild minds: A computational assay of Theory of Mind sophistication across seven primate species.

    PubMed

    Devaine, Marie; San-Galli, Aurore; Trapanese, Cinzia; Bardino, Giulia; Hano, Christelle; Saint Jalme, Michel; Bouret, Sebastien; Masi, Shelly; Daunizeau, Jean

    2017-11-01

    Theory of Mind (ToM), i.e. the ability to understand others' mental states, endows humans with highly adaptive social skills such as teaching or deceiving. Candidate evolutionary explanations have been proposed for the unique sophistication of human ToM among primates. For example, the Machiavellian intelligence hypothesis states that the increasing complexity of social networks may have induced a demand for sophisticated ToM. This type of scenario ignores neurocognitive constraints that may eventually be crucial limiting factors for ToM evolution. In contradistinction, the cognitive scaffolding hypothesis asserts that a species' opportunity to develop sophisticated ToM is mostly determined by its general cognitive capacity (on which ToM is scaffolded). However, the actual relationships between ToM sophistication and either brain volume (a proxy for general cognitive capacity) or social group size (a proxy for social network complexity) are unclear. Here, we let 39 individuals sampled from seven non-human primate species (lemurs, macaques, mangabeys, orangutans, gorillas and chimpanzees) engage in simple dyadic games against artificial ToM players (via a familiar human caregiver). Using computational analyses of primates' choice sequences, we found that the probability of exhibiting a ToM-compatible learning style is mainly driven by species' brain volume (rather than by social group size). Moreover, primates' social cognitive sophistication culminates in a precursor form of ToM, which still falls short of human fully-developed ToM abilities.

  4. Solvable Family of Driven-Dissipative Many-Body Systems.

    PubMed

    Foss-Feig, Michael; Young, Jeremy T; Albert, Victor V; Gorshkov, Alexey V; Maghrebi, Mohammad F

    2017-11-10

    Exactly solvable models have played an important role in establishing the sophisticated modern understanding of equilibrium many-body physics. Conversely, the relative scarcity of solutions for nonequilibrium models greatly limits our understanding of systems away from thermal equilibrium. We study a family of nonequilibrium models, some of which can be viewed as dissipative analogues of the transverse-field Ising model, in that an effectively classical Hamiltonian is frustrated by dissipative processes that drive the system toward states that do not commute with the Hamiltonian. Surprisingly, a broad and experimentally relevant subset of these models can be solved efficiently. We leverage these solutions to compute the effects of decoherence on a canonical trapped-ion-based quantum computation architecture, and to prove a no-go theorem on steady-state phase transitions in a many-body model that can be realized naturally with Rydberg atoms or trapped ions.

  5. Solvable Family of Driven-Dissipative Many-Body Systems

    NASA Astrophysics Data System (ADS)

    Foss-Feig, Michael; Young, Jeremy T.; Albert, Victor V.; Gorshkov, Alexey V.; Maghrebi, Mohammad F.

    2017-11-01

    Exactly solvable models have played an important role in establishing the sophisticated modern understanding of equilibrium many-body physics. Conversely, the relative scarcity of solutions for nonequilibrium models greatly limits our understanding of systems away from thermal equilibrium. We study a family of nonequilibrium models, some of which can be viewed as dissipative analogues of the transverse-field Ising model, in that an effectively classical Hamiltonian is frustrated by dissipative processes that drive the system toward states that do not commute with the Hamiltonian. Surprisingly, a broad and experimentally relevant subset of these models can be solved efficiently. We leverage these solutions to compute the effects of decoherence on a canonical trapped-ion-based quantum computation architecture, and to prove a no-go theorem on steady-state phase transitions in a many-body model that can be realized naturally with Rydberg atoms or trapped ions.

  6. An integrated decision support system for TRAC: A proposal

    NASA Technical Reports Server (NTRS)

    Mukkamala, Ravi

    1991-01-01

    Optimal allocation and usage of resources is a key to effective management. Resources of concern to TRAC are: Manpower (PSY), Money (Travel, contracts), Computing, Data, Models, etc. Management activities of TRAC include: Planning, Programming, Tasking, Monitoring, Updating, and Coordinating. Existing systems are insufficient, not completely automated, manpower intensive, and has the potential for data inconsistency exists. A system is proposed which suggests a means to integrate all project management activities of TRAC through the development of a sophisticated software and by utilizing the existing computing systems and network resources. The systems integration proposal is examined in detail.

  7. Designer drugs: the evolving science of drug discovery.

    PubMed

    Wanke, L A; DuBose, R F

    1998-07-01

    Drug discovery and design are fundamental to drug development. Until recently, most drugs were discovered through random screening or developed through molecular modification. New technologies are revolutionizing this phase of drug development. Rational drug design, using powerful computers and computational chemistry and employing X-ray crystallography, nuclear magnetic resonance spectroscopy, and three-dimensional quantitative structure activity relationship analysis, is creating highly specific, biologically active molecules by virtual reality modeling. Sophisticated screening technologies are eliminating all but the most active lead compounds. These new technologies promise more efficacious, safe, and cost-effective medications, while minimizing drug development time and maximizing profits.

  8. Extension, validation and application of the NASCAP code

    NASA Technical Reports Server (NTRS)

    Katz, I.; Cassidy, J. J., III; Mandell, M. J.; Schnuelle, G. W.; Steen, P. G.; Parks, D. E.; Rotenberg, M.; Alexander, J. H.

    1979-01-01

    Numerous extensions were made in the NASCAP code. They fall into three categories: a greater range of definable objects, a more sophisticated computational model, and simplified code structure and usage. An important validation of NASCAP was performed using a new two dimensional computer code (TWOD). An interactive code (MATCHG) was written to compare material parameter inputs with charging results. The first major application of NASCAP was performed on the SCATHA satellite. Shadowing and charging calculation were completed. NASCAP was installed at the Air Force Geophysics Laboratory, where researchers plan to use it to interpret SCATHA data.

  9. Space environment and lunar surface processes

    NASA Technical Reports Server (NTRS)

    Comstock, G. M.

    1979-01-01

    The development of a general rock/soil model capable of simulating in a self consistent manner the mechanical and exposure history of an assemblage of solid and loose material from submicron to planetary size scales, applicable to lunar and other space exposed planetary surfaces is discussed. The model was incorporated into a computer code called MESS.2 (model for the evolution of space exposed surfaces). MESS.2, which represents a considerable increase in sophistication and scope over previous soil and rock surface models, is described. The capabilities of previous models for near surface soil and rock surfaces are compared with the rock/soil model, MESS.2.

  10. Composite Load Spectra for Select Space Propulsion Structural Components

    NASA Technical Reports Server (NTRS)

    Ho, Hing W.; Newell, James F.

    1994-01-01

    Generic load models are described with multiple levels of progressive sophistication to simulate the composite (combined) load spectra (CLS) that are induced in space propulsion system components, representative of Space Shuttle Main Engines (SSME), such as transfer ducts, turbine blades and liquid oxygen (LOX) posts. These generic (coupled) models combine the deterministic models for composite load dynamic, acoustic, high-pressure and high rotational speed, etc., load simulation using statistically varying coefficients. These coefficients are then determined using advanced probabilistic simulation methods with and without strategically selected experimental data. The entire simulation process is included in a CLS computer code. Applications of the computer code to various components in conjunction with the PSAM (Probabilistic Structural Analysis Method) to perform probabilistic load evaluation and life prediction evaluations are also described to illustrate the effectiveness of the coupled model approach.

  11. Toward A Simulation-Based Tool for the Treatment of Vocal Fold Paralysis

    PubMed Central

    Mittal, Rajat; Zheng, Xudong; Bhardwaj, Rajneesh; Seo, Jung Hee; Xue, Qian; Bielamowicz, Steven

    2011-01-01

    Advances in high-performance computing are enabling a new generation of software tools that employ computational modeling for surgical planning. Surgical management of laryngeal paralysis is one area where such computational tools could have a significant impact. The current paper describes a comprehensive effort to develop a software tool for planning medialization laryngoplasty where a prosthetic implant is inserted into the larynx in order to medialize the paralyzed vocal fold (VF). While this is one of the most common procedures used to restore voice in patients with VF paralysis, it has a relatively high revision rate, and the tool being developed is expected to improve surgical outcomes. This software tool models the biomechanics of airflow-induced vibration in the human larynx and incorporates sophisticated approaches for modeling the turbulent laryngeal flow, the complex dynamics of the VFs, as well as the production of voiced sound. The current paper describes the key elements of the modeling approach, presents computational results that demonstrate the utility of the approach and also describes some of the limitations and challenges. PMID:21556320

  12. Application of advanced virtual reality and 3D computer assisted technologies in tele-3D-computer assisted surgery in rhinology.

    PubMed

    Klapan, Ivica; Vranjes, Zeljko; Prgomet, Drago; Lukinović, Juraj

    2008-03-01

    The real-time requirement means that the simulation should be able to follow the actions of the user that may be moving in the virtual environment. The computer system should also store in its memory a three-dimensional (3D) model of the virtual environment. In that case a real-time virtual reality system will update the 3D graphic visualization as the user moves, so that up-to-date visualization is always shown on the computer screen. Upon completion of the tele-operation, the surgeon compares the preoperative and postoperative images and models of the operative field, and studies video records of the procedure itself Using intraoperative records, animated images of the real tele-procedure performed can be designed. Virtual surgery offers the possibility of preoperative planning in rhinology. The intraoperative use of computer in real time requires development of appropriate hardware and software to connect medical instrumentarium with the computer and to operate the computer by thus connected instrumentarium and sophisticated multimedia interfaces.

  13. A performance comparison of scalar, vector, and concurrent vector computers including supercomputers for modeling transport of reactive contaminants in groundwater

    NASA Astrophysics Data System (ADS)

    Tripathi, Vijay S.; Yeh, G. T.

    1993-06-01

    Sophisticated and highly computation-intensive models of transport of reactive contaminants in groundwater have been developed in recent years. Application of such models to real-world contaminant transport problems, e.g., simulation of groundwater transport of 10-15 chemically reactive elements (e.g., toxic metals) and relevant complexes and minerals in two and three dimensions over a distance of several hundred meters, requires high-performance computers including supercomputers. Although not widely recognized as such, the computational complexity and demand of these models compare with well-known computation-intensive applications including weather forecasting and quantum chemical calculations. A survey of the performance of a variety of available hardware, as measured by the run times for a reactive transport model HYDROGEOCHEM, showed that while supercomputers provide the fastest execution times for such problems, relatively low-cost reduced instruction set computer (RISC) based scalar computers provide the best performance-to-price ratio. Because supercomputers like the Cray X-MP are inherently multiuser resources, often the RISC computers also provide much better turnaround times. Furthermore, RISC-based workstations provide the best platforms for "visualization" of groundwater flow and contaminant plumes. The most notable result, however, is that current workstations costing less than $10,000 provide performance within a factor of 5 of a Cray X-MP.

  14. TethysCluster: A comprehensive approach for harnessing cloud resources for hydrologic modeling

    NASA Astrophysics Data System (ADS)

    Nelson, J.; Jones, N.; Ames, D. P.

    2015-12-01

    Advances in water resources modeling are improving the information that can be supplied to support decisions affecting the safety and sustainability of society. However, as water resources models become more sophisticated and data-intensive they require more computational power to run. Purchasing and maintaining the computing facilities needed to support certain modeling tasks has been cost-prohibitive for many organizations. With the advent of the cloud, the computing resources needed to address this challenge are now available and cost-effective, yet there still remains a significant technical barrier to leverage these resources. This barrier inhibits many decision makers and even trained engineers from taking advantage of the best science and tools available. Here we present the Python tools TethysCluster and CondorPy, that have been developed to lower the barrier to model computation in the cloud by providing (1) programmatic access to dynamically scalable computing resources, (2) a batch scheduling system to queue and dispatch the jobs to the computing resources, (3) data management for job inputs and outputs, and (4) the ability to dynamically create, submit, and monitor computing jobs. These Python tools leverage the open source, computing-resource management, and job management software, HTCondor, to offer a flexible and scalable distributed-computing environment. While TethysCluster and CondorPy can be used independently to provision computing resources and perform large modeling tasks, they have also been integrated into Tethys Platform, a development platform for water resources web apps, to enable computing support for modeling workflows and decision-support systems deployed as web apps.

  15. Application of a new model for groundwater age distributions: Modeling and isotopic analysis of artificial recharge in the Rialto-Colton basin, California

    USGS Publications Warehouse

    Ginn, T.R.; Woolfenden, L.

    2002-01-01

    A project for modeling and isotopic analysis of artificial recharge in the Rialto-Colton basin aquifer in California, is discussed. The Rialto-Colton aquifer has been divided into four primary and significant flowpaths following the general direction of groundwater flow from NW to SE. The introductory investigation include sophisticated chemical reaction modeling, with highly simplified flow path simulation. A comprehensive reactive transport model with the established set of geochemical reactions over the whole aquifer will also be developed for treating both reactions and transport realistically. This will be completed by making use of HBGC123D implemented with isotopic calculation step to compute Carbon-14 (C14) and stable Carbon-13 (C13) contents of the water. Computed carbon contents will also be calibrated with the measured carbon contents for assessment of the amount of imported recharge into the Linden pond.

  16. Emerging Uses of Computer Technology in Qualitative Research.

    ERIC Educational Resources Information Center

    Parker, D. Randall

    The application of computer technology in qualitative research and evaluation ranges from simple word processing to doing sophisticated data sorting and retrieval. How computer software can be used for qualitative research is discussed. Researchers should consider the use of computers in data analysis in light of their own familiarity and comfort…

  17. The Next Computer Revolution.

    ERIC Educational Resources Information Center

    Peled, Abraham

    1987-01-01

    Discusses some of the future trends in the use of the computer in our society, suggesting that computing is now entering a new phase in which it will grow exponentially more powerful, flexible, and sophisticated in the next decade. Describes some of the latest breakthroughs in computer hardware and software technology. (TW)

  18. Hybrid soft computing systems for electromyographic signals analysis: a review.

    PubMed

    Xie, Hong-Bo; Guo, Tianruo; Bai, Siwei; Dokos, Socrates

    2014-02-03

    Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis.

  19. Hybrid soft computing systems for electromyographic signals analysis: a review

    PubMed Central

    2014-01-01

    Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis. PMID:24490979

  20. Facial Animations: Future Research Directions & Challenges

    NASA Astrophysics Data System (ADS)

    Alkawaz, Mohammed Hazim; Mohamad, Dzulkifli; Rehman, Amjad; Basori, Ahmad Hoirul

    2014-06-01

    Nowadays, computer facial animation is used in a significant multitude fields that brought human and social to study the computer games, films and interactive multimedia reality growth. Authoring the computer facial animation, complex and subtle expressions are challenging and fraught with problems. As a result, the current most authored using universal computer animation techniques often limit the production quality and quantity of facial animation. With the supplement of computer power, facial appreciative, software sophistication and new face-centric methods emerging are immature in nature. Therefore, this paper concentrates to define and managerially categorize current and emerged surveyed facial animation experts to define the recent state of the field, observed bottlenecks and developing techniques. This paper further presents a real-time simulation model of human worry and howling with detail discussion about their astonish, sorrow, annoyance and panic perception.

  1. MoCog1: A computer simulation of recognition-primed human decision making

    NASA Technical Reports Server (NTRS)

    Gevarter, William B.

    1991-01-01

    This report describes the successful results of the first stage of a research effort to develop a 'sophisticated' computer model of human cognitive behavior. Most human decision-making is of the experience-based, relatively straight-forward, largely automatic, type of response to internal goals and drives, utilizing cues and opportunities perceived from the current environment. This report describes the development of the architecture and computer program associated with such 'recognition-primed' decision-making. The resultant computer program was successfully utilized as a vehicle to simulate findings that relate how an individual's implicit theories orient them toward particular goals, with resultant cognitions, affects, and behavior in response to their environment. The present work is an expanded version and is based on research reported while the author was an employee of NASA ARC.

  2. Preliminary Work for Modeling the Propellers of an Aircraft as a Noise Source in an Acoustic Boundary Element Analysis

    NASA Technical Reports Server (NTRS)

    Vlahopoulos, Nickolas; Lyle, Karen H.; Burley, Casey L.

    1998-01-01

    An algorithm for generating appropriate velocity boundary conditions for an acoustic boundary element analysis from the kinematics of an operating propeller is presented. It constitutes the initial phase of Integrating sophisticated rotorcraft models into a conventional boundary element analysis. Currently, the pressure field is computed by a linear approximation. An initial validation of the developed process was performed by comparing numerical results to test data for the external acoustic pressure on the surface of a tilt-rotor aircraft for one flight condition.

  3. Handling of Atomic Data

    NASA Astrophysics Data System (ADS)

    Rauch, T.; Deetjen, J. L.

    2003-01-01

    State-of-the-art NLTE model atmosphere codes have arrived at a high level of ``numerical'' sophistication and are an adequate tool to analyze the available high-quality spectra from the infrared to the X-ray wavelength range. The computational capacities allow the calculation which include all elements from hydrogen up to the iron group and the lack of reliable atomic data has become a crucial problem for further progress. We summarize briefly the available sources of atomic data and how these are implemented in the Tübingen Model Atmosphere Package (TMAP).

  4. Petascale computation performance of lightweight multiscale cardiac models using hybrid programming models.

    PubMed

    Pope, Bernard J; Fitch, Blake G; Pitman, Michael C; Rice, John J; Reumann, Matthias

    2011-01-01

    Future multiscale and multiphysics models must use the power of high performance computing (HPC) systems to enable research into human disease, translational medical science, and treatment. Previously we showed that computationally efficient multiscale models will require the use of sophisticated hybrid programming models, mixing distributed message passing processes (e.g. the message passing interface (MPI)) with multithreading (e.g. OpenMP, POSIX pthreads). The objective of this work is to compare the performance of such hybrid programming models when applied to the simulation of a lightweight multiscale cardiac model. Our results show that the hybrid models do not perform favourably when compared to an implementation using only MPI which is in contrast to our results using complex physiological models. Thus, with regards to lightweight multiscale cardiac models, the user may not need to increase programming complexity by using a hybrid programming approach. However, considering that model complexity will increase as well as the HPC system size in both node count and number of cores per node, it is still foreseeable that we will achieve faster than real time multiscale cardiac simulations on these systems using hybrid programming models.

  5. Advancing botnet modeling techniques for military and security simulations

    NASA Astrophysics Data System (ADS)

    Banks, Sheila B.; Stytz, Martin R.

    2011-06-01

    Simulation environments serve many purposes, but they are only as good as their content. One of the most challenging and pressing areas that call for improved content is the simulation of bot armies (botnets) and their effects upon networks and computer systems. Botnets are a new type of malware, a type that is more powerful and potentially dangerous than any other type of malware. A botnet's power derives from several capabilities including the following: 1) the botnet's capability to be controlled and directed throughout all phases of its activity, 2) a command and control structure that grows increasingly sophisticated, and 3) the ability of a bot's software to be updated at any time by the owner of the bot (a person commonly called a bot master or bot herder.) Not only is a bot army powerful and agile in its technical capabilities, a bot army can be extremely large, can be comprised of tens of thousands, if not millions, of compromised computers or it can be as small as a few thousand targeted systems. In all botnets, their members can surreptitiously communicate with each other and their command and control centers. In sum, these capabilities allow a bot army to execute attacks that are technically sophisticated, difficult to trace, tactically agile, massive, and coordinated. To improve our understanding of their operation and potential, we believe that it is necessary to develop computer security simulations that accurately portray bot army activities, with the goal of including bot army simulations within military simulation environments. In this paper, we investigate issues that arise when simulating bot armies and propose a combination of the biologically inspired MSEIR infection spread model coupled with the jump-diffusion infection spread model to portray botnet propagation.

  6. Goals and Objectives for Computing in the Associated Colleges of the St. Lawrence Valley.

    ERIC Educational Resources Information Center

    Grupe, Fritz H.

    A forecast of the computing requirements of the Associated Colleges of the St. Lawrence Valley, an analysis of their needs, and specifications for a joint computer system are presented. Problems encountered included the lack of resources and computer sophistication at the member schools and a dearth of experience with long-term computer consortium…

  7. FDTD Modeling of LEMP Propagation in the Earth-Ionosphere Waveguide With Emphasis on Realistic Representation of Lightning Source

    NASA Astrophysics Data System (ADS)

    Tran, Thang H.; Baba, Yoshihiro; Somu, Vijaya B.; Rakov, Vladimir A.

    2017-12-01

    The finite difference time domain (FDTD) method in the 2-D cylindrical coordinate system was used to compute the nearly full-frequency-bandwidth vertical electric field and azimuthal magnetic field waveforms produced on the ground surface by lightning return strokes. The lightning source was represented by the modified transmission-line model with linear current decay with height, which was implemented in the FDTD computations as an appropriate vertical phased-current-source array. The conductivity of atmosphere was assumed to increase exponentially with height, with different conductivity profiles being used for daytime and nighttime conditions. The fields were computed at distances ranging from 50 to 500 km. Sky waves (reflections from the ionosphere) were identified in computed waveforms and used for estimation of apparent ionospheric reflection heights. It was found that our model reproduces reasonably well the daytime electric field waveforms measured at different distances and simulated (using a more sophisticated propagation model) by Qin et al. (2017). Sensitivity of model predictions to changes in the parameters of atmospheric conductivity profile, as well as influences of the lightning source characteristics (current waveshape parameters, return-stroke speed, and channel length) and ground conductivity were examined.

  8. Laser Powered Launch Vehicle Performance Analyses

    NASA Technical Reports Server (NTRS)

    Chen, Yen-Sen; Liu, Jiwen; Wang, Ten-See (Technical Monitor)

    2001-01-01

    The purpose of this study is to establish the technical ground for modeling the physics of laser powered pulse detonation phenomenon. Laser powered propulsion systems involve complex fluid dynamics, thermodynamics and radiative transfer processes. Successful predictions of the performance of laser powered launch vehicle concepts depend on the sophisticate models that reflects the underlying flow physics including the laser ray tracing the focusing, inverse Bremsstrahlung (IB) effects, finite-rate air chemistry, thermal non-equilibrium, plasma radiation and detonation wave propagation, etc. The proposed work will extend the base-line numerical model to an efficient design analysis tool. The proposed model is suitable for 3-D analysis using parallel computing methods.

  9. Numerical Simulation of Shock/Detonation-Deformable-Particle Interaction with Constrained Interface Reinitialization

    NASA Astrophysics Data System (ADS)

    Zhang, Ju; Jackson, Thomas; Balachandar, Sivaramakrishnan

    2015-06-01

    We will develop a computational model built upon our verified and validated in-house SDT code to provide improved description of the multiphase blast wave dynamics where solid particles are considered deformable and can even undergo phase transitions. Our SDT computational framework includes a reactive compressible flow solver with sophisticated material interface tracking capability and realistic equation of state (EOS) such as Mie-Gruneisen EOS for multiphase flow modeling. The behavior of diffuse interface models by Shukla et al. (2010) and Tiwari et al. (2013) at different shock impedance ratio will be first examined and characterized. The recent constrained interface reinitialization by Shukla (2014) will then be developed to examine if conservation property can be improved. This work was supported in part by the U.S. Department of Energy and by the Defense Threat Reduction Agency.

  10. Bootstrapping in a language of thought: a formal model of numerical concept learning.

    PubMed

    Piantadosi, Steven T; Tenenbaum, Joshua B; Goodman, Noah D

    2012-05-01

    In acquiring number words, children exhibit a qualitative leap in which they transition from understanding a few number words, to possessing a rich system of interrelated numerical concepts. We present a computational framework for understanding this inductive leap as the consequence of statistical inference over a sufficiently powerful representational system. We provide an implemented model that is powerful enough to learn number word meanings and other related conceptual systems from naturalistic data. The model shows that bootstrapping can be made computationally and philosophically well-founded as a theory of number learning. Our approach demonstrates how learners may combine core cognitive operations to build sophisticated representations during the course of development, and how this process explains observed developmental patterns in number word learning. Copyright © 2011 Elsevier B.V. All rights reserved.

  11. Software systems for modeling articulated figures

    NASA Technical Reports Server (NTRS)

    Phillips, Cary B.

    1989-01-01

    Research in computer animation and simulation of human task performance requires sophisticated geometric modeling and user interface tools. The software for a research environment should present the programmer with a powerful but flexible substrate of facilities for displaying and manipulating geometric objects, yet insure that future tools have a consistent and friendly user interface. Jack is a system which provides a flexible and extensible programmer and user interface for displaying and manipulating complex geometric figures, particularly human figures in a 3D working environment. It is a basic software framework for high-performance Silicon Graphics IRIS workstations for modeling and manipulating geometric objects in a general but powerful way. It provides a consistent and user-friendly interface across various applications in computer animation and simulation of human task performance. Currently, Jack provides input and control for applications including lighting specification and image rendering, anthropometric modeling, figure positioning, inverse kinematics, dynamic simulation, and keyframe animation.

  12. Predictive models in urology.

    PubMed

    Cestari, Andrea

    2013-01-01

    Predictive modeling is emerging as an important knowledge-based technology in healthcare. The interest in the use of predictive modeling reflects advances on different fronts such as the availability of health information from increasingly complex databases and electronic health records, a better understanding of causal or statistical predictors of health, disease processes and multifactorial models of ill-health and developments in nonlinear computer models using artificial intelligence or neural networks. These new computer-based forms of modeling are increasingly able to establish technical credibility in clinical contexts. The current state of knowledge is still quite young in understanding the likely future direction of how this so-called 'machine intelligence' will evolve and therefore how current relatively sophisticated predictive models will evolve in response to improvements in technology, which is advancing along a wide front. Predictive models in urology are gaining progressive popularity not only for academic and scientific purposes but also into the clinical practice with the introduction of several nomograms dealing with the main fields of onco-urology.

  13. Classical and all-floating FETI methods for the simulation of arterial tissues

    PubMed Central

    Augustin, Christoph M.; Holzapfel, Gerhard A.; Steinbach, Olaf

    2015-01-01

    High-resolution and anatomically realistic computer models of biological soft tissues play a significant role in the understanding of the function of cardiovascular components in health and disease. However, the computational effort to handle fine grids to resolve the geometries as well as sophisticated tissue models is very challenging. One possibility to derive a strongly scalable parallel solution algorithm is to consider finite element tearing and interconnecting (FETI) methods. In this study we propose and investigate the application of FETI methods to simulate the elastic behavior of biological soft tissues. As one particular example we choose the artery which is – as most other biological tissues – characterized by anisotropic and nonlinear material properties. We compare two specific approaches of FETI methods, classical and all-floating, and investigate the numerical behavior of different preconditioning techniques. In comparison to classical FETI, the all-floating approach has not only advantages concerning the implementation but in many cases also concerning the convergence of the global iterative solution method. This behavior is illustrated with numerical examples. We present results of linear elastic simulations to show convergence rates, as expected from the theory, and results from the more sophisticated nonlinear case where we apply a well-known anisotropic model to the realistic geometry of an artery. Although the FETI methods have a great applicability on artery simulations we will also discuss some limitations concerning the dependence on material parameters. PMID:26751957

  14. MS-CASPT2 study of hole transfer in guanine-indole complexes using the generalized Mulliken-Hush method: effective two-state treatment.

    PubMed

    Butchosa, C; Simon, S; Blancafort, L; Voityuk, A

    2012-07-12

    Because hole transfer from nucleobases to amino acid residues in DNA-protein complexes can prevent oxidative damage of DNA in living cells, computational modeling of the process is of high interest. We performed MS-CASPT2 calculations of several model structures of π-stacked guanine and indole and derived electron-transfer (ET) parameters for these systems using the generalized Mulliken-Hush (GMH) method. We show that the two-state model commonly applied to treat thermal ET between adjacent donor and acceptor is of limited use for the considered systems because of the small gap between the ground and first excited states in the indole radical cation. The ET parameters obtained within the two-state GMH scheme can deviate significantly from the corresponding matrix elements of the two-state effective Hamiltonian based on the GMH treatment of three adiabatic states. The computed values of diabatic energies and electronic couplings provide benchmarks to assess the performance of less sophisticated computational methods.

  15. Technology for the product and process data base

    NASA Technical Reports Server (NTRS)

    Barnes, R. D.

    1984-01-01

    The computerized product and process data base is increasingly recognized to be the cornerstone component of an overall system aimed at the integrated automation of the industrial processes of a given company or enterprise. The technology needed to support these more effective computer integrated design and manufacturing methods, especially the concept of 3-D computer-sensible product definitions rather than engineering drawings, is not fully available and rationalized. Progress is being made, however, in bridging this technology gap with concentration on the modeling of sophisticated information and data structures, high-performance interactive user interfaces and comprehensive tools for managing the resulting computerized product definition and process data base.

  16. The Matter Simulation (R)evolution

    PubMed Central

    2018-01-01

    To date, the program for the development of methods and models for atomistic and continuum simulation directed toward chemicals and materials has reached an incredible degree of sophistication and maturity. Currently, one can witness an increasingly rapid emergence of advances in computing, artificial intelligence, and robotics. This drives us to consider the future of computer simulation of matter from the molecular to the human length and time scales in a radical way that deliberately dares to go beyond the foreseeable next steps in any given discipline. This perspective article presents a view on this future development that we believe is likely to become a reality during our lifetime. PMID:29532014

  17. Learning of embodied interaction dynamics with recurrent neural networks: some exploratory experiments.

    PubMed

    Oubbati, Mohamed; Kord, Bahram; Koprinkova-Hristova, Petia; Palm, Günther

    2014-04-01

    The new tendency of artificial intelligence suggests that intelligence must be seen as a result of the interaction between brains, bodies and environments. This view implies that designing sophisticated behaviour requires a primary focus on how agents are functionally coupled to their environments. Under this perspective, we present early results with the application of reservoir computing as an efficient tool to understand how behaviour emerges from interaction. Specifically, we present reservoir computing models, that are inspired by imitation learning designs, to extract the essential components of behaviour that results from agent-environment interaction dynamics. Experimental results using a mobile robot are reported to validate the learning architectures.

  18. Learning of embodied interaction dynamics with recurrent neural networks: some exploratory experiments

    NASA Astrophysics Data System (ADS)

    Oubbati, Mohamed; Kord, Bahram; Koprinkova-Hristova, Petia; Palm, Günther

    2014-04-01

    The new tendency of artificial intelligence suggests that intelligence must be seen as a result of the interaction between brains, bodies and environments. This view implies that designing sophisticated behaviour requires a primary focus on how agents are functionally coupled to their environments. Under this perspective, we present early results with the application of reservoir computing as an efficient tool to understand how behaviour emerges from interaction. Specifically, we present reservoir computing models, that are inspired by imitation learning designs, to extract the essential components of behaviour that results from agent-environment interaction dynamics. Experimental results using a mobile robot are reported to validate the learning architectures.

  19. Nonadditivity of van der Waals forces on liquid surfaces

    NASA Astrophysics Data System (ADS)

    Venkataram, Prashanth S.; Whitton, Jeremy D.; Rodriguez, Alejandro W.

    2016-09-01

    We present an approach for modeling nanoscale wetting and dewetting of textured solid surfaces that exploits recently developed, sophisticated techniques for computing exact long-range dispersive van der Waals (vdW) or (more generally) Casimir forces in arbitrary geometries. We apply these techniques to solve the variational formulation of the Young-Laplace equation and predict the equilibrium shapes of liquid-vacuum interfaces near solid gratings. We show that commonly employed methods of computing vdW interactions based on additive Hamaker or Derjaguin approximations, which neglect important electromagnetic boundary effects, can result in large discrepancies in the shapes and behaviors of liquid surfaces compared to exact methods.

  20. Logic via Computer Programming.

    ERIC Educational Resources Information Center

    Wieschenberg, Agnes A.

    This paper proposed the question "How do we teach logical thinking and sophisticated mathematics to unsophisticated college students?" One answer among many is through the writing of computer programs. The writing of computer algorithms is mathematical problem solving and logic in disguise and it may attract students who would otherwise stop…

  1. Prior Consent: Not-So-Strange Bedfellows Plan Library/Computing Partnerships.

    ERIC Educational Resources Information Center

    McDonough, Kristin

    The increasing sophistication of information technologies and the nearly universal access to computing have blurred distinctions among information delivery units on college campuses, forcing institutions to rethink the separate organizational structures that evolved when computing in academe was more localized and less prevalent. Experiences in…

  2. Computational ecology as an emerging science

    PubMed Central

    Petrovskii, Sergei; Petrovskaya, Natalia

    2012-01-01

    It has long been recognized that numerical modelling and computer simulations can be used as a powerful research tool to understand, and sometimes to predict, the tendencies and peculiarities in the dynamics of populations and ecosystems. It has been, however, much less appreciated that the context of modelling and simulations in ecology is essentially different from those that normally exist in other natural sciences. In our paper, we review the computational challenges arising in modern ecology in the spirit of computational mathematics, i.e. with our main focus on the choice and use of adequate numerical methods. Somewhat paradoxically, the complexity of ecological problems does not always require the use of complex computational methods. This paradox, however, can be easily resolved if we recall that application of sophisticated computational methods usually requires clear and unambiguous mathematical problem statement as well as clearly defined benchmark information for model validation. At the same time, many ecological problems still do not have mathematically accurate and unambiguous description, and available field data are often very noisy, and hence it can be hard to understand how the results of computations should be interpreted from the ecological viewpoint. In this scientific context, computational ecology has to deal with a new paradigm: conventional issues of numerical modelling such as convergence and stability become less important than the qualitative analysis that can be provided with the help of computational techniques. We discuss this paradigm by considering computational challenges arising in several specific ecological applications. PMID:23565336

  3. Electromechanical models of the ventricles

    PubMed Central

    Constantino, Jason; Gurev, Viatcheslav

    2011-01-01

    Computational modeling has traditionally played an important role in dissecting the mechanisms for cardiac dysfunction. Ventricular electromechanical models, likely the most sophisticated virtual organs to date, integrate detailed information across the spatial scales of cardiac electrophysiology and mechanics and are capable of capturing the emergent behavior and the interaction between electrical activation and mechanical contraction of the heart. The goal of this review is to provide an overview of the latest advancements in multiscale electromechanical modeling of the ventricles. We first detail the general framework of multiscale ventricular electromechanical modeling and describe the state of the art in computational techniques and experimental validation approaches. The powerful utility of ventricular electromechanical models in providing a better understanding of cardiac function is then demonstrated by reviewing the latest insights obtained by these models, focusing primarily on the mechanisms by which mechanoelectric coupling contributes to ventricular arrythmogenesis, the relationship between electrical activation and mechanical contraction in the normal heart, and the mechanisms of mechanical dyssynchrony and resynchronization in the failing heart. Computational modeling of cardiac electromechanics will continue to complement basic science research and clinical cardiology and holds promise to become an important clinical tool aiding the diagnosis and treatment of cardiac disease. PMID:21572017

  4. Experience with a sophisticated computer based authoring system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gardner, P.R.

    1984-04-01

    In the November 1982 issue of ADCIS SIG CBT Newsletter the editor arrives at two conclusions regarding Computer Based Authoring Systems (CBAS): (1) CBAS drastically reduces programming time and the need for expert programmers, and (2) CBAS appears to have minimal impact on initial lesson design. Both of these comments have significant impact on any Cost-Benefit analysis for Computer-Based Training. The first tends to improve cost-effectiveness but only toward the limits imposed by the second. Westinghouse Hanford Company (WHC) recently purchased a sophisticated CBAS, the WISE/SMART system from Wicat (Orem, UT), for use in the Nuclear Power Industry. This reportmore » details our experience with this system relative to Items (1) and (2) above; lesson design time will be compared with lesson input time. Also provided will be the WHC experience in the use of subject matter experts (though computer neophytes) for the design and inputting of CBT materials.« less

  5. Using Interactive Computer to Communicate Scientific Information.

    ERIC Educational Resources Information Center

    Selnow, Gary W.

    1988-01-01

    Asks whether the computer is another channel of communication, if its interactive qualities make it an information source, or if it is an undefined hybrid. Concludes that computers are neither the medium nor the source but will in the future provide the possibility of a sophisticated interaction between human intelligence and artificial…

  6. How You Can Protect Public Access Computers "and" Their Users

    ERIC Educational Resources Information Center

    Huang, Phil

    2007-01-01

    By providing the public with online computing facilities, librarians make available a world of information resources beyond their traditional print materials. Internet-connected computers in libraries greatly enhance the opportunity for patrons to enjoy the benefits of the digital age. Unfortunately, as hackers become more sophisticated and…

  7. Improved object optimal synthetic description, modeling, learning, and discrimination by GEOGINE computational kernel

    NASA Astrophysics Data System (ADS)

    Fiorini, Rodolfo A.; Dacquino, Gianfranco

    2005-03-01

    GEOGINE (GEOmetrical enGINE), a state-of-the-art OMG (Ontological Model Generator) based on n-D Tensor Invariants for n-Dimensional shape/texture optimal synthetic representation, description and learning, was presented in previous conferences elsewhere recently. Improved computational algorithms based on the computational invariant theory of finite groups in Euclidean space and a demo application is presented. Progressive model automatic generation is discussed. GEOGINE can be used as an efficient computational kernel for fast reliable application development and delivery in advanced biomedical engineering, biometric, intelligent computing, target recognition, content image retrieval, data mining technological areas mainly. Ontology can be regarded as a logical theory accounting for the intended meaning of a formal dictionary, i.e., its ontological commitment to a particular conceptualization of the world object. According to this approach, "n-D Tensor Calculus" can be considered a "Formal Language" to reliably compute optimized "n-Dimensional Tensor Invariants" as specific object "invariant parameter and attribute words" for automated n-Dimensional shape/texture optimal synthetic object description by incremental model generation. The class of those "invariant parameter and attribute words" can be thought as a specific "Formal Vocabulary" learned from a "Generalized Formal Dictionary" of the "Computational Tensor Invariants" language. Even object chromatic attributes can be effectively and reliably computed from object geometric parameters into robust colour shape invariant characteristics. As a matter of fact, any highly sophisticated application needing effective, robust object geometric/colour invariant attribute capture and parameterization features, for reliable automated object learning and discrimination can deeply benefit from GEOGINE progressive automated model generation computational kernel performance. Main operational advantages over previous, similar approaches are: 1) Progressive Automated Invariant Model Generation, 2) Invariant Minimal Complete Description Set for computational efficiency, 3) Arbitrary Model Precision for robust object description and identification.

  8. Implementing Computer Algebra Enabled Questions for the Assessment and Learning of Mathematics

    ERIC Educational Resources Information Center

    Sangwin, Christopher J.; Naismith, Laura

    2008-01-01

    We present principles for the design of an online system to support computer algebra enabled questions for use within the teaching and learning of mathematics in higher education. The introduction of a computer algebra system (CAS) into a computer aided assessment (CAA) system affords sophisticated response processing of student provided answers.…

  9. Computational oncology.

    PubMed

    Lefor, Alan T

    2011-08-01

    Oncology research has traditionally been conducted using techniques from the biological sciences. The new field of computational oncology has forged a new relationship between the physical sciences and oncology to further advance research. By applying physics and mathematics to oncologic problems, new insights will emerge into the pathogenesis and treatment of malignancies. One major area of investigation in computational oncology centers around the acquisition and analysis of data, using improved computing hardware and software. Large databases of cellular pathways are being analyzed to understand the interrelationship among complex biological processes. Computer-aided detection is being applied to the analysis of routine imaging data including mammography and chest imaging to improve the accuracy and detection rate for population screening. The second major area of investigation uses computers to construct sophisticated mathematical models of individual cancer cells as well as larger systems using partial differential equations. These models are further refined with clinically available information to more accurately reflect living systems. One of the major obstacles in the partnership between physical scientists and the oncology community is communications. Standard ways to convey information must be developed. Future progress in computational oncology will depend on close collaboration between clinicians and investigators to further the understanding of cancer using these new approaches.

  10. Simulation Of Seawater Intrusion With 2D And 3D Models: Nauru Island Case Study

    NASA Astrophysics Data System (ADS)

    Ghassemi, F.; Jakeman, A. J.; Jacobson, G.; Howard, K. W. F.

    1996-03-01

    With the advent of large computing capacities during the past few decades, sophisticated models have been developed for the simulation of seawater intrusion in coastal and island aquifers. Currently, several models are commercially available for the simulation of this problem. This paper describes the mathematical basis and application of the SUTRA and HST3D models to simulate seawater intrusion in Nauru Island, in the central Pacific Ocean. A comparison of the performance and limitations of these two models in simulating a real problem indicates that three-dimensional simulation of seawater intrusion with the HST3D model has the major advantage of being able to specify natural boundary conditions as well as pumping stresses. However, HST3D requires a small grid size and short time steps in order to maintain numerical stability and accuracy. These requirements lead to solution of a large set of linear equations that requires the availability of powerful computing facilities in terms of memory and computing speed. Combined results of the two simulation models indicate a safe pumping rate of 400 m3/d for the aquifer on Nauru Island, where additional fresh water is presently needed for the rehabilitation of mined-out land.

  11. Wires in the soup: quantitative models of cell signaling

    PubMed Central

    Cheong, Raymond; Levchenko, Andre

    2014-01-01

    Living cells are capable of extracting information from their environments and mounting appropriate responses to a variety of associated challenges. The underlying signal transduction networks enabling this can be quite complex, necessitating for their unraveling by sophisticated computational modeling coupled with precise experimentation. Although we are still at the beginning of this process, some recent examples of integrative analysis of cell signaling are very encouraging. This review highlights the case of the NF-κB pathway in order to illustrate how a quantitative model of a signaling pathway can be gradually constructed through continuous experimental validation, and what lessons one might learn from such exercises. PMID:18291655

  12. Evolution of Computational Toxicology-from Primitive ...

    EPA Pesticide Factsheets

    Presentation at the Health Canada seminar in Ottawa, ON, Canada on Nov. 15. 2016 Presentation at the Health Canada seminar in Ottawa, ON, Canada on Nov. 15. 2016 on the Evolution of Computational Toxicology-from Primitive Beginnings to Sophisticated Application

  13. Evaluating coastal and river valley communities evacuation network performance using macroscopic productivity.

    DOT National Transportation Integrated Search

    2017-06-30

    The ever-increasing processing speed and computational power of computers and simulation systems has led to correspondingly larger, more sophisticated representations of evacuation traffic processes. Today, micro-level analyses can be conducted for m...

  14. Computers for Interactive Learning.

    ERIC Educational Resources Information Center

    Grabowski, Barbara; Aggen, William

    1984-01-01

    Analyzes features of computer-based interactive video including sophisticated answer judging, diagnostic feedback, simulation, animation, audible tones, touch sensitive screen, function keys, and video enhancements, and matches these to the characteristics and pedagogical styles of learners. The learner characteristics discussed include internal…

  15. Toward a molecular programming language for algorithmic self-assembly

    NASA Astrophysics Data System (ADS)

    Patitz, Matthew John

    Self-assembly is the process whereby relatively simple components autonomously combine to form more complex objects. Nature exhibits self-assembly to form everything from microscopic crystals to living cells to galaxies. With a desire to both form increasingly sophisticated products and to understand the basic components of living systems, scientists have developed and studied artificial self-assembling systems. One such framework is the Tile Assembly Model introduced by Erik Winfree in 1998. In this model, simple two-dimensional square 'tiles' are designed so that they self-assemble into desired shapes. The work in this thesis consists of a series of results which build toward the future goal of designing an abstracted, high-level programming language for designing the molecular components of self-assembling systems which can perform powerful computations and form into intricate structures. The first two sets of results demonstrate self-assembling systems which perform infinite series of computations that characterize computably enumerable and decidable languages, and exhibit tools for algorithmically generating the necessary sets of tiles. In the next chapter, methods for generating tile sets which self-assemble into complicated shapes, namely a class of discrete self-similar fractal structures, are presented. Next, a software package for graphically designing tile sets, simulating their self-assembly, and debugging designed systems is discussed. Finally, a high-level programming language which abstracts much of the complexity and tedium of designing such systems, while preventing many of the common errors, is presented. The summation of this body of work presents a broad coverage of the spectrum of desired outputs from artificial self-assembling systems and a progression in the sophistication of tools used to design them. By creating a broader and deeper set of modular tools for designing self-assembling systems, we hope to increase the complexity which is attainable. These tools provide a solid foundation for future work in both the Tile Assembly Model and explorations into more advanced models.

  16. Connecting Numerical Relativity and Data Analysis of Gravitational Wave Detectors

    NASA Astrophysics Data System (ADS)

    Shoemaker, Deirdre; Jani, Karan; London, Lionel; Pekowsky, Larne

    Gravitational waves deliver information in exquisite detail about astrophysical phenomena, among them the collision of two black holes, a system completely invisible to the eyes of electromagnetic telescopes. Models that predict gravitational wave signals from likely sources are crucial for the success of this endeavor. Modeling binary black hole sources of gravitational radiation requires solving the Einstein equations of General Relativity using powerful computer hardware and sophisticated numerical algorithms. This proceeding presents where we are in understanding ground-based gravitational waves resulting from the merger of black holes and the implications of these sources for the advent of gravitational-wave astronomy.

  17. The Tangent Linear and Adjoint of the FV3 Dynamical Core: Development and Applications

    NASA Technical Reports Server (NTRS)

    Holdaway, Daniel

    2018-01-01

    GMAO (NASA's Global Modeling and Assimilation Office) has developed a highly sophisticated adjoint modeling system based on the most recent version of the finite volume cubed sphere (FV3) dynamical core. This provides a mechanism for investigating sensitivity to initial conditions and examining observation impacts. It also allows for the computation of singular vectors and for the implementation of hybrid 4DVAR (4-Dimensional Variational Assimilation). In this work we will present the scientific assessment of the new adjoint system and show results from a number of research application of the adjoint system.

  18. Electron collisions with atoms, ions, molecules, and surfaces: Fundamental science empowering advances in technology

    PubMed Central

    Bartschat, Klaus; Kushner, Mark J.

    2016-01-01

    Electron collisions with atoms, ions, molecules, and surfaces are critically important to the understanding and modeling of low-temperature plasmas (LTPs), and so in the development of technologies based on LTPs. Recent progress in obtaining experimental benchmark data and the development of highly sophisticated computational methods is highlighted. With the cesium-based diode-pumped alkali laser and remote plasma etching of Si3N4 as examples, we demonstrate how accurate and comprehensive datasets for electron collisions enable complex modeling of plasma-using technologies that empower our high-technology–based society. PMID:27317740

  19. Probabilistic models of cognition: conceptual foundations.

    PubMed

    Chater, Nick; Tenenbaum, Joshua B; Yuille, Alan

    2006-07-01

    Remarkable progress in the mathematics and computer science of probability has led to a revolution in the scope of probabilistic models. In particular, 'sophisticated' probabilistic methods apply to structured relational systems such as graphs and grammars, of immediate relevance to the cognitive sciences. This Special Issue outlines progress in this rapidly developing field, which provides a potentially unifying perspective across a wide range of domains and levels of explanation. Here, we introduce the historical and conceptual foundations of the approach, explore how the approach relates to studies of explicit probabilistic reasoning, and give a brief overview of the field as it stands today.

  20. An analog retina model for detecting dim moving objects against a bright moving background

    NASA Technical Reports Server (NTRS)

    Searfus, R. M.; Colvin, M. E.; Eeckman, F. H.; Teeters, J. L.; Axelrod, T. S.

    1991-01-01

    We are interested in applications that require the ability to track a dim target against a bright, moving background. Since the target signal will be less than or comparable to the variations in the background signal intensity, sophisticated techniques must be employed to detect the target. We present an analog retina model that adapts to the motion of the background in order to enhance targets that have a velocity difference with respect to the background. Computer simulation results and our preliminary concept of an analog 'Z' focal plane implementation are also presented.

  1. An expert system for municipal solid waste management simulation analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hsieh, M.C.; Chang, N.B.

    1996-12-31

    Optimization techniques were usually used to model the complicated metropolitan solid waste management system to search for the best dynamic combination of waste recycling, facility siting, and system operation, where sophisticated and well-defined interrelationship are required in the modeling process. But this paper applied the Concurrent Object-Oriented Simulation (COOS), a new simulation software construction method, to bridge the gap between the physical system and its computer representation. The case study of Kaohsiung solid waste management system in Taiwan is prepared for the illustration of the analytical methodology of COOS and its implementation in the creation of an expert system.

  2. Quantitative Modeling of Earth Surface Processes

    NASA Astrophysics Data System (ADS)

    Pelletier, Jon D.

    This textbook describes some of the most effective and straightforward quantitative techniques for modeling Earth surface processes. By emphasizing a core set of equations and solution techniques, the book presents state-of-the-art models currently employed in Earth surface process research, as well as a set of simple but practical research tools. Detailed case studies demonstrate application of the methods to a wide variety of processes including hillslope, fluvial, aeolian, glacial, tectonic, and climatic systems. Exercises at the end of each chapter begin with simple calculations and then progress to more sophisticated problems that require computer programming. All the necessary computer codes are available online at www.cambridge.org/9780521855976. Assuming some knowledge of calculus and basic programming experience, this quantitative textbook is designed for advanced geomorphology courses and as a reference book for professional researchers in Earth and planetary science looking for a quantitative approach to Earth surface processes.

  3. More details...
  4. Restoring Fort Frontenac in 3D: Effective Usage of 3D Technology for Heritage Visualization

    NASA Astrophysics Data System (ADS)

    Yabe, M.; Goins, E.; Jackson, C.; Halbstein, D.; Foster, S.; Bazely, S.

    2015-02-01

    This paper is composed of three elements: 3D modeling, web design, and heritage visualization. The aim is to use computer graphics design to inform and create an interest in historical visualization by rebuilding Fort Frontenac using 3D modeling and interactive design. The final model will be integr ated into an interactive website to learn more about the fort's historic imp ortance. It is apparent that using computer graphics can save time and money when it comes to historical visualization. Visitors do not have to travel to the actual archaeological buildings. They can simply use the Web in their own home to learn about this information virtually. Meticulously following historical records to create a sophisticated restoration of archaeological buildings will draw viewers into visualizations, such as the historical world of Fort Frontenac. As a result, it allows the viewers to effectively understand the fort's social sy stem, habits, and historical events.

  5. Assessment methodology for computer-based instructional simulations.

    PubMed

    Koenig, Alan; Iseli, Markus; Wainess, Richard; Lee, John J

    2013-10-01

    Computer-based instructional simulations are becoming more and more ubiquitous, particularly in military and medical domains. As the technology that drives these simulations grows ever more sophisticated, the underlying pedagogical models for how instruction, assessment, and feedback are implemented within these systems must evolve accordingly. In this article, we review some of the existing educational approaches to medical simulations, and present pedagogical methodologies that have been used in the design and development of games and simulations at the University of California, Los Angeles, Center for Research on Evaluation, Standards, and Student Testing. In particular, we present a methodology for how automated assessments of computer-based simulations can be implemented using ontologies and Bayesian networks, and discuss their advantages and design considerations for pedagogical use. Reprint & Copyright © 2013 Association of Military Surgeons of the U.S.

  6. Supercomputer use in orthopaedic biomechanics research: focus on functional adaptation of bone.

    PubMed

    Hart, R T; Thongpreda, N; Van Buskirk, W C

    1988-01-01

    The authors describe two biomechanical analyses carried out using numerical methods. One is an analysis of the stress and strain in a human mandible, and the other analysis involves modeling the adaptive response of a sheep bone to mechanical loading. The computing environment required for the two types of analyses is discussed. It is shown that a simple stress analysis of a geometrically complex mandible can be accomplished using a minicomputer. However, more sophisticated analyses of the same model with dynamic loading or nonlinear materials would require supercomputer capabilities. A supercomputer is also required for modeling the adaptive response of living bone, even when simple geometric and material models are use.

  7. A Computer Analysis of Library Postcards. (CALP)

    ERIC Educational Resources Information Center

    Stevens, Norman D.

    1974-01-01

    A description of a sophisticated application of computer techniques to the analysis of a collection of picture postcards of library buildings in an attempt to establish the minimum architectural requirements needed to distinguish one style of library building from another. (Author)

  8. Cancer's Big Data Problem

    DOE PAGES

    Breaux, Justin H. S.

    2017-03-15

    The US Department of Energy (DOE) has partnered with the National Cancer Institute (NCI) to use DOE supercomputers to aid in the fight against cancer by building sophisticated models based on data available at the population, patient, and molecular levels. Here, through a three-year pilot project called the Joint Design of Advanced Computing Solutions for Cancer (JDACSC), four participating national laboratories--Argonne, Lawrence Livermore, Los Alamos, and Oak Ridge--will focus on three problems singled out by the NCI as the biggest bottlenecks to advancing cancer research.

  9. Cancer's Big Data Problem

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Breaux, Justin H. S.

    The US Department of Energy (DOE) has partnered with the National Cancer Institute (NCI) to use DOE supercomputers to aid in the fight against cancer by building sophisticated models based on data available at the population, patient, and molecular levels. Here, through a three-year pilot project called the Joint Design of Advanced Computing Solutions for Cancer (JDACSC), four participating national laboratories--Argonne, Lawrence Livermore, Los Alamos, and Oak Ridge--will focus on three problems singled out by the NCI as the biggest bottlenecks to advancing cancer research.

  10. VBA: A Probabilistic Treatment of Nonlinear Models for Neurobiological and Behavioural Data

    PubMed Central

    Daunizeau, Jean; Adam, Vincent; Rigoux, Lionel

    2014-01-01

    This work is in line with an on-going effort tending toward a computational (quantitative and refutable) understanding of human neuro-cognitive processes. Many sophisticated models for behavioural and neurobiological data have flourished during the past decade. Most of these models are partly unspecified (i.e. they have unknown parameters) and nonlinear. This makes them difficult to peer with a formal statistical data analysis framework. In turn, this compromises the reproducibility of model-based empirical studies. This work exposes a software toolbox that provides generic, efficient and robust probabilistic solutions to the three problems of model-based analysis of empirical data: (i) data simulation, (ii) parameter estimation/model selection, and (iii) experimental design optimization. PMID:24465198

  11. Virtual Computing Laboratories: A Case Study with Comparisons to Physical Computing Laboratories

    ERIC Educational Resources Information Center

    Burd, Stephen D.; Seazzu, Alessandro F.; Conway, Christopher

    2009-01-01

    Current technology enables schools to provide remote or virtual computing labs that can be implemented in multiple ways ranging from remote access to banks of dedicated workstations to sophisticated access to large-scale servers hosting virtualized workstations. This paper reports on the implementation of a specific lab using remote access to…

  12. Use of Computer-Assisted Technologies (CAT) to Enhance Social, Communicative, and Language Development in Children with Autism Spectrum Disorders

    ERIC Educational Resources Information Center

    Ploog, Bertram O.; Scharf, Alexa; Nelson, DeShawn; Brooks, Patricia J.

    2013-01-01

    Major advances in multimedia computer technology over the past decades have made sophisticated computer games readily available to the public. This, combined with the observation that most children, including those with autism spectrum disorders (ASD), show an affinity to computers, has led researchers to recognize the potential of computer…

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gavignet, A.A.; Wick, C.J.

    In current practice, pressure drops in the mud circulating system and the settling velocity of cuttings are calculated with simple rheological models and simple equations. Wellsite computers now allow more sophistication in drilling computations. In this paper, experimental results on the settling velocity of spheres in drilling fluids are reported, along with rheograms done over a wide range of shear rates. The flow curves are fitted to polynomials and general methods are developed to predict friction losses and settling velocities as functions of the polynomial coefficients. These methods were incorporated in a software package that can handle any rig configurationmore » system, including riser booster. Graphic displays show the effect of each parameter on the performance of the circulating system.« less

  14. Optimized Materials From First Principles Simulations: Are We There Yet?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Galli, G; Gygi, F

    2005-07-26

    In the past thirty years, the use of scientific computing has become pervasive in all disciplines: collection and interpretation of most experimental data is carried out using computers, and physical models in computable form, with various degrees of complexity and sophistication, are utilized in all fields of science. However, full prediction of physical and chemical phenomena based on the basic laws of Nature, using computer simulations, is a revolution still in the making, and it involves some formidable theoretical and computational challenges. We illustrate the progress and successes obtained in recent years in predicting fundamental properties of materials in condensedmore » phases and at the nanoscale, using ab-initio, quantum simulations. We also discuss open issues related to the validation of the approximate, first principles theories used in large scale simulations, and the resulting complex interplay between computation and experiment. Finally, we describe some applications, with focus on nanostructures and liquids, both at ambient and under extreme conditions.« less

  15. A Gateway for Phylogenetic Analysis Powered by Grid Computing Featuring GARLI 2.0

    PubMed Central

    Bazinet, Adam L.; Zwickl, Derrick J.; Cummings, Michael P.

    2014-01-01

    We introduce molecularevolution.org, a publicly available gateway for high-throughput, maximum-likelihood phylogenetic analysis powered by grid computing. The gateway features a garli 2.0 web service that enables a user to quickly and easily submit thousands of maximum likelihood tree searches or bootstrap searches that are executed in parallel on distributed computing resources. The garli web service allows one to easily specify partitioned substitution models using a graphical interface, and it performs sophisticated post-processing of phylogenetic results. Although the garli web service has been used by the research community for over three years, here we formally announce the availability of the service, describe its capabilities, highlight new features and recent improvements, and provide details about how the grid system efficiently delivers high-quality phylogenetic results. [garli, gateway, grid computing, maximum likelihood, molecular evolution portal, phylogenetics, web service.] PMID:24789072

  16. Application of CFD to a generic hypersonic flight research study

    NASA Technical Reports Server (NTRS)

    Green, Michael J.; Lawrence, Scott L.; Dilley, Arthur D.; Hawkins, Richard W.; Walker, Mary M.; Oberkampf, William L.

    1993-01-01

    Computational analyses have been performed for the initial assessment of flight research vehicle concepts that satisfy requirements for potential hypersonic experiments. Results were obtained from independent analyses at NASA Ames, NASA Langley, and Sandia National Labs, using sophisticated time-dependent Navier-Stokes and parabolized Navier-Stokes methods. Careful study of a common problem consisting of hypersonic flow past a slightly blunted conical forebody was undertaken to estimate the level of uncertainty in the computed results, and to assess the capabilities of current computational methods for predicting boundary-layer transition onset. Results of this study in terms of surface pressure and heat transfer comparisons, as well as comparisons of boundary-layer edge quantities and flow-field profiles are presented here. Sensitivities to grid and gas model are discussed. Finally, representative results are presented relating to the use of Computational Fluid Dynamics in the vehicle design and the integration/support of potential experiments.

  17. High-Performance Computing Act of 1991. Report of the Senate Committee on Commerce, Science, and Transportation on S. 272. Senate, 102d Congress, 1st Session.

    ERIC Educational Resources Information Center

    Congress of the U.S., Washington, DC. Senate Committee on Commerce, Science, and Transportation.

    This report discusses Senate Bill no. 272, which provides for a coordinated federal research and development program to ensure continued U.S. leadership in high-performance computing. High performance computing is defined as representing the leading edge of technological advancement in computing, i.e., the most sophisticated computer chips, the…

  18. Simple calculation of ab initio melting curves: Application to aluminum.

    PubMed

    Robert, Grégory; Legrand, Philippe; Arnault, Philippe; Desbiens, Nicolas; Clérouin, Jean

    2015-03-01

    We present a simple, fast, and promising method to compute the melting curves of materials with ab initio molecular dynamics. It is based on the two-phase thermodynamic model of Lin et al [J. Chem. Phys. 119, 11792 (2003)] and its improved version given by Desjarlais [Phys. Rev. E 88, 062145 (2013)]. In this model, the velocity autocorrelation function is utilized to calculate the contribution of the nuclei motion to the entropy of the solid and liquid phases. It is then possible to find the thermodynamic conditions of equal Gibbs free energy between these phases, defining the melting curve. The first benchmark on the face-centered cubic melting curve of aluminum from 0 to 300 GPa demonstrates how to obtain an accuracy of 5%-10%, comparable to the most sophisticated methods, for a much lower computational cost.

  19. Forecasting the Occurrence of Severe Haze Events in Asia using Machine Learning Algorithms

    NASA Astrophysics Data System (ADS)

    Walton, A. L.

    2016-12-01

    Particulate pollution has become a serious environmental issue of many Asian countries in recent decades, threatening human health and frequently causing low visibility or haze days that interrupt from working, outdoor, and school activities to air, road, and sea transportation. To ultimately prevent such severe haze to occur requires many difficult tasks to be accomplished, dealing with trade and negotiation, emission control, energy consumption, transportation, land and plantation management, among other, of all involved countries or parties. Whereas, before these difficult measures could finally take place, it would be more practical to reduce the economic loss by developing skills to predict the occurrence of such events in reasonable accuracy so that effective mitigation or adaptation measures could be implemented ahead of time. The "traditional" numerical models developed based on fluid dynamics and explicit or parameterized representations of physiochemical processes can be certainly used for this task. However, the significant and sophisticated spatiotemporal variabilities associated with these events, the propagation of numerical or parameterization errors through model integration, and the computational demand all pose serious challenges to the practice of using these models to accomplish this interdisciplinary task. On the other hand, large quantity of meteorological, hydrological, atmospheric aerosol and composition, and surface visibility data from in-situ observation, reanalysis, or satellite retrievals, have become available to the community. These data might still not sufficient for evaluating and improving certain important aspects of the "traditional" models. Nevertheless, it is likely that these data can already support the effort to develop alternative "task-oriented" and computationally efficient forecasting skill using deep machine learning technique to avoid directly dealing with the sophisticated interplays across multiple process layers. I will present an experiential case of applying machine learning technique to predict the occurrence of severe haze events in Asia.

  20. Forecasting the Occurrence of Severe Haze Events in Asia using Machine Learning Algorithms

    NASA Astrophysics Data System (ADS)

    Wang, C.

    2017-12-01

    Particulate pollution has become a serious environmental issue of many Asian countries in recent decades, threatening human health and frequently causing low visibility or haze days that interrupt from working, outdoor, and school activities to air, road, and sea transportation. To ultimately prevent such severe haze to occur requires many difficult tasks to be accomplished, dealing with trade and negotiation, emission control, energy consumption, transportation, land and plantation management, among other, of all involved countries or parties. Whereas, before these difficult measures could finally take place, it would be more practical to reduce the economic loss by developing skills to predict the occurrence of such events in reasonable accuracy so that effective mitigation or adaptation measures could be implemented ahead of time. The "traditional" numerical models developed based on fluid dynamics and explicit or parameterized representations of physiochemical processes can be certainly used for this task. However, the significant and sophisticated spatiotemporal variabilities associated with these events, the propagation of numerical or parameterization errors through model integration, and the computational demand all pose serious challenges to the practice of using these models to accomplish this interdisciplinary task. On the other hand, large quantity of meteorological, hydrological, atmospheric aerosol and composition, and surface visibility data from in-situ observation, reanalysis, or satellite retrievals, have become available to the community. These data might still not sufficient for evaluating and improving certain important aspects of the "traditional" models. Nevertheless, it is likely that these data can already support the effort to develop alternative "task-oriented" and computationally efficient forecasting skill using deep machine learning technique to avoid directly dealing with the sophisticated interplays across multiple process layers. I will present an experiential case of applying machine learning technique to predict the occurrence of severe haze events in Asia.

  21. Effect of two viscosity models on lethality estimation in sterilization of liquid canned foods.

    PubMed

    Calderón-Alvarado, M P; Alvarado-Orozco, J M; Herrera-Hernández, E C; Martínez-González, G M; Miranda-López, R; Jiménez-Islas, H

    2016-09-01

    A numerical study on 2D natural convection in cylindrical cavities during the sterilization of liquid foods was performed. The mathematical model was established on momentum and energy balances and predicts both the heating dynamics of the slowest heating zone (SHZ) and the lethal rate achieved in homogeneous liquid canned foods. Two sophistication levels were proposed in viscosity modelling: 1) considering average viscosity and 2) using an Arrhenius-type model to include the effect of temperature on viscosity. The remaining thermodynamic properties were kept constant. The governing equations were spatially discretized via orthogonal collocation (OC) with mesh size of 25 × 25. Computational simulations were performed using proximate and thermodynamic data for carrot-orange soup, broccoli-cheddar soup, tomato puree, and cream-style corn. Flow patterns, isothermals, heating dynamics of the SHZ, and the sterilization rate achieved for the cases studied were compared for both viscosity models. The dynamics of coldest point and the lethal rate F0 in all food fluids studied were approximately equal in both cases, although the second sophistication level is closer to physical behavior. The model accuracy was compared favorably with reported sterilization time for cream-style corn packed at 303 × 406 can size, predicting 66 min versus an experimental time of 68 min at retort temperature of 121.1 ℃. © The Author(s) 2016.

  1. Current Status on the use of Parallel Computing in Turbulent Reacting Flow Computations Involving Sprays, Monte Carlo PDF and Unstructured Grids. Chapter 4

    NASA Technical Reports Server (NTRS)

    Raju, M. S.

    1998-01-01

    The state of the art in multidimensional combustor modeling as evidenced by the level of sophistication employed in terms of modeling and numerical accuracy considerations, is also dictated by the available computer memory and turnaround times afforded by present-day computers. With the aim of advancing the current multi-dimensional computational tools used in the design of advanced technology combustors, a solution procedure is developed that combines the novelty of the coupled CFD/spray/scalar Monte Carlo PDF (Probability Density Function) computations on unstructured grids with the ability to run on parallel architectures. In this approach, the mean gas-phase velocity and turbulence fields are determined from a standard turbulence model, the joint composition of species and enthalpy from the solution of a modeled PDF transport equation, and a Lagrangian-based dilute spray model is used for the liquid-phase representation. The gas-turbine combustor flows are often characterized by a complex interaction between various physical processes associated with the interaction between the liquid and gas phases, droplet vaporization, turbulent mixing, heat release associated with chemical kinetics, radiative heat transfer associated with highly absorbing and radiating species, among others. The rate controlling processes often interact with each other at various disparate time 1 and length scales. In particular, turbulence plays an important role in determining the rates of mass and heat transfer, chemical reactions, and liquid phase evaporation in many practical combustion devices.

  2. Progress in Computational Electron-Molecule Collisions

    NASA Astrophysics Data System (ADS)

    Rescigno, Tn

    1997-10-01

    The past few years have witnessed tremendous progress in the development of sophisticated ab initio methods for treating collisions of slow electrons with isolated small molecules. Researchers in this area have benefited greatly from advances in computer technology; indeed, the advent of parallel computers has made it possible to carry out calculations at a level of sophistication inconceivable a decade ago. But bigger and faster computers are only part of the picture. Even with today's computers, the practical need to study electron collisions with the kinds of complex molecules and fragments encountered in real-world plasma processing environments is taxing present methods beyond their current capabilities. Since extrapolation of existing methods to handle increasingly larger targets will ultimately fail as it would require computational resources beyond any imagined, continued progress must also be linked to new theoretical developments. Some of the techniques recently introduced to address these problems will be discussed and illustrated with examples of electron-molecule collision calculations we have carried out on some fairly complex target gases encountered in processing plasmas. Electron-molecule scattering continues to pose many formidable theoretical and computational challenges. I will touch on some of the outstanding open questions.

  3. Choosing a Computer Language for Institutional Research. The AIR Professional File No. 6.

    ERIC Educational Resources Information Center

    Strenglein, Denise

    1980-01-01

    It is suggested that much thought should be given to choosing an appropriate computer language for an institutional research office, considering the sophistication of the staff, types of planned application, size and type of computer, and availability of central programming support in the institution. For offices that prepare straight reports and…

  4. New Ways of Using Computers in Language Teaching. New Ways in TESOL Series II. Innovative Classroom Techniques.

    ERIC Educational Resources Information Center

    Boswood, Tim, Ed.

    A collection of classroom approaches and activities using computers for language learning is presented. Some require sophisticated installations, but most do not, and most use software readily available on most workplace computer systems. The activities were chosen because they use sound language learning strategies. The book is divided into five…

  5. Evaluation of CFD to Determine Two-Dimensional Airfoil Characteristics for Rotorcraft Applications

    NASA Technical Reports Server (NTRS)

    Smith, Marilyn J.; Wong, Tin-Chee; Potsdam, Mark; Baeder, James; Phanse, Sujeet

    2004-01-01

    The efficient prediction of helicopter rotor performance, vibratory loads, and aeroelastic properties still relies heavily on the use of comprehensive analysis codes by the rotorcraft industry. These comprehensive codes utilize look-up tables to provide two-dimensional aerodynamic characteristics. Typically these tables are comprised of a combination of wind tunnel data, empirical data and numerical analyses. The potential to rely more heavily on numerical computations based on Computational Fluid Dynamics (CFD) simulations has become more of a reality with the advent of faster computers and more sophisticated physical models. The ability of five different CFD codes applied independently to predict the lift, drag and pitching moments of rotor airfoils is examined for the SC1095 airfoil, which is utilized in the UH-60A main rotor. Extensive comparisons with the results of ten wind tunnel tests are performed. These CFD computations are found to be as good as experimental data in predicting many of the aerodynamic performance characteristics. Four turbulence models were examined (Baldwin-Lomax, Spalart-Allmaras, Menter SST, and k-omega).

  6. Cyber-workstation for computational neuroscience.

    PubMed

    Digiovanna, Jack; Rattanatamrong, Prapaporn; Zhao, Ming; Mahmoudi, Babak; Hermer, Linda; Figueiredo, Renato; Principe, Jose C; Fortes, Jose; Sanchez, Justin C

    2010-01-01

    A Cyber-Workstation (CW) to study in vivo, real-time interactions between computational models and large-scale brain subsystems during behavioral experiments has been designed and implemented. The design philosophy seeks to directly link the in vivo neurophysiology laboratory with scalable computing resources to enable more sophisticated computational neuroscience investigation. The architecture designed here allows scientists to develop new models and integrate them with existing models (e.g. recursive least-squares regressor) by specifying appropriate connections in a block-diagram. Then, adaptive middleware transparently implements these user specifications using the full power of remote grid-computing hardware. In effect, the middleware deploys an on-demand and flexible neuroscience research test-bed to provide the neurophysiology laboratory extensive computational power from an outside source. The CW consolidates distributed software and hardware resources to support time-critical and/or resource-demanding computing during data collection from behaving animals. This power and flexibility is important as experimental and theoretical neuroscience evolves based on insights gained from data-intensive experiments, new technologies and engineering methodologies. This paper describes briefly the computational infrastructure and its most relevant components. Each component is discussed within a systematic process of setting up an in vivo, neuroscience experiment. Furthermore, a co-adaptive brain machine interface is implemented on the CW to illustrate how this integrated computational and experimental platform can be used to study systems neurophysiology and learning in a behavior task. We believe this implementation is also the first remote execution and adaptation of a brain-machine interface.

  7. Cyber-Workstation for Computational Neuroscience

    PubMed Central

    DiGiovanna, Jack; Rattanatamrong, Prapaporn; Zhao, Ming; Mahmoudi, Babak; Hermer, Linda; Figueiredo, Renato; Principe, Jose C.; Fortes, Jose; Sanchez, Justin C.

    2009-01-01

    A Cyber-Workstation (CW) to study in vivo, real-time interactions between computational models and large-scale brain subsystems during behavioral experiments has been designed and implemented. The design philosophy seeks to directly link the in vivo neurophysiology laboratory with scalable computing resources to enable more sophisticated computational neuroscience investigation. The architecture designed here allows scientists to develop new models and integrate them with existing models (e.g. recursive least-squares regressor) by specifying appropriate connections in a block-diagram. Then, adaptive middleware transparently implements these user specifications using the full power of remote grid-computing hardware. In effect, the middleware deploys an on-demand and flexible neuroscience research test-bed to provide the neurophysiology laboratory extensive computational power from an outside source. The CW consolidates distributed software and hardware resources to support time-critical and/or resource-demanding computing during data collection from behaving animals. This power and flexibility is important as experimental and theoretical neuroscience evolves based on insights gained from data-intensive experiments, new technologies and engineering methodologies. This paper describes briefly the computational infrastructure and its most relevant components. Each component is discussed within a systematic process of setting up an in vivo, neuroscience experiment. Furthermore, a co-adaptive brain machine interface is implemented on the CW to illustrate how this integrated computational and experimental platform can be used to study systems neurophysiology and learning in a behavior task. We believe this implementation is also the first remote execution and adaptation of a brain-machine interface. PMID:20126436

  8. Railroads and the Environment : Estimation of Fuel Consumption in Rail Transportation : Volume 3. Comparison of Computer Simulations with Field Measurements

    DOT National Transportation Integrated Search

    1978-09-01

    This report documents comparisons between extensive rail freight service measurements (previously presented in Volume II) and simulations of the same operations using a sophisticated train performance calculator computer program. The comparisons cove...

  9. Home, Hearth and Computing.

    ERIC Educational Resources Information Center

    Seelig, Anita

    1982-01-01

    Advantages of having children use microcomputers at school and home include learning about sophisticated concepts early in life without a great deal of prodding, playing games that expand knowledge, and becoming literate in computer knowledge needed later in life. Includes comments from parents on their experiences with microcomputers and…

  10. Elliptic Length Scales in Laminar, Two-Dimensional Supersonic Flows

    DTIC Science & Technology

    2015-06-01

    sophisticated computational fluid dynamics ( CFD ) methods. Additionally, for 3D interactions, the length scales would require determination in spanwise as well...Manna, M. “Experimental, Analytical, and Computational Methods Applied to Hypersonic Compression Ramp Flows,” AIAA Journal, Vol. 32, No. 2, Feb. 1994

  11. 32 CFR 806b.35 - Balancing protection.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ..., Computer Security, 5 for procedures on safeguarding personal information in automated records. 5 http://www... automated system with a log-on protocol. Others may require more sophisticated security protection based on the sensitivity of the information. Classified computer systems or those with established audit and...

  12. 32 CFR 806b.35 - Balancing protection.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ..., Computer Security, 5 for procedures on safeguarding personal information in automated records. 5 http://www... automated system with a log-on protocol. Others may require more sophisticated security protection based on the sensitivity of the information. Classified computer systems or those with established audit and...

  13. 32 CFR 806b.35 - Balancing protection.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ..., Computer Security, 5 for procedures on safeguarding personal information in automated records. 5 http://www... automated system with a log-on protocol. Others may require more sophisticated security protection based on the sensitivity of the information. Classified computer systems or those with established audit and...

  14. 32 CFR 806b.35 - Balancing protection.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ..., Computer Security, 5 for procedures on safeguarding personal information in automated records. 5 http://www... automated system with a log-on protocol. Others may require more sophisticated security protection based on the sensitivity of the information. Classified computer systems or those with established audit and...

  15. 32 CFR 806b.35 - Balancing protection.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., Computer Security, 5 for procedures on safeguarding personal information in automated records. 5 http://www... automated system with a log-on protocol. Others may require more sophisticated security protection based on the sensitivity of the information. Classified computer systems or those with established audit and...

  16. Ontology-based vector space model and fuzzy query expansion to retrieve knowledge on medical computational problem solutions.

    PubMed

    Bratsas, Charalampos; Koutkias, Vassilis; Kaimakamis, Evangelos; Bamidis, Panagiotis; Maglaveras, Nicos

    2007-01-01

    Medical Computational Problem (MCP) solving is related to medical problems and their computerized algorithmic solutions. In this paper, an extension of an ontology-based model to fuzzy logic is presented, as a means to enhance the information retrieval (IR) procedure in semantic management of MCPs. We present herein the methodology followed for the fuzzy expansion of the ontology model, the fuzzy query expansion procedure, as well as an appropriate ontology-based Vector Space Model (VSM) that was constructed for efficient mapping of user-defined MCP search criteria and MCP acquired knowledge. The relevant fuzzy thesaurus is constructed by calculating the simultaneous occurrences of terms and the term-to-term similarities derived from the ontology that utilizes UMLS (Unified Medical Language System) concepts by using Concept Unique Identifiers (CUI), synonyms, semantic types, and broader-narrower relationships for fuzzy query expansion. The current approach constitutes a sophisticated advance for effective, semantics-based MCP-related IR.

  17. Multiscale methods for gore curvature calculations from FSI modeling of spacecraft parachutes

    NASA Astrophysics Data System (ADS)

    Takizawa, Kenji; Tezduyar, Tayfun E.; Kolesar, Ryan; Boswell, Cody; Kanai, Taro; Montel, Kenneth

    2014-12-01

    There are now some sophisticated and powerful methods for computer modeling of parachutes. These methods are capable of addressing some of the most formidable computational challenges encountered in parachute modeling, including fluid-structure interaction (FSI) between the parachute and air flow, design complexities such as those seen in spacecraft parachutes, and operational complexities such as use in clusters and disreefing. One should be able to extract from a reliable full-scale parachute modeling any data or analysis needed. In some cases, however, the parachute engineers may want to perform quickly an extended or repetitive analysis with methods based on simplified models. Some of the data needed by a simplified model can very effectively be extracted from a full-scale computer modeling that serves as a pilot. A good example of such data is the circumferential curvature of a parachute gore, where a gore is the slice of the parachute canopy between two radial reinforcement cables running from the parachute vent to the skirt. We present the multiscale methods we devised for gore curvature calculation from FSI modeling of spacecraft parachutes. The methods include those based on the multiscale sequentially-coupled FSI technique and using NURBS meshes. We show how the methods work for the fully-open and two reefed stages of the Orion spacecraft main and drogue parachutes.

  18. A Computing Method for Sound Propagation Through a Nonuniform Jet Stream

    NASA Technical Reports Server (NTRS)

    Padula, S. L.; Liu, C. H.

    1974-01-01

    Understanding the principles of jet noise propagation is an essential ingredient of systematic noise reduction research. High speed computer methods offer a unique potential for dealing with complex real life physical systems whereas analytical solutions are restricted to sophisticated idealized models. The classical formulation of sound propagation through a jet flow was found to be inadequate for computer solutions and a more suitable approach was needed. Previous investigations selected the phase and amplitude of the acoustic pressure as dependent variables requiring the solution of a system of nonlinear algebraic equations. The nonlinearities complicated both the analysis and the computation. A reformulation of the convective wave equation in terms of a new set of dependent variables is developed with a special emphasis on its suitability for numerical solutions on fast computers. The technique is very attractive because the resulting equations are linear in nonwaving variables. The computer solution to such a linear system of algebraic equations may be obtained by well-defined and direct means which are conservative of computer time and storage space. Typical examples are illustrated and computational results are compared with available numerical and experimental data.

  19. Space Spurred Computer Graphics

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Dicomed Corporation was asked by NASA in the early 1970s to develop processing capabilities for recording images sent from Mars by Viking spacecraft. The company produced a film recorder which increased the intensity levels and the capability for color recording. This development led to a strong technology base resulting in sophisticated computer graphics equipment. Dicomed systems are used to record CAD (computer aided design) and CAM (computer aided manufacturing) equipment, to update maps and produce computer generated animation.

  20. A computer assisted tutorial for applications of computer spreadsheets in nursing financial management.

    PubMed

    Edwardson, S R; Pejsa, J

    1993-01-01

    A computer-based tutorial for teaching nursing financial management concepts was developed using the macro function of a commercially available spreadsheet program. The goals of the tutorial were to provide students with an experience with spreadsheets as a computer tool and to teach selected financial management concepts. Preliminary results show the tutorial was well received by students. Suggestions are made for overcoming the general lack of computer sophistication among students.

  1. Image-Based Predictive Modeling of Heart Mechanics.

    PubMed

    Wang, V Y; Nielsen, P M F; Nash, M P

    2015-01-01

    Personalized biophysical modeling of the heart is a useful approach for noninvasively analyzing and predicting in vivo cardiac mechanics. Three main developments support this style of analysis: state-of-the-art cardiac imaging technologies, modern computational infrastructure, and advanced mathematical modeling techniques. In vivo measurements of cardiac structure and function can be integrated using sophisticated computational methods to investigate mechanisms of myocardial function and dysfunction, and can aid in clinical diagnosis and developing personalized treatment. In this article, we review the state-of-the-art in cardiac imaging modalities, model-based interpretation of 3D images of cardiac structure and function, and recent advances in modeling that allow personalized predictions of heart mechanics. We discuss how using such image-based modeling frameworks can increase the understanding of the fundamental biophysics behind cardiac mechanics, and assist with diagnosis, surgical guidance, and treatment planning. Addressing the challenges in this field will require a coordinated effort from both the clinical-imaging and modeling communities. We also discuss future directions that can be taken to bridge the gap between basic science and clinical translation.

  2. 3-d finite element model development for biomechanics: a software demonstration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hollerbach, K.; Hollister, A.M.; Ashby, E.

    1997-03-01

    Finite element analysis is becoming an increasingly important part of biomechanics and orthopedic research, as computational resources become more powerful, and data handling algorithms become more sophisticated. Until recently, tools with sufficient power did not exist or were not accessible to adequately model complicated, three-dimensional, nonlinear biomechanical systems. In the past, finite element analyses in biomechanics have often been limited to two-dimensional approaches, linear analyses, or simulations of single tissue types. Today, we have the resources to model fully three-dimensional, nonlinear, multi-tissue, and even multi-joint systems. The authors will present the process of developing these kinds of finite element models,more » using human hand and knee examples, and will demonstrate their software tools.« less

  3. Numerical models of volcanic eruption plumes: inter-comparison and sensitivity

    NASA Astrophysics Data System (ADS)

    Costa, Antonio; Suzuki, Yujiro; Folch, Arnau; Cioni, Raffaello

    2016-10-01

    The accurate description of the dynamics of convective plumes developed during explosive volcanic eruptions represents one of the most crucial and intriguing challenges in volcanology. Eruptive plume dynamics are significantly affected by complex interactions with the surrounding atmosphere, in the case of both strong eruption columns, rising vertically above the tropopause, and weak volcanic plumes, developing within the troposphere and often following bended trajectories. The understanding of eruptive plume dynamics is pivotal for estimating mass flow rates of volcanic sources, a crucial aspect for tephra dispersion models used to assess aviation safety and tephra fallout hazard. For these reasons, several eruption column models have been developed in the past decades, including the more recent sophisticated computational fluid dynamic models.

  4. Peering into the Future of Advertising.

    ERIC Educational Resources Information Center

    Hsia, H. J.

    All areas in mass communications (i.e., newspapers, magazines, television, radio, films, photos, and books) will be transformed because of the increasing sophistication of computer users, the decreasing costs for interactive computer systems, and the global adoption of integrated services digital networks (ISDN). ISDN refer to the digitization of…

  5. Improving Undergraduate Computer Instruction: Experiments and Strategies

    ERIC Educational Resources Information Center

    Kalman, Howard K.; Ellis, Maureen L.

    2007-01-01

    Today, undergraduate students enter college with increasingly more sophisticated computer skills compared to their counterparts of 20 years ago. However, many instructors are still using traditional instructional strategies to teach this new generation. This research study discusses a number of strategies that were employed to teach a…

  6. DISCRETE EVENT SIMULATION OF OPTICAL SWITCH MATRIX PERFORMANCE IN COMPUTER NETWORKS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Imam, Neena; Poole, Stephen W

    2013-01-01

    In this paper, we present application of a Discrete Event Simulator (DES) for performance modeling of optical switching devices in computer networks. Network simulators are valuable tools in situations where one cannot investigate the system directly. This situation may arise if the system under study does not exist yet or the cost of studying the system directly is prohibitive. Most available network simulators are based on the paradigm of discrete-event-based simulation. As computer networks become increasingly larger and more complex, sophisticated DES tool chains have become available for both commercial and academic research. Some well-known simulators are NS2, NS3, OPNET,more » and OMNEST. For this research, we have applied OMNEST for the purpose of simulating multi-wavelength performance of optical switch matrices in computer interconnection networks. Our results suggest that the application of DES to computer interconnection networks provides valuable insight in device performance and aids in topology and system optimization.« less

  7. Verification of a three-dimensional FEM model for FBGs in PANDA fibers by transversal load experiments

    NASA Astrophysics Data System (ADS)

    Fischer, Bennet; Hopf, Barbara; Lindner, Markus; Koch, Alexander W.; Roths, Johannes

    2017-04-01

    A 3D FEM model of an FBG in a PANDA fiber with an extended fiber length of 25.4 mm is presented. Simulating long fiber lengths with limited computer power is achieved by using an iterative solver and by optimizing the FEM mesh. For verification purposes, the model is adapted to a configuration with transversal loads on the fiber. The 3D FEM model results correspond with experimental data and with the results of an additional 2D FEM plain strain model. In further studies, this 3D model shall be applied to more sophisticated situations, for example to study the temperature dependence of surface-glued or embedded FBGs in PANDA fibers that are used for strain-temperature decoupling.

  8. Choice Rules and Accumulator Networks

    PubMed Central

    2015-01-01

    This article presents a preference accumulation model that can be used to implement a number of different multi-attribute heuristic choice rules, including the lexicographic rule, the majority of confirming dimensions (tallying) rule and the equal weights rule. The proposed model differs from existing accumulators in terms of attribute representation: Leakage and competition, typically applied only to preference accumulation, are also assumed to be involved in processing attribute values. This allows the model to perform a range of sophisticated attribute-wise comparisons, including comparisons that compute relative rank. The ability of a preference accumulation model composed of leaky competitive networks to mimic symbolic models of heuristic choice suggests that these 2 approaches are not incompatible, and that a unitary cognitive model of preferential choice, based on insights from both these approaches, may be feasible. PMID:28670592

  9. Computational reacting gas dynamics

    NASA Technical Reports Server (NTRS)

    Lam, S. H.

    1993-01-01

    In the study of high speed flows at high altitudes, such as that encountered by re-entry spacecrafts, the interaction of chemical reactions and other non-equilibrium processes in the flow field with the gas dynamics is crucial. Generally speaking, problems of this level of complexity must resort to numerical methods for solutions, using sophisticated computational fluid dynamics (CFD) codes. The difficulties introduced by reacting gas dynamics can be classified into three distinct headings: (1) the usually inadequate knowledge of the reaction rate coefficients in the non-equilibrium reaction system; (2) the vastly larger number of unknowns involved in the computation and the expected stiffness of the equations; and (3) the interpretation of the detailed reacting CFD numerical results. The research performed accepts the premise that reacting flows of practical interest in the future will in general be too complex or 'untractable' for traditional analytical developments. The power of modern computers must be exploited. However, instead of focusing solely on the construction of numerical solutions of full-model equations, attention is also directed to the 'derivation' of the simplified model from the given full-model. In other words, the present research aims to utilize computations to do tasks which have traditionally been done by skilled theoreticians: to reduce an originally complex full-model system into an approximate but otherwise equivalent simplified model system. The tacit assumption is that once the appropriate simplified model is derived, the interpretation of the detailed numerical reacting CFD numerical results will become much easier. The approach of the research is called computational singular perturbation (CSP).

  10. Student Thinking Processes. The Influence of Immediate Computer Access on Students' Thinking. First- and Second-Year Findings. ACOT Report #3.

    ERIC Educational Resources Information Center

    Tierney, Robert J.

    This 2-year longitudinal study explored whether computers promote more sophisticated thinking, and examined how students' thinking changes as they become experienced computer users. The first-year study examined the thinking process of four ninth-grade Apple Classrooms of Tomorrow (ACOT) students. The second-year study continued following these…

  11. Objective and Item Banking Computer Software and Its Use in Comprehensive Achievement Monitoring.

    ERIC Educational Resources Information Center

    Schriber, Peter E.; Gorth, William P.

    The current emphasis on objectives and test item banks for constructing more effective tests is being augmented by increasingly sophisticated computer software. Items can be catalogued in numerous ways for retrieval. The items as well as instructional objectives can be stored and test forms can be selected and printed by the computer. It is also…

  12. Radar Model of Asteroid 216 Kleopatra

    NASA Technical Reports Server (NTRS)

    2000-01-01

    These images show several views from a radar-based computer model of asteroid 216 Kleopatra. The object, located in the main asteroid belt between Mars and Jupiter, is about 217 kilometers (135 miles) long and about 94 kilometers (58 miles) wide, or about the size of New Jersey.

    This dog bone-shaped asteroid is an apparent leftover from an ancient, violent cosmic collision. Kleopatra is one of several dozen asteroids whose coloring suggests they contain metal.

    A team of astronomers observing Kleopatra used the 305-meter (1,000-foot) telescope of the Arecibo Observatory in Puerto Rico to bounce encoded radio signals off Kleopatra. Using sophisticated computer analysis techniques, they decoded the echoes, transformed them into images, and assembled a computer model of the asteroid's shape.

    The images were obtained when Kleopatra was about 171 million kilometers (106 million miles) from Earth. This model is accurate to within about 15 kilometers (9 miles).

    The Arecibo Observatory is part of the National Astronomy and Ionosphere Center, operated by Cornell University, Ithaca, N.Y., for the National Science Foundation. The Kleopatra radar observations were supported by NASA's Office of Space Science, Washington, DC. JPL is managed for NASA by the California Institute of Technology in Pasadena.

  13. Electronic Networking as an Avenue of Enhanced Professional Interchange.

    ERIC Educational Resources Information Center

    Ratcliff, James L.

    Electronic networking is communication between two or more people that involves one or more telecommunications media. There is electronic networking software available for most computers, including IBM, Apple, and Radio Shack personal computers. Depending upon the sophistication of the hardware and software used, individuals and groups can…

  14. Computer Aided Tests.

    ERIC Educational Resources Information Center

    Steinke, Elisabeth

    An approach to using the computer to assemble German tests is described. The purposes of the system would be: (1) an expansion of the bilingual lexical memory bank to list and store idioms of all degrees of difficulty, with frequency data and with complete and sophisticated retrieval possibility for assembly; (2) the creation of an…

  15. The effects of different representations on static structure analysis of computer malware signatures.

    PubMed

    Narayanan, Ajit; Chen, Yi; Pang, Shaoning; Tao, Ban

    2013-01-01

    The continuous growth of malware presents a problem for internet computing due to increasingly sophisticated techniques for disguising malicious code through mutation and the time required to identify signatures for use by antiviral software systems (AVS). Malware modelling has focused primarily on semantics due to the intended actions and behaviours of viral and worm code. The aim of this paper is to evaluate a static structure approach to malware modelling using the growing malware signature databases now available. We show that, if malware signatures are represented as artificial protein sequences, it is possible to apply standard sequence alignment techniques in bioinformatics to improve accuracy of distinguishing between worm and virus signatures. Moreover, aligned signature sequences can be mined through traditional data mining techniques to extract metasignatures that help to distinguish between viral and worm signatures. All bioinformatics and data mining analysis were performed on publicly available tools and Weka.

  16. The Effects of Different Representations on Static Structure Analysis of Computer Malware Signatures

    PubMed Central

    Narayanan, Ajit; Chen, Yi; Pang, Shaoning; Tao, Ban

    2013-01-01

    The continuous growth of malware presents a problem for internet computing due to increasingly sophisticated techniques for disguising malicious code through mutation and the time required to identify signatures for use by antiviral software systems (AVS). Malware modelling has focused primarily on semantics due to the intended actions and behaviours of viral and worm code. The aim of this paper is to evaluate a static structure approach to malware modelling using the growing malware signature databases now available. We show that, if malware signatures are represented as artificial protein sequences, it is possible to apply standard sequence alignment techniques in bioinformatics to improve accuracy of distinguishing between worm and virus signatures. Moreover, aligned signature sequences can be mined through traditional data mining techniques to extract metasignatures that help to distinguish between viral and worm signatures. All bioinformatics and data mining analysis were performed on publicly available tools and Weka. PMID:23983644

  17. Using Machine Learning in Adversarial Environments.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Warren Leon Davis

    Intrusion/anomaly detection systems are among the first lines of cyber defense. Commonly, they either use signatures or machine learning (ML) to identify threats, but fail to account for sophisticated attackers trying to circumvent them. We propose to embed machine learning within a game theoretic framework that performs adversarial modeling, develops methods for optimizing operational response based on ML, and integrates the resulting optimization codebase into the existing ML infrastructure developed by the Hybrid LDRD. Our approach addresses three key shortcomings of ML in adversarial settings: 1) resulting classifiers are typically deterministic and, therefore, easy to reverse engineer; 2) ML approachesmore » only address the prediction problem, but do not prescribe how one should operationalize predictions, nor account for operational costs and constraints; and 3) ML approaches do not model attackers’ response and can be circumvented by sophisticated adversaries. The principal novelty of our approach is to construct an optimization framework that blends ML, operational considerations, and a model predicting attackers reaction, with the goal of computing optimal moving target defense. One important challenge is to construct a realistic model of an adversary that is tractable, yet realistic. We aim to advance the science of attacker modeling by considering game-theoretic methods, and by engaging experimental subjects with red teaming experience in trying to actively circumvent an intrusion detection system, and learning a predictive model of such circumvention activities. In addition, we will generate metrics to test that a particular model of an adversary is consistent with available data.« less

  18. Protocols for efficient simulations of long-time protein dynamics using coarse-grained CABS model.

    PubMed

    Jamroz, Michal; Kolinski, Andrzej; Kmiecik, Sebastian

    2014-01-01

    Coarse-grained (CG) modeling is a well-acknowledged simulation approach for getting insight into long-time scale protein folding events at reasonable computational cost. Depending on the design of a CG model, the simulation protocols vary from highly case-specific-requiring user-defined assumptions about the folding scenario-to more sophisticated blind prediction methods for which only a protein sequence is required. Here we describe the framework protocol for the simulations of long-term dynamics of globular proteins, with the use of the CABS CG protein model and sequence data. The simulations can start from a random or a selected (e.g., native) structure. The described protocol has been validated using experimental data for protein folding model systems-the prediction results agreed well with the experimental results.

  19. Validation of coupled atmosphere-fire behavior models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bossert, J.E.; Reisner, J.M.; Linn, R.R.

    1998-12-31

    Recent advances in numerical modeling and computer power have made it feasible to simulate the dynamical interaction and feedback between the heat and turbulence induced by wildfires and the local atmospheric wind and temperature fields. At Los Alamos National Laboratory, the authors have developed a modeling system that includes this interaction by coupling a high resolution atmospheric dynamics model, HIGRAD, with a fire behavior model, BEHAVE, to predict the spread of wildfires. The HIGRAD/BEHAVE model is run at very high resolution to properly resolve the fire/atmosphere interaction. At present, these coupled wildfire model simulations are computationally intensive. The additional complexitymore » of these models require sophisticated methods for assuring their reliability in real world applications. With this in mind, a substantial part of the research effort is directed at model validation. Several instrumented prescribed fires have been conducted with multi-agency support and participation from chaparral, marsh, and scrub environments in coastal areas of Florida and inland California. In this paper, the authors first describe the data required to initialize the components of the wildfire modeling system. Then they present results from one of the Florida fires, and discuss a strategy for further testing and improvement of coupled weather/wildfire models.« less

  20. Ground-water models as a management tool in Florida

    USGS Publications Warehouse

    Hutchinson, C.B.

    1984-01-01

    Highly sophisticated computer models provide powerful tools for analyzing historic data and for simulating future water levels, water movement, and water chemistry under stressed conditions throughout the ground-water system in Florida. Models that simulate the movement of heat and subsidence of land in response to aquifer pumping also have potential for application to hydrologic problems in the State. Florida, with 20 ground-water modeling studies reported since 1972, has applied computer modeling techniques to a variety of water-resources problems. Models in Florida generally have been used to provide insight to problems of water supply, contamination, and impact on the environment. The model applications range from site-specific studies, such as estimating contamination by wastewater injection at St. Petersburg, to a regional model of the entire State that may be used to assess broad-scale environmental impact of water-resources development. Recently, groundwater models have been used as management tools by the State regulatory authority to permit or deny development of water resources. As modeling precision, knowledge, and confidence increase, the use of ground-water models will shift more and more toward regulation of development and enforcement of environmental laws. (USGS)

  1. The origins of computer weather prediction and climate modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lynch, Peter

    2008-03-20

    Numerical simulation of an ever-increasing range of geophysical phenomena is adding enormously to our understanding of complex processes in the Earth system. The consequences for mankind of ongoing climate change will be far-reaching. Earth System Models are capable of replicating climate regimes of past millennia and are the best means we have of predicting the future of our climate. The basic ideas of numerical forecasting and climate modeling were developed about a century ago, long before the first electronic computer was constructed. There were several major practical obstacles to be overcome before numerical prediction could be put into practice. Amore » fuller understanding of atmospheric dynamics allowed the development of simplified systems of equations; regular radiosonde observations of the free atmosphere and, later, satellite data, provided the initial conditions; stable finite difference schemes were developed; and powerful electronic computers provided a practical means of carrying out the prodigious calculations required to predict the changes in the weather. Progress in weather forecasting and in climate modeling over the past 50 years has been dramatic. In this presentation, we will trace the history of computer forecasting through the ENIAC integrations to the present day. The useful range of deterministic prediction is increasing by about one day each decade, and our understanding of climate change is growing rapidly as Earth System Models of ever-increasing sophistication are developed.« less

  2. The origins of computer weather prediction and climate modeling

    NASA Astrophysics Data System (ADS)

    Lynch, Peter

    2008-03-01

    Numerical simulation of an ever-increasing range of geophysical phenomena is adding enormously to our understanding of complex processes in the Earth system. The consequences for mankind of ongoing climate change will be far-reaching. Earth System Models are capable of replicating climate regimes of past millennia and are the best means we have of predicting the future of our climate. The basic ideas of numerical forecasting and climate modeling were developed about a century ago, long before the first electronic computer was constructed. There were several major practical obstacles to be overcome before numerical prediction could be put into practice. A fuller understanding of atmospheric dynamics allowed the development of simplified systems of equations; regular radiosonde observations of the free atmosphere and, later, satellite data, provided the initial conditions; stable finite difference schemes were developed; and powerful electronic computers provided a practical means of carrying out the prodigious calculations required to predict the changes in the weather. Progress in weather forecasting and in climate modeling over the past 50 years has been dramatic. In this presentation, we will trace the history of computer forecasting through the ENIAC integrations to the present day. The useful range of deterministic prediction is increasing by about one day each decade, and our understanding of climate change is growing rapidly as Earth System Models of ever-increasing sophistication are developed.

  3. A gateway for phylogenetic analysis powered by grid computing featuring GARLI 2.0.

    PubMed

    Bazinet, Adam L; Zwickl, Derrick J; Cummings, Michael P

    2014-09-01

    We introduce molecularevolution.org, a publicly available gateway for high-throughput, maximum-likelihood phylogenetic analysis powered by grid computing. The gateway features a garli 2.0 web service that enables a user to quickly and easily submit thousands of maximum likelihood tree searches or bootstrap searches that are executed in parallel on distributed computing resources. The garli web service allows one to easily specify partitioned substitution models using a graphical interface, and it performs sophisticated post-processing of phylogenetic results. Although the garli web service has been used by the research community for over three years, here we formally announce the availability of the service, describe its capabilities, highlight new features and recent improvements, and provide details about how the grid system efficiently delivers high-quality phylogenetic results. © The Author(s) 2014. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.

  4. Application of NASTRAN to TFTR toroidal field coil structures

    NASA Technical Reports Server (NTRS)

    Chen, S. J.; Lee, E.

    1978-01-01

    The primary applied loads on the TF coils were electromagnetic and thermal. The complex structure and the tremendous applied loads necessitated computer type of solutions for the design problems. In the early stage of the TF coil design, many simplified finite element models were developed for the purpose of investigating the effects of material properties, supporting schemes, and coil case material on the stress levels in the case and in the copper coil. In the more sophisticated models that followed the parametric and scoping studies, the isoparametric elements, such as QUAD4, HEX8, and HEXA, were used. The analysis results from using these finite element models and the NASTRAN system were considered accurate enough to provide timely design information.

  5. >From naive to sophisticated behavior in multiagents-based financial market models

    NASA Astrophysics Data System (ADS)

    Mansilla, R.

    2000-09-01

    The behavior of physical complexity and mutual information function of the outcome of a model of heterogeneous, inductive rational agents inspired by the El Farol Bar problem and the Minority Game is studied. The first magnitude is a measure rooted in the Kolmogorov-Chaitin theory and the second a measure related to Shannon's information entropy. Extensive computer simulations were done, as a result of which, is proposed an ansatz for physical complexity of the type C(l)=lα and the dependence of the exponent α from the parameters of the model is established. The accuracy of our results and the relationship with the behavior of mutual information function as a measure of time correlation of agents choice are discussed.

  6. Computational Control of Flexible Aerospace Systems

    NASA Technical Reports Server (NTRS)

    Sharpe, Lonnie, Jr.; Shen, Ji Yao

    1994-01-01

    The main objective of this project is to establish a distributed parameter modeling technique for structural analysis, parameter estimation, vibration suppression and control synthesis of large flexible aerospace structures. This report concentrates on the research outputs produced in the last two years of the project. The main accomplishments can be summarized as follows. A new version of the PDEMOD Code had been completed. A theoretical investigation of the NASA MSFC two-dimensional ground-based manipulator facility by using distributed parameter modelling technique has been conducted. A new mathematical treatment for dynamic analysis and control of large flexible manipulator systems has been conceived, which may provide a embryonic form of a more sophisticated mathematical model for future modified versions of the PDEMOD Codes.

  7. Using Web Speech Technology with Language Learning Applications

    ERIC Educational Resources Information Center

    Daniels, Paul

    2015-01-01

    In this article, the author presents the history of human-to-computer interaction based upon the design of sophisticated computerized speech recognition algorithms. Advancements such as the arrival of cloud-based computing and software like Google's Web Speech API allows anyone with an Internet connection and Chrome browser to take advantage of…

  8. Evaluation of Imagine Learning English, a Computer-Assisted Instruction of Language and Literacy for Kindergarten Students

    ERIC Educational Resources Information Center

    Longberg, Pauline Oliphant

    2012-01-01

    As computer assisted instruction (CAI) becomes increasingly sophisticated, its appeal as a viable method of literacy intervention with young children continues despite limited evidence of effectiveness. The present study sought to assess the impact of one such CAI program, "Imagine Learning English" (ILE), on both the receptive…

  9. Microcomputer Based Computer-Assisted Learning System: CASTLE.

    ERIC Educational Resources Information Center

    Garraway, R. W. T.

    The purpose of this study was to investigate the extent to which a sophisticated computer assisted instruction (CAI) system could be implemented on the type of microcomputer system currently found in the schools. A method was devised for comparing CAI languages and was used to rank five common CAI languages. The highest ranked language, NATAL,…

  10. Understanding valence-shell electron-pair repulsion (VSEPR) theory using origami molecular models

    NASA Astrophysics Data System (ADS)

    Endah Saraswati, Teguh; Saputro, Sulistyo; Ramli, Murni; Praseptiangga, Danar; Khasanah, Nurul; Marwati, Sri

    2017-01-01

    Valence-shell electron-pair repulsion (VSEPR) theory is conventionally used to predict molecular geometry. However, it is difficult to explore the full implications of this theory by simply drawing chemical structures. Here, we introduce origami modelling as a more accessible approach for exploration of the VSEPR theory. Our technique is simple, readily accessible and inexpensive compared with other sophisticated methods such as computer simulation or commercial three-dimensional modelling kits. This method can be implemented in chemistry education at both the high school and university levels. We discuss the example of a simple molecular structure prediction for ammonia (NH3). Using the origami model, both molecular shape and the scientific justification can be visualized easily. This ‘hands-on’ approach to building molecules will help promote understanding of VSEPR theory.

  11. Implementation and use of direct-flow connections in a coupled ground-water and surface-water model

    USGS Publications Warehouse

    Swain, Eric D.

    1994-01-01

    The U.S. Geological Survey's MODFLOW finite-difference ground-water flow model has been coupled with three surface-water packages - the MODBRANCH, River, and Stream packages - to simulate surface water and its interaction with ground water. Prior to the development of the coupling packages, the only interaction between these modeling packages was that leakage values could be passed between MODFLOW and the three surface-water packages. To facilitate wider and more flexible uses of the models, a computer program was developed and added to MODFLOW to allow direct flows or stages to be passed between any of the packages and MODFLOW. The flows or stages calculated in one package can be set as boundary discharges or stages to be used in another package. Several modeling packages can be used in the same simulation depending upon the level of sophistication needed in the various reaches being modeled. This computer program is especially useful when any of the River, Stream, or MODBRANCH packages are used to model a river flowing directly into or out of wetlands in direct connection with the aquifer and represented in the model as an aquifer block. A field case study is shown to illustrate an application.

  12. Artificial tektites: an experimental technique for capturing the shapes of spinning drops

    NASA Astrophysics Data System (ADS)

    Baldwin, Kyle A.; Butler, Samuel L.; Hill, Richard J. A.

    2015-01-01

    Determining the shapes of a rotating liquid droplet bound by surface tension is an archetypal problem in the study of the equilibrium shapes of a spinning and charged droplet, a problem that unites models of the stability of the atomic nucleus with the shapes of astronomical-scale, gravitationally-bound masses. The shapes of highly deformed droplets and their stability must be calculated numerically. Although the accuracy of such models has increased with the use of progressively more sophisticated computational techniques and increases in computing power, direct experimental verification is still lacking. Here we present an experimental technique for making wax models of these shapes using diamagnetic levitation. The wax models resemble splash-form tektites, glassy stones formed from molten rock ejected from asteroid impacts. Many tektites have elongated or `dumb-bell' shapes due to their rotation mid-flight before solidification, just as we observe here. Measurements of the dimensions of our wax `artificial tektites' show good agreement with equilibrium shapes calculated by our numerical model, and with previous models. These wax models provide the first direct experimental validation for numerical models of the equilibrium shapes of spinning droplets, of importance to fundamental physics and also to studies of tektite formation.

  13. A comparison of transport algorithms for premixed, laminar steady state flames

    NASA Technical Reports Server (NTRS)

    Coffee, T. P.; Heimerl, J. M.

    1980-01-01

    The effects of different methods of approximating multispecies transport phenomena in models of premixed, laminar, steady state flames were studied. Five approximation methods that span a wide range of computational complexity were developed. Identical data for individual species properties were used for each method. Each approximation method is employed in the numerical solution of a set of five H2-02-N2 flames. For each flame the computed species and temperature profiles, as well as the computed flame speeds, are found to be very nearly independent of the approximation method used. This does not indicate that transport phenomena are unimportant, but rather that the selection of the input values for the individual species transport properties is more important than the selection of the method used to approximate the multispecies transport. Based on these results, a sixth approximation method was developed that is computationally efficient and provides results extremely close to the most sophisticated and precise method used.

  14. Unified, Cross-Platform, Open-Source Library Package for High-Performance Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kozacik, Stephen

    Compute power is continually increasing, but this increased performance is largely found in sophisticated computing devices and supercomputer resources that are difficult to use, resulting in under-utilization. We developed a unified set of programming tools that will allow users to take full advantage of the new technology by allowing them to work at a level abstracted away from the platform specifics, encouraging the use of modern computing systems, including government-funded supercomputer facilities.

  15. Characteristics and Levels of Sophistication: An Analysis of Chemistry Students' Ability to Think with Mental Models

    ERIC Educational Resources Information Center

    Wang, Chia-Yu; Barrow, Lloyd H.

    2011-01-01

    This study employed a case-study approach to reveal how an ability to think with mental models contributes to differences in students' understanding of molecular geometry and polarity. We were interested in characterizing features and levels of sophistication regarding first-year university chemistry learners' mental modeling behaviors while the…

  16. Using adaptive grid in modeling rocket nozzle flow

    NASA Technical Reports Server (NTRS)

    Chow, Alan S.; Jin, Kang-Ren

    1992-01-01

    The mechanical behavior of a rocket motor internal flow field results in a system of nonlinear partial differential equations which cannot be solved analytically. However, this system of equations called the Navier-Stokes equations can be solved numerically. The accuracy and the convergence of the solution of the system of equations will depend largely on how precisely the sharp gradients in the domain of interest can be resolved. With the advances in computer technology, more sophisticated algorithms are available to improve the accuracy and convergence of the solutions. An adaptive grid generation is one of the schemes which can be incorporated into the algorithm to enhance the capability of numerical modeling. It is equivalent to putting intelligence into the algorithm to optimize the use of computer memory. With this scheme, the finite difference domain of the flow field called the grid does neither have to be very fine nor strategically placed at the location of sharp gradients. The grid is self adapting as the solution evolves. This scheme significantly improves the methodology of solving flow problems in rocket nozzles by taking the refinement part of grid generation out of the hands of computational fluid dynamics (CFD) specialists and place it into the computer algorithm itself.

  17. Computational Modeling and Neuroimaging Techniques for Targeting during Deep Brain Stimulation

    PubMed Central

    Sweet, Jennifer A.; Pace, Jonathan; Girgis, Fady; Miller, Jonathan P.

    2016-01-01

    Accurate surgical localization of the varied targets for deep brain stimulation (DBS) is a process undergoing constant evolution, with increasingly sophisticated techniques to allow for highly precise targeting. However, despite the fastidious placement of electrodes into specific structures within the brain, there is increasing evidence to suggest that the clinical effects of DBS are likely due to the activation of widespread neuronal networks directly and indirectly influenced by the stimulation of a given target. Selective activation of these complex and inter-connected pathways may further improve the outcomes of currently treated diseases by targeting specific fiber tracts responsible for a particular symptom in a patient-specific manner. Moreover, the delivery of such focused stimulation may aid in the discovery of new targets for electrical stimulation to treat additional neurological, psychiatric, and even cognitive disorders. As such, advancements in surgical targeting, computational modeling, engineering designs, and neuroimaging techniques play a critical role in this process. This article reviews the progress of these applications, discussing the importance of target localization for DBS, and the role of computational modeling and novel neuroimaging in improving our understanding of the pathophysiology of diseases, and thus paving the way for improved selective target localization using DBS. PMID:27445709

  18. A Novel Multilayer Correlation Maximization Model for Improving CCA-Based Frequency Recognition in SSVEP Brain-Computer Interface.

    PubMed

    Jiao, Yong; Zhang, Yu; Wang, Yu; Wang, Bei; Jin, Jing; Wang, Xingyu

    2018-05-01

    Multiset canonical correlation analysis (MsetCCA) has been successfully applied to optimize the reference signals by extracting common features from multiple sets of electroencephalogram (EEG) for steady-state visual evoked potential (SSVEP) recognition in brain-computer interface application. To avoid extracting the possible noise components as common features, this study proposes a sophisticated extension of MsetCCA, called multilayer correlation maximization (MCM) model for further improving SSVEP recognition accuracy. MCM combines advantages of both CCA and MsetCCA by carrying out three layers of correlation maximization processes. The first layer is to extract the stimulus frequency-related information in using CCA between EEG samples and sine-cosine reference signals. The second layer is to learn reference signals by extracting the common features with MsetCCA. The third layer is to re-optimize the reference signals set in using CCA with sine-cosine reference signals again. Experimental study is implemented to validate effectiveness of the proposed MCM model in comparison with the standard CCA and MsetCCA algorithms. Superior performance of MCM demonstrates its promising potential for the development of an improved SSVEP-based brain-computer interface.

  19. Approaches for scalable modeling and emulation of cyber systems : LDRD final report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mayo, Jackson R.; Minnich, Ronald G.; Armstrong, Robert C.

    2009-09-01

    The goal of this research was to combine theoretical and computational approaches to better understand the potential emergent behaviors of large-scale cyber systems, such as networks of {approx} 10{sup 6} computers. The scale and sophistication of modern computer software, hardware, and deployed networked systems have significantly exceeded the computational research community's ability to understand, model, and predict current and future behaviors. This predictive understanding, however, is critical to the development of new approaches for proactively designing new systems or enhancing existing systems with robustness to current and future cyber threats, including distributed malware such as botnets. We have developed preliminarymore » theoretical and modeling capabilities that can ultimately answer questions such as: How would we reboot the Internet if it were taken down? Can we change network protocols to make them more secure without disrupting existing Internet connectivity and traffic flow? We have begun to address these issues by developing new capabilities for understanding and modeling Internet systems at scale. Specifically, we have addressed the need for scalable network simulation by carrying out emulations of a network with {approx} 10{sup 6} virtualized operating system instances on a high-performance computing cluster - a 'virtual Internet'. We have also explored mappings between previously studied emergent behaviors of complex systems and their potential cyber counterparts. Our results provide foundational capabilities for further research toward understanding the effects of complexity in cyber systems, to allow anticipating and thwarting hackers.« less

  20. SoftWAXS: a computational tool for modeling wide-angle X-ray solution scattering from biomolecules.

    PubMed

    Bardhan, Jaydeep; Park, Sanghyun; Makowski, Lee

    2009-10-01

    This paper describes a computational approach to estimating wide-angle X-ray solution scattering (WAXS) from proteins, which has been implemented in a computer program called SoftWAXS. The accuracy and efficiency of SoftWAXS are analyzed for analytically solvable model problems as well as for proteins. Key features of the approach include a numerical procedure for performing the required spherical averaging and explicit representation of the solute-solvent boundary and the surface of the hydration layer. These features allow the Fourier transform of the excluded volume and hydration layer to be computed directly and with high accuracy. This approach will allow future investigation of different treatments of the electron density in the hydration shell. Numerical results illustrate the differences between this approach to modeling the excluded volume and a widely used model that treats the excluded-volume function as a sum of Gaussians representing the individual atomic excluded volumes. Comparison of the results obtained here with those from explicit-solvent molecular dynamics clarifies shortcomings inherent to the representation of solvent as a time-averaged electron-density profile. In addition, an assessment is made of how the calculated scattering patterns depend on input parameters such as the solute-atom radii, the width of the hydration shell and the hydration-layer contrast. These results suggest that obtaining predictive calculations of high-resolution WAXS patterns may require sophisticated treatments of solvent.

  1. Parallel computer processing and modeling: applications for the ICU

    NASA Astrophysics Data System (ADS)

    Baxter, Grant; Pranger, L. Alex; Draghic, Nicole; Sims, Nathaniel M.; Wiesmann, William P.

    2003-07-01

    Current patient monitoring procedures in hospital intensive care units (ICUs) generate vast quantities of medical data, much of which is considered extemporaneous and not evaluated. Although sophisticated monitors to analyze individual types of patient data are routinely used in the hospital setting, this equipment lacks high order signal analysis tools for detecting long-term trends and correlations between different signals within a patient data set. Without the ability to continuously analyze disjoint sets of patient data, it is difficult to detect slow-forming complications. As a result, the early onset of conditions such as pneumonia or sepsis may not be apparent until the advanced stages. We report here on the development of a distributed software architecture test bed and software medical models to analyze both asynchronous and continuous patient data in real time. Hardware and software has been developed to support a multi-node distributed computer cluster capable of amassing data from multiple patient monitors and projecting near and long-term outcomes based upon the application of physiologic models to the incoming patient data stream. One computer acts as a central coordinating node; additional computers accommodate processing needs. A simple, non-clinical model for sepsis detection was implemented on the system for demonstration purposes. This work shows exceptional promise as a highly effective means to rapidly predict and thereby mitigate the effect of nosocomial infections.

  2. WE-D-303-00: Computational Phantoms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lewis, John; Brigham and Women’s Hospital and Dana-Farber Cancer Institute, Boston, MA

    2015-06-15

    Modern medical physics deals with complex problems such as 4D radiation therapy and imaging quality optimization. Such problems involve a large number of radiological parameters, and anatomical and physiological breathing patterns. A major challenge is how to develop, test, evaluate and compare various new imaging and treatment techniques, which often involves testing over a large range of radiological parameters as well as varying patient anatomies and motions. It would be extremely challenging, if not impossible, both ethically and practically, to test every combination of parameters and every task on every type of patient under clinical conditions. Computer-based simulation using computationalmore » phantoms offers a practical technique with which to evaluate, optimize, and compare imaging technologies and methods. Within simulation, the computerized phantom provides a virtual model of the patient’s anatomy and physiology. Imaging data can be generated from it as if it was a live patient using accurate models of the physics of the imaging and treatment process. With sophisticated simulation algorithms, it is possible to perform virtual experiments entirely on the computer. By serving as virtual patients, computational phantoms hold great promise in solving some of the most complex problems in modern medical physics. In this proposed symposium, we will present the history and recent developments of computational phantom models, share experiences in their application to advanced imaging and radiation applications, and discuss their promises and limitations. Learning Objectives: Understand the need and requirements of computational phantoms in medical physics research Discuss the developments and applications of computational phantoms Know the promises and limitations of computational phantoms in solving complex problems.« less

  3. Reuse, Recycle, Reweigh: Combating Influenza through Efficient Sequential Bayesian Computation for Massive Data.

    PubMed

    Tom, Jennifer A; Sinsheimer, Janet S; Suchard, Marc A

    Massive datasets in the gigabyte and terabyte range combined with the availability of increasingly sophisticated statistical tools yield analyses at the boundary of what is computationally feasible. Compromising in the face of this computational burden by partitioning the dataset into more tractable sizes results in stratified analyses, removed from the context that justified the initial data collection. In a Bayesian framework, these stratified analyses generate intermediate realizations, often compared using point estimates that fail to account for the variability within and correlation between the distributions these realizations approximate. However, although the initial concession to stratify generally precludes the more sensible analysis using a single joint hierarchical model, we can circumvent this outcome and capitalize on the intermediate realizations by extending the dynamic iterative reweighting MCMC algorithm. In doing so, we reuse the available realizations by reweighting them with importance weights, recycling them into a now tractable joint hierarchical model. We apply this technique to intermediate realizations generated from stratified analyses of 687 influenza A genomes spanning 13 years allowing us to revisit hypotheses regarding the evolutionary history of influenza within a hierarchical statistical framework.

  4. Reuse, Recycle, Reweigh: Combating Influenza through Efficient Sequential Bayesian Computation for Massive Data

    PubMed Central

    Tom, Jennifer A.; Sinsheimer, Janet S.; Suchard, Marc A.

    2015-01-01

    Massive datasets in the gigabyte and terabyte range combined with the availability of increasingly sophisticated statistical tools yield analyses at the boundary of what is computationally feasible. Compromising in the face of this computational burden by partitioning the dataset into more tractable sizes results in stratified analyses, removed from the context that justified the initial data collection. In a Bayesian framework, these stratified analyses generate intermediate realizations, often compared using point estimates that fail to account for the variability within and correlation between the distributions these realizations approximate. However, although the initial concession to stratify generally precludes the more sensible analysis using a single joint hierarchical model, we can circumvent this outcome and capitalize on the intermediate realizations by extending the dynamic iterative reweighting MCMC algorithm. In doing so, we reuse the available realizations by reweighting them with importance weights, recycling them into a now tractable joint hierarchical model. We apply this technique to intermediate realizations generated from stratified analyses of 687 influenza A genomes spanning 13 years allowing us to revisit hypotheses regarding the evolutionary history of influenza within a hierarchical statistical framework. PMID:26681992

  5. Application of foam-extend on turbulent fluid-structure interaction

    NASA Astrophysics Data System (ADS)

    Rege, K.; Hjertager, B. H.

    2017-12-01

    Turbulent flow around flexible structures is likely to induce structural vibrations which may eventually lead to fatigue failure. In order to assess the fatigue life of these structures, it is necessary to take the action of the flow on the structure into account, but also the influence of the vibrating structure on the fluid flow. This is achieved by performing fluid-structure interaction (FSI) simulations. In this work, we have investigated the capability of a FSI toolkit for the finite volume computational fluid dynamics software foam-extend to simulate turbulence-induced vibrations of a flexible structure. A large-eddy simulation (LES) turbulence model has been implemented to a basic FSI problem of a flexible wall which is placed in a confined, turbulent flow. This problem was simulated for 2.32 seconds. This short simulation required over 200 computation hours, using 20 processor cores. Thereby, it has been shown that the simulation of FSI with LES is possible, but also computationally demanding. In order to make turbulent FSI simulations with foam-extend more applicable, more sophisticated turbulence models and/or faster FSI iteration schemes should be applied.

  6. Past, present and prospect of an Artificial Intelligence (AI) based model for sediment transport prediction

    NASA Astrophysics Data System (ADS)

    Afan, Haitham Abdulmohsin; El-shafie, Ahmed; Mohtar, Wan Hanna Melini Wan; Yaseen, Zaher Mundher

    2016-10-01

    An accurate model for sediment prediction is a priority for all hydrological researchers. Many conventional methods have shown an inability to achieve an accurate prediction of suspended sediment. These methods are unable to understand the behaviour of sediment transport in rivers due to the complexity, noise, non-stationarity, and dynamism of the sediment pattern. In the past two decades, Artificial Intelligence (AI) and computational approaches have become a remarkable tool for developing an accurate model. These approaches are considered a powerful tool for solving any non-linear model, as they can deal easily with a large number of data and sophisticated models. This paper is a review of all AI approaches that have been applied in sediment modelling. The current research focuses on the development of AI application in sediment transport. In addition, the review identifies major challenges and opportunities for prospective research. Throughout the literature, complementary models superior to classical modelling.

  7. Adding a solar-radiance function to the Hošek-Wilkie skylight model.

    PubMed

    Hošek, Lukáš; Wilkie, Alexander

    2013-01-01

    One prerequisite for realistic renderings of outdoor scenes is the proper capturing of the sky's appearance. Currently, an explicit simulation of light scattering in the atmosphere isn't computationally feasible, and won't be in the foreseeable future. Captured luminance patterns have proven their usefulness in practice but can't meet all user needs. To fill this capability gap, computer graphics technology has employed analytical models of sky-dome luminance patterns for more than two decades. For technical reasons, such models deal with only the sky dome's appearance, though, and exclude the solar disc. The widely used model proposed by Arcot Preetham and colleagues employed a separately derived analytical formula for adding a solar emitter of suitable radiant intensity. Although this yields reasonable results, the formula is derived in a manner that doesn't exactly match the conditions in their sky-dome model. But the more sophisticated a skylight model is and the more subtly it can represent different conditions, the more the solar radiance should exactly match the skylight's conditions. Toward that end, researchers propose a solar-radiance function that exactly matches a recently published high-quality analytical skylight model.

  8. Processing power limits social group size: computational evidence for the cognitive costs of sociality

    PubMed Central

    Dávid-Barrett, T.; Dunbar, R. I. M.

    2013-01-01

    Sociality is primarily a coordination problem. However, the social (or communication) complexity hypothesis suggests that the kinds of information that can be acquired and processed may limit the size and/or complexity of social groups that a species can maintain. We use an agent-based model to test the hypothesis that the complexity of information processed influences the computational demands involved. We show that successive increases in the kinds of information processed allow organisms to break through the glass ceilings that otherwise limit the size of social groups: larger groups can only be achieved at the cost of more sophisticated kinds of information processing that are disadvantageous when optimal group size is small. These results simultaneously support both the social brain and the social complexity hypotheses. PMID:23804623

  9. An analytical study of electric vehicle handling dynamics

    NASA Technical Reports Server (NTRS)

    Greene, J. E.; Segal, D. J.

    1979-01-01

    Hypothetical electric vehicle configurations were studied by applying available analytical methods. Elementary linearized models were used in addition to a highly sophisticated vehicle dynamics computer simulation technique. Physical properties of specific EV's were defined for various battery and powertrain packaging approaches applied to a range of weight distribution and inertial properties which characterize a generic class of EV's. Computer simulations of structured maneuvers were performed for predicting handling qualities in the normal driving range and during various extreme conditions related to accident avoidance. Results indicate that an EV with forward weight bias will possess handling qualities superior to a comparable EV that is rear-heavy or equally balanced. The importance of properly matching tires, suspension systems, and brake system front/rear torque proportioning to a given EV configuration during the design stage is demonstrated.

  10. An Application of Latent Variable Structural Equation Modeling for Experimental Research in Educational Technology

    ERIC Educational Resources Information Center

    Lee, Hyeon Woo

    2011-01-01

    As the technology-enriched learning environments and theoretical constructs involved in instructional design become more sophisticated and complex, a need arises for equally sophisticated analytic methods to research these environments, theories, and models. Thus, this paper illustrates a comprehensive approach for analyzing data arising from…

  11. 75 FR 63067 - Interpretation of “Children's Product”

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-14

    ... a level of sophistication required to operate the locomotives. Additionally, the commenters note... railroad hobbyists, the costs involved, and the level of sophistication required to operate them. Model...

  12. SuML: A Survey Markup Language for Generalized Survey Encoding

    PubMed Central

    Barclay, MW; Lober, WB; Karras, BT

    2002-01-01

    There is a need in clinical and research settings for a sophisticated, generalized, web based survey tool that supports complex logic, separation of content and presentation, and computable guidelines. There are many commercial and open source survey packages available that provide simple logic; few provide sophistication beyond “goto” statements; none support the use of guidelines. These tools are driven by databases, static web pages, and structured documents using markup languages such as eXtensible Markup Language (XML). We propose a generalized, guideline aware language and an implementation architecture using open source standards.

  13. A Primer on High-Throughput Computing for Genomic Selection

    PubMed Central

    Wu, Xiao-Lin; Beissinger, Timothy M.; Bauck, Stewart; Woodward, Brent; Rosa, Guilherme J. M.; Weigel, Kent A.; Gatti, Natalia de Leon; Gianola, Daniel

    2011-01-01

    High-throughput computing (HTC) uses computer clusters to solve advanced computational problems, with the goal of accomplishing high-throughput over relatively long periods of time. In genomic selection, for example, a set of markers covering the entire genome is used to train a model based on known data, and the resulting model is used to predict the genetic merit of selection candidates. Sophisticated models are very computationally demanding and, with several traits to be evaluated sequentially, computing time is long, and output is low. In this paper, we present scenarios and basic principles of how HTC can be used in genomic selection, implemented using various techniques from simple batch processing to pipelining in distributed computer clusters. Various scripting languages, such as shell scripting, Perl, and R, are also very useful to devise pipelines. By pipelining, we can reduce total computing time and consequently increase throughput. In comparison to the traditional data processing pipeline residing on the central processors, performing general-purpose computation on a graphics processing unit provide a new-generation approach to massive parallel computing in genomic selection. While the concept of HTC may still be new to many researchers in animal breeding, plant breeding, and genetics, HTC infrastructures have already been built in many institutions, such as the University of Wisconsin–Madison, which can be leveraged for genomic selection, in terms of central processing unit capacity, network connectivity, storage availability, and middleware connectivity. Exploring existing HTC infrastructures as well as general-purpose computing environments will further expand our capability to meet increasing computing demands posed by unprecedented genomic data that we have today. We anticipate that HTC will impact genomic selection via better statistical models, faster solutions, and more competitive products (e.g., from design of marker panels to realized genetic gain). Eventually, HTC may change our view of data analysis as well as decision-making in the post-genomic era of selection programs in animals and plants, or in the study of complex diseases in humans. PMID:22303303

  14. 32 CFR Appendix E to Part 323 - OMB Guidelines for Matching Programs

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... concern expressed by the Congress in the Privacy Act of 1974 that “the increasing use of computers and sophisticated information technology, while essential to the efficient operation of the Government, has greatly... which a computer is used to compare two or more automated systems of records or a system of records with...

  15. 32 CFR Appendix E to Part 323 - OMB Guidelines for Matching Programs

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... concern expressed by the Congress in the Privacy Act of 1974 that “the increasing use of computers and sophisticated information technology, while essential to the efficient operation of the Government, has greatly... which a computer is used to compare two or more automated systems of records or a system of records with...

  16. 32 CFR Appendix E to Part 323 - OMB Guidelines for Matching Programs

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... concern expressed by the Congress in the Privacy Act of 1974 that “the increasing use of computers and sophisticated information technology, while essential to the efficient operation of the Government, has greatly... which a computer is used to compare two or more automated systems of records or a system of records with...

  17. An intelligent multi-media human-computer dialogue system

    NASA Technical Reports Server (NTRS)

    Neal, J. G.; Bettinger, K. E.; Byoun, J. S.; Dobes, Z.; Thielman, C. Y.

    1988-01-01

    Sophisticated computer systems are being developed to assist in the human decision-making process for very complex tasks performed under stressful conditions. The human-computer interface is a critical factor in these systems. The human-computer interface should be simple and natural to use, require a minimal learning period, assist the user in accomplishing his task(s) with a minimum of distraction, present output in a form that best conveys information to the user, and reduce cognitive load for the user. In pursuit of this ideal, the Intelligent Multi-Media Interfaces project is devoted to the development of interface technology that integrates speech, natural language text, graphics, and pointing gestures for human-computer dialogues. The objective of the project is to develop interface technology that uses the media/modalities intelligently in a flexible, context-sensitive, and highly integrated manner modelled after the manner in which humans converse in simultaneous coordinated multiple modalities. As part of the project, a knowledge-based interface system, called CUBRICON (CUBRC Intelligent CONversationalist) is being developed as a research prototype. The application domain being used to drive the research is that of military tactical air control.

  18. Unit cell-based computer-aided manufacturing system for tissue engineering.

    PubMed

    Kang, Hyun-Wook; Park, Jeong Hun; Kang, Tae-Yun; Seol, Young-Joon; Cho, Dong-Woo

    2012-03-01

    Scaffolds play an important role in the regeneration of artificial tissues or organs. A scaffold is a porous structure with a micro-scale inner architecture in the range of several to several hundreds of micrometers. Therefore, computer-aided construction of scaffolds should provide sophisticated functionality for porous structure design and a tool path generation strategy that can achieve micro-scale architecture. In this study, a new unit cell-based computer-aided manufacturing (CAM) system was developed for the automated design and fabrication of a porous structure with micro-scale inner architecture that can be applied to composite tissue regeneration. The CAM system was developed by first defining a data structure for the computing process of a unit cell representing a single pore structure. Next, an algorithm and software were developed and applied to construct porous structures with a single or multiple pore design using solid freeform fabrication technology and a 3D tooth/spine computer-aided design model. We showed that this system is quite feasible for the design and fabrication of a scaffold for tissue engineering.

  19. Two Anatomically and Computationally Distinct Learning Signals Predict Changes to Stimulus-Outcome Associations in Hippocampus.

    PubMed

    Boorman, Erie D; Rajendran, Vani G; O'Reilly, Jill X; Behrens, Tim E

    2016-03-16

    Complex cognitive processes require sophisticated local processing but also interactions between distant brain regions. It is therefore critical to be able to study distant interactions between local computations and the neural representations they act on. Here we report two anatomically and computationally distinct learning signals in lateral orbitofrontal cortex (lOFC) and the dopaminergic ventral midbrain (VM) that predict trial-by-trial changes to a basic internal model in hippocampus. To measure local computations during learning and their interaction with neural representations, we coupled computational fMRI with trial-by-trial fMRI suppression. We find that suppression in a medial temporal lobe network changes trial-by-trial in proportion to stimulus-outcome associations. During interleaved choice trials, we identify learning signals that relate to outcome type in lOFC and to reward value in VM. These intervening choice feedback signals predicted the subsequent change to hippocampal suppression, suggesting a convergence of signals that update the flexible representation of stimulus-outcome associations. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  20. A Study of Upgraded Phenolic Curing for RSRM Nozzle Rings

    NASA Technical Reports Server (NTRS)

    Smartt, Ziba

    2000-01-01

    A thermochemical cure model for predicting temperature and degree of cure profiles in curing phenolic parts was developed, validated and refined over several years. The model supports optimization of cure cycles and allows input of properties based upon the types of material and the process by which these materials are used to make nozzle components. The model has been refined to use sophisticated computer graphics to demonstrate the changes in temperature and degree of cure during the curing process. The effort discussed in the paper will be the conversion from an outdated solid modeling input program and SINDA analysis code to an integrated solid modeling and analysis package (I-DEAS solid model and TMG). Also discussed will be the incorporation of updated material properties obtained during full scale curing tests into the cure models and the results for all the Reusable Solid Rocket Motor (RSRM) nozzle rings.

  1. Improved Regional Seismic Event Locations Using 3-D Velocity Models

    DTIC Science & Technology

    1999-12-15

    regional velocity model to estimate event hypocenters. Travel times for the regional phases are calculated using a sophisticated eikonal finite...can greatly improve estimates of event locations. Our algorithm calculates travel times using a finite difference approximation of the eikonal ...such as IASP91 or J-B. 3-D velocity models require more sophisticated travel time modeling routines; thus, we use a 3-D eikonal equation solver

  2. Nursing informatics: the trend of the future.

    PubMed

    Nagelkerk, J; Ritola, P M; Vandort, P J

    1998-01-01

    Nursing informatics is a combination of computer, information, and nursing sciences. This new and expanding field addresses the efficient and effective use of information for nurses. Preparing nurses for computerization is essential to confront an explosion of sophisticated computerized technology in the workplace. It is critical in a competitive health care market for preparing nurses to use the most cost-effective methods. A model is presented that identifies six essential factors for preparing nurses for computerization. Strong leadership, effective communication, organized training sessions, established time frames, planned change, and tailored software are the essential factors to consider for development of a successful educational program.

  3. Electronic prototyping

    NASA Technical Reports Server (NTRS)

    Hopcroft, J.

    1987-01-01

    The potential benefits of automation in space are significant. The science base needed to support this automation not only will help control costs and reduce lead-time in the earth-based design and construction of space stations, but also will advance the nation's capability for computer design, simulation, testing, and debugging of sophisticated objects electronically. Progress in automation will require the ability to electronically represent, reason about, and manipulate objects. Discussed here is the development of representations, languages, editors, and model-driven simulation systems to support electronic prototyping. In particular, it identifies areas where basic research is needed before further progress can be made.

  4. Programming of a flexible computer simulation to visualize pharmacokinetic-pharmacodynamic models.

    PubMed

    Lötsch, J; Kobal, G; Geisslinger, G

    2004-01-01

    Teaching pharmacokinetic-pharmacodynamic (PK/PD) models can be made more effective using computer simulations. We propose the programming of educational PK or PK/PD computer simulations as an alternative to the use of pre-built simulation software. This approach has the advantage of adaptability to non-standard or complicated PK or PK/PD models. Simplicity of the programming procedure was achieved by selecting the LabVIEW programming environment. An intuitive user interface to visualize the time courses of drug concentrations or effects can be obtained with pre-built elements. The environment uses a wiring analogy that resembles electrical circuit diagrams rather than abstract programming code. The goal of high interactivity of the simulation was attained by allowing the program to run in continuously repeating loops. This makes the program behave flexibly to the user input. The programming is described with the aid of a 2-compartment PK simulation. Examples of more sophisticated simulation programs are also given where the PK/PD simulation shows drug input, concentrations in plasma, and at effect site and the effects themselves as a function of time. A multi-compartmental model of morphine, including metabolite kinetics and effects is also included. The programs are available for download from the World Wide Web at http:// www. klinik.uni-frankfurt.de/zpharm/klin/ PKPDsimulation/content.html. For pharmacokineticists who only program occasionally, there is the possibility of building the computer simulation, together with the flexible interactive simulation algorithm for clinical pharmacological teaching in the field of PK/PD models.

  5. An Interactive, Web-based High Performance Modeling Environment for Computational Epidemiology.

    PubMed

    Deodhar, Suruchi; Bisset, Keith R; Chen, Jiangzhuo; Ma, Yifei; Marathe, Madhav V

    2014-07-01

    We present an integrated interactive modeling environment to support public health epidemiology. The environment combines a high resolution individual-based model with a user-friendly web-based interface that allows analysts to access the models and the analytics back-end remotely from a desktop or a mobile device. The environment is based on a loosely-coupled service-oriented-architecture that allows analysts to explore various counter factual scenarios. As the modeling tools for public health epidemiology are getting more sophisticated, it is becoming increasingly hard for non-computational scientists to effectively use the systems that incorporate such models. Thus an important design consideration for an integrated modeling environment is to improve ease of use such that experimental simulations can be driven by the users. This is achieved by designing intuitive and user-friendly interfaces that allow users to design and analyze a computational experiment and steer the experiment based on the state of the system. A key feature of a system that supports this design goal is the ability to start, stop, pause and roll-back the disease propagation and intervention application process interactively. An analyst can access the state of the system at any point in time and formulate dynamic interventions based on additional information obtained through state assessment. In addition, the environment provides automated services for experiment set-up and management, thus reducing the overall time for conducting end-to-end experimental studies. We illustrate the applicability of the system by describing computational experiments based on realistic pandemic planning scenarios. The experiments are designed to demonstrate the system's capability and enhanced user productivity.

  6. An Interactive, Web-based High Performance Modeling Environment for Computational Epidemiology

    PubMed Central

    Deodhar, Suruchi; Bisset, Keith R.; Chen, Jiangzhuo; Ma, Yifei; Marathe, Madhav V.

    2014-01-01

    We present an integrated interactive modeling environment to support public health epidemiology. The environment combines a high resolution individual-based model with a user-friendly web-based interface that allows analysts to access the models and the analytics back-end remotely from a desktop or a mobile device. The environment is based on a loosely-coupled service-oriented-architecture that allows analysts to explore various counter factual scenarios. As the modeling tools for public health epidemiology are getting more sophisticated, it is becoming increasingly hard for non-computational scientists to effectively use the systems that incorporate such models. Thus an important design consideration for an integrated modeling environment is to improve ease of use such that experimental simulations can be driven by the users. This is achieved by designing intuitive and user-friendly interfaces that allow users to design and analyze a computational experiment and steer the experiment based on the state of the system. A key feature of a system that supports this design goal is the ability to start, stop, pause and roll-back the disease propagation and intervention application process interactively. An analyst can access the state of the system at any point in time and formulate dynamic interventions based on additional information obtained through state assessment. In addition, the environment provides automated services for experiment set-up and management, thus reducing the overall time for conducting end-to-end experimental studies. We illustrate the applicability of the system by describing computational experiments based on realistic pandemic planning scenarios. The experiments are designed to demonstrate the system's capability and enhanced user productivity. PMID:25530914

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lusk, Ewing; Butler, Ralph; Pieper, Steven C.

    Here, we take a historical approach to our presentation of self-scheduled task parallelism, a programming model with its origins in early irregular and nondeterministic computations encountered in automated theorem proving and logic programming. We show how an extremely simple task model has evolved into a system, asynchronous dynamic load balancing (ADLB), and a scalable implementation capable of supporting sophisticated applications on today’s (and tomorrow’s) largest supercomputers; and we illustrate the use of ADLB with a Green’s function Monte Carlo application, a modern, mature nuclear physics code in production use. Our lesson is that by surrendering a certain amount of generalitymore » and thus applicability, a minimal programming model (in terms of its basic concepts and the size of its application programmer interface) can achieve extreme scalability without introducing complexity.« less

  8. Simulation tools for particle-based reaction-diffusion dynamics in continuous space

    PubMed Central

    2014-01-01

    Particle-based reaction-diffusion algorithms facilitate the modeling of the diffusional motion of individual molecules and the reactions between them in cellular environments. A physically realistic model, depending on the system at hand and the questions asked, would require different levels of modeling detail such as particle diffusion, geometrical confinement, particle volume exclusion or particle-particle interaction potentials. Higher levels of detail usually correspond to increased number of parameters and higher computational cost. Certain systems however, require these investments to be modeled adequately. Here we present a review on the current field of particle-based reaction-diffusion software packages operating on continuous space. Four nested levels of modeling detail are identified that capture incrementing amount of detail. Their applicability to different biological questions is discussed, arching from straight diffusion simulations to sophisticated and expensive models that bridge towards coarse grained molecular dynamics. PMID:25737778

  9. A Physiological Signal Transmission Model to be Used for Specific Diagnosis of Cochlear Impairments

    NASA Astrophysics Data System (ADS)

    Saremi, Amin; Stenfelt, Stefan

    2011-11-01

    Many of the sophisticated characteristics of human auditory system are attributed to cochlea. Also, most of patients with a hearing loss suffer from impairments that originate from cochlea (sensorineural). Despite this, today's clinical diagnosis methods do not probe the specific origins of such cochlear lesions. The aim of this research is to introduce a physiological signal transmission model to be clinically used as a tool for diagnosis of cochlear losses. This model enables simulation of different bio-mechano-electrical processes which occur in the auditory organ of Corti inside the cochlea. What makes this model different from many available computational models is its loyalty to physiology since the ultimate goal is to model each single physiological phenomenon. This includes passive BM vibration, outer hair cells' performances such as nonlinear mechanoelectrical transduction (MET), active amplifications by somatic motor, as well as vibration to neural conversion at the inner hair cells.

  10. Evolution of a minimal parallel programming model

    DOE PAGES

    Lusk, Ewing; Butler, Ralph; Pieper, Steven C.

    2017-04-30

    Here, we take a historical approach to our presentation of self-scheduled task parallelism, a programming model with its origins in early irregular and nondeterministic computations encountered in automated theorem proving and logic programming. We show how an extremely simple task model has evolved into a system, asynchronous dynamic load balancing (ADLB), and a scalable implementation capable of supporting sophisticated applications on today’s (and tomorrow’s) largest supercomputers; and we illustrate the use of ADLB with a Green’s function Monte Carlo application, a modern, mature nuclear physics code in production use. Our lesson is that by surrendering a certain amount of generalitymore » and thus applicability, a minimal programming model (in terms of its basic concepts and the size of its application programmer interface) can achieve extreme scalability without introducing complexity.« less

  11. That Elusive, Eclectic Thing Called Thermal Environment: What a Board Should Know About It

    ERIC Educational Resources Information Center

    Schutte, Frederick

    1970-01-01

    Discussion of proper thermal environment for protection of sophisticated educational equipment such as computer and data-processing machines, magnetic tapes, closed-circuit television and video tape communications systems.

  12. School Architecture: New Activities Dictate New Designs.

    ERIC Educational Resources Information Center

    Hill, Robert

    1984-01-01

    Changing educational requirements have led to many school building design developments in recent years, including technologically sophisticated music and computer rooms, large school kitchens, and Title IX-mandated equal facilities available for both sexes. (MLF)

  13. Fast automated analysis of strong gravitational lenses with convolutional neural networks.

    PubMed

    Hezaveh, Yashar D; Levasseur, Laurence Perreault; Marshall, Philip J

    2017-08-30

    Quantifying image distortions caused by strong gravitational lensing-the formation of multiple images of distant sources due to the deflection of their light by the gravity of intervening structures-and estimating the corresponding matter distribution of these structures (the 'gravitational lens') has primarily been performed using maximum likelihood modelling of observations. This procedure is typically time- and resource-consuming, requiring sophisticated lensing codes, several data preparation steps, and finding the maximum likelihood model parameters in a computationally expensive process with downhill optimizers. Accurate analysis of a single gravitational lens can take up to a few weeks and requires expert knowledge of the physical processes and methods involved. Tens of thousands of new lenses are expected to be discovered with the upcoming generation of ground and space surveys. Here we report the use of deep convolutional neural networks to estimate lensing parameters in an extremely fast and automated way, circumventing the difficulties that are faced by maximum likelihood methods. We also show that the removal of lens light can be made fast and automated using independent component analysis of multi-filter imaging data. Our networks can recover the parameters of the 'singular isothermal ellipsoid' density profile, which is commonly used to model strong lensing systems, with an accuracy comparable to the uncertainties of sophisticated models but about ten million times faster: 100 systems in approximately one second on a single graphics processing unit. These networks can provide a way for non-experts to obtain estimates of lensing parameters for large samples of data.

  14. Allen Newell's Program of Research: The Video-Game Test.

    PubMed

    Gobet, Fernand

    2017-04-01

    Newell (1973) argued that progress in psychology was slow because research focused on experiments trying to answer binary questions, such as serial versus parallel processing. In addition, not enough attention was paid to the strategies used by participants, and there was a lack of theories implemented as computer models offering sufficient precision for being tested rigorously. He proposed a three-headed research program: to develop computational models able to carry out the task they aimed to explain; to study one complex task in detail, such as chess; and to build computational models that can account for multiple tasks. This article assesses the extent to which the papers in this issue advance Newell's program. While half of the papers devote much attention to strategies, several papers still average across them, a capital sin according to Newell. The three courses of action he proposed were not popular in these papers: Only two papers used computational models, with no model being both able to carry out the task and to account for human data; there was no systematic analysis of a specific video game; and no paper proposed a computational model accounting for human data in several tasks. It is concluded that, while they use sophisticated methods of analysis and discuss interesting results, overall these papers contribute only little to Newell's program of research. In this respect, they reflect the current state of psychology and cognitive science. This is a shame, as Newell's ideas might help address the current crisis of lack of replication and fraud in psychology. Copyright © 2017 The Author. Topics in Cognitive Science published by Wiley Periodicals, Inc. on behalf of Cognitive Science Society.

  15. The Operation of a Specialized Scientific Information and Data Analysis Center With Computer Base and Associated Communications Network.

    ERIC Educational Resources Information Center

    Cottrell, William B.; And Others

    The Nuclear Safety Information Center (NSIC) is a highly sophisticated scientific information center operated at Oak Ridge National Laboratory (ORNL) for the U.S. Atomic Energy Commission. Its information file, which consists of both data and bibliographic information, is computer stored and numerous programs have been developed to facilitate the…

  16. Integrating the iPod Touch in K-12 Education: Visions and Vices

    ERIC Educational Resources Information Center

    Banister, Savilla

    2010-01-01

    Advocates of ubiquitous computing have long been documenting classroom benefits of one-to-one ratios of students to handheld or laptop computers. The recent sophisticated capabilities of the iPod Touch, iPhone, and iPad have encouraged further speculation on exactly how K-12 teaching and learning might be energized by such devices. This paper…

  17. Proposal for hierarchical description of software systems

    NASA Technical Reports Server (NTRS)

    Thauboth, H.

    1973-01-01

    The programming of digital computers has developed into a new dimension full of diffculties, because the hardware of computers has become so powerful that more complex applications are entrusted to computers. The costs of software development, verification, and maintenance are outpacing those of the hardware and the trend is toward futher increase of sophistication of application of computers and consequently of sophistication of software. To obtain better visibility into software systems and to improve the structure of software systems for better tests, verification, and maintenance, a clear, but rigorous description and documentation of software is needed. The purpose of the report is to extend the present methods in order to obtain a documentation that better reflects the interplay between the various components and functions of a software system at different levels of detail without losing the precision in expression. This is done by the use of block diagrams, sequence diagrams, and cross-reference charts. In the appendices, examples from an actual large sofware system, i.e. the Marshall System for Aerospace Systems Simulation (MARSYAS), are presented. The proposed documentation structure is compatible to automation of updating significant portions of the documentation for better software change control.

  18. Web Service Model for Plasma Simulations with Automatic Post Processing and Generation of Visual Diagnostics*

    NASA Astrophysics Data System (ADS)

    Exby, J.; Busby, R.; Dimitrov, D. A.; Bruhwiler, D.; Cary, J. R.

    2003-10-01

    We present our design and initial implementation of a web service model for running particle-in-cell (PIC) codes remotely from a web browser interface. PIC codes have grown significantly in complexity and now often require parallel execution on multiprocessor computers, which in turn requires sophisticated post-processing and data analysis. A significant amount of time and effort is required for a physicist to develop all the necessary skills, at the expense of actually doing research. Moreover, parameter studies with a computationally intensive code justify the systematic management of results with an efficient way to communicate them among a group of remotely located collaborators. Our initial implementation uses the OOPIC Pro code [1], Linux, Apache, MySQL, Python, and PHP. The Interactive Data Language is used for visualization. [1] D.L. Bruhwiler et al., Phys. Rev. ST-AB 4, 101302 (2001). * This work is supported by DOE grant # DE-FG02-03ER83857 and by Tech-X Corp. ** Also University of Colorado.

  19. 4D Origami by Smart Embroidery.

    PubMed

    Stoychev, Georgi; Razavi, Mir Jalil; Wang, Xianqiao; Ionov, Leonid

    2017-09-01

    There exist many methods for processing of materials: extrusion, injection molding, fibers spinning, 3D printing, to name a few. In most cases, materials with a static, fixed shape are produced. However, numerous advanced applications require customized elements with reconfigurable shape. The few available techniques capable of overcoming this problem are expensive and/or time-consuming. Here, the use of one of the most ancient technologies for structuring, embroidering, is proposed to generate sophisticated patterns of active materials, and, in this way, to achieve complex actuation. By combining experiments and computational modeling, the fundamental rules that can predict the folding behavior of sheets with a variety of stitch-patterns are elucidated. It is demonstrated that theoretical mechanics analysis is only suitable to predict the behavior of the simplest experimental setups, whereas computer modeling gives better predictions for more complex cases. Finally, the applicability of the rules by designing basic origami structures and wrinkling substrates with controlled thermal insulation properties is shown. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Design applications for supercomputers

    NASA Technical Reports Server (NTRS)

    Studerus, C. J.

    1987-01-01

    The complexity of codes for solutions of real aerodynamic problems has progressed from simple two-dimensional models to three-dimensional inviscid and viscous models. As the algorithms used in the codes increased in accuracy, speed and robustness, the codes were steadily incorporated into standard design processes. The highly sophisticated codes, which provide solutions to the truly complex flows, require computers with large memory and high computational speed. The advent of high-speed supercomputers, such that the solutions of these complex flows become more practical, permits the introduction of the codes into the design system at an earlier stage. The results of several codes which either were already introduced into the design process or are rapidly in the process of becoming so, are presented. The codes fall into the area of turbomachinery aerodynamics and hypersonic propulsion. In the former category, results are presented for three-dimensional inviscid and viscous flows through nozzle and unducted fan bladerows. In the latter category, results are presented for two-dimensional inviscid and viscous flows for hypersonic vehicle forebodies and engine inlets.

  1. Data communication requirements for the advanced NAS network

    NASA Technical Reports Server (NTRS)

    Levin, Eugene; Eaton, C. K.; Young, Bruce

    1986-01-01

    The goal of the Numerical Aerodynamic Simulation (NAS) Program is to provide a powerful computational environment for advanced research and development in aeronautics and related disciplines. The present NAS system consists of a Cray 2 supercomputer connected by a data network to a large mass storage system, to sophisticated local graphics workstations, and by remote communications to researchers throughout the United States. The program plan is to continue acquiring the most powerful supercomputers as they become available. In the 1987/1988 time period it is anticipated that a computer with 4 times the processing speed of a Cray 2 will be obtained and by 1990 an additional supercomputer with 16 times the speed of the Cray 2. The implications of this 20-fold increase in processing power on the data communications requirements are described. The analysis was based on models of the projected workload and system architecture. The results are presented together with the estimates of their sensitivity to assumptions inherent in the models.

  2. Analysis Method for Laterally Loaded Pile Groups Using an Advanced Modeling of Reinforced Concrete Sections.

    PubMed

    Stacul, Stefano; Squeglia, Nunziante

    2018-02-15

    A Boundary Element Method (BEM) approach was developed for the analysis of pile groups. The proposed method includes: the non-linear behavior of the soil by a hyperbolic modulus reduction curve; the non-linear response of reinforced concrete pile sections, also taking into account the influence of tension stiffening; the influence of suction by increasing the stiffness of shallow portions of soil and modeled using the Modified Kovacs model; pile group shadowing effect, modeled using an approach similar to that proposed in the Strain Wedge Model for pile groups analyses. The proposed BEM method saves computational effort compared to more sophisticated codes such as VERSAT-P3D, PLAXIS 3D and FLAC-3D, and provides reliable results using input data from a standard site investigation. The reliability of this method was verified by comparing results from data from full scale and centrifuge tests on single piles and pile groups. A comparison is presented between measured and computed data on a laterally loaded fixed-head pile group composed by reinforced concrete bored piles. The results of the proposed method are shown to be in good agreement with those obtained in situ.

  3. Intelligent model-based diagnostics for vehicle health management

    NASA Astrophysics Data System (ADS)

    Luo, Jianhui; Tu, Fang; Azam, Mohammad S.; Pattipati, Krishna R.; Willett, Peter K.; Qiao, Liu; Kawamoto, Masayuki

    2003-08-01

    The recent advances in sensor technology, remote communication and computational capabilities, and standardized hardware/software interfaces are creating a dramatic shift in the way the health of vehicles is monitored and managed. These advances facilitate remote monitoring, diagnosis and condition-based maintenance of automotive systems. With the increased sophistication of electronic control systems in vehicles, there is a concomitant increased difficulty in the identification of the malfunction phenomena. Consequently, the current rule-based diagnostic systems are difficult to develop, validate and maintain. New intelligent model-based diagnostic methodologies that exploit the advances in sensor, telecommunications, computing and software technologies are needed. In this paper, we will investigate hybrid model-based techniques that seamlessly employ quantitative (analytical) models and graph-based dependency models for intelligent diagnosis. Automotive engineers have found quantitative simulation (e.g. MATLAB/SIMULINK) to be a vital tool in the development of advanced control systems. The hybrid method exploits this capability to improve the diagnostic system's accuracy and consistency, utilizes existing validated knowledge on rule-based methods, enables remote diagnosis, and responds to the challenges of increased system complexity. The solution is generic and has the potential for application in a wide range of systems.

  4. Analysis Method for Laterally Loaded Pile Groups Using an Advanced Modeling of Reinforced Concrete Sections

    PubMed Central

    2018-01-01

    A Boundary Element Method (BEM) approach was developed for the analysis of pile groups. The proposed method includes: the non-linear behavior of the soil by a hyperbolic modulus reduction curve; the non-linear response of reinforced concrete pile sections, also taking into account the influence of tension stiffening; the influence of suction by increasing the stiffness of shallow portions of soil and modeled using the Modified Kovacs model; pile group shadowing effect, modeled using an approach similar to that proposed in the Strain Wedge Model for pile groups analyses. The proposed BEM method saves computational effort compared to more sophisticated codes such as VERSAT-P3D, PLAXIS 3D and FLAC-3D, and provides reliable results using input data from a standard site investigation. The reliability of this method was verified by comparing results from data from full scale and centrifuge tests on single piles and pile groups. A comparison is presented between measured and computed data on a laterally loaded fixed-head pile group composed by reinforced concrete bored piles. The results of the proposed method are shown to be in good agreement with those obtained in situ. PMID:29462857

  5. Alert Response to Motion Onset in the Retina

    PubMed Central

    Chen, Eric Y.; Marre, Olivier; Fisher, Clark; Schwartz, Greg; Levy, Joshua; da Silveira, Rava Azeredo

    2013-01-01

    Previous studies have shown that motion onset is very effective at capturing attention and is more salient than smooth motion. Here, we find that this salience ranking is present already in the firing rate of retinal ganglion cells. By stimulating the retina with a bar that appears, stays still, and then starts moving, we demonstrate that a subset of salamander retinal ganglion cells, fast OFF cells, responds significantly more strongly to motion onset than to smooth motion. We refer to this phenomenon as an alert response to motion onset. We develop a computational model that predicts the time-varying firing rate of ganglion cells responding to the appearance, onset, and smooth motion of a bar. This model, termed the adaptive cascade model, consists of a ganglion cell that receives input from a layer of bipolar cells, represented by individual rectified subunits. Additionally, both the bipolar and ganglion cells have separate contrast gain control mechanisms. This model captured the responses to our different motion stimuli over a wide range of contrasts, speeds, and locations. The alert response to motion onset, together with its computational model, introduces a new mechanism of sophisticated motion processing that occurs early in the visual system. PMID:23283327

  6. The prediction of zenith range refraction from surface measurements of meteorological parameters. [mathematical models of atmospheric refraction used to improve spacecraft tracking space navigation

    NASA Technical Reports Server (NTRS)

    Berman, A. L.

    1976-01-01

    In the last two decades, increasingly sophisticated deep space missions have placed correspondingly stringent requirements on navigational accuracy. As part of the effort to increase navigational accuracy, and hence the quality of radiometric data, much effort has been expended in an attempt to understand and compute the tropospheric effect on range (and hence range rate) data. The general approach adopted has been that of computing a zenith range refraction, and then mapping this refraction to any arbitrary elevation angle via an empirically derived function of elevation. The prediction of zenith range refraction derived from surface measurements of meteorological parameters is presented. Refractivity is separated into wet (water vapor pressure) and dry (atmospheric pressure) components. The integration of dry refractivity is shown to be exact. Attempts to integrate wet refractivity directly prove ineffective; however, several empirical models developed by the author and other researchers at JPL are discussed. The best current wet refraction model is here considered to be a separate day/night model, which is proportional to surface water vapor pressure and inversely proportional to surface temperature. Methods are suggested that might improve the accuracy of the wet range refraction model.

  7. Classification and Feature Selection Algorithms for Modeling Ice Storm Climatology

    NASA Astrophysics Data System (ADS)

    Swaminathan, R.; Sridharan, M.; Hayhoe, K.; Dobbie, G.

    2015-12-01

    Ice storms account for billions of dollars of winter storm loss across the continental US and Canada. In the future, increasing concentration of human populations in areas vulnerable to ice storms such as the northeastern US will only exacerbate the impacts of these extreme events on infrastructure and society. Quantifying the potential impacts of global climate change on ice storm prevalence and frequency is challenging, as ice storm climatology is driven by complex and incompletely defined atmospheric processes, processes that are in turn influenced by a changing climate. This makes the underlying atmospheric and computational modeling of ice storm climatology a formidable task. We propose a novel computational framework that uses sophisticated stochastic classification and feature selection algorithms to model ice storm climatology and quantify storm occurrences from both reanalysis and global climate model outputs. The framework is based on an objective identification of ice storm events by key variables derived from vertical profiles of temperature, humidity and geopotential height. Historical ice storm records are used to identify days with synoptic-scale upper air and surface conditions associated with ice storms. Evaluation using NARR reanalysis and historical ice storm records corresponding to the northeastern US demonstrates that an objective computational model with standard performance measures, with a relatively high degree of accuracy, identify ice storm events based on upper-air circulation patterns and provide insights into the relationships between key climate variables associated with ice storms.

  8. CT-assisted agile manufacturing

    NASA Astrophysics Data System (ADS)

    Stanley, James H.; Yancey, Robert N.

    1996-11-01

    The next century will witness at least two great revolutions in the way goods are produced. First, workers will use the medium of virtual reality in all aspects of marketing, research, development, prototyping, manufacturing, sales and service. Second, market forces will drive manufacturing towards small-lot production and just-in-time delivery. Already, we can discern the merging of these megatrends into what some are calling agile manufacturing. Under this new paradigm, parts and processes will be designed and engineered within the mind of a computer, tooled and manufactured by the offspring of today's rapid prototyping equipment, and evaluated for performance and reliability by advanced nondestructive evaluation (NDE) techniques and sophisticated computational models. Computed tomography (CT) is the premier example of an NDE method suitable for future agile manufacturing activities. It is the only modality that provides convenient access to the full suite of engineering data that users will need to avail themselves of computer- aided design, computer-aided manufacturing, and computer- aided engineering capabilities, as well as newly emerging reverse engineering, rapid prototyping and solid freeform fabrication technologies. As such, CT is assured a central, utilitarian role in future industrial operations. An overview of this exciting future for industrial CT is presented.

  9. Cognitive engineering models: A prerequisite to the design of human-computer interaction in complex dynamic systems

    NASA Technical Reports Server (NTRS)

    Mitchell, Christine M.

    1993-01-01

    This chapter examines a class of human-computer interaction applications, specifically the design of human-computer interaction for the operators of complex systems. Such systems include space systems (e.g., manned systems such as the Shuttle or space station, and unmanned systems such as NASA scientific satellites), aviation systems (e.g., the flight deck of 'glass cockpit' airplanes or air traffic control) and industrial systems (e.g., power plants, telephone networks, and sophisticated, e.g., 'lights out,' manufacturing facilities). The main body of human-computer interaction (HCI) research complements but does not directly address the primary issues involved in human-computer interaction design for operators of complex systems. Interfaces to complex systems are somewhat special. The 'user' in such systems - i.e., the human operator responsible for safe and effective system operation - is highly skilled, someone who in human-machine systems engineering is sometimes characterized as 'well trained, well motivated'. The 'job' or task context is paramount and, thus, human-computer interaction is subordinate to human job interaction. The design of human interaction with complex systems, i.e., the design of human job interaction, is sometimes called cognitive engineering.

  10. High-level user interfaces for transfer function design with semantics.

    PubMed

    Salama, Christof Rezk; Keller, Maik; Kohlmann, Peter

    2006-01-01

    Many sophisticated techniques for the visualization of volumetric data such as medical data have been published. While existing techniques are mature from a technical point of view, managing the complexity of visual parameters is still difficult for non-expert users. To this end, this paper presents new ideas to facilitate the specification of optical properties for direct volume rendering. We introduce an additional level of abstraction for parametric models of transfer functions. The proposed framework allows visualization experts to design high-level transfer function models which can intuitively be used by non-expert users. The results are user interfaces which provide semantic information for specialized visualization problems. The proposed method is based on principal component analysis as well as on concepts borrowed from computer animation.

  11. DYNA3D: A computer code for crashworthiness engineering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hallquist, J.O.; Benson, D.J.

    1986-09-01

    A finite element program with crashworthiness applications has been developed at LLNL. DYNA3D, an explicit, fully vectorized, finite deformation structural dynamics program, has four capabilities that are critical for the efficient and realistic modeling crash phenomena: (1) fully optimized nonlinear solid, shell, and beam elements for representing a structure; (2) a broad range of constitutive models for simulating material behavior; (3) sophisticated contact algorithms for impact interactions; (4) a rigid body capability to represent the bodies away from the impact region at a greatly reduced cost without sacrificing accuracy in the momentum calculations. Basic methodologies of the program are brieflymore » presented along with several crashworthiness calculations. Efficiencies of the Hughes-Liu and Belytschko-Tsay shell formulations are considered.« less

  12. Airburst-Generated Tsunamis

    NASA Astrophysics Data System (ADS)

    Berger, Marsha; Goodman, Jonathan

    2018-04-01

    This paper examines the questions of whether smaller asteroids that burst in the air over water can generate tsunamis that could pose a threat to distant locations. Such airburst-generated tsunamis are qualitatively different than the more frequently studied earthquake-generated tsunamis, and differ as well from tsunamis generated by asteroids that strike the ocean. Numerical simulations are presented using the shallow water equations in several settings, demonstrating very little tsunami threat from this scenario. A model problem with an explicit solution that demonstrates and explains the same phenomena found in the computations is analyzed. We discuss the question of whether compressibility and dispersion are important effects that should be included, and show results from a more sophisticated model problem using the linearized Euler equations that begins to addresses this.

  13. The Cerebellum: Adaptive Prediction for Movement and Cognition

    PubMed Central

    Sokolov, Arseny A.; Miall, R. Chris; Ivry, Richard B.

    2017-01-01

    Over the past 30 years, cumulative evidence has indicated that cerebellar function extends beyond sensorimotor control. This view has emerged from studies of neuroanatomy, neuroimaging, neuropsychology and brain stimulation, with the results implicating the cerebellum in domains as diverse as attention, language, executive function and social cognition. Although the literature provides sophisticated models of how the cerebellum helps refine movements, it remains unclear how the core mechanisms of these models can be applied when considering a broader conceptualization of cerebellar function. In light of recent multidisciplinary findings, we consider two key concepts that have been suggested as general computational principles of cerebellar function, prediction and error-based learning, examining how these might be relevant in the operation of cognitive cerebro-cerebellar loops. PMID:28385461

  14. Computation of repetitions and regularities of biologically weighted sequences.

    PubMed

    Christodoulakis, M; Iliopoulos, C; Mouchard, L; Perdikuri, K; Tsakalidis, A; Tsichlas, K

    2006-01-01

    Biological weighted sequences are used extensively in molecular biology as profiles for protein families, in the representation of binding sites and often for the representation of sequences produced by a shotgun sequencing strategy. In this paper, we address three fundamental problems in the area of biologically weighted sequences: (i) computation of repetitions, (ii) pattern matching, and (iii) computation of regularities. Our algorithms can be used as basic building blocks for more sophisticated algorithms applied on weighted sequences.

  15. Towards a large-scale scalable adaptive heart model using shallow tree meshes

    NASA Astrophysics Data System (ADS)

    Krause, Dorian; Dickopf, Thomas; Potse, Mark; Krause, Rolf

    2015-10-01

    Electrophysiological heart models are sophisticated computational tools that place high demands on the computing hardware due to the high spatial resolution required to capture the steep depolarization front. To address this challenge, we present a novel adaptive scheme for resolving the deporalization front accurately using adaptivity in space. Our adaptive scheme is based on locally structured meshes. These tensor meshes in space are organized in a parallel forest of trees, which allows us to resolve complicated geometries and to realize high variations in the local mesh sizes with a minimal memory footprint in the adaptive scheme. We discuss both a non-conforming mortar element approximation and a conforming finite element space and present an efficient technique for the assembly of the respective stiffness matrices using matrix representations of the inclusion operators into the product space on the so-called shallow tree meshes. We analyzed the parallel performance and scalability for a two-dimensional ventricle slice as well as for a full large-scale heart model. Our results demonstrate that the method has good performance and high accuracy.

  16. Impact of wall thickness and saccular geometry on the computational wall stress of descending thoracic aortic aneurysms.

    PubMed

    Shang, Eric K; Nathan, Derek P; Sprinkle, Shanna R; Fairman, Ronald M; Bavaria, Joseph E; Gorman, Robert C; Gorman, Joseph H; Jackson, Benjamin M

    2013-09-10

    Wall stress calculated using finite element analysis has been used to predict rupture risk of aortic aneurysms. Prior models often assume uniform aortic wall thickness and fusiform geometry. We examined the effects of including local wall thickness, intraluminal thrombus, calcifications, and saccular geometry on peak wall stress (PWS) in finite element analysis of descending thoracic aortic aneurysms. Computed tomographic angiography of descending thoracic aortic aneurysms (n=10 total, 5 fusiform and 5 saccular) underwent 3-dimensional reconstruction with custom algorithms. For each aneurysm, an initial model was constructed with uniform wall thickness. Experimental models explored the addition of variable wall thickness, calcifications, and intraluminal thrombus. Each model was loaded with 120 mm Hg pressure, and von Mises PWS was computed. The mean PWS of uniform wall thickness models was 410 ± 111 kPa. The imposition of variable wall thickness increased PWS (481 ± 126 kPa, P<0.001). Although the addition of calcifications was not statistically significant (506 ± 126 kPa, P=0.07), the addition of intraluminal thrombus to variable wall thickness (359 ± 86 kPa, P ≤ 0.001) reduced PWS. A final model incorporating all features also reduced PWS (368 ± 88 kPa, P<0.001). Saccular geometry did not increase diameter-normalized stress in the final model (77 ± 7 versus 67 ± 12 kPa/cm, P=0.22). Incorporation of local wall thickness can significantly increase PWS in finite element analysis models of thoracic aortic aneurysms. Incorporating variable wall thickness, intraluminal thrombus, and calcifications significantly impacts computed PWS of thoracic aneurysms; sophisticated models may, therefore, be more accurate in assessing rupture risk. Saccular aneurysms did not demonstrate a significantly higher normalized PWS than fusiform aneurysms.

  17. Verification for measurement-only blind quantum computing

    NASA Astrophysics Data System (ADS)

    Morimae, Tomoyuki

    2014-06-01

    Blind quantum computing is a new secure quantum computing protocol where a client who does not have any sophisticated quantum technology can delegate her quantum computing to a server without leaking any privacy. It is known that a client who has only a measurement device can perform blind quantum computing [T. Morimae and K. Fujii, Phys. Rev. A 87, 050301(R) (2013), 10.1103/PhysRevA.87.050301]. It has been an open problem whether the protocol can enjoy the verification, i.e., the ability of the client to check the correctness of the computing. In this paper, we propose a protocol of verification for the measurement-only blind quantum computing.

  18. Addition of equilibrium air to an upwind Navier-Stokes code and other first steps toward a more generalized flow solver

    NASA Technical Reports Server (NTRS)

    Rosen, Bruce S.

    1991-01-01

    An upwind three-dimensional volume Navier-Stokes code is modified to facilitate modeling of complex geometries and flow fields represented by proposed National Aerospace Plane concepts. Code enhancements include an equilibrium air model, a generalized equilibrium gas model and several schemes to simplify treatment of complex geometric configurations. The code is also restructured for inclusion of an arbitrary number of independent and dependent variables. This latter capability is intended for eventual use to incorporate nonequilibrium/chemistry gas models, more sophisticated turbulence and transition models, or other physical phenomena which will require inclusion of additional variables and/or governing equations. Comparisons of computed results with experimental data and results obtained using other methods are presented for code validation purposes. Good correlation is obtained for all of the test cases considered, indicating the success of the current effort.

  19. Economic analysis of linking operating room scheduling and hospital material management information systems for just-in-time inventory control.

    PubMed

    Epstein, R H; Dexter, F

    2000-08-01

    Operating room (OR) scheduling information systems can decrease perioperative labor costs. Material management information systems can decrease perioperative inventory costs. We used computer simulation to investigate whether using the OR schedule to trigger purchasing of perioperative supplies is likely to further decrease perioperative inventory costs, as compared with using sophisticated, stand-alone material management inventory control. Although we designed the simulations to favor financially linking the information systems, we found that this strategy would be expected to decrease inventory costs substantively only for items of high price ($1000 each) and volume (>1000 used each year). Because expensive items typically have different models and sizes, each of which is used by a hospital less often than this, for almost all items there will be no benefit to making daily adjustments to the order volume based on booked cases. We conclude that, in a hospital with a sophisticated material management information system, OR managers will probably achieve greater cost reductions from focusing on negotiating less expensive purchase prices for items than on trying to link the OR information system with the hospital's material management information system to achieve just-in-time inventory control. In a hospital with a sophisticated material management information system, operating room managers will probably achieve greater cost reductions from focusing on negotiating less expensive purchase prices for items than on trying to link the operating room information system with the hospital's material management information system to achieve just-in-time inventory control.

  20. National Combustion Code: A Multidisciplinary Combustor Design System

    NASA Technical Reports Server (NTRS)

    Stubbs, Robert M.; Liu, Nan-Suey

    1997-01-01

    The Internal Fluid Mechanics Division conducts both basic research and technology, and system technology research for aerospace propulsion systems components. The research within the division, which is both computational and experimental, is aimed at improving fundamental understanding of flow physics in inlets, ducts, nozzles, turbomachinery, and combustors. This article and the following three articles highlight some of the work accomplished in 1996. A multidisciplinary combustor design system is critical for optimizing the combustor design process. Such a system should include sophisticated computer-aided design (CAD) tools for geometry creation, advanced mesh generators for creating solid model representations, a common framework for fluid flow and structural analyses, modern postprocessing tools, and parallel processing. The goal of the present effort is to develop some of the enabling technologies and to demonstrate their overall performance in an integrated system called the National Combustion Code.

  1. Transmission loss optimization in acoustic sandwich panels

    NASA Astrophysics Data System (ADS)

    Makris, S. E.; Dym, C. L.; MacGregor Smith, J.

    1986-06-01

    Considering the sound transmission loss (TL) of a sandwich panel as the single objective, different optimization techniques are examined and a sophisticated computer program is used to find the optimum TL. Also, for one of the possible case studies such as core optimization, closed-form expressions are given between TL and the core-design variables for different sets of skins. The significance of these functional relationships lies in the fact that the panel designer can bypass the necessity of using a sophisticated software package in order to assess explicitly the dependence of the TL on core thickness and density.

  2. Low-Computation Strategies for Extracting CO2 Emission Trends from Surface-Level Mixing Ratio Observations

    NASA Astrophysics Data System (ADS)

    Shusterman, A.; Kim, J.; Lieschke, K.; Newman, C.; Cohen, R. C.

    2017-12-01

    Global momentum is building for drastic, regulated reductions in greenhouse gas emissions over the coming decade. With this increasing regulation comes a clear need for increasingly sophisticated monitoring, reporting, and verification (MRV) strategies capable of enforcing and optimizing emissions-related policy, particularly as it applies to urban areas. Remote sensing and/or activity-based emission inventories can offer MRV insights for entire sectors or regions, but are not yet sophisticated enough to resolve unexpected trends in specific emitters. Urban surface monitors can offer the desired proximity to individual greenhouse gas sources, but due to the densely-packed nature of typical urban landscapes, surface observations are rarely representative of a single source. Most previous efforts to decompose these complex signals into their contributing emission processes have involved inverse atmospheric modeling techniques, which are computationally intensive and believed to depend heavily on poorly understood a priori estimates of error covariance. Here we present a number of transparent, low-computation approaches for extracting source-specific emissions estimates from signals with a variety of nearfield influences. Using observations from the first several years of the BErkeley Atmospheric CO2 Observation Network (BEACO2N), we demonstrate how to exploit strategic pairings of monitoring "nodes," anomalous wind conditions, and well-understood temporal variations to hone in on specific CO2 sources of interest. When evaluated against conventional, activity-based bottom-up emission inventories, these strategies are seen to generate quantitatively rigorous emission estimates. With continued application as the BEACO2N data set grows in time and space, these approaches offer a promising avenue for optimizing greenhouse gas mitigation strategies into the future.

  3. Computer programs: Information retrieval and data analysis, a compilation

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The items presented in this compilation are divided into two sections. Section one treats of computer usage devoted to the retrieval of information that affords the user rapid entry into voluminous collections of data on a selective basis. Section two is a more generalized collection of computer options for the user who needs to take such data and reduce it to an analytical study within a specific discipline. These programs, routines, and subroutines should prove useful to users who do not have access to more sophisticated and expensive computer software.

  4. Computational dynamics of soft machines

    NASA Astrophysics Data System (ADS)

    Hu, Haiyan; Tian, Qiang; Liu, Cheng

    2017-06-01

    Soft machine refers to a kind of mechanical system made of soft materials to complete sophisticated missions, such as handling a fragile object and crawling along a narrow tunnel corner, under low cost control and actuation. Hence, soft machines have raised great challenges to computational dynamics. In this review article, recent studies of the authors on the dynamic modeling, numerical simulation, and experimental validation of soft machines are summarized in the framework of multibody system dynamics. The dynamic modeling approaches are presented first for the geometric nonlinearities of coupled overall motions and large deformations of a soft component, the physical nonlinearities of a soft component made of hyperelastic or elastoplastic materials, and the frictional contacts/impacts of soft components, respectively. Then the computation approach is outlined for the dynamic simulation of soft machines governed by a set of differential-algebraic equations of very high dimensions, with an emphasis on the efficient computations of the nonlinear elastic force vector of finite elements. The validations of the proposed approaches are given via three case studies, including the locomotion of a soft quadrupedal robot, the spinning deployment of a solar sail of a spacecraft, and the deployment of a mesh reflector of a satellite antenna, as well as the corresponding experimental studies. Finally, some remarks are made for future studies.

  5. Regional-Scale Modeling at NASA Goddard Space Flight Center

    NASA Technical Reports Server (NTRS)

    Tao, W.-K.; Adler, R.; Baker, D.; Braun, S.; Chou, M.-D.; Jasinski, M. F.; Jia, Y.; Kakar, R.; Karyampudi, M.; Lang, S.

    2003-01-01

    Over the past decade, the Goddard Mesoscale Modeling and Dynamics Group has used a popular regional scale model, MM5, to study precipitation processes. Our group is making contributions to the MM5 by incorporating the following physical and numerical packages: improved Goddard cloud processes, a land processes model (Parameterization for Land-Atmosphere-Cloud Exchange - PLACE), efficient but sophisticated radiative processes, conservation of hydrometeor mass (water budget), four-dimensional data assimilation for rainfall, and better computational methods for trace gas transport. At NASA Goddard, the MM5 has been used to study: (1) the impact of initial conditions, assimilation of satellite-derived rainfall, and cumulus parameterizations on rapidly intensifying oceanic cyclones, hurricanes and typhoons, (2) the dynamic and thermodynamic processes associated with the development of narrow cold frontal rainbands, (3) regional climate and water cycles, (4) the impact of vertical transport by clouds and lightning on trace gas distributiodproduction associated with South and North American mesoscale convective systems, (5) the development of a westerly wind burst (WWB) that occurred during the TOGA COARE and the diurnal variation of precipitation in the tropics, (6) a Florida sea breeze convective event and a Mid-US flood event using a sophisticated land surface model, (7) the influence of soil heterogeneity on land surface energy balance in the southwest GCIP region, (8) explicit simulations (with 1.33 to 4 km horizontal resolution) of hurricanes Bob (1991) and Bonnie (1998), (9) a heavy precipitation event over Taiwan, and (10) to make real time forecasts for a major NASA field program. In this paper, the modifications and simulated cases will be described and discussed.

  6. A Frontal Dopamine System for Reflective Exploratory Behavior

    PubMed Central

    Blanco, Nathaniel J.; Love, Bradley C.; Cooper, Jessica A.; McGeary, John E.; Knopik, Valerie S.; Maddox, W. Todd

    2015-01-01

    The COMT gene modulates dopamine levels in prefrontal cortex with Met allele carriers having lower COMT enzyme activity and, therefore, higher dopamine levels compared to Val/Val homozygotes. Concordantly, Val/Val homozygotes tend to perform worse and display increased (interpreted as inefficient) frontal activation in certain cognitive tasks. In a sample of 209 participants, we test the hypothesis that Met carriers will be advantaged in a decision-making task that demands sequencing exploratory and exploitive choices to minimize uncertainty about the reward structure in the environment. Previous work suggests that optimal performance depends on limited cognitive resources supported by prefrontal systems. If so, Met carriers should outperform Val/Val homozygotes, particularly under dual-task conditions that tax limited cognitive resources. In accord with these a priori predictions, Met carriers were more resilient in the face of cognitive load, continuing to explore in a sophisticated manner. We fit computational models that embody sophisticated reflective and simple reflexive strategies to further evaluate participants' exploration behavior. The Ideal Actor model reflectively updates beliefs and plans ahead, taking into account the information gained by each choice and making choices that maximize long-term payoffs. In contrast, the Naïve Reinforcement Learning (RL) model instantiates the reflexive account of choice, in which the values of actions are based only on the rewards experienced so far. Its beliefs are updated reflexively in response to observed changes in rewards. Converging with standard analyses, Met carriers were best characterized by the Ideal Actor model, whereas Val/Val homozygotes were best characterized by the Naive RL model, particularly under dual-task conditions. PMID:26004676

  7. Theoretical Characterizaiton of Visual Signatures (Muzzle Flash)

    NASA Astrophysics Data System (ADS)

    Kashinski, D. O.; Scales, A. N.; Vanderley, D. L.; Chase, G. M.; di Nallo, O. E.; Byrd, E. F. C.

    2014-05-01

    We are investigating the accuracy of theoretical models used to predict the visible, ultraviolet and infrared spectra of product materials ejected from the muzzle of currently fielded systems. Recent advances in solid propellants has made the management of muzzle signature (flash) a principle issue in weapons development across the calibers. A priori prediction of the electromagnetic spectra of formulations will allow researchers to tailor blends that yield desired signatures and determine spectrographic detection ranges. We are currently employing quantum chemistry methods at various levels of sophistication to optimize molecular geometries, compute vibrational frequencies, and determine the optical spectra of specific gas-phase molecules and radicals of interest. Electronic excitations are being computed using Time Dependent Density Functional Theory (TD-DFT). A comparison of computational results to experimental values found in the literature is used to assess the affect of basis set and functional choice on calculation accuracy. The current status of this work will be presented at the conference. Work supported by the ARL, and USMA.

  8. Theoretical Characterization of Visual Signatures and Calculation of Approximate Global Harmonic Frequency Scaling Factors

    NASA Astrophysics Data System (ADS)

    Kashinski, D. O.; Nelson, R. G.; Chase, G. M.; di Nallo, O. E.; Byrd, E. F. C.

    2016-05-01

    We are investigating the accuracy of theoretical models used to predict the visible, ultraviolet, and infrared spectra, as well as other properties, of product materials ejected from the muzzle of currently fielded systems. Recent advances in solid propellants has made the management of muzzle signature (flash) a principle issue in weapons development across the calibers. A priori prediction of the electromagnetic spectra of formulations will allow researchers to tailor blends that yield desired signatures and determine spectrographic detection ranges. Quantum chemistry methods at various levels of sophistication have been employed to optimize molecular geometries, compute unscaled harmonic frequencies, and determine the optical spectra of specific gas-phase species. Electronic excitations are being computed using Time Dependent Density Functional Theory (TD-DFT). Calculation of approximate global harmonic frequency scaling factors for specific DFT functionals is also in progress. A full statistical analysis and reliability assessment of computational results is currently underway. Work supported by the ARL, DoD-HPCMP, and USMA.

  9. Cerebellar contributions to motor control and language comprehension: searching for common computational principles.

    PubMed

    Moberget, Torgeir; Ivry, Richard B

    2016-04-01

    The past 25 years have seen the functional domain of the cerebellum extend beyond the realm of motor control, with considerable discussion of how this subcortical structure contributes to cognitive domains including attention, memory, and language. Drawing on evidence from neuroanatomy, physiology, neuropsychology, and computational work, sophisticated models have been developed to describe cerebellar function in sensorimotor control and learning. In contrast, mechanistic accounts of how the cerebellum contributes to cognition have remained elusive. Inspired by the homogeneous cerebellar microanatomy and a desire for parsimony, many researchers have sought to extend mechanistic ideas from motor control to cognition. One influential hypothesis centers on the idea that the cerebellum implements internal models, representations of the context-specific dynamics of an agent's interactions with the environment, enabling predictive control. We briefly review cerebellar anatomy and physiology, to review the internal model hypothesis as applied in the motor domain, before turning to extensions of these ideas in the linguistic domain, focusing on speech perception and semantic processing. While recent findings are consistent with this computational generalization, they also raise challenging questions regarding the nature of cerebellar learning, and may thus inspire revisions of our views on the role of the cerebellum in sensorimotor control. © 2016 New York Academy of Sciences.

  10. A light-stimulated synaptic device based on graphene hybrid phototransistor

    NASA Astrophysics Data System (ADS)

    Qin, Shuchao; Wang, Fengqiu; Liu, Yujie; Wan, Qing; Wang, Xinran; Xu, Yongbing; Shi, Yi; Wang, Xiaomu; Zhang, Rong

    2017-09-01

    Neuromorphic chips refer to an unconventional computing architecture that is modelled on biological brains. They are increasingly employed for processing sensory data for machine vision, context cognition, and decision making. Despite rapid advances, neuromorphic computing has remained largely an electronic technology, making it a challenge to access the superior computing features provided by photons, or to directly process vision data that has increasing importance to artificial intelligence. Here we report a novel light-stimulated synaptic device based on a graphene-carbon nanotube hybrid phototransistor. Significantly, the device can respond to optical stimuli in a highly neuron-like fashion and exhibits flexible tuning of both short- and long-term plasticity. These features combined with the spatiotemporal processability make our device a capable counterpart to today’s electrically-driven artificial synapses, with superior reconfigurable capabilities. In addition, our device allows for generic optical spike processing, which provides a foundation for more sophisticated computing. The silicon-compatible, multifunctional photosensitive synapse opens up a new opportunity for neural networks enabled by photonics and extends current neuromorphic systems in terms of system complexities and functionalities.

  11. Graphical Requirements for Force Level Planning. Volume 2

    DTIC Science & Technology

    1991-09-01

    technology review includes graphics algorithms, computer hardware, computer software, and design methodologies. The technology can either exist today or...level graphics language. 7.4 User Interface Design Tools As user interfaces have become more sophisticated, they have become harder to develop. Xl...Setphen M. Pizer, editors. Proceedings 1986 Workshop on Interactive 31) Graphics , October 1986. 18 J. S. Dumas. Designing User Interface Software. Prentice

  12. Use of Microcomputers and Personal Computers in Pacing

    PubMed Central

    Sasmor, L.; Tarjan, P.; Mumford, V.; Smith, E.

    1983-01-01

    This paper describes the evolution from the early discrete circuit pacemaker of the past to the sophisticated microprocessor based pacemakers of today. The necessary computerized supporting instrumentation is also described. Technological and economical reasons for this evolution are discussed.

  13. Adaptive Language Games with Robots

    NASA Astrophysics Data System (ADS)

    Steels, Luc

    2010-11-01

    This paper surveys recent research into language evolution using computer simulations and robotic experiments. This field has made tremendous progress in the past decade going from simple simulations of lexicon formation with animallike cybernetic robots to sophisticated grammatical experiments with humanoid robots.

  14. Toward a New Voice

    ERIC Educational Resources Information Center

    Murphy, Patti

    2007-01-01

    Frequently linked to sophisticated speech communication devices resembling laptop computers, augmentative and alternative communication (AAC) encompasses a spectrum of tools and strategies ranging from pointing, writing, gestures, and facial expressions to sign language, manual alphabet boards, picture symbols, and photographs used to convey…

  15. V/STOL AND digital avionics system for UH-1H

    NASA Technical Reports Server (NTRS)

    Liden, S.

    1978-01-01

    A hardware and software system for the Bell UH-1H helicopter was developed that provides sophisticated navigation, guidance, control, display, and data acquisition capabilities for performing terminal area navigation, guidance and control research. Two Sperry 1819B general purpose digital computers were used. One contains the development software that performs all the specified system flight computations. The second computer is available to NASA for experimental programs that run simultaneously with the other computer programs and which may, at the push of a button, replace selected computer computations. Other features that provide research flexibility include keyboard selectable gains and parameters and software generated alphanumeric and CRT displays.

  16. The journey from forensic to predictive materials science using density functional theory

    DOE PAGES

    Schultz, Peter A.

    2017-09-12

    Approximate methods for electronic structure, implemented in sophisticated computer codes and married to ever-more powerful computing platforms, have become invaluable in chemistry and materials science. The maturing and consolidation of quantum chemistry codes since the 1980s, based upon explicitly correlated electronic wave functions, has made them a staple of modern molecular chemistry. Here, the impact of first principles electronic structure in physics and materials science had lagged owing to the extra formal and computational demands of bulk calculations.

  17. The journey from forensic to predictive materials science using density functional theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schultz, Peter A.

    Approximate methods for electronic structure, implemented in sophisticated computer codes and married to ever-more powerful computing platforms, have become invaluable in chemistry and materials science. The maturing and consolidation of quantum chemistry codes since the 1980s, based upon explicitly correlated electronic wave functions, has made them a staple of modern molecular chemistry. Here, the impact of first principles electronic structure in physics and materials science had lagged owing to the extra formal and computational demands of bulk calculations.

  18. The Iodine Spectrum: A New Look at an Old Topic

    NASA Astrophysics Data System (ADS)

    Long, George; Sauder, Deborah; Shalhoub, George M.; Stout, Roland; Hamby Towns, Marcy; Zielinski, Theresa Julia

    1999-06-01

    This paper describes a new approach to the traditional iodine gas absorption spectrum experiment often performed in undergraduate physical chemistry labs. The approach is student centered and designed to emphasize the conceptual richness in this classic experiment. It gives students the opportunity to examine the conceptual and mathematical connections between spectroscopic data and quantum models by organizing the material in conceptual chunks, which they work through sequentially. Students use symbolic mathematics software, Mathcad, to expedite the sophisticated numerical calculations required. The curricular chunks were specifically constructed to make the sophisticated concepts embedded in the project accessible. The focus activities remind the students of information they already know and require them to employ both paper and pencil and computer worksheets to complete calculations. Five Mathcad templates provide a rich mathematical treatment of the topics in this experiment. This paper describes how the documents MorsePotential.mcd, BirgeSponer.mcd, IodineSpectrum.mcd, FranckCondonBackground.mcd, and FranckCondonComputation.mcd are used during the three weeks in which this experiment can be performed by a typical physical chemistry student. Although originally designed to use the WWW to disseminate information and promote interaction among physical chemistry students at geographically dispersed institutions, this segmented focus-question approach to the iodine experiment has also been used by a physical chemistry class at a single campus. In both formats, faculty noticed decreased anxiety of the students towards the experiment and an increase in the quality of laboratory reports that indicated better understanding of the chemical concepts.

  19. Computer simulation studies of the growth of strained layers by molecular-beam epitaxy

    NASA Astrophysics Data System (ADS)

    Faux, D. A.; Gaynor, G.; Carson, C. L.; Hall, C. K.; Bernholc, J.

    1990-08-01

    Two new types of discrete-space Monte Carlo computer simulation are presented for the modeling of the early stages of strained-layer growth by molecular-beam epitaxy. The simulations are more economical on computer resources than continuous-space Monte Carlo or molecular dynamics. Each model is applied to the study of growth onto a substrate in two dimensions with use of Lennard-Jones interatomic potentials. Up to seven layers are deposited for a variety of lattice mismatches, temperatures, and growth rates. Both simulations give similar results. At small lattice mismatches (<~4%) the growth is in registry with the substrate, while at high mismatches (>~6%) the growth is incommensurate with the substrate. At intermediate mismatches, a transition from registered to incommensurate growth is observed which commences at the top of the crystal and propagates down to the first layer. Faster growth rates are seen to inhibit this transition. The growth mode is van der Merwe (layer-by-layer) at 2% lattice mismatch, but at larger mismatches Volmer-Weber (island) growth is preferred. The Monte Carlo simulations are assessed in the light of these results and the ease at which they can be extended to three dimensions and to more sophisticated potentials is discussed.

  20. WE-D-303-01: Development and Application of Digital Human Phantoms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Segars, P.

    2015-06-15

    Modern medical physics deals with complex problems such as 4D radiation therapy and imaging quality optimization. Such problems involve a large number of radiological parameters, and anatomical and physiological breathing patterns. A major challenge is how to develop, test, evaluate and compare various new imaging and treatment techniques, which often involves testing over a large range of radiological parameters as well as varying patient anatomies and motions. It would be extremely challenging, if not impossible, both ethically and practically, to test every combination of parameters and every task on every type of patient under clinical conditions. Computer-based simulation using computationalmore » phantoms offers a practical technique with which to evaluate, optimize, and compare imaging technologies and methods. Within simulation, the computerized phantom provides a virtual model of the patient’s anatomy and physiology. Imaging data can be generated from it as if it was a live patient using accurate models of the physics of the imaging and treatment process. With sophisticated simulation algorithms, it is possible to perform virtual experiments entirely on the computer. By serving as virtual patients, computational phantoms hold great promise in solving some of the most complex problems in modern medical physics. In this proposed symposium, we will present the history and recent developments of computational phantom models, share experiences in their application to advanced imaging and radiation applications, and discuss their promises and limitations. Learning Objectives: Understand the need and requirements of computational phantoms in medical physics research Discuss the developments and applications of computational phantoms Know the promises and limitations of computational phantoms in solving complex problems.« less

  1. High accuracy mantle convection simulation through modern numerical methods - II: realistic models and problems

    NASA Astrophysics Data System (ADS)

    Heister, Timo; Dannberg, Juliane; Gassmöller, Rene; Bangerth, Wolfgang

    2017-08-01

    Computations have helped elucidate the dynamics of Earth's mantle for several decades already. The numerical methods that underlie these simulations have greatly evolved within this time span, and today include dynamically changing and adaptively refined meshes, sophisticated and efficient solvers, and parallelization to large clusters of computers. At the same time, many of the methods - discussed in detail in a previous paper in this series - were developed and tested primarily using model problems that lack many of the complexities that are common to the realistic models our community wants to solve today. With several years of experience solving complex and realistic models, we here revisit some of the algorithm designs of the earlier paper and discuss the incorporation of more complex physics. In particular, we re-consider time stepping and mesh refinement algorithms, evaluate approaches to incorporate compressibility, and discuss dealing with strongly varying material coefficients, latent heat, and how to track chemical compositions and heterogeneities. Taken together and implemented in a high-performance, massively parallel code, the techniques discussed in this paper then allow for high resolution, 3-D, compressible, global mantle convection simulations with phase transitions, strongly temperature dependent viscosity and realistic material properties based on mineral physics data.

  2. On the kinematics of scalar iso-surfaces in turbulent flow

    NASA Astrophysics Data System (ADS)

    Blakeley, Brandon C.; Riley, James J.; Storti, Duane W.; Wang, Weirong

    2017-11-01

    The behavior of scalar iso-surfaces in turbulent flows is of fundamental interest and importance in a number of problems, e.g., the stoichiometric surface in non-premixed reactions, and the turbulent/non-turbulent interface in localized turbulent shear flows. Of particular interest here is the behavior of the average surface area per unit volume, Σ. We report on the use of direct numerical simulations and sophisticated surface tracking techniques to directly compute Σ and model its evolution. We consider two different scalar configurations in decaying, isotropic turbulence: first, the iso-surface is initially homogenous and isotropic in space, second, the iso-surface is initially planar. A novel method of computing integral properties from regularly-sampled values of a scalar function is leveraged to provide accurate estimates of Σ. Guided by simulation results, modeling is introduced from two perspectives. The first approach models the various terms in the evolution equation for Σ, while the second uses Rice's theorem to model Σ directly. In particular, the two principal effects on the evolution of Σ, i.e., the growth of the surface area due to local surface stretching, and the ultimate decay due to molecular destruction, are addressed.

  3. Science & Technology Review January/February 2010

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bearinger, J P

    2009-11-30

    This month's issue has the following articles: (1) Innovative Materials Rise to the Radiation Challenge - Commentary by Bruce Warner; (2) The Hunt for Better Radiation Detection - New materials will help radiation detectors pick up weak signals and accurately identify illicit radioactive sources; (3) Time-Critical Technology Identifies Deadly Bloodborne Pathogens - A portable device can simultaneously distinguish up to five bloodborne pathogens in just minutes; (4) Defending Computer Networks against Attack - A Laboratory effort takes a new approach to detecting increasingly sophisticated cyber attacks; and (5) Imaging Cargo's Inner Secrets - Livermore-University of California collaborators are modeling amore » new radiographic technique for identifying nuclear materials concealed inside cargo containers.« less

  4. Microbiome Tools for Forensic Science.

    PubMed

    Metcalf, Jessica L; Xu, Zhenjiang Z; Bouslimani, Amina; Dorrestein, Pieter; Carter, David O; Knight, Rob

    2017-09-01

    Microbes are present at every crime scene and have been used as physical evidence for over a century. Advances in DNA sequencing and computational approaches have led to recent breakthroughs in the use of microbiome approaches for forensic science, particularly in the areas of estimating postmortem intervals (PMIs), locating clandestine graves, and obtaining soil and skin trace evidence. Low-cost, high-throughput technologies allow us to accumulate molecular data quickly and to apply sophisticated machine-learning algorithms, building generalizable predictive models that will be useful in the criminal justice system. In particular, integrating microbiome and metabolomic data has excellent potential to advance microbial forensics. Copyright © 2017. Published by Elsevier Ltd.

  5. A new scheme for the parameterization of the turbulent planetary boundary layer in the GLAS fourth order GCM

    NASA Technical Reports Server (NTRS)

    Helfand, H. M.

    1985-01-01

    Methods being used to increase the horizontal and vertical resolution and to implement more sophisticated parameterization schemes for general circulation models (GCM) run on newer, more powerful computers are described. Attention is focused on the NASA-Goddard Laboratory for Atmospherics fourth order GCM. A new planetary boundary layer (PBL) model has been developed which features explicit resolution of two or more layers. Numerical models are presented for parameterizing the turbulent vertical heat, momentum and moisture fluxes at the earth's surface and between the layers in the PBL model. An extended Monin-Obhukov similarity scheme is applied to express the relationships between the lowest levels of the GCM and the surface fluxes. On-line weather prediction experiments are to be run to test the effects of the higher resolution thereby obtained for dynamic atmospheric processes.

  6. Wall interference correction improvements for the ONERA main wind tunnels

    NASA Technical Reports Server (NTRS)

    Vaucheret, X.

    1982-01-01

    This paper describes improved methods of calculating wall interference corrections for the ONERA large windtunnels. The mathematical description of the model and its sting support have become more sophisticated. An increasing number of singularities is used until an agreement between theoretical and experimental signatures of the model and sting on the walls of the closed test section is obtained. The singularity decentering effects are calculated when the model reaches large angles of attack. The porosity factor cartography on the perforated walls deduced from the measured signatures now replaces the reference tests previously carried out in larger tunnels. The porosity factors obtained from the blockage terms (signatures at zero lift) and from the lift terms are in good agreement. In each case (model + sting + test section), wall corrections are now determined, before the tests, as a function of the fundamental parameters M, CS, CZ. During the windtunnel tests, the corrections are quickly computed from these functions.

  7. An Overview of R in Health Decision Sciences.

    PubMed

    Jalal, Hawre; Pechlivanoglou, Petros; Krijkamp, Eline; Alarid-Escudero, Fernando; Enns, Eva; Hunink, M G Myriam

    2017-10-01

    As the complexity of health decision science applications increases, high-level programming languages are increasingly adopted for statistical analyses and numerical computations. These programming languages facilitate sophisticated modeling, model documentation, and analysis reproducibility. Among the high-level programming languages, the statistical programming framework R is gaining increased recognition. R is freely available, cross-platform compatible, and open source. A large community of users who have generated an extensive collection of well-documented packages and functions supports it. These functions facilitate applications of health decision science methodology as well as the visualization and communication of results. Although R's popularity is increasing among health decision scientists, methodological extensions of R in the field of decision analysis remain isolated. The purpose of this article is to provide an overview of existing R functionality that is applicable to the various stages of decision analysis, including model design, input parameter estimation, and analysis of model outputs.

  8. Simplified jet-A kinetic mechanism for combustor application

    NASA Technical Reports Server (NTRS)

    Lee, Chi-Ming; Kundu, Krishna; Ghorashi, Bahman

    1993-01-01

    Successful modeling of combustion and emissions in gas turbine engine combustors requires an adequate description of the reaction mechanism. For hydrocarbon oxidation, detailed mechanisms are only available for the simplest types of hydrocarbons such as methane, ethane, acetylene, and propane. These detailed mechanisms contain a large number of chemical species participating simultaneously in many elementary kinetic steps. Current computational fluid dynamic (CFD) models must include fuel vaporization, fuel-air mixing, chemical reactions, and complicated boundary geometries. To simulate these conditions a very sophisticated computer model is required, which requires large computer memory capacity and long run times. Therefore, gas turbine combustion modeling has frequently been simplified by using global reaction mechanisms, which can predict only the quantities of interest: heat release rates, flame temperature, and emissions. Jet fuels are wide-boiling-range hydrocarbons with ranges extending through those of gasoline and kerosene. These fuels are chemically complex, often containing more than 300 components. Jet fuel typically can be characterized as containing 70 vol pct paraffin compounds and 25 vol pct aromatic compounds. A five-step Jet-A fuel mechanism which involves pyrolysis and subsequent oxidation of paraffin and aromatic compounds is presented here. This mechanism is verified by comparing with Jet-A fuel ignition delay time experimental data, and species concentrations obtained from flametube experiments. This five-step mechanism appears to be better than the current one- and two-step mechanisms.

  9. Studies of the low-energy quasiparticle excitations in high-temperature superconducting cuprates with scanning tunneling spectroscopy and magnetization measurements

    NASA Astrophysics Data System (ADS)

    Boev, Ivan Krasimirov

    In the present PhD work, three sophisticated models based on the "Engineering" modeling approach have been utilized to conveniently describe and thoroughly analyze details of Lightning events at the CN Tower. Both the CN Tower and the Lightning Channel are represented by a number of connected in series Transmission Line sections in order to account for the variations in the shape of the tower and for plasma processes that take place within the Lightning Channel. A sum of two Heidler functions is used to describe the "uncontaminated" Return Stroke current, which is injected at the attachment point between the CN Tower and the Lightning Channel. Reflections and refractions at all points of mismatched impedances are considered until their contribution becomes less than 1% of the originally injected current wave. In the proposed models, the problem with the current discontinuity at the Lightning Channel front, commonly taken care of by introducing a "turn-on" term when computing radiation fields, is uniquely treated by introducing reflected and transmitted components. For the first time, variable speed of propagation of the Return Stroke current front has been considered and its influence upon the predicted current distributions along the whole Lightning Channel path and upon the radiated distant fields analyzed. Furthermore, as another novelty, computation of the electromagnetic field is accomplished in Cartesian Coordinates. This fact permits to relax the requirement on the verticality of the Lightning Channel, normally imposed in Cylindrical Coordinates. Therefore, it becomes possible to study without difficulty the influence of a slanted Lightning Channel upon the surrounding electromagnetic field. Since the proposed sophisticated Five-Section Model has the capability to represent very closely the structure of the CN Tower and to emulate faithfully the shape of, as well as physical processes within the Lightning Channel, it is believed to have the potential of truthfully reproducing observed fields. The developed modeling approach can be easily adapted to study the anticipated radiated fields at tall structures even before construction.

  10. Synthetic analog computation in living cells.

    PubMed

    Daniel, Ramiz; Rubens, Jacob R; Sarpeshkar, Rahul; Lu, Timothy K

    2013-05-30

    A central goal of synthetic biology is to achieve multi-signal integration and processing in living cells for diagnostic, therapeutic and biotechnology applications. Digital logic has been used to build small-scale circuits, but other frameworks may be needed for efficient computation in the resource-limited environments of cells. Here we demonstrate that synthetic analog gene circuits can be engineered to execute sophisticated computational functions in living cells using just three transcription factors. Such synthetic analog gene circuits exploit feedback to implement logarithmically linear sensing, addition, ratiometric and power-law computations. The circuits exhibit Weber's law behaviour as in natural biological systems, operate over a wide dynamic range of up to four orders of magnitude and can be designed to have tunable transfer functions. Our circuits can be composed to implement higher-order functions that are well described by both intricate biochemical models and simple mathematical functions. By exploiting analog building-block functions that are already naturally present in cells, this approach efficiently implements arithmetic operations and complex functions in the logarithmic domain. Such circuits may lead to new applications for synthetic biology and biotechnology that require complex computations with limited parts, need wide-dynamic-range biosensing or would benefit from the fine control of gene expression.

  11. Process model comparison and transferability across bioreactor scales and modes of operation for a mammalian cell bioprocess.

    PubMed

    Craven, Stephen; Shirsat, Nishikant; Whelan, Jessica; Glennon, Brian

    2013-01-01

    A Monod kinetic model, logistic equation model, and statistical regression model were developed for a Chinese hamster ovary cell bioprocess operated under three different modes of operation (batch, bolus fed-batch, and continuous fed-batch) and grown on two different bioreactor scales (3 L bench-top and 15 L pilot-scale). The Monod kinetic model was developed for all modes of operation under study and predicted cell density, glucose glutamine, lactate, and ammonia concentrations well for the bioprocess. However, it was computationally demanding due to the large number of parameters necessary to produce a good model fit. The transferability of the Monod kinetic model structure and parameter set across bioreactor scales and modes of operation was investigated and a parameter sensitivity analysis performed. The experimentally determined parameters had the greatest influence on model performance. They changed with scale and mode of operation, but were easily calculated. The remaining parameters, which were fitted using a differential evolutionary algorithm, were not as crucial. Logistic equation and statistical regression models were investigated as alternatives to the Monod kinetic model. They were less computationally intensive to develop due to the absence of a large parameter set. However, modeling of the nutrient and metabolite concentrations proved to be troublesome due to the logistic equation model structure and the inability of both models to incorporate a feed. The complexity, computational load, and effort required for model development has to be balanced with the necessary level of model sophistication when choosing which model type to develop for a particular application. Copyright © 2012 American Institute of Chemical Engineers (AIChE).

  12. A Macro-Level Analysis of SRL Processes and Their Relations to the Acquisition of a Sophisticated Mental Model of a Complex System

    ERIC Educational Resources Information Center

    Greene, Jeffrey Alan; Azevedo, Roger

    2009-01-01

    In this study, we used think-aloud verbal protocols to examine how various macro-level processes of self-regulated learning (SRL; e.g., planning, monitoring, strategy use, handling of task difficulty and demands) were associated with the acquisition of a sophisticated mental model of a complex biological system. Numerous studies examine how…

  13. At Home in the Cell.

    ERIC Educational Resources Information Center

    Flannery, Maura C.

    1999-01-01

    Argues that biologists' understanding of the cell has become richer over the past 30 years. Describes how genetic engineering and sophisticated computer technology have provided an increased knowledge of genes, gene products, components of cells, and the structure and function of proteins. (CCM)

  14. The High-Tech Surge. Focus on Careers.

    ERIC Educational Resources Information Center

    Vo, Chuong-Dai Hong

    1996-01-01

    The computer industry is growing at a phenomenal rate as technology advances and prices fall, stimulating unprecedented demand from business, government, and individuals. Higher levels of education will be the key to securing employment as organizations increasingly rely on sophisticated technology. (Author)

  15. Chandelier: Picturing Potential

    ERIC Educational Resources Information Center

    Tebbs, Trevor J.

    2014-01-01

    The author--artist, scientist, educator, and visual-spatial thinker--describes the genesis of, and provides insight into, an innovative, strength-based, visually dynamic computer-aided communication system called Chandelier©. This system is the powerful combination of a sophisticated, user-friendly software program and an organizational…

  16. General Multimechanism Reversible-Irreversible Time-Dependent Constitutive Deformation Model Being Developed

    NASA Technical Reports Server (NTRS)

    Saleeb, A. F.; Arnold, Steven M.

    2001-01-01

    Since most advanced material systems (for example metallic-, polymer-, and ceramic-based systems) being currently researched and evaluated are for high-temperature airframe and propulsion system applications, the required constitutive models must account for both reversible and irreversible time-dependent deformations. Furthermore, since an integral part of continuum-based computational methodologies (be they microscale- or macroscale-based) is an accurate and computationally efficient constitutive model to describe the deformation behavior of the materials of interest, extensive research efforts have been made over the years on the phenomenological representations of constitutive material behavior in the inelastic analysis of structures. From a more recent and comprehensive perspective, the NASA Glenn Research Center in conjunction with the University of Akron has emphasized concurrently addressing three important and related areas: that is, 1) Mathematical formulation; 2) Algorithmic developments for updating (integrating) the external (e.g., stress) and internal state variables; 3) Parameter estimation for characterizing the model. This concurrent perspective to constitutive modeling has enabled the overcoming of the two major obstacles to fully utilizing these sophisticated time-dependent (hereditary) constitutive models in practical engineering analysis. These obstacles are: 1) Lack of efficient and robust integration algorithms; 2) Difficulties associated with characterizing the large number of required material parameters, particularly when many of these parameters lack obvious or direct physical interpretations.

  17. Optimizing radiotherapy protocols using computer automata to model tumour cell death as a function of oxygen diffusion processes.

    PubMed

    Paul-Gilloteaux, Perrine; Potiron, Vincent; Delpon, Grégory; Supiot, Stéphane; Chiavassa, Sophie; Paris, François; Costes, Sylvain V

    2017-05-23

    The concept of hypofractionation is gaining momentum in radiation oncology centres, enabled by recent advances in radiotherapy apparatus. The gain of efficacy of this innovative treatment must be defined. We present a computer model based on translational murine data for in silico testing and optimization of various radiotherapy protocols with respect to tumour resistance and the microenvironment heterogeneity. This model combines automata approaches with image processing algorithms to simulate the cellular response of tumours exposed to ionizing radiation, modelling the alteration of oxygen permeabilization in blood vessels against repeated doses, and introducing mitotic catastrophe (as opposed to arbitrary delayed cell-death) as a means of modelling radiation-induced cell death. Published data describing cell death in vitro as well as tumour oxygenation in vivo are used to inform parameters. Our model is validated by comparing simulations to in vivo data obtained from the radiation treatment of mice transplanted with human prostate tumours. We then predict the efficacy of untested hypofractionation protocols, hypothesizing that tumour control can be optimized by adjusting daily radiation dosage as a function of the degree of hypoxia in the tumour environment. Further biological refinement of this tool will permit the rapid development of more sophisticated strategies for radiotherapy.

  18. Ranking network of a captive rhesus macaque society: a sophisticated corporative kingdom.

    PubMed

    Fushing, Hsieh; McAssey, Michael P; Beisner, Brianne; McCowan, Brenda

    2011-03-15

    We develop a three-step computing approach to explore a hierarchical ranking network for a society of captive rhesus macaques. The computed network is sufficiently informative to address the question: Is the ranking network for a rhesus macaque society more like a kingdom or a corporation? Our computations are based on a three-step approach. These steps are devised to deal with the tremendous challenges stemming from the transitivity of dominance as a necessary constraint on the ranking relations among all individual macaques, and the very high sampling heterogeneity in the behavioral conflict data. The first step simultaneously infers the ranking potentials among all network members, which requires accommodation of heterogeneous measurement error inherent in behavioral data. Our second step estimates the social rank for all individuals by minimizing the network-wide errors in the ranking potentials. The third step provides a way to compute confidence bounds for selected empirical features in the social ranking. We apply this approach to two sets of conflict data pertaining to two captive societies of adult rhesus macaques. The resultant ranking network for each society is found to be a sophisticated mixture of both a kingdom and a corporation. Also, for validation purposes, we reanalyze conflict data from twenty longhorn sheep and demonstrate that our three-step approach is capable of correctly computing a ranking network by eliminating all ranking error.

  19. Reliability-Based Model to Analyze the Performance and Cost of a Transit Fare Collection System.

    DOT National Transportation Integrated Search

    1985-06-01

    The collection of transit system fares has become more sophisticated in recent years, with more flexible structures requiring more sophisticated fare collection equipment to process tickets and admit passengers. However, this new and complex equipmen...

  20. Crowd Sensing-Enabling Security Service Recommendation for Social Fog Computing Systems

    PubMed Central

    Wu, Jun; Su, Zhou; Li, Jianhua

    2017-01-01

    Fog computing, shifting intelligence and resources from the remote cloud to edge networks, has the potential of providing low-latency for the communication from sensing data sources to users. For the objects from the Internet of Things (IoT) to the cloud, it is a new trend that the objects establish social-like relationships with each other, which efficiently brings the benefits of developed sociality to a complex environment. As fog service become more sophisticated, it will become more convenient for fog users to share their own services, resources, and data via social networks. Meanwhile, the efficient social organization can enable more flexible, secure, and collaborative networking. Aforementioned advantages make the social network a potential architecture for fog computing systems. In this paper, we design an architecture for social fog computing, in which the services of fog are provisioned based on “friend” relationships. To the best of our knowledge, this is the first attempt at an organized fog computing system-based social model. Meanwhile, social networking enhances the complexity and security risks of fog computing services, creating difficulties of security service recommendations in social fog computing. To address this, we propose a novel crowd sensing-enabling security service provisioning method to recommend security services accurately in social fog computing systems. Simulation results show the feasibilities and efficiency of the crowd sensing-enabling security service recommendation method for social fog computing systems. PMID:28758943

  1. Crowd Sensing-Enabling Security Service Recommendation for Social Fog Computing Systems.

    PubMed

    Wu, Jun; Su, Zhou; Wang, Shen; Li, Jianhua

    2017-07-30

    Fog computing, shifting intelligence and resources from the remote cloud to edge networks, has the potential of providing low-latency for the communication from sensing data sources to users. For the objects from the Internet of Things (IoT) to the cloud, it is a new trend that the objects establish social-like relationships with each other, which efficiently brings the benefits of developed sociality to a complex environment. As fog service become more sophisticated, it will become more convenient for fog users to share their own services, resources, and data via social networks. Meanwhile, the efficient social organization can enable more flexible, secure, and collaborative networking. Aforementioned advantages make the social network a potential architecture for fog computing systems. In this paper, we design an architecture for social fog computing, in which the services of fog are provisioned based on "friend" relationships. To the best of our knowledge, this is the first attempt at an organized fog computing system-based social model. Meanwhile, social networking enhances the complexity and security risks of fog computing services, creating difficulties of security service recommendations in social fog computing. To address this, we propose a novel crowd sensing-enabling security service provisioning method to recommend security services accurately in social fog computing systems. Simulation results show the feasibilities and efficiency of the crowd sensing-enabling security service recommendation method for social fog computing systems.

  2. The politics of participation in watershed modeling.

    PubMed

    Korfmacher, K S

    2001-02-01

    While researchers and decision-makers increasingly recognize the importance of public participation in environmental decision-making, there is less agreement about how to involve the public. One of the most controversial issues is how to involve citizens in producing scientific information. Although this question is relevant to many areas of environmental policy, it has come to the fore in watershed management. Increasingly, the public is becoming involved in the sophisticated computer modeling efforts that have been developed to inform watershed management decisions. These models typically have been treated as technical inputs to the policy process. However, model-building itself involves numerous assumptions, judgments, and decisions that are relevant to the public. This paper examines the politics of public involvement in watershed modeling efforts and proposes five guidelines for good practice for such efforts. Using these guidelines, I analyze four cases in which different approaches to public involvement in the modeling process have been attempted and make recommendations for future efforts to involve communities in watershed modeling. Copyright 2001 Springer-Verlag

  3. Failover in Cellular Automata

    NASA Astrophysics Data System (ADS)

    Kumar, Shailesh; Rao, Shrisha

    This paper studies a phenomenon called failover, and shows that this phenomenon (in particular, stateless failover) can be modeled by Game of Life cellular automata. This is the first time that this sophisticated real-life system behavior has been modeled in abstract terms. A cellular automata (CA) configuration is constructed that exhibits emergent failover. The configuration is based on standard Game of Life rules. Gliders and glider-guns form the core messaging structure in the configuration. The blinker is represented as the basic computational unit, and it is shown how it can be recreated in case of a failure. Stateless failover using the primary-backup mechanism is demonstrated. The details of the CA components used in the configuration and its working are described, and a simulation of the complete configuration is also presented.

  4. The Cerebellum: Adaptive Prediction for Movement and Cognition.

    PubMed

    Sokolov, Arseny A; Miall, R Chris; Ivry, Richard B

    2017-05-01

    Over the past 30 years, cumulative evidence has indicated that cerebellar function extends beyond sensorimotor control. This view has emerged from studies of neuroanatomy, neuroimaging, neuropsychology, and brain stimulation, with the results implicating the cerebellum in domains as diverse as attention, language, executive function, and social cognition. Although the literature provides sophisticated models of how the cerebellum helps refine movements, it remains unclear how the core mechanisms of these models can be applied when considering a broader conceptualization of cerebellar function. In light of recent multidisciplinary findings, we examine how two key concepts that have been suggested as general computational principles of cerebellar function- prediction and error-based learning- might be relevant in the operation of cognitive cerebro-cerebellar loops. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Multilocus Association Mapping Using Variable-Length Markov Chains

    PubMed Central

    Browning, Sharon R.

    2006-01-01

    I propose a new method for association-based gene mapping that makes powerful use of multilocus data, is computationally efficient, and is straightforward to apply over large genomic regions. The approach is based on the fitting of variable-length Markov chain models, which automatically adapt to the degree of linkage disequilibrium (LD) between markers to create a parsimonious model for the LD structure. Edges of the fitted graph are tested for association with trait status. This approach can be thought of as haplotype testing with sophisticated windowing that accounts for extent of LD to reduce degrees of freedom and number of tests while maximizing information. I present analyses of two published data sets that show that this approach can have better power than single-marker tests or sliding-window haplotypic tests. PMID:16685642

  6. Multilocus association mapping using variable-length Markov chains.

    PubMed

    Browning, Sharon R

    2006-06-01

    I propose a new method for association-based gene mapping that makes powerful use of multilocus data, is computationally efficient, and is straightforward to apply over large genomic regions. The approach is based on the fitting of variable-length Markov chain models, which automatically adapt to the degree of linkage disequilibrium (LD) between markers to create a parsimonious model for the LD structure. Edges of the fitted graph are tested for association with trait status. This approach can be thought of as haplotype testing with sophisticated windowing that accounts for extent of LD to reduce degrees of freedom and number of tests while maximizing information. I present analyses of two published data sets that show that this approach can have better power than single-marker tests or sliding-window haplotypic tests.

  7. Fast automated analysis of strong gravitational lenses with convolutional neural networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hezaveh, Yashar D.; Levasseur, Laurence Perreault; Marshall, Philip J.

    Quantifying image distortions caused by strong gravitational lensing—the formation of multiple images of distant sources due to the deflection of their light by the gravity of intervening structures—and estimating the corresponding matter distribution of these structures (the ‘gravitational lens’) has primarily been performed using maximum likelihood modelling of observations. Our procedure is typically time- and resource-consuming, requiring sophisticated lensing codes, several data preparation steps, and finding the maximum likelihood model parameters in a computationally expensive process with downhill optimizers. Accurate analysis of a single gravitational lens can take up to a few weeks and requires expert knowledge of the physicalmore » processes and methods involved. Tens of thousands of new lenses are expected to be discovered with the upcoming generation of ground and space surveys. We report the use of deep convolutional neural networks to estimate lensing parameters in an extremely fast and automated way, circumventing the difficulties that are faced by maximum likelihood methods. We also show that the removal of lens light can be made fast and automated using independent component analysis of multi-filter imaging data. Our networks can recover the parameters of the ‘singular isothermal ellipsoid’ density profile, which is commonly used to model strong lensing systems, with an accuracy comparable to the uncertainties of sophisticated models but about ten million times faster: 100 systems in approximately one second on a single graphics processing unit. These networks can provide a way for non-experts to obtain estimates of lensing parameters for large samples of data.« less

  8. Fast automated analysis of strong gravitational lenses with convolutional neural networks

    DOE PAGES

    Hezaveh, Yashar D.; Levasseur, Laurence Perreault; Marshall, Philip J.

    2017-08-30

    Quantifying image distortions caused by strong gravitational lensing—the formation of multiple images of distant sources due to the deflection of their light by the gravity of intervening structures—and estimating the corresponding matter distribution of these structures (the ‘gravitational lens’) has primarily been performed using maximum likelihood modelling of observations. Our procedure is typically time- and resource-consuming, requiring sophisticated lensing codes, several data preparation steps, and finding the maximum likelihood model parameters in a computationally expensive process with downhill optimizers. Accurate analysis of a single gravitational lens can take up to a few weeks and requires expert knowledge of the physicalmore » processes and methods involved. Tens of thousands of new lenses are expected to be discovered with the upcoming generation of ground and space surveys. We report the use of deep convolutional neural networks to estimate lensing parameters in an extremely fast and automated way, circumventing the difficulties that are faced by maximum likelihood methods. We also show that the removal of lens light can be made fast and automated using independent component analysis of multi-filter imaging data. Our networks can recover the parameters of the ‘singular isothermal ellipsoid’ density profile, which is commonly used to model strong lensing systems, with an accuracy comparable to the uncertainties of sophisticated models but about ten million times faster: 100 systems in approximately one second on a single graphics processing unit. These networks can provide a way for non-experts to obtain estimates of lensing parameters for large samples of data.« less

  9. Fast automated analysis of strong gravitational lenses with convolutional neural networks

    NASA Astrophysics Data System (ADS)

    Hezaveh, Yashar D.; Levasseur, Laurence Perreault; Marshall, Philip J.

    2017-08-01

    Quantifying image distortions caused by strong gravitational lensing—the formation of multiple images of distant sources due to the deflection of their light by the gravity of intervening structures—and estimating the corresponding matter distribution of these structures (the ‘gravitational lens’) has primarily been performed using maximum likelihood modelling of observations. This procedure is typically time- and resource-consuming, requiring sophisticated lensing codes, several data preparation steps, and finding the maximum likelihood model parameters in a computationally expensive process with downhill optimizers. Accurate analysis of a single gravitational lens can take up to a few weeks and requires expert knowledge of the physical processes and methods involved. Tens of thousands of new lenses are expected to be discovered with the upcoming generation of ground and space surveys. Here we report the use of deep convolutional neural networks to estimate lensing parameters in an extremely fast and automated way, circumventing the difficulties that are faced by maximum likelihood methods. We also show that the removal of lens light can be made fast and automated using independent component analysis of multi-filter imaging data. Our networks can recover the parameters of the ‘singular isothermal ellipsoid’ density profile, which is commonly used to model strong lensing systems, with an accuracy comparable to the uncertainties of sophisticated models but about ten million times faster: 100 systems in approximately one second on a single graphics processing unit. These networks can provide a way for non-experts to obtain estimates of lensing parameters for large samples of data.

  10. Microgravity

    NASA Image and Video Library

    1999-05-26

    Looking for a faster computer? How about an optical computer that processes data streams simultaneously and works with the speed of light? In space, NASA researchers have formed optical thin-film. By turning these thin-films into very fast optical computer components, scientists could improve computer tasks, such as pattern recognition. Dr. Hossin Abdeldayem, physicist at NASA/Marshall Space Flight Center (MSFC) in Huntsville, Al, is working with lasers as part of an optical system for pattern recognition. These systems can be used for automated fingerprinting, photographic scarning and the development of sophisticated artificial intelligence systems that can learn and evolve. Photo credit: NASA/Marshall Space Flight Center (MSFC)

  11. Biomolecular computing systems: principles, progress and potential.

    PubMed

    Benenson, Yaakov

    2012-06-12

    The task of information processing, or computation, can be performed by natural and man-made 'devices'. Man-made computers are made from silicon chips, whereas natural 'computers', such as the brain, use cells and molecules. Computation also occurs on a much smaller scale in regulatory and signalling pathways in individual cells and even within single biomolecules. Indeed, much of what we recognize as life results from the remarkable capacity of biological building blocks to compute in highly sophisticated ways. Rational design and engineering of biological computing systems can greatly enhance our ability to study and to control biological systems. Potential applications include tissue engineering and regeneration and medical treatments. This Review introduces key concepts and discusses recent progress that has been made in biomolecular computing.

  12. Development of a Searchable Metabolite Database and Simulator of Xenobiotic Metabolism

    EPA Science Inventory

    A computational tool (MetaPath) has been developed for storage and analysis of metabolic pathways and associated metadata. The system is capable of sophisticated text and chemical structure/substructure searching as well as rapid comparison of metabolites formed across chemicals,...

  13. High-Tech Conservation: Information-Age Tools Have Revolutionized the Work of Ecologists.

    ERIC Educational Resources Information Center

    Chiles, James R.

    1992-01-01

    Describes a new direction for conservation efforts influenced by the advance of the information age and the introduction of many technologically sophisticated information collecting devices. Devices include microscopic computer chips, miniature electronic components, and Earth-observation satellite. (MCO)

  14. Using Visual Basic to Teach Programming for Geographers.

    ERIC Educational Resources Information Center

    Slocum, Terry A.; Yoder, Stephen C.

    1996-01-01

    Outlines reasons why computer programming should be taught to geographers. These include experience using macro (scripting) languages and sophisticated visualization software, and developing a deeper understanding of general hardware and software capabilities. Discusses the distinct advantages and few disadvantages of the programming language…

  15. Controls for Burning Solid Wastes

    ERIC Educational Resources Information Center

    Toro, Richard F.; Weinstein, Norman J.

    1975-01-01

    Modern thermal solid waste processing systems are becoming more complex, incorporating features that require instrumentation and control systems to a degree greater than that previously required just for proper combustion control. With the advent of complex, sophisticated, thermal processing systems, TV monitoring and computer control should…

  16. Resistance Is Futile

    ERIC Educational Resources Information Center

    O'Hanlon, Charlene

    2009-01-01

    How odd it should seem that even today, with interactive whiteboards, content management systems, wireless broadband, handhelds, and every sort of sophisticated computing device penetrating and improving the classroom experience for students, David Roh, general manager for Follett Digital Resources can still say, "There are hundreds of…

  17. Architecture independent environment for developing engineering software on MIMD computers

    NASA Technical Reports Server (NTRS)

    Valimohamed, Karim A.; Lopez, L. A.

    1990-01-01

    Engineers are constantly faced with solving problems of increasing complexity and detail. Multiple Instruction stream Multiple Data stream (MIMD) computers have been developed to overcome the performance limitations of serial computers. The hardware architectures of MIMD computers vary considerably and are much more sophisticated than serial computers. Developing large scale software for a variety of MIMD computers is difficult and expensive. There is a need to provide tools that facilitate programming these machines. First, the issues that must be considered to develop those tools are examined. The two main areas of concern were architecture independence and data management. Architecture independent software facilitates software portability and improves the longevity and utility of the software product. It provides some form of insurance for the investment of time and effort that goes into developing the software. The management of data is a crucial aspect of solving large engineering problems. It must be considered in light of the new hardware organizations that are available. Second, the functional design and implementation of a software environment that facilitates developing architecture independent software for large engineering applications are described. The topics of discussion include: a description of the model that supports the development of architecture independent software; identifying and exploiting concurrency within the application program; data coherence; engineering data base and memory management.

  18. Radial mixing in turbomachines

    NASA Astrophysics Data System (ADS)

    Segaert, P.; Hirsch, Ch.; Deruyck, J.

    1991-03-01

    A method for computing the effects of radial mixing in a turbomachinery blade row has been developed. The method fits in the framework of a quasi-3D flow computation and hence is applied in a corrective fashion to through flow distributions. The method takes into account both secondary flows and turbulent diffusion as possible sources of mixing. Secondary flow velocities determine the magnitude of the convection terms in the energy redistribution equation while a turbulent diffusion coefficient determines the magnitude of the diffusion terms. Secondary flows are computed by solving a Poisson equation for a secondary streamfunction on a transversal S3-plane, whereby the right-hand side axial vorticity is composed of different contributions, each associated to a particular flow region: inviscid core flow, end-wall boundary layers, profile boundary layers and wakes. The turbulent mixing coefficient is estimated by a semi-empirical correlation. Secondary flow theory is applied to the VUB cascade testcase and comparisons are made between the computational results and the extensive experimental data available for this testcase. This comparison shows that the secondary flow computations yield reliable predictions of the secondary flow pattern, both qualitatively and quantitatively, taking into account the limitations of the model. However, the computations show that use of a uniform mixing coefficient has to be replaced by a more sophisticated approach.

  19. Deep Part Load Flow Analysis in a Francis Model turbine by means of two-phase unsteady flow simulations

    NASA Astrophysics Data System (ADS)

    Conrad, Philipp; Weber, Wilhelm; Jung, Alexander

    2017-04-01

    Hydropower plants are indispensable to stabilize the grid by reacting quickly to changes of the energy demand. However, an extension of the operating range towards high and deep part load conditions without fatigue of the hydraulic components is desirable to increase their flexibility. In this paper a model sized Francis turbine at low discharge operating conditions (Q/QBEP = 0.27) is analyzed by means of computational fluid dynamics (CFD). Unsteady two-phase simulations for two Thoma-number conditions are conducted. Stochastic pressure oscillations, observed on the test rig at low discharge, require sophisticated numerical models together with small time steps, large grid sizes and long simulation times to cope with these fluctuations. In this paper the BSL-EARSM model (Explicit Algebraic Reynolds Stress) was applied as a compromise between scale resolving and two-equation turbulence models with respect to computational effort and accuracy. Simulation results are compared to pressure measurements showing reasonable agreement in resolving the frequency spectra and amplitude. Inner blade vortices were predicted successfully in shape and size. Surface streamlines in blade-to-blade view are presented, giving insights to the formation of the inner blade vortices. The acquired time dependent pressure fields can be used for quasi-static structural analysis (FEA) for fatigue calculations in the future.

  20. New generation of exploration tools: interactive modeling software and microcomputers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krajewski, S.A.

    1986-08-01

    Software packages offering interactive modeling techniques are now available for use on microcomputer hardware systems. These packages are reasonably priced for both company and independent explorationists; they do not require users to have high levels of computer literacy; they are capable of rapidly completing complex ranges of sophisticated geologic and geophysical modeling tasks; and they can produce presentation-quality output for comparison with real-world data. For example, interactive packages are available for mapping, log analysis, seismic modeling, reservoir studies, and financial projects as well as for applying a variety of statistical and geostatistical techniques to analysis of exploration data. More importantly,more » these packages enable explorationists to directly apply their geologic expertise when developing and fine-tuning models for identifying new prospects and for extending producing fields. As a result of these features, microcomputers and interactive modeling software are becoming common tools in many exploration offices. Gravity and magnetics software programs illustrate some of the capabilities of such exploration tools.« less

  1. URDME: a modular framework for stochastic simulation of reaction-transport processes in complex geometries.

    PubMed

    Drawert, Brian; Engblom, Stefan; Hellander, Andreas

    2012-06-22

    Experiments in silico using stochastic reaction-diffusion models have emerged as an important tool in molecular systems biology. Designing computational software for such applications poses several challenges. Firstly, realistic lattice-based modeling for biological applications requires a consistent way of handling complex geometries, including curved inner- and outer boundaries. Secondly, spatiotemporal stochastic simulations are computationally expensive due to the fast time scales of individual reaction- and diffusion events when compared to the biological phenomena of actual interest. We therefore argue that simulation software needs to be both computationally efficient, employing sophisticated algorithms, yet in the same time flexible in order to meet present and future needs of increasingly complex biological modeling. We have developed URDME, a flexible software framework for general stochastic reaction-transport modeling and simulation. URDME uses Unstructured triangular and tetrahedral meshes to resolve general geometries, and relies on the Reaction-Diffusion Master Equation formalism to model the processes under study. An interface to a mature geometry and mesh handling external software (Comsol Multiphysics) provides for a stable and interactive environment for model construction. The core simulation routines are logically separated from the model building interface and written in a low-level language for computational efficiency. The connection to the geometry handling software is realized via a Matlab interface which facilitates script computing, data management, and post-processing. For practitioners, the software therefore behaves much as an interactive Matlab toolbox. At the same time, it is possible to modify and extend URDME with newly developed simulation routines. Since the overall design effectively hides the complexity of managing the geometry and meshes, this means that newly developed methods may be tested in a realistic setting already at an early stage of development. In this paper we demonstrate, in a series of examples with high relevance to the molecular systems biology community, that the proposed software framework is a useful tool for both practitioners and developers of spatial stochastic simulation algorithms. Through the combined efforts of algorithm development and improved modeling accuracy, increasingly complex biological models become feasible to study through computational methods. URDME is freely available at http://www.urdme.org.

  2. [Activities of Research Institute for Advanced Computer Science

    NASA Technical Reports Server (NTRS)

    Gross, Anthony R. (Technical Monitor); Leiner, Barry M.

    2001-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administrations missions. RIACS is located at the NASA Ames Research Center, Moffett Field, California. RIACS research focuses on the three cornerstones of IT research necessary to meet the future challenges of NASA missions: 1. Automated Reasoning for Autonomous Systems Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. 2. Human-Centered Computing Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities. 3. High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to analysis of large scientific datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply IT research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, visiting scientist programs and student summer programs, designed to encourage and facilitate collaboration between the university and NASA IT research communities.

  3. High-Fidelity Roadway Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Wang, Jie; Papelis, Yiannis; Shen, Yuzhong; Unal, Ozhan; Cetin, Mecit

    2010-01-01

    Roads are an essential feature in our daily lives. With the advances in computing technologies, 2D and 3D road models are employed in many applications, such as computer games and virtual environments. Traditional road models were generated by professional artists manually using modeling software tools such as Maya and 3ds Max. This approach requires both highly specialized and sophisticated skills and massive manual labor. Automatic road generation based on procedural modeling can create road models using specially designed computer algorithms or procedures, reducing the tedious manual editing needed for road modeling dramatically. But most existing procedural modeling methods for road generation put emphasis on the visual effects of the generated roads, not the geometrical and architectural fidelity. This limitation seriously restricts the applicability of the generated road models. To address this problem, this paper proposes a high-fidelity roadway generation method that takes into account road design principles practiced by civil engineering professionals, and as a result, the generated roads can support not only general applications such as games and simulations in which roads are used as 3D assets, but also demanding civil engineering applications, which requires accurate geometrical models of roads. The inputs to the proposed method include road specifications, civil engineering road design rules, terrain information, and surrounding environment. Then the proposed method generates in real time 3D roads that have both high visual and geometrical fidelities. This paper discusses in details the procedures that convert 2D roads specified in shape files into 3D roads and civil engineering road design principles. The proposed method can be used in many applications that have stringent requirements on high precision 3D models, such as driving simulations and road design prototyping. Preliminary results demonstrate the effectiveness of the proposed method.

  4. A frontal dopamine system for reflective exploratory behavior.

    PubMed

    Blanco, Nathaniel J; Love, Bradley C; Cooper, Jessica A; McGeary, John E; Knopik, Valerie S; Maddox, W Todd

    2015-09-01

    The COMT gene modulates dopamine levels in prefrontal cortex with Met allele carriers having lower COMT enzyme activity and, therefore, higher dopamine levels compared to Val/Val homozygotes. Concordantly, Val/Val homozygotes tend to perform worse and display increased (interpreted as inefficient) frontal activation in certain cognitive tasks. In a sample of 209 participants, we test the hypothesis that Met carriers will be advantaged in a decision-making task that demands sequencing exploratory and exploitive choices to minimize uncertainty about the reward structure in the environment. Previous work suggests that optimal performance depends on limited cognitive resources supported by prefrontal systems. If so, Met carriers should outperform Val/Val homozygotes, particularly under dual-task conditions that tax limited cognitive resources. In accord with these a priori predictions, Met carriers were more resilient in the face of cognitive load, continuing to explore in a sophisticated manner. We fit computational models that embody sophisticated reflective and simple reflexive strategies to further evaluate participants' exploration behavior. The Ideal Actor model reflectively updates beliefs and plans ahead, taking into account the information gained by each choice and making choices that maximize long-term payoffs. In contrast, the Naïve Reinforcement Learning (RL) model instantiates the reflexive account of choice, in which the values of actions are based only on the rewards experienced so far. Its beliefs are updated reflexively in response to observed changes in rewards. Converging with standard analyses, Met carriers were best characterized by the Ideal Actor model, whereas Val/Val homozygotes were best characterized by the Naive RL model, particularly under dual-task conditions. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. A qualitative analysis of bus simulator training on transit incidents : a case study in Florida. [Summary].

    DOT National Transportation Integrated Search

    2013-01-01

    The simulator was once a very expensive, large-scale mechanical device for training military pilots or astronauts. Modern computers, linking sophisticated software and large-screen displays, have yielded simulators for the desktop or configured as sm...

  6. Sharing Digital Data

    ERIC Educational Resources Information Center

    Benedis-Grab, Gregory

    2011-01-01

    Computers have changed the landscape of scientific research in profound ways. Technology has always played an important role in scientific experimentation--through the development of increasingly sophisticated tools, the measurement of elusive quantities, and the processing of large amounts of data. However, the advent of social networking and the…

  7. Suicide Prevention in a Treatment Setting.

    ERIC Educational Resources Information Center

    Litman, Robert E.

    1995-01-01

    The author anticipates that sophisticated interactive computer programs will be effective in improving screening and case finding of the suicidal and that they will become invaluable in improving training for primary care providers and outpatient mental health workers. Additionally, improved communication networks will help maintain continuity of…

  8. Analysis of a Distributed Pulse Power System Using a Circuit Analysis Code

    DTIC Science & Technology

    1979-06-01

    dose rate was then integrated to give a number that could be compared with measure- ments made using thermal luminescent dosimeters ( TLD ’ s). Since...NM 8 7117 AND THE BDM CORPORATION, ALBUQUERQUE, NM 87106 Abstract A sophisticated computer code (SCEPTRE), used to analyze electronic circuits...computer code (SCEPTRE), used to analyze electronic circuits, was used to evaluate the performance of a large flash X-ray machine. This device was

  9. Modeling the evolution of channel shape: Balancing computational efficiency with hydraulic fidelity

    USGS Publications Warehouse

    Wobus, C.W.; Kean, J.W.; Tucker, G.E.; Anderson, R. Scott

    2008-01-01

    The cross-sectional shape of a natural river channel controls the capacity of the system to carry water off a landscape, to convey sediment derived from hillslopes, and to erode its bed and banks. Numerical models that describe the response of a landscape to changes in climate or tectonics therefore require formulations that can accommodate evolution of channel cross-sectional geometry. However, fully two-dimensional (2-D) flow models are too computationally expensive to implement in large-scale landscape evolution models, while available simple empirical relationships between width and discharge do not adequately capture the dynamics of channel adjustment. We have developed a simplified 2-D numerical model of channel evolution in a cohesive, detachment-limited substrate subject to steady, unidirectional flow. Erosion is assumed to be proportional to boundary shear stress, which is calculated using an approximation of the flow field in which log-velocity profiles are assumed to apply along vectors that are perpendicular to the local channel bed. Model predictions of the velocity structure, peak boundary shear stress, and equilibrium channel shape compare well with predictions of a more sophisticated but more computationally demanding ray-isovel model. For example, the mean velocities computed by the two models are consistent to within ???3%, and the predicted peak shear stress is consistent to within ???7%. Furthermore, the shear stress distributions predicted by our model compare favorably with available laboratory measurements for prescribed channel shapes. A modification to our simplified code in which the flow includes a high-velocity core allows the model to be extended to estimate shear stress distributions in channels with large width-to-depth ratios. Our model is efficient enough to incorporate into large-scale landscape evolution codes and can be used to examine how channels adjust both cross-sectional shape and slope in response to tectonic and climatic forcing. Copyright 2008 by the American Geophysical Union.

  10. Simple deterministic models and applications. Comment on "Coupled disease-behavior dynamics on complex networks: A review" by Z. Wang et al.

    NASA Astrophysics Data System (ADS)

    Yang, Hyun Mo

    2015-12-01

    Currently, discrete modellings are largely accepted due to the access to computers with huge storage capacity and high performance processors and easy implementation of algorithms, allowing to develop and simulate increasingly sophisticated models. Wang et al. [7] present a review of dynamics in complex networks, focusing on the interaction between disease dynamics and human behavioral and social dynamics. By doing an extensive review regarding to the human behavior responding to disease dynamics, the authors briefly describe the complex dynamics found in the literature: well-mixed populations networks, where spatial structure can be neglected, and other networks considering heterogeneity on spatially distributed populations. As controlling mechanisms are implemented, such as social distancing due 'social contagion', quarantine, non-pharmaceutical interventions and vaccination, adaptive behavior can occur in human population, which can be easily taken into account in the dynamics formulated by networked populations.

  11. Data Assimilation by delay-coordinate nudging

    NASA Astrophysics Data System (ADS)

    Pazo, Diego; Lopez, Juan Manuel; Carrassi, Alberto

    2016-04-01

    A new nudging method for data assimilation, delay-coordinate nudging, is presented. Delay-coordinate nudging makes explicit use of present and past observations in the formulation of the forcing driving the model evolution at each time-step. Numerical experiments with a low order chaotic system show that the new method systematically outperforms standard nudging in different model and observational scenarios, also when using an un-optimized formulation of the delay-nudging coefficients. A connection between the optimal delay and the dominant Lyapunov exponent of the dynamics is found based on heuristic arguments and is confirmed by the numerical results, providing a guideline for the practical implementation of the algorithm. Delay-coordinate nudging preserves the easiness of implementation, the intuitive functioning and the reduced computational cost of the standard nudging, making it a potential alternative especially in the field of seasonal-to-decadal predictions with large Earth system models that limit the use of more sophisticated data assimilation procedures.

  12. Quantifying chemical uncertainties in simulations of the ISM

    NASA Astrophysics Data System (ADS)

    Glover, Simon

    2018-06-01

    The ever-increasing power of large parallel computers now makes it possible to include increasingly sophisticated chemical models in three-dimensional simulations of the interstellar medium (ISM). This allows us to study the role that chemistry plays in the thermal balance of a realistically-structured, turbulent ISM, as well as enabling us to generated detailed synthetic observations of important atomic or molecular tracers. However, one major constraint on the accuracy of these models is the accuracy with which the input chemical rate coefficients are known. Uncertainties in these chemical rate coefficients inevitably introduce uncertainties into the model predictions. In this talk, I will review some of the methods we can use to quantify these uncertainties and to identify the key reactions where improved chemical data is most urgently required. I will also discuss a few examples, ranging from the local ISM to the high-redshift universe.

  13. Workshop on Models for Plasma Spectroscopy

    NASA Astrophysics Data System (ADS)

    1993-09-01

    A meeting was held at St. Johns College, Oxford from Monday 27th to Thursday 30th of September 1993 to bring together a group of physicists working on computational modelling of plasma spectroscopy. The group came from the UK, France, Israel and the USA. The meeting was organized by myself, Dr. Steven Rose of RAL and Dr. R.W. Lee of LLNL. It was funded by the U.S. European Office of Aerospace Research and Development and by LLNL. The meeting grew out of a wish by a group of core participants to make available to practicing plasma physicists (particularly those engaged in the design and analysis of experiments) sophisticated numerical models of plasma physics. Additional plasma physicists attended the meeting in Oxford by invitation. These were experimentalists and users of plasma physics simulation codes whose input to the meeting was to advise the core group as to what was really needed.

  14. A sustainable city implantation for Vienna, Austria

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Levine, R.S.; Radmard, T.; Dumreicher, H.

    1995-11-01

    This paper presents a prototype of the Sustainable City Implantation of the future, which the city of Vienna, Austria is interested in considering as a solution to a long standing urban problem. Developed conceptually through numerous architectural design studio projects and field studies the Sustainable City Implantation is inspired by the historic medieval Italian hilltown. This city-as-a-hill prototype, rendered through sophisticated and flexible computer models, offers the promise of overcoming many of the puzzles and conundrums plaguing urban designers and social ecologists in their efforts to find the proper form and scale for charting the pathway to sustainability in naturemore » and the built environment. The new holistic, people centered, urban model has been developed to be able to synthesize new urban concepts with technological means to develop humane, sustainable cities. This preliminary study develops and verifies this model together with its process for a significant urban site in the city of Vienna.« less

  15. An investigation into multi-dimensional prediction models to estimate the pose error of a quadcopter in a CSP plant setting

    NASA Astrophysics Data System (ADS)

    Lock, Jacobus C.; Smit, Willie J.; Treurnicht, Johann

    2016-05-01

    The Solar Thermal Energy Research Group (STERG) is investigating ways to make heliostats cheaper to reduce the total cost of a concentrating solar power (CSP) plant. One avenue of research is to use unmanned aerial vehicles (UAVs) to automate and assist with the heliostat calibration process. To do this, the pose estimation error of each UAV must be determined and integrated into a calibration procedure. A computer vision (CV) system is used to measure the pose of a quadcopter UAV. However, this CV system contains considerable measurement errors. Since this is a high-dimensional problem, a sophisticated prediction model must be used to estimate the measurement error of the CV system for any given pose measurement vector. This paper attempts to train and validate such a model with the aim of using it to determine the pose error of a quadcopter in a CSP plant setting.

  16. A transfer learning approach for classification of clinical significant prostate cancers from mpMRI scans

    NASA Astrophysics Data System (ADS)

    Chen, Quan; Xu, Xiang; Hu, Shiliang; Li, Xiao; Zou, Qing; Li, Yunpeng

    2017-03-01

    Deep learning has shown a great potential in computer aided diagnosis. However, in many applications, large dataset is not available. This makes the training of a sophisticated deep learning neural network (DNN) difficult. In this study, we demonstrated that with transfer learning, we can quickly retrain start-of-the-art DNN models with limited data provided by the prostateX challenge. The training data consists of 330 lesions, only 78 were clinical significant. Efforts were made to balance the data during training. We used ImageNet pre-trained inceptionV3 and Vgg-16 model and obtained AUC of 0.81 and 0.83 respectively on the prostateX test data, good for a 4th place finish. We noticed that models trained for different prostate zone has different sensitivity. Applying scaling factors before merging the result improves the AUC for the final result.

  17. Kernel methods and flexible inference for complex stochastic dynamics

    NASA Astrophysics Data System (ADS)

    Capobianco, Enrico

    2008-07-01

    Approximation theory suggests that series expansions and projections represent standard tools for random process applications from both numerical and statistical standpoints. Such instruments emphasize the role of both sparsity and smoothness for compression purposes, the decorrelation power achieved in the expansion coefficients space compared to the signal space, and the reproducing kernel property when some special conditions are met. We consider these three aspects central to the discussion in this paper, and attempt to analyze the characteristics of some known approximation instruments employed in a complex application domain such as financial market time series. Volatility models are often built ad hoc, parametrically and through very sophisticated methodologies. But they can hardly deal with stochastic processes with regard to non-Gaussianity, covariance non-stationarity or complex dependence without paying a big price in terms of either model mis-specification or computational efficiency. It is thus a good idea to look at other more flexible inference tools; hence the strategy of combining greedy approximation and space dimensionality reduction techniques, which are less dependent on distributional assumptions and more targeted to achieve computationally efficient performances. Advantages and limitations of their use will be evaluated by looking at algorithmic and model building strategies, and by reporting statistical diagnostics.

  18. VPython: Writing Real-time 3D Physics Programs

    NASA Astrophysics Data System (ADS)

    Chabay, Ruth

    2001-06-01

    VPython (http://cil.andrew.cmu.edu/projects/visual) combines the Python programming language with an innovative 3D graphics module called Visual, developed by David Scherer. Designed to make 3D physics simulations accessible to novice programmers, VPython allows the programmer to write a purely computational program without any graphics code, and produces an interactive realtime 3D graphical display. In a program 3D objects are created and their positions modified by computational algorithms. Running in a separate thread, the Visual module monitors the positions of these objects and renders them many times per second. Using the mouse, one can zoom and rotate to navigate through the scene. After one hour of instruction, students in an introductory physics course at Carnegie Mellon University, including those who have never programmed before, write programs in VPython to model the behavior of physical systems and to visualize fields in 3D. The Numeric array processing module allows the construction of more sophisticated simulations and models as well. VPython is free and open source. The Visual module is based on OpenGL, and runs on Windows, Linux, and Macintosh.

  19. Science in the cloud (SIC): A use case in MRI connectomics

    PubMed Central

    Gorgolewski, Krzysztof J.; Kleissas, Dean; Roncal, William Gray; Litt, Brian; Wandell, Brian; Poldrack, Russel A.; Wiener, Martin; Vogelstein, R. Jacob; Burns, Randal

    2017-01-01

    Abstract Modern technologies are enabling scientists to collect extraordinary amounts of complex and sophisticated data across a huge range of scales like never before. With this onslaught of data, we can allow the focal point to shift from data collection to data analysis. Unfortunately, lack of standardized sharing mechanisms and practices often make reproducing or extending scientific results very difficult. With the creation of data organization structures and tools that drastically improve code portability, we now have the opportunity to design such a framework for communicating extensible scientific discoveries. Our proposed solution leverages these existing technologies and standards, and provides an accessible and extensible model for reproducible research, called ‘science in the cloud’ (SIC). Exploiting scientific containers, cloud computing, and cloud data services, we show the capability to compute in the cloud and run a web service that enables intimate interaction with the tools and data presented. We hope this model will inspire the community to produce reproducible and, importantly, extensible results that will enable us to collectively accelerate the rate at which scientific breakthroughs are discovered, replicated, and extended. PMID:28327935

  20. Science in the cloud (SIC): A use case in MRI connectomics.

    PubMed

    Kiar, Gregory; Gorgolewski, Krzysztof J; Kleissas, Dean; Roncal, William Gray; Litt, Brian; Wandell, Brian; Poldrack, Russel A; Wiener, Martin; Vogelstein, R Jacob; Burns, Randal; Vogelstein, Joshua T

    2017-05-01

    Modern technologies are enabling scientists to collect extraordinary amounts of complex and sophisticated data across a huge range of scales like never before. With this onslaught of data, we can allow the focal point to shift from data collection to data analysis. Unfortunately, lack of standardized sharing mechanisms and practices often make reproducing or extending scientific results very difficult. With the creation of data organization structures and tools that drastically improve code portability, we now have the opportunity to design such a framework for communicating extensible scientific discoveries. Our proposed solution leverages these existing technologies and standards, and provides an accessible and extensible model for reproducible research, called 'science in the cloud' (SIC). Exploiting scientific containers, cloud computing, and cloud data services, we show the capability to compute in the cloud and run a web service that enables intimate interaction with the tools and data presented. We hope this model will inspire the community to produce reproducible and, importantly, extensible results that will enable us to collectively accelerate the rate at which scientific breakthroughs are discovered, replicated, and extended. © The Author 2017. Published by Oxford University Press.

  1. REE radiation fault model: a tool for organizing and communication radiation test data and construction COTS based spacebourne computing systems

    NASA Technical Reports Server (NTRS)

    Ferraro, R.; Some, R.

    2002-01-01

    The growth in data rates of instruments on future NASA spacecraft continues to outstrip the improvement in communications bandwidth and processing capabilities of radiation-hardened computers. Sophisticated autonomous operations strategies will further increase the processing workload. Given the reductions in spacecraft size and available power, standard radiation hardened computing systems alone will not be able to address the requirements of future missions. The REE project was intended to overcome this obstacle by developing a COTS- based supercomputer suitable for use as a science and autonomy data processor in most space environments. This development required a detailed knowledge of system behavior in the presence of Single Event Effect (SEE) induced faults so that mitigation strategies could be designed to recover system level reliability while maintaining the COTS throughput advantage. The REE project has developed a suite of tools and a methodology for predicting SEU induced transient fault rates in a range of natural space environments from ground-based radiation testing of component parts. In this paper we provide an overview of this methodology and tool set with a concentration on the radiation fault model and its use in the REE system development methodology. Using test data reported elsewhere in this and other conferences, we predict upset rates for a particular COTS single board computer configuration in several space environments.

  2. A model of the temporal dynamics of multisensory enhancement

    PubMed Central

    Rowland, Benjamin A.; Stein, Barry E.

    2014-01-01

    The senses transduce different forms of environmental energy, and the brain synthesizes information across them to enhance responses to salient biological events. We hypothesize that the potency of multisensory integration is attributable to the convergence of independent and temporally aligned signals derived from cross-modal stimulus configurations onto multisensory neurons. The temporal profile of multisensory integration in neurons of the deep superior colliculus (SC) is consistent with this hypothesis. The responses of these neurons to visual, auditory, and combinations of visual–auditory stimuli reveal that multisensory integration takes place in real-time; that is, the input signals are integrated as soon as they arrive at the target neuron. Interactions between cross-modal signals may appear to reflect linear or nonlinear computations on a moment-by-moment basis, the aggregate of which determines the net product of multisensory integration. Modeling observations presented here suggest that the early nonlinear components of the temporal profile of multisensory integration can be explained with a simple spiking neuron model, and do not require more sophisticated assumptions about the underlying biology. A transition from nonlinear “super-additive” computation to linear, additive computation can be accomplished via scaled inhibition. The findings provide a set of design constraints for artificial implementations seeking to exploit the basic principles and potency of biological multisensory integration in contexts of sensory substitution or augmentation. PMID:24374382

  3. Optimization-Based Inverse Identification of the Parameters of a Concrete Cap Material Model

    NASA Astrophysics Data System (ADS)

    Král, Petr; Hokeš, Filip; Hušek, Martin; Kala, Jiří; Hradil, Petr

    2017-10-01

    Issues concerning the advanced numerical analysis of concrete building structures in sophisticated computing systems currently require the involvement of nonlinear mechanics tools. The efforts to design safer, more durable and mainly more economically efficient concrete structures are supported via the use of advanced nonlinear concrete material models and the geometrically nonlinear approach. The application of nonlinear mechanics tools undoubtedly presents another step towards the approximation of the real behaviour of concrete building structures within the framework of computer numerical simulations. However, the success rate of this application depends on having a perfect understanding of the behaviour of the concrete material models used and having a perfect understanding of the used material model parameters meaning. The effective application of nonlinear concrete material models within computer simulations often becomes very problematic because these material models very often contain parameters (material constants) whose values are difficult to obtain. However, getting of the correct values of material parameters is very important to ensure proper function of a concrete material model used. Today, one possibility, which permits successful solution of the mentioned problem, is the use of optimization algorithms for the purpose of the optimization-based inverse material parameter identification. Parameter identification goes hand in hand with experimental investigation while it trying to find parameter values of the used material model so that the resulting data obtained from the computer simulation will best approximate the experimental data. This paper is focused on the optimization-based inverse identification of the parameters of a concrete cap material model which is known under the name the Continuous Surface Cap Model. Within this paper, material parameters of the model are identified on the basis of interaction between nonlinear computer simulations, gradient based and nature inspired optimization algorithms and experimental data, the latter of which take the form of a load-extension curve obtained from the evaluation of uniaxial tensile test results. The aim of this research was to obtain material model parameters corresponding to the quasi-static tensile loading which may be further used for the research involving dynamic and high-speed tensile loading. Based on the obtained results it can be concluded that the set goal has been reached.

  4. A HyperCard Program for Business German.

    ERIC Educational Resources Information Center

    Paulsell, Patricia R.

    Although the use of computer-assisted language instruction software has been mainly limited to grammatical/syntactical drills, the increasing number of language professionals with programming skills is leading to the development of more sophisticated language education programs. This report describes the generation of such a program using the…

  5. Inventors in the Making

    ERIC Educational Resources Information Center

    Murray, Jenny; Bartelmay, Kathy

    2005-01-01

    Can second-grade students construct an understanding of sophisticated science processes and explore physics concepts while creating their own inventions? Yes! Students accomplished this and much more through a month-long project in which they used Legos and Robolab, the Lego computer programing software, to create their own inventions. One…

  6. Education in the Information Age.

    ERIC Educational Resources Information Center

    Hay, Lee

    1983-01-01

    This essay considers the revolutionized education of a projected future of cheap and sophisticated technology. Predictions include a redefinition of literacy and basic skills and a restructuring of educational delivery employing computers to dispense information in order to free teachers to work directly with students on cognitive development.…

  7. Research and Development in Natural Language Understanding as Part of the Strategic Computing Program.

    DTIC Science & Technology

    1987-04-01

    facilities. BBN is developing a series of increasingly sophisticated natural language understanding systems which will serve as an integrated interface...Haas, A.R. A Syntactic Theory of Belief and Action. Artificial Intelligence. 1986. Forthcoming. [6] Hinrichs, E. Temporale Anaphora im Englischen

  8. 12 Math Rules That Expire in the Middle Grades

    ERIC Educational Resources Information Center

    Karp, Karen S.; Bush, Sarah B.; Dougherty, Barbara J.

    2015-01-01

    Many rules taught in mathematics classrooms "expire" when students develop knowledge that is more sophisticated, such as using new number systems. For example, in elementary grades, students are sometimes taught that "addition makes bigger" or "subtraction makes smaller" when learning to compute with whole numbers,…

  9. Symbionic Technology and Education. Report 83-02.

    ERIC Educational Resources Information Center

    Cartwright, Glenn F.

    Research findings indicate that major breakthroughs in education will have to occur through direct cortical intervention, using either chemical or electronic means. It will eventually be possible to build sophisticated intelligence amplifiers that will be internal extensions of our brains, significantly more powerful than present day computers,…

  10. Assessing Higher Order Thinking in Video Games

    ERIC Educational Resources Information Center

    Rice, John

    2007-01-01

    Computer video games have become highly interesting to educators and researchers since their sophistication has improved considerably over the last decade. Studies indicate simple video games touting educational benefits are common in classrooms. However, a need for identifying truly useful games for educational purposes exists. This article…

  11. STAF: A Powerful and Sophisticated CAI System.

    ERIC Educational Resources Information Center

    Loach, Ken

    1982-01-01

    Describes the STAF (Science Teacher's Authoring Facility) computer-assisted instruction system developed at Leeds University (England), focusing on STAF language and major program features. Although programs for the system emphasize physical chemistry and organic spectroscopy, the system and language are general purpose and can be used in any…

  12. Interactive Video-Based Industrial Training in Basic Electronics.

    ERIC Educational Resources Information Center

    Mirkin, Barry

    The Wisconsin Foundation for Vocational, Technical, and Adult Education is currently involved in the development, implementation, and distribution of a sophisticated interactive computer and video learning system. Designed to offer trainees an open entry and open exit opportunity to pace themselves through a comprehensive competency-based,…

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Habte, A.; Sengupta, M.; Wilcox, S.

    This report was part of a multiyear collaboration with the University of Wisconsin and the National Oceanic and Atmospheric Administration (NOAA) to produce high-quality, satellite-based, solar resource datasets for the United States. High-quality, solar resource assessment accelerates technology deployment by making a positive impact on decision making and reducing uncertainty in investment decisions. Satellite-based solar resource datasets are used as a primary source in solar resource assessment. This is mainly because satellites provide larger areal coverage and longer periods of record than ground-based measurements. With the advent of newer satellites with increased information content and faster computers that can processmore » increasingly higher data volumes, methods that were considered too computationally intensive are now feasible. One class of sophisticated methods for retrieving solar resource information from satellites is a two-step, physics-based method that computes cloud properties and uses the information in a radiative transfer model to compute solar radiation. This method has the advantage of adding additional information as satellites with newer channels come on board. This report evaluates the two-step method developed at NOAA and adapted for solar resource assessment for renewable energy with the goal of identifying areas that can be improved in the future.« less

  14. Measurement of Satellite Impact Test Fragments for Modeling Orbital Debris

    NASA Technical Reports Server (NTRS)

    Hill, Nicole M.

    2009-01-01

    There are over 13,000 pieces of catalogued objects 10cm and larger in orbit around Earth [ODQN, January 2009, p12]. More than 6000 of these objects are fragments from explosions and collisions. As the earth-orbiting object count increases, debris-generating collisions in the future become a statistical inevitability. To aid in understanding this collision risk, the NASA Orbital Debris Program Office has developed computer models that calculate quantity and orbits of debris both currently in orbit and in future epochs. In order to create a reasonable computer model of the orbital debris environment, it is important to understand the mechanics of creation of debris as a result of a collision. The measurement of the physical characteristics of debris resulting from ground-based, hypervelocity impact testing aids in understanding the sizes and shapes of debris produced from potential impacts in orbit. To advance the accuracy of fragment shape/size determination, the NASA Orbital Debris Program Office recently implemented a computerized measurement system. The goal of this system is to improve knowledge and understanding of the relation between commonly used dimensions and overall shape. The technique developed involves scanning a single fragment with a hand-held laser device, measuring its size properties using a sophisticated software tool, and creating a three-dimensional computer model to demonstrate how the object might appear in orbit. This information is used to aid optical techniques in shape determination. This more automated and repeatable method provides higher accuracy in the size and shape determination of debris.

  15. Error Modelling for Multi-Sensor Measurements in Infrastructure-Free Indoor Navigation

    PubMed Central

    Ruotsalainen, Laura; Kirkko-Jaakkola, Martti; Rantanen, Jesperi; Mäkelä, Maija

    2018-01-01

    The long-term objective of our research is to develop a method for infrastructure-free simultaneous localization and mapping (SLAM) and context recognition for tactical situational awareness. Localization will be realized by propagating motion measurements obtained using a monocular camera, a foot-mounted Inertial Measurement Unit (IMU), sonar, and a barometer. Due to the size and weight requirements set by tactical applications, Micro-Electro-Mechanical (MEMS) sensors will be used. However, MEMS sensors suffer from biases and drift errors that may substantially decrease the position accuracy. Therefore, sophisticated error modelling and implementation of integration algorithms are key for providing a viable result. Algorithms used for multi-sensor fusion have traditionally been different versions of Kalman filters. However, Kalman filters are based on the assumptions that the state propagation and measurement models are linear with additive Gaussian noise. Neither of the assumptions is correct for tactical applications, especially for dismounted soldiers, or rescue personnel. Therefore, error modelling and implementation of advanced fusion algorithms are essential for providing a viable result. Our approach is to use particle filtering (PF), which is a sophisticated option for integrating measurements emerging from pedestrian motion having non-Gaussian error characteristics. This paper discusses the statistical modelling of the measurement errors from inertial sensors and vision based heading and translation measurements to include the correct error probability density functions (pdf) in the particle filter implementation. Then, model fitting is used to verify the pdfs of the measurement errors. Based on the deduced error models of the measurements, particle filtering method is developed to fuse all this information, where the weights of each particle are computed based on the specific models derived. The performance of the developed method is tested via two experiments, one at a university’s premises and another in realistic tactical conditions. The results show significant improvement on the horizontal localization when the measurement errors are carefully modelled and their inclusion into the particle filtering implementation correctly realized. PMID:29443918

  16. A non-local mixing-length theory able to compute core overshooting

    NASA Astrophysics Data System (ADS)

    Gabriel, M.; Belkacem, K.

    2018-04-01

    Turbulent convection is certainly one of the most important and thorny issues in stellar physics. Our deficient knowledge of this crucial physical process introduces a fairly large uncertainty concerning the internal structure and evolution of stars. A striking example is overshoot at the edge of convective cores. Indeed, nearly all stellar evolutionary codes treat the overshooting zones in a very approximative way that considers both its extent and the profile of the temperature gradient as free parameters. There are only a few sophisticated theories of stellar convection such as Reynolds stress approaches, but they also require the adjustment of a non-negligible number of free parameters. We present here a theory, based on the plume theory as well as on the mean-field equations, but without relying on the usual Taylor's closure hypothesis. It leads us to a set of eight differential equations plus a few algebraic ones. Our theory is essentially a non-mixing length theory. It enables us to compute the temperature gradient in a shrinking convective core and its overshooting zone. The case of an expanding convective core is also discussed, though more briefly. Numerical simulations have quickly improved during recent years and enabling us to foresee that they will probably soon provide a model of convection adapted to the computation of 1D stellar models.

  17. Physical and numerical sources of computational inefficiency in integration of chemical kinetic rate equations: Etiology, treatment and prognosis

    NASA Technical Reports Server (NTRS)

    Pratt, D. T.; Radhakrishnan, K.

    1986-01-01

    The design of a very fast, automatic black-box code for homogeneous, gas-phase chemical kinetics problems requires an understanding of the physical and numerical sources of computational inefficiency. Some major sources reviewed in this report are stiffness of the governing ordinary differential equations (ODE's) and its detection, choice of appropriate method (i.e., integration algorithm plus step-size control strategy), nonphysical initial conditions, and too frequent evaluation of thermochemical and kinetic properties. Specific techniques are recommended (and some advised against) for improving or overcoming the identified problem areas. It is argued that, because reactive species increase exponentially with time during induction, and all species exhibit asymptotic, exponential decay with time during equilibration, exponential-fitted integration algorithms are inherently more accurate for kinetics modeling than classical, polynomial-interpolant methods for the same computational work. But current codes using the exponential-fitted method lack the sophisticated stepsize-control logic of existing black-box ODE solver codes, such as EPISODE and LSODE. The ultimate chemical kinetics code does not exist yet, but the general characteristics of such a code are becoming apparent.

  18. Theoretical Characterizaiton of Visual Signatures

    NASA Astrophysics Data System (ADS)

    Kashinski, D. O.; Chase, G. M.; di Nallo, O. E.; Scales, A. N.; Vanderley, D. L.; Byrd, E. F. C.

    2015-05-01

    We are investigating the accuracy of theoretical models used to predict the visible, ultraviolet, and infrared spectra, as well as other properties, of product materials ejected from the muzzle of currently fielded systems. Recent advances in solid propellants has made the management of muzzle signature (flash) a principle issue in weapons development across the calibers. A priori prediction of the electromagnetic spectra of formulations will allow researchers to tailor blends that yield desired signatures and determine spectrographic detection ranges. Quantum chemistry methods at various levels of sophistication have been employed to optimize molecular geometries, compute unscaled vibrational frequencies, and determine the optical spectra of specific gas-phase species. Electronic excitations are being computed using Time Dependent Density Functional Theory (TD-DFT). A full statistical analysis and reliability assessment of computational results is currently underway. A comparison of theoretical results to experimental values found in the literature is used to assess any affects of functional choice and basis set on calculation accuracy. The status of this work will be presented at the conference. Work supported by the ARL, DoD HPCMP, and USMA.

  19. Using Multi-modal Sensing for Human Activity Modeling in the Real World

    NASA Astrophysics Data System (ADS)

    Harrison, Beverly L.; Consolvo, Sunny; Choudhury, Tanzeem

    Traditionally smart environments have been understood to represent those (often physical) spaces where computation is embedded into the users' surrounding infrastructure, buildings, homes, and workplaces. Users of this "smartness" move in and out of these spaces. Ambient intelligence assumes that users are automatically and seamlessly provided with context-aware, adaptive information, applications and even sensing - though this remains a significant challenge even when limited to these specialized, instrumented locales. Since not all environments are "smart" the experience is not a pervasive one; rather, users move between these intelligent islands of computationally enhanced space while we still aspire to achieve a more ideal anytime, anywhere experience. Two key technological trends are helping to bridge the gap between these smart environments and make the associated experience more persistent and pervasive. Smaller and more computationally sophisticated mobile devices allow sensing, communication, and services to be more directly and continuously experienced by user. Improved infrastructure and the availability of uninterrupted data streams, for instance location-based data, enable new services and applications to persist across environments.

  20. Numerical simulation of the interaction of biological cells with an ice front during freezing

    NASA Astrophysics Data System (ADS)

    Carin, M.; Jaeger, M.

    2001-12-01

    The goal of this study is a better understanding of the interaction between cells and a solidification front during a cryopreservation process. This technique of freezing is commonly used to conserve biological material for long periods at low temperatures. However the biophysical mechanisms of cell injuries during freezing are difficult to understand because a cell is a very sophisticated microstructure interacting with its environment. We have developed a finite element model to simulate the response of cells to an advancing solidification front. A special front-tracking technique is used to compute the motion of the cell membrane and the ice front during freezing. The model solves the conductive heat transfer equation and the diffusion equation of a solute on a domain containing three phases: one or more cells, the extra-cellular solution and the growing ice. This solid phase growing from a binary salt solution rejects the solute in the liquid phase and increases the solute gradient around the cell. This induces the shrinkage of the cell. The model is used to simulate the engulfment of one cell modelling a red blood cell by an advancing solidification front initially planar or not is computed. We compare the incorporation of a cell with that of a solid particle.

  1. RF Wave Simulation Using the MFEM Open Source FEM Package

    NASA Astrophysics Data System (ADS)

    Stillerman, J.; Shiraiwa, S.; Bonoli, P. T.; Wright, J. C.; Green, D. L.; Kolev, T.

    2016-10-01

    A new plasma wave simulation environment based on the finite element method is presented. MFEM, a scalable open-source FEM library, is used as the basis for this capability. MFEM allows for assembling an FEM matrix of arbitrarily high order in a parallel computing environment. A 3D frequency domain RF physics layer was implemented using a python wrapper for MFEM and a cold collisional plasma model was ported. This physics layer allows for defining the plasma RF wave simulation model without user knowledge of the FEM weak-form formulation. A graphical user interface is built on πScope, a python-based scientific workbench, such that a user can build a model definition file interactively. Benchmark cases have been ported to this new environment, with results being consistent with those obtained using COMSOL multiphysics, GENRAY, and TORIC/TORLH spectral solvers. This work is a first step in bringing to bear the sophisticated computational tool suite that MFEM provides (e.g., adaptive mesh refinement, solver suite, element types) to the linear plasma-wave interaction problem, and within more complicated integrated workflows, such as coupling with core spectral solver, or incorporating additional physics such as an RF sheath potential model or kinetic effects. USDoE Awards DE-FC02-99ER54512, DE-FC02-01ER54648.

  2. Self-tuning bistable parametric feedback oscillator: Near-optimal amplitude maximization without model information

    NASA Astrophysics Data System (ADS)

    Braun, David J.; Sutas, Andrius; Vijayakumar, Sethu

    2017-01-01

    Theory predicts that parametrically excited oscillators, tuned to operate under resonant condition, are capable of large-amplitude oscillation useful in diverse applications, such as signal amplification, communication, and analog computation. However, due to amplitude saturation caused by nonlinearity, lack of robustness to model uncertainty, and limited sensitivity to parameter modulation, these oscillators require fine-tuning and strong modulation to generate robust large-amplitude oscillation. Here we present a principle of self-tuning parametric feedback excitation that alleviates the above-mentioned limitations. This is achieved using a minimalistic control implementation that performs (i) self-tuning (slow parameter adaptation) and (ii) feedback pumping (fast parameter modulation), without sophisticated signal processing past observations. The proposed approach provides near-optimal amplitude maximization without requiring model-based control computation, previously perceived inevitable to implement optimal control principles in practical application. Experimental implementation of the theory shows that the oscillator self-tunes itself near to the onset of dynamic bifurcation to achieve extreme sensitivity to small resonant parametric perturbations. As a result, it achieves large-amplitude oscillations by capitalizing on the effect of nonlinearity, despite substantial model uncertainties and strong unforeseen external perturbations. We envision the present finding to provide an effective and robust approach to parametric excitation when it comes to real-world application.

  3. Simulation of Combustion Systems with Realistic g-jitter

    NASA Technical Reports Server (NTRS)

    Mell, William E.; McGrattan, Kevin B.; Baum, Howard R.

    2003-01-01

    In this project a transient, fully three-dimensional computer simulation code was developed to simulate the effects of realistic g-jitter on a number of combustion systems. The simulation code is capable of simulating flame spread on a solid and nonpremixed or premixed gaseous combustion in nonturbulent flow with simple combustion models. Simple combustion models were used to preserve computational efficiency since this is meant to be an engineering code. Also, the use of sophisticated turbulence models was not pursued (a simple Smagorinsky type model can be implemented if deemed appropriate) because if flow velocities are large enough for turbulence to develop in a reduced gravity combustion scenario it is unlikely that g-jitter disturbances (in NASA's reduced gravity facilities) will play an important role in the flame dynamics. Acceleration disturbances of realistic orientation, magnitude, and time dependence can be easily included in the simulation. The simulation algorithm was based on techniques used in an existing large eddy simulation code which has successfully simulated fire dynamics in complex domains. A series of simulations with measured and predicted acceleration disturbances on the International Space Station (ISS) are presented. The results of this series of simulations suggested a passive isolation system and appropriate scheduling of crew activity would provide a sufficiently "quiet" acceleration environment for spherical diffusion flames.

  4. Entropy generation across Earth's collisionless bow shock.

    PubMed

    Parks, G K; Lee, E; McCarthy, M; Goldstein, M; Fu, S Y; Cao, J B; Canu, P; Lin, N; Wilber, M; Dandouras, I; Réme, H; Fazakerley, A

    2012-02-10

    Earth's bow shock is a collisionless shock wave but entropy has never been directly measured across it. The plasma experiments on Cluster and Double Star measure 3D plasma distributions upstream and downstream of the bow shock allowing calculation of Boltzmann's entropy function H and his famous H theorem, dH/dt≤0. The collisionless Boltzmann (Vlasov) equation predicts that the total entropy does not change if the distribution function across the shock becomes nonthermal, but it allows changes in the entropy density. Here, we present the first direct measurements of entropy density changes across Earth's bow shock and show that the results generally support the model of the Vlasov analysis. These observations are a starting point for a more sophisticated analysis that includes 3D computer modeling of collisionless shocks with input from observed particles, waves, and turbulences.

  5. Electric Fields and Enzyme Catalysis

    PubMed Central

    Fried, Stephen D.; Boxer, Steven G.

    2017-01-01

    What happens inside an enzyme’s active site to allow slow and difficult chemical reactions to occur so rapidly? This question has occupied biochemists’ attention for a long time. Computer models of increasing sophistication have predicted an important role for electrostatic interactions in enzymatic reactions, yet this hypothesis has proved vexingly difficult to test experimentally. Recent experiments utilizing the vibrational Stark effect make it possible to measure the electric field a substrate molecule experiences when bound inside its enzyme’s active site. These experiments have provided compelling evidence supporting a major electrostatic contribution to enzymatic catalysis. Here, we review these results and develop a simple model for electrostatic catalysis that enables us to incorporate disparate concepts introduced by many investigators to describe how enzymes work into a more unified framework stressing the importance of electric fields at the active site. PMID:28375745

  6. MHOST: An efficient finite element program for inelastic analysis of solids and structures

    NASA Technical Reports Server (NTRS)

    Nakazawa, S.

    1988-01-01

    An efficient finite element program for 3-D inelastic analysis of gas turbine hot section components was constructed and validated. A novel mixed iterative solution strategy is derived from the augmented Hu-Washizu variational principle in order to nodally interpolate coordinates, displacements, deformation, strains, stresses and material properties. A series of increasingly sophisticated material models incorporated in MHOST include elasticity, secant plasticity, infinitesimal and finite deformation plasticity, creep and unified viscoplastic constitutive model proposed by Walker. A library of high performance elements is built into this computer program utilizing the concepts of selective reduced integrations and independent strain interpolations. A family of efficient solution algorithms is implemented in MHOST for linear and nonlinear equation solution including the classical Newton-Raphson, modified, quasi and secant Newton methods with optional line search and the conjugate gradient method.

  7. TAIR: A transonic airfoil analysis computer code

    NASA Technical Reports Server (NTRS)

    Dougherty, F. C.; Holst, T. L.; Grundy, K. L.; Thomas, S. D.

    1981-01-01

    The operation of the TAIR (Transonic AIRfoil) computer code, which uses a fast, fully implicit algorithm to solve the conservative full-potential equation for transonic flow fields about arbitrary airfoils, is described on two levels of sophistication: simplified operation and detailed operation. The program organization and theory are elaborated to simplify modification of TAIR for new applications. Examples with input and output are given for a wide range of cases, including incompressible, subcritical compressible, and transonic calculations.

  8. Predicting Second Language Writing Proficiency: The Roles of Cohesion and Linguistic Sophistication

    ERIC Educational Resources Information Center

    Crossley, Scott A.; McNamara, Danielle S.

    2012-01-01

    This study addresses research gaps in predicting second language (L2) writing proficiency using linguistic features. Key to this analysis is the inclusion of linguistic measures at the surface, textbase and situation model level that assess text cohesion and linguistic sophistication. The results of this study demonstrate that five variables…

  9. SciSpark: Highly Interactive and Scalable Model Evaluation and Climate Metrics

    NASA Astrophysics Data System (ADS)

    Wilson, B. D.; Mattmann, C. A.; Waliser, D. E.; Kim, J.; Loikith, P.; Lee, H.; McGibbney, L. J.; Whitehall, K. D.

    2014-12-01

    Remote sensing data and climate model output are multi-dimensional arrays of massive sizes locked away in heterogeneous file formats (HDF5/4, NetCDF 3/4) and metadata models (HDF-EOS, CF) making it difficult to perform multi-stage, iterative science processing since each stage requires writing and reading data to and from disk. We are developing a lightning fast Big Data technology called SciSpark based on ApacheTM Spark. Spark implements the map-reduce paradigm for parallel computing on a cluster, but emphasizes in-memory computation, "spilling" to disk only as needed, and so outperforms the disk-based ApacheTM Hadoop by 100x in memory and by 10x on disk, and makes iterative algorithms feasible. SciSpark will enable scalable model evaluation by executing large-scale comparisons of A-Train satellite observations to model grids on a cluster of 100 to 1000 compute nodes. This 2nd generation capability for NASA's Regional Climate Model Evaluation System (RCMES) will compute simple climate metrics at interactive speeds, and extend to quite sophisticated iterative algorithms such as machine-learning (ML) based clustering of temperature PDFs, and even graph-based algorithms for searching for Mesocale Convective Complexes. The goals of SciSpark are to: (1) Decrease the time to compute comparison statistics and plots from minutes to seconds; (2) Allow for interactive exploration of time-series properties over seasons and years; (3) Decrease the time for satellite data ingestion into RCMES to hours; (4) Allow for Level-2 comparisons with higher-order statistics or PDF's in minutes to hours; and (5) Move RCMES into a near real time decision-making platform. We will report on: the architecture and design of SciSpark, our efforts to integrate climate science algorithms in Python and Scala, parallel ingest and partitioning (sharding) of A-Train satellite observations from HDF files and model grids from netCDF files, first parallel runs to compute comparison statistics and PDF's, and first metrics quantifying parallel speedups and memory & disk usage.

  10. Robust Informatics Infrastructure Required For ICME: Combining Virtual and Experimental Data

    NASA Technical Reports Server (NTRS)

    Arnold, Steven M.; Holland, Frederic A. Jr.; Bednarcyk, Brett A.

    2014-01-01

    With the increased emphasis on reducing the cost and time to market of new materials, the need for robust automated materials information management system(s) enabling sophisticated data mining tools is increasing, as evidenced by the emphasis on Integrated Computational Materials Engineering (ICME) and the recent establishment of the Materials Genome Initiative (MGI). This need is also fueled by the demands for higher efficiency in material testing; consistency, quality and traceability of data; product design; engineering analysis; as well as control of access to proprietary or sensitive information. Further, the use of increasingly sophisticated nonlinear, anisotropic and or multi-scale models requires both the processing of large volumes of test data and complex materials data necessary to establish processing-microstructure-property-performance relationships. Fortunately, material information management systems have kept pace with the growing user demands and evolved to enable: (i) the capture of both point wise data and full spectra of raw data curves, (ii) data management functions such as access, version, and quality controls;(iii) a wide range of data import, export and analysis capabilities; (iv) data pedigree traceability mechanisms; (v) data searching, reporting and viewing tools; and (vi) access to the information via a wide range of interfaces. This paper discusses key principles for the development of a robust materials information management system to enable the connections at various length scales to be made between experimental data and corresponding multiscale modeling toolsets to enable ICME. In particular, NASA Glenn's efforts towards establishing such a database for capturing constitutive modeling behavior for both monolithic and composites materials

  11. Heads in the Cloud: A Primer on Neuroimaging Applications of High Performance Computing.

    PubMed

    Shatil, Anwar S; Younas, Sohail; Pourreza, Hossein; Figley, Chase R

    2015-01-01

    With larger data sets and more sophisticated analyses, it is becoming increasingly common for neuroimaging researchers to push (or exceed) the limitations of standalone computer workstations. Nonetheless, although high-performance computing platforms such as clusters, grids and clouds are already in routine use by a small handful of neuroimaging researchers to increase their storage and/or computational power, the adoption of such resources by the broader neuroimaging community remains relatively uncommon. Therefore, the goal of the current manuscript is to: 1) inform prospective users about the similarities and differences between computing clusters, grids and clouds; 2) highlight their main advantages; 3) discuss when it may (and may not) be advisable to use them; 4) review some of their potential problems and barriers to access; and finally 5) give a few practical suggestions for how interested new users can start analyzing their neuroimaging data using cloud resources. Although the aim of cloud computing is to hide most of the complexity of the infrastructure management from end-users, we recognize that this can still be an intimidating area for cognitive neuroscientists, psychologists, neurologists, radiologists, and other neuroimaging researchers lacking a strong computational background. Therefore, with this in mind, we have aimed to provide a basic introduction to cloud computing in general (including some of the basic terminology, computer architectures, infrastructure and service models, etc.), a practical overview of the benefits and drawbacks, and a specific focus on how cloud resources can be used for various neuroimaging applications.

  12. Heads in the Cloud: A Primer on Neuroimaging Applications of High Performance Computing

    PubMed Central

    Shatil, Anwar S.; Younas, Sohail; Pourreza, Hossein; Figley, Chase R.

    2015-01-01

    With larger data sets and more sophisticated analyses, it is becoming increasingly common for neuroimaging researchers to push (or exceed) the limitations of standalone computer workstations. Nonetheless, although high-performance computing platforms such as clusters, grids and clouds are already in routine use by a small handful of neuroimaging researchers to increase their storage and/or computational power, the adoption of such resources by the broader neuroimaging community remains relatively uncommon. Therefore, the goal of the current manuscript is to: 1) inform prospective users about the similarities and differences between computing clusters, grids and clouds; 2) highlight their main advantages; 3) discuss when it may (and may not) be advisable to use them; 4) review some of their potential problems and barriers to access; and finally 5) give a few practical suggestions for how interested new users can start analyzing their neuroimaging data using cloud resources. Although the aim of cloud computing is to hide most of the complexity of the infrastructure management from end-users, we recognize that this can still be an intimidating area for cognitive neuroscientists, psychologists, neurologists, radiologists, and other neuroimaging researchers lacking a strong computational background. Therefore, with this in mind, we have aimed to provide a basic introduction to cloud computing in general (including some of the basic terminology, computer architectures, infrastructure and service models, etc.), a practical overview of the benefits and drawbacks, and a specific focus on how cloud resources can be used for various neuroimaging applications. PMID:27279746

  13. One Dimensional Turing-Like Handshake Test for Motor Intelligence

    PubMed Central

    Karniel, Amir; Avraham, Guy; Peles, Bat-Chen; Levy-Tzedek, Shelly; Nisky, Ilana

    2010-01-01

    In the Turing test, a computer model is deemed to "think intelligently" if it can generate answers that are not distinguishable from those of a human. However, this test is limited to the linguistic aspects of machine intelligence. A salient function of the brain is the control of movement, and the movement of the human hand is a sophisticated demonstration of this function. Therefore, we propose a Turing-like handshake test, for machine motor intelligence. We administer the test through a telerobotic system in which the interrogator is engaged in a task of holding a robotic stylus and interacting with another party (human or artificial). Instead of asking the interrogator whether the other party is a person or a computer program, we employ a two-alternative forced choice method and ask which of two systems is more human-like. We extract a quantitative grade for each model according to its resemblance to the human handshake motion and name it "Model Human-Likeness Grade" (MHLG). We present three methods to estimate the MHLG. (i) By calculating the proportion of subjects' answers that the model is more human-like than the human; (ii) By comparing two weighted sums of human and model handshakes we fit a psychometric curve and extract the point of subjective equality (PSE); (iii) By comparing a given model with a weighted sum of human and random signal, we fit a psychometric curve to the answers of the interrogator and extract the PSE for the weight of the human in the weighted sum. Altogether, we provide a protocol to test computational models of the human handshake. We believe that building a model is a necessary step in understanding any phenomenon and, in this case, in understanding the neural mechanisms responsible for the generation of the human handshake. PMID:21206462

  14. Extending the Stabilized Supralinear Network model for binocular image processing.

    PubMed

    Selby, Ben; Tripp, Bryan

    2017-06-01

    The visual cortex is both extensive and intricate. Computational models are needed to clarify the relationships between its local mechanisms and high-level functions. The Stabilized Supralinear Network (SSN) model was recently shown to account for many receptive field phenomena in V1, and also to predict subtle receptive field properties that were subsequently confirmed in vivo. In this study, we performed a preliminary exploration of whether the SSN is suitable for incorporation into large, functional models of the visual cortex, considering both its extensibility and computational tractability. First, whereas the SSN receives abstract orientation signals as input, we extended it to receive images (through a linear-nonlinear stage), and found that the extended version behaved similarly. Secondly, whereas the SSN had previously been studied in a monocular context, we found that it could also reproduce data on interocular transfer of surround suppression. Finally, we reformulated the SSN as a convolutional neural network, and found that it scaled well on parallel hardware. These results provide additional support for the plausibility of the SSN as a model of lateral interactions in V1, and suggest that the SSN is well suited as a component of complex vision models. Future work will use the SSN to explore relationships between local network interactions and sophisticated vision processes in large networks. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Improving Access to Data While Protecting Confidentiality: Prospects for the Future.

    ERIC Educational Resources Information Center

    Duncan, George T.; Pearson, Robert W.

    Providing researchers, especially those in the social sciences, with access to publicly collected microdata furthers research while advancing public policy goals in a democratic society. However, while technological improvements have eased remote access to these databases and enabled computer using researchers to perform sophisticated statistical…

  16. Analysis of an Anti-Phishing Lab Activity

    ERIC Educational Resources Information Center

    Werner, Laurie A.; Courte, Jill

    2010-01-01

    Despite advances in spam detection software, anti-spam laws, and increasingly sophisticated users, the number of successful phishing scams continues to grow. In addition to monetary losses attributable to phishing, there is also a loss of confidence that stifles use of online services. Using in-class activities in an introductory computer course…

  17. Artificial Intelligence Applications in Special Education: How Feasible? Final Report.

    ERIC Educational Resources Information Center

    Hofmeister, Alan M.; Ferrara, Joseph M.

    The research project investigated whether expert system tools have become sophisticated enough to be applied efficiently to problems in special education. (Expert systems are a development of artificial intelligence that combines the computer's capacity for storing specialized knowledge with a general set of rules intended to replicate the…

  18. Instructional Design Considerations in Converting Non-CBT Materials into CBT Courses.

    ERIC Educational Resources Information Center

    Ng, Raymond

    Instructional designers who are asked to convert existing training materials into computer-based training (CBT) must take special precautions to avoid making the product into a sophisticated page turner. Although conversion may save considerable time on subject research and analysis, courses to be delivered through microcomputers may require…

  19. CBES--An Efficient Implementation of the Coursewriter Language.

    ERIC Educational Resources Information Center

    Franks, Edward W.

    An extensive computer based education system (CBES) built around the IBM Coursewriter III program product at Ohio State University is described. In this system, numerous extensions have been added to the Coursewriter III language to provide capabilities needed to implement sophisticated instructional strategies. CBES design goals include lower CPU…

  20. Detecting Satisficing in Online Surveys

    ERIC Educational Resources Information Center

    Salifu, Shani

    2012-01-01

    The proliferation of computers and high speed internet services are making online activities an integral part of peoples' lives as connect with friends, shop, and exchange data. The increasing ability of the internet to handle sophisticated data exchanges is endearing it to researchers interested in gathering all kinds of data. This method has the…

  1. A Performance Support Tool for Cisco Training Program Managers

    ERIC Educational Resources Information Center

    Benson, Angela D.; Bothra, Jashoda; Sharma, Priya

    2004-01-01

    Performance support systems can play an important role in corporations by managing and allowing distribution of information more easily. These systems run the gamut from simple paper job aids to sophisticated computer- and web-based software applications that support the entire corporate supply chain. According to Gery (1991), a performance…

  2. Introduction to Autonomous Mobile Robotics Using "Lego Mindstorms" NXT

    ERIC Educational Resources Information Center

    Akin, H. Levent; Meriçli, Çetin; Meriçli, Tekin

    2013-01-01

    Teaching the fundamentals of robotics to computer science undergraduates requires designing a well-balanced curriculum that is complemented with hands-on applications on a platform that allows rapid construction of complex robots, and implementation of sophisticated algorithms. This paper describes such an elective introductory course where the…

  3. Robotics for Computer Scientists: What's the Big Idea?

    ERIC Educational Resources Information Center

    Touretzky, David S.

    2013-01-01

    Modern robots, like today's smartphones, are complex devices with intricate software systems. Introductory robot programming courses must evolve to reflect this reality, by teaching students to make use of the sophisticated tools their robots provide rather than reimplementing basic algorithms. This paper focuses on teaching with Tekkotsu, an open…

  4. Teaching Conversations with the XDS Sigma 7. Systems Description.

    ERIC Educational Resources Information Center

    Bork, Alfred M.; Mosmann, Charles

    Some computers permit conventional programing languages to be extended by the use of macro-instructions, a sophisticated programing tool which is especially useful in writing instructional dialogs. Macro-instructions (or "macro's") are complex commands defined in terms of the machine language or other macro-instructions. Like terms in…

  5. PREFACE: Theory, Modelling and Computational methods for Semiconductors

    NASA Astrophysics Data System (ADS)

    Migliorato, Max; Probert, Matt

    2010-04-01

    These conference proceedings contain the written papers of the contributions presented at the 2nd International Conference on: Theory, Modelling and Computational methods for Semiconductors. The conference was held at the St Williams College, York, UK on 13th-15th Jan 2010. The previous conference in this series took place in 2008 at the University of Manchester, UK. The scope of this conference embraces modelling, theory and the use of sophisticated computational tools in Semiconductor science and technology, where there is a substantial potential for time saving in R&D. The development of high speed computer architectures is finally allowing the routine use of accurate methods for calculating the structural, thermodynamic, vibrational and electronic properties of semiconductors and their heterostructures. This workshop ran for three days, with the objective of bringing together UK and international leading experts in the field of theory of group IV, III-V and II-VI semiconductors together with postdocs and students in the early stages of their careers. The first day focused on providing an introduction and overview of this vast field, aimed particularly at students at this influential point in their careers. We would like to thank all participants for their contribution to the conference programme and these proceedings. We would also like to acknowledge the financial support from the Institute of Physics (Computational Physics group and Semiconductor Physics group), the UK Car-Parrinello Consortium, Accelrys (distributors of Materials Studio) and Quantumwise (distributors of Atomistix). The Editors Acknowledgements Conference Organising Committee: Dr Matt Probert (University of York) and Dr Max Migliorato (University of Manchester) Programme Committee: Dr Marco Califano (University of Leeds), Dr Jacob Gavartin (Accelrys Ltd, Cambridge), Dr Stanko Tomic (STFC Daresbury Laboratory), Dr Gabi Slavcheva (Imperial College London) Proceedings edited and compiled by Dr Max Migliorato and Dr Matt Probert

  6. A finite area scheme for shallow granular flows on three-dimensional surfaces

    NASA Astrophysics Data System (ADS)

    Rauter, Matthias

    2017-04-01

    Shallow granular flow models have become a popular tool for the estimation of natural hazards, such as landslides, debris flows and avalanches. The shallowness of the flow allows to reduce the three-dimensional governing equations to a quasi two-dimensional system. Three-dimensional flow fields are replaced by their depth-integrated two-dimensional counterparts, which yields a robust and fast method [1]. A solution for a simple shallow granular flow model, based on the so-called finite area method [3] is presented. The finite area method is an adaption of the finite volume method [4] to two-dimensional curved surfaces in three-dimensional space. This method handles the three dimensional basal topography in a simple way, making the model suitable for arbitrary (but mildly curved) topography, such as natural terrain. Furthermore, the implementation into the open source software OpenFOAM [4] is shown. OpenFOAM is a popular computational fluid dynamics application, designed so that the top-level code mimics the mathematical governing equations. This makes the code easy to read and extendable to more sophisticated models. Finally, some hints on how to get started with the code and how to extend the basic model will be given. I gratefully acknowledge the financial support by the OEAW project "beyond dense flow avalanches". Savage, S. B. & Hutter, K. 1989 The motion of a finite mass of granular material down a rough incline. Journal of Fluid Mechanics 199, 177-215. Ferziger, J. & Peric, M. 2002 Computational methods for fluid dynamics, 3rd edn. Springer. Tukovic, Z. & Jasak, H. 2012 A moving mesh finite volume interface tracking method for surface tension dominated interfacial fluid flow. Computers & fluids 55, 70-84. Weller, H. G., Tabor, G., Jasak, H. & Fureby, C. 1998 A tensorial approach to computational continuum mechanics using object-oriented techniques. Computers in physics 12(6), 620-631.

  7. Natural three-qubit interactions in one-way quantum computing

    NASA Astrophysics Data System (ADS)

    Tame, M. S.; Paternostro, M.; Kim, M. S.; Vedral, V.

    2006-02-01

    We address the effects of natural three-qubit interactions on the computational power of one-way quantum computation. A benefit of using more sophisticated entanglement structures is the ability to construct compact and economic simulations of quantum algorithms with limited resources. We show that the features of our study are embodied by suitably prepared optical lattices, where effective three-spin interactions have been theoretically demonstrated. We use this to provide a compact construction for the Toffoli gate. Information flow and two-qubit interactions are also outlined, together with a brief analysis of relevant sources of imperfection.

  8. pyblocxs: Bayesian Low-Counts X-ray Spectral Analysis in Sherpa

    NASA Astrophysics Data System (ADS)

    Siemiginowska, A.; Kashyap, V.; Refsdal, B.; van Dyk, D.; Connors, A.; Park, T.

    2011-07-01

    Typical X-ray spectra have low counts and should be modeled using the Poisson distribution. However, χ2 statistic is often applied as an alternative and the data are assumed to follow the Gaussian distribution. A variety of weights to the statistic or a binning of the data is performed to overcome the low counts issues. However, such modifications introduce biases or/and a loss of information. Standard modeling packages such as XSPEC and Sherpa provide the Poisson likelihood and allow computation of rudimentary MCMC chains, but so far do not allow for setting a full Bayesian model. We have implemented a sophisticated Bayesian MCMC-based algorithm to carry out spectral fitting of low counts sources in the Sherpa environment. The code is a Python extension to Sherpa and allows to fit a predefined Sherpa model to high-energy X-ray spectral data and other generic data. We present the algorithm and discuss several issues related to the implementation, including flexible definition of priors and allowing for variations in the calibration information.

  9. GIS Data Based Automatic High-Fidelity 3D Road Network Modeling

    NASA Technical Reports Server (NTRS)

    Wang, Jie; Shen, Yuzhong

    2011-01-01

    3D road models are widely used in many computer applications such as racing games and driving simulations_ However, almost all high-fidelity 3D road models were generated manually by professional artists at the expense of intensive labor. There are very few existing methods for automatically generating 3D high-fidelity road networks, especially those existing in the real world. This paper presents a novel approach thai can automatically produce 3D high-fidelity road network models from real 2D road GIS data that mainly contain road. centerline in formation. The proposed method first builds parametric representations of the road centerlines through segmentation and fitting . A basic set of civil engineering rules (e.g., cross slope, superelevation, grade) for road design are then selected in order to generate realistic road surfaces in compliance with these rules. While the proposed method applies to any types of roads, this paper mainly addresses automatic generation of complex traffic interchanges and intersections which are the most sophisticated elements in the road networks

  10. Investigation of different simulation approaches on a high-head Francis turbine and comparison with model test data: Francis-99

    NASA Astrophysics Data System (ADS)

    Mössinger, Peter; Jester-Zürker, Roland; Jung, Alexander

    2015-01-01

    Numerical investigations of hydraulic turbo machines under steady-state conditions are state of the art in current product development processes. Nevertheless allow increasing computational resources refined discretization methods, more sophisticated turbulence models and therefore better predictions of results as well as the quantification of existing uncertainties. Single stage investigations are done using in-house tools for meshing and set-up procedure. Beside different model domains and a mesh study to reduce mesh dependencies, the variation of several eddy viscosity and Reynolds stress turbulence models are investigated. All obtained results are compared with available model test data. In addition to global values, measured magnitudes in the vaneless space, at runner blade and draft tube positions in term of pressure and velocity are considered. From there it is possible to estimate the influence and relevance of various model domains depending on different operating points and numerical variations. Good agreement can be found for pressure and velocity measurements with all model configurations and, except the BSL-RSM model, all turbulence models. At part load, deviations in hydraulic efficiency are at a large magnitude, whereas at best efficiency and high load operating point efficiencies are close to the measurement. A consideration of the runner side gap geometry as well as a refined mesh is able to improve the results either in relation to hydraulic efficiency or velocity distribution with the drawbacks of less stable numerics and increasing computational time.

  11. Artificial Exo-Society Modeling: a New Tool for SETI Research

    NASA Astrophysics Data System (ADS)

    Gardner, James N.

    2002-01-01

    One of the newest fields of complexity research is artificial society modeling. Methodologically related to artificial life research, artificial society modeling utilizes agent-based computer simulation tools like SWARM and SUGARSCAPE developed by the Santa Fe Institute, Los Alamos National Laboratory and the Bookings Institution in an effort to introduce an unprecedented degree of rigor and quantitative sophistication into social science research. The broad aim of artificial society modeling is to begin the development of a more unified social science that embeds cultural evolutionary processes in a computational environment that simulates demographics, the transmission of culture, conflict, economics, disease, the emergence of groups and coadaptation with an environment in a bottom-up fashion. When an artificial society computer model is run, artificial societal patterns emerge from the interaction of autonomous software agents (the "inhabitants" of the artificial society). Artificial society modeling invites the interpretation of society as a distributed computational system and the interpretation of social dynamics as a specialized category of computation. Artificial society modeling techniques offer the potential of computational simulation of hypothetical alien societies in much the same way that artificial life modeling techniques offer the potential to model hypothetical exobiological phenomena. NASA recently announced its intention to begin exploring the possibility of including artificial life research within the broad portfolio of scientific fields comprised by the interdisciplinary astrobiology research endeavor. It may be appropriate for SETI researchers to likewise commence an exploration of the possible inclusion of artificial exo-society modeling within the SETI research endeavor. Artificial exo-society modeling might be particularly useful in a post-detection environment by (1) coherently organizing the set of data points derived from a detected ETI signal, (2) mapping trends in the data points over time (assuming receipt of an extended ETI signal), and (3) projecting such trends forward to derive alternative cultural evolutionary scenarios for the exo-society under analysis. The latter exercise might be particularly useful to compensate for the inevitable time lag between generation of an ETI signal and receipt of an ETI signal on Earth. For this reason, such an exercise might be a helpful adjunct to the decisional process contemplated by Paragraph 9 of the Declaration of Principles Concerning Activities Following the Detection of Extraterrestrial Intelligence.

  12. Pilot interaction with automated airborne decision making systems

    NASA Technical Reports Server (NTRS)

    Hammer, John M.; Wan, C. Yoon; Vasandani, Vijay

    1987-01-01

    The current research is focused on detection of human error and protection from its consequences. A program for monitoring pilot error by comparing pilot actions to a script was described. It dealt primarily with routine errors (slips) that occurred during checklist activity. The model to which operator actions were compared was a script. Current research is an extension along these two dimensions. The ORS fault detection aid uses a sophisticated device model rather than a script. The newer initiative, the model-based and constraint-based warning system, uses an even more sophisticated device model and is to prevent all types of error, not just slips or bad decision.

  13. PROcess Based Diagnostics PROBE

    NASA Technical Reports Server (NTRS)

    Clune, T.; Schmidt, G.; Kuo, K.; Bauer, M.; Oloso, H.

    2013-01-01

    Many of the aspects of the climate system that are of the greatest interest (e.g., the sensitivity of the system to external forcings) are emergent properties that arise via the complex interplay between disparate processes. This is also true for climate models most diagnostics are not a function of an isolated portion of source code, but rather are affected by multiple components and procedures. Thus any model-observation mismatch is hard to attribute to any specific piece of code or imperfection in a specific model assumption. An alternative approach is to identify diagnostics that are more closely tied to specific processes -- implying that if a mismatch is found, it should be much easier to identify and address specific algorithmic choices that will improve the simulation. However, this approach requires looking at model output and observational data in a more sophisticated way than the more traditional production of monthly or annual mean quantities. The data must instead be filtered in time and space for examples of the specific process being targeted.We are developing a data analysis environment called PROcess-Based Explorer (PROBE) that seeks to enable efficient and systematic computation of process-based diagnostics on very large sets of data. In this environment, investigators can define arbitrarily complex filters and then seamlessly perform computations in parallel on the filtered output from their model. The same analysis can be performed on additional related data sets (e.g., reanalyses) thereby enabling routine comparisons between model and observational data. PROBE also incorporates workflow technology to automatically update computed diagnostics for subsequent executions of a model. In this presentation, we will discuss the design and current status of PROBE as well as share results from some preliminary use cases.

  14. Reinventing Solutions to Systems of Linear Differential Equations: A Case of Emergent Models Involving Analytic Expressions

    ERIC Educational Resources Information Center

    Rasmussen, Chris; Blumenfeld, Howard

    2007-01-01

    An enduring challenge in mathematics education is to create learning environments in which students generate, refine, and extend their intuitive and informal ways of reasoning to more sophisticated and formal ways of reasoning. Pressing concerns for research, therefore, are to detail students' progressively sophisticated ways of reasoning and…

  15. A Low Cost Remote Sensing System Using PC and Stereo Equipment

    NASA Technical Reports Server (NTRS)

    Campbell, Joel F.; Flood, Michael A.; Prasad, Narasimha S.; Hodson, Wade D.

    2011-01-01

    A system using a personal computer, speaker, and a microphone is used to detect objects, and make crude measurements using a carrier modulated by a pseudorandom noise (PN) code. This system can be constructed using a personal computer and audio equipment commonly found in the laboratory or at home, or more sophisticated equipment that can be purchased at reasonable cost. We demonstrate its value as an instructional tool for teaching concepts of remote sensing and digital signal processing.

  16. TRENT2D WG: a smart web infrastructure for debris-flow modelling and hazard assessment

    NASA Astrophysics Data System (ADS)

    Zorzi, Nadia; Rosatti, Giorgio; Zugliani, Daniel; Rizzi, Alessandro; Piffer, Stefano

    2016-04-01

    Mountain regions are naturally exposed to geomorphic flows, which involve large amounts of sediments and induce significant morphological modifications. The physical complexity of this class of phenomena represents a challenging issue for modelling, leading to elaborate theoretical frameworks and sophisticated numerical techniques. In general, geomorphic-flows models proved to be valid tools in hazard assessment and management. However, model complexity seems to represent one of the main obstacles to the diffusion of advanced modelling tools between practitioners and stakeholders, although the UE Flood Directive (2007/60/EC) requires risk management and assessment to be based on "best practices and best available technologies". Furthermore, several cutting-edge models are not particularly user-friendly and multiple stand-alone software are needed to pre- and post-process modelling data. For all these reasons, users often resort to quicker and rougher approaches, leading possibly to unreliable results. Therefore, some effort seems to be necessary to overcome these drawbacks, with the purpose of supporting and encouraging a widespread diffusion of the most reliable, although sophisticated, modelling tools. With this aim, this work presents TRENT2D WG, a new smart modelling solution for the state-of-the-art model TRENT2D (Armanini et al., 2009, Rosatti and Begnudelli, 2013), which simulates debris flows and hyperconcentrated flows adopting a two-phase description over a mobile bed. TRENT2D WG is a web infrastructure joining advantages offered by the software-delivering model SaaS (Software as a Service) and by WebGIS technology and hosting a complete and user-friendly working environment for modelling. In order to develop TRENT2D WG, the model TRENT2D was converted into a service and exposed on a cloud server, transferring computational burdens from the user hardware to a high-performing server and reducing computational time. Then, the system was equipped with an interface supporting Web-based GIS functionalities, making the model accessible through the World Wide Web. Furthermore, WebGIS technology allows georeferenced model input data and simulation results to be produced, managed, displayed and processed in a unique and intuitive working environment. Thanks to its large flexibility, TRENT2D WG was equipped also with a BUWAL-type procedure (Heinimann et al., 1998) to assess and map debris-flow hazard. In this way, model results can be used straightforwardly as input data of the hazard-mapping procedure, avoiding work fragmentation and taking wide advantage of the functionalities offered by WebGIS technology. TRENT2D WG is intended to become a reliable tool for researchers, practitioners and stakeholders, supporting modelling and hazard mapping effectively and encouraging connections between the research field and professional needs at a working scale.

  17. Scaffolding Students' Online Critiquing of Expert- and Peer-generated Molecular Models of Chemical Reactions

    NASA Astrophysics Data System (ADS)

    Chang, Hsin-Yi; Chang, Hsiang-Chi

    2013-08-01

    In this study, we developed online critiquing activities using an open-source computer learning environment. We investigated how well the activities scaffolded students to critique molecular models of chemical reactions made by scientists, peers, and a fictitious peer, and whether the activities enhanced the students' understanding of science models and chemical reactions. The activities were implemented in an eighth-grade class with 28 students in a public junior high school in southern Taiwan. The study employed mixed research methods. Data collected included pre- and post-instructional assessments, post-instructional interviews, and students' electronic written responses and oral discussions during the critiquing activities. The results indicated that these activities guided the students to produce overall quality critiques. Also, the students developed a more sophisticated understanding of chemical reactions and scientific models as a result of the intervention. Design considerations for effective model critiquing activities are discussed based on observational results, including the use of peer-generated artefacts for critiquing to promote motivation and collaboration, coupled with critiques of scientific models to enhance students' epistemological understanding of model purpose and communication.

  18. The Matrix Element Method: Past, Present, and Future

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gainer, James S.; Lykken, Joseph; Matchev, Konstantin T.

    2013-07-12

    The increasing use of multivariate methods, and in particular the Matrix Element Method (MEM), represents a revolution in experimental particle physics. With continued exponential growth in computing capabilities, the use of sophisticated multivariate methods-- already common-- will soon become ubiquitous and ultimately almost compulsory. While the existence of sophisticated algorithms for disentangling signal and background might naively suggest a diminished role for theorists, the use of the MEM, with its inherent connection to the calculation of differential cross sections will benefit from collaboration between theorists and experimentalists. In this white paper, we will briefly describe the MEM and some ofmore » its recent uses, note some current issues and potential resolutions, and speculate about exciting future opportunities.« less

  19. The GLEaMviz computational tool, a publicly available software to explore realistic epidemic spreading scenarios at the global scale

    PubMed Central

    2011-01-01

    Background Computational models play an increasingly important role in the assessment and control of public health crises, as demonstrated during the 2009 H1N1 influenza pandemic. Much research has been done in recent years in the development of sophisticated data-driven models for realistic computer-based simulations of infectious disease spreading. However, only a few computational tools are presently available for assessing scenarios, predicting epidemic evolutions, and managing health emergencies that can benefit a broad audience of users including policy makers and health institutions. Results We present "GLEaMviz", a publicly available software system that simulates the spread of emerging human-to-human infectious diseases across the world. The GLEaMviz tool comprises three components: the client application, the proxy middleware, and the simulation engine. The latter two components constitute the GLEaMviz server. The simulation engine leverages on the Global Epidemic and Mobility (GLEaM) framework, a stochastic computational scheme that integrates worldwide high-resolution demographic and mobility data to simulate disease spread on the global scale. The GLEaMviz design aims at maximizing flexibility in defining the disease compartmental model and configuring the simulation scenario; it allows the user to set a variety of parameters including: compartment-specific features, transition values, and environmental effects. The output is a dynamic map and a corresponding set of charts that quantitatively describe the geo-temporal evolution of the disease. The software is designed as a client-server system. The multi-platform client, which can be installed on the user's local machine, is used to set up simulations that will be executed on the server, thus avoiding specific requirements for large computational capabilities on the user side. Conclusions The user-friendly graphical interface of the GLEaMviz tool, along with its high level of detail and the realism of its embedded modeling approach, opens up the platform to simulate realistic epidemic scenarios. These features make the GLEaMviz computational tool a convenient teaching/training tool as well as a first step toward the development of a computational tool aimed at facilitating the use and exploitation of computational models for the policy making and scenario analysis of infectious disease outbreaks. PMID:21288355

  20. CANFAR + Skytree: Mining Massive Datasets as an Essential Part of the Future of Astronomy

    NASA Astrophysics Data System (ADS)

    Ball, Nicholas M.

    2013-01-01

    The future study of large astronomical datasets, consisting of hundreds of millions to billions of objects, will be dominated by large computing resources, and by analysis tools of the necessary scalability and sophistication to extract useful information. Significant effort will be required to fulfil their potential as a provider of the next generation of science results. To-date, computing systems have allowed either sophisticated analysis of small datasets, e.g., most astronomy software, or simple analysis of large datasets, e.g., database queries. At the Canadian Astronomy Data Centre, we have combined our cloud computing system, the Canadian Advanced Network for Astronomical Research (CANFAR), with the world's most advanced machine learning software, Skytree, to create the world's first cloud computing system for data mining in astronomy. This allows the full sophistication of the huge fields of data mining and machine learning to be applied to the hundreds of millions of objects that make up current large datasets. CANFAR works by utilizing virtual machines, which appear to the user as equivalent to a desktop. Each machine is replicated as desired to perform large-scale parallel processing. Such an arrangement carries far more flexibility than other cloud systems, because it enables the user to immediately install and run the same code that they already utilize for science on their desktop. We demonstrate the utility of the CANFAR + Skytree system by showing science results obtained, including assigning photometric redshifts with full probability density functions (PDFs) to a catalog of approximately 133 million galaxies from the MegaPipe reductions of the Canada-France-Hawaii Telescope Legacy Wide and Deep surveys. Each PDF is produced nonparametrically from 100 instances of the photometric parameters for each galaxy, generated by perturbing within the errors on the measurements. Hence, we produce, store, and assign redshifts to, a catalog of over 13 billion object instances. This catalog is comparable in size to those expected from next-generation surveys, such as Large Synoptic Survey Telescope. The CANFAR+Skytree system is open for use by any interested member of the astronomical community.

  1. GiveMe Shelter: A People-Centred Design Process for Promoting Independent Inquiry-Led Learning in Engineering

    ERIC Educational Resources Information Center

    Dyer, Mark; Grey, Thomas; Kinnane, Oliver

    2017-01-01

    It has become increasingly common for tasks traditionally carried out by engineers to be undertaken by technicians and technologist with access to sophisticated computers and software that can often perform complex calculations that were previously the responsibility of engineers. Not surprisingly, this development raises serious questions about…

  2. A Developing Market for Continuing Higher Education: The Reserve Components.

    ERIC Educational Resources Information Center

    Watt, David M.

    Due to increasingly sophisticated military equipment, the Reserve Components of the armed forces need to raise the educational standards for recruits. A number of U.S. educational institutions have responded to their needs for continuing higher education in the areas of job skill enhancement (such as computer operation), regular courses directly…

  3. Deep FIFO Surge Buffer

    NASA Technical Reports Server (NTRS)

    Temple, Gerald; Siegel, Marc; Amitai, Zwie

    1991-01-01

    First-in/first-out (FIFO) temporarily stores short surges of data generated by data-acquisition system at excessively high rate and releases data at lower rate suitable for processing by computer. Size and complexity reduced while capacity enhanced by use of newly developed, sophisticated integrated circuits and by "byte-folding" scheme doubling effective depth and data rate.

  4. Let's Dance the "Robot Hokey-Pokey!": Children's Programming Approaches and Achievement throughout Early Cognitive Development

    ERIC Educational Resources Information Center

    Flannery, Louise P.; Bers, Marina Umaschi

    2013-01-01

    Young learners today generate, express, and interact with sophisticated ideas using a range of digital tools to explore interactive stories, animations, computer games, and robotics. In recent years, new developmentally appropriate robotics kits have been entering early childhood classrooms. This paper presents a retrospective analysis of one…

  5. Using Excel's Solver Function to Facilitate Reciprocal Service Department Cost Allocations

    ERIC Educational Resources Information Center

    Leese, Wallace R.

    2013-01-01

    The reciprocal method of service department cost allocation requires linear equations to be solved simultaneously. These computations are often so complex as to cause the abandonment of the reciprocal method in favor of the less sophisticated and theoretically incorrect direct or step-down methods. This article illustrates how Excel's Solver…

  6. ALFIL: A Crowd Simulation Serious Game for Massive Evacuation Training and Awareness

    ERIC Educational Resources Information Center

    García-García, César; Fernández-Robles, José Luis; Larios-Rosillo, Victor; Luga, Hervé

    2012-01-01

    This article presents the current development of a serious game for the simulation of massive evacuations. The purpose of this project is to promote self-protection through awareness of the procedures and different possible scenarios during the evacuation of a massive event. Sophisticated behaviors require massive computational power and it has…

  7. Children's Behavior toward and Understanding of Robotic and Living Dogs

    ERIC Educational Resources Information Center

    Melson, Gail F.; Kahn, Peter H., Jr.; Beck, Alan; Friedman, Batya; Roberts, Trace; Garrett, Erik; Gill, Brian T.

    2009-01-01

    This study investigated children's reasoning about and behavioral interactions with a computationally sophisticated robotic dog (Sony's AIBO) compared to a live dog (an Australian Shepherd). Seventy-two children from three age groups (7-9 years, 10-12 years, and 13-15 years) participated in this study. Results showed that more children…

  8. Using Excel's Matrix Operations to Facilitate Reciprocal Cost Allocations

    ERIC Educational Resources Information Center

    Leese, Wallace R.; Kizirian, Tim

    2009-01-01

    The reciprocal method of service department cost allocation requires linear equations to be solved simultaneously. These computations are often so complex as to cause the abandonment of the reciprocal method in favor of the less sophisticated direct or step-down methods. Here is a short example demonstrating how Excel's sometimes unknown matrix…

  9. DOE/ NREL Build One of the World's Most Energy Efficient Office Spaces

    ScienceCinema

    Radocy, Rachel; Livingston, Brian; von Luhrte, Rich

    2018-05-18

    Technology — from sophisticated computer modeling to advanced windows that actually open — will help the newest building at the U.S. Department of Energy's (DOE) National Renewable Energy Laboratory (NREL) be one of the world's most energy efficient offices. Scheduled to open this summer, the 222,000 square-foot RSF will house more than 800 staff and an energy efficient information technology data center. Because 19 percent of the country's energy is used by commercial buildings, DOE plans to make this facility a showcase for energy efficiency. DOE hopes the design of the RSF will be replicated by the building industry and help reduce the nation's energy consumption by changing the way commercial buildings are designed and built.

  10. The neural code of thoughts and feelings. Comment on "Topodynamics of metastable brains" by Arturo Tozzi et al.

    NASA Astrophysics Data System (ADS)

    Jaušovec, Norbert

    2017-07-01

    Recently the number of theories trying to explain the brain - cognition - behavior relation has been increased. Promoted on the one hand by the development of sophisticated brain imaging techniques, such as functional magnetic resonance imaging (fMRI) and diffusion tensor imaging (DTI), and on the other, by complex computational models based on chaos and graph theory. But has this really advanced our understanding of the brain-behavior relation beyond Descartes's dualistic mind body division? One could critically argue that replacing the pineal body with extracellular electric fields represented in the electroencephalogram (EEG) as rapid transitional processes (RTS), combined with algebraic topology and dubbed brain topodynamics [1] is just putting lipstick on an outmoded evergreen.

  11. Faster PET reconstruction with a stochastic primal-dual hybrid gradient method

    NASA Astrophysics Data System (ADS)

    Ehrhardt, Matthias J.; Markiewicz, Pawel; Chambolle, Antonin; Richtárik, Peter; Schott, Jonathan; Schönlieb, Carola-Bibiane

    2017-08-01

    Image reconstruction in positron emission tomography (PET) is computationally challenging due to Poisson noise, constraints and potentially non-smooth priors-let alone the sheer size of the problem. An algorithm that can cope well with the first three of the aforementioned challenges is the primal-dual hybrid gradient algorithm (PDHG) studied by Chambolle and Pock in 2011. However, PDHG updates all variables in parallel and is therefore computationally demanding on the large problem sizes encountered with modern PET scanners where the number of dual variables easily exceeds 100 million. In this work, we numerically study the usage of SPDHG-a stochastic extension of PDHG-but is still guaranteed to converge to a solution of the deterministic optimization problem with similar rates as PDHG. Numerical results on a clinical data set show that by introducing randomization into PDHG, similar results as the deterministic algorithm can be achieved using only around 10 % of operator evaluations. Thus, making significant progress towards the feasibility of sophisticated mathematical models in a clinical setting.

  12. The bacteriorhodopsin model membrane system as a prototype molecular computing element.

    PubMed

    Hong, F T

    1986-01-01

    The quest for more sophisticated integrated circuits to overcome the limitation of currently available silicon integrated circuits has led to the proposal of using biological molecules as computational elements by computer scientists and engineers. While the theoretical aspect of this possibility has been pursued by computer scientists, the research and development of experimental prototypes have not been pursued with an equal intensity. In this survey, we make an attempt to examine model membrane systems that incorporate the protein pigment bacteriorhodopsin which is found in Halobacterium halobium. This system was chosen for several reasons. The pigment/membrane system is sufficiently simple and stable for rigorous quantitative study, yet at the same time sufficiently complex in molecular structure to permit alteration of this structure in an attempt to manipulate the photosignal. Several methods of forming the pigment/membrane assembly are described and the potential application to biochip design is discussed. Experimental data using these membranes and measured by a tunable voltage clamp method are presented along with a theoretical analysis based on the Gouy-Chapman diffuse double layer theory to illustrate the usefulness of this approach. It is shown that detailed layouts of the pigment/membrane assembly as well as external loading conditions can modify the time course of the photosignal in a predictable manner. Some problems that may arise in the actual implementation and manufacturing, as well as the use of existing technology in protein chemistry, immunology, and recombinant DNA technology are discussed.

  13. HO + CO reaction rates and H/D kinetic isotope effects: master equation models with ab initio SCTST rate constants.

    PubMed

    Weston, Ralph E; Nguyen, Thanh Lam; Stanton, John F; Barker, John R

    2013-02-07

    Ab initio microcanonical rate constants were computed using Semi-Classical Transition State Theory (SCTST) and used in two master equation formulations (1D, depending on active energy with centrifugal corrections, and 2D, depending on total energy and angular momentum) to compute temperature-dependent rate constants for the title reactions using a potential energy surface obtained by sophisticated ab initio calculations. The 2D master equation was used at the P = 0 and P = ∞ limits, while the 1D master equation with centrifugal corrections and an empirical energy transfer parameter could be used over the entire pressure range. Rate constants were computed for 75 K ≤ T ≤ 2500 K and 0 ≤ [He] ≤ 10(23) cm(-3). For all temperatures and pressures important for combustion and for the terrestrial atmosphere, the agreement with the experimental rate constants is very good, but at very high pressures and T ≤ 200 K, the theoretical rate constants are significantly smaller than the experimental values. This effect is possibly due to the presence in the experiments of dimers and prereactive complexes, which were not included in the model calculations. The computed H/D kinetic isotope effects are in acceptable agreement with experimental data, which show considerable scatter. Overall, the agreement between experimental and theoretical H/D kinetic isotope effects is much better than in previous work, and an assumption of non-RRKM behavior does not appear to be needed to reproduce experimental observations.

  14. The Social Bayesian Brain: Does Mentalizing Make a Difference When We Learn?

    PubMed Central

    Devaine, Marie; Hollard, Guillaume; Daunizeau, Jean

    2014-01-01

    When it comes to interpreting others' behaviour, we almost irrepressibly engage in the attribution of mental states (beliefs, emotions…). Such "mentalizing" can become very sophisticated, eventually endowing us with highly adaptive skills such as convincing, teaching or deceiving. Here, sophistication can be captured in terms of the depth of our recursive beliefs, as in "I think that you think that I think…" In this work, we test whether such sophisticated recursive beliefs subtend learning in the context of social interaction. We asked participants to play repeated games against artificial (Bayesian) mentalizing agents, which differ in their sophistication. Critically, we made people believe either that they were playing against each other, or that they were gambling like in a casino. Although both framings are similarly deceiving, participants win against the artificial (sophisticated) mentalizing agents in the social framing of the task, and lose in the non-social framing. Moreover, we find that participants' choice sequences are best explained by sophisticated mentalizing Bayesian learning models only in the social framing. This study is the first demonstration of the added-value of mentalizing on learning in the context of repeated social interactions. Importantly, our results show that we would not be able to decipher intentional behaviour without a priori attributing mental states to others. PMID:25474637

  15. Proceedings Second Annual Cyber Security and Information Infrastructure Research Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sheldon, Frederick T; Krings, Axel; Yoo, Seong-Moo

    2006-01-01

    The workshop theme is Cyber Security: Beyond the Maginot Line Recently the FBI reported that computer crime has skyrocketed costing over $67 billion in 2005 alone and affecting 2.8M+ businesses and organizations. Attack sophistication is unprecedented along with availability of open source concomitant tools. Private, academic, and public sectors invest significant resources in cyber security. Industry primarily performs cyber security research as an investment in future products and services. While the public sector also funds cyber security R&D, the majority of this activity focuses on the specific mission(s) of the funding agency. Thus, broad areas of cyber security remain neglectedmore » or underdeveloped. Consequently, this workshop endeavors to explore issues involving cyber security and related technologies toward strengthening such areas and enabling the development of new tools and methods for securing our information infrastructure critical assets. We aim to assemble new ideas and proposals about robust models on which we can build the architecture of a secure cyberspace including but not limited to: * Knowledge discovery and management * Critical infrastructure protection * De-obfuscating tools for the validation and verification of tamper-proofed software * Computer network defense technologies * Scalable information assurance strategies * Assessment-driven design for trust * Security metrics and testing methodologies * Validation of security and survivability properties * Threat assessment and risk analysis * Early accurate detection of the insider threat * Security hardened sensor networks and ubiquitous computing environments * Mobile software authentication protocols * A new "model" of the threat to replace the "Maginot Line" model and more . . .« less

  16. Consequence modeling using the fire dynamics simulator.

    PubMed

    Ryder, Noah L; Sutula, Jason A; Schemel, Christopher F; Hamer, Andrew J; Van Brunt, Vincent

    2004-11-11

    The use of Computational Fluid Dynamics (CFD) and in particular Large Eddy Simulation (LES) codes to model fires provides an efficient tool for the prediction of large-scale effects that include plume characteristics, combustion product dispersion, and heat effects to adjacent objects. This paper illustrates the strengths of the Fire Dynamics Simulator (FDS), an LES code developed by the National Institute of Standards and Technology (NIST), through several small and large-scale validation runs and process safety applications. The paper presents two fire experiments--a small room fire and a large (15 m diameter) pool fire. The model results are compared to experimental data and demonstrate good agreement between the models and data. The validation work is then extended to demonstrate applicability to process safety concerns by detailing a model of a tank farm fire and a model of the ignition of a gaseous fuel in a confined space. In this simulation, a room was filled with propane, given time to disperse, and was then ignited. The model yields accurate results of the dispersion of the gas throughout the space. This information can be used to determine flammability and explosive limits in a space and can be used in subsequent models to determine the pressure and temperature waves that would result from an explosion. The model dispersion results were compared to an experiment performed by Factory Mutual. Using the above examples, this paper will demonstrate that FDS is ideally suited to build realistic models of process geometries in which large scale explosion and fire failure risks can be evaluated with several distinct advantages over more traditional CFD codes. Namely transient solutions to fire and explosion growth can be produced with less sophisticated hardware (lower cost) than needed for traditional CFD codes (PC type computer verses UNIX workstation) and can be solved for longer time histories (on the order of hundreds of seconds of computed time) with minimal computer resources and length of model run. Additionally results that are produced can be analyzed, viewed, and tabulated during and following a model run within a PC environment. There are some tradeoffs, however, as rapid computations in PC's may require a sacrifice in the grid resolution or in the sub-grid modeling, depending on the size of the geometry modeled.

  17. Near-optimal protocols in complex nonequilibrium transformations

    DOE PAGES

    Gingrich, Todd R.; Rotskoff, Grant M.; Crooks, Gavin E.; ...

    2016-08-29

    The development of sophisticated experimental means to control nanoscale systems has motivated efforts to design driving protocols that minimize the energy dissipated to the environment. Computational models are a crucial tool in this practical challenge. In this paper, we describe a general method for sampling an ensemble of finite-time, nonequilibrium protocols biased toward a low average dissipation. In addition, we show that this scheme can be carried out very efficiently in several limiting cases. As an application, we sample the ensemble of low-dissipation protocols that invert the magnetization of a 2D Ising model and explore how the diversity of themore » protocols varies in response to constraints on the average dissipation. In this example, we find that there is a large set of protocols with average dissipation close to the optimal value, which we argue is a general phenomenon.« less

  18. Fulfilling the promise of the materials genome initiative with high-throughput experimental methodologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Green, Martin L.; Choi, C. L.; Hattrick-Simpers, J. R.

    The Materials Genome Initiative, a national effort to introduce new materials into the market faster and at lower cost, has made significant progress in computational simulation and modeling of materials. To build on this progress, a large amount of experimental data for validating these models, and informing more sophisticated ones, will be required. High-throughput experimentation generates large volumes of experimental data using combinatorial materials synthesis and rapid measurement techniques, making it an ideal experimental complement to bring the Materials Genome Initiative vision to fruition. This paper reviews the state-of-the-art results, opportunities, and challenges in high-throughput experimentation for materials design. Asmore » a result, a major conclusion is that an effort to deploy a federated network of high-throughput experimental (synthesis and characterization) tools, which are integrated with a modern materials data infrastructure, is needed.« less

  19. Large-Scale Brain Simulation and Disorders of Consciousness. Mapping Technical and Conceptual Issues.

    PubMed

    Farisco, Michele; Kotaleski, Jeanette H; Evers, Kathinka

    2018-01-01

    Modeling and simulations have gained a leading position in contemporary attempts to describe, explain, and quantitatively predict the human brain's operations. Computer models are highly sophisticated tools developed to achieve an integrated knowledge of the brain with the aim of overcoming the actual fragmentation resulting from different neuroscientific approaches. In this paper we investigate the plausibility of simulation technologies for emulation of consciousness and the potential clinical impact of large-scale brain simulation on the assessment and care of disorders of consciousness (DOCs), e.g., Coma, Vegetative State/Unresponsive Wakefulness Syndrome, Minimally Conscious State. Notwithstanding their technical limitations, we suggest that simulation technologies may offer new solutions to old practical problems, particularly in clinical contexts. We take DOCs as an illustrative case, arguing that the simulation of neural correlates of consciousness is potentially useful for improving treatments of patients with DOCs.

  20. Engineering a Functional Small RNA Negative Autoregulation Network with Model-Guided Design.

    PubMed

    Hu, Chelsea Y; Takahashi, Melissa K; Zhang, Yan; Lucks, Julius B

    2018-05-22

    RNA regulators are powerful components of the synthetic biology toolbox. Here, we expand the repertoire of synthetic gene networks built from these regulators by constructing a transcriptional negative autoregulation (NAR) network out of small RNAs (sRNAs). NAR network motifs are core motifs of natural genetic networks, and are known for reducing network response time and steady state signal. Here we use cell-free transcription-translation (TX-TL) reactions and a computational model to design and prototype sRNA NAR constructs. Using parameter sensitivity analysis, we design a simple set of experiments that allow us to accurately predict NAR function in TX-TL. We transfer successful network designs into Escherichia coli and show that our sRNA transcriptional network reduces both network response time and steady-state gene expression. This work broadens our ability to construct increasingly sophisticated RNA genetic networks with predictable function.

  1. Electric Motor Thermal Management R&D; NREL (National Renewable Energy Laboratory)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bennion, Kevin

    2015-06-09

    Thermal constraints place significant limitations on how electric motors ultimately perform. Without the ability to remove heat, the motor cannot operate without sacrificing performance, efficiency, and reliability. Finite element analysis and computational fluid dynamics modeling approaches are being increasingly utilized in the design and analysis of electric motors. As the models become more sophisticated, it is important to have detailed and accurate knowledge of both the passive thermal performance and the active cooling performance. In this work, we provide an overview of research characterizing both passive and active thermal elements related to electric motor thermal management. To better characterize themore » passive thermal performance, work is being performed to measure motor material thermal properties and thermal contact resistances. The active cooling performance of automatic transmission fluid (ATF) jets is also being measured to better understand the heat transfer coefficients of ATF impinging on motor copper windings.« less

  2. JSPAM: A restricted three-body code for simulating interacting galaxies

    NASA Astrophysics Data System (ADS)

    Wallin, J. F.; Holincheck, A. J.; Harvey, A.

    2016-07-01

    Restricted three-body codes have a proven ability to recreate much of the disturbed morphology of actual interacting galaxies. As more sophisticated n-body models were developed and computer speed increased, restricted three-body codes fell out of favor. However, their supporting role for performing wide searches of parameter space when fitting orbits to real systems demonstrates a continuing need for their use. Here we present the model and algorithm used in the JSPAM code. A precursor of this code was originally described in 1990, and was called SPAM. We have recently updated the software with an alternate potential and a treatment of dynamical friction to more closely mimic the results from n-body tree codes. The code is released publicly for use under the terms of the Academic Free License ("AFL") v. 3.0 and has been added to the Astrophysics Source Code Library.

  3. Large-Scale Brain Simulation and Disorders of Consciousness. Mapping Technical and Conceptual Issues

    PubMed Central

    Farisco, Michele; Kotaleski, Jeanette H.; Evers, Kathinka

    2018-01-01

    Modeling and simulations have gained a leading position in contemporary attempts to describe, explain, and quantitatively predict the human brain’s operations. Computer models are highly sophisticated tools developed to achieve an integrated knowledge of the brain with the aim of overcoming the actual fragmentation resulting from different neuroscientific approaches. In this paper we investigate the plausibility of simulation technologies for emulation of consciousness and the potential clinical impact of large-scale brain simulation on the assessment and care of disorders of consciousness (DOCs), e.g., Coma, Vegetative State/Unresponsive Wakefulness Syndrome, Minimally Conscious State. Notwithstanding their technical limitations, we suggest that simulation technologies may offer new solutions to old practical problems, particularly in clinical contexts. We take DOCs as an illustrative case, arguing that the simulation of neural correlates of consciousness is potentially useful for improving treatments of patients with DOCs. PMID:29740372

  4. Debating complexity in modeling

    USGS Publications Warehouse

    Hunt, Randall J.; Zheng, Chunmiao

    1999-01-01

    As scientists trying to understand the natural world, how should our effort be apportioned? We know that the natural world is characterized by complex and interrelated processes. Yet do we need to explicitly incorporate these intricacies to perform the tasks we are charged with? In this era of expanding computer power and development of sophisticated preprocessors and postprocessors, are bigger machines making better models? Put another way, do we understand the natural world better now with all these advancements in our simulation ability? Today the public's patience for long-term projects producing indeterminate results is wearing thin. This increases pressure on the investigator to use the appropriate technology efficiently. On the other hand, bringing scientific results into the legal arena opens up a new dimension to the issue: to the layperson, a tool that includes more of the complexity known to exist in the real world is expected to provide the more scientifically valid answer.

  5. Fulfilling the promise of the materials genome initiative with high-throughput experimental methodologies

    DOE PAGES

    Green, Martin L.; Choi, C. L.; Hattrick-Simpers, J. R.; ...

    2017-03-28

    The Materials Genome Initiative, a national effort to introduce new materials into the market faster and at lower cost, has made significant progress in computational simulation and modeling of materials. To build on this progress, a large amount of experimental data for validating these models, and informing more sophisticated ones, will be required. High-throughput experimentation generates large volumes of experimental data using combinatorial materials synthesis and rapid measurement techniques, making it an ideal experimental complement to bring the Materials Genome Initiative vision to fruition. This paper reviews the state-of-the-art results, opportunities, and challenges in high-throughput experimentation for materials design. Asmore » a result, a major conclusion is that an effort to deploy a federated network of high-throughput experimental (synthesis and characterization) tools, which are integrated with a modern materials data infrastructure, is needed.« less

  6. PFAAT version 2.0: a tool for editing, annotating, and analyzing multiple sequence alignments.

    PubMed

    Caffrey, Daniel R; Dana, Paul H; Mathur, Vidhya; Ocano, Marco; Hong, Eun-Jong; Wang, Yaoyu E; Somaroo, Shyamal; Caffrey, Brian E; Potluri, Shobha; Huang, Enoch S

    2007-10-11

    By virtue of their shared ancestry, homologous sequences are similar in their structure and function. Consequently, multiple sequence alignments are routinely used to identify trends that relate to function. This type of analysis is particularly productive when it is combined with structural and phylogenetic analysis. Here we describe the release of PFAAT version 2.0, a tool for editing, analyzing, and annotating multiple sequence alignments. Support for multiple annotations is a key component of this release as it provides a framework for most of the new functionalities. The sequence annotations are accessible from the alignment and tree, where they are typically used to label sequences or hyperlink them to related databases. Sequence annotations can be created manually or extracted automatically from UniProt entries. Once a multiple sequence alignment is populated with sequence annotations, sequences can be easily selected and sorted through a sophisticated search dialog. The selected sequences can be further analyzed using statistical methods that explicitly model relationships between the sequence annotations and residue properties. Residue annotations are accessible from the alignment viewer and are typically used to designate binding sites or properties for a particular residue. Residue annotations are also searchable, and allow one to quickly select alignment columns for further sequence analysis, e.g. computing percent identities. Other features include: novel algorithms to compute sequence conservation, mapping conservation scores to a 3D structure in Jmol, displaying secondary structure elements, and sorting sequences by residue composition. PFAAT provides a framework whereby end-users can specify knowledge for a protein family in the form of annotation. The annotations can be combined with sophisticated analysis to test hypothesis that relate to sequence, structure and function.

  7. Game Theory of Mind

    PubMed Central

    Yoshida, Wako; Dolan, Ray J.; Friston, Karl J.

    2008-01-01

    This paper introduces a model of ‘theory of mind’, namely, how we represent the intentions and goals of others to optimise our mutual interactions. We draw on ideas from optimum control and game theory to provide a ‘game theory of mind’. First, we consider the representations of goals in terms of value functions that are prescribed by utility or rewards. Critically, the joint value functions and ensuing behaviour are optimised recursively, under the assumption that I represent your value function, your representation of mine, your representation of my representation of yours, and so on ad infinitum. However, if we assume that the degree of recursion is bounded, then players need to estimate the opponent's degree of recursion (i.e., sophistication) to respond optimally. This induces a problem of inferring the opponent's sophistication, given behavioural exchanges. We show it is possible to deduce whether players make inferences about each other and quantify their sophistication on the basis of choices in sequential games. This rests on comparing generative models of choices with, and without, inference. Model comparison is demonstrated using simulated and real data from a ‘stag-hunt’. Finally, we note that exactly the same sophisticated behaviour can be achieved by optimising the utility function itself (through prosocial utility), producing unsophisticated but apparently altruistic agents. This may be relevant ethologically in hierarchal game theory and coevolution. PMID:19112488

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoehler, M; McCallen, D; Noble, C

    The analysis, and subsequent retrofit, of concrete arch bridges during recent years has relied heavily on the use of computational simulation. For seismic analysis in particular, computer simulation, typically utilizing linear approximations of structural behavior, has become standard practice. This report presents the results of a comprehensive study of the significance of model sophistication (i.e. linear vs. nonlinear) and pertinent modeling assumptions on the dynamic response of concrete arch bridges. The study uses the Bixby Creek Bridge, located in California, as a case study. In addition to presenting general recommendations for analysis of this class of structures, this report providesmore » an independent evaluation of the proposed seismic retrofit for the Bixby Creek Bridge. Results from the study clearly illustrate a reduction of displacement drifts and redistribution of member forces brought on by the inclusion of material nonlinearity. The analyses demonstrate that accurate modeling of expansion joints, for the Bixby Creek Bridge in particular, is critical to achieve representative modal and transient behavior. The inclusion of near-field displacement pulses in ground motion records was shown to significantly increase demand on the relatively softer, longer period Bixby Creek Bridge arch. Stiffer, shorter period arches, however, are more likely susceptible to variable support motions arising from the canyon topography typical for this class of bridges.« less

  9. Applying Computerized-Scoring Models of Written Biological Explanations across Courses and Colleges: Prospects and Limitations

    PubMed Central

    Ha, Minsu; Nehm, Ross H.; Urban-Lurain, Mark; Merrill, John E.

    2011-01-01

    Our study explored the prospects and limitations of using machine-learning software to score introductory biology students’ written explanations of evolutionary change. We investigated three research questions: 1) Do scoring models built using student responses at one university function effectively at another university? 2) How many human-scored student responses are needed to build scoring models suitable for cross-institutional application? 3) What factors limit computer-scoring efficacy, and how can these factors be mitigated? To answer these questions, two biology experts scored a corpus of 2556 short-answer explanations (from biology majors and nonmajors) at two universities for the presence or absence of five key concepts of evolution. Human- and computer-generated scores were compared using kappa agreement statistics. We found that machine-learning software was capable in most cases of accurately evaluating the degree of scientific sophistication in undergraduate majors’ and nonmajors’ written explanations of evolutionary change. In cases in which the software did not perform at the benchmark of “near-perfect” agreement (kappa > 0.80), we located the causes of poor performance and identified a series of strategies for their mitigation. Machine-learning software holds promise as an assessment tool for use in undergraduate biology education, but like most assessment tools, it is also characterized by limitations. PMID:22135372

  10. Applying computerized-scoring models of written biological explanations across courses and colleges: prospects and limitations.

    PubMed

    Ha, Minsu; Nehm, Ross H; Urban-Lurain, Mark; Merrill, John E

    2011-01-01

    Our study explored the prospects and limitations of using machine-learning software to score introductory biology students' written explanations of evolutionary change. We investigated three research questions: 1) Do scoring models built using student responses at one university function effectively at another university? 2) How many human-scored student responses are needed to build scoring models suitable for cross-institutional application? 3) What factors limit computer-scoring efficacy, and how can these factors be mitigated? To answer these questions, two biology experts scored a corpus of 2556 short-answer explanations (from biology majors and nonmajors) at two universities for the presence or absence of five key concepts of evolution. Human- and computer-generated scores were compared using kappa agreement statistics. We found that machine-learning software was capable in most cases of accurately evaluating the degree of scientific sophistication in undergraduate majors' and nonmajors' written explanations of evolutionary change. In cases in which the software did not perform at the benchmark of "near-perfect" agreement (kappa > 0.80), we located the causes of poor performance and identified a series of strategies for their mitigation. Machine-learning software holds promise as an assessment tool for use in undergraduate biology education, but like most assessment tools, it is also characterized by limitations.

  11. Integrative multicellular biological modeling: a case study of 3D epidermal development using GPU algorithms

    PubMed Central

    2010-01-01

    Background Simulation of sophisticated biological models requires considerable computational power. These models typically integrate together numerous biological phenomena such as spatially-explicit heterogeneous cells, cell-cell interactions, cell-environment interactions and intracellular gene networks. The recent advent of programming for graphical processing units (GPU) opens up the possibility of developing more integrative, detailed and predictive biological models while at the same time decreasing the computational cost to simulate those models. Results We construct a 3D model of epidermal development and provide a set of GPU algorithms that executes significantly faster than sequential central processing unit (CPU) code. We provide a parallel implementation of the subcellular element method for individual cells residing in a lattice-free spatial environment. Each cell in our epidermal model includes an internal gene network, which integrates cellular interaction of Notch signaling together with environmental interaction of basement membrane adhesion, to specify cellular state and behaviors such as growth and division. We take a pedagogical approach to describing how modeling methods are efficiently implemented on the GPU including memory layout of data structures and functional decomposition. We discuss various programmatic issues and provide a set of design guidelines for GPU programming that are instructive to avoid common pitfalls as well as to extract performance from the GPU architecture. Conclusions We demonstrate that GPU algorithms represent a significant technological advance for the simulation of complex biological models. We further demonstrate with our epidermal model that the integration of multiple complex modeling methods for heterogeneous multicellular biological processes is both feasible and computationally tractable using this new technology. We hope that the provided algorithms and source code will be a starting point for modelers to develop their own GPU implementations, and encourage others to implement their modeling methods on the GPU and to make that code available to the wider community. PMID:20696053

  12. Landslides Identification Using Airborne Laser Scanning Data Derived Topographic Terrain Attributes and Support Vector Machine Classification

    NASA Astrophysics Data System (ADS)

    Pawłuszek, Kamila; Borkowski, Andrzej

    2016-06-01

    Since the availability of high-resolution Airborne Laser Scanning (ALS) data, substantial progress in geomorphological research, especially in landslide analysis, has been carried out. First and second order derivatives of Digital Terrain Model (DTM) have become a popular and powerful tool in landslide inventory mapping. Nevertheless, an automatic landslide mapping based on sophisticated classifiers including Support Vector Machine (SVM), Artificial Neural Network or Random Forests is often computationally time consuming. The objective of this research is to deeply explore topographic information provided by ALS data and overcome computational time limitation. For this reason, an extended set of topographic features and the Principal Component Analysis (PCA) were used to reduce redundant information. The proposed novel approach was tested on a susceptible area affected by more than 50 landslides located on Rożnów Lake in Carpathian Mountains, Poland. The initial seven PCA components with 90% of the total variability in the original topographic attributes were used for SVM classification. Comparing results with landslide inventory map, the average user's accuracy (UA), producer's accuracy (PA), and overall accuracy (OA) were calculated for two models according to the classification results. Thereby, for the PCA-feature-reduced model UA, PA, and OA were found to be 72%, 76%, and 72%, respectively. Similarly, UA, PA, and OA in the non-reduced original topographic model, was 74%, 77% and 74%, respectively. Using the initial seven PCA components instead of the twenty original topographic attributes does not significantly change identification accuracy but reduce computational time.

  13. Comparison of stresses on homogeneous spheroids in the optical stretcher computed with geometrical optics and generalized Lorenz-Mie theory.

    PubMed

    Boyde, Lars; Ekpenyong, Andrew; Whyte, Graeme; Guck, Jochen

    2012-11-20

    We present two electromagnetic frameworks to compare the surface stresses on spheroidal particles in the optical stretcher (a dual-beam laser trap that can be used to capture and deform biological cells). The first model is based on geometrical optics (GO) and limited in its applicability to particles that are much greater than the incident wavelength. The second framework is more sophisticated and hinges on the generalized Lorenz-Mie theory (GLMT). Despite the difference in complexity between both theories, the stress profiles computed with GO and GLMT are in good agreement with each other (relative errors are on the order of 1-10%). Both models predict a diminishing of the stresses for larger wavelengths and a strong increase of the stresses for shorter laser-cell distances. Results indicate that surface stresses on a spheroid with an aspect ratio of 1.2 hardly differ from the stresses on a sphere of similar size. Knowledge of the surface stresses and whether or not they redistribute during the stretching process is of crucial importance in real-time applications of the stretcher that aim to discern the viscoelastic properties of cells for purposes of cell characterization, sorting, and medical diagnostics.

  14. Grid workflow validation using ontology-based tacit knowledge: A case study for quantitative remote sensing applications

    NASA Astrophysics Data System (ADS)

    Liu, Jia; Liu, Longli; Xue, Yong; Dong, Jing; Hu, Yingcui; Hill, Richard; Guang, Jie; Li, Chi

    2017-01-01

    Workflow for remote sensing quantitative retrieval is the ;bridge; between Grid services and Grid-enabled application of remote sensing quantitative retrieval. Workflow averts low-level implementation details of the Grid and hence enables users to focus on higher levels of application. The workflow for remote sensing quantitative retrieval plays an important role in remote sensing Grid and Cloud computing services, which can support the modelling, construction and implementation of large-scale complicated applications of remote sensing science. The validation of workflow is important in order to support the large-scale sophisticated scientific computation processes with enhanced performance and to minimize potential waste of time and resources. To research the semantic correctness of user-defined workflows, in this paper, we propose a workflow validation method based on tacit knowledge research in the remote sensing domain. We first discuss the remote sensing model and metadata. Through detailed analysis, we then discuss the method of extracting the domain tacit knowledge and expressing the knowledge with ontology. Additionally, we construct the domain ontology with Protégé. Through our experimental study, we verify the validity of this method in two ways, namely data source consistency error validation and parameters matching error validation.

  15. Design approach of an aquaculture cage system for deployment in the constructed channel flow environments of a power plant

    PubMed Central

    Lee, Jihoon; Fredriksson, David W.; DeCew, Judson; Drach, Andrew; Yim, Solomon C.

    2018-01-01

    This study provides an engineering approach for designing an aquaculture cage system for use in constructed channel flow environments. As sustainable aquaculture has grown globally, many novel techniques have been introduced such as those implemented in the global Atlantic salmon industry. The advent of several highly sophisticated analysis software systems enables the development of such novel engineering techniques. These software systems commonly include three-dimensional (3D) drafting, computational fluid dynamics, and finite element analysis. In this study, a combination of these analysis tools is applied to evaluate a conceptual aquaculture system for potential deployment in a power plant effluent channel. The channel is supposedly clean; however, it includes elevated water temperatures and strong currents. The first portion of the analysis includes the design of a fish cage system with specific net solidities using 3D drafting techniques. Computational fluid dynamics is then applied to evaluate the flow reduction through the system from the previously generated solid models. Implementing the same solid models, a finite element analysis is performed on the critical components to assess the material stresses produced by the drag force loads that are calculated from the fluid velocities. PMID:29897954

  16. Recent advances in mathematical modeling of developmental abnormalities using mechanistic information.

    PubMed

    Kavlock, R J

    1997-01-01

    During the last several years, significant changes in the risk assessment process for developmental toxicity of environmental contaminants have begun to emerge. The first of these changes is the development and beginning use of statistically based dose-response models [the benchmark dose (BMD) approach] that better utilize data derived from existing testing approaches. Accompanying this change is the greater emphasis placed on understanding and using mechanistic information to yield more accurate, reliable, and less uncertain risk assessments. The next stage in the evolution of risk assessment will be the use of biologically based dose-response (BBDR) models that begin to build into the statistically based models factors related to the underlying kinetic, biochemical, and/or physiologic processes perturbed by a toxicant. Such models are now emerging from several research laboratories. The introduction of quantitative models and the incorporation of biologic information into them has pointed to the need for even more sophisticated modifications for which we offer the term embryologically based dose-response (EBDR) models. Because these models would be based upon the understanding of normal morphogenesis, they represent a quantum leap in our thinking, but their complexity presents daunting challenges both to the developmental biologist and the developmental toxicologist. Implementation of these models will require extensive communication between developmental toxicologists, molecular embryologists, and biomathematicians. The remarkable progress in the understanding of mammalian embryonic development at the molecular level that has occurred over the last decade combined with advances in computing power and computational models should eventually enable these as yet hypothetical models to be brought into use.

  17. Fast Dating Using Least-Squares Criteria and Algorithms.

    PubMed

    To, Thu-Hien; Jung, Matthieu; Lycett, Samantha; Gascuel, Olivier

    2016-01-01

    Phylogenies provide a useful way to understand the evolutionary history of genetic samples, and data sets with more than a thousand taxa are becoming increasingly common, notably with viruses (e.g., human immunodeficiency virus (HIV)). Dating ancestral events is one of the first, essential goals with such data. However, current sophisticated probabilistic approaches struggle to handle data sets of this size. Here, we present very fast dating algorithms, based on a Gaussian model closely related to the Langley-Fitch molecular-clock model. We show that this model is robust to uncorrelated violations of the molecular clock. Our algorithms apply to serial data, where the tips of the tree have been sampled through times. They estimate the substitution rate and the dates of all ancestral nodes. When the input tree is unrooted, they can provide an estimate for the root position, thus representing a new, practical alternative to the standard rooting methods (e.g., midpoint). Our algorithms exploit the tree (recursive) structure of the problem at hand, and the close relationships between least-squares and linear algebra. We distinguish between an unconstrained setting and the case where the temporal precedence constraint (i.e., an ancestral node must be older that its daughter nodes) is accounted for. With rooted trees, the former is solved using linear algebra in linear computing time (i.e., proportional to the number of taxa), while the resolution of the latter, constrained setting, is based on an active-set method that runs in nearly linear time. With unrooted trees the computing time becomes (nearly) quadratic (i.e., proportional to the square of the number of taxa). In all cases, very large input trees (>10,000 taxa) can easily be processed and transformed into time-scaled trees. We compare these algorithms to standard methods (root-to-tip, r8s version of Langley-Fitch method, and BEAST). Using simulated data, we show that their estimation accuracy is similar to that of the most sophisticated methods, while their computing time is much faster. We apply these algorithms on a large data set comprising 1194 strains of Influenza virus from the pdm09 H1N1 Human pandemic. Again the results show that these algorithms provide a very fast alternative with results similar to those of other computer programs. These algorithms are implemented in the LSD software (least-squares dating), which can be downloaded from http://www.atgc-montpellier.fr/LSD/, along with all our data sets and detailed results. An Online Appendix, providing additional algorithm descriptions, tables, and figures can be found in the Supplementary Material available on Dryad at http://dx.doi.org/10.5061/dryad.968t3. © The Author(s) 2015. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.

  18. Fast Dating Using Least-Squares Criteria and Algorithms

    PubMed Central

    To, Thu-Hien; Jung, Matthieu; Lycett, Samantha; Gascuel, Olivier

    2016-01-01

    Phylogenies provide a useful way to understand the evolutionary history of genetic samples, and data sets with more than a thousand taxa are becoming increasingly common, notably with viruses (e.g., human immunodeficiency virus (HIV)). Dating ancestral events is one of the first, essential goals with such data. However, current sophisticated probabilistic approaches struggle to handle data sets of this size. Here, we present very fast dating algorithms, based on a Gaussian model closely related to the Langley–Fitch molecular-clock model. We show that this model is robust to uncorrelated violations of the molecular clock. Our algorithms apply to serial data, where the tips of the tree have been sampled through times. They estimate the substitution rate and the dates of all ancestral nodes. When the input tree is unrooted, they can provide an estimate for the root position, thus representing a new, practical alternative to the standard rooting methods (e.g., midpoint). Our algorithms exploit the tree (recursive) structure of the problem at hand, and the close relationships between least-squares and linear algebra. We distinguish between an unconstrained setting and the case where the temporal precedence constraint (i.e., an ancestral node must be older that its daughter nodes) is accounted for. With rooted trees, the former is solved using linear algebra in linear computing time (i.e., proportional to the number of taxa), while the resolution of the latter, constrained setting, is based on an active-set method that runs in nearly linear time. With unrooted trees the computing time becomes (nearly) quadratic (i.e., proportional to the square of the number of taxa). In all cases, very large input trees (>10,000 taxa) can easily be processed and transformed into time-scaled trees. We compare these algorithms to standard methods (root-to-tip, r8s version of Langley–Fitch method, and BEAST). Using simulated data, we show that their estimation accuracy is similar to that of the most sophisticated methods, while their computing time is much faster. We apply these algorithms on a large data set comprising 1194 strains of Influenza virus from the pdm09 H1N1 Human pandemic. Again the results show that these algorithms provide a very fast alternative with results similar to those of other computer programs. These algorithms are implemented in the LSD software (least-squares dating), which can be downloaded from http://www.atgc-montpellier.fr/LSD/, along with all our data sets and detailed results. An Online Appendix, providing additional algorithm descriptions, tables, and figures can be found in the Supplementary Material available on Dryad at http://dx.doi.org/10.5061/dryad.968t3. PMID:26424727

  19. The CSIRO Mk3L climate system model v1.0 coupled to the CABLE land surface scheme v1.4b: evaluation of the control climatology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mao, Jiafu; Phipps, S.J.; Pitman, A.J.

    The CSIRO Mk3L climate system model, a reduced-resolution coupled general circulation model, has previously been described in this journal. The model is configured for millennium scale or multiple century scale simulations. This paper reports the impact of replacing the relatively simple land surface scheme that is the default parameterisation in Mk3L with a sophisticated land surface model that simulates the terrestrial energy, water and carbon balance in a physically and biologically consistent way. An evaluation of the new model s near-surface climatology highlights strengths and weaknesses, but overall the atmospheric variables, including the near-surface air temperature and precipitation, are simulatedmore » well. The impact of the more sophisticated land surface model on existing variables is relatively small, but generally positive. More significantly, the new land surface scheme allows an examination of surface carbon-related quantities including net primary productivity which adds significantly to the capacity of Mk3L. Overall, results demonstrate that this reduced-resolution climate model is a good foundation for exploring long time scale phenomena. The addition of the more sophisticated land surface model enables an exploration of important Earth System questions including land cover change and abrupt changes in terrestrial carbon storage.« less

  20. Computational Models Used to Assess US Tobacco Control Policies.

    PubMed

    Feirman, Shari P; Glasser, Allison M; Rose, Shyanika; Niaura, Ray; Abrams, David B; Teplitskaya, Lyubov; Villanti, Andrea C

    2017-11-01

    Simulation models can be used to evaluate existing and potential tobacco control interventions, including policies. The purpose of this systematic review was to synthesize evidence from computational models used to project population-level effects of tobacco control interventions. We provide recommendations to strengthen simulation models that evaluate tobacco control interventions. Studies were eligible for review if they employed a computational model to predict the expected effects of a non-clinical US-based tobacco control intervention. We searched five electronic databases on July 1, 2013 with no date restrictions and synthesized studies qualitatively. Six primary non-clinical intervention types were examined across the 40 studies: taxation, youth prevention, smoke-free policies, mass media campaigns, marketing/advertising restrictions, and product regulation. Simulation models demonstrated the independent and combined effects of these interventions on decreasing projected future smoking prevalence. Taxation effects were the most robust, as studies examining other interventions exhibited substantial heterogeneity with regard to the outcomes and specific policies examined across models. Models should project the impact of interventions on overall tobacco use, including nicotine delivery product use, to estimate preventable health and cost-saving outcomes. Model validation, transparency, more sophisticated models, and modeling policy interactions are also needed to inform policymakers to make decisions that will minimize harm and maximize health. In this systematic review, evidence from multiple studies demonstrated the independent effect of taxation on decreasing future smoking prevalence, and models for other tobacco control interventions showed that these strategies are expected to decrease smoking, benefit population health, and are reasonable to implement from a cost perspective. Our recommendations aim to help policymakers and researchers minimize harm and maximize overall population-level health benefits by considering the real-world context in which tobacco control interventions are implemented. © The Author 2017. Published by Oxford University Press on behalf of the Society for Research on Nicotine and Tobacco. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  1. An introduction to real-time graphical techniques for analyzing multivariate data

    NASA Astrophysics Data System (ADS)

    Friedman, Jerome H.; McDonald, John Alan; Stuetzle, Werner

    1987-08-01

    Orion I is a graphics system used to study applications of computer graphics - especially interactive motion graphics - in statistics. Orion I is the newest of a family of "Prim" systems, whose most striking common feature is the use of real-time motion graphics to display three dimensional scatterplots. Orion I differs from earlier Prim systems through the use of modern and relatively inexpensive raster graphics and microprocessor technology. It also delivers more computing power to its user; Orion I can perform more sophisticated real-time computations than were possible on previous such systems. We demonstrate some of Orion I's capabilities in our film: "Exploring data with Orion I".

  2. Computer Assisted Navigation in Knee Arthroplasty

    PubMed Central

    Bae, Dae Kyung

    2011-01-01

    Computer assisted surgery (CAS) was used to improve the positioning of implants during total knee arthroplasty (TKA). Most studies have reported that computer assisted navigation reduced the outliers of alignment and component malpositioning. However, additional sophisticated studies are necessary to determine if the improvement of alignment will improve long-term clinical results and increase the survival rate of the implant. Knowledge of CAS-TKA technology and understanding the advantages and limitations of navigation are crucial to the successful application of the CAS technique in TKA. In this article, we review the components of navigation, classification of the system, surgical method, potential error, clinical results, advantages, and disadvantages. PMID:22162787

  3. Sydney Observatory and astronomy teaching in the 90s

    NASA Astrophysics Data System (ADS)

    Lomb, N.

    1996-05-01

    Computers and the Internet have created a revolution in the way astronomy can be communicated to the public. At Sydney Observatory we make full use of these recent developments. In our lecture room a variety of sophisticated computer programs can show, with the help of a projection TV system, the appearance and motion of the sky at any place, date or time. The latest HST images obtained from the Internet can be shown, as can images taken through our own Meade 16 inch telescope. This recently installed computer-controlled telescope with its accurate pointing is an ideal instrument for a light-polluted site such as ours.

  4. An approach to quality and performance control in a computer-assisted clinical chemistry laboratory.

    PubMed Central

    Undrill, P E; Frazer, S C

    1979-01-01

    A locally developed, computer-based clinical chemistry laboratory system has been in operation since 1970. This utilises a Digital Equipment Co Ltd PDP 12 and an interconnected PDP 8/F computer. Details are presented of the performance and quality control techniques incorporated into the system. Laboratory performance is assessed through analysis of results from fixed-level control sera as well as from cumulative sum methods. At a simple level the presentation may be considered purely indicative, while at a more sophisticated level statistical concepts have been introduced to aid the laboratory controller in decision-making processes. PMID:438340

  5. Planetary investigation utilizing an imaging spectrometer system based upon charge injection technology

    NASA Technical Reports Server (NTRS)

    Wattson, R. B.; Harvey, P.; Swift, R.

    1975-01-01

    An intrinsic silicon charge injection device (CID) television sensor array has been used in conjunction with a CaMoO4 colinear tunable acousto optic filter, a 61 inch reflector, a sophisticated computer system, and a digital color TV scan converter/computer to produce near IR images of Saturn and Jupiter with 10A spectral resolution and approximately 3 inch spatial resolution. The CID camera has successfully obtained digitized 100 x 100 array images with 5 minutes of exposure time, and slow-scanned readout to a computer. Details of the equipment setup, innovations, problems, experience, data and final equipment performance limits are given.

  6. SciSpark: Highly Interactive and Scalable Model Evaluation and Climate Metrics

    NASA Astrophysics Data System (ADS)

    Wilson, B. D.; Palamuttam, R. S.; Mogrovejo, R. M.; Whitehall, K. D.; Mattmann, C. A.; Verma, R.; Waliser, D. E.; Lee, H.

    2015-12-01

    Remote sensing data and climate model output are multi-dimensional arrays of massive sizes locked away in heterogeneous file formats (HDF5/4, NetCDF 3/4) and metadata models (HDF-EOS, CF) making it difficult to perform multi-stage, iterative science processing since each stage requires writing and reading data to and from disk. We are developing a lightning fast Big Data technology called SciSpark based on ApacheTM Spark under a NASA AIST grant (PI Mattmann). Spark implements the map-reduce paradigm for parallel computing on a cluster, but emphasizes in-memory computation, "spilling" to disk only as needed, and so outperforms the disk-based ApacheTM Hadoop by 100x in memory and by 10x on disk. SciSpark will enable scalable model evaluation by executing large-scale comparisons of A-Train satellite observations to model grids on a cluster of 10 to 1000 compute nodes. This 2nd generation capability for NASA's Regional Climate Model Evaluation System (RCMES) will compute simple climate metrics at interactive speeds, and extend to quite sophisticated iterative algorithms such as machine-learning based clustering of temperature PDFs, and even graph-based algorithms for searching for Mesocale Convective Complexes. We have implemented a parallel data ingest capability in which the user specifies desired variables (arrays) as several time-sorted lists of URL's (i.e. using OPeNDAP model.nc?varname, or local files). The specified variables are partitioned by time/space and then each Spark node pulls its bundle of arrays into memory to begin a computation pipeline. We also investigated the performance of several N-dim. array libraries (scala breeze, java jblas & netlib-java, and ND4J). We are currently developing science codes using ND4J and studying memory behavior on the JVM. On the pyspark side, many of our science codes already use the numpy and SciPy ecosystems. The talk will cover: the architecture of SciSpark, the design of the scientific RDD (sRDD) data structure, our efforts to integrate climate science algorithms in Python and Scala, parallel ingest and partitioning of A-Train satellite observations from HDF files and model grids from netCDF files, first parallel runs to compute comparison statistics and PDF's, and first metrics quantifying parallel speedups and memory & disk usage.

  7. 1D-3D hybrid modeling-from multi-compartment models to full resolution models in space and time.

    PubMed

    Grein, Stephan; Stepniewski, Martin; Reiter, Sebastian; Knodel, Markus M; Queisser, Gillian

    2014-01-01

    Investigation of cellular and network dynamics in the brain by means of modeling and simulation has evolved into a highly interdisciplinary field, that uses sophisticated modeling and simulation approaches to understand distinct areas of brain function. Depending on the underlying complexity, these models vary in their level of detail, in order to cope with the attached computational cost. Hence for large network simulations, single neurons are typically reduced to time-dependent signal processors, dismissing the spatial aspect of each cell. For single cell or networks with relatively small numbers of neurons, general purpose simulators allow for space and time-dependent simulations of electrical signal processing, based on the cable equation theory. An emerging field in Computational Neuroscience encompasses a new level of detail by incorporating the full three-dimensional morphology of cells and organelles into three-dimensional, space and time-dependent, simulations. While every approach has its advantages and limitations, such as computational cost, integrated and methods-spanning simulation approaches, depending on the network size could establish new ways to investigate the brain. In this paper we present a hybrid simulation approach, that makes use of reduced 1D-models using e.g., the NEURON simulator-which couples to fully resolved models for simulating cellular and sub-cellular dynamics, including the detailed three-dimensional morphology of neurons and organelles. In order to couple 1D- and 3D-simulations, we present a geometry-, membrane potential- and intracellular concentration mapping framework, with which graph- based morphologies, e.g., in the swc- or hoc-format, are mapped to full surface and volume representations of the neuron and computational data from 1D-simulations can be used as boundary conditions for full 3D simulations and vice versa. Thus, established models and data, based on general purpose 1D-simulators, can be directly coupled to the emerging field of fully resolved, highly detailed 3D-modeling approaches. We present the developed general framework for 1D/3D hybrid modeling and apply it to investigate electrically active neurons and their intracellular spatio-temporal calcium dynamics.

  8. Numerical simulation of the modulation transfer function (MTF) in infrared focal plane arrays: simulation methodology and MTF optimization

    NASA Astrophysics Data System (ADS)

    Schuster, J.

    2018-02-01

    Military requirements demand both single and dual-color infrared (IR) imaging systems with both high resolution and sharp contrast. To quantify the performance of these imaging systems, a key measure of performance, the modulation transfer function (MTF), describes how well an optical system reproduces an objects contrast in the image plane at different spatial frequencies. At the center of an IR imaging system is the focal plane array (FPA). IR FPAs are hybrid structures consisting of a semiconductor detector pixel array, typically fabricated from HgCdTe, InGaAs or III-V superlattice materials, hybridized with heat/pressure to a silicon read-out integrated circuit (ROIC) with indium bumps on each pixel providing the mechanical and electrical connection. Due to the growing sophistication of the pixel arrays in these FPAs, sophisticated modeling techniques are required to predict, understand, and benchmark the pixel array MTF that contributes to the total imaging system MTF. To model the pixel array MTF, computationally exhaustive 2D and 3D numerical simulation approaches are required to correctly account for complex architectures and effects such as lateral diffusion from the pixel corners. It is paramount to accurately model the lateral di_usion (pixel crosstalk) as it can become the dominant mechanism limiting the detector MTF if not properly mitigated. Once the detector MTF has been simulated, it is directly decomposed into its constituent contributions to reveal exactly what is limiting the total detector MTF, providing a path for optimization. An overview of the MTF will be given and the simulation approach will be discussed in detail, along with how different simulation parameters effect the MTF calculation. Finally, MTF optimization strategies (crosstalk mitigation) will be discussed.

  9. ballaxy: web services for structural bioinformatics.

    PubMed

    Hildebrandt, Anna Katharina; Stöckel, Daniel; Fischer, Nina M; de la Garza, Luis; Krüger, Jens; Nickels, Stefan; Röttig, Marc; Schärfe, Charlotta; Schumann, Marcel; Thiel, Philipp; Lenhof, Hans-Peter; Kohlbacher, Oliver; Hildebrandt, Andreas

    2015-01-01

    Web-based workflow systems have gained considerable momentum in sequence-oriented bioinformatics. In structural bioinformatics, however, such systems are still relatively rare; while commercial stand-alone workflow applications are common in the pharmaceutical industry, academic researchers often still rely on command-line scripting to glue individual tools together. In this work, we address the problem of building a web-based system for workflows in structural bioinformatics. For the underlying molecular modelling engine, we opted for the BALL framework because of its extensive and well-tested functionality in the field of structural bioinformatics. The large number of molecular data structures and algorithms implemented in BALL allows for elegant and sophisticated development of new approaches in the field. We hence connected the versatile BALL library and its visualization and editing front end BALLView with the Galaxy workflow framework. The result, which we call ballaxy, enables the user to simply and intuitively create sophisticated pipelines for applications in structure-based computational biology, integrated into a standard tool for molecular modelling.  ballaxy consists of three parts: some minor modifications to the Galaxy system, a collection of tools and an integration into the BALL framework and the BALLView application for molecular modelling. Modifications to Galaxy will be submitted to the Galaxy project, and the BALL and BALLView integrations will be integrated in the next major BALL release. After acceptance of the modifications into the Galaxy project, we will publish all ballaxy tools via the Galaxy toolshed. In the meantime, all three components are available from http://www.ball-project.org/ballaxy. Also, docker images for ballaxy are available at https://registry.hub.docker.com/u/anhi/ballaxy/dockerfile/. ballaxy is licensed under the terms of the GPL. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  10. Image segmentation using hidden Markov Gauss mixture models.

    PubMed

    Pyun, Kyungsuk; Lim, Johan; Won, Chee Sun; Gray, Robert M

    2007-07-01

    Image segmentation is an important tool in image processing and can serve as an efficient front end to sophisticated algorithms and thereby simplify subsequent processing. We develop a multiclass image segmentation method using hidden Markov Gauss mixture models (HMGMMs) and provide examples of segmentation of aerial images and textures. HMGMMs incorporate supervised learning, fitting the observation probability distribution given each class by a Gauss mixture estimated using vector quantization with a minimum discrimination information (MDI) distortion. We formulate the image segmentation problem using a maximum a posteriori criteria and find the hidden states that maximize the posterior density given the observation. We estimate both the hidden Markov parameter and hidden states using a stochastic expectation-maximization algorithm. Our results demonstrate that HMGMM provides better classification in terms of Bayes risk and spatial homogeneity of the classified objects than do several popular methods, including classification and regression trees, learning vector quantization, causal hidden Markov models (HMMs), and multiresolution HMMs. The computational load of HMGMM is similar to that of the causal HMM.

  11. Optimizing LX-17 Thermal Decomposition Model Parameters with Evolutionary Algorithms

    NASA Astrophysics Data System (ADS)

    Moore, Jason; McClelland, Matthew; Tarver, Craig; Hsu, Peter; Springer, H. Keo

    2017-06-01

    We investigate and model the cook-off behavior of LX-17 because this knowledge is critical to understanding system response in abnormal thermal environments. Thermal decomposition of LX-17 has been explored in conventional ODTX (One-Dimensional Time-to-eXplosion), PODTX (ODTX with pressure-measurement), TGA (thermogravimetric analysis), and DSC (differential scanning calorimetry) experiments using varied temperature profiles. These experimental data are the basis for developing multiple reaction schemes with coupled mechanics in LLNL's multi-physics hydrocode, ALE3D (Arbitrary Lagrangian-Eulerian code in 2D and 3D). We employ evolutionary algorithms to optimize reaction rate parameters on high performance computing clusters. Once experimentally validated, this model will be scalable to a number of applications involving LX-17 and can be used to develop more sophisticated experimental methods. Furthermore, the optimization methodology developed herein should be applicable to other high explosive materials. This work was performed under the auspices of the U.S. DOE by LLNL under contract DE-AC52-07NA27344. LLNS, LLC.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jonkman, Jason; Annoni, Jennifer; Hayman, Greg

    This paper presents the development of FAST.Farm, a new multiphysics tool applicable to engineering problems in research and industry involving wind farm performance and cost optimization that is needed to address the current underperformance, failures, and expenses plaguing the wind industry. Achieving wind cost-of-energy targets - which requires improvements in wind farm performance and reliability, together with reduced uncertainty and expenditures - has been eluded by the complicated nature of the wind farm design problem, especially the sophisticated interaction between atmospheric phenomena and wake dynamics and array effects. FAST.Farm aims to balance the need for accurate modeling of the relevantmore » physics for predicting power performance and loads while maintaining low computational cost to support a highly iterative and probabilistic design process and system-wide optimization. FAST.Farm makes use of FAST to model the aero-hydro-servo-elastics of distinct turbines in the wind farm, and it is based on some of the principles of the Dynamic Wake Meandering (DWM) model, but avoids many of the limitations of existing DWM implementations.« less

  13. Global Ocean Prediction with the HYbrid Coordinate Ocean Model, HYCOM

    NASA Astrophysics Data System (ADS)

    Chassignet, E.

    A broad partnership of institutions is collaborating in developing and demonstrating the performance and application of eddy-resolving, real-time global and Atlantic ocean prediction systems using the the HYbrid Coordinate Ocean Model (HYCOM). These systems will be transitioned for operational use by both the U.S. Navy at the Naval Oceanographic Office (NAVOCEANO), Stennis Space Center, MS, and the Fleet Numerical Meteorology and Oceanography Centre (FNMOC), Monterey, CA, and by NOAA at the National Centers for Environmental Prediction (NCEP), Washington, D.C. These systems will run efficiently on a variety of massively parallel computers and will include sophisticated data assimilation techniques for assimilation of satellite altimeter sea surface height and sea surface temperature as well as in situ temperature, salinity, and float displacement. The Partnership addresses the Global Ocean Data Assimilation Experiment (GODAE) goals of three-dimensional (3D) depiction of the ocean state at fine resolution in real-time and provision of boundary conditions for coastal and regional models. An overview of the effort will be presented.

  14. Flexural-torsional vibration of simply supported open cross-section steel beams under moving loads

    NASA Astrophysics Data System (ADS)

    Michaltsos, G. T.; Sarantithou, E.; Sophianopoulos, D. S.

    2005-02-01

    SummaryThe present work deals with linearized modal analysis of the combined flexural-torsional vibration of simply supported steel beams with open monosymmetric cross-sections, acted upon by a load of constant magnitude, traversing its span eccentrically with constant velocity. After thoroughly investigating the free vibrations of the structure, which simulates a commonly used highway bridge, its forced motions under the aforementioned loading type are investigated. Utilizing the capabilities of symbolic computations within modern mathematical software, the effect of the most significant geometrical and cross-sectional beam properties on the free vibration characteristics of the beam are established and presented in tabular and graphical form. Moreover, adopting realistic values of the simplified vehicle model adopted, the effects of eccentricity, load magnitude and corresponding velocity are assessed and interesting conclusions for structural design purposes are drawn. The proposed methodology may serve as a starting point for further in-depth study of the whole scientific subject, in which sophisticated vehicle models, energy dissipation and more complicated bridge models may be used.

  15. Nonlocal kinetic energy functional from the jellium-with-gap model: Applications to orbital-free density functional theory

    NASA Astrophysics Data System (ADS)

    Constantin, Lucian A.; Fabiano, Eduardo; Della Sala, Fabio

    2018-05-01

    Orbital-free density functional theory (OF-DFT) promises to describe the electronic structure of very large quantum systems, being its computational cost linear with the system size. However, the OF-DFT accuracy strongly depends on the approximation made for the kinetic energy (KE) functional. To date, the most accurate KE functionals are nonlocal functionals based on the linear-response kernel of the homogeneous electron gas, i.e., the jellium model. Here, we use the linear-response kernel of the jellium-with-gap model to construct a simple nonlocal KE functional (named KGAP) which depends on the band-gap energy. In the limit of vanishing energy gap (i.e., in the case of metals), the KGAP is equivalent to the Smargiassi-Madden (SM) functional, which is accurate for metals. For a series of semiconductors (with different energy gaps), the KGAP performs much better than SM, and results are close to the state-of-the-art functionals with sophisticated density-dependent kernels.

  16. Ab initio modeling of CW-ESR spectra of the double spin labeled peptide Fmoc-(Aib-Aib-TOAC)2-Aib-OMe in acetonitrile.

    PubMed

    Zerbetto, Mirco; Carlotto, Silvia; Polimeno, Antonino; Corvaja, Carlo; Franco, Lorenzo; Toniolo, Claudio; Formaggio, Fernando; Barone, Vincenzo; Cimino, Paola

    2007-03-15

    In this work we address the interpretation, via an ab initio integrated computational approach, of the CW-ESR spectra of the double spin labeled, 310-helical, peptide Fmoc-(Aib-Aib-TOAC)2-Aib-OMe dissolved in acetonitrile. Our approach is based on the determination of geometric and local magnetic parameters of the heptapeptide by quantum mechanical density functional calculations taking into account solvent and, when needed, vibrational averaging contributions. The system is then described by a stochastic Liouville equation for the two electron spins interacting with each other and with two 14N nuclear spins, in the presence of diffusive rotational dynamics. Parametrization of the diffusion rotational tensor is provided by a hydrodynamic model. CW-ESR spectra are simulated with minimal resorting to fitting procedures, proving that the combination of sensitive ESR spectroscopy and sophisticated modeling can be highly helpful in providing 3D structural and dynamic information on molecular systems.

  17. Modeling Flight: The Role of Dynamically Scaled Free-Flight Models in Support of NASA's Aerospace Programs

    NASA Technical Reports Server (NTRS)

    Chambers, Joseph

    2010-01-01

    The state of the art in aeronautical engineering has been continually accelerated by the development of advanced analysis and design tools. Used in the early design stages for aircraft and spacecraft, these methods have provided a fundamental understanding of physical phenomena and enabled designers to predict and analyze critical characteristics of new vehicles, including the capability to control or modify unsatisfactory behavior. For example, the relatively recent emergence and routine use of extremely powerful digital computer hardware and software has had a major impact on design capabilities and procedures. Sophisticated new airflow measurement and visualization systems permit the analyst to conduct micro- and macro-studies of properties within flow fields on and off the surfaces of models in advanced wind tunnels. Trade studies of the most efficient geometrical shapes for aircraft can be conducted with blazing speed within a broad scope of integrated technical disciplines, and the use of sophisticated piloted simulators in the vehicle development process permits the most important segment of operations the human pilot to make early assessments of the acceptability of the vehicle for its intended mission. Knowledgeable applications of these tools of the trade dramatically reduce risk and redesign, and increase the marketability and safety of new aerospace vehicles. Arguably, one of the more viable and valuable design tools since the advent of flight has been testing of subscale models. As used herein, the term "model" refers to a physical article used in experimental analyses of a larger full-scale vehicle. The reader is probably aware that many other forms of mathematical and computer-based models are also used in aerospace design; however, such topics are beyond the intended scope of this document. Model aircraft have always been a source of fascination, inspiration, and recreation for humans since the earliest days of flight. Within the scientific community, Leonardo da Vinci, George Cayley, and the Wright brothers are examples of early aviation pioneers who frequently used models during their scientific efforts to understand and develop flying machines. Progress in the technology associated with model testing in worldwide applications has firmly established model aircraft as a key element in new aerospace research and development programs. Models are now routinely used in many applications and roles, including aerodynamic data gathering in wind tunnel investigations for the analysis of full-scale aircraft designs, proof-of-concept demonstrators for radical aeronautical concepts, and problem-solving exercises for vehicles already in production. The most critical contributions of aerospace models are to provide confidence and risk reduction for new designs and to enhance the safety and efficiency of existing configurations.

  18. Implementing multiresolution models and families of models: from entity-level simulation to desktop stochastic models and "repro" models

    NASA Astrophysics Data System (ADS)

    McEver, Jimmie; Davis, Paul K.; Bigelow, James H.

    2000-06-01

    We have developed and used families of multiresolution and multiple-perspective models (MRM and MRMPM), both in our substantive analytic work for the Department of Defense and to learn more about how such models can be designed and implemented. This paper is a brief case history of our experience with a particular family of models addressing the use of precision fires in interdicting and halting an invading army. Our models were implemented as closed-form analytic solutions, in spreadsheets, and in the more sophisticated AnalyticaTM environment. We also drew on an entity-level simulation for data. The paper reviews the importance of certain key attributes of development environments (visual modeling, interactive languages, friendly use of array mathematics, facilities for experimental design and configuration control, statistical analysis tools, graphical visualization tools, interactive post-processing, and relational database tools). These can go a long way towards facilitating MRMPM work, but many of these attributes are not yet widely available (or available at all) in commercial model-development tools--especially for use with personal computers. We conclude with some lessons learned from our experience.

  19. Semi-Empirical Model to Estimate the Solubility of CO2 NaCl Brine in Conditions Representative of CO2 Sequestration

    NASA Astrophysics Data System (ADS)

    Mohammadian, E.; Hamidi, H.; Azdarpour, A.

    2018-05-01

    CO2 sequestration is considered as one of the most anticipated methods to mitigate CO2 concentration in the atmosphere. Solubility mechanism is one of the most important and sophisticated mechanisms by which CO2 is rendered immobile while it is being injected into aquifers. A semi-empirical, easy to use model was developed to calculate the solubility of CO2 in NaCl brines with thermodynamic conditions (pressure, temperature) and salinity gradients representative CO2 sequestration in the Malay basin. The model was compared to the previous more sophisticated models and a good consistency was found among the data obtained using the two models. A Sensitivity analysis was also conducted on the model to test its performance beyond its limits.

  20. Numerical evidences of universal trap-like aging dynamics

    NASA Astrophysics Data System (ADS)

    Cammarota, Chiara; Marinari, Enzo

    2018-04-01

    Trap models have been initially proposed as toy models for dynamical relaxation in extremely simplified rough potential energy landscapes. Their importance has recently grown considerably thanks to the discovery that the trap-like aging mechanism directly controls the out-of-equilibrium relaxation processes of more sophisticated spin models, that are considered as the solvable counterpart of real disordered systems. Further establishing the connection between these spin models, out-of-equilibrium behavior and the trap like aging mechanism could shed new light on the properties, which are still largely mysterious, for the activated out-of-equilibrium dynamics of disordered systems. In this work we discuss numerical evidence based on the computations of the permanence times of an emergent trap-like aging behavior in a variety of very simple disordered models—developed from the trap model paradigm. Our numerical results are backed by analytic derivations and heuristic discussions. Such exploration reveals some of the tricks needed to reveal the trap behavior in spite of the occurrence of secondary processes, of the existence of dynamical correlations and of strong finite system’s size effects.

  1. Machine learning of frustrated classical spin models. I. Principal component analysis

    NASA Astrophysics Data System (ADS)

    Wang, Ce; Zhai, Hui

    2017-10-01

    This work aims at determining whether artificial intelligence can recognize a phase transition without prior human knowledge. If this were successful, it could be applied to, for instance, analyzing data from the quantum simulation of unsolved physical models. Toward this goal, we first need to apply the machine learning algorithm to well-understood models and see whether the outputs are consistent with our prior knowledge, which serves as the benchmark for this approach. In this work, we feed the computer data generated by the classical Monte Carlo simulation for the X Y model in frustrated triangular and union jack lattices, which has two order parameters and exhibits two phase transitions. We show that the outputs of the principal component analysis agree very well with our understanding of different orders in different phases, and the temperature dependences of the major components detect the nature and the locations of the phase transitions. Our work offers promise for using machine learning techniques to study sophisticated statistical models, and our results can be further improved by using principal component analysis with kernel tricks and the neural network method.

  2. Path Models of Vocal Emotion Communication

    PubMed Central

    Bänziger, Tanja; Hosoya, Georg; Scherer, Klaus R.

    2015-01-01

    We propose to use a comprehensive path model of vocal emotion communication, encompassing encoding, transmission, and decoding processes, to empirically model data sets on emotion expression and recognition. The utility of the approach is demonstrated for two data sets from two different cultures and languages, based on corpora of vocal emotion enactment by professional actors and emotion inference by naïve listeners. Lens model equations, hierarchical regression, and multivariate path analysis are used to compare the relative contributions of objectively measured acoustic cues in the enacted expressions and subjective voice cues as perceived by listeners to the variance in emotion inference from vocal expressions for four emotion families (fear, anger, happiness, and sadness). While the results confirm the central role of arousal in vocal emotion communication, the utility of applying an extended path modeling framework is demonstrated by the identification of unique combinations of distal cues and proximal percepts carrying information about specific emotion families, independent of arousal. The statistical models generated show that more sophisticated acoustic parameters need to be developed to explain the distal underpinnings of subjective voice quality percepts that account for much of the variance in emotion inference, in particular voice instability and roughness. The general approach advocated here, as well as the specific results, open up new research strategies for work in psychology (specifically emotion and social perception research) and engineering and computer science (specifically research and development in the domain of affective computing, particularly on automatic emotion detection and synthetic emotion expression in avatars). PMID:26325076

  3. Symmetrically private information retrieval based on blind quantum computing

    NASA Astrophysics Data System (ADS)

    Sun, Zhiwei; Yu, Jianping; Wang, Ping; Xu, Lingling

    2015-05-01

    Universal blind quantum computation (UBQC) is a new secure quantum computing protocol which allows a user Alice who does not have any sophisticated quantum technology to delegate her computing to a server Bob without leaking any privacy. Using the features of UBQC, we propose a protocol to achieve symmetrically private information retrieval, which allows a quantum limited Alice to query an item from Bob with a fully fledged quantum computer; meanwhile, the privacy of both parties is preserved. The security of our protocol is based on the assumption that malicious Alice has no quantum computer, which avoids the impossibility proof of Lo. For the honest Alice, she is almost classical and only requires minimal quantum resources to carry out the proposed protocol. Therefore, she does not need any expensive laboratory which can maintain the coherence of complicated quantum experimental setups.

  4. Design and performance analysis of solid-propellant rocket motors using a simplified computer program

    NASA Technical Reports Server (NTRS)

    Sforzini, R. H.

    1972-01-01

    An analysis and a computer program are presented which represent a compromise between the more sophisticated programs using precise burning geometric relations and the textbook type of solutions. The program requires approximately 900 computer cards including a set of 20 input data cards required for a typical problem. The computer operating time for a single configuration is approximately 1 minute and 30 seconds on the IBM 360 computer. About l minute and l5 seconds of the time is compilation time so that additional configurations input at the same time require approximately 15 seconds each. The program uses approximately 11,000 words on the IBM 360. The program is written in FORTRAN 4 and is readily adaptable for use on a number of different computers: IBM 7044, IBM 7094, and Univac 1108.

  5. Learning and evolution in bacterial taxis: an operational amplifier circuit modeling the computational dynamics of the prokaryotic 'two component system' protein network.

    PubMed

    Di Paola, Vieri; Marijuán, Pedro C; Lahoz-Beltra, Rafael

    2004-01-01

    Adaptive behavior in unicellular organisms (i.e., bacteria) depends on highly organized networks of proteins governing purposefully the myriad of molecular processes occurring within the cellular system. For instance, bacteria are able to explore the environment within which they develop by utilizing the motility of their flagellar system as well as a sophisticated biochemical navigation system that samples the environmental conditions surrounding the cell, searching for nutrients or moving away from toxic substances or dangerous physical conditions. In this paper we discuss how proteins of the intervening signal transduction network could be modeled as artificial neurons, simulating the dynamical aspects of the bacterial taxis. The model is based on the assumption that, in some important aspects, proteins can be considered as processing elements or McCulloch-Pitts artificial neurons that transfer and process information from the bacterium's membrane surface to the flagellar motor. This simulation of bacterial taxis has been carried out on a hardware realization of a McCulloch-Pitts artificial neuron using an operational amplifier. Based on the behavior of the operational amplifier we produce a model of the interaction between CheY and FliM, elements of the prokaryotic two component system controlling chemotaxis, as well as a simulation of learning and evolution processes in bacterial taxis. On the one side, our simulation results indicate that, computationally, these protein 'switches' are similar to McCulloch-Pitts artificial neurons, suggesting a bridge between evolution and learning in dynamical systems at cellular and molecular levels and the evolutive hardware approach. On the other side, important protein 'tactilizing' properties are not tapped by the model, and this suggests further complexity steps to explore in the approach to biological molecular computing.

  6. Bayesian depth estimation from monocular natural images.

    PubMed

    Su, Che-Chun; Cormack, Lawrence K; Bovik, Alan C

    2017-05-01

    Estimating an accurate and naturalistic dense depth map from a single monocular photographic image is a difficult problem. Nevertheless, human observers have little difficulty understanding the depth structure implied by photographs. Two-dimensional (2D) images of the real-world environment contain significant statistical information regarding the three-dimensional (3D) structure of the world that the vision system likely exploits to compute perceived depth, monocularly as well as binocularly. Toward understanding how this might be accomplished, we propose a Bayesian model of monocular depth computation that recovers detailed 3D scene structures by extracting reliable, robust, depth-sensitive statistical features from single natural images. These features are derived using well-accepted univariate natural scene statistics (NSS) models and recent bivariate/correlation NSS models that describe the relationships between 2D photographic images and their associated depth maps. This is accomplished by building a dictionary of canonical local depth patterns from which NSS features are extracted as prior information. The dictionary is used to create a multivariate Gaussian mixture (MGM) likelihood model that associates local image features with depth patterns. A simple Bayesian predictor is then used to form spatial depth estimates. The depth results produced by the model, despite its simplicity, correlate well with ground-truth depths measured by a current-generation terrestrial light detection and ranging (LIDAR) scanner. Such a strong form of statistical depth information could be used by the visual system when creating overall estimated depth maps incorporating stereopsis, accommodation, and other conditions. Indeed, even in isolation, the Bayesian predictor delivers depth estimates that are competitive with state-of-the-art "computer vision" methods that utilize highly engineered image features and sophisticated machine learning algorithms.

  7. Parallel algorithm for solving Kepler’s equation on Graphics Processing Units: Application to analysis of Doppler exoplanet searches

    NASA Astrophysics Data System (ADS)

    Ford, Eric B.

    2009-05-01

    We present the results of a highly parallel Kepler equation solver using the Graphics Processing Unit (GPU) on a commercial nVidia GeForce 280GTX and the "Compute Unified Device Architecture" (CUDA) programming environment. We apply this to evaluate a goodness-of-fit statistic (e.g., χ2) for Doppler observations of stars potentially harboring multiple planetary companions (assuming negligible planet-planet interactions). Given the high-dimensionality of the model parameter space (at least five dimensions per planet), a global search is extremely computationally demanding. We expect that the underlying Kepler solver and model evaluator will be combined with a wide variety of more sophisticated algorithms to provide efficient global search, parameter estimation, model comparison, and adaptive experimental design for radial velocity and/or astrometric planet searches. We tested multiple implementations using single precision, double precision, pairs of single precision, and mixed precision arithmetic. We find that the vast majority of computations can be performed using single precision arithmetic, with selective use of compensated summation for increased precision. However, standard single precision is not adequate for calculating the mean anomaly from the time of observation and orbital period when evaluating the goodness-of-fit for real planetary systems and observational data sets. Using all double precision, our GPU code outperforms a similar code using a modern CPU by a factor of over 60. Using mixed precision, our GPU code provides a speed-up factor of over 600, when evaluating nsys > 1024 models planetary systems each containing npl = 4 planets and assuming nobs = 256 observations of each system. We conclude that modern GPUs also offer a powerful tool for repeatedly evaluating Kepler's equation and a goodness-of-fit statistic for orbital models when presented with a large parameter space.

  8. 3D Texture Features Mining for MRI Brain Tumor Identification

    NASA Astrophysics Data System (ADS)

    Rahim, Mohd Shafry Mohd; Saba, Tanzila; Nayer, Fatima; Syed, Afraz Zahra

    2014-03-01

    Medical image segmentation is a process to extract region of interest and to divide an image into its individual meaningful, homogeneous components. Actually, these components will have a strong relationship with the objects of interest in an image. For computer-aided diagnosis and therapy process, medical image segmentation is an initial mandatory step. Medical image segmentation is a sophisticated and challenging task because of the sophisticated nature of the medical images. Indeed, successful medical image analysis heavily dependent on the segmentation accuracy. Texture is one of the major features to identify region of interests in an image or to classify an object. 2D textures features yields poor classification results. Hence, this paper represents 3D features extraction using texture analysis and SVM as segmentation technique in the testing methodologies.

  9. Interactive Visualization and Analysis of Geospatial Data Sets - TrikeND-iGlobe

    NASA Astrophysics Data System (ADS)

    Rosebrock, Uwe; Hogan, Patrick; Chandola, Varun

    2013-04-01

    The visualization of scientific datasets is becoming an ever-increasing challenge as advances in computing technologies have enabled scientists to build high resolution climate models that have produced petabytes of climate data. To interrogate and analyze these large datasets in real-time is a task that pushes the boundaries of computing hardware and software. But integration of climate datasets with geospatial data requires considerable amount of effort and close familiarity of various data formats and projection systems, which has prevented widespread utilization outside of climate community. TrikeND-iGlobe is a sophisticated software tool that bridges this gap, allows easy integration of climate datasets with geospatial datasets and provides sophisticated visualization and analysis capabilities. The objective for TrikeND-iGlobe is the continued building of an open source 4D virtual globe application using NASA World Wind technology that integrates analysis of climate model outputs with remote sensing observations as well as demographic and environmental data sets. This will facilitate a better understanding of global and regional phenomenon, and the impact analysis of climate extreme events. The critical aim is real-time interactive interrogation. At the data centric level the primary aim is to enable the user to interact with the data in real-time for the purpose of analysis - locally or remotely. TrikeND-iGlobe provides the basis for the incorporation of modular tools that provide extended interactions with the data, including sub-setting, aggregation, re-shaping, time series analysis methods and animation to produce publication-quality imagery. TrikeND-iGlobe may be run locally or can be accessed via a web interface supported by high-performance visualization compute nodes placed close to the data. It supports visualizing heterogeneous data formats: traditional geospatial datasets along with scientific data sets with geographic coordinates (NetCDF, HDF, etc.). It also supports multiple data access mechanisms, including HTTP, FTP, WMS, WCS, and Thredds Data Server (for NetCDF data and for scientific data, TrikeND-iGlobe supports various visualization capabilities, including animations, vector field visualization, etc. TrikeND-iGlobe is a collaborative open-source project, contributors include NASA (ARC-PX), ORNL (Oakridge National Laboratories), Unidata, Kansas University, CSIRO CMAR Australia and Geoscience Australia.

  10. Automation Problems of 1968; Papers Presented at the Meeting...October 4-5, 1968.

    ERIC Educational Resources Information Center

    Andrews, Theodora, Ed.

    Librarians and their concerned colleagues met to give, hear and discuss papers on library automation, primarily by computers. Noted at this second meeting on library automation were: (1) considerably more sophistication and casualness about the techniques involved, (2) considerably more assurance of what and where things can be applied and (3)…

  11. The Relationship of Lexical Richness to the Quality of ESL Learners' Oral Narratives

    ERIC Educational Resources Information Center

    Lu, Xiaofei

    2012-01-01

    This study was an examination of the relationship of lexical richness to the quality of English as a second language (ESL) learners' oral narratives. A computational system was designed to automate the measurement of 3 dimensions of lexical richness, that is, lexical density, sophistication, and variation, using 25 different metrics proposed in…

  12. Requirements for SPIRES II. An External Specification for the Stanford Public Information Retrieval System.

    ERIC Educational Resources Information Center

    Parker, Edwin B.

    SPIRES (Stanford Public Information Retrieval System) is a computerized information storage and retrieval system intended for use by students and faculty members who have little knowledge of computers but who need rapid and sophisticated retrieval and analysis. The functions and capabilities of the system from the user's point of view are…

  13. Science of Security Lablet - Scalability and Usability

    DTIC Science & Technology

    2014-12-16

    mobile computing [19]. However, the high-level infrastructure design and our own implementation (both described throughout this paper) can easily...critical and infrastructural systems demands high levels of sophistication in the technical aspects of cybersecurity, software and hardware design...Forget, S. Komanduri, Alessandro Acquisti, Nicolas Christin, Lorrie Cranor, Rahul Telang. "Security Behavior Observatory: Infrastructure for Long-term

  14. An introduction to UGRS: the ultimate grading and remanufacturing system

    Treesearch

    John Moody; Charles J. Gatchell; Elizabeth S. Walker; Powsiri Klinkhachorn

    1998-01-01

    The Ultimate Grading and Remanufactming System (UGRS) is an advanced computer program for grading and remanufacturing lumber. It is an interactive program that will both grade lumber according to NHLA rules and remanufacture it for maximum value. UGRS is written to run under Microsoft Windows 3.0 or later updates and provides a sophisticated graphical user interface....

  15. Program Integrity, Controlled Growth Spell Success for Roots of Empathy

    ERIC Educational Resources Information Center

    Gordon, Mary; Letchford, Donna

    2009-01-01

    Childhood is a universal aspect of the human condition. Yet the landscape of childhood is changing rapidly. On playgrounds young children carry cell phones, and in classrooms children are more sophisticated in their use of computers and digital media than the adults in their lives. Most young adolescents are prolific communicators via text and…

  16. RAVEN: a GUI and an Artificial Intelligence Engine in a Dynamic PRA Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    C. Rabiti; D. Mandelli; A. Alfonsi

    Increases in computational power and pressure for more accurate simulations and estimations of accident scenario consequences are driving the need for Dynamic Probabilistic Risk Assessment (PRA) [1] of very complex models. While more sophisticated algorithms and computational power address the back end of this challenge, the front end is still handled by engineers that need to extract meaningful information from the large amount of data and build these complex models. Compounding this problem is the difficulty in knowledge transfer and retention, and the increasing speed of software development. The above-described issues would have negatively impacted deployment of the new highmore » fidelity plant simulator RELAP-7 (Reactor Excursion and Leak Analysis Program) at Idaho National Laboratory. Therefore, RAVEN that was initially focused to be the plant controller for RELAP-7 will help mitigate future RELAP-7 software engineering risks. In order to accomplish this task, Reactor Analysis and Virtual Control Environment (RAVEN) has been designed to provide an easy to use Graphical User Interface (GUI) for building plant models and to leverage artificial intelligence algorithms in order to reduce computational time, improve results, and help the user to identify the behavioral pattern of the Nuclear Power Plants (NPPs). In this paper we will present the GUI implementation and its current capability status. We will also introduce the support vector machine algorithms and show our evaluation of their potentiality in increasing the accuracy and reducing the computational costs of PRA analysis. In this evaluation we will refer to preliminary studies performed under the Risk Informed Safety Margins Characterization (RISMC) project of the Light Water Reactors Sustainability (LWRS) campaign [3]. RISMC simulation needs and algorithm testing are currently used as a guidance to prioritize RAVEN developments relevant to PRA.« less

  17. Machine learning patterns for neuroimaging-genetic studies in the cloud.

    PubMed

    Da Mota, Benoit; Tudoran, Radu; Costan, Alexandru; Varoquaux, Gaël; Brasche, Goetz; Conrod, Patricia; Lemaitre, Herve; Paus, Tomas; Rietschel, Marcella; Frouin, Vincent; Poline, Jean-Baptiste; Antoniu, Gabriel; Thirion, Bertrand

    2014-01-01

    Brain imaging is a natural intermediate phenotype to understand the link between genetic information and behavior or brain pathologies risk factors. Massive efforts have been made in the last few years to acquire high-dimensional neuroimaging and genetic data on large cohorts of subjects. The statistical analysis of such data is carried out with increasingly sophisticated techniques and represents a great computational challenge. Fortunately, increasing computational power in distributed architectures can be harnessed, if new neuroinformatics infrastructures are designed and training to use these new tools is provided. Combining a MapReduce framework (TomusBLOB) with machine learning algorithms (Scikit-learn library), we design a scalable analysis tool that can deal with non-parametric statistics on high-dimensional data. End-users describe the statistical procedure to perform and can then test the model on their own computers before running the very same code in the cloud at a larger scale. We illustrate the potential of our approach on real data with an experiment showing how the functional signal in subcortical brain regions can be significantly fit with genome-wide genotypes. This experiment demonstrates the scalability and the reliability of our framework in the cloud with a 2 weeks deployment on hundreds of virtual machines.

  18. OPserver: opacities and radiative accelerations on demand

    NASA Astrophysics Data System (ADS)

    Mendoza, C.; González, J.; Seaton, M. J.; Buerger, P.; Bellorín, A.; Meléndez, M.; Rodríguez, L. S.; Delahaye, F.; Zeippen, C. J.; Palacios, E.; Pradhan, A. K.

    2009-05-01

    We report on developments carried out within the Opacity Project (OP) to upgrade atomic database services to comply with e-infrastructure requirements. We give a detailed description of an interactive, online server for astrophysical opacities, referred to as OPserver, to be used in sophisticated stellar modelling where Rosseland mean opacities and radiative accelerations are computed at every depth point and each evolution cycle. This is crucial, for instance, in chemically peculiar stars and in the exploitation of the new asteroseismological data. OPserver, downloadable with the new OPCD_3.0 release from the Centre de Données Astronomiques de Strasbourg, France, computes mean opacities and radiative data for arbitrary chemical mixtures from the OP monochromatic opacities. It is essentially a client-server network restructuring and optimization of the suite of codes included in the earlier OPCD_2.0 release. The server can be installed locally or, alternatively, accessed remotely from the Ohio Supercomputer Center, Columbus, Ohio, USA. The client is an interactive web page or a subroutine library that can be linked to the user code. The suitability of this scheme in grid computing environments is emphasized, and its extension to other atomic database services for astrophysical purposes is discussed.

  19. 1D-3D hybrid modeling—from multi-compartment models to full resolution models in space and time

    PubMed Central

    Grein, Stephan; Stepniewski, Martin; Reiter, Sebastian; Knodel, Markus M.; Queisser, Gillian

    2014-01-01

    Investigation of cellular and network dynamics in the brain by means of modeling and simulation has evolved into a highly interdisciplinary field, that uses sophisticated modeling and simulation approaches to understand distinct areas of brain function. Depending on the underlying complexity, these models vary in their level of detail, in order to cope with the attached computational cost. Hence for large network simulations, single neurons are typically reduced to time-dependent signal processors, dismissing the spatial aspect of each cell. For single cell or networks with relatively small numbers of neurons, general purpose simulators allow for space and time-dependent simulations of electrical signal processing, based on the cable equation theory. An emerging field in Computational Neuroscience encompasses a new level of detail by incorporating the full three-dimensional morphology of cells and organelles into three-dimensional, space and time-dependent, simulations. While every approach has its advantages and limitations, such as computational cost, integrated and methods-spanning simulation approaches, depending on the network size could establish new ways to investigate the brain. In this paper we present a hybrid simulation approach, that makes use of reduced 1D-models using e.g., the NEURON simulator—which couples to fully resolved models for simulating cellular and sub-cellular dynamics, including the detailed three-dimensional morphology of neurons and organelles. In order to couple 1D- and 3D-simulations, we present a geometry-, membrane potential- and intracellular concentration mapping framework, with which graph- based morphologies, e.g., in the swc- or hoc-format, are mapped to full surface and volume representations of the neuron and computational data from 1D-simulations can be used as boundary conditions for full 3D simulations and vice versa. Thus, established models and data, based on general purpose 1D-simulators, can be directly coupled to the emerging field of fully resolved, highly detailed 3D-modeling approaches. We present the developed general framework for 1D/3D hybrid modeling and apply it to investigate electrically active neurons and their intracellular spatio-temporal calcium dynamics. PMID:25120463

  20. Modeling the Effects of Beam Size and Flaw Morphology on Ultrasonic Pulse/Echo Sizing of Delaminations in Carbon Composites

    NASA Technical Reports Server (NTRS)

    Margetan, Frank J.; Leckey, Cara A.; Barnard, Dan

    2012-01-01

    The size and shape of a delamination in a multi-layered structure can be estimated in various ways from an ultrasonic pulse/echo image. For example the -6dB contours of measured response provide one simple estimate of the boundary. More sophisticated approaches can be imagined where one adjusts the proposed boundary to bring measured and predicted UT images into optimal agreement. Such approaches require suitable models of the inspection process. In this paper we explore issues pertaining to model-based size estimation for delaminations in carbon fiber reinforced laminates. In particular we consider the influence on sizing when the delamination is non-planar or partially transmitting in certain regions. Two models for predicting broadband sonic time-domain responses are considered: (1) a fast "simple" model using paraxial beam expansions and Kirchhoff and phase-screen approximations; and (2) the more exact (but computationally intensive) 3D elastodynamic finite integration technique (EFIT). Model-to-model and model-to experiment comparisons are made for delaminations in uniaxial composite plates, and the simple model is then used to critique the -6dB rule for delamination sizing.

Top