Sample records for computer simulated person

  1. A demonstrative model of a lunar base simulation on a personal computer

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The initial demonstration model of a lunar base simulation is described. This initial model was developed on the personal computer level to demonstrate feasibility and technique before proceeding to a larger computer-based model. Lotus Symphony Version 1.1 software was used to base the demonstration model on an personal computer with an MS-DOS operating system. The personal computer-based model determined the applicability of lunar base modeling techniques developed at an LSPI/NASA workshop. In addition, the personnal computer-based demonstration model defined a modeling structure that could be employed on a larger, more comprehensive VAX-based lunar base simulation. Refinement of this personal computer model and the development of a VAX-based model is planned in the near future.

  2. Acceleration of the matrix multiplication of Radiance three phase daylighting simulations with parallel computing on heterogeneous hardware of personal computer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zuo, Wangda; McNeil, Andrew; Wetter, Michael

    2013-05-23

    Building designers are increasingly relying on complex fenestration systems to reduce energy consumed for lighting and HVAC in low energy buildings. Radiance, a lighting simulation program, has been used to conduct daylighting simulations for complex fenestration systems. Depending on the configurations, the simulation can take hours or even days using a personal computer. This paper describes how to accelerate the matrix multiplication portion of a Radiance three-phase daylight simulation by conducting parallel computing on heterogeneous hardware of a personal computer. The algorithm was optimized and the computational part was implemented in parallel using OpenCL. The speed of new approach wasmore » evaluated using various daylighting simulation cases on a multicore central processing unit and a graphics processing unit. Based on the measurements and analysis of the time usage for the Radiance daylighting simulation, further speedups can be achieved by using fast I/O devices and storing the data in a binary format.« less

  3. Development of qualification guidelines for personal computer-based aviation training devices.

    DOT National Transportation Integrated Search

    1995-02-01

    Recent advances in the capabilities of personal computers have resulted in an increase in the number of flight simulation programs made available as Personal Computer-Based Aviation Training Devices (PCATDs).The potential benefits of PCATDs have been...

  4. When Everybody Anticipates in a Different Way …

    NASA Astrophysics Data System (ADS)

    Kindler, Eugene

    2002-09-01

    The paper is oriented to the computer modeling of anticipatory systems in which there are more than one anticipating individuals. The anticipating of each of them can mutually differ. In such a case we can meet four main cases: (1) the anticipating persons make a dialogue to access some agreement and by such a way they can optimize the anticipation, (2) one of the anticipating persons is a teacher of the other ones and can show them where they had to be better in their anticipation, (3) the anticipating persons compete, each of them expecting to make the best anticipation and wishes to apply it in order to make the other ones weaker, (4) the anticipating persons do not mutually communicate. A human often anticipates so that he imagines the possible processes of the future and so he performs a certain "mental simulation", but nowadays a human uses computer simulation to replace that (insufficient) mental simulation. All the variants were simulated so that the human imagining was transferred to a computer simulation. Thus systems containing several simulating elements were simulated. Experiences with that "nested" simulation and applications of it are described.

  5. Interactional Personality, Mathematical Simulation, and Prediction of Interpersonal Compatability.

    ERIC Educational Resources Information Center

    Kunce, Joseph T.; And Others

    1981-01-01

    Used a mathematical simulation procedure adaptable to an interactional concept of personality to predict the interpersonal compatibility of couples. Strife scores derived from computer simulation of interactional personality data correlated significantly with partner ratings for the quality and the stability of their relationship. Significance…

  6. Development of a personal computer-based secondary task procedure as a surrogate for a driving simulator

    DOT National Transportation Integrated Search

    2007-08-01

    This research was conducted to develop and test a personal computer-based study procedure (PCSP) with secondary task loading for use in human factors laboratory experiments in lieu of a driving simulator to test reading time and understanding of traf...

  7. Computer Models of Personality: Implications for Measurement

    ERIC Educational Resources Information Center

    Cranton, P. A.

    1976-01-01

    Current research on computer models of personality is reviewed and categorized under five headings: (1) models of belief systems; (2) models of interpersonal behavior; (3) models of decision-making processes; (4) prediction models; and (5) theory-based simulations of specific processes. The use of computer models in personality measurement is…

  8. SIGMA--A Graphical Approach to Teaching Simulation.

    ERIC Educational Resources Information Center

    Schruben, Lee W.

    1992-01-01

    SIGMA (Simulation Graphical Modeling and Analysis) is a computer graphics environment for building, testing, and experimenting with discrete event simulation models on personal computers. It uses symbolic representations (computer animation) to depict the logic of large, complex discrete event systems for easier understanding and has proven itself…

  9. BeeSim: Leveraging Wearable Computers in Participatory Simulations with Young Children

    ERIC Educational Resources Information Center

    Peppler, Kylie; Danish, Joshua; Zaitlen, Benjamin; Glosson, Diane; Jacobs, Alexander; Phelps, David

    2010-01-01

    New technologies have enabled students to become active participants in computational simulations of dynamic and complex systems (called Participatory Simulations), providing a "first-person"perspective on complex systems. However, most existing Participatory Simulations have targeted older children, teens, and adults assuming that such concepts…

  10. Computer modeling with randomized-controlled trial data informs the development of person-centered aged care homes.

    PubMed

    Chenoweth, Lynn; Vickland, Victor; Stein-Parbury, Jane; Jeon, Yun-Hee; Kenny, Patricia; Brodaty, Henry

    2015-10-01

    To answer questions on the essential components (services, operations and resources) of a person-centered aged care home (iHome) using computer simulation. iHome was developed with AnyLogic software using extant study data obtained from 60 Australian aged care homes, 900+ clients and 700+ aged care staff. Bayesian analysis of simulated trial data will determine the influence of different iHome characteristics on care service quality and client outcomes. Interim results: A person-centered aged care home (socio-cultural context) and care/lifestyle services (interactional environment) can produce positive outcomes for aged care clients (subjective experiences) in the simulated environment. Further testing will define essential characteristics of a person-centered care home.

  11. Teaching by Simulation with Personal Computers.

    ERIC Educational Resources Information Center

    Randall, James E.

    1978-01-01

    Describes the use of a small digital computer to simulate a peripheral nerve demonstration in which the action potential responses to pairs of stimuli are used to illustrate the properties of excitable membranes. (Author/MA)

  12. Parallelized computation for computer simulation of electrocardiograms using personal computers with multi-core CPU and general-purpose GPU.

    PubMed

    Shen, Wenfeng; Wei, Daming; Xu, Weimin; Zhu, Xin; Yuan, Shizhong

    2010-10-01

    Biological computations like electrocardiological modelling and simulation usually require high-performance computing environments. This paper introduces an implementation of parallel computation for computer simulation of electrocardiograms (ECGs) in a personal computer environment with an Intel CPU of Core (TM) 2 Quad Q6600 and a GPU of Geforce 8800GT, with software support by OpenMP and CUDA. It was tested in three parallelization device setups: (a) a four-core CPU without a general-purpose GPU, (b) a general-purpose GPU plus 1 core of CPU, and (c) a four-core CPU plus a general-purpose GPU. To effectively take advantage of a multi-core CPU and a general-purpose GPU, an algorithm based on load-prediction dynamic scheduling was developed and applied to setting (c). In the simulation with 1600 time steps, the speedup of the parallel computation as compared to the serial computation was 3.9 in setting (a), 16.8 in setting (b), and 20.0 in setting (c). This study demonstrates that a current PC with a multi-core CPU and a general-purpose GPU provides a good environment for parallel computations in biological modelling and simulation studies. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.

  13. ROMI-RIP: Rough Mill RIP-first simulator user's guide

    Treesearch

    R. Edward Thomas

    1995-01-01

    The ROugh Mill RIP-first simulator (ROMI-RIP) is a computer software package for IBM compatible personal computers that simulates current industrial practices for gang-ripping lumber. This guide shows the user how to set and examine the results of simulations regarding current or proposed mill practices. ROMI-RIP accepts cutting bills with up to 300 different part...

  14. Tissue-scale, personalized modeling and simulation of prostate cancer growth

    NASA Astrophysics Data System (ADS)

    Lorenzo, Guillermo; Scott, Michael A.; Tew, Kevin; Hughes, Thomas J. R.; Zhang, Yongjie Jessica; Liu, Lei; Vilanova, Guillermo; Gomez, Hector

    2016-11-01

    Recently, mathematical modeling and simulation of diseases and their treatments have enabled the prediction of clinical outcomes and the design of optimal therapies on a personalized (i.e., patient-specific) basis. This new trend in medical research has been termed “predictive medicine.” Prostate cancer (PCa) is a major health problem and an ideal candidate to explore tissue-scale, personalized modeling of cancer growth for two main reasons: First, it is a small organ, and, second, tumor growth can be estimated by measuring serum prostate-specific antigen (PSA, a PCa biomarker in blood), which may enable in vivo validation. In this paper, we present a simple continuous model that reproduces the growth patterns of PCa. We use the phase-field method to account for the transformation of healthy cells to cancer cells and use diffusion-reaction equations to compute nutrient consumption and PSA production. To accurately and efficiently compute tumor growth, our simulations leverage isogeometric analysis (IGA). Our model is shown to reproduce a known shape instability from a spheroidal pattern to fingered growth. Results of our computations indicate that such shift is a tumor response to escape starvation, hypoxia, and, eventually, necrosis. Thus, branching enables the tumor to minimize the distance from inner cells to external nutrients, contributing to cancer survival and further development. We have also used our model to perform tissue-scale, personalized simulation of a PCa patient, based on prostatic anatomy extracted from computed tomography images. This simulation shows tumor progression similar to that seen in clinical practice.

  15. Symplectic molecular dynamics simulations on specially designed parallel computers.

    PubMed

    Borstnik, Urban; Janezic, Dusanka

    2005-01-01

    We have developed a computer program for molecular dynamics (MD) simulation that implements the Split Integration Symplectic Method (SISM) and is designed to run on specialized parallel computers. The MD integration is performed by the SISM, which analytically treats high-frequency vibrational motion and thus enables the use of longer simulation time steps. The low-frequency motion is treated numerically on specially designed parallel computers, which decreases the computational time of each simulation time step. The combination of these approaches means that less time is required and fewer steps are needed and so enables fast MD simulations. We study the computational performance of MD simulation of molecular systems on specialized computers and provide a comparison to standard personal computers. The combination of the SISM with two specialized parallel computers is an effective way to increase the speed of MD simulations up to 16-fold over a single PC processor.

  16. ROMI 4.0: Rough mill simulator 4.0 users manual

    Treesearch

    R. Edward Thomas; Timo Grueneberg; Urs Buehlmann

    2015-01-01

    The Rough MIll simulator (ROMI Version 4.0) is a computer software package for personal computers (PCs) that simulates current industrial practices for rip-first, chop-first, and rip and chop-first lumber processing. This guide shows how to set up the software; design, implement, and execute simulations; and examine the results. ROMI 4.0 accepts cutting bills with as...

  17. Computer Simulations: An Integrating Tool.

    ERIC Educational Resources Information Center

    Bilan, Bohdan J.

    This introduction to computer simulations as an integrated learning experience reports on their use with students in grades 5 through 10 using commercial software packages such as SimCity, SimAnt, SimEarth, and Civilization. Students spent an average of 60 hours with the simulation games and reported their experiences each week in a personal log.…

  18. Application of Computer Simulation to Teach ATM Access to Individuals with Intellectual Disabilities

    ERIC Educational Resources Information Center

    Davies, Daniel K.; Stock, Steven E.; Wehmeyer, Michael L.

    2003-01-01

    This study investigates use of computer simulation for teaching ATM use to adults with intellectual disabilities. ATM-SIM is a computer-based trainer used for teaching individuals with intellectual disabilities how to use an automated teller machine (ATM) to access their personal bank accounts. In the pilot evaluation, a prototype system was…

  19. Computer simulation of space charge

    NASA Astrophysics Data System (ADS)

    Yu, K. W.; Chung, W. K.; Mak, S. S.

    1991-05-01

    Using the particle-mesh (PM) method, a one-dimensional simulation of the well-known Langmuir-Child's law is performed on an INTEL 80386-based personal computer system. The program is coded in turbo basic (trademark of Borland International, Inc.). The numerical results obtained were in excellent agreement with theoretical predictions and the computational time required is quite modest. This simulation exercise demonstrates that some simple computer simulation using particles may be implemented successfully on PC's that are available today, and hopefully this will provide the necessary incentives for newcomers to the field who wish to acquire a flavor of the elementary aspects of the practice.

  20. Study on Thermal Conductivity of Personal Computer Aluminum-Magnesium Alloy Casing

    NASA Astrophysics Data System (ADS)

    Liao, MeiHong

    With the rapid development of computer technology, micro-state atoms by simulating the movement of material to analyze the nature of the macro-state have become an important subject. Materials, especially aluminium-magnesium alloy materials, often used in personal computer case, this article puts forward heat conduction model of the material, and numerical methods of heat transfer performance of the material.

  1. Computers with Wings: Flight Simulation and Personalized Landscapes

    ERIC Educational Resources Information Center

    Oss, Stefano

    2005-01-01

    We propose, as a special way to explore the physics of flying objects, to use a flight simulator with a personalized scenery to reproduce the territory where students live. This approach increases the participation and attention of students to physics classes but also creates several opportunities for addressing side activities and arguments of…

  2. Physician Utilization of a Hospital Information System: A Computer Simulation Model

    PubMed Central

    Anderson, James G.; Jay, Stephen J.; Clevenger, Stephen J.; Kassing, David R.; Perry, Jane; Anderson, Marilyn M.

    1988-01-01

    The purpose of this research was to develop a computer simulation model that represents the process through which physicians enter orders into a hospital information system (HIS). Computer simulation experiments were performed to estimate the effects of two methods of order entry on outcome variables. The results of the computer simulation experiments were used to perform a cost-benefit analysis to compare the two different means of entering medical orders into the HIS. The results indicate that the use of personal order sets to enter orders into the HIS will result in a significant reduction in manpower, salaries and fringe benefits, and errors in order entry.

  3. Launch Site Computer Simulation and its Application to Processes

    NASA Technical Reports Server (NTRS)

    Sham, Michael D.

    1995-01-01

    This paper provides an overview of computer simulation, the Lockheed developed STS Processing Model, and the application of computer simulation to a wide range of processes. The STS Processing Model is an icon driven model that uses commercial off the shelf software and a Macintosh personal computer. While it usually takes one year to process and launch 8 space shuttles, with the STS Processing Model this process is computer simulated in about 5 minutes. Facilities, orbiters, or ground support equipment can be added or deleted and the impact on launch rate, facility utilization, or other factors measured as desired. This same computer simulation technology can be used to simulate manufacturing, engineering, commercial, or business processes. The technology does not require an 'army' of software engineers to develop and operate, but instead can be used by the layman with only a minimal amount of training. Instead of making changes to a process and realizing the results after the fact, with computer simulation, changes can be made and processes perfected before they are implemented.

  4. Computer Series, 97.

    ERIC Educational Resources Information Center

    Kay, Jack G.; And Others

    1988-01-01

    Describes two applications of the microcomputer for laboratory exercises. Explores radioactive decay using the Batemen equations on a Macintosh computer. Provides examples and screen dumps of data. Investigates polymer configurations using a Monte Carlo simulation on an IBM personal computer. (MVL)

  5. REFERENCE MANUAL FOR RASSMIT VERSION 2.1: SUB-SLAB DEPRESSURIZATION SYSTEM DESIGN PERFORMANCE SIMULATION PROGRAM

    EPA Science Inventory

    The report is a reference manual for RASSMlT Version 2.1, a computer program that was developed to simulate and aid in the design of sub-slab depressurization systems used for indoor radon mitigation. The program was designed to run on DOS-compatible personal computers to ensure ...

  6. Training Effectiveness Evaluation (TEE) of the Advanced Fire Fighting Training System. Focus on the Trained Person.

    ERIC Educational Resources Information Center

    Cordell, Curtis C.; And Others

    A training effectiveness evaluation of the Navy Advanced Fire Fighting Training System was conducted. This system incorporates simulated fires as well as curriculum materials and instruction. The fires are non-pollutant, computer controlled, and installed in a simulated shipboard environment. Two teams of 15 to 16 persons, with varying amounts of…

  7. A scalable parallel black oil simulator on distributed memory parallel computers

    NASA Astrophysics Data System (ADS)

    Wang, Kun; Liu, Hui; Chen, Zhangxin

    2015-11-01

    This paper presents our work on developing a parallel black oil simulator for distributed memory computers based on our in-house parallel platform. The parallel simulator is designed to overcome the performance issues of common simulators that are implemented for personal computers and workstations. The finite difference method is applied to discretize the black oil model. In addition, some advanced techniques are employed to strengthen the robustness and parallel scalability of the simulator, including an inexact Newton method, matrix decoupling methods, and algebraic multigrid methods. A new multi-stage preconditioner is proposed to accelerate the solution of linear systems from the Newton methods. Numerical experiments show that our simulator is scalable and efficient, and is capable of simulating extremely large-scale black oil problems with tens of millions of grid blocks using thousands of MPI processes on parallel computers.

  8. ASTEC: Controls analysis for personal computers

    NASA Technical Reports Server (NTRS)

    Downing, John P.; Bauer, Frank H.; Thorpe, Christopher J.

    1989-01-01

    The ASTEC (Analysis and Simulation Tools for Engineering Controls) software is under development at Goddard Space Flight Center (GSFC). The design goal is to provide a wide selection of controls analysis tools at the personal computer level, as well as the capability to upload compute-intensive jobs to a mainframe or supercomputer. The project is a follow-on to the INCA (INteractive Controls Analysis) program that has been developed at GSFC over the past five years. While ASTEC makes use of the algorithms and expertise developed for the INCA program, the user interface was redesigned to take advantage of the capabilities of the personal computer. The design philosophy and the current capabilities of the ASTEC software are described.

  9. Personal supercomputing by using transputer and Intel 80860 in plasma engineering

    NASA Astrophysics Data System (ADS)

    Ido, S.; Aoki, K.; Ishine, M.; Kubota, M.

    1992-09-01

    Transputer (T800) and 64-bit RISC Intel 80860 (i860) added on a personal computer can be used as an accelerator. When 32-bit T800s in a parallel system or 64-bit i860s are used, scientific calculations are carried out several ten times as fast as in the case of commonly used 32-bit personal computers or UNIX workstations. Benchmark tests and examples of physical simulations using T800s and i860 are reported.

  10. An Amphibious Ship-To-Shore Simulation for Use on an IBM PC (Personal Computer)

    DTIC Science & Technology

    1984-09-01

    CA : «< <- j Special ■ *- amphibious ship- an IBM Personal ion of the phy- he logic used analysis, and a DD | JAM 11 1473 COITION...research, for instance, wiL1 be geared toward a technically oriented person who is familiar with computers, programming and the associated logic. A...problem, often vaguely stated by the decision aaker , into precise and operational terms [Ref. Hz p.51]. The analysis begins with specification of the

  11. Neural simulations on multi-core architectures.

    PubMed

    Eichner, Hubert; Klug, Tobias; Borst, Alexander

    2009-01-01

    Neuroscience is witnessing increasing knowledge about the anatomy and electrophysiological properties of neurons and their connectivity, leading to an ever increasing computational complexity of neural simulations. At the same time, a rather radical change in personal computer technology emerges with the establishment of multi-cores: high-density, explicitly parallel processor architectures for both high performance as well as standard desktop computers. This work introduces strategies for the parallelization of biophysically realistic neural simulations based on the compartmental modeling technique and results of such an implementation, with a strong focus on multi-core architectures and automation, i.e. user-transparent load balancing.

  12. Neural Simulations on Multi-Core Architectures

    PubMed Central

    Eichner, Hubert; Klug, Tobias; Borst, Alexander

    2009-01-01

    Neuroscience is witnessing increasing knowledge about the anatomy and electrophysiological properties of neurons and their connectivity, leading to an ever increasing computational complexity of neural simulations. At the same time, a rather radical change in personal computer technology emerges with the establishment of multi-cores: high-density, explicitly parallel processor architectures for both high performance as well as standard desktop computers. This work introduces strategies for the parallelization of biophysically realistic neural simulations based on the compartmental modeling technique and results of such an implementation, with a strong focus on multi-core architectures and automation, i.e. user-transparent load balancing. PMID:19636393

  13. Computer simulation: A modern day crystal ball?

    NASA Technical Reports Server (NTRS)

    Sham, Michael; Siprelle, Andrew

    1994-01-01

    It has long been the desire of managers to be able to look into the future and predict the outcome of decisions. With the advent of computer simulation and the tremendous capability provided by personal computers, that desire can now be realized. This paper presents an overview of computer simulation and modeling, and discusses the capabilities of Extend. Extend is an iconic-driven Macintosh-based software tool that brings the power of simulation to the average computer user. An example of an Extend based model is presented in the form of the Space Transportation System (STS) Processing Model. The STS Processing Model produces eight shuttle launches per year, yet it takes only about ten minutes to run. In addition, statistical data such as facility utilization, wait times, and processing bottlenecks are produced. The addition or deletion of resources, such as orbiters or facilities, can be easily modeled and their impact analyzed. Through the use of computer simulation, it is possible to look into the future to see the impact of today's decisions.

  14. Teaching Ionic Solvation Structure with a Monte Carlo Liquid Simulation Program

    ERIC Educational Resources Information Center

    Serrano, Agostinho; Santos, Flavia M. T.; Greca, Ileana M.

    2004-01-01

    The use of molecular dynamics and Monte Carlo methods has provided efficient means to stimulate the behavior of molecular liquids and solutions. A Monte Carlo simulation program is used to compute the structure of liquid water and of water as a solvent to Na(super +), Cl(super -), and Ar on a personal computer to show that it is easily feasible to…

  15. Visualizing ultrasound through computational modeling

    NASA Technical Reports Server (NTRS)

    Guo, Theresa W.

    2004-01-01

    The Doppler Ultrasound Hematocrit Project (DHP) hopes to find non-invasive methods of determining a person s blood characteristics. Because of the limits of microgravity and the space travel environment, it is important to find non-invasive methods of evaluating the health of persons in space. Presently, there is no well developed method of determining blood composition non-invasively. This projects hopes to use ultrasound and Doppler signals to evaluate the characteristic of hematocrit, the percentage by volume of red blood cells within whole blood. These non-invasive techniques may also be developed to be used on earth for trauma patients where invasive measure might be detrimental. Computational modeling is a useful tool for collecting preliminary information and predictions for the laboratory research. We hope to find and develop a computer program that will be able to simulate the ultrasound signals the project will work with. Simulated models of test conditions will more easily show what might be expected from laboratory results thus help the research group make informed decisions before and during experimentation. There are several existing Matlab based computer programs available, designed to interpret and simulate ultrasound signals. These programs will be evaluated to find which is best suited for the project needs. The criteria of evaluation that will be used are 1) the program must be able to specify transducer properties and specify transmitting and receiving signals, 2) the program must be able to simulate ultrasound signals through different attenuating mediums, 3) the program must be able to process moving targets in order to simulate the Doppler effects that are associated with blood flow, 4) the program should be user friendly and adaptable to various models. After a computer program is chosen, two simulation models will be constructed. These models will simulate and interpret an RF data signal and a Doppler signal.

  16. Particle-In-Cell simulations of high pressure plasmas using graphics processing units

    NASA Astrophysics Data System (ADS)

    Gebhardt, Markus; Atteln, Frank; Brinkmann, Ralf Peter; Mussenbrock, Thomas; Mertmann, Philipp; Awakowicz, Peter

    2009-10-01

    Particle-In-Cell (PIC) simulations are widely used to understand the fundamental phenomena in low-temperature plasmas. Particularly plasmas at very low gas pressures are studied using PIC methods. The inherent drawback of these methods is that they are very time consuming -- certain stability conditions has to be satisfied. This holds even more for the PIC simulation of high pressure plasmas due to the very high collision rates. The simulations take up to very much time to run on standard computers and require the help of computer clusters or super computers. Recent advances in the field of graphics processing units (GPUs) provides every personal computer with a highly parallel multi processor architecture for very little money. This architecture is freely programmable and can be used to implement a wide class of problems. In this paper we present the concepts of a fully parallel PIC simulation of high pressure plasmas using the benefits of GPU programming.

  17. Developing Educational Computer Animation Based on Human Personality Types

    ERIC Educational Resources Information Center

    Musa, Sajid; Ziatdinov, Rushan; Sozcu, Omer Faruk; Griffiths, Carol

    2015-01-01

    Computer animation in the past decade has become one of the most noticeable features of technology-based learning environments. By its definition, it refers to simulated motion pictures showing movement of drawn objects, and is often defined as the art in movement. Its educational application known as educational computer animation is considered…

  18. The Personal Motion Platform

    NASA Technical Reports Server (NTRS)

    Park, Brian Vandellyn

    1993-01-01

    The Neutral Body Posture experienced in microgravity creates a biomechanical equilibrium by enabling the internal forces within the body to find their own balance. A patented reclining chair based on this posture provides a minimal stress environment for interfacing with computer systems for extended periods. When the chair is mounted on a 3 or 6 axis motion platform, a generic motion simulator for simulated digital environments is created. The Personal Motion Platform provides motional feedback to the occupant in synchronization with their movements inside the digital world which enhances the simulation experience. Existing HMD based simulation systems can be integrated to the turnkey system. Future developments are discussed.

  19. Pediatric personalized CT-dosimetry Monte Carlo simulations, using computational phantoms

    NASA Astrophysics Data System (ADS)

    Papadimitroulas, P.; Kagadis, G. C.; Ploussi, A.; Kordolaimi, S.; Papamichail, D.; Karavasilis, E.; Syrgiamiotis, V.; Loudos, G.

    2015-09-01

    The last 40 years Monte Carlo (MC) simulations serve as a “gold standard” tool for a wide range of applications in the field of medical physics and tend to be essential in daily clinical practice. Regarding diagnostic imaging applications, such as computed tomography (CT), the assessment of deposited energy is of high interest, so as to better analyze the risks and the benefits of the procedure. The last few years a big effort is done towards personalized dosimetry, especially in pediatric applications. In the present study the GATE toolkit was used and computational pediatric phantoms have been modeled for the assessment of CT examinations dosimetry. The pediatric models used come from the XCAT and IT'IS series. The X-ray spectrum of a Brightspeed CT scanner was simulated and validated with experimental data. Specifically, a DCT-10 ionization chamber was irradiated twice using 120 kVp with 100 mAs and 200 mAs, for 1 sec in 1 central axial slice (thickness = 10mm). The absorbed dose was measured in air resulting in differences lower than 4% between the experimental and simulated data. The simulations were acquired using ˜1010 number of primaries in order to achieve low statistical uncertainties. Dose maps were also saved for quantification of the absorbed dose in several children critical organs during CT acquisition.

  20. Modeling the Impact of Motivation, Personality, and Emotion on Social Behavior

    NASA Astrophysics Data System (ADS)

    Miller, Lynn C.; Read, Stephen J.; Zachary, Wayne; Rosoff, Andrew

    Models seeking to predict human social behavior must contend with multiple sources of individual and group variability that underlie social behavior. One set of interrelated factors that strongly contribute to that variability - motivations, personality, and emotions - has been only minimally incorporated in previous computational models of social behavior. The Personality, Affect, Culture (PAC) framework is a theory-based computational model that addresses this gap. PAC is used to simulate social agents whose social behavior varies according to their personalities and emotions, which, in turn, vary according to their motivations and underlying motive control parameters. Examples involving disease spread and counter-insurgency operations show how PAC can be used to study behavioral variability in different social contexts.

  1. Numerical simulation of a mini PEMFC stack

    NASA Astrophysics Data System (ADS)

    Liu, Zhixiang; Mao, Zongqiang; Wang, Cheng; Zhuge, Weilin; Zhang, Yangjun

    Fuel cell modeling and simulation has aroused much attention recently because it can probe transport and reaction mechanism. In this paper, a computational fuel cell dynamics (CFCD) method was applied to simulate a proton exchange membrane fuel cell (PEMFC) stack for the first time. The air cooling mini fuel cell stack consisted of six cells, in which the active area was 8 cm 2 (2 cm × 4 cm). With reasonable simplification, the computational elements were effectively reduced and allowed a simulation which could be conducted on a personal computer without large-scale parallel computation. The results indicated that the temperature gradient inside the fuel cell stack was determined by the flow rate of the cooling air. If the air flow rate is too low, the stack could not be effectively cooled and the temperature will rise to a range that might cause unstable stack operation.

  2. GPU-accelerated phase-field simulation of dendritic solidification in a binary alloy

    NASA Astrophysics Data System (ADS)

    Yamanaka, Akinori; Aoki, Takayuki; Ogawa, Satoi; Takaki, Tomohiro

    2011-03-01

    The phase-field simulation for dendritic solidification of a binary alloy has been accelerated by using a graphic processing unit (GPU). To perform the phase-field simulation of the alloy solidification on GPU, a program code was developed with computer unified device architecture (CUDA). In this paper, the implementation technique of the phase-field model on GPU is presented. Also, we evaluated the acceleration performance of the three-dimensional solidification simulation by using a single NVIDIA TESLA C1060 GPU and the developed program code. The results showed that the GPU calculation for 5763 computational grids achieved the performance of 170 GFLOPS by utilizing the shared memory as a software-managed cache. Furthermore, it can be demonstrated that the computation with the GPU is 100 times faster than that with a single CPU core. From the obtained results, we confirmed the feasibility of realizing a real-time full three-dimensional phase-field simulation of microstructure evolution on a personal desktop computer.

  3. On the concept of the interactive information and simulation system for gas dynamics and multiphysics problems

    NASA Astrophysics Data System (ADS)

    Bessonov, O.; Silvestrov, P.

    2017-02-01

    This paper describes the general idea and the first implementation of the Interactive information and simulation system - an integrated environment that combines computational modules for modeling the aerodynamics and aerothermodynamics of re-entry space vehicles with the large collection of different information materials on this topic. The internal organization and the composition of the system are described and illustrated. Examples of the computational and information output are presented. The system has the unified implementation for Windows and Linux operation systems and can be deployed on any modern high-performance personal computer.

  4. The Distributed Diagonal Force Decomposition Method for Parallelizing Molecular Dynamics Simulations

    PubMed Central

    Boršnik, Urban; Miller, Benjamin T.; Brooks, Bernard R.; Janežič, Dušanka

    2011-01-01

    Parallelization is an effective way to reduce the computational time needed for molecular dynamics simulations. We describe a new parallelization method, the distributed-diagonal force decomposition method, with which we extend and improve the existing force decomposition methods. Our new method requires less data communication during molecular dynamics simulations than replicated data and current force decomposition methods, increasing the parallel efficiency. It also dynamically load-balances the processors' computational load throughout the simulation. The method is readily implemented in existing molecular dynamics codes and it has been incorporated into the CHARMM program, allowing its immediate use in conjunction with the many molecular dynamics simulation techniques that are already present in the program. We also present the design of the Force Decomposition Machine, a cluster of personal computers and networks that is tailored to running molecular dynamics simulations using the distributed diagonal force decomposition method. The design is expandable and provides various degrees of fault resilience. This approach is easily adaptable to computers with Graphics Processing Units because it is independent of the processor type being used. PMID:21793007

  5. ROMI-RIP: Rough mill rip-first simulator. Forest Service general technical report (Final)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thomas, R.E.

    1995-07-01

    The ROugh Mill Rip-First Simulator (ROMI-RIP) is a computer software package that simulates the gang-ripping of lumber. ROMI-RIP was designed to closely simulate current machines and industrial practice. This simulator allows the user to perform `what if` analyses on various gang-rip-first rough mill operations with fixed, floating outer blade and all-movable blade arbors. ROMI-RIP accepts cutting bills with up to 300 different part sizes. Plots of processed boards are easily viewed or printed. Detailed summaries of processing steps (number of rips and crosscuts) and yields (single boards or entire board files) can also be viewed of printed. ROMI-RIP requires IBMmore » personal computers with 80286 of higher processors.« less

  6. User's guide to SILVAH

    Treesearch

    Peter D. Knopp; Susan L. Stout

    2014-01-01

    This user's guide for the SILVAH computer program, version 6.2, supersedes the 1992 user's guide (Gen. Tech. Rep. NE-162). Designed for stand-alone Windows-based personal computers, SILVAH recommends a silvicultural prescription for a forest stand based on a summary and analysis of field inventory data. The program also includes a simulator that can be used...

  7. Power and the Power Simulation: Then and Now

    ERIC Educational Resources Information Center

    Bolman, Lee; Deal, Terrence E.

    2017-01-01

    Lee Bolman and Terrence Deal think of how much has changed since the era of innocence when they first published "A Simple But Powerful Power Simulation"--before the advent of cell phones, personal computers, the Internet, e-mail, Facebook, and Twitter. So much has changed, and yet the fundamentals of human behavior, social interaction,…

  8. Personalizing oncology treatments by predicting drug efficacy, side-effects, and improved therapy: mathematics, statistics, and their integration.

    PubMed

    Agur, Zvia; Elishmereni, Moran; Kheifetz, Yuri

    2014-01-01

    Despite its great promise, personalized oncology still faces many hurdles, and it is increasingly clear that targeted drugs and molecular biomarkers alone yield only modest clinical benefit. One reason is the complex relationships between biomarkers and the patient's response to drugs, obscuring the true weight of the biomarkers in the overall patient's response. This complexity can be disentangled by computational models that integrate the effects of personal biomarkers into a simulator of drug-patient dynamic interactions, for predicting the clinical outcomes. Several computational tools have been developed for personalized oncology, notably evidence-based tools for simulating pharmacokinetics, Bayesian-estimated tools for predicting survival, etc. We describe representative statistical and mathematical tools, and discuss their merits, shortcomings and preliminary clinical validation attesting to their potential. Yet, the individualization power of mathematical models alone, or statistical models alone, is limited. More accurate and versatile personalization tools can be constructed by a new application of the statistical/mathematical nonlinear mixed effects modeling (NLMEM) approach, which until recently has been used only in drug development. Using these advanced tools, clinical data from patient populations can be integrated with mechanistic models of disease and physiology, for generating personal mathematical models. Upon a more substantial validation in the clinic, this approach will hopefully be applied in personalized clinical trials, P-trials, hence aiding the establishment of personalized medicine within the main stream of clinical oncology. © 2014 Wiley Periodicals, Inc.

  9. PHREEQCI; a graphical user interface for the geochemical computer program PHREEQC

    USGS Publications Warehouse

    Charlton, Scott R.; Macklin, Clifford L.; Parkhurst, David L.

    1997-01-01

    PhreeqcI is a Windows-based graphical user interface for the geochemical computer program PHREEQC. PhreeqcI provides the capability to generate and edit input data files, run simulations, and view text files containing simulation results, all within the framework of a single interface. PHREEQC is a multipurpose geochemical program that can perform speciation, inverse, reaction-path, and 1D advective reaction-transport modeling. Interactive access to all of the capabilities of PHREEQC is available with PhreeqcI. The interface is written in Visual Basic and will run on personal computers under the Windows(3.1), Windows95, and WindowsNT operating systems.

  10. A computer simulation approach to measurement of human control strategy

    NASA Technical Reports Server (NTRS)

    Green, J.; Davenport, E. L.; Engler, H. F.; Sears, W. E., III

    1982-01-01

    Human control strategy is measured through use of a psychologically-based computer simulation which reflects a broader theory of control behavior. The simulation is called the human operator performance emulator, or HOPE. HOPE was designed to emulate control learning in a one-dimensional preview tracking task and to measure control strategy in that setting. When given a numerical representation of a track and information about current position in relation to that track, HOPE generates positions for a stick controlling the cursor to be moved along the track. In other words, HOPE generates control stick behavior corresponding to that which might be used by a person learning preview tracking.

  11. Using flight simulators aboard ships: human side effects of an optimal scenario with smooth seas.

    PubMed

    Muth, Eric R; Lawson, Ben

    2003-05-01

    The U.S. Navy is considering placing flight simulators aboard ships. It is known that certain types of flight simulators can elicit motion adaptation syndrome (MAS), and also that certain types of ship motion can cause MAS. The goal of this study was to determine if using a flight simulator during ship motion would cause MAS, even when the simulator stimulus and the ship motion were both very mild. All participants in this study completed three conditions. Condition 1 (Sim) entailed "flying" a personal computer-based flight simulator situated on land. Condition 2 (Ship) involved riding aboard a U.S. Navy Yard Patrol boat. Condition 3 (ShipSim) entailed "flying" a personal computer-based flight simulator while riding aboard a Yard Patrol boat. Before and after each condition, participants' balance and dynamic visual acuity were assessed. After each condition, participants filled out the Nausea Profile and the Simulator Sickness Questionnaire. Following exposure to a flight simulator aboard a ship, participants reported negligible symptoms of nausea and simulator sickness. However, participants exhibited a decrease in dynamic visual acuity after exposure to the flight simulator aboard ship (T[25] = 3.61, p < 0.05). Balance results were confounded by significant learning and, therefore, not interpretable. This study suggests that flight simulators can be used aboard ship. As a minimal safety precaution, these simulators should be used according to current safety practices for land-based simulators. Optimally, these simulators should be designed to minimize MAS, located near the ship's center of rotation and used when ship motion is not provocative.

  12. Applications of CFD and visualization techniques

    NASA Technical Reports Server (NTRS)

    Saunders, James H.; Brown, Susan T.; Crisafulli, Jeffrey J.; Southern, Leslie A.

    1992-01-01

    In this paper, three applications are presented to illustrate current techniques for flow calculation and visualization. The first two applications use a commercial computational fluid dynamics (CFD) code, FLUENT, performed on a Cray Y-MP. The results are animated with the aid of data visualization software, apE. The third application simulates a particulate deposition pattern using techniques inspired by developments in nonlinear dynamical systems. These computations were performed on personal computers.

  13. Development of a Computer-Assisted Cranial Nerve Simulation from the Visible Human Dataset

    ERIC Educational Resources Information Center

    Yeung, Jeffrey C.; Fung, Kevin; Wilson, Timothy D.

    2011-01-01

    Advancements in technology and personal computing have allowed for the development of novel teaching modalities such as online web-based modules. These modules are currently being incorporated into medical curricula and, in some paradigms, have been shown to be superior to classroom instruction. We believe that these modules have the potential of…

  14. Markov Chain Monte Carlo Estimation of Item Parameters for the Generalized Graded Unfolding Model

    ERIC Educational Resources Information Center

    de la Torre, Jimmy; Stark, Stephen; Chernyshenko, Oleksandr S.

    2006-01-01

    The authors present a Markov Chain Monte Carlo (MCMC) parameter estimation procedure for the generalized graded unfolding model (GGUM) and compare it to the marginal maximum likelihood (MML) approach implemented in the GGUM2000 computer program, using simulated and real personality data. In the simulation study, test length, number of response…

  15. Implementation of the force decomposition machine for molecular dynamics simulations.

    PubMed

    Borštnik, Urban; Miller, Benjamin T; Brooks, Bernard R; Janežič, Dušanka

    2012-09-01

    We present the design and implementation of the force decomposition machine (FDM), a cluster of personal computers (PCs) that is tailored to running molecular dynamics (MD) simulations using the distributed diagonal force decomposition (DDFD) parallelization method. The cluster interconnect architecture is optimized for the communication pattern of the DDFD method. Our implementation of the FDM relies on standard commodity components even for networking. Although the cluster is meant for DDFD MD simulations, it remains general enough for other parallel computations. An analysis of several MD simulation runs on both the FDM and a standard PC cluster demonstrates that the FDM's interconnect architecture provides a greater performance compared to a more general cluster interconnect. Copyright © 2012 Elsevier Inc. All rights reserved.

  16. Relationship between Norm-internalization and Cooperation in N-person Prisoners' Dilemma Games

    NASA Astrophysics Data System (ADS)

    Matsumoto, Mitsutaka

    In this paper, I discuss the problems of ``order in social situations'' using a computer simulation of iterated N-person prisoners' dilemma game. It has been claimed that, in the case of the 2-person prisoners' dilemma, repetition of games and the reciprocal use of the ``tit-for-tat'' strategy promote the possibility of cooperation. However, in cases of N-person prisoners' dilemma where N is greater than 2, the logic does not work effectively. The most essential problem is so called ``sanctioning problems''. In this paper, firstly, I discuss the ``sanctioning problems'' which were introduced by Axelrod and Keohane in 1986. Based on the model formalized by Axelrod, I propose a new model, in which I added a mechanism of players' payoff changes in the Axelrod's model. I call this mechanism norm-internalization and call our model ``norm-internalization game''. Second, by using the model, I investigated the relationship between agents' norm-internalization (payoff-alternation) and the possibilities of cooperation. The results of computer simulation indicated that unequal distribution of cooperating norm and uniform distribution of sanctioning norm are more effective in establishing cooperation. I discuss the mathematical features and the implications of the results on social science.

  17. Expert knowledge elicitation using computer simulation: the organization of frail elderly case management as an illustration.

    PubMed

    Chiêm, Jean-Christophe; Van Durme, Thérèse; Vandendorpe, Florence; Schmitz, Olivier; Speybroeck, Niko; Cès, Sophie; Macq, Jean

    2014-08-01

    Various elderly case management projects have been implemented in Belgium. This type of long-term health care intervention involves contextual factors and human interactions. These underlying complex mechanisms can be usefully informed with field experts' knowledge, which are hard to make explicit. However, computer simulation has been suggested as one possible method of overcoming the difficulty of articulating such elicited qualitative views. A simulation model of case management was designed using an agent-based methodology, based on the initial qualitative research material. Variables and rules of interaction were formulated into a simple conceptual framework. This model has been implemented and was used as a support for a structured discussion with experts in case management. The rigorous formulation provided by the agent-based methodology clarified the descriptions of the interventions and the problems encountered regarding: the diverse network topologies of health care actors in the project; the adaptation time required by the intervention; the communication between the health care actors; the institutional context; the organization of the care; and the role of the case manager and his or hers personal ability to interpret the informal demands of the frail older person. The simulation model should be seen primarily as a tool for thinking and learning. A number of insights were gained as part of a valuable cognitive process. Computer simulation supporting field experts' elicitation can lead to better-informed decisions in the organization of complex health care interventions. © 2013 John Wiley & Sons, Ltd.

  18. Simulation framework for intelligent transportation systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ewing, T.; Doss, E.; Hanebutte, U.

    1996-10-01

    A simulation framework has been developed for a large-scale, comprehensive, scaleable simulation of an Intelligent Transportation System (ITS). The simulator is designed for running on parallel computers and distributed (networked) computer systems, but can run on standalone workstations for smaller simulations. The simulator currently models instrumented smart vehicles with in-vehicle navigation units capable of optimal route planning and Traffic Management Centers (TMC). The TMC has probe vehicle tracking capabilities (display position and attributes of instrumented vehicles), and can provide two-way interaction with traffic to provide advisories and link times. Both the in-vehicle navigation module and the TMC feature detailed graphicalmore » user interfaces to support human-factors studies. Realistic modeling of variations of the posted driving speed are based on human factors studies that take into consideration weather, road conditions, driver personality and behavior, and vehicle type. The prototype has been developed on a distributed system of networked UNIX computers but is designed to run on parallel computers, such as ANL`s IBM SP-2, for large-scale problems. A novel feature of the approach is that vehicles are represented by autonomous computer processes which exchange messages with other processes. The vehicles have a behavior model which governs route selection and driving behavior, and can react to external traffic events much like real vehicles. With this approach, the simulation is scaleable to take advantage of emerging massively parallel processor (MPP) systems.« less

  19. Time series analysis of personal exposure to ambient air pollution and mortality using an exposure simulator.

    PubMed

    Chang, Howard H; Fuentes, Montserrat; Frey, H Christopher

    2012-09-01

    This paper describes a modeling framework for estimating the acute effects of personal exposure to ambient air pollution in a time series design. First, a spatial hierarchical model is used to relate Census tract-level daily ambient concentrations and simulated exposures for a subset of the study period. The complete exposure time series is then imputed for risk estimation. Modeling exposure via a statistical model reduces the computational burden associated with simulating personal exposures considerably. This allows us to consider personal exposures at a finer spatial resolution to improve exposure assessment and for a longer study period. The proposed approach is applied to an analysis of fine particulate matter of <2.5 μm in aerodynamic diameter (PM(2.5)) and daily mortality in the New York City metropolitan area during the period 2001-2005. Personal PM(2.5) exposures were simulated from the Stochastic Human Exposure and Dose Simulation. Accounting for exposure uncertainty, the authors estimated a 2.32% (95% posterior interval: 0.68, 3.94) increase in mortality per a 10 μg/m(3) increase in personal exposure to PM(2.5) from outdoor sources on the previous day. The corresponding estimates per a 10 μg/m(3) increase in PM(2.5) ambient concentration was 1.13% (95% confidence interval: 0.27, 2.00). The risks of mortality associated with PM(2.5) were also higher during the summer months.

  20. Adaptive screening for depression--recalibration of an item bank for the assessment of depression in persons with mental and somatic diseases and evaluation in a simulated computer-adaptive test environment.

    PubMed

    Forkmann, Thomas; Kroehne, Ulf; Wirtz, Markus; Norra, Christine; Baumeister, Harald; Gauggel, Siegfried; Elhan, Atilla Halil; Tennant, Alan; Boecker, Maren

    2013-11-01

    This study conducted a simulation study for computer-adaptive testing based on the Aachen Depression Item Bank (ADIB), which was developed for the assessment of depression in persons with somatic diseases. Prior to computer-adaptive test simulation, the ADIB was newly calibrated. Recalibration was performed in a sample of 161 patients treated for a depressive syndrome, 103 patients from cardiology, and 103 patients from otorhinolaryngology (mean age 44.1, SD=14.0; 44.7% female) and was cross-validated in a sample of 117 patients undergoing rehabilitation for cardiac diseases (mean age 58.4, SD=10.5; 24.8% women). Unidimensionality of the itembank was checked and a Rasch analysis was performed that evaluated local dependency (LD), differential item functioning (DIF), item fit and reliability. CAT-simulation was conducted with the total sample and additional simulated data. Recalibration resulted in a strictly unidimensional item bank with 36 items, showing good Rasch model fit (item fit residuals<|2.5|) and no DIF or LD. CAT simulation revealed that 13 items on average were necessary to estimate depression in the range of -2 and +2 logits when terminating at SE≤0.32 and 4 items if using SE≤0.50. Receiver Operating Characteristics analysis showed that θ estimates based on the CAT algorithm have good criterion validity with regard to depression diagnoses (Area Under the Curve≥.78 for all cut-off criteria). The recalibration of the ADIB succeeded and the simulation studies conducted suggest that it has good screening performance in the samples investigated and that it may reasonably add to the improvement of depression assessment. © 2013.

  1. Virtual reality in the assessment of selected cognitive function after brain injury.

    PubMed

    Zhang, L; Abreu, B C; Masel, B; Scheibel, R S; Christiansen, C H; Huddleston, N; Ottenbacher, K J

    2001-08-01

    To assess selected cognitive functions of persons with traumatic brain injury using a computer-simulated virtual reality environment. A computer-simulated virtual kitchen was used to assess the ability of 30 patients with brain injury and 30 volunteers without brain injury to process and sequence information. The overall assessment score was based on the number of correct responses and the time needed to complete daily living tasks. Identical daily living tasks were tested and scored in participants with and without brain injury. Each subject was evaluated twice within 7 to 10 days. A total of 30 tasks were categorized as follows: information processing, problem solving, logical sequencing, and speed of responding. Persons with brain injuries consistently demonstrated a significant decrease in the ability to process information (P = 0.04-0.01), identify logical sequencing (P = 0.04-0.01), and complete the overall assessment (P < 0.01), compared with volunteers without brain injury. The time needed to process tasks, representing speed of cognitive responding, was also significantly different between the two groups (P < 0.01). A computer-generated virtual reality environment represents a reproducible tool to assess selected cognitive functions and can be used as a supplement to traditional rehabilitation assessment in persons with acquired brain injury.

  2. Model for disease dynamics of a waterborne pathogen on a random network.

    PubMed

    Li, Meili; Ma, Junling; van den Driessche, P

    2015-10-01

    A network epidemic SIWR model for cholera and other diseases that can be transmitted via the environment is developed and analyzed. The person-to-person contacts are modeled by a random contact network, and the contagious environment is modeled by an external node that connects to every individual. The model is adapted from the Miller network SIR model, and in the homogeneous mixing limit becomes the Tien and Earn deterministic cholera model without births and deaths. The dynamics of our model shows excellent agreement with stochastic simulations. The basic reproduction number [Formula: see text] is computed, and on a Poisson network shown to be the sum of the basic reproduction numbers of the person-to-person and person-to-water-to-person transmission pathways. However, on other networks, [Formula: see text] depends nonlinearly on the transmission along the two pathways. Type reproduction numbers are computed and quantify measures to control the disease. Equations giving the final epidemic size are obtained.

  3. Developing a Learning Algorithm-Generated Empirical Relaxer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mitchell, Wayne; Kallman, Josh; Toreja, Allen

    2016-03-30

    One of the main difficulties when running Arbitrary Lagrangian-Eulerian (ALE) simulations is determining how much to relax the mesh during the Eulerian step. This determination is currently made by the user on a simulation-by-simulation basis. We present a Learning Algorithm-Generated Empirical Relaxer (LAGER) which uses a regressive random forest algorithm to automate this decision process. We also demonstrate that LAGER successfully relaxes a variety of test problems, maintains simulation accuracy, and has the potential to significantly decrease both the person-hours and computational hours needed to run a successful ALE simulation.

  4. Computational fluid dynamics uses in fluid dynamics/aerodynamics education

    NASA Technical Reports Server (NTRS)

    Holst, Terry L.

    1994-01-01

    The field of computational fluid dynamics (CFD) has advanced to the point where it can now be used for the purpose of fluid dynamics physics education. Because of the tremendous wealth of information available from numerical simulation, certain fundamental concepts can be efficiently communicated using an interactive graphical interrogation of the appropriate numerical simulation data base. In other situations, a large amount of aerodynamic information can be communicated to the student by interactive use of simple CFD tools on a workstation or even in a personal computer environment. The emphasis in this presentation is to discuss ideas for how this process might be implemented. Specific examples, taken from previous publications, will be used to highlight the presentation.

  5. Fast neural net simulation with a DSP processor array.

    PubMed

    Muller, U A; Gunzinger, A; Guggenbuhl, W

    1995-01-01

    This paper describes the implementation of a fast neural net simulator on a novel parallel distributed-memory computer. A 60-processor system, named MUSIC (multiprocessor system with intelligent communication), is operational and runs the backpropagation algorithm at a speed of 330 million connection updates per second (continuous weight update) using 32-b floating-point precision. This is equal to 1.4 Gflops sustained performance. The complete system with 3.8 Gflops peak performance consumes less than 800 W of electrical power and fits into a 19-in rack. While reaching the speed of modern supercomputers, MUSIC still can be used as a personal desktop computer at a researcher's own disposal. In neural net simulation, this gives a computing performance to a single user which was unthinkable before. The system's real-time interfaces make it especially useful for embedded applications.

  6. Computer simulation of the activity of the elderly person living independently in a Health Smart Home.

    PubMed

    Noury, N; Hadidi, T

    2012-12-01

    We propose a simulator of human activities collected with presence sensors in our experimental Health Smart Home "Habitat Intelligent pour la Sante (HIS)". We recorded 1492 days of data on several experimental HIS during the French national project "AILISA". On these real data, we built a mathematical model of the behavior of the data series, based on "Hidden Markov Models" (HMM). The model is then played on a computer to produce simulated data series with added flexibility to adjust the parameters in various scenarios. We also tested several methods to measure the similarity between our real and simulated data. Our simulator can produce large data base which can be further used to evaluate the algorithms to raise an alarm in case of loss in autonomy. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  7. Simulation and analysis of three congested weigh stations using Westa

    DOT National Transportation Integrated Search

    2001-01-01

    A user-friendly model for personal computers, "Vehicle/Highway Performance Predictor," was developed to estimate fuel consumption and exhaust emissions related to modes of vehicle operations on highways of various configurations and traffic controls ...

  8. Simulation of keratoconus observation in photorefraction

    NASA Astrophysics Data System (ADS)

    Chen, Ying-Ling; Tan, B.; Baker, K.; Lewis, J. W. L.; Swartz, T.; Jiang, Y.; Wang, M.

    2006-11-01

    In the recent years, keratoconus (KC) has increasingly gained attention due to its treatment options and to the popularity of keratorefractive surgery. This paper investigates the potential of identification of KC using photorefraction (PR), an optical technique that is similar to objective retinoscopy and is commonly used for large-scale ocular screening. Using personalized eye models of both KC and pre-LASIK patients, computer simulations were performed to achieve visualization of this ophthalmic measurement. The simulations are validated by comparing results to two sets of experimental measurements. These PR images show distinguishable differences between KC eyes and eyes that are either normal or ametropic. The simulation technique with personalized modeling can be extended to other ophthalmic instrument developments. It makes possible investigation with the least number of real human subjects. The application is also of great interest in medical training.

  9. The visible ear simulator: a public PC application for GPU-accelerated haptic 3D simulation of ear surgery based on the visible ear data.

    PubMed

    Sorensen, Mads Solvsten; Mosegaard, Jesper; Trier, Peter

    2009-06-01

    Existing virtual simulators for middle ear surgery are based on 3-dimensional (3D) models from computed tomographic or magnetic resonance imaging data in which image quality is limited by the lack of detail (maximum, approximately 50 voxels/mm3), natural color, and texture of the source material.Virtual training often requires the purchase of a program, a customized computer, and expensive peripherals dedicated exclusively to this purpose. The Visible Ear freeware library of digital images from a fresh-frozen human temporal bone was segmented, and real-time volume rendered as a 3D model of high-fidelity, true color, and great anatomic detail and realism of the surgically relevant structures. A haptic drilling model was developed for surgical interaction with the 3D model. Realistic visualization in high-fidelity (approximately 125 voxels/mm3) and true color, 2D, or optional anaglyph stereoscopic 3D was achieved on a standard Core 2 Duo personal computer with a GeForce 8,800 GTX graphics card, and surgical interaction was provided through a relatively inexpensive (approximately $2,500) Phantom Omni haptic 3D pointing device. This prototype is published for download (approximately 120 MB) as freeware at http://www.alexandra.dk/ves/index.htm.With increasing personal computer performance, future versions may include enhanced resolution (up to 8,000 voxels/mm3) and realistic interaction with deformable soft tissue components such as skin, tympanic membrane, dura, and cholesteatomas-features some of which are not possible with computed tomographic-/magnetic resonance imaging-based systems.

  10. ECHO: A Computer Based Test for the Measurement of Individualistic, Cooperative, Defensive, and Aggressive Models of Behavior. Occasional Paper No. 30.

    ERIC Educational Resources Information Center

    Krus, David J.; And Others

    This paper describes a test which attempts to measure a group of personality traits by analyzing the actual behavior of the participant in a computer-simulated game. ECHO evolved from an extension and computerization of Horstein and Deutsch's allocation game. The computerized version of ECHO requires subjects to make decisions about the allocation…

  11. Development of the KOSMS management simulation training system and its application

    NASA Astrophysics Data System (ADS)

    Takatsu, Yoshiki

    The use of games which simulate actual corporate management has recently become more common and is now utilized in various ways for in-house corporate training courses. KOSMS (Kobe Steel Management Simulation System), a training system designed to help improve the management skills of senior management staff, is a unique management simulation training system in which the participants, using personal computers, must make decisions concerning a variety of management activities, in simulated competition with other corporations. This report outlines the KOSMS system, and describes the basic structure and detailed contents of the management simulation models, and actual application of the KOSMS management simulation training.

  12. Using Open Source Software in Visual Simulation Development

    DTIC Science & Technology

    2005-09-01

    increased the use of the technology in training activities. Using open source/free software tools in the process can expand these possibilities...resulting in even greater cost reduction and allowing the flexibility needed in a training environment. This thesis presents a configuration and architecture...to be used when developing training visual simulations using both personal computers and open source tools. Aspects of the requirements needed in a

  13. Simulation with Python on transverse modes of the symmetric confocal resonator

    NASA Astrophysics Data System (ADS)

    Wang, Qing Hua; Qi, Jing; Ji, Yun Jing; Song, Yang; Li, Zhenhua

    2017-08-01

    Python is a popular open-source programming language that can be used to simulate various optical phenomena. We have developed a suite of programs to help teach the course of laser principle. The complicated transverse modes of the symmetric confocal resonator can be visualized in personal computers, which is significant to help the students understand the pattern distribution of laser resonator.

  14. Assessing Functional Performance using a Computer-Based Simulations of Everyday Activities

    PubMed Central

    Czaja, Sara J.; Loewenstein, David A.; Lee, Chin Chin; Fu, Shih Hua; Harvey, Philip D.

    2016-01-01

    Current functional capacity (FC) measures for patients with schizophrenia typically involve informant assessments or are in paper and pencil format, requiring in-person administration by a skilled assessor. This approach presents logistic problems and limits the possibilities for remote assessment, an important issue for these patients. This study evaluated the feasibility of using a computer-based assessment battery, including simulations of everyday activities. The battery was compared to in-person standard assessments of cognition and FC with respect to baseline convergence and sensitivity to group differences. The battery, administered on a touch screen computer, included measures of critical everyday activities, including: ATM Banking/Financial Management, Prescriptions Refill via Telephone/Voice Menu System, and Forms Completion (simulating a clinic and patient history form). The sample included 77 older adult patients with schizophrenia and 24 older adult healthy controls that were administered the battery at two time points. The results indicated that the battery was sensitive to group differences in FC. Performance on the battery was also moderately correlated with standard measures of cognitive abilities and showed convergence with standard measures of FC, while demonstrating good test-retest reliability. Our results show that it is feasible to use technology-based assessment protocols with older adults and patients with schizophrenia. The battery overcomes logistic constraints associated with current FC assessment protocols as the battery is computer-based, can be delivered remotely and does not require a healthcare professional for administration. PMID:27913159

  15. Comparison of Different Methods of Grading a Level Turn Task on a Flight Simulator

    NASA Technical Reports Server (NTRS)

    Heath, Bruce E.; Crier, tomyka

    2003-01-01

    With the advancements in the computing power of personal computers, pc-based flight simulators and trainers have opened new avenues in the training of airplane pilots. It may be desirable to have the flight simulator make a quantitative evaluation of the progress of a pilot's training thereby reducing the physical requirement of the flight instructor who must, in turn, watch every flight. In an experiment, University students conducted six different flights, each consisting of two level turns. The flights were three minutes in duration. By evaluating videotapes, two certified flight instructors provided separate letter grades for each turn. These level turns were also evaluated using two other computer based grading methods. One method determined automated grades based on prescribed tolerances in bank angle, airspeed and altitude. The other method used was deviations in altitude and bank angle for performance index and performance grades.

  16. Linear Scaling Density Functional Calculations with Gaussian Orbitals

    NASA Technical Reports Server (NTRS)

    Scuseria, Gustavo E.

    1999-01-01

    Recent advances in linear scaling algorithms that circumvent the computational bottlenecks of large-scale electronic structure simulations make it possible to carry out density functional calculations with Gaussian orbitals on molecules containing more than 1000 atoms and 15000 basis functions using current workstations and personal computers. This paper discusses the recent theoretical developments that have led to these advances and demonstrates in a series of benchmark calculations the present capabilities of state-of-the-art computational quantum chemistry programs for the prediction of molecular structure and properties.

  17. Modeling and simulation in biomedicine.

    PubMed Central

    Aarts, J.; Möller, D.; van Wijk van Brievingh, R.

    1991-01-01

    A group of researchers and educators in The Netherlands, Germany and Czechoslovakia have developed and adapted mathematical computer models of phenomena in the field of physiology and biomedicine for use in higher education. The models are graphical and highly interactive, and are all written in TurboPascal or the mathematical simulation language PSI. An educational shell has been developed to launch the models. The shell allows students to interact with the models and teachers to edit the models, to add new models and to monitor the achievements of the students. The models and the shell have been implemented on a MS-DOS personal computer. This paper describes the features of the modeling package and presents the modeling and simulation of the heart muscle as an example. PMID:1807745

  18. A PC-based simulation of the National Transonic Facitity's safety microprocessor

    NASA Technical Reports Server (NTRS)

    Thibodeaux, J. J.; Kilgore, W. A.; Balakrishna, S.

    1993-01-01

    A brief study was undertaken to demonstrate the feasibility of using a state-of-the-art off-the-shelf high speed personal computer for simulating a microprocessor presently used for wind tunnel safety purposes at Langley Research Center's National Transonic Facility (NTF). Currently, there is no active display of tunnel alarm/alert safety information provided to the tunnel operators, but rather such information is periodically recorded on a process monitoring computer printout. This does not provide on-line situational information nor permit rapid identification of safety operational violations which are able to halt tunnel operations. It was therefore decided to simulate the existing algorithms and briefly evaluate a real-time display which could provide both position and trouble shooting information.

  19. Progress and supercomputing in computational fluid dynamics; Proceedings of U.S.-Israel Workshop, Jerusalem, Israel, December 1984

    NASA Technical Reports Server (NTRS)

    Murman, E. M. (Editor); Abarbanel, S. S. (Editor)

    1985-01-01

    Current developments and future trends in the application of supercomputers to computational fluid dynamics are discussed in reviews and reports. Topics examined include algorithm development for personal-size supercomputers, a multiblock three-dimensional Euler code for out-of-core and multiprocessor calculations, simulation of compressible inviscid and viscous flow, high-resolution solutions of the Euler equations for vortex flows, algorithms for the Navier-Stokes equations, and viscous-flow simulation by FEM and related techniques. Consideration is given to marching iterative methods for the parabolized and thin-layer Navier-Stokes equations, multigrid solutions to quasi-elliptic schemes, secondary instability of free shear flows, simulation of turbulent flow, and problems connected with weather prediction.

  20. Analysis and Simulation of Narrowband GPS Jamming Using Digital Excision Temporal Filtering.

    DTIC Science & Technology

    1994-12-01

    the sequence of stored values from the P- code sampled at a 20 MHz rate. When correlated with a reference vector of the same length to simulate a GPS ...rate required for the GPS signals, (20 MHz sampling rate for the P- code signal), the personal computer (PC) used run the simulation could not perform...This subroutine is used to perform a fast FFT based 168 biased cross correlation . Written by Capt Gerry Falen, USAF, 16 AUG 94 % start of code

  1. Grace: A cross-platform micromagnetic simulator on graphics processing units

    NASA Astrophysics Data System (ADS)

    Zhu, Ru

    2015-12-01

    A micromagnetic simulator running on graphics processing units (GPUs) is presented. Different from GPU implementations of other research groups which are predominantly running on NVidia's CUDA platform, this simulator is developed with C++ Accelerated Massive Parallelism (C++ AMP) and is hardware platform independent. It runs on GPUs from venders including NVidia, AMD and Intel, and achieves significant performance boost as compared to previous central processing unit (CPU) simulators, up to two orders of magnitude. The simulator paved the way for running large size micromagnetic simulations on both high-end workstations with dedicated graphics cards and low-end personal computers with integrated graphics cards, and is freely available to download.

  2. Goal Orientation Framing and Its Influence on Performance

    DTIC Science & Technology

    2012-12-01

    first-person shooter computer games Call of Duty: Modern Warfare 2 and Call of Duty: Modern Warfare 3. During the simulation, participants were...working understanding of social expectations and norms (Duda & Nicholis, 1992). It is true that obese people, drug addicts and abusive parents exist in...Monterey Student Activity Center. B. MEASURES Performance was assessed in two tests, a math test and a first-person shooter game . It was the intent of

  3. Communication interval selection in distributed heterogeneous simulation of large-scale dynamical systems

    NASA Astrophysics Data System (ADS)

    Lucas, Charles E.; Walters, Eric A.; Jatskevich, Juri; Wasynczuk, Oleg; Lamm, Peter T.

    2003-09-01

    In this paper, a new technique useful for the numerical simulation of large-scale systems is presented. This approach enables the overall system simulation to be formed by the dynamic interconnection of the various interdependent simulations, each representing a specific component or subsystem such as control, electrical, mechanical, hydraulic, or thermal. Each simulation may be developed separately using possibly different commercial-off-the-shelf simulation programs thereby allowing the most suitable language or tool to be used based on the design/analysis needs. These subsystems communicate the required interface variables at specific time intervals. A discussion concerning the selection of appropriate communication intervals is presented herein. For the purpose of demonstration, this technique is applied to a detailed simulation of a representative aircraft power system, such as that found on the Joint Strike Fighter (JSF). This system is comprised of ten component models each developed using MATLAB/Simulink, EASY5, or ACSL. When the ten component simulations were distributed across just four personal computers (PCs), a greater than 15-fold improvement in simulation speed (compared to the single-computer implementation) was achieved.

  4. Rotordynamics on the PC: Transient Analysis With ARDS

    NASA Technical Reports Server (NTRS)

    Fleming, David P.

    1997-01-01

    Personal computers can now do many jobs that formerly required a large mainframe computer. An example is NASA Lewis Research Center's program Analysis of RotorDynamic Systems (ARDS), which uses the component mode synthesis method to analyze the dynamic motion of up to five rotating shafts. As originally written in the early 1980's, this program was considered large for the mainframe computers of the time. ARDS, which was written in Fortran 77, has been successfully ported to a 486 personal computer. Plots appear on the computer monitor via calls programmed for the original CALCOMP plotter; plots can also be output on a standard laser printer. The executable code, which uses the full array sizes of the mainframe version, easily fits on a high-density floppy disk. The program runs under DOS with an extended memory manager. In addition to transient analysis of blade loss, step turns, and base acceleration, with simulation of squeeze-film dampers and rubs, ARDS calculates natural frequencies and unbalance response.

  5. Simulating smokers' acceptance of modifications in a cessation program.

    PubMed Central

    Spoth, R

    1992-01-01

    Recent research has underscored the importance of assessing barriers to smokers' acceptance of cessation programs. This paper illustrates the use of computer simulations to gauge smokers' response to program modifications which may produce barriers to participation. It also highlights methodological issues encountered in conducting this work. Computer simulations were based on conjoint analysis, a consumer research method which enables measurement of smokers' relative preference for various modifications of cessation programs. Results from two studies are presented in this paper. The primary study used a randomly selected sample of 218 adult smokers who participated in a computer-assisted phone interview. Initially, the study assessed smokers' relative utility rating of 30 features of cessation programs. Utility data were used in computer-simulated comparisons of a low-cost, self-help oriented program under development and five other existing programs. A baseline version of the program under development and two modifications (for example, use of a support group with a higher level of cost) were simulated. Both the baseline version and modifications received a favorable response vis-à-vis comparison programs. Modifications requiring higher program costs were, however, associated with moderately reduced levels of favorable consumer response. The second study used a sample of 70 smokers who responded to an expanded set of smoking cessation program features focusing on program packaging. This secondary study incorporate in-person, computer-assisted interviews at a shopping mall, with smokers viewing an artist's mock-up of various program options on display. A similar pattern of responses to simulated program modifications emerged, with monetary cost apparently playing a key role. The significance of conjoint-based computer simulation as a tool in program development or dissemination, salient methodological issues, and implications for further research are discussed. PMID:1738813

  6. Simulating smokers' acceptance of modifications in a cessation program.

    PubMed

    Spoth, R

    1992-01-01

    Recent research has underscored the importance of assessing barriers to smokers' acceptance of cessation programs. This paper illustrates the use of computer simulations to gauge smokers' response to program modifications which may produce barriers to participation. It also highlights methodological issues encountered in conducting this work. Computer simulations were based on conjoint analysis, a consumer research method which enables measurement of smokers' relative preference for various modifications of cessation programs. Results from two studies are presented in this paper. The primary study used a randomly selected sample of 218 adult smokers who participated in a computer-assisted phone interview. Initially, the study assessed smokers' relative utility rating of 30 features of cessation programs. Utility data were used in computer-simulated comparisons of a low-cost, self-help oriented program under development and five other existing programs. A baseline version of the program under development and two modifications (for example, use of a support group with a higher level of cost) were simulated. Both the baseline version and modifications received a favorable response vis-à-vis comparison programs. Modifications requiring higher program costs were, however, associated with moderately reduced levels of favorable consumer response. The second study used a sample of 70 smokers who responded to an expanded set of smoking cessation program features focusing on program packaging. This secondary study incorporate in-person, computer-assisted interviews at a shopping mall, with smokers viewing an artist's mock-up of various program options on display. A similar pattern of responses to simulated program modifications emerged, with monetary cost apparently playing a key role. The significance of conjoint-based computer simulation as a tool in program development or dissemination, salient methodological issues, and implications for further research are discussed.

  7. Virtual rounds: simulation-based education in procedural medicine

    NASA Astrophysics Data System (ADS)

    Shaffer, David W.; Meglan, Dwight A.; Ferrell, Margaret; Dawson, Steven L.

    1999-07-01

    Computer-based simulation is a goal for training physicians in specialties where traditional training puts patients at risk. Intuitively, interactive simulation of anatomy, pathology, and therapeutic actions should lead to shortening of the learning curve for novice or inexperienced physicians. Effective transfer of knowledge acquired in simulators must be shown for such devices to be widely accepted in the medical community. We have developed an Interventional Cardiology Training Simulator which incorporates real-time graphic interactivity coupled with haptic response, and an embedded curriculum permitting rehearsal, hypertext links, personal archiving and instructor review and testing capabilities. This linking of purely technical simulation with educational content creates a more robust educational purpose for procedural simulators.

  8. Road sign recognition during computer testing versus driving simulator performance for stroke and stroke+aphasia groups.

    DOT National Transportation Integrated Search

    2015-07-01

    Driving is essential to maintaining independence. For most Americans preserving personal mobility is a : key element to retaining jobs, friends, activities and the basic necessities to maintain a household. This : is particularly true for older peopl...

  9. Computational techniques for ECG analysis and interpretation in light of their contribution to medical advances

    PubMed Central

    Mincholé, Ana; Martínez, Juan Pablo; Laguna, Pablo; Rodriguez, Blanca

    2018-01-01

    Widely developed for clinical screening, electrocardiogram (ECG) recordings capture the cardiac electrical activity from the body surface. ECG analysis can therefore be a crucial first step to help diagnose, understand and predict cardiovascular disorders responsible for 30% of deaths worldwide. Computational techniques, and more specifically machine learning techniques and computational modelling are powerful tools for classification, clustering and simulation, and they have recently been applied to address the analysis of medical data, especially ECG data. This review describes the computational methods in use for ECG analysis, with a focus on machine learning and 3D computer simulations, as well as their accuracy, clinical implications and contributions to medical advances. The first section focuses on heartbeat classification and the techniques developed to extract and classify abnormal from regular beats. The second section focuses on patient diagnosis from whole recordings, applied to different diseases. The third section presents real-time diagnosis and applications to wearable devices. The fourth section highlights the recent field of personalized ECG computer simulations and their interpretation. Finally, the discussion section outlines the challenges of ECG analysis and provides a critical assessment of the methods presented. The computational methods reported in this review are a strong asset for medical discoveries and their translation to the clinical world may lead to promising advances. PMID:29321268

  10. Closed-loop controller for chest compressions based on coronary perfusion pressure: a computer simulation study.

    PubMed

    Wang, Chunfei; Zhang, Guang; Wu, Taihu; Zhan, Ningbo; Wang, Yaling

    2016-03-01

    High-quality cardiopulmonary resuscitation contributes to cardiac arrest survival. The traditional chest compression (CC) standard, which neglects individual differences, uses unified standards for compression depth and compression rate in practice. In this study, an effective and personalized CC method for automatic mechanical compression devices is provided. We rebuild Charles F. Babbs' human circulation model with a coronary perfusion pressure (CPP) simulation module and propose a closed-loop controller based on a fuzzy control algorithm for CCs, which adjusts the CC depth according to the CPP. Compared with a traditional proportion-integration-differentiation (PID) controller, the performance of the fuzzy controller is evaluated in computer simulation studies. The simulation results demonstrate that the fuzzy closed-loop controller results in shorter regulation time, fewer oscillations and smaller overshoot than traditional PID controllers and outperforms the traditional PID controller for CPP regulation and maintenance.

  11. Implementation of the EM Algorithm in the Estimation of Item Parameters: The BILOG Computer Program.

    ERIC Educational Resources Information Center

    Mislevy, Robert J.; Bock, R. Darrell

    This paper reviews the basic elements of the EM approach to estimating item parameters and illustrates its use with one simulated and one real data set. In order to illustrate the use of the BILOG computer program, runs for 1-, 2-, and 3-parameter models are presented for the two sets of data. First is a set of responses from 1,000 persons to five…

  12. Computer program for the reservoir model of metabolic crossroads.

    PubMed

    Ribeiro, J M; Juzgado, D; Crespo, E; Sillero, A

    1990-01-01

    A program containing 344 sentences, written in BASIC and adapted to run in personal computers (PC) has been developed to simulate the reservoir model of metabolic crossroads. The program draws the holes of the reservoir with shapes reflecting the Vmax, Km (S0.5) and cooperativity coefficients (n) of the enzymes and calculates both the actual velocities and the percentage of contribution of every enzyme to the overall removal of their common substrate.

  13. From good intentions to healthy habits: towards integrated computational models of goal striving and habit formation.

    PubMed

    Pirolli, Peter

    2016-08-01

    Computational models were developed in the ACT-R neurocognitive architecture to address some aspects of the dynamics of behavior change. The simulations aim to address the day-to-day goal achievement data available from mobile health systems. The models refine current psychological theories of self-efficacy, intended effort, and habit formation, and provide an account for the mechanisms by which goal personalization, implementation intentions, and remindings work.

  14. Optimizing Adversary Training and the Structure of the Navy Adversary Fleet

    DTIC Science & Technology

    2013-09-01

    ORGANIZATION REPORT NUMBER 9. SPONSORING /MONITORING AGENCY NAME(S) AND ADDRESS(ES) OPNAV N98 2000 Navy Pentagon , Room 5C469 Washington DC, 20350...an overhaul of existing computers and encryption in the range operations centers (CDR R. Van Diepen, OPNAV Simulator Requirements Officer, personal...1.0. Using a simulated annealing heuristic algorithm in conjunction with the utility assignments, CNA found, in order of priority, that the following

  15. Integrative approaches to computational biomedicine

    PubMed Central

    Coveney, Peter V.; Diaz-Zuccarini, Vanessa; Graf, Norbert; Hunter, Peter; Kohl, Peter; Tegner, Jesper; Viceconti, Marco

    2013-01-01

    The new discipline of computational biomedicine is concerned with the application of computer-based techniques and particularly modelling and simulation to human health. Since 2007, this discipline has been synonymous, in Europe, with the name given to the European Union's ambitious investment in integrating these techniques with the eventual aim of modelling the human body as a whole: the virtual physiological human. This programme and its successors are expected, over the next decades, to transform the study and practice of healthcare, moving it towards the priorities known as ‘4P's’: predictive, preventative, personalized and participatory medicine.

  16. A Placer-Gold Evaluation Exercise.

    ERIC Educational Resources Information Center

    Tunley, A. Tom

    1984-01-01

    A laboratory exercise allowing students to use drillhole data to simulate the process of locating a placer gold paystreak is presented. As part of the activity students arithmetically compute the value of their gold, mining costs, and personal profits or losses, and decide on development plans for the claim. (BC)

  17. THREE-DIMENSIONAL COMPUTATIONAL FLUID DYNAMICS SIMULATIONS OF LOCAL PARTICLE DEPOSITION PATTERNS IN LUNG AIRWAYS

    EPA Science Inventory

    EPA has identified respirable particulate matter (PM) as a significant threat to human health, particularly in the elderly, in children, and in persons with respiratory disease. However, deposition of PM in the respiratory system is highly variable, depending upon particle chara...

  18. Field Systems Research: Sport Pedagogy Perspectives.

    ERIC Educational Resources Information Center

    Locke, Lawrence F.; And Others

    1992-01-01

    These articles contain responses from several scholars on the issue of field systems analysis (FSA). The scholars offer critiques from their sport pedagogy perspectives, a reaction relating FSA to personal examinations of teaching expertise, and a discussion of how computer simulation informs the study of expert teachers. (SM)

  19. Assessing Teaching Skills with a Mobile Simulation

    ERIC Educational Resources Information Center

    Gibson, David

    2013-01-01

    Because mobile technologies are overtaking personal computers as the primary tools of Internet access, and cloud-based resources are fundamentally transforming the world's knowledge, new forms of teaching and assessment are required to foster 21st century literacies, including those needed by K-12 teachers. A key feature of mobile technology…

  20. The Influence of Simulated Home and Neighbourhood Densification on Perceived Liveability

    ERIC Educational Resources Information Center

    Thomas, J. A.; Walton, D.; Lamb, S.

    2011-01-01

    This study experimentally manipulated neighbourhood density and home location to reveal the effect of these changes on perceived liveability. Two hypothetical scenarios were provided to 106 households using a Computer-Aided Personal Interview (CAPI). The first scenario examined a densification of the participant's current property, and the second…

  1. Argonne simulation framework for intelligent transportation systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ewing, T.; Doss, E.; Hanebutte, U.

    1996-04-01

    A simulation framework has been developed which defines a high-level architecture for a large-scale, comprehensive, scalable simulation of an Intelligent Transportation System (ITS). The simulator is designed to run on parallel computers and distributed (networked) computer systems; however, a version for a stand alone workstation is also available. The ITS simulator includes an Expert Driver Model (EDM) of instrumented ``smart`` vehicles with in-vehicle navigation units. The EDM is capable of performing optimal route planning and communicating with Traffic Management Centers (TMC). A dynamic road map data base is sued for optimum route planning, where the data is updated periodically tomore » reflect any changes in road or weather conditions. The TMC has probe vehicle tracking capabilities (display position and attributes of instrumented vehicles), and can provide 2-way interaction with traffic to provide advisories and link times. Both the in-vehicle navigation module and the TMC feature detailed graphical user interfaces that includes human-factors studies to support safety and operational research. Realistic modeling of variations of the posted driving speed are based on human factor studies that take into consideration weather, road conditions, driver`s personality and behavior and vehicle type. The simulator has been developed on a distributed system of networked UNIX computers, but is designed to run on ANL`s IBM SP-X parallel computer system for large scale problems. A novel feature of the developed simulator is that vehicles will be represented by autonomous computer processes, each with a behavior model which performs independent route selection and reacts to external traffic events much like real vehicles. Vehicle processes interact with each other and with ITS components by exchanging messages. With this approach, one will be able to take advantage of emerging massively parallel processor (MPP) systems.« less

  2. Neural networks as a control methodology

    NASA Technical Reports Server (NTRS)

    Mccullough, Claire L.

    1990-01-01

    While conventional computers must be programmed in a logical fashion by a person who thoroughly understands the task to be performed, the motivation behind neural networks is to develop machines which can train themselves to perform tasks, using available information about desired system behavior and learning from experience. There are three goals of this fellowship program: (1) to evaluate various neural net methods and generate computer software to implement those deemed most promising on a personal computer equipped with Matlab; (2) to evaluate methods currently in the professional literature for system control using neural nets to choose those most applicable to control of flexible structures; and (3) to apply the control strategies chosen in (2) to a computer simulation of a test article, the Control Structures Interaction Suitcase Demonstrator, which is a portable system consisting of a small flexible beam driven by a torque motor and mounted on springs tuned to the first flexible mode of the beam. Results of each are discussed.

  3. A new approach for data acquisition at the JPL space simulators

    NASA Technical Reports Server (NTRS)

    Fisher, Terry C.

    1992-01-01

    In 1990, a personal computer based data acquisition system was put into service for the Space Simulators and Environmental Test Laboratory at the Jet Propulsion Laboratory (JPL) in Pasadena, California. The new system replaced an outdated minicomputer system which had been in use since 1980. This new data acquisition system was designed and built by JPL for the specific task of acquiring thermal test data in support of space simulation and thermal vacuum testing at JPL. The data acquisition system was designed using powerful personal computers and local-area-network (LAN) technology. Reliability, expandability, and maintainability were some of the most important criteria in the design of the data system and in the selection of hardware and software components. The data acquisition system is used to record both test chamber operational data and thermal data from the unit under test. Tests are conducted in numerous small thermal vacuum chambers and in the large solar simulator and range in size from individual components using only 2 or 3 thermocouples to entire planetary spacecraft requiring in excess of 1200 channels of test data. The system supports several of these tests running concurrently. The previous data system is described along with reasons for its replacement, the types of data acquired, the new data system, and the benefits obtained from the new system including information on tests performed to date.

  4. Evolution of computational chemistry, the "launch pad" to scientific computational models: The early days from a personal account, the present status from the TACC-2012 congress, and eventual future applications from the global simulation approach

    NASA Astrophysics Data System (ADS)

    Clementi, Enrico

    2012-06-01

    This is the introductory chapter to the AIP Proceedings volume "Theory and Applications of Computational Chemistry: The First Decade of the Second Millennium" where we discuss the evolution of "computational chemistry". Very early variational computational chemistry developments are reported in Sections 1 to 7, and 11, 12 by recalling some of the computational chemistry contributions by the author and his collaborators (from late 1950 to mid 1990); perturbation techniques are not considered in this already extended work. Present day's computational chemistry is partly considered in Sections 8 to 10 where more recent studies by the author and his collaborators are discussed, including the Hartree-Fock-Heitler-London method; a more general discussion on present day computational chemistry is presented in Section 14. The following chapters of this AIP volume provide a view of modern computational chemistry. Future computational chemistry developments can be extrapolated from the chapters of this AIP volume; further, in Sections 13 and 15 present an overall analysis on computational chemistry, obtained from the Global Simulation approach, by considering the evolution of scientific knowledge confronted with the opportunities offered by modern computers.

  5. Loci-STREAM Version 0.9

    NASA Technical Reports Server (NTRS)

    Wright, Jeffrey; Thakur, Siddharth

    2006-01-01

    Loci-STREAM is an evolving computational fluid dynamics (CFD) software tool for simulating possibly chemically reacting, possibly unsteady flows in diverse settings, including rocket engines, turbomachines, oil refineries, etc. Loci-STREAM implements a pressure- based flow-solving algorithm that utilizes unstructured grids. (The benefit of low memory usage by pressure-based algorithms is well recognized by experts in the field.) The algorithm is robust for flows at all speeds from zero to hypersonic. The flexibility of arbitrary polyhedral grids enables accurate, efficient simulation of flows in complex geometries, including those of plume-impingement problems. The present version - Loci-STREAM version 0.9 - includes an interface with the Portable, Extensible Toolkit for Scientific Computation (PETSc) library for access to enhanced linear-equation-solving programs therein that accelerate convergence toward a solution. The name "Loci" reflects the creation of this software within the Loci computational framework, which was developed at Mississippi State University for the primary purpose of simplifying the writing of complex multidisciplinary application programs to run in distributed-memory computing environments including clusters of personal computers. Loci has been designed to relieve application programmers of the details of programming for distributed-memory computers.

  6. Using Computer Simulations for Investigating a Sex Education Intervention: An Exploratory Study

    PubMed Central

    Bullock, Seth; Graham, Cynthia A; Ingham, Roger

    2017-01-01

    Background Sexually transmitted infections (STIs) are ongoing concerns. The best method for preventing the transmission of these infections is the correct and consistent use of condoms. Few studies have explored the use of games in interventions for increasing condom use by challenging the false sense of security associated with judging the presence of an STI based on attractiveness. Objectives The primary purpose of this study was to explore the potential use of computer simulation as a serious game for sex education. Specific aims were to (1) study the influence of a newly designed serious game on self-rated confidence for assessing STI risk and (2) examine whether this varied by gender, age, and scores on sexuality-related personality trait measures. Methods This paper undertook a Web-based questionnaire study employing between and within subject analyses. A Web-based platform hosted in the United Kingdom was used to deliver male and female stimuli (facial photographs) and collect data. A convenience sample group of 66 participants (64%, 42/66) male, mean age 22.5 years) completed the Term on the Tides, a computer simulation developed for this study. Participants also completed questionnaires on demographics, sexual preferences, sexual risk evaluations, the Sexual Sensation Seeking Scale (SSS), and the Sexual Inhibition Subscale 2 (SIS2) of the Sexual Inhibition/Sexual Excitation Scales-Short Form (SIS/SES - SF). Results The overall confidence of participants to evaluate sexual risks reduced after playing the game (P<.005). Age and personality trait measures did not predict the change in confidence of evaluating risk. Women demonstrated larger shifts in confidence than did men (P=.03). Conclusions This study extends the literature by investigating the potential of computer simulations as a serious game for sex education. Engaging in the Term on the Tides game had an impact on participants’ confidence in evaluating sexual risks. PMID:28468747

  7. Development and Assessment of a Novel Training Package for Basic Maneuvering Tasks on a Flight Simulator Using Self Instruction Methods and Above Real Time Training (ARTT)

    NASA Technical Reports Server (NTRS)

    Ali, Syed Firasat; Khan, M. Javed; Rossi, Marcia J.; Heath, Bruce e.; Crane, Peter; Ward, Marcus; Crier, Tomyka; Knighten, Tremaine; Culpepper, Christi

    2007-01-01

    One result of the relatively recent advances in computing technology has been the decreasing cost of computers and increasing computational power. This has allowed high fidelity airplane simulations to be run on personal computers (PC). Thus, simulators are now used routinely by pilots to substitute real flight hours for simulated flight hours for training for an aircraft type rating thereby reducing the cost of flight training. However, FAA regulations require that such substitution training must be supervised by Certified Flight Instructors (CFI). If the CFI presence could be reduced or eliminated for certain tasks this would mean a further cost savings to the pilot. This would require that the flight simulator have a certain level of 'intelligence' in order to provide feedback on pilot performance similar to that of a CFI. The 'intelligent' flight simulator would have at least the capability to use data gathered from the flight to create a measure for the performance of the student pilot. Also, to fully utilize the advances in computational power, the simulator would be capable of interacting with the student pilot using the best possible training interventions. This thesis reports on the two studies conducted at Tuskegee University investigating the effects of interventions on the learning of two flight maneuvers on a flight simulator and the robustness and accuracy of calculated performance indices as compared to CFI evaluations of performance. The intent of these studies is to take a step in the direction of creating an 'intelligent' flight simulator. The first study deals with the comparisons of novice pilot performance trained at different levels of above real-time to execute a level S-turn. The second study examined the effect of out-of-the-window (OTW) visual cues in the form of hoops on the performance of novice pilots learning to fly a landing approach on the flight simulator. The reliability/robustness of the computed performance metrics was assessed by comparing them with the evaluations of the landing approach maneuver by a number of CFIs.

  8. Simulation of Dual Firing of Hydrogen and JP-8 in a Swirling Combustor

    DTIC Science & Technology

    2012-06-14

    completed using the Ansys CFX computational fluid dynamics software. The total Lower Heating Value of the fuel mixture is maintained at a constant 6 kW...PERSON a. REPORT unclassified b. ABSTRACT unclassified c. THIS PAGE unclassified Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18

  9. RFA Guardian: Comprehensive Simulation of Radiofrequency Ablation Treatment of Liver Tumors.

    PubMed

    Voglreiter, Philip; Mariappan, Panchatcharam; Pollari, Mika; Flanagan, Ronan; Blanco Sequeiros, Roberto; Portugaller, Rupert Horst; Fütterer, Jurgen; Schmalstieg, Dieter; Kolesnik, Marina; Moche, Michael

    2018-01-15

    The RFA Guardian is a comprehensive application for high-performance patient-specific simulation of radiofrequency ablation of liver tumors. We address a wide range of usage scenarios. These include pre-interventional planning, sampling of the parameter space for uncertainty estimation, treatment evaluation and, in the worst case, failure analysis. The RFA Guardian is the first of its kind that exhibits sufficient performance for simulating treatment outcomes during the intervention. We achieve this by combining a large number of high-performance image processing, biomechanical simulation and visualization techniques into a generalized technical workflow. Further, we wrap the feature set into a single, integrated application, which exploits all available resources of standard consumer hardware, including massively parallel computing on graphics processing units. This allows us to predict or reproduce treatment outcomes on a single personal computer with high computational performance and high accuracy. The resulting low demand for infrastructure enables easy and cost-efficient integration into the clinical routine. We present a number of evaluation cases from the clinical practice where users performed the whole technical workflow from patient-specific modeling to final validation and highlight the opportunities arising from our fast, accurate prediction techniques.

  10. Designing and using computer simulations in medical education and training: an introduction.

    PubMed

    Friedl, Karl E; O'Neil, Harold F

    2013-10-01

    Computer-based technologies informed by the science of learning are becoming increasingly prevalent in education and training. For the Department of Defense (DoD), this presents a great potential advantage to the effective preparation of a new generation of technologically enabled service members. Military medicine has broad education and training challenges ranging from first aid and personal protective skills for every service member to specialized combat medic training; many of these challenges can be met with gaming and simulation technologies that this new generation has embraced. However, comprehensive use of medical games and simulation to augment expert mentorship is still limited to elite medical provider training programs, but can be expected to become broadly used in the training of first responders and allied health care providers. The purpose of this supplement is to review the use of computer games and simulation to teach and assess medical knowledge and skills. This review and other DoD research policy sources will form the basis for development of a research and development road map and guidelines for use of this technology in military medicine. Reprint & Copyright © 2013 Association of Military Surgeons of the U.S.

  11. Automation of a Wave-Optics Simulation and Image Post-Processing Package on Riptide

    NASA Astrophysics Data System (ADS)

    Werth, M.; Lucas, J.; Thompson, D.; Abercrombie, M.; Holmes, R.; Roggemann, M.

    Detailed wave-optics simulations and image post-processing algorithms are computationally expensive and benefit from the massively parallel hardware available at supercomputing facilities. We created an automated system that interfaces with the Maui High Performance Computing Center (MHPCC) Distributed MATLAB® Portal interface to submit massively parallel waveoptics simulations to the IBM iDataPlex (Riptide) supercomputer. This system subsequently postprocesses the output images with an improved version of physically constrained iterative deconvolution (PCID) and analyzes the results using a series of modular algorithms written in Python. With this architecture, a single person can simulate thousands of unique scenarios and produce analyzed, archived, and briefing-compatible output products with very little effort. This research was developed with funding from the Defense Advanced Research Projects Agency (DARPA). The views, opinions, and/or findings expressed are those of the author(s) and should not be interpreted as representing the official views or policies of the Department of Defense or the U.S. Government.

  12. Computational Modeling of Aerosol Hazard Arising from the Opening of an Anthrax Letter in an Open-Office Complex

    NASA Astrophysics Data System (ADS)

    Lien, F. S.; Ji, H.; Yee, E.

    Early experimental work, conducted at Defence R&D Canada — Suffield, measured and characterized the personal and environmental contamination associated with the simulated opening of anthrax-tainted letters under a number of different scenarios. A better understanding of the physical and biological processes is considerably significant for detecting, assessing, and formulating potential mitigation strategies for managing these risks. These preliminary experimental investigations have been extended to simulate the contamination from the opening of anthrax-tainted letters in an Open-Office environment using Computational Fluid Dynamics (CFD). Bacillus globigii (BG) was used as a biological simulant for anthrax, with 0.1 gram of the simulant released from opened letters in the experiments conducted. The accuracy of the model for prediction of the spatial distribution of BG spores in the office is first assessed quantitatively by comparison with measured SF6 concentrations (the baseline experiment), and then qualitatively by comparison with measured BG concentrations obtained under a number of scenarios, some involving people moving within various offices.

  13. Visuospatial skills and computer game experience influence the performance of virtual endoscopy.

    PubMed

    Enochsson, Lars; Isaksson, Bengt; Tour, René; Kjellin, Ann; Hedman, Leif; Wredmark, Torsten; Tsai-Felländer, Li

    2004-11-01

    Advanced medical simulators have been introduced to facilitate surgical and endoscopic training and thereby improve patient safety. Residents trained in the Procedicus Minimally Invasive Surgical Trainer-Virtual Reality (MIST-VR) laparoscopic simulator perform laparoscopic cholecystectomy safer and faster than a control group. Little has been reported regarding whether factors like gender, computer experience, and visuospatial tests can predict the performance with a medical simulator. Our aim was to investigate whether such factors influence the performance of simulated gastroscopy. Seventeen medical students were asked about computer gaming experiences. Before virtual endoscopy, they performed the visuospatial test PicCOr, which discriminates the ability of the tested person to create a three-dimensional image from a two-dimensional presentation. Each student performed one gastroscopy (level 1, case 1) in the GI Mentor II, Simbionix, and several variables related to performance were registered. Percentage of time spent with a clear view in the endoscope correlated well with the performance on the PicSOr test (r = 0.56, P < 0.001). Efficiency of screening also correlated with PicSOr (r = 0.23, P < 0.05). In students with computer gaming experience, the efficiency of screening increased (33.6% +/- 3.1% versus 22.6% +/- 2.8%, P < 0.05) and the duration of the examination decreased by 1.5 minutes (P < 0.05). A similar trend was seen in men compared with women. The visuospatial test PicSOr predicts the results with the endoscopic simulator GI Mentor II. Two-dimensional image experience, as in computer games, also seems to affect the outcome.

  14. A modular finite-element model (MODFE) for areal and axisymmetric ground-water-flow problems, Part 3: Design philosophy and programming details

    USGS Publications Warehouse

    Torak, L.J.

    1993-01-01

    A MODular Finite-Element, digital-computer program (MODFE) was developed to simulate steady or unsteady-state, two-dimensional or axisymmetric ground-water-flow. The modular structure of MODFE places the computationally independent tasks that are performed routinely by digital-computer programs simulating ground-water flow into separate subroutines, which are executed from the main program by control statements. Each subroutine consists of complete sets of computations, or modules, which are identified by comment statements, and can be modified by the user without affecting unrelated computations elsewhere in the program. Simulation capabilities can be added or modified by either adding or modifying subroutines that perform specific computational tasks, and the modular-program structure allows the user to create versions of MODFE that contain only the simulation capabilities that pertain to the ground-water problem of interest. MODFE is written in a Fortran programming language that makes it virtually device independent and compatible with desk-top personal computers and large mainframes. MODFE uses computer storage and execution time efficiently by taking advantage of symmetry and sparseness within the coefficient matrices of the finite-element equations. Parts of the matrix coefficients are computed and stored as single-subscripted variables, which are assembled into a complete coefficient just prior to solution. Computer storage is reused during simulation to decrease storage requirements. Descriptions of subroutines that execute the computational steps of the modular-program structure are given in tables that cross reference the subroutines with particular versions of MODFE. Programming details of linear and nonlinear hydrologic terms are provided. Structure diagrams for the main programs show the order in which subroutines are executed for each version and illustrate some of the linear and nonlinear versions of MODFE that are possible. Computational aspects of changing stresses and boundary conditions with time and of mass-balance and error terms are given for each hydrologic feature. Program variables are listed and defined according to their occurrence in the main programs and in subroutines. Listings of the main programs and subroutines are given.

  15. Surgical simulation software for insertion of pedicle screws.

    PubMed

    Eftekhar, Behzad; Ghodsi, Mohammad; Ketabchi, Ebrahim; Rasaee, Saman

    2002-01-01

    As the first step toward finding noninvasive alternatives to the traditional methods of surgical training, we have developed a small, stand-alone computer program that simulates insertion of pedicle screws in different spinal vertebrae (T10-L5). We used Delphi 5.0 and DirectX 7.0 extension for Microsoft Windows. This is a stand-alone and portable program. The program can run on most personal computers. It provides the trainee with visual feedback during practice of the technique. At present, it uses predefined three-dimensional images of the vertebrae, but we are attempting to adapt the program to three-dimensional objects based on real computed tomographic scans of the patients. The program can be downloaded at no cost from the web site: www.tums.ac.ir/downloads As a preliminary work, it requires further development, particularly toward better visual, auditory, and even proprioceptive feedback and use of the individual patient's data.

  16. Physically Based Virtual Surgery Planning and Simulation Tools for Personal Health Care Systems

    NASA Astrophysics Data System (ADS)

    Dogan, Firat; Atilgan, Yasemin

    The virtual surgery planning and simulation tools have gained a great deal of importance in the last decade in a consequence of increasing capacities at the information technology level. The modern hardware architectures, large scale database systems, grid based computer networks, agile development processes, better 3D visualization and all the other strong aspects of the information technology brings necessary instruments into almost every desk. The last decade’s special software and sophisticated super computer environments are now serving to individual needs inside “tiny smart boxes” for reasonable prices. However, resistance to learning new computerized environments, insufficient training and all the other old habits prevents effective utilization of IT resources by the specialists of the health sector. In this paper, all the aspects of the former and current developments in surgery planning and simulation related tools are presented, future directions and expectations are investigated for better electronic health care systems.

  17. Virtual reality for dermatologic surgery: virtually a reality in the 21st century.

    PubMed

    Gladstone, H B; Raugi, G J; Berg, D; Berkley, J; Weghorst, S; Ganter, M

    2000-01-01

    In the 20th century, virtual reality has predominantly played a role in training pilots and in the entertainment industry. Despite much publicity, virtual reality did not live up to its perceived potential. During the past decade, it has also been applied for medical uses, particularly as training simulators, for minimally invasive surgery. Because of advances in computer technology, virtual reality is on the cusp of becoming an effective medical educational tool. At the University of Washington, we are developing a virtual reality soft tissue surgery simulator. Based on fast finite element modeling and using a personal computer, this device can simulate three-dimensional human skin deformations with real-time tactile feedback. Although there are many cutaneous biomechanical challenges to solve, it will eventually provide more realistic dermatologic surgery training for medical students and residents than the currently used models.

  18. Independent component analysis-based source-level hyperlink analysis for two-person neuroscience studies

    NASA Astrophysics Data System (ADS)

    Zhao, Yang; Dai, Rui-Na; Xiao, Xiang; Zhang, Zong; Duan, Lian; Li, Zheng; Zhu, Chao-Zhe

    2017-02-01

    Two-person neuroscience, a perspective in understanding human social cognition and interaction, involves designing immersive social interaction experiments as well as simultaneously recording brain activity of two or more subjects, a process termed "hyperscanning." Using newly developed imaging techniques, the interbrain connectivity or hyperlink of various types of social interaction has been revealed. Functional near-infrared spectroscopy (fNIRS)-hyperscanning provides a more naturalistic environment for experimental paradigms of social interaction and has recently drawn much attention. However, most fNIRS-hyperscanning studies have computed hyperlinks using sensor data directly while ignoring the fact that the sensor-level signals contain confounding noises, which may lead to a loss of sensitivity and specificity in hyperlink analysis. In this study, on the basis of independent component analysis (ICA), a source-level analysis framework is proposed to investigate the hyperlinks in a fNIRS two-person neuroscience study. The performance of five widely used ICA algorithms in extracting sources of interaction was compared in simulative datasets, and increased sensitivity and specificity of hyperlink analysis by our proposed method were demonstrated in both simulative and real two-person experiments.

  19. Deterministic absorbed dose estimation in computed tomography using a discrete ordinates method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Norris, Edward T.; Liu, Xin, E-mail: xinliu@mst.edu; Hsieh, Jiang

    Purpose: Organ dose estimation for a patient undergoing computed tomography (CT) scanning is very important. Although Monte Carlo methods are considered gold-standard in patient dose estimation, the computation time required is formidable for routine clinical calculations. Here, the authors instigate a deterministic method for estimating an absorbed dose more efficiently. Methods: Compared with current Monte Carlo methods, a more efficient approach to estimating the absorbed dose is to solve the linear Boltzmann equation numerically. In this study, an axial CT scan was modeled with a software package, Denovo, which solved the linear Boltzmann equation using the discrete ordinates method. Themore » CT scanning configuration included 16 x-ray source positions, beam collimators, flat filters, and bowtie filters. The phantom was the standard 32 cm CT dose index (CTDI) phantom. Four different Denovo simulations were performed with different simulation parameters, including the number of quadrature sets and the order of Legendre polynomial expansions. A Monte Carlo simulation was also performed for benchmarking the Denovo simulations. A quantitative comparison was made of the simulation results obtained by the Denovo and the Monte Carlo methods. Results: The difference in the simulation results of the discrete ordinates method and those of the Monte Carlo methods was found to be small, with a root-mean-square difference of around 2.4%. It was found that the discrete ordinates method, with a higher order of Legendre polynomial expansions, underestimated the absorbed dose near the center of the phantom (i.e., low dose region). Simulations of the quadrature set 8 and the first order of the Legendre polynomial expansions proved to be the most efficient computation method in the authors’ study. The single-thread computation time of the deterministic simulation of the quadrature set 8 and the first order of the Legendre polynomial expansions was 21 min on a personal computer. Conclusions: The simulation results showed that the deterministic method can be effectively used to estimate the absorbed dose in a CTDI phantom. The accuracy of the discrete ordinates method was close to that of a Monte Carlo simulation, and the primary benefit of the discrete ordinates method lies in its rapid computation speed. It is expected that further optimization of this method in routine clinical CT dose estimation will improve its accuracy and speed.« less

  20. 2000 Numerical Propulsion System Simulation Review

    NASA Technical Reports Server (NTRS)

    Lytle, John; Follen, Greg; Naiman, Cynthia; Veres, Joseph; Owen, Karl; Lopez, Isaac

    2001-01-01

    The technologies necessary to enable detailed numerical simulations of complete propulsion systems are being developed at the NASA Glenn Research Center in cooperation with industry, academia, and other government agencies. Large scale, detailed simulations will be of great value to the nation because they eliminate some of the costly testing required to develop and certify advanced propulsion systems. In addition, time and cost savings will be achieved by enabling design details to be evaluated early in the development process before a commitment is made to a specific design. This concept is called the Numerical Propulsion System Simulation (NPSS). NPSS consists of three main elements: (1) engineering models that enable multidisciplinary analysis of large subsystems and systems at various levels of detail, (2) a simulation environment that maximizes designer productivity, and (3) a cost-effective. high-performance computing platform. A fundamental requirement of the concept is that the simulations must be capable of overnight execution on easily accessible computing platforms. This will greatly facilitate the use of large-scale simulations in a design environment. This paper describes the current status of the NPSS with specific emphasis on the progress made over the past year on air breathing propulsion applications. Major accomplishments include the first formal release of the NPSS object-oriented architecture (NPSS Version 1) and the demonstration of a one order of magnitude reduction in computing cost-to-performance ratio using a cluster of personal computers. The paper also describes the future NPSS milestones, which include the simulation of space transportation propulsion systems in response to increased emphasis on safe, low cost access to space within NASA'S Aerospace Technology Enterprise. In addition, the paper contains a summary of the feedback received from industry partners on the fiscal year 1999 effort and the actions taken over the past year to respond to that feedback. NPSS was supported in fiscal year 2000 by the High Performance Computing and Communications Program.

  1. 2001 Numerical Propulsion System Simulation Review

    NASA Technical Reports Server (NTRS)

    Lytle, John; Follen, Gregory; Naiman, Cynthia; Veres, Joseph; Owen, Karl; Lopez, Isaac

    2002-01-01

    The technologies necessary to enable detailed numerical simulations of complete propulsion systems are being developed at the NASA Glenn Research Center in cooperation with industry, academia and other government agencies. Large scale, detailed simulations will be of great value to the nation because they eliminate some of the costly testing required to develop and certify advanced propulsion systems. In addition, time and cost savings will be achieved by enabling design details to be evaluated early in the development process before a commitment is made to a specific design. This concept is called the Numerical Propulsion System Simulation (NPSS). NPSS consists of three main elements: (1) engineering models that enable multidisciplinary analysis of large subsystems and systems at various levels of detail, (2) a simulation environment that maximizes designer productivity, and (3) a cost-effective, high-performance computing platform. A fundamental requirement of the concept is that the simulations must be capable of overnight execution on easily accessible computing platforms. This will greatly facilitate the use of large-scale simulations in a design environment. This paper describes the current status of the NPSS with specific emphasis on the progress made over the past year on air breathing propulsion applications. Major accomplishments include the first formal release of the NPSS object-oriented architecture (NPSS Version 1) and the demonstration of a one order of magnitude reduction in computing cost-to-performance ratio using a cluster of personal computers. The paper also describes the future NPSS milestones, which include the simulation of space transportation propulsion systems in response to increased emphasis on safe, low cost access to space within NASA's Aerospace Technology Enterprise. In addition, the paper contains a summary of the feedback received from industry partners on the fiscal year 2000 effort and the actions taken over the past year to respond to that feedback. NPSS was supported in fiscal year 2001 by the High Performance Computing and Communications Program.

  2. Efficient computation of electrograms and ECGs in human whole heart simulations using a reaction-eikonal model.

    PubMed

    Neic, Aurel; Campos, Fernando O; Prassl, Anton J; Niederer, Steven A; Bishop, Martin J; Vigmond, Edward J; Plank, Gernot

    2017-10-01

    Anatomically accurate and biophysically detailed bidomain models of the human heart have proven a powerful tool for gaining quantitative insight into the links between electrical sources in the myocardium and the concomitant current flow in the surrounding medium as they represent their relationship mechanistically based on first principles. Such models are increasingly considered as a clinical research tool with the perspective of being used, ultimately, as a complementary diagnostic modality. An important prerequisite in many clinical modeling applications is the ability of models to faithfully replicate potential maps and electrograms recorded from a given patient. However, while the personalization of electrophysiology models based on the gold standard bidomain formulation is in principle feasible, the associated computational expenses are significant, rendering their use incompatible with clinical time frames. In this study we report on the development of a novel computationally efficient reaction-eikonal (R-E) model for modeling extracellular potential maps and electrograms. Using a biventricular human electrophysiology model, which incorporates a topologically realistic His-Purkinje system (HPS), we demonstrate by comparing against a high-resolution reaction-diffusion (R-D) bidomain model that the R-E model predicts extracellular potential fields, electrograms as well as ECGs at the body surface with high fidelity and offers vast computational savings greater than three orders of magnitude. Due to their efficiency R-E models are ideally suitable for forward simulations in clinical modeling studies which attempt to personalize electrophysiological model features.

  3. Efficient computation of electrograms and ECGs in human whole heart simulations using a reaction-eikonal model

    NASA Astrophysics Data System (ADS)

    Neic, Aurel; Campos, Fernando O.; Prassl, Anton J.; Niederer, Steven A.; Bishop, Martin J.; Vigmond, Edward J.; Plank, Gernot

    2017-10-01

    Anatomically accurate and biophysically detailed bidomain models of the human heart have proven a powerful tool for gaining quantitative insight into the links between electrical sources in the myocardium and the concomitant current flow in the surrounding medium as they represent their relationship mechanistically based on first principles. Such models are increasingly considered as a clinical research tool with the perspective of being used, ultimately, as a complementary diagnostic modality. An important prerequisite in many clinical modeling applications is the ability of models to faithfully replicate potential maps and electrograms recorded from a given patient. However, while the personalization of electrophysiology models based on the gold standard bidomain formulation is in principle feasible, the associated computational expenses are significant, rendering their use incompatible with clinical time frames. In this study we report on the development of a novel computationally efficient reaction-eikonal (R-E) model for modeling extracellular potential maps and electrograms. Using a biventricular human electrophysiology model, which incorporates a topologically realistic His-Purkinje system (HPS), we demonstrate by comparing against a high-resolution reaction-diffusion (R-D) bidomain model that the R-E model predicts extracellular potential fields, electrograms as well as ECGs at the body surface with high fidelity and offers vast computational savings greater than three orders of magnitude. Due to their efficiency R-E models are ideally suitable for forward simulations in clinical modeling studies which attempt to personalize electrophysiological model features.

  4. Molecular Dynamics: New Frontier in Personalized Medicine.

    PubMed

    Sneha, P; Doss, C George Priya

    2016-01-01

    The field of drug discovery has witnessed infinite development over the last decade with the demand for discovery of novel efficient lead compounds. Although the development of novel compounds in this field has seen large failure, a breakthrough in this area might be the establishment of personalized medicine. The trend of personalized medicine has shown stupendous growth being a hot topic after the successful completion of Human Genome Project and 1000 genomes pilot project. Genomic variant such as SNPs play a vital role with respect to inter individual's disease susceptibility and drug response. Hence, identification of such genetic variants has to be performed before administration of a drug. This process requires high-end techniques to understand the complexity of the molecules which might bring an insight to understand the compounds at their molecular level. To sustenance this, field of bioinformatics plays a crucial role in revealing the molecular mechanism of the mutation and thereby designing a drug for an individual in fast and affordable manner. High-end computational methods, such as molecular dynamics (MD) simulation has proved to be a constitutive approach to detecting the minor changes associated with an SNP for better understanding of the structural and functional relationship. The parameters used in molecular dynamic simulation elucidate different properties of a macromolecule, such as protein stability and flexibility. MD along with docking analysis can reveal the synergetic effect of an SNP in protein-ligand interaction and provides a foundation for designing a particular drug molecule for an individual. This compelling application of computational power and the advent of other technologies have paved a promising way toward personalized medicine. In this in-depth review, we tried to highlight the different wings of MD toward personalized medicine. © 2016 Elsevier Inc. All rights reserved.

  5. Cognitive simulation as a tool for cognitive task analysis.

    PubMed

    Roth, E M; Woods, D D; Pople, H E

    1992-10-01

    Cognitive simulations are runnable computer programs that represent models of human cognitive activities. We show how one cognitive simulation built as a model of some of the cognitive processes involved in dynamic fault management can be used in conjunction with small-scale empirical data on human performance to uncover the cognitive demands of a task, to identify where intention errors are likely to occur, and to point to improvements in the person-machine system. The simulation, called Cognitive Environment Simulation or CES, has been exercised on several nuclear power plant accident scenarios. Here we report one case to illustrate how a cognitive simulation tool such as CES can be used to clarify the cognitive demands of a problem-solving situation as part of a cognitive task analysis.

  6. An Evaluation of Training Interventions and Computed Scoring Techniques for Grading a Level Turn Task and a Straight In Landing Approach on a PC-Based Flight Simulator

    NASA Technical Reports Server (NTRS)

    Heath, Bruce E.

    2007-01-01

    One result of the relatively recent advances in computing technology has been the decreasing cost of computers and increasing computational power. This has allowed high fidelity airplane simulations to be run on personal computers (PC). Thus, simulators are now used routinely by pilots to substitute real flight hours for simulated flight hours for training for an aircraft type rating thereby reducing the cost of flight training. However, FAA regulations require that such substitution training must be supervised by Certified Flight Instructors (CFI). If the CFI presence could be reduced or eliminated for certain tasks this would mean a further cost savings to the pilot. This would require that the flight simulator have a certain level of 'intelligence' in order to provide feedback on pilot perfolmance similar to that of a CFI. The 'intelligent' flight sinlulator would have at least the capability to use data gathered from the flight to create a measure for the performance of the student pilot. Also, to fully utilize the advances in computational power, the sinlulator would be capable of interacting with the student pilot using the best possible training interventions. This thesis reposts on the two studies conducted at Tuskegee University investigating the effects of interventions on the learning of two flight maneuvers on a flight sinlulator and the robustness and accuracy of calculated perfornlance indices as compared to CFI evaluations of performance. The intent of these studies is to take a step in the direction of creating an 'intelligent' flight simulator. The first study deals with the comparisons of novice pilot performance trained at different levels of above real-time to execute a level S-turn. The second study examined the effect of out-of-the-window (OTW) visual cues in the form of hoops on the performance of novice pilots learning to fly a landing approach on the flight simulator. The reliability/robustness of the computed performance metrics was assessed by comparing them with the evaluations of the landing approach maneuver by a number of CFIs.

  7. Generalization of Clustering Coefficients to Signed Correlation Networks

    PubMed Central

    Costantini, Giulio; Perugini, Marco

    2014-01-01

    The recent interest in network analysis applications in personality psychology and psychopathology has put forward new methodological challenges. Personality and psychopathology networks are typically based on correlation matrices and therefore include both positive and negative edge signs. However, some applications of network analysis disregard negative edges, such as computing clustering coefficients. In this contribution, we illustrate the importance of the distinction between positive and negative edges in networks based on correlation matrices. The clustering coefficient is generalized to signed correlation networks: three new indices are introduced that take edge signs into account, each derived from an existing and widely used formula. The performances of the new indices are illustrated and compared with the performances of the unsigned indices, both on a signed simulated network and on a signed network based on actual personality psychology data. The results show that the new indices are more resistant to sample variations in correlation networks and therefore have higher convergence compared with the unsigned indices both in simulated networks and with real data. PMID:24586367

  8. Unity Power Factor Operated PFC Converter Based Power Supply for Computers

    NASA Astrophysics Data System (ADS)

    Singh, Shikha; Singh, Bhim; Bhuvaneswari, G.; Bist, Vashist

    2017-11-01

    Power Supplies (PSs) employed in personal computers pollute the single phase ac mains by drawing distorted current at a substandard Power Factor (PF). The harmonic distortion of the supply current in these personal computers are observed 75% to 90% with the Crest Factor (CF) being very high which escalates losses in the distribution system. To find a tangible solution to these issues, a non-isolated PFC converter is employed at the input of isolated converter that is capable of improving the input power quality apart from regulating the dc voltage at its output. This is given to the isolated stage that yields completely isolated and stiffly regulated multiple output voltages which is the prime requirement of computer PS. The operation of the proposed PS is evaluated under various operating conditions and the results show improved performance depicting nearly unity PF and low input current harmonics. The prototype of this PS is developed in laboratory environment and test results are recorded which corroborate the power quality improvement observed in simulation results under various operating conditions.

  9. Coupling of EIT with computational lung modeling for predicting patient-specific ventilatory responses.

    PubMed

    Roth, Christian J; Becher, Tobias; Frerichs, Inéz; Weiler, Norbert; Wall, Wolfgang A

    2017-04-01

    Providing optimal personalized mechanical ventilation for patients with acute or chronic respiratory failure is still a challenge within a clinical setting for each case anew. In this article, we integrate electrical impedance tomography (EIT) monitoring into a powerful patient-specific computational lung model to create an approach for personalizing protective ventilatory treatment. The underlying computational lung model is based on a single computed tomography scan and able to predict global airflow quantities, as well as local tissue aeration and strains for any ventilation maneuver. For validation, a novel "virtual EIT" module is added to our computational lung model, allowing to simulate EIT images based on the patient's thorax geometry and the results of our numerically predicted tissue aeration. Clinically measured EIT images are not used to calibrate the computational model. Thus they provide an independent method to validate the computational predictions at high temporal resolution. The performance of this coupling approach has been tested in an example patient with acute respiratory distress syndrome. The method shows good agreement between computationally predicted and clinically measured airflow data and EIT images. These results imply that the proposed framework can be used for numerical prediction of patient-specific responses to certain therapeutic measures before applying them to an actual patient. In the long run, definition of patient-specific optimal ventilation protocols might be assisted by computational modeling. NEW & NOTEWORTHY In this work, we present a patient-specific computational lung model that is able to predict global and local ventilatory quantities for a given patient and any selected ventilation protocol. For the first time, such a predictive lung model is equipped with a virtual electrical impedance tomography module allowing real-time validation of the computed results with the patient measurements. First promising results obtained in an acute respiratory distress syndrome patient show the potential of this approach for personalized computationally guided optimization of mechanical ventilation in future. Copyright © 2017 the American Physiological Society.

  10. Software for Acoustic Rendering

    NASA Technical Reports Server (NTRS)

    Miller, Joel D.

    2003-01-01

    SLAB is a software system that can be run on a personal computer to simulate an acoustic environment in real time. SLAB was developed to enable computational experimentation in which one can exert low-level control over a variety of signal-processing parameters, related to spatialization, for conducting psychoacoustic studies. Among the parameters that can be manipulated are the number and position of reflections, the fidelity (that is, the number of taps in finite-impulse-response filters), the system latency, and the update rate of the filters. Another goal in the development of SLAB was to provide an inexpensive means of dynamic synthesis of virtual audio over headphones, without need for special-purpose signal-processing hardware. SLAB has a modular, object-oriented design that affords the flexibility and extensibility needed to accommodate a variety of computational experiments and signal-flow structures. SLAB s spatial renderer has a fixed signal-flow architecture corresponding to a set of parallel signal paths from each source to a listener. This fixed architecture can be regarded as a compromise that optimizes efficiency at the expense of complete flexibility. Such a compromise is necessary, given the design goal of enabling computational psychoacoustic experimentation on inexpensive personal computers.

  11. Ground Test and Computation of Boundary Layer Transition on the Hypersonic International Flight Research and Experimentation (HIFiRE)-5 Vehicle

    DTIC Science & Technology

    2011-02-01

    provision of law, no person shall be subject to any penalty for failing to comply with a collection of information if it does not display a currently ...transition characteristics as well as the effectiveness of 2-D strip trips to simulate the joint between the nosecap and body of the vehicle and 3-D...diamond shaped trips, to simulate the fasteners on a closeout panel that will be on one side of the flight vehicle. In order to accomplish this, global

  12. Computational modeling in melanoma for novel drug discovery.

    PubMed

    Pennisi, Marzio; Russo, Giulia; Di Salvatore, Valentina; Candido, Saverio; Libra, Massimo; Pappalardo, Francesco

    2016-06-01

    There is a growing body of evidence highlighting the applications of computational modeling in the field of biomedicine. It has recently been applied to the in silico analysis of cancer dynamics. In the era of precision medicine, this analysis may allow the discovery of new molecular targets useful for the design of novel therapies and for overcoming resistance to anticancer drugs. According to its molecular behavior, melanoma represents an interesting tumor model in which computational modeling can be applied. Melanoma is an aggressive tumor of the skin with a poor prognosis for patients with advanced disease as it is resistant to current therapeutic approaches. This review discusses the basics of computational modeling in melanoma drug discovery and development. Discussion includes the in silico discovery of novel molecular drug targets, the optimization of immunotherapies and personalized medicine trials. Mathematical and computational models are gradually being used to help understand biomedical data produced by high-throughput analysis. The use of advanced computer models allowing the simulation of complex biological processes provides hypotheses and supports experimental design. The research in fighting aggressive cancers, such as melanoma, is making great strides. Computational models represent the key component to complement these efforts. Due to the combinatorial complexity of new drug discovery, a systematic approach based only on experimentation is not possible. Computational and mathematical models are necessary for bringing cancer drug discovery into the era of omics, big data and personalized medicine.

  13. On the Use of Computers for Teaching Fluid Mechanics

    NASA Technical Reports Server (NTRS)

    Benson, Thomas J.

    1994-01-01

    Several approaches for improving the teaching of basic fluid mechanics using computers are presented. There are two objectives to these approaches: to increase the involvement of the student in the learning process and to present information to the student in a variety of forms. Items discussed include: the preparation of educational videos using the results of computational fluid dynamics (CFD) calculations, the analysis of CFD flow solutions using workstation based post-processing graphics packages, and the development of workstation or personal computer based simulators which behave like desk top wind tunnels. Examples of these approaches are presented along with observations from working with undergraduate co-ops. Possible problems in the implementation of these approaches as well as solutions to these problems are also discussed.

  14. Development of a low-cost virtual reality workstation for training and education

    NASA Technical Reports Server (NTRS)

    Phillips, James A.

    1996-01-01

    Virtual Reality (VR) is a set of breakthrough technologies that allow a human being to enter and fully experience a 3-dimensional, computer simulated environment. A true virtual reality experience meets three criteria: (1) it involves 3-dimensional computer graphics; (2) it includes real-time feedback and response to user actions; and (3) it must provide a sense of immersion. Good examples of a virtual reality simulator are the flight simulators used by all branches of the military to train pilots for combat in high performance jet fighters. The fidelity of such simulators is extremely high -- but so is the price tag, typically millions of dollars. Virtual reality teaching and training methods are manifestly effective, but the high cost of VR technology has limited its practical application to fields with big budgets, such as military combat simulation, commercial pilot training, and certain projects within the space program. However, in the last year there has been a revolution in the cost of VR technology. The speed of inexpensive personal computers has increased dramatically, especially with the introduction of the Pentium processor and the PCI bus for IBM-compatibles, and the cost of high-quality virtual reality peripherals has plummeted. The result is that many public schools, colleges, and universities can afford a PC-based workstation capable of running immersive virtual reality applications. My goal this summer was to assemble and evaluate such a system.

  15. A flexible and accurate digital volume correlation method applicable to high-resolution volumetric images

    NASA Astrophysics Data System (ADS)

    Pan, Bing; Wang, Bo

    2017-10-01

    Digital volume correlation (DVC) is a powerful technique for quantifying interior deformation within solid opaque materials and biological tissues. In the last two decades, great efforts have been made to improve the accuracy and efficiency of the DVC algorithm. However, there is still a lack of a flexible, robust and accurate version that can be efficiently implemented in personal computers with limited RAM. This paper proposes an advanced DVC method that can realize accurate full-field internal deformation measurement applicable to high-resolution volume images with up to billions of voxels. Specifically, a novel layer-wise reliability-guided displacement tracking strategy combined with dynamic data management is presented to guide the DVC computation from slice to slice. The displacements at specified calculation points in each layer are computed using the advanced 3D inverse-compositional Gauss-Newton algorithm with the complete initial guess of the deformation vector accurately predicted from the computed calculation points. Since only limited slices of interest in the reference and deformed volume images rather than the whole volume images are required, the DVC calculation can thus be efficiently implemented on personal computers. The flexibility, accuracy and efficiency of the presented DVC approach are demonstrated by analyzing computer-simulated and experimentally obtained high-resolution volume images.

  16. Optimal Computing Budget Allocation for Particle Swarm Optimization in Stochastic Optimization.

    PubMed

    Zhang, Si; Xu, Jie; Lee, Loo Hay; Chew, Ek Peng; Wong, Wai Peng; Chen, Chun-Hung

    2017-04-01

    Particle Swarm Optimization (PSO) is a popular metaheuristic for deterministic optimization. Originated in the interpretations of the movement of individuals in a bird flock or fish school, PSO introduces the concept of personal best and global best to simulate the pattern of searching for food by flocking and successfully translate the natural phenomena to the optimization of complex functions. Many real-life applications of PSO cope with stochastic problems. To solve a stochastic problem using PSO, a straightforward approach is to equally allocate computational effort among all particles and obtain the same number of samples of fitness values. This is not an efficient use of computational budget and leaves considerable room for improvement. This paper proposes a seamless integration of the concept of optimal computing budget allocation (OCBA) into PSO to improve the computational efficiency of PSO for stochastic optimization problems. We derive an asymptotically optimal allocation rule to intelligently determine the number of samples for all particles such that the PSO algorithm can efficiently select the personal best and global best when there is stochastic estimation noise in fitness values. We also propose an easy-to-implement sequential procedure. Numerical tests show that our new approach can obtain much better results using the same amount of computational effort.

  17. Optimal Computing Budget Allocation for Particle Swarm Optimization in Stochastic Optimization

    PubMed Central

    Zhang, Si; Xu, Jie; Lee, Loo Hay; Chew, Ek Peng; Chen, Chun-Hung

    2017-01-01

    Particle Swarm Optimization (PSO) is a popular metaheuristic for deterministic optimization. Originated in the interpretations of the movement of individuals in a bird flock or fish school, PSO introduces the concept of personal best and global best to simulate the pattern of searching for food by flocking and successfully translate the natural phenomena to the optimization of complex functions. Many real-life applications of PSO cope with stochastic problems. To solve a stochastic problem using PSO, a straightforward approach is to equally allocate computational effort among all particles and obtain the same number of samples of fitness values. This is not an efficient use of computational budget and leaves considerable room for improvement. This paper proposes a seamless integration of the concept of optimal computing budget allocation (OCBA) into PSO to improve the computational efficiency of PSO for stochastic optimization problems. We derive an asymptotically optimal allocation rule to intelligently determine the number of samples for all particles such that the PSO algorithm can efficiently select the personal best and global best when there is stochastic estimation noise in fitness values. We also propose an easy-to-implement sequential procedure. Numerical tests show that our new approach can obtain much better results using the same amount of computational effort. PMID:29170617

  18. Evaluation of a micro-scale wind model's performance over realistic building clusters using wind tunnel experiments

    NASA Astrophysics Data System (ADS)

    Zhang, Ning; Du, Yunsong; Miao, Shiguang; Fang, Xiaoyi

    2016-08-01

    The simulation performance over complex building clusters of a wind simulation model (Wind Information Field Fast Analysis model, WIFFA) in a micro-scale air pollutant dispersion model system (Urban Microscale Air Pollution dispersion Simulation model, UMAPS) is evaluated using various wind tunnel experimental data including the CEDVAL (Compilation of Experimental Data for Validation of Micro-Scale Dispersion Models) wind tunnel experiment data and the NJU-FZ experiment data (Nanjing University-Fang Zhuang neighborhood wind tunnel experiment data). The results show that the wind model can reproduce the vortexes triggered by urban buildings well, and the flow patterns in urban street canyons and building clusters can also be represented. Due to the complex shapes of buildings and their distributions, the simulation deviations/discrepancies from the measurements are usually caused by the simplification of the building shapes and the determination of the key zone sizes. The computational efficiencies of different cases are also discussed in this paper. The model has a high computational efficiency compared to traditional numerical models that solve the Navier-Stokes equations, and can produce very high-resolution (1-5 m) wind fields of a complex neighborhood scale urban building canopy (~ 1 km ×1 km) in less than 3 min when run on a personal computer.

  19. Transforming Clinical Imaging and 3D Data for Virtual Reality Learning Objects: HTML5 and Mobile Devices Implementation

    ERIC Educational Resources Information Center

    Trelease, Robert B.; Nieder, Gary L.

    2013-01-01

    Web deployable anatomical simulations or "virtual reality learning objects" can easily be produced with QuickTime VR software, but their use for online and mobile learning is being limited by the declining support for web browser plug-ins for personal computers and unavailability on popular mobile devices like Apple iPad and Android…

  20. A general theoretical framework for interpreting patient-reported outcomes estimated from ordinally scaled item responses.

    PubMed

    Massof, Robert W

    2014-10-01

    A simple theoretical framework explains patient responses to items in rating scale questionnaires. Fixed latent variables position each patient and each item on the same linear scale. Item responses are governed by a set of fixed category thresholds, one for each ordinal response category. A patient's item responses are magnitude estimates of the difference between the patient variable and the patient's estimate of the item variable, relative to his/her personally defined response category thresholds. Differences between patients in their personal estimates of the item variable and in their personal choices of category thresholds are represented by random variables added to the corresponding fixed variables. Effects of intervention correspond to changes in the patient variable, the patient's response bias, and/or latent item variables for a subset of items. Intervention effects on patients' item responses were simulated by assuming the random variables are normally distributed with a constant scalar covariance matrix. Rasch analysis was used to estimate latent variables from the simulated responses. The simulations demonstrate that changes in the patient variable and changes in response bias produce indistinguishable effects on item responses and manifest as changes only in the estimated patient variable. Changes in a subset of item variables manifest as intervention-specific differential item functioning and as changes in the estimated person variable that equals the average of changes in the item variables. Simulations demonstrate that intervention-specific differential item functioning produces inefficiencies and inaccuracies in computer adaptive testing. © The Author(s) 2013 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  1. An efficient and scalable deformable model for virtual reality-based medical applications.

    PubMed

    Choi, Kup-Sze; Sun, Hanqiu; Heng, Pheng-Ann

    2004-09-01

    Modeling of tissue deformation is of great importance to virtual reality (VR)-based medical simulations. Considerable effort has been dedicated to the development of interactively deformable virtual tissues. In this paper, an efficient and scalable deformable model is presented for virtual-reality-based medical applications. It considers deformation as a localized force transmittal process which is governed by algorithms based on breadth-first search (BFS). The computational speed is scalable to facilitate real-time interaction by adjusting the penetration depth. Simulated annealing (SA) algorithms are developed to optimize the model parameters by using the reference data generated with the linear static finite element method (FEM). The mechanical behavior and timing performance of the model have been evaluated. The model has been applied to simulate the typical behavior of living tissues and anisotropic materials. Integration with a haptic device has also been achieved on a generic personal computer (PC) platform. The proposed technique provides a feasible solution for VR-based medical simulations and has the potential for multi-user collaborative work in virtual environment.

  2. The Individual Virtual Eye: a Computer Model for Advanced Intraocular Lens Calculation

    PubMed Central

    Einighammer, Jens; Oltrup, Theo; Bende, Thomas; Jean, Benedikt

    2010-01-01

    Purpose To describe the individual virtual eye, a computer model of a human eye with respect to its optical properties. It is based on measurements of an individual person and one of its major application is calculating intraocular lenses (IOLs) for cataract surgery. Methods The model is constructed from an eye's geometry, including axial length and topographic measurements of the anterior corneal surface. All optical components of a pseudophakic eye are modeled with computer scientific methods. A spline-based interpolation method efficiently includes data from corneal topographic measurements. The geometrical optical properties, such as the wavefront aberration, are simulated with real ray-tracing using Snell's law. Optical components can be calculated using computer scientific optimization procedures. The geometry of customized aspheric IOLs was calculated for 32 eyes and the resulting wavefront aberration was investigated. Results The more complex the calculated IOL is, the lower the residual wavefront error is. Spherical IOLs are only able to correct for the defocus, while toric IOLs also eliminate astigmatism. Spherical aberration is additionally reduced by aspheric and toric aspheric IOLs. The efficient implementation of time-critical numerical ray-tracing and optimization procedures allows for short calculation times, which may lead to a practicable method integrated in some device. Conclusions The individual virtual eye allows for simulations and calculations regarding geometrical optics for individual persons. This leads to clinical applications like IOL calculation, with the potential to overcome the limitations of those current calculation methods that are based on paraxial optics, exemplary shown by calculating customized aspheric IOLs.

  3. The Data Acquisition and Control Systems of the Jet Noise Laboratory at the NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Jansen, B. J., Jr.

    1998-01-01

    The features of the data acquisition and control systems of the NASA Langley Research Center's Jet Noise Laboratory are presented. The Jet Noise Laboratory is a facility that simulates realistic mixed flow turbofan jet engine nozzle exhaust systems in simulated flight. The system is capable of acquiring data for a complete take-off assessment of noise and nozzle performance. This paper describes the development of an integrated system to control and measure the behavior of model jet nozzles featuring dual independent high pressure combusting air streams with wind tunnel flow. The acquisition and control system is capable of simultaneous measurement of forces, moments, static and dynamic model pressures and temperatures, and jet noise. The design concepts for the coordination of the control computers and multiple data acquisition computers and instruments are discussed. The control system design and implementation are explained, describing the features, equipment, and the experiences of using a primarily Personal Computer based system. Areas for future development are examined.

  4. ASTEC and MODEL: Controls software development at Goddard Space Flight Center

    NASA Technical Reports Server (NTRS)

    Downing, John P.; Bauer, Frank H.; Surber, Jeffrey L.

    1993-01-01

    The ASTEC (Analysis and Simulation Tools for Engineering Controls) software is under development at the Goddard Space Flight Center (GSFC). The design goal is to provide a wide selection of controls analysis tools at the personal computer level, as well as the capability to upload compute-intensive jobs to a mainframe or supercomputer. In the last three years the ASTEC (Analysis and Simulation Tools for Engineering Controls) software has been under development. ASTEC is meant to be an integrated collection of controls analysis tools for use at the desktop level. MODEL (Multi-Optimal Differential Equation Language) is a translator that converts programs written in the MODEL language to FORTRAN. An upgraded version of the MODEL program will be merged into ASTEC. MODEL has not been modified since 1981 and has not kept with changes in computers or user interface techniques. This paper describes the changes made to MODEL in order to make it useful in the 90's and how it relates to ASTEC.

  5. Neuronify: An Educational Simulator for Neural Circuits.

    PubMed

    Dragly, Svenn-Arne; Hobbi Mobarhan, Milad; Våvang Solbrå, Andreas; Tennøe, Simen; Hafreager, Anders; Malthe-Sørenssen, Anders; Fyhn, Marianne; Hafting, Torkel; Einevoll, Gaute T

    2017-01-01

    Educational software (apps) can improve science education by providing an interactive way of learning about complicated topics that are hard to explain with text and static illustrations. However, few educational apps are available for simulation of neural networks. Here, we describe an educational app, Neuronify, allowing the user to easily create and explore neural networks in a plug-and-play simulation environment. The user can pick network elements with adjustable parameters from a menu, i.e., synaptically connected neurons modelled as integrate-and-fire neurons and various stimulators (current sources, spike generators, visual, and touch) and recording devices (voltmeter, spike detector, and loudspeaker). We aim to provide a low entry point to simulation-based neuroscience by allowing students with no programming experience to create and simulate neural networks. To facilitate the use of Neuronify in teaching, a set of premade common network motifs is provided, performing functions such as input summation, gain control by inhibition, and detection of direction of stimulus movement. Neuronify is developed in C++ and QML using the cross-platform application framework Qt and runs on smart phones (Android, iOS) and tablet computers as well personal computers (Windows, Mac, Linux).

  6. Modeling and simulation of magnetic resonance imaging based on intermolecular multiple quantum coherences

    NASA Astrophysics Data System (ADS)

    Cai, Congbo; Dong, Jiyang; Cai, Shuhui; Cheng, En; Chen, Zhong

    2006-11-01

    Intermolecular multiple quantum coherences (iMQCs) have many potential applications since they can provide interaction information between different molecules within the range of dipolar correlation distance, and can provide new contrast in magnetic resonance imaging (MRI). Because of the non-localized property of dipolar field, and the non-linear property of the Bloch equations incorporating the dipolar field term, the evolution behavior of iMQC is difficult to deduce strictly in many cases. In such cases, simulation studies are very important. Simulation results can not only give a guide to optimize experimental conditions, but also help analyze unexpected experimental results. Based on our product operator matrix and the K-space method for dipolar field calculation, the MRI simulation software was constructed, running on Windows operation system. The non-linear Bloch equations are calculated by a fifth-order Cash-Karp Runge-Kutta formulism. Computational time can be efficiently reduced by separating the effects of chemical shifts and strong gradient field. Using this software, simulation of different kinds of complex MRI sequences can be done conveniently and quickly on general personal computers. Some examples were given. The results were discussed.

  7. Neuronify: An Educational Simulator for Neural Circuits

    PubMed Central

    Hafreager, Anders; Malthe-Sørenssen, Anders; Fyhn, Marianne

    2017-01-01

    Abstract Educational software (apps) can improve science education by providing an interactive way of learning about complicated topics that are hard to explain with text and static illustrations. However, few educational apps are available for simulation of neural networks. Here, we describe an educational app, Neuronify, allowing the user to easily create and explore neural networks in a plug-and-play simulation environment. The user can pick network elements with adjustable parameters from a menu, i.e., synaptically connected neurons modelled as integrate-and-fire neurons and various stimulators (current sources, spike generators, visual, and touch) and recording devices (voltmeter, spike detector, and loudspeaker). We aim to provide a low entry point to simulation-based neuroscience by allowing students with no programming experience to create and simulate neural networks. To facilitate the use of Neuronify in teaching, a set of premade common network motifs is provided, performing functions such as input summation, gain control by inhibition, and detection of direction of stimulus movement. Neuronify is developed in C++ and QML using the cross-platform application framework Qt and runs on smart phones (Android, iOS) and tablet computers as well personal computers (Windows, Mac, Linux). PMID:28321440

  8. Simulation techniques in hyperthermia treatment planning

    PubMed Central

    Paulides, MM; Stauffer, PR; Neufeld, E; Maccarini, P; Kyriakou, A; Canters, RAM; Diederich, C; Bakker, JF; Van Rhoon, GC

    2013-01-01

    Clinical trials have shown that hyperthermia (HT), i.e. an increase of tissue temperature to 39-44°C, significantly enhance radiotherapy and chemotherapy effectiveness (1). Driven by the developments in computational techniques and computing power, personalized hyperthermia treatment planning (HTP) has matured and has become a powerful tool for optimizing treatment quality. Electromagnetic, ultrasound, and thermal simulations using realistic clinical setups are now being performed to achieve patient-specific treatment optimization. In addition, extensive studies aimed to properly implement novel HT tools and techniques, and to assess the quality of HT, are becoming more common. In this paper, we review the simulation tools and techniques developed for clinical hyperthermia, and evaluate their current status on the path from “model” to “clinic”. In addition, we illustrate the major techniques employed for validation and optimization. HTP has become an essential tool for improvement, control, and assessment of HT treatment quality. As such, it plays a pivotal role in the quest to establish HT as an efficacious addition to multi-modality treatment of cancer. PMID:23672453

  9. Hydrodynamics Analysis and CFD Simulation of Portal Venous System by TIPS and LS.

    PubMed

    Wang, Meng; Zhou, Hongyu; Huang, Yaozhen; Gong, Piyun; Peng, Bing; Zhou, Shichun

    2015-06-01

    In cirrhotic patients, portal hypertension is often associated with a hyperdynamic changes. Transjugular Intrahepatic Portosystemic Shunt (TIPS) and Laparoscopic splenectomy are both treatments for liver cirrhosis due to portal hypertension. While, the two different interventions have different effects on hemodynamics after operation and the possibilities of triggering PVT are different. How hemodynamics of portal vein system evolving with two different operations remain unknown. Based on ultrasound and established numerical methods, CFD technique is applied to analyze hemodynamic changes after TIPS and Laparoscopic splenectomy. In this paper, we applied two 3-D flow models to the hemodynamic analysis for two patients who received a TIPS and a laparoscopic splenectomy, both therapies for treating portal hypertension induced diseases. The current computer simulations give a quantitative analysis of the interplay between hemodynamics and TIPS or splenectomy. In conclusion, the presented computational model can be used for the theoretical analysis of TIPS and laparoscopic splenectomy, clinical decisions could be made based on the simulation results with personal properly treatment.

  10. Cybersim: geographic, temporal, and organizational dynamics of malware propagation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Santhi, Nandakishore; Yan, Guanhua; Eidenbenz, Stephan

    2010-01-01

    Cyber-infractions into a nation's strategic security envelope pose a constant and daunting challenge. We present the modular CyberSim tool which has been developed in response to the need to realistically simulate at a national level, software vulnerabilities and resulting mal ware propagation in online social networks. CyberSim suite (a) can generate realistic scale-free networks from a database of geocoordinated computers to closely model social networks arising from personal and business email contacts and online communities; (b) maintains for each,bost a list of installed software, along with the latest published vulnerabilities; (d) allows designated initial nodes where malware gets introduced; (e)more » simulates, using distributed discrete event-driven technology, the spread of malware exploiting a specific vulnerability, with packet delay and user online behavior models; (f) provides a graphical visualization of spread of infection, its severity, businesses affected etc to the analyst. We present sample simulations on a national level network with millions of computers.« less

  11. Performance comparison of attitude determination, attitude estimation, and nonlinear observers algorithms

    NASA Astrophysics Data System (ADS)

    MOHAMMED, M. A. SI; BOUSSADIA, H.; BELLAR, A.; ADNANE, A.

    2017-01-01

    This paper presents a brief synthesis and useful performance analysis of different attitude filtering algorithms (attitude determination algorithms, attitude estimation algorithms, and nonlinear observers) applied to Low Earth Orbit Satellite in terms of accuracy, convergence time, amount of memory, and computation time. This latter is calculated in two ways, using a personal computer and also using On-board computer 750 (OBC 750) that is being used in many SSTL Earth observation missions. The use of this comparative study could be an aided design tool to the designer to choose from an attitude determination or attitude estimation or attitude observer algorithms. The simulation results clearly indicate that the nonlinear Observer is the more logical choice.

  12. An application for multi-person task synchronization

    NASA Technical Reports Server (NTRS)

    Brown, Robert L.; Doyle, Dee

    1990-01-01

    Computer applications are studied that will enable a group of people to synchronize their actions when following a predefined task sequence. It is assumed that the people involved only have computer workstations available to them for communication. Hence, the approach is to study how the computer can be used to help a group remain synchronized. A series of applications were designed and developed that can be used as vehicles for experimentation. An example of how this technique can be used for a remote coaching capability is explained in a report describing an experiment that simulated a Life Sciences experiment on-board Space Station Freedom, with a ground based principal investigator providing the expertise by coaching the on-orbit mission specialist.

  13. Experimental Investigation of 60 GHz Transmission Characteristics Between Computers on a Conference Table for WPAN Applications

    NASA Technical Reports Server (NTRS)

    Ponchak, George E.; Amadjikpe, Arnaud L.; Choudhury, Debabani; Papapolymerou, John

    2011-01-01

    In this paper, the first measurements of the received radiated power between antennas located on a conference table to simulate the environment of antennas embedded in laptop computers for 60 GHz Wireless Personal Area Network (WPAN) applications is presented. A high gain horn antenna and a medium gain microstrip patch antenna for two linear polarizations are compared. It is shown that for a typical conference table arrangement with five computers, books, pens, and coffee cups, the antennas should be placed a minimum of 5 cm above the table, but that a height of greater than 20 cm may be required to maximize the received power in all cases.

  14. Emerging CAE technologies and their role in Future Ambient Intelligence Environments

    NASA Astrophysics Data System (ADS)

    Noor, Ahmed K.

    2011-03-01

    Dramatic improvements are on the horizon in Computer Aided Engineering (CAE) and various simulation technologies. The improvements are due, in part, to the developments in a number of leading-edge technologies and their synergistic combinations/convergence. The technologies include ubiquitous, cloud, and petascale computing; ultra high-bandwidth networks, pervasive wireless communication; knowledge based engineering; networked immersive virtual environments and virtual worlds; novel human-computer interfaces; and powerful game engines and facilities. This paper describes the frontiers and emerging simulation technologies, and their role in the future virtual product creation and learning/training environments. The environments will be ambient intelligence environments, incorporating a synergistic combination of novel agent-supported visual simulations (with cognitive learning and understanding abilities); immersive 3D virtual world facilities; development chain management systems and facilities (incorporating a synergistic combination of intelligent engineering and management tools); nontraditional methods; intelligent, multimodal and human-like interfaces; and mobile wireless devices. The Virtual product creation environment will significantly enhance the productivity and will stimulate creativity and innovation in future global virtual collaborative enterprises. The facilities in the learning/training environment will provide timely, engaging, personalized/collaborative and tailored visual learning.

  15. The Cell Collective: Toward an open and collaborative approach to systems biology

    PubMed Central

    2012-01-01

    Background Despite decades of new discoveries in biomedical research, the overwhelming complexity of cells has been a significant barrier to a fundamental understanding of how cells work as a whole. As such, the holistic study of biochemical pathways requires computer modeling. Due to the complexity of cells, it is not feasible for one person or group to model the cell in its entirety. Results The Cell Collective is a platform that allows the world-wide scientific community to create these models collectively. Its interface enables users to build and use models without specifying any mathematical equations or computer code - addressing one of the major hurdles with computational research. In addition, this platform allows scientists to simulate and analyze the models in real-time on the web, including the ability to simulate loss/gain of function and test what-if scenarios in real time. Conclusions The Cell Collective is a web-based platform that enables laboratory scientists from across the globe to collaboratively build large-scale models of various biological processes, and simulate/analyze them in real time. In this manuscript, we show examples of its application to a large-scale model of signal transduction. PMID:22871178

  16. Parallel approach to identifying the well-test interpretation model using a neurocomputer

    NASA Astrophysics Data System (ADS)

    May, Edward A., Jr.; Dagli, Cihan H.

    1996-03-01

    The well test is one of the primary diagnostic and predictive tools used in the analysis of oil and gas wells. In these tests, a pressure recording device is placed in the well and the pressure response is recorded over time under controlled flow conditions. The interpreted results are indicators of the well's ability to flow and the damage done to the formation surrounding the wellbore during drilling and completion. The results are used for many purposes, including reservoir modeling (simulation) and economic forecasting. The first step in the analysis is the identification of the Well-Test Interpretation (WTI) model, which determines the appropriate solution method. Mis-identification of the WTI model occurs due to noise and non-ideal reservoir conditions. Previous studies have shown that a feed-forward neural network using the backpropagation algorithm can be used to identify the WTI model. One of the drawbacks to this approach is, however, training time, which can run into days of CPU time on personal computers. In this paper a similar neural network is applied using both a personal computer and a neurocomputer. Input data processing, network design, and performance are discussed and compared. The results show that the neurocomputer greatly eases the burden of training and allows the network to outperform a similar network running on a personal computer.

  17. Experimental and computational investigation of lateral gauge response in polycarbonate

    NASA Astrophysics Data System (ADS)

    Eliot, Jim; Harris, Ernst; Hazell, Paul; Appleby-Thomas, Gareth; Winter, Ronald; Wood, David; Owen, Gareth

    2011-06-01

    Polycarbonate's use in personal armour systems means its high strain-rate response has been extensively studied. Interestingly, embedded lateral manganin stress gauges in polycarbonate have shown gradients behind incident shocks, suggestive of increasing shear strength. However, such gauges need to be embedded in a central (typically) epoxy interlayer - an inherently invasive approach. Recently, research has suggested that in such metal systems interlayer/target impedance may contribute to observed gradients in lateral stress. Here, experimental T-gauge (Vishay Micro-Measurements® type J2M-SS-580SF-025) traces from polycarbonate targets are compared to computational simulations. This work extends previous efforts such that similar impedance exists between the interlayer and matrix (target) interface. Further, experiments and simulations are presented investigating the effects of a ``dry joint'' in polycarbonate, in which no encapsulating medium is employed.

  18. A Collection of Nonlinear Aircraft Simulations in MATLAB

    NASA Technical Reports Server (NTRS)

    Garza, Frederico R.; Morelli, Eugene A.

    2003-01-01

    Nonlinear six degree-of-freedom simulations for a variety of aircraft were created using MATLAB. Data for aircraft geometry, aerodynamic characteristics, mass / inertia properties, and engine characteristics were obtained from open literature publications documenting wind tunnel experiments and flight tests. Each nonlinear simulation was implemented within a common framework in MATLAB, and includes an interface with another commercially-available program to read pilot inputs and produce a three-dimensional (3-D) display of the simulated airplane motion. Aircraft simulations include the General Dynamics F-16 Fighting Falcon, Convair F-106B Delta Dart, Grumman F-14 Tomcat, McDonnell Douglas F-4 Phantom, NASA Langley Free-Flying Aircraft for Sub-scale Experimental Research (FASER), NASA HL-20 Lifting Body, NASA / DARPA X-31 Enhanced Fighter Maneuverability Demonstrator, and the Vought A-7 Corsair II. All nonlinear simulations and 3-D displays run in real time in response to pilot inputs, using contemporary desktop personal computer hardware. The simulations can also be run in batch mode. Each nonlinear simulation includes the full nonlinear dynamics of the bare airframe, with a scaled direct connection from pilot inputs to control surface deflections to provide adequate pilot control. Since all the nonlinear simulations are implemented entirely in MATLAB, user-defined control laws can be added in a straightforward fashion, and the simulations are portable across various computing platforms. Routines for trim, linearization, and numerical integration are included. The general nonlinear simulation framework and the specifics for each particular aircraft are documented.

  19. Analysis of Harmonic Vibration of Cable-Stayed Footbridge under the Influence of Changes of the Cables Tension

    NASA Astrophysics Data System (ADS)

    Pakos, Wojciech

    2015-09-01

    The paper presents numerical analysis of harmonically excited vibration of a cable-stayed footbridge caused by a load function simulating crouching (squats) while changing the static tension in chosen cables. The intentional synchronized motion (e.g., squats) of a single person or group of persons on the footbridge with a frequency close to the natural frequency of the structure may lead to the resonant vibrations with large amplitudes. The appropriate tension changes in some cables cause detuning of resonance on account of stiffness changes of structures and hence detuning in the natural frequency that is close to the excitation frequency. The research was carried out on a 3D computer model of a real structure - a cable-stayed steel footbridge in Leśnica, a quarter of Wrocław, Poland, with the help of standard computer software based on FEM COSMOS/M System.

  20. Effects of Energetic Additives on Combustion Dynamics

    DTIC Science & Technology

    2010-04-19

    has the Distribution Statement checked befow. The current distribution for this document can be found in the DTIC® Technical Report Database. Q...no person shall be subject to any penalty for failing to comply with a collection of information if it does not display a currently velid OMB...and ethanol drops loaded with nano-Al additives burned differently. An exploratory computational study using Large Eddy Simulation indicated that

  1. FARSITE: a fire area simulator for fire managers

    Treesearch

    Mark A. Finney

    1995-01-01

    A fire growth model (FARSITE) has been developed for use on personal computers (PC’s). Because PC’s are commonly used by land and fire managers, this portable platform would be an accustomed means to bring fire growth modeling technology to management applications. The FARSITE model is intended for use in projecting the growth of prescribed natural fires for wilderness...

  2. Theoretical analysis and Vsim simulation of a low-voltage high-efficiency 250 GHz gyrotron

    NASA Astrophysics Data System (ADS)

    An, Chenxiang; Zhang, Dian; Zhang, Jun; Zhong, Huihuang

    2018-02-01

    Low-voltage, high-frequency gyrotrons with hundreds of watts of power are useful in radar, magnetic resonance spectroscopy and plasma diagnostic applications. In this paper, a 10 kV, 478 W, 250 GHz gyrotron with an efficiency of nearly 40% and a pitch ratio of 1.5 was designed through linear and nonlinear numerical analyses and Vsim particle-in-cell (PIC) simulation. Vsim is a highly efficient parallel PIC code, but it has seldom been used to carry out electron beam wave interaction simulations of gyro-devices. The setting up of the parameters required for the Vsim simulations of the gyrotron is presented. The results of Vsim simulations agree well with that of nonlinear numerical calculation. The commercial software Vsim7.2 completed the 3D gyrotron simulation in 80 h using a 20 core, 2.2 GHz personal computer with 256 GBytes of memory.

  3. An E-learning System based on Affective Computing

    NASA Astrophysics Data System (ADS)

    Duo, Sun; Song, Lu Xue

    In recent years, e-learning as a learning system is very popular. But the current e-learning systems cannot instruct students effectively since they do not consider the emotional state in the context of instruction. The emergence of the theory about "Affective computing" can solve this question. It can make the computer's intelligence no longer be a pure cognitive one. In this paper, we construct an emotional intelligent e-learning system based on "Affective computing". A dimensional model is put forward to recognize and analyze the student's emotion state and a virtual teacher's avatar is offered to regulate student's learning psychology with consideration of teaching style based on his personality trait. A "man-to-man" learning environment is built to simulate the traditional classroom's pedagogy in the system.

  4. Monitoring task loading with multivariate EEG measures during complex forms of human-computer interaction

    NASA Technical Reports Server (NTRS)

    Smith, M. E.; Gevins, A.; Brown, H.; Karnik, A.; Du, R.

    2001-01-01

    Electroencephalographic (EEG) recordings were made while 16 participants performed versions of a personal-computer-based flight simulation task of low, moderate, or high difficulty. As task difficulty increased, frontal midline theta EEG activity increased and alpha band activity decreased. A participant-specific function that combined multiple EEG features to create a single load index was derived from a sample of each participant's data and then applied to new test data from that participant. Index values were computed for every 4 s of task data. Across participants, mean task load index values increased systematically with increasing task difficulty and differed significantly between the different task versions. Actual or potential applications of this research include the use of multivariate EEG-based methods to monitor task loading during naturalistic computer-based work.

  5. Research on computer aided testing of pilot response to critical in-flight events

    NASA Technical Reports Server (NTRS)

    Giffin, W. C.; Rockwell, T. H.; Smith, P. J.

    1984-01-01

    Experiments on pilot decision making are described. The development of models of pilot decision making in critical in flight events (CIFE) are emphasized. The following tests are reported on the development of: (1) a frame system representation describing how pilots use their knowledge in a fault diagnosis task; (2) assessment of script norms, distance measures, and Markov models developed from computer aided testing (CAT) data; and (3) performance ranking of subject data. It is demonstrated that interactive computer aided testing either by touch CRT's or personal computers is a useful research and training device for measuring pilot information management in diagnosing system failures in simulated flight situations. Performance is dictated by knowledge of aircraft sybsystems, initial pilot structuring of the failure symptoms and efficient testing of plausible causal hypotheses.

  6. GPU-Based Simulation of Ultrasound Imaging Artifacts for Cryosurgery Training.

    PubMed

    Keelan, Robert; Shimada, Kenji; Rabin, Yoed

    2017-02-01

    This study presents an efficient computational technique for the simulation of ultrasound imaging artifacts associated with cryosurgery based on nonlinear ray tracing. This study is part of an ongoing effort to develop computerized training tools for cryosurgery, with prostate cryosurgery as a development model. The capability of performing virtual cryosurgical procedures on a variety of test cases is essential for effective surgical training. Simulated ultrasound imaging artifacts include reverberation and reflection of the cryoprobes in the unfrozen tissue, reflections caused by the freezing front, shadowing caused by the frozen region, and tissue property changes in repeated freeze-thaw cycles procedures. The simulated artifacts appear to preserve the key features observed in a clinical setting. This study displays an example of how training may benefit from toggling between the undisturbed ultrasound image, the simulated temperature field, the simulated imaging artifacts, and an augmented hybrid presentation of the temperature field superimposed on the ultrasound image. The proposed method is demonstrated on a graphic processing unit at 100 frames per second, on a mid-range personal workstation, at two orders of magnitude faster than a typical cryoprocedure. This performance is based on computation with C++ accelerated massive parallelism and its interoperability with the DirectX-rendering application programming interface.

  7. GPU-Based Simulation of Ultrasound Imaging Artifacts for Cryosurgery Training

    PubMed Central

    Keelan, Robert; Shimada, Kenji

    2016-01-01

    This study presents an efficient computational technique for the simulation of ultrasound imaging artifacts associated with cryosurgery based on nonlinear ray tracing. This study is part of an ongoing effort to develop computerized training tools for cryosurgery, with prostate cryosurgery as a development model. The capability of performing virtual cryosurgical procedures on a variety of test cases is essential for effective surgical training. Simulated ultrasound imaging artifacts include reverberation and reflection of the cryoprobes in the unfrozen tissue, reflections caused by the freezing front, shadowing caused by the frozen region, and tissue property changes in repeated freeze–thaw cycles procedures. The simulated artifacts appear to preserve the key features observed in a clinical setting. This study displays an example of how training may benefit from toggling between the undisturbed ultrasound image, the simulated temperature field, the simulated imaging artifacts, and an augmented hybrid presentation of the temperature field superimposed on the ultrasound image. The proposed method is demonstrated on a graphic processing unit at 100 frames per second, on a mid-range personal workstation, at two orders of magnitude faster than a typical cryoprocedure. This performance is based on computation with C++ accelerated massive parallelism and its interoperability with the DirectX-rendering application programming interface. PMID:26818026

  8. QwikMD — Integrative Molecular Dynamics Toolkit for Novices and Experts

    PubMed Central

    Ribeiro, João V.; Bernardi, Rafael C.; Rudack, Till; Stone, John E.; Phillips, James C.; Freddolino, Peter L.; Schulten, Klaus

    2016-01-01

    The proper functioning of biomolecules in living cells requires them to assume particular structures and to undergo conformational changes. Both biomolecular structure and motion can be studied using a wide variety of techniques, but none offers the level of detail as do molecular dynamics (MD) simulations. Integrating two widely used modeling programs, namely NAMD and VMD, we have created a robust, user-friendly software, QwikMD, which enables novices and experts alike to address biomedically relevant questions, where often only molecular dynamics simulations can provide answers. Performing both simple and advanced MD simulations interactively, QwikMD automates as many steps as necessary for preparing, carrying out, and analyzing simulations while checking for common errors and enabling reproducibility. QwikMD meets also the needs of experts in the field, increasing the efficiency and quality of their work by carrying out tedious or repetitive tasks while enabling easy control of every step. Whether carrying out simulations within the live view mode on a small laptop or performing complex and large simulations on supercomputers or Cloud computers, QwikMD uses the same steps and user interface. QwikMD is freely available by download on group and personal computers. It is also available on the cloud at Amazon Web Services. PMID:27216779

  9. QwikMD — Integrative Molecular Dynamics Toolkit for Novices and Experts

    NASA Astrophysics Data System (ADS)

    Ribeiro, João V.; Bernardi, Rafael C.; Rudack, Till; Stone, John E.; Phillips, James C.; Freddolino, Peter L.; Schulten, Klaus

    2016-05-01

    The proper functioning of biomolecules in living cells requires them to assume particular structures and to undergo conformational changes. Both biomolecular structure and motion can be studied using a wide variety of techniques, but none offers the level of detail as do molecular dynamics (MD) simulations. Integrating two widely used modeling programs, namely NAMD and VMD, we have created a robust, user-friendly software, QwikMD, which enables novices and experts alike to address biomedically relevant questions, where often only molecular dynamics simulations can provide answers. Performing both simple and advanced MD simulations interactively, QwikMD automates as many steps as necessary for preparing, carrying out, and analyzing simulations while checking for common errors and enabling reproducibility. QwikMD meets also the needs of experts in the field, increasing the efficiency and quality of their work by carrying out tedious or repetitive tasks while enabling easy control of every step. Whether carrying out simulations within the live view mode on a small laptop or performing complex and large simulations on supercomputers or Cloud computers, QwikMD uses the same steps and user interface. QwikMD is freely available by download on group and personal computers. It is also available on the cloud at Amazon Web Services.

  10. Designing environmental campaigns by using agent-based simulations: strategies for changing environmental attitudes.

    PubMed

    Mosler, Hans-Joachim; Martens, Thomas

    2008-09-01

    Agent-based computer simulation was used to create artificial communities in which each individual was constructed according to the principles of the elaboration likelihood model of Petty and Cacioppo [1986. The elaboration likelihood model of persuasion. In: Berkowitz, L. (Ed.), Advances in Experimental Social Psychology. Academic Press, New York, NY, pp. 123-205]. Campaigning strategies and community characteristics were varied systematically to understand and test their impact on attitudes towards environmental protection. The results show that strong arguments influence a green (environmentally concerned) population with many contacts most effectively, while peripheral cues have the greatest impact on a non-green population with fewer contacts. Overall, deeper information scrutiny increases the impact of strong arguments but is especially important for convincing green populations. Campaigns involving person-to-person communication are superior to mass-media campaigns because they can be adapted to recipients' characteristics.

  11. Agent based simulation on the process of human flesh search-From perspective of knowledge and emotion

    NASA Astrophysics Data System (ADS)

    Zhu, Hou; Hu, Bin

    2017-03-01

    Human flesh search as a new net crowed behavior, on the one hand can help us to find some special information, on the other hand may lead to privacy leaking and offending human right. In order to study the mechanism of human flesh search, this paper proposes a simulation model based on agent-based model and complex networks. The computational experiments show some useful results. Discovered information quantity and involved personal ratio are highly correlated, and most of net citizens will take part in the human flesh search or will not take part in the human flesh search. Knowledge quantity does not influence involved personal ratio, but influences whether HFS can find out the target human. When the knowledge concentrates on hub nodes, the discovered information quantity is either perfect or almost zero. Emotion of net citizens influences both discovered information quantity and involved personal ratio. Concretely, when net citizens are calm to face the search topic, it will be hardly to find out the target; But when net citizens are agitated, the target will be found out easily.

  12. "Watts per person" paradigm to design net zero energy buildings: Examining technology interventions and integrating occupant feedback to reduce plug loads in a commercial building

    NASA Astrophysics Data System (ADS)

    Yagi Kim, Mika

    As building envelopes have improved due to more restrictive energy codes, internal loads have increased largely due to the proliferation of computers, electronics, appliances, imaging and audio visual equipment that continues to grow in commercial buildings. As the dependency on the internet for information and data transfer increases, the electricity demand will pose a challenge to design and operate Net Zero Energy Buildings (NZEBs). Plug Loads (PLs) as a proportion of the building load has become the largest non-regulated building energy load and represents the third highest electricity end-use in California's commercial office buildings, accounting for 23% of the total building electricity consumption (Ecova 2011,2). In the Annual Energy Outlook 2008 (AEO2008), prepared by the Energy Information Administration (EIA) that presents long-term projections of energy supply and demand through 2030 states that office equipment and personal computers are the "fastest growing electrical end uses" in the commercial sector. This thesis entitled "Watts Per Person" Paradigm to Design Net Zero Energy Buildings, measures the implementation of advanced controls and behavioral interventions to study the reduction of PL energy use in the commercial sector. By integrating real world data extracted from an energy efficient commercial building of its energy use, the results produce a new methodology on estimating PL energy use by calculating based on "Watts Per Person" and analyzes computational simulation methods to design NZEBs.

  13. Constructing Scientific Arguments Using Evidence from Dynamic Computational Climate Models

    NASA Astrophysics Data System (ADS)

    Pallant, Amy; Lee, Hee-Sun

    2015-04-01

    Modeling and argumentation are two important scientific practices students need to develop throughout school years. In this paper, we investigated how middle and high school students ( N = 512) construct a scientific argument based on evidence from computational models with which they simulated climate change. We designed scientific argumentation tasks with three increasingly complex dynamic climate models. Each scientific argumentation task consisted of four parts: multiple-choice claim, openended explanation, five-point Likert scale uncertainty rating, and open-ended uncertainty rationale. We coded 1,294 scientific arguments in terms of a claim's consistency with current scientific consensus, whether explanations were model based or knowledge based and categorized the sources of uncertainty (personal vs. scientific). We used chi-square and ANOVA tests to identify significant patterns. Results indicate that (1) a majority of students incorporated models as evidence to support their claims, (2) most students used model output results shown on graphs to confirm their claim rather than to explain simulated molecular processes, (3) students' dependence on model results and their uncertainty rating diminished as the dynamic climate models became more and more complex, (4) some students' misconceptions interfered with observing and interpreting model results or simulated processes, and (5) students' uncertainty sources reflected more frequently on their assessment of personal knowledge or abilities related to the tasks than on their critical examination of scientific evidence resulting from models. These findings have implications for teaching and research related to the integration of scientific argumentation and modeling practices to address complex Earth systems.

  14. Programmable personality interface for the dynamic infrared scene generator (IRSG2)

    NASA Astrophysics Data System (ADS)

    Buford, James A., Jr.; Mobley, Scott B.; Mayhall, Anthony J.; Braselton, William J.

    1998-07-01

    As scene generator platforms begin to rely specifically on commercial off-the-shelf (COTS) hardware and software components, the need for high speed programmable personality interfaces (PPIs) are required for interfacing to Infrared (IR) flight computer/processors and complex IR projectors in the hardware-in-the-loop (HWIL) simulation facilities. Recent technological advances and innovative applications of established technologies are beginning to allow development of cost effective PPIs to interface to COTS scene generators. At the U.S. Army Aviation and Missile Command (AMCOM) Missile Research, Development, and Engineering Center (MRDEC) researchers have developed such a PPI to reside between the AMCOM MRDEC IR Scene Generator (IRSG) and either a missile flight computer or the dynamic Laser Diode Array Projector (LDAP). AMCOM MRDEC has developed several PPIs for the first and second generation IRSGs (IRSG1 and IRSG2), which are based on Silicon Graphics Incorporated (SGI) Onyx and Onyx2 computers with Reality Engine 2 (RE2) and Infinite Reality (IR/IR2) graphics engines. This paper provides an overview of PPIs designed, integrated, tested, and verified at AMCOM MRDEC, specifically the IRSG2's PPI.

  15. Inventory of environmental impact models related to energy technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Owen, P.T.; Dailey, N.S.; Johnson, C.A.

    The purpose of this inventory is to identify and collect data on computer simulations and computational models related to the environmental effects of energy source development, energy conversion, or energy utilization. Information for 33 data fields was sought for each model reported. All of the information which could be obtained within the time alloted for completion of the project is presented for each model listed. Efforts will be continued toward acquiring the needed information. Readers who are interested in these particular models are invited to contact ESIC for assistance in locating them. In addition to the standard bibliographic information, othermore » data fields of interest to modelers, such as computer hardware and software requirements, algorithms, applications, and existing model validation information, are included. Indexes are provided for contact person, acronym, keyword, and title. The models are grouped into the following categories: atmospheric transport, air quality, aquatic transport, terrestrial food chains, soil transport, aquatic food chains, water quality, dosimetry, and human effects, animal effects, plant effects, and generalized environmental transport. Within these categories, the models are arranged alphabetically by last name of the contact person.« less

  16. PERSONAL COMPUTERS AND ENVIRONMENTAL ENGINEERING

    EPA Science Inventory

    This article discusses how personal computers can be applied to environmental engineering. fter explaining some of the differences between mainframe and Personal computers, we will review the development of personal computers and describe the areas of data management, interactive...

  17. Quality of Care as an Emergent Phenomenon out of a Small-World Network of Relational Actors.

    PubMed

    Fiorini, Rodolfo; De Giacomo, Piero; Marconi, Pier Luigi; L'Abate, Luciano

    2014-01-01

    In Healthcare Decision Support System, the development and evaluation of effective "Quality of Care" (QOC) indicators, in simulation-based training, are key feature to develop resilient and antifragile organization scenarios. Is it possible to conceive of QOC not only as a result of a voluntary and rational decision, imposed or even not, but also as an overall system "emergent phenomenon" out of a small-world network of relational synthetic actors, endowed with their own personality profiles to simulate human behaviour (for short, called "subjects")? In order to answer this question and to observe the phenomena of real emergence we should use computational models of high complexity, with heavy computational load and extensive computational time. Nevertheless, De Giacomo's Elementary Pragmatic Model (EPM) intrinsic self-reflexive functional logical closure enables to run simulation examples to classify the outcomes grown out of a small-world network of relational subjects fast and effectively. Therefore, it is possible to take note and to learn of how much strategic systemic interventions can induce context conditions of QOC facilitation, which can improve the effectiveness of specific actions, which otherwise might be paradoxically counterproductive also. Early results are so encouraging to use EPM as basic block to start designing more powerful Evolutive Elementary Pragmatic Model (E2PM) for real emergence computational model, to cope with ontological uncertainty at system level.

  18. 48 CFR 1552.239-103 - Acquisition of Energy Star Compliant Microcomputers, Including Personal Computers, Monitors and...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Compliant Microcomputers, Including Personal Computers, Monitors and Printers. 1552.239-103 Section 1552.239... Star Compliant Microcomputers, Including Personal Computers, Monitors and Printers. As prescribed in... Personal Computers, Monitors, and Printers (APR 1996) (a) The Contractor shall provide computer products...

  19. 48 CFR 1552.239-103 - Acquisition of Energy Star Compliant Microcomputers, Including Personal Computers, Monitors and...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Compliant Microcomputers, Including Personal Computers, Monitors and Printers. 1552.239-103 Section 1552.239... Star Compliant Microcomputers, Including Personal Computers, Monitors and Printers. As prescribed in... Personal Computers, Monitors, and Printers (APR 1996) (a) The Contractor shall provide computer products...

  20. 48 CFR 1552.239-103 - Acquisition of Energy Star Compliant Microcomputers, Including Personal Computers, Monitors and...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Compliant Microcomputers, Including Personal Computers, Monitors and Printers. 1552.239-103 Section 1552.239... Star Compliant Microcomputers, Including Personal Computers, Monitors and Printers. As prescribed in... Personal Computers, Monitors, and Printers (APR 1996) (a) The Contractor shall provide computer products...

  1. Issues in implementing a knowledge-based ECG analyzer for personal mobile health monitoring.

    PubMed

    Goh, K W; Kim, E; Lavanya, J; Kim, Y; Soh, C B

    2006-01-01

    Advances in sensor technology, personal mobile devices, and wireless broadband communications are enabling the development of an integrated personal mobile health monitoring system that can provide patients with a useful tool to assess their own health and manage their personal health information anytime and anywhere. Personal mobile devices, such as PDAs and mobile phones, are becoming more powerful integrated information management tools and play a major role in many people's lives. We focus on designing a health-monitoring system for people who suffer from cardiac arrhythmias. We have developed computer simulation models to evaluate the performance of appropriate electrocardiogram (ECG) analysis techniques that can be implemented on personal mobile devices. This paper describes an ECG analyzer to perform ECG beat and episode detection and classification. We have obtained promising preliminary results from our study. Also, we discuss several key considerations when implementing a mobile health monitoring solution. The mobile ECG analyzer would become a front-end patient health data acquisition module, which is connected to the Personal Health Information Management System (PHIMS) for data repository.

  2. Computational modeling of radiofrequency ablation: evaluation on ex vivo data using ultrasound monitoring

    NASA Astrophysics Data System (ADS)

    Audigier, Chloé; Kim, Younsu; Dillow, Austin; Boctor, Emad M.

    2017-03-01

    Radiofrequency ablation (RFA) is the most widely used minimally invasive ablative therapy for liver cancer, but it is challenged by a lack of patient-specific monitoring. Inter-patient tissue variability and the presence of blood vessels make the prediction of the RFA difficult. A monitoring tool which can be personalized for a given patient during the intervention would be helpful to achieve a complete tumor ablation. However, the clinicians do not have access to such a tool, which results in incomplete treatment and a large number of recurrences. Computational models can simulate the phenomena and mechanisms governing this therapy. The temperature evolution as well as the resulted ablation can be modeled. When combined together with intraoperative measurements, computational modeling becomes an accurate and powerful tool to gain quantitative understanding and to enable improvements in the ongoing clinical settings. This paper shows how computational models of RFA can be evaluated using intra-operative measurements. First, simulations are used to demonstrate the feasibility of the method, which is then evaluated on two ex vivo datasets. RFA is simulated on a simplified geometry to generate realistic longitudinal temperature maps and the resulted necrosis. Computed temperatures are compared with the temperature evolution recorded using thermometers, and with temperatures monitored by ultrasound (US) in a 2D plane containing the ablation tip. Two ablations are performed on two cadaveric bovine livers, and we achieve error of 2.2 °C on average between the computed and the thermistors temperature and 1.4 °C and 2.7 °C on average between the temperature computed and monitored by US during the ablation at two different time points (t = 240 s and t = 900 s).

  3. Deformable torso phantoms of Chinese adults for personalized anatomy modelling.

    PubMed

    Wang, Hongkai; Sun, Xiaobang; Wu, Tongning; Li, Congsheng; Chen, Zhonghua; Liao, Meiying; Li, Mengci; Yan, Wen; Huang, Hui; Yang, Jia; Tan, Ziyu; Hui, Libo; Liu, Yue; Pan, Hang; Qu, Yue; Chen, Zhaofeng; Tan, Liwen; Yu, Lijuan; Shi, Hongcheng; Huo, Li; Zhang, Yanjun; Tang, Xin; Zhang, Shaoxiang; Liu, Changjian

    2018-04-16

    In recent years, there has been increasing demand for personalized anatomy modelling for medical and industrial applications, such as ergonomics device development, clinical radiological exposure simulation, biomechanics analysis, and 3D animation character design. In this study, we constructed deformable torso phantoms that can be deformed to match the personal anatomy of Chinese male and female adults. The phantoms were created based on a training set of 79 trunk computed tomography (CT) images (41 males and 38 females) from normal Chinese subjects. Major torso organs were segmented from the CT images, and the statistical shape model (SSM) approach was used to learn the inter-subject anatomical variations. To match the personal anatomy, the phantoms were registered to individual body surface scans or medical images using the active shape model method. The constructed SSM demonstrated anatomical variations in body height, fat quantity, respiratory status, organ geometry, male muscle size, and female breast size. The masses of the deformed phantom organs were consistent with Chinese population organ mass ranges. To validate the performance of personal anatomy modelling, the phantoms were registered to the body surface scan and CT images. The registration accuracy measured from 22 test CT images showed a median Dice coefficient over 0.85, a median volume recovery coefficient (RC vlm ) between 0.85 and 1.1, and a median averaged surface distance (ASD) < 1.5 mm. We hope these phantoms can serve as computational tools for personalized anatomy modelling for the research community. © 2018 Anatomical Society.

  4. [Measurement of intracranial hematoma volume by personal computer].

    PubMed

    DU, Wanping; Tan, Lihua; Zhai, Ning; Zhou, Shunke; Wang, Rui; Xue, Gongshi; Xiao, An

    2011-01-01

    To explore the method for intracranial hematoma volume measurement by the personal computer. Forty cases of various intracranial hematomas were measured by the computer tomography with quantitative software and personal computer with Photoshop CS3 software, respectively. the data from the 2 methods were analyzed and compared. There was no difference between the data from the computer tomography and the personal computer (P>0.05). The personal computer with Photoshop CS3 software can measure the volume of various intracranial hematomas precisely, rapidly and simply. It should be recommended in the clinical medicolegal identification.

  5. Advancing botnet modeling techniques for military and security simulations

    NASA Astrophysics Data System (ADS)

    Banks, Sheila B.; Stytz, Martin R.

    2011-06-01

    Simulation environments serve many purposes, but they are only as good as their content. One of the most challenging and pressing areas that call for improved content is the simulation of bot armies (botnets) and their effects upon networks and computer systems. Botnets are a new type of malware, a type that is more powerful and potentially dangerous than any other type of malware. A botnet's power derives from several capabilities including the following: 1) the botnet's capability to be controlled and directed throughout all phases of its activity, 2) a command and control structure that grows increasingly sophisticated, and 3) the ability of a bot's software to be updated at any time by the owner of the bot (a person commonly called a bot master or bot herder.) Not only is a bot army powerful and agile in its technical capabilities, a bot army can be extremely large, can be comprised of tens of thousands, if not millions, of compromised computers or it can be as small as a few thousand targeted systems. In all botnets, their members can surreptitiously communicate with each other and their command and control centers. In sum, these capabilities allow a bot army to execute attacks that are technically sophisticated, difficult to trace, tactically agile, massive, and coordinated. To improve our understanding of their operation and potential, we believe that it is necessary to develop computer security simulations that accurately portray bot army activities, with the goal of including bot army simulations within military simulation environments. In this paper, we investigate issues that arise when simulating bot armies and propose a combination of the biologically inspired MSEIR infection spread model coupled with the jump-diffusion infection spread model to portray botnet propagation.

  6. Virtual Instrument Simulator for CERES

    NASA Technical Reports Server (NTRS)

    Chapman, John J.

    1997-01-01

    A benchtop virtual instrument simulator for CERES (Clouds and the Earth's Radiant Energy System) has been built at NASA, Langley Research Center in Hampton, VA. The CERES instruments will fly on several earth orbiting platforms notably NASDA's Tropical Rainfall Measurement Mission (TRMM) and NASA's Earth Observing System (EOS) satellites. CERES measures top of the atmosphere radiative fluxes using microprocessor controlled scanning radiometers. The CERES Virtual Instrument Simulator consists of electronic circuitry identical to the flight unit's twin microprocessors and telemetry interface to the supporting spacecraft electronics and two personal computers (PC) connected to the I/O ports that control azimuth and elevation gimbals. Software consists of the unmodified TRW developed Flight Code and Ground Support Software which serves as the instrument monitor and NASA/TRW developed engineering models of the scanners. The CERES Instrument Simulator will serve as a testbed for testing of custom instrument commands intended to solve in-flight anomalies of the instruments which could arise during the CERES mission. One of the supporting computers supports the telemetry display which monitors the simulator microprocessors during the development and testing of custom instrument commands. The CERES engineering development software models have been modified to provide a virtual instrument running on a second supporting computer linked in real time to the instrument flight microprocessor control ports. The CERES Instrument Simulator will be used to verify memory uploads by the CERES Flight Operations TEAM at NASA. Plots of the virtual scanner models match the actual instrument scan plots. A high speed logic analyzer has been used to track the performance of the flight microprocessor. The concept of using an identical but non-flight qualified microprocessor and electronics ensemble linked to a virtual instrument with identical system software affords a relatively inexpensive simulation system capable of high fidelity.

  7. Computer-based personality judgments are more accurate than those made by humans

    PubMed Central

    Youyou, Wu; Kosinski, Michal; Stillwell, David

    2015-01-01

    Judging others’ personalities is an essential skill in successful social living, as personality is a key driver behind people’s interactions, behaviors, and emotions. Although accurate personality judgments stem from social-cognitive skills, developments in machine learning show that computer models can also make valid judgments. This study compares the accuracy of human and computer-based personality judgments, using a sample of 86,220 volunteers who completed a 100-item personality questionnaire. We show that (i) computer predictions based on a generic digital footprint (Facebook Likes) are more accurate (r = 0.56) than those made by the participants’ Facebook friends using a personality questionnaire (r = 0.49); (ii) computer models show higher interjudge agreement; and (iii) computer personality judgments have higher external validity when predicting life outcomes such as substance use, political attitudes, and physical health; for some outcomes, they even outperform the self-rated personality scores. Computers outpacing humans in personality judgment presents significant opportunities and challenges in the areas of psychological assessment, marketing, and privacy. PMID:25583507

  8. Computer-based personality judgments are more accurate than those made by humans.

    PubMed

    Youyou, Wu; Kosinski, Michal; Stillwell, David

    2015-01-27

    Judging others' personalities is an essential skill in successful social living, as personality is a key driver behind people's interactions, behaviors, and emotions. Although accurate personality judgments stem from social-cognitive skills, developments in machine learning show that computer models can also make valid judgments. This study compares the accuracy of human and computer-based personality judgments, using a sample of 86,220 volunteers who completed a 100-item personality questionnaire. We show that (i) computer predictions based on a generic digital footprint (Facebook Likes) are more accurate (r = 0.56) than those made by the participants' Facebook friends using a personality questionnaire (r = 0.49); (ii) computer models show higher interjudge agreement; and (iii) computer personality judgments have higher external validity when predicting life outcomes such as substance use, political attitudes, and physical health; for some outcomes, they even outperform the self-rated personality scores. Computers outpacing humans in personality judgment presents significant opportunities and challenges in the areas of psychological assessment, marketing, and privacy.

  9. Applications of personal computers in geophysics

    NASA Astrophysics Data System (ADS)

    Lee, W. H. K.; Lahr, J. C.; Habermann, R. E.

    Since 1981, the use of personal computers (PCs) to increase productivity has become widespread. At present, more than 5 million personal computers are in operation for business, education, engineering, and scientific purposes. Activities within AGU reflect this trend: KOSMOS, the AGU electronic network, was introduced this year, and the AGU Committee on Personal Computers, chaired by W.H K. Lee (U.S. Geological Survey, Menlo Park, Calif.), was recently formed. In addition, in conjunction with the 1986 AGU Fall Meeting, this committee is organizing a personal computer session and hands-on demonstrations to promote applications of personal computers in geophysics.

  10. Sharing One Biographical Detail Elicits Priming between Famous Names: Empirical and Computational Approaches

    PubMed Central

    Ihrke, Matthias; Brennen, Tim

    2011-01-01

    In this paper three experiments and corresponding model simulations are reported that investigate the priming of famous name recognition in order to explore the structure of the part of the semantic system dealing with people. Consistent with empirical findings, novel computational simulations using Burton et al.’s interactive activation and competition model point to a conceptual distinction between how priming is initiated in single- and double-familiarity tasks, indicating that priming should be weaker or non-existent for the single-familiarity task. Experiment 1 demonstrates that, within a double-familiarity framework using famous names, categorical, and associative priming are reliable effects. Pushing the model to the limit, it predicts that pairs of celebrities who are neither associatively nor categorically related but who share single biographical features, both died in a car crash for example, should prime each other. Experiment 2 investigated this in a double-familiarity task but the effect was not observed. We therefore simulated and realized a pairwise learning task that was conceptually similar to the double-familiarity-decision task but allowed to strengthen the underlying connections. Priming based on a single biographical feature could be found both in simulations and the experiment. The effect was not due to visual or name similarity which were controlled for and participants did not report using the biographical links between the people to learn the pairs. The results are interpreted to lend further support to structural models of the memory for persons. Furthermore, the results are consistent with the idea that episodic features known about people are stored in semantic memory and are automatically activated when encountering that person. PMID:21687446

  11. Digital analysis of wind tunnel imagery to measure fluid thickness

    NASA Technical Reports Server (NTRS)

    Easton, Roger L., Jr.; Enge, James

    1992-01-01

    Documented here are the procedure and results obtained from the application of digital image processing techniques to the problem of measuring the thickness of a deicing fluid on a model airfoil during simulated takeoffs. The fluid contained a fluorescent dye and the images were recorded under flash illumination on photographic film. The films were digitized and analyzed on a personal computer to obtain maps of the fluid thickness.

  12. Integrating in silico prediction methods, molecular docking, and molecular dynamics simulation to predict the impact of ALK missense mutations in structural perspective.

    PubMed

    Doss, C George Priya; Chakraborty, Chiranjib; Chen, Luonan; Zhu, Hailong

    2014-01-01

    Over the past decade, advancements in next generation sequencing technology have placed personalized genomic medicine upon horizon. Understanding the likelihood of disease causing mutations in complex diseases as pathogenic or neutral remains as a major task and even impossible in the structural context because of its time consuming and expensive experiments. Among the various diseases causing mutations, single nucleotide polymorphisms (SNPs) play a vital role in defining individual's susceptibility to disease and drug response. Understanding the genotype-phenotype relationship through SNPs is the first and most important step in drug research and development. Detailed understanding of the effect of SNPs on patient drug response is a key factor in the establishment of personalized medicine. In this paper, we represent a computational pipeline in anaplastic lymphoma kinase (ALK) for SNP-centred study by the application of in silico prediction methods, molecular docking, and molecular dynamics simulation approaches. Combination of computational methods provides a way in understanding the impact of deleterious mutations in altering the protein drug targets and eventually leading to variable patient's drug response. We hope this rapid and cost effective pipeline will also serve as a bridge to connect the clinicians and in silico resources in tailoring treatments to the patients' specific genotype.

  13. An Expanded Multi-scale Monte Carlo Simulation Method for Personalized Radiobiological Effect Estimation in Radiotherapy: a feasibility study

    NASA Astrophysics Data System (ADS)

    Zhang, Ying; Feng, Yuanming; Wang, Wei; Yang, Chengwen; Wang, Ping

    2017-03-01

    A novel and versatile “bottom-up” approach is developed to estimate the radiobiological effect of clinic radiotherapy. The model consists of multi-scale Monte Carlo simulations from organ to cell levels. At cellular level, accumulated damages are computed using a spectrum-based accumulation algorithm and predefined cellular damage database. The damage repair mechanism is modeled by an expanded reaction-rate two-lesion kinetic model, which were calibrated through replicating a radiobiological experiment. Multi-scale modeling is then performed on a lung cancer patient under conventional fractionated irradiation. The cell killing effects of two representative voxels (isocenter and peripheral voxel of the tumor) are computed and compared. At microscopic level, the nucleus dose and damage yields vary among all nucleuses within the voxels. Slightly larger percentage of cDSB yield is observed for the peripheral voxel (55.0%) compared to the isocenter one (52.5%). For isocenter voxel, survival fraction increase monotonically at reduced oxygen environment. Under an extreme anoxic condition (0.001%), survival fraction is calculated to be 80% and the hypoxia reduction factor reaches a maximum value of 2.24. In conclusion, with biological-related variations, the proposed multi-scale approach is more versatile than the existing approaches for evaluating personalized radiobiological effects in radiotherapy.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walker, La Tonya Nicole; Malczynski, Leonard A.

    DYNAMO is a computer program for building and running 'continuous' simulation models. It was developed by the Industrial Dynamics Group at the Massachusetts Institute of Technology for simulating dynamic feedback models of business, economic, and social systems. The history of the system dynamics method since 1957 includes many classic models built in DYANMO. It was not until the late 1980s that software was built to take advantage of the rise of personal computers and graphical user interfaces that DYNAMO was supplanted. There is much learning and insight to be gained from examining the DYANMO models and their accompanying research papers.more » We believe that it is a worthwhile exercise to convert DYNAMO models to more recent software packages. We have made an attempt to make it easier to turn these models into a more current system dynamics software language, Powersim © Studio produced by Powersim AS 2 of Bergen, Norway. This guide shows how to convert DYNAMO syntax into Studio syntax.« less

  15. Modeling of processes of formation of the images in optical-electronic systems

    NASA Astrophysics Data System (ADS)

    Grudin, B. N.; Plotnikov, V. S.; Fischenko, V. K.

    2001-08-01

    The digital model of the multicomponent coherent optical system with arbitrary layout of optical elements (lasers, lenses, phototransparencies with recording of the function of transmission of a specimens or filters, photoregistrars), constructed with usage of fast algorithms is considered. The model is realized as the program for personal computers in operational systems Windows 95, 98 and Windows NT. At simulation, for example, coherent system consisting of twenty elementary optical cascades a relative error in the output image as a rule does not exceed 0.25% when N >= 256 (N x N - the number of discrete samples on the image), and time of calculation of the output image on a computer (Pentium-2, 300 MHz) for N = 512 does not exceed one minute. The program of simulation of coherent optical systems will be utilized in scientific researches and at tutoring the students of Far East State University.

  16. Multiple wavelength spectral system simulating background light noise environment in satellite laser communications

    NASA Astrophysics Data System (ADS)

    Lu, Wei; Sun, Jianfeng; Hou, Peipei; Xu, Qian; Xi, Yueli; Zhou, Yu; Zhu, Funan; Liu, Liren

    2017-08-01

    Performance of satellite laser communications between GEO and LEO satellites can be influenced by background light noise appeared in the field of view due to sunlight or planets and some comets. Such influences should be studied on the ground testing platform before the space application. In this paper, we introduce a simulator that can simulate the real case of background light noise in space environment during the data talking via laser beam between two lonely satellites. This simulator can not only simulate the effect of multi-wavelength spectrum, but also the effects of adjustable angles of field-of-view, large range of adjustable optical power and adjustable deflection speeds of light noise in space environment. We integrate these functions into a device with small and compact size for easily mobile use. Software control function is also achieved via personal computer to adjust these functions arbitrarily. Keywords:

  17. A haptic interface for virtual simulation of endoscopic surgery.

    PubMed

    Rosenberg, L B; Stredney, D

    1996-01-01

    Virtual reality can be described as a convincingly realistic and naturally interactive simulation in which the user is given a first person illusion of being immersed within a computer generated environment While virtual reality systems offer great potential to reduce the cost and increase the quality of medical training, many technical challenges must be overcome before such simulation platforms offer effective alternatives to more traditional training means. A primary challenge in developing effective virtual reality systems is designing the human interface hardware which allows rich sensory information to be presented to users in natural ways. When simulating a given manual procedure, task specific human interface requirements dictate task specific human interface hardware. The following paper explores the design of human interface hardware that satisfies the task specific requirements of virtual reality simulation of Endoscopic surgical procedures. Design parameters were derived through direct cadaver studies and interviews with surgeons. Final hardware design is presented.

  18. Finite element analysis of a bone healing model: 1-year follow-up after internal fixation surgery for femoral fracture.

    PubMed

    Jiang-Jun, Zhou; Min, Zhao; Ya-Bo, Yan; Wei, Lei; Ren-Fa, Lv; Zhi-Yu, Zhu; Rong-Jian, Chen; Wei-Tao, Yu; Cheng-Fei, Du

    2014-03-01

    Finite element analysis was used to compare preoperative and postoperative stress distribution of a bone healing model of femur fracture, to identify whether broken ends of fractured bone would break or not after fixation dislodgement one year after intramedullary nailing. Method s: Using fast, personalized imaging, bone healing models of femur fracture were constructed based on data from multi-slice spiral computed tomography using Mimics, Geomagic Studio, and Abaqus software packages. The intramedullary pin was removed by Boolean operations before fixation was dislodged. Loads were applied on each model to simulate a person standing on one leg. The von Mises stress distribution, maximum stress, and its location was observed. Results : According to 10 kinds of display groups based on material assignment, the nodes of maximum and minimum von Mises stress were the same before and after dislodgement, and all nodes of maximum von Mises stress were outside the fracture line. The maximum von Mises stress node was situated at the bottom quarter of the femur. The von Mises stress distribution was identical before and after surgery. Conclusion : Fast, personalized model establishment can simulate fixation dislodgement before operation, and personalized finite element analysis was performed to successfully predict whether nail dislodgement would disrupt femur fracture or not.

  19. Transient Heat Conduction Simulation around Microprocessor Die

    NASA Astrophysics Data System (ADS)

    Nishi, Koji

    This paper explains about fundamental formula of calculating power consumption of CMOS (Complementary Metal-Oxide-Semiconductor) devices and its voltage and temperature dependency, then introduces equation for estimating power consumption of the microprocessor for notebook PC (Personal Computer). The equation is applied to heat conduction simulation with simplified thermal model and evaluates in sub-millisecond time step calculation. In addition, the microprocessor has two major heat conduction paths; one is from the top of the silicon die via thermal solution and the other is from package substrate and pins via PGA (Pin Grid Array) socket. Even though the dominant factor of heat conduction is the former path, the latter path - from package substrate and pins - plays an important role in transient heat conduction behavior. Therefore, this paper tries to focus the path from package substrate and pins, and to investigate more accurate method of estimating heat conduction paths of the microprocessor. Also, cooling performance expression of heatsink fan is one of key points to assure result with practical accuracy, while finer expression requires more computation resources which results in longer computation time. Then, this paper discusses the expression to minimize computation workload with a practical accuracy of the result.

  20. How virtual reality may enhance training in obstetrics and gynecology.

    PubMed

    Letterie, Gerard S

    2002-09-01

    Contemporary training in obstetrics and gynecology is aimed at the acquisition of a complex set of skills oriented to both the technical and personal aspects of patient care. The ability to create clinical simulations through virtual reality (VR) may facilitate the accomplishment of these goals. The purpose of this paper is 2-fold: (1) to review the circumstances and equipment in industry, science, and education in which VR has been successfully applied, and (2) to explore the possible role of VR for training in obstetrics and gynecology and to suggest innovative and unique approaches to enhancing this training. Qualitative assessment of the literature describing successful applications of VR in industry, law enforcement, military, and medicine from 1995 to 2000. Articles were identified through a computer-based search using Medline, Current Contents, and cross referencing bibliographies of articles identified through the search. One hundred and fifty-four articles were reviewed. This review of contemporary literature suggests that VR has been successfully used to simulate person-to-person interactions for training in psychiatry and the social sciences in a variety of circumstances by using real-time simulations of personal interactions, and to launch 3-dimensional trainers for surgical simulation. These successful applications and simulations suggest that this technology may be helpful and should be evaluated as an educational modality in obstetrics and gynecology in two areas: (1) counseling in circumstances ranging from routine preoperative informed consent to intervention in more acute circumstances such as domestic violence or rape, and (2) training in basic and advanced surgical skills for both medical students and residents. Virtual reality is an untested, but potentially useful, modality for training in obstetrics and gynecology. On the basis of successful applications in other nonmedical and medical areas, VR may have a role in teaching essential elements of counseling and surgical skill acquisition.

  1. Research use of the AIDA www.2aida.org diabetes software simulation program: a review--part 2. Generating simulated blood glucose data for prototype validation.

    PubMed

    Lehmann, Eldon D

    2003-01-01

    The purpose of this review is to describe research applications of the AIDA diabetes software simulator. AIDA is a computer program that permits the interactive simulation of insulin and glucose profiles for teaching, demonstration, and self-learning purposes. Since March/April 1996 it has been made freely available on the Internet as a noncommercial contribution to continuing diabetes education. Up to May 2003 well over 320,000 visits have been logged at the main AIDA Website--www.2aida.org--and over 65,000 copies of the AIDA program have been downloaded free-of-charge. This review (the second of two parts) overviews research projects and ventures, undertaken for the most part by other research workers in the diabetes computing field, that have made use of the freeware AIDA program. As with Part 1 of the review (Diabetes Technol Ther 2003;5:425-438) relevant research work was identified in three main ways: (i) by personal (e-mail/written) communications from researchers, (ii) via the ISI Web of Science citation database to identify published articles which referred to AIDA-related papers, and (iii) via searches on the Internet. Also, in a number of cases research students who had sought advice about AIDA, and diabetes computing in general, provided copies of their research dissertations/theses upon the completion of their projects. Part 2 of this review highlights some more of the research projects that have made use of the AIDA diabetes simulation program to date. A wide variety of diabetes computing topics are addressed. These range from learning about parameter interactions using simulated blood glucose data, to considerations of dietary assessments, developing new diabetes models, and performance monitoring of closed-loop insulin delivery devices. Other topics include evaluation/validation research usage of such software, applying simulated blood glucose data for prototype training/validation, and other research uses of placing technical information on the Web. This review confirms an unexpected but useful benefit of distributing a medical program, like AIDA, for free via the Internet--demonstrating how it is possible to have a synergistic benefit with other researchers--facilitating their own research projects in related medical fields. A common theme that emerges from the research ventures that have been reviewed is the use of simulated blood glucose data from the AIDA software for preliminary computer lab-based testing of other decision support prototypes. Issues surrounding such use of simulated data for separate computer prototype testing are considered further.

  2. High fidelity computational simulation of thrombus formation in Thoratec HeartMate II continuous flow ventricular assist device

    PubMed Central

    Wu, Wei-Tao; Yang, Fang; Wu, Jingchun; Aubry, Nadine; Massoudi, Mehrdad; Antaki, James F.

    2016-01-01

    Continuous flow ventricular assist devices (cfVADs) provide a life-saving therapy for severe heart failure. However, in recent years, the incidence of device-related thrombosis (resulting in stroke, device-exchange surgery or premature death) has been increasing dramatically, which has alarmed both the medical community and the FDA. The objective of this study was to gain improved understanding of the initiation and progression of thrombosis in one of the most commonly used cfVADs, the Thoratec HeartMate II. A computational fluid dynamics simulation (CFD) was performed using our recently updated mathematical model of thrombosis. The patterns of deposition predicted by simulation agreed well with clinical observations. Furthermore, thrombus accumulation was found to increase with decreased flow rate, and can be completely suppressed by the application of anticoagulants and/or improvement of surface chemistry. To our knowledge, this is the first simulation to explicitly model the processes of platelet deposition and thrombus growth in a continuous flow blood pump and thereby replicate patterns of deposition observed clinically. The use of this simulation tool over a range of hemodynamic, hematological, and anticoagulation conditions could assist physicians to personalize clinical management to mitigate the risk of thrombosis. It may also contribute to the design of future VADs that are less thrombogenic. PMID:27905492

  3. [Simulation of lung lobe resection with personal computer].

    PubMed

    Onuki, T; Murasugi, M; Mae, M; Koyama, K; Ikeda, T; Shimizu, T

    2005-09-01

    Various patterns of branching are seen for pulmonary arteries and veins in the lung hilum. However, thoracic surgeons usually cannot expect to discern much anatomical detail preoperatively. If the surgeon can gain an understanding of individual patterns preoperatively, the risks inherent in exposing the pulmonary vessels in the hilum can be avoided, reducing invasiveness. This software will meet the increasing needs of them in video-assisted thoracoscopic surgery (VATS) which prefer lesser dissections of the vessels and bronchus of hilum. We have produced free application software, where we can mark on pulmonary arteries, vein, bronchus and tumor of the successive images of computed tomography (CT). After receiving a compact disk containing 60 images of 2 mm CT slices, from tumor to hilum, in DICOM format, we required only 1 hour to obtain 3-dimensional images for a patient with other free software (Metasequoia LE). Furthermore, with Metasequoia LE, we can simulate cut the vessels and change the figure of them 3-dimensionally. Although the picture image leaves much room for improvement, we believe it is very attractive for residents because they can simulate operations.

  4. Myocardial Infarct Segmentation from Magnetic Resonance Images for Personalized Modeling of Cardiac Electrophysiology

    PubMed Central

    Ukwatta, Eranga; Arevalo, Hermenegild; Li, Kristina; Yuan, Jing; Qiu, Wu; Malamas, Peter; Wu, Katherine C.

    2016-01-01

    Accurate representation of myocardial infarct geometry is crucial to patient-specific computational modeling of the heart in ischemic cardiomyopathy. We have developed a methodology for segmentation of left ventricular (LV) infarct from clinically acquired, two-dimensional (2D), late-gadolinium enhanced cardiac magnetic resonance (LGE-CMR) images, for personalized modeling of ventricular electrophysiology. The infarct segmentation was expressed as a continuous min-cut optimization problem, which was solved using its dual formulation, the continuous max-flow (CMF). The optimization objective comprised of a smoothness term, and a data term that quantified the similarity between image intensity histograms of segmented regions and those of a set of training images. A manual segmentation of the LV myocardium was used to initialize and constrain the developed method. The three-dimensional geometry of infarct was reconstructed from its segmentation using an implicit, shape-based interpolation method. The proposed methodology was extensively evaluated using metrics based on geometry, and outcomes of individualized electrophysiological simulations of cardiac dys(function). Several existing LV infarct segmentation approaches were implemented, and compared with the proposed method. Our results demonstrated that the CMF method was more accurate than the existing approaches in reproducing expert manual LV infarct segmentations, and in electrophysiological simulations. The infarct segmentation method we have developed and comprehensively evaluated in this study constitutes an important step in advancing clinical applications of personalized simulations of cardiac electrophysiology. PMID:26731693

  5. EVALUATING THE SENSITIVITY OF RADIONUCLIDE DETECTORS FOR CONDUCTING A MARITIME ON-BOARD SEARCH USING MONTE CARLO SIMULATION IMPLEMENTED IN AVERT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, S; Dave Dunn, D

    The sensitivity of two specific types of radionuclide detectors for conducting an on-board search in the maritime environment was evaluated using Monte Carlo simulation implemented in AVERT{reg_sign}. AVERT{reg_sign}, short for the Automated Vulnerability Evaluation for Risk of Terrorism, is personal computer based vulnerability assessment software developed by the ARES Corporation. The sensitivity of two specific types of radionuclide detectors for conducting an on-board search in the maritime environment was evaluated using Monte Carlo simulation. The detectors, a RadPack and also a Personal Radiation Detector (PRD), were chosen from the class of Human Portable Radiation Detection Systems (HPRDS). Human Portable Radiationmore » Detection Systems (HPRDS) serve multiple purposes. In the maritime environment, there is a need to detect, localize, characterize, and identify radiological/nuclear (RN) material or weapons. The RadPack is a commercially available broad-area search device used for gamma and also for neutron detection. The PRD is chiefly used as a personal radiation protection device. It is also used to detect contraband radionuclides and to localize radionuclide sources. Neither device has the capacity to characterize or identify radionuclides. The principal aim of this study was to investigate the sensitivity of both the RadPack and the PRD while being used under controlled conditions in a simulated maritime environment for detecting hidden RN contraband. The detection distance varies by the source strength and the shielding present. The characterization parameters of the source are not indicated in this report so the results summarized are relative. The Monte Carlo simulation results indicate the probability of detection of the RN source at certain distances from the detector which is a function of transverse speed and instrument sensitivity for the specified RN source.« less

  6. PC-CUBE: A Personal Computer Based Hypercube

    NASA Technical Reports Server (NTRS)

    Ho, Alex; Fox, Geoffrey; Walker, David; Snyder, Scott; Chang, Douglas; Chen, Stanley; Breaden, Matt; Cole, Terry

    1988-01-01

    PC-CUBE is an ensemble of IBM PCs or close compatibles connected in the hypercube topology with ordinary computer cables. Communication occurs at the rate of 115.2 K-band via the RS-232 serial links. Available for PC-CUBE is the Crystalline Operating System III (CrOS III), Mercury Operating System, CUBIX and PLOTIX which are parallel I/O and graphics libraries. A CrOS performance monitor was developed to facilitate the measurement of communication and computation time of a program and their effects on performance. Also available are CXLISP, a parallel version of the XLISP interpreter; GRAFIX, some graphics routines for the EGA and CGA; and a general execution profiler for determining execution time spent by program subroutines. PC-CUBE provides a programming environment similar to all hypercube systems running CrOS III, Mercury and CUBIX. In addition, every node (personal computer) has its own graphics display monitor and storage devices. These allow data to be displayed or stored at every processor, which has much instructional value and enables easier debugging of applications. Some application programs which are taken from the book Solving Problems on Concurrent Processors (Fox 88) were implemented with graphics enhancement on PC-CUBE. The applications range from solving the Mandelbrot set, Laplace equation, wave equation, long range force interaction, to WaTor, an ecological simulation.

  7. Investigation of a computer virus outbreak in the pharmacy of a tertiary care teaching hospital.

    PubMed

    Bailey, T C; Reichley, R M

    1992-10-01

    A computer virus outbreak was recognized, verified, defined, investigated, and controlled using an infection control approach. The pathogenesis and epidemiology of computer virus infection are reviewed. Case-control study. Pharmacy of a tertiary care teaching institution. On October 28, 1991, 2 personal computers in the drug information center manifested symptoms consistent with the "Jerusalem" virus infection. The same day, a departmental personal computer began playing "Yankee Doodle," a sign of "Doodle" virus infection. An investigation of all departmental personal computers identified the "Stoned" virus in an additional personal computer. Controls were functioning virus-free personal computers within the department. Cases were associated with users who brought diskettes from outside the department (5/5 cases versus 5/13 controls, p = .04) and with College of Pharmacy student users (3/5 cases versus 0/13 controls, p = .012). The detection of a virus-infected diskette or personal computer was associated with the number of 5 1/4-inch diskettes in the files of personal computers, a surrogate for rate of media exchange (mean = 17.4 versus 152.5, p = .018, Wilcoxon rank sum test). After education of departmental personal computer users regarding appropriate computer hygiene and installation of virus protection software, no further spread of personal computer viruses occurred, although 2 additional Stoned-infected and 1 Jerusalem-infected diskettes were detected. We recommend that virus detection software be installed on personal computers where the interchange of diskettes among computers is necessary, that write-protect tabs be placed on all program master diskettes and data diskettes where data are being read and not written, that in the event of a computer virus outbreak, all available diskettes be quarantined and scanned by virus detection software, and to facilitate quarantine and scanning in an outbreak, that diskettes be stored in organized files.

  8. Thermo-Physiological Responses of Sailors in a Disabled Submarine with Interior Cabin Temperature and Humidity Slowly Rising as Predicted by Computer Simulation Techniques

    DTIC Science & Technology

    2013-09-01

    define CSTR .5 // 1/°C #define SKBFN 6.3 // liters/(h m^2) #define Skbfmax 90. // conservative could be higher for fit person 36 #define...WarmC=0; if (Tsk<TTSK) Colds=TTSK-Tsk; if (Tc>TTCR) WarmC=Tc-TTCR; Skbf=(SKBFN+CDIL*WarmC)/(1+ CSTR *Colds); // Liters/(h m^2) if (Skbf

  9. Theoretical Design Study of a 2-18 GHz Bandwidth Helix TWT (Traveling Wave Tube) Amplifier

    DTIC Science & Technology

    1987-02-01

    Inckode Security Clanification) THEORETICAL DESIGN STUDY OF A 2-18 GHz BANDWIDTH HELIX TWT AMPLIFIER 12. PERSONAL AUTNOR(S) Michael A. Frisoni 13a. TYPE...in a traveling-wave tube ( TWT ) output circuit in A’ order to realize a 2-18 GHz frequency bandwidth. The nondispersive helix circuit provides the...Input Parameters . . . . . . . . . . . 30 V. ULTRA- BROADBAND THEORY BASED ON TWT COMPUTER SIMULATION • . 33 A. Definitions

  10. High Performance Computing Contributions to DoD Mission Success 2002

    DTIC Science & Technology

    2003-03-01

    CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT Same as Report (SAR) 18 . NUMBER OF PAGES 194 19a. NAME OF RESPONSIBLE PERSON a. REPORT unclassified b...ABSTRACT unclassified c. THIS PAGE unclassified Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39- 18 Approved for public release...molecular diffusion – 18 – PROTECT BASES OF OPERATION Further applications of the pore-scale simulation have been identified, including the flow of

  11. Spontaneous Ad Hoc Mobile Cloud Computing Network

    PubMed Central

    Lacuesta, Raquel; Sendra, Sandra; Peñalver, Lourdes

    2014-01-01

    Cloud computing helps users and companies to share computing resources instead of having local servers or personal devices to handle the applications. Smart devices are becoming one of the main information processing devices. Their computing features are reaching levels that let them create a mobile cloud computing network. But sometimes they are not able to create it and collaborate actively in the cloud because it is difficult for them to build easily a spontaneous network and configure its parameters. For this reason, in this paper, we are going to present the design and deployment of a spontaneous ad hoc mobile cloud computing network. In order to perform it, we have developed a trusted algorithm that is able to manage the activity of the nodes when they join and leave the network. The paper shows the network procedures and classes that have been designed. Our simulation results using Castalia show that our proposal presents a good efficiency and network performance even by using high number of nodes. PMID:25202715

  12. Spontaneous ad hoc mobile cloud computing network.

    PubMed

    Lacuesta, Raquel; Lloret, Jaime; Sendra, Sandra; Peñalver, Lourdes

    2014-01-01

    Cloud computing helps users and companies to share computing resources instead of having local servers or personal devices to handle the applications. Smart devices are becoming one of the main information processing devices. Their computing features are reaching levels that let them create a mobile cloud computing network. But sometimes they are not able to create it and collaborate actively in the cloud because it is difficult for them to build easily a spontaneous network and configure its parameters. For this reason, in this paper, we are going to present the design and deployment of a spontaneous ad hoc mobile cloud computing network. In order to perform it, we have developed a trusted algorithm that is able to manage the activity of the nodes when they join and leave the network. The paper shows the network procedures and classes that have been designed. Our simulation results using Castalia show that our proposal presents a good efficiency and network performance even by using high number of nodes.

  13. 48 CFR 52.223-16 - IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Environmental Assessment of Personal Computer Products. 52.223-16 Section 52.223-16 Federal Acquisition... Assessment of Personal Computer Products. As prescribed in 23.705(b)(1), insert the following clause: IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products (DEC 2007) (a) Definitions...

  14. 48 CFR 52.223-16 - IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Environmental Assessment of Personal Computer Products. 52.223-16 Section 52.223-16 Federal Acquisition... Assessment of Personal Computer Products. As prescribed in 23.705(b)(1), insert the following clause: IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products (DEC 2007) (a) Definitions...

  15. 48 CFR 52.223-16 - IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Environmental Assessment of Personal Computer Products. 52.223-16 Section 52.223-16 Federal Acquisition... Assessment of Personal Computer Products. As prescribed in 23.705(b)(1), insert the following clause: IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products (DEC 2007) (a) Definitions...

  16. 48 CFR 52.223-16 - IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Environmental Assessment of Personal Computer Products. 52.223-16 Section 52.223-16 Federal Acquisition... Assessment of Personal Computer Products. As prescribed in 23.706(b)(1), insert the following clause: IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products (DEC 2007) (a) Definitions...

  17. Establishing a communications link between two different, incompatible, personal computers: with practical examples and illustrations and program code.

    PubMed

    Davidson, R W

    1985-01-01

    The increasing need to communicate to exchange data can be handled by personal microcomputers. The necessity for the transference of information stored in one type of personal computer to another type of personal computer is often encountered in the process of integrating multiple sources of information stored in different and incompatible computers in Medical Research and Practice. A practical example is demonstrated with two relatively inexpensive commonly used computers, the IBM PC jr. and the Apple IIe. The basic input/output (I/O) interface chip for serial communication for each computer are joined together using a Null connector and cable to form a communications link. Using BASIC (Beginner's All-purpose Symbolic Instruction Code) Computer Language and the Disk Operating System (DOS) the communications handshaking protocol and file transfer is established between the two computers. The BASIC programming languages used are Applesoft (Apple Personal Computer) and PC BASIC (IBM Personal computer).

  18. Design of a numerical model of lung by means of a special boundary condition in the truncated branches.

    PubMed

    Tena, Ana F; Fernández, Joaquín; Álvarez, Eduardo; Casan, Pere; Walters, D Keith

    2017-06-01

    The need for a better understanding of pulmonary diseases has led to increased interest in the development of realistic computational models of the human lung. To minimize computational cost, a reduced geometry model is used for a model lung airway geometry up to generation 16. Truncated airway branches require physiologically realistic boundary conditions to accurately represent the effect of the removed airway sections. A user-defined function has been developed, which applies velocities mapped from similar locations in fully resolved airway sections. The methodology can be applied in any general purpose computational fluid dynamics code, with the only limitation that the lung model must be symmetrical in each truncated branch. Unsteady simulations have been performed to verify the operation of the model. The test case simulates a spirometry because the lung is obliged to rapidly perform both inspiration and expiration. Once the simulation was completed, the obtained pressure in the lower level of the lung was used as a boundary condition. The output velocity, which is a numerical spirometry, was compared with the experimental spirometry for validation purposes. This model can be applied for a wide range of patient-specific resolution levels. If the upper airway generations have been constructed from a computed tomography scan, it would be possible to quickly obtain a complete reconstruction of the lung specific to a specific person, which would allow individualized therapies. Copyright © 2016 John Wiley & Sons, Ltd.

  19. Computational Modeling of Tissue Self-Assembly

    NASA Astrophysics Data System (ADS)

    Neagu, Adrian; Kosztin, Ioan; Jakab, Karoly; Barz, Bogdan; Neagu, Monica; Jamison, Richard; Forgacs, Gabor

    As a theoretical framework for understanding the self-assembly of living cells into tissues, Steinberg proposed the differential adhesion hypothesis (DAH) according to which a specific cell type possesses a specific adhesion apparatus that combined with cell motility leads to cell assemblies of various cell types in the lowest adhesive energy state. Experimental and theoretical efforts of four decades turned the DAH into a fundamental principle of developmental biology that has been validated both in vitro and in vivo. Based on computational models of cell sorting, we have developed a DAH-based lattice model for tissues in interaction with their environment and simulated biological self-assembly using the Monte Carlo method. The present brief review highlights results on specific morphogenetic processes with relevance to tissue engineering applications. Our own work is presented on the background of several decades of theoretical efforts aimed to model morphogenesis in living tissues. Simulations of systems involving about 105 cells have been performed on high-end personal computers with CPU times of the order of days. Studied processes include cell sorting, cell sheet formation, and the development of endothelialized tubes from rings made of spheroids of two randomly intermixed cell types, when the medium in the interior of the tube was different from the external one. We conclude by noting that computer simulations based on mathematical models of living tissues yield useful guidelines for laboratory work and can catalyze the emergence of innovative technologies in tissue engineering.

  20. Secure and Privacy-Preserving Body Sensor Data Collection and Query Scheme.

    PubMed

    Zhu, Hui; Gao, Lijuan; Li, Hui

    2016-02-01

    With the development of body sensor networks and the pervasiveness of smart phones, different types of personal data can be collected in real time by body sensors, and the potential value of massive personal data has attracted considerable interest recently. However, the privacy issues of sensitive personal data are still challenging today. Aiming at these challenges, in this paper, we focus on the threats from telemetry interface and present a secure and privacy-preserving body sensor data collection and query scheme, named SPCQ, for outsourced computing. In the proposed SPCQ scheme, users' personal information is collected by body sensors in different types and converted into multi-dimension data, and each dimension is converted into the form of a number and uploaded to the cloud server, which provides a secure, efficient and accurate data query service, while the privacy of sensitive personal information and users' query data is guaranteed. Specifically, based on an improved homomorphic encryption technology over composite order group, we propose a special weighted Euclidean distance contrast algorithm (WEDC) for multi-dimension vectors over encrypted data. With the SPCQ scheme, the confidentiality of sensitive personal data, the privacy of data users' queries and accurate query service can be achieved in the cloud server. Detailed analysis shows that SPCQ can resist various security threats from telemetry interface. In addition, we also implement SPCQ on an embedded device, smart phone and laptop with a real medical database, and extensive simulation results demonstrate that our proposed SPCQ scheme is highly efficient in terms of computation and communication costs.

  1. Secure and Privacy-Preserving Body Sensor Data Collection and Query Scheme

    PubMed Central

    Zhu, Hui; Gao, Lijuan; Li, Hui

    2016-01-01

    With the development of body sensor networks and the pervasiveness of smart phones, different types of personal data can be collected in real time by body sensors, and the potential value of massive personal data has attracted considerable interest recently. However, the privacy issues of sensitive personal data are still challenging today. Aiming at these challenges, in this paper, we focus on the threats from telemetry interface and present a secure and privacy-preserving body sensor data collection and query scheme, named SPCQ, for outsourced computing. In the proposed SPCQ scheme, users’ personal information is collected by body sensors in different types and converted into multi-dimension data, and each dimension is converted into the form of a number and uploaded to the cloud server, which provides a secure, efficient and accurate data query service, while the privacy of sensitive personal information and users’ query data is guaranteed. Specifically, based on an improved homomorphic encryption technology over composite order group, we propose a special weighted Euclidean distance contrast algorithm (WEDC) for multi-dimension vectors over encrypted data. With the SPCQ scheme, the confidentiality of sensitive personal data, the privacy of data users’ queries and accurate query service can be achieved in the cloud server. Detailed analysis shows that SPCQ can resist various security threats from telemetry interface. In addition, we also implement SPCQ on an embedded device, smart phone and laptop with a real medical database, and extensive simulation results demonstrate that our proposed SPCQ scheme is highly efficient in terms of computation and communication costs. PMID:26840319

  2. Program Predicts Time Courses of Human/Computer Interactions

    NASA Technical Reports Server (NTRS)

    Vera, Alonso; Howes, Andrew

    2005-01-01

    CPM X is a computer program that predicts sequences of, and amounts of time taken by, routine actions performed by a skilled person performing a task. Unlike programs that simulate the interaction of the person with the task environment, CPM X predicts the time course of events as consequences of encoded constraints on human behavior. The constraints determine which cognitive and environmental processes can occur simultaneously and which have sequential dependencies. The input to CPM X comprises (1) a description of a task and strategy in a hierarchical description language and (2) a description of architectural constraints in the form of rules governing interactions of fundamental cognitive, perceptual, and motor operations. The output of CPM X is a Program Evaluation Review Technique (PERT) chart that presents a schedule of predicted cognitive, motor, and perceptual operators interacting with a task environment. The CPM X program allows direct, a priori prediction of skilled user performance on complex human-machine systems, providing a way to assess critical interfaces before they are deployed in mission contexts.

  3. Wealth distribution, Pareto law, and stretched exponential decay of money: Computer simulations analysis of agent-based models

    NASA Astrophysics Data System (ADS)

    Aydiner, Ekrem; Cherstvy, Andrey G.; Metzler, Ralf

    2018-01-01

    We study by Monte Carlo simulations a kinetic exchange trading model for both fixed and distributed saving propensities of the agents and rationalize the person and wealth distributions. We show that the newly introduced wealth distribution - that may be more amenable in certain situations - features a different power-law exponent, particularly for distributed saving propensities of the agents. For open agent-based systems, we analyze the person and wealth distributions and find that the presence of trap agents alters their amplitude, leaving however the scaling exponents nearly unaffected. For an open system, we show that the total wealth - for different trap agent densities and saving propensities of the agents - decreases in time according to the classical Kohlrausch-Williams-Watts stretched exponential law. Interestingly, this decay does not depend on the trap agent density, but rather on saving propensities. The system relaxation for fixed and distributed saving schemes are found to be different.

  4. Using pilot data to size a two-arm randomized trial to find a nearly optimal personalized treatment strategy.

    PubMed

    Laber, Eric B; Zhao, Ying-Qi; Regh, Todd; Davidian, Marie; Tsiatis, Anastasios; Stanford, Joseph B; Zeng, Donglin; Song, Rui; Kosorok, Michael R

    2016-04-15

    A personalized treatment strategy formalizes evidence-based treatment selection by mapping patient information to a recommended treatment. Personalized treatment strategies can produce better patient outcomes while reducing cost and treatment burden. Thus, among clinical and intervention scientists, there is a growing interest in conducting randomized clinical trials when one of the primary aims is estimation of a personalized treatment strategy. However, at present, there are no appropriate sample size formulae to assist in the design of such a trial. Furthermore, because the sampling distribution of the estimated outcome under an estimated optimal treatment strategy can be highly sensitive to small perturbations in the underlying generative model, sample size calculations based on standard (uncorrected) asymptotic approximations or computer simulations may not be reliable. We offer a simple and robust method for powering a single stage, two-armed randomized clinical trial when the primary aim is estimating the optimal single stage personalized treatment strategy. The proposed method is based on inverting a plugin projection confidence interval and is thereby regular and robust to small perturbations of the underlying generative model. The proposed method requires elicitation of two clinically meaningful parameters from clinical scientists and uses data from a small pilot study to estimate nuisance parameters, which are not easily elicited. The method performs well in simulated experiments and is illustrated using data from a pilot study of time to conception and fertility awareness. Copyright © 2015 John Wiley & Sons, Ltd.

  5. [Application of 3D printing and computer-assisted surgical simulation in preoperative planning for acetabular fracture].

    PubMed

    Liu, Xin; Zeng, Can-Jun; Lu, Jian-Sen; Lin, Xu-Chen; Huang, Hua-Jun; Tan, Xin-Yu; Cai, Dao-Zhang

    2017-03-20

    To evaluate the feasibility and effectiveness of using 3D printing and computer-assisted surgical simulation in preoperative planning for acetabular fractures. A retrospective analysis was performed in 53 patients with pelvic fracture, who underwent surgical treatment between September, 2013 and December, 2015 with complete follow-up data. Among them, 19 patients were treated with CT three-dimensional reconstruction, computer-assisted virtual reset internal fixation, 3D model printing, and personalized surgery simulation before surgery (3D group), and 34 patients underwent routine preoperative examination (conventional group). The intraoperative blood loss, transfusion volume, times of intraoperative X-ray, operation time, Matta score and Merle D' Aubigne & Postel score were recorded in the 2 groups. Preoperative planning and postoperative outcomes in the two groups were compared. All the operations were completed successfully. In 3D group, significantly less intraoperative blood loss, transfusion volume, fewer times of X-ray, and shortened operation time were recorded compared with those in the conventional group (P<0.05). According to the Matta scores, excellent or good fracture reduction was achieved in 94.7% (18/19) of the patients in 3D group and in 82.4% (28/34) of the patients in conventional group; the rates of excellent and good hip function at the final follow-up were 89.5% (17/19) in the 3D group and 85.3% (29/34) in the conventional group (P>0.05). In the 3D group, the actual internal fixation well matched the preoperative design. 3D printing and computer-assisted surgical simulation for preoperative planning is feasible and accurate for management of acetabular fracture and can effectively improve the operation efficiency.

  6. On Fast Post-Processing of Global Positioning System Simulator Truth Data and Receiver Measurements and Solutions Data

    NASA Technical Reports Server (NTRS)

    Kizhner, Semion; Day, John H. (Technical Monitor)

    2000-01-01

    Post-Processing of data related to a Global Positioning System (GPS) simulation is an important activity in qualification of a GPS receiver for space flight. Because a GPS simulator is a critical resource it is desirable to move off the pertinent simulation data from the simulator as soon as a test is completed. The simulator data files are usually moved to a Personal Computer (PC), where the post-processing of the receiver logged measurements and solutions data and simulated data is performed. Typically post-processing is accomplished using PC-based commercial software languages and tools. Because of commercial software systems generality their general-purpose functions are notoriously slow and more than often are the bottleneck problem even for short duration experiments. For example, it may take 8 hours to post-process data from a 6-hour simulation. There is a need to do post-processing faster, especially in order to use the previous test results as feedback for a next simulation setup. This paper demonstrates that a fast software linear interpolation algorithm is applicable to a large class of engineering problems, like GPS simulation data post-processing, where computational time is a critical resource and is one of the most important considerations. An approach is developed that allows to speed-up post-processing by an order of magnitude. It is based on improving the post-processing bottleneck interpolation algorithm using apriori information that is specific to the GPS simulation application. The presented post-processing scheme was used in support of a few successful space flight missions carrying GPS receivers. A future approach to solving the post-processing performance problem using Field Programmable Gate Array (FPGA) technology is described.

  7. Desktop Computing Integration Project

    NASA Technical Reports Server (NTRS)

    Tureman, Robert L., Jr.

    1992-01-01

    The Desktop Computing Integration Project for the Human Resources Management Division (HRMD) of LaRC was designed to help division personnel use personal computing resources to perform job tasks. The three goals of the project were to involve HRMD personnel in desktop computing, link mainframe data to desktop capabilities, and to estimate training needs for the division. The project resulted in increased usage of personal computers by Awards specialists, an increased awareness of LaRC resources to help perform tasks, and personal computer output that was used in presentation of information to center personnel. In addition, the necessary skills for HRMD personal computer users were identified. The Awards Office was chosen for the project because of the consistency of their data requests and the desire of employees in that area to use the personal computer.

  8. Increasing emergency medicine residents' confidence in disaster management: use of an emergency department simulator and an expedited curriculum.

    PubMed

    Franc, Jeffrey Michael; Nichols, Darren; Dong, Sandy L

    2012-02-01

    Disaster Medicine is an increasingly important part of medicine. Emergency Medicine residency programs have very high curriculum commitments, and adding Disaster Medicine training to this busy schedule can be difficult. Development of a short Disaster Medicine curriculum that is effective and enjoyable for the participants may be a valuable addition to Emergency Medicine residency training. A simulation-based curriculum was developed. The curriculum included four group exercises in which the participants developed a disaster plan for a simulated hospital. This was followed by a disaster simulation using the Disastermed.Ca Emergency Disaster Simulator computer software Version 3.5.2 (Disastermed.Ca, Edmonton, Alberta, Canada) and the disaster plan developed by the participants. Progress was assessed by a pre- and post-test, resident evaluations, faculty evaluation of Command and Control, and markers obtained from the Disastermed.Ca software. Twenty-five residents agreed to partake in the training curriculum. Seventeen completed the simulation. There was no statistically significant difference in pre- and post-test scores. Residents indicated that they felt the curriculum had been useful, and judged it to be preferable to a didactic curriculum. In addition, the residents' confidence in their ability to manage a disaster increased on both a personal and and a departmental level. A simulation-based model of Disaster Medicine training, requiring approximately eight hours of classroom time, was judged by Emergency Medicine residents to be a valuable component of their medical training, and increased their confidence in personal and departmental disaster management capabilities.

  9. Muscle Stimulation Technology

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Under a Goddard Space Flight Center contract, Electrologic of America was able to refine the process of densely packing circuitry on personal computer boards, providing significant contributions to the closed-loop systems for the Remote Manipulator System Simulator. The microcircuitry work was then applied to the StimMaster FES Ergometer, an exercise device used to stimulate muscles suffering from paralysis. The electrical stimulation equipment was developed exclusively for V-Care Health Systems, Inc. Product still commercially available as of March 2002.

  10. Computational Modeling of Inflammation and Wound Healing

    PubMed Central

    Ziraldo, Cordelia; Mi, Qi; An, Gary; Vodovotz, Yoram

    2013-01-01

    Objective Inflammation is both central to proper wound healing and a key driver of chronic tissue injury via a positive-feedback loop incited by incidental cell damage. We seek to derive actionable insights into the role of inflammation in wound healing in order to improve outcomes for individual patients. Approach To date, dynamic computational models have been used to study the time evolution of inflammation in wound healing. Emerging clinical data on histo-pathological and macroscopic images of evolving wounds, as well as noninvasive measures of blood flow, suggested the need for tissue-realistic, agent-based, and hybrid mechanistic computational simulations of inflammation and wound healing. Innovation We developed a computational modeling system, Simple Platform for Agent-based Representation of Knowledge, to facilitate the construction of tissue-realistic models. Results A hybrid equation–agent-based model (ABM) of pressure ulcer formation in both spinal cord-injured and -uninjured patients was used to identify control points that reduce stress caused by tissue ischemia/reperfusion. An ABM of arterial restenosis revealed new dynamics of cell migration during neointimal hyperplasia that match histological features, but contradict the currently prevailing mechanistic hypothesis. ABMs of vocal fold inflammation were used to predict inflammatory trajectories in individuals, possibly allowing for personalized treatment. Conclusions The intertwined inflammatory and wound healing responses can be modeled computationally to make predictions in individuals, simulate therapies, and gain mechanistic insights. PMID:24527362

  11. Development of a personal dosimetry system based on optically stimulated luminescence of alpha-Al2O3:C for mixed radiation fields.

    PubMed

    Lee, S Y; Lee, K J

    2001-04-01

    To develop a personal optically stimulated luminescence (OSL) dosimetry system for mixed radiation fields using alpha-Al2O3:C, a discriminating badge filter system was designed by taking advantage of its optically stimulable properties and energy dependencies. This was done by designing a multi-element badge system for powder layered alpha-Al2O3:C material and an optical reader system based on high-intensity blue light-emitting diode (LED). The design of the multielement OSL dosimeter badge system developed allows the measurement of a personal dose equivalent value Hp(d) in mixed radiation fields of beta and gamma. Dosimetric properties of the personal OSL dosimeter badge system investigated here were the dose response, energy response and multi-readability. Based on the computational simulations and experiments of the proposed dosimeter design, it was demonstrated that a multi-element dosimeter system with an OSL technology based on alpha-Al2O3:C is suitable to obtain personal dose equivalent information in mixed radiation fields.

  12. Computational physics in RISC environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rhoades, C.E. Jr.

    The new high performance Reduced Instruction Set Computers (RISC) promise near Cray-level performance at near personal-computer prices. This paper explores the performance, conversion and compatibility issues associated with developing, testing and using our traditional, large-scale simulation models in the RISC environments exemplified by the IBM RS6000 and MISP R3000 machines. The questions of operating systems (CTSS versus UNIX), compilers (Fortran, C, pointers) and data are addressed in detail. Overall, it is concluded that the RISC environments are practical for a very wide range of computational physic activities. Indeed, all but the very largest two- and three-dimensional codes will work quitemore » well, particularly in a single user environment. Easily projected hardware-performance increases will revolutionize the field of computational physics. The way we do research will change profoundly in the next few years. There is, however, nothing more difficult to plan, nor more dangerous to manage than the creation of this new world.« less

  13. Computational physics in RISC environments. Revision 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rhoades, C.E. Jr.

    The new high performance Reduced Instruction Set Computers (RISC) promise near Cray-level performance at near personal-computer prices. This paper explores the performance, conversion and compatibility issues associated with developing, testing and using our traditional, large-scale simulation models in the RISC environments exemplified by the IBM RS6000 and MISP R3000 machines. The questions of operating systems (CTSS versus UNIX), compilers (Fortran, C, pointers) and data are addressed in detail. Overall, it is concluded that the RISC environments are practical for a very wide range of computational physic activities. Indeed, all but the very largest two- and three-dimensional codes will work quitemore » well, particularly in a single user environment. Easily projected hardware-performance increases will revolutionize the field of computational physics. The way we do research will change profoundly in the next few years. There is, however, nothing more difficult to plan, nor more dangerous to manage than the creation of this new world.« less

  14. eduCRATE--a Virtual Hospital architecture.

    PubMed

    Stoicu-Tivadar, Lăcrimioara; Stoicu-Tivadar, Vasile; Berian, Dorin; Drăgan, Simona; Serban, Alexandru; Serban, Corina

    2014-01-01

    eduCRATE is a complex project proposal which aims to develop a virtual learning environment offering interactive digital content through original and integrated solutions using cloud computing, complex multimedia systems in virtual space and personalized design with avatars. Compared to existing similar products the project brings the novelty of using languages for medical guides in order to ensure a maximum of flexibility. The Virtual Hospital simulations will create interactive clinical scenarios for which students will find solutions for positive diagnosis and therapeutic management. The solution based on cloud computing and immersive multimedia is an attractive option in education because is economical and it matches the current working style of the young generation to whom it addresses.

  15. To What Degree Are Undergraduate Students Using Their Personal Computers to Support Their Daily Study Practices?

    ERIC Educational Resources Information Center

    Sim, KwongNui; Butson, Russell

    2014-01-01

    This scoping study examines the degree to which twenty two undergraduate students used their personal computers to support their academic study. The students were selected based on their responses to a questionnaire aimed at gauging their degree of computer skill. Computer activity data was harvested from the personal computers of eighteen…

  16. Simulation of complex pharmacokinetic models in Microsoft Excel.

    PubMed

    Meineke, Ingolf; Brockmöller, Jürgen

    2007-12-01

    With the arrival of powerful personal computers in the office numerical methods are accessible to everybody. Simulation of complex processes therefore has become an indispensible tool in research and education. In this paper Microsoft EXCEL is used as a platform for a universal differential equation solver. The software is designed as an add-in aiming at a minimum of required user input to perform a given task. Four examples are included to demonstrate both, the simplicity of use and the versatility of possible applications. While the layout of the program is admittedly geared to the needs of pharmacokineticists, it can be used in any field where sets of differential equations are involved. The software package is available upon request.

  17. WMT: The CSDMS Web Modeling Tool

    NASA Astrophysics Data System (ADS)

    Piper, M.; Hutton, E. W. H.; Overeem, I.; Syvitski, J. P.

    2015-12-01

    The Community Surface Dynamics Modeling System (CSDMS) has a mission to enable model use and development for research in earth surface processes. CSDMS strives to expand the use of quantitative modeling techniques, promotes best practices in coding, and advocates for the use of open-source software. To streamline and standardize access to models, CSDMS has developed the Web Modeling Tool (WMT), a RESTful web application with a client-side graphical interface and a server-side database and API that allows users to build coupled surface dynamics models in a web browser on a personal computer or a mobile device, and run them in a high-performance computing (HPC) environment. With WMT, users can: Design a model from a set of components Edit component parameters Save models to a web-accessible server Share saved models with the community Submit runs to an HPC system Download simulation results The WMT client is an Ajax application written in Java with GWT, which allows developers to employ object-oriented design principles and development tools such as Ant, Eclipse and JUnit. For deployment on the web, the GWT compiler translates Java code to optimized and obfuscated JavaScript. The WMT client is supported on Firefox, Chrome, Safari, and Internet Explorer. The WMT server, written in Python and SQLite, is a layered system, with each layer exposing a web service API: wmt-db: database of component, model, and simulation metadata and output wmt-api: configure and connect components wmt-exe: launch simulations on remote execution servers The database server provides, as JSON-encoded messages, the metadata for users to couple model components, including descriptions of component exchange items, uses and provides ports, and input parameters. Execution servers are network-accessible computational resources, ranging from HPC systems to desktop computers, containing the CSDMS software stack for running a simulation. Once a simulation completes, its output, in NetCDF, is packaged and uploaded to a data server where it is stored and from which a user can download it as a single compressed archive file.

  18. Real-time software-based end-to-end wireless visual communications simulation platform

    NASA Astrophysics Data System (ADS)

    Chen, Ting-Chung; Chang, Li-Fung; Wong, Andria H.; Sun, Ming-Ting; Hsing, T. Russell

    1995-04-01

    Wireless channel impairments pose many challenges to real-time visual communications. In this paper, we describe a real-time software based wireless visual communications simulation platform which can be used for performance evaluation in real-time. This simulation platform consists of two personal computers serving as hosts. Major components of each PC host include a real-time programmable video code, a wireless channel simulator, and a network interface for data transport between the two hosts. The three major components are interfaced in real-time to show the interaction of various wireless channels and video coding algorithms. The programmable features in the above components allow users to do performance evaluation of user-controlled wireless channel effects without physically carrying out these experiments which are limited in scope, time-consuming, and costly. Using this simulation platform as a testbed, we have experimented with several wireless channel effects including Rayleigh fading, antenna diversity, channel filtering, symbol timing, modulation, and packet loss.

  19. Investigation of Item-Pair Presentation and Construct Validity of the Navy Computer Adaptive Personality Scales (NCAPS)

    DTIC Science & Technology

    2006-10-01

    NCAPS ) Christina M. Underhill, Ph.D. Approved for public release; distribution is unlimited. NPRST-TN-06-9 October 2006...Investigation of Item-Pair Presentation and Construct Validity of the Navy Computer Adaptive Personality Scales ( NCAPS ) Christina M. Underhill, Ph.D...documents one of the steps in our development of the Navy Computer Adaptive Personality Scales ( NCAPS ). NCAPS is a computer adaptive personality measure

  20. ADAM: An Accident Diagnostic,Analysis and Management System - Applications to Severe Accident Simulation and Management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zavisca, M.J.; Khatib-Rahbar, M.; Esmaili, H.

    2002-07-01

    The Accident Diagnostic, Analysis and Management (ADAM) computer code has been developed as a tool for on-line applications to accident diagnostics, simulation, management and training. ADAM's severe accident simulation capabilities incorporate a balance of mechanistic, phenomenologically based models with simple parametric approaches for elements including (but not limited to) thermal hydraulics; heat transfer; fuel heatup, meltdown, and relocation; fission product release and transport; combustible gas generation and combustion; and core-concrete interaction. The overall model is defined by a relatively coarse spatial nodalization of the reactor coolant and containment systems and is advanced explicitly in time. The result is to enablemore » much faster than real time (i.e., 100 to 1000 times faster than real time on a personal computer) applications to on-line investigations and/or accident management training. Other features of the simulation module include provision for activation of water injection, including the Engineered Safety Features, as well as other mechanisms for the assessment of accident management and recovery strategies and the evaluation of PSA success criteria. The accident diagnostics module of ADAM uses on-line access to selected plant parameters (as measured by plant sensors) to compute the thermodynamic state of the plant, and to predict various margins to safety (e.g., times to pressure vessel saturation and steam generator dryout). Rule-based logic is employed to classify the measured data as belonging to one of a number of likely scenarios based on symptoms, and a number of 'alarms' are generated to signal the state of the reactor and containment. This paper will address the features and limitations of ADAM with particular focus on accident simulation and management. (authors)« less

  1. Redesigned Human Metabolic Simulator

    NASA Technical Reports Server (NTRS)

    Duffield, Bruce; Jeng, Frank; Lange, Kevin

    2008-01-01

    A design has been formulated for a proposed improved version of an apparatus that simulates atmospheric effects of human respiration by introducing controlled amounts of carbon dioxide, water vapor, and heat into the air. Denoted a human metabolic simulator (HMS), the apparatus is used for testing life-support equipment when human test subjects are not available. The prior version of the HMS, to be replaced, was designed to simulate the respiratory effects of as many as four persons. It exploits the catalytic combustion of methyl acetate, for which the respiratory quotient (the molar ratio of carbon dioxide produced to oxygen consumed) is very close to the human respiratory quotient of about 0.86. The design of the improved HMS provides for simulation of the respiratory effects of as many as eight persons at various levels of activity. The design would also increase safety by eliminating the use of combustion. The improved HMS (see figure) would include a computer that would exert overall control. The computer would calculate the required amounts of oxygen removal, carbon dioxide addition, water addition, and heat addition by use of empirical equations for metabolic profiles of respiration and heat. A blower would circulate air between the HMS and a chamber containing a life-support system to be tested. With the help of feedback from a mass flowmeter, the blower speed would be adjusted to regulate the rate of flow according to the number of persons to be simulated and to a temperature-regulation requirement (the air temperature would indirectly depend on the rate of flow, among other parameters). Oxygen would be removed from the circulating air by means of a commercially available molecular sieve configured as an oxygen concentrator. Oxygen, argon, and trace amounts of nitrogen would pass through a bed in the molecular sieve while carbon dioxide, the majority of nitrogen, and other trace gases would be trapped by the bed and subsequently returned to the chamber. If, as recommended, the oxygen concentrator were of a rotating twelve-bed design, then variations in the product stream could be made very small. Carbon dioxide would be added directly to the circulating air by simple injection from a supply tank. The rate of injection would be maintained at the required rate by use of a mass flowmeter/controller. In the same way, nitrogen would be added to make up for the small amount of nitrogen lost through the oxygen concentrator. Water vapor would be added to the circulating air by heating the corresponding required flow of water to steam in a heat exchanger. More heat, required to complete the simulation of the thermal effect of respiration, would be added through another heat exchanger. Heat would be supplied to both heat exchangers via a hot-oil loop.

  2. All-optical 4-bit binary to binary coded decimal converter with the help of semiconductor optical amplifier-assisted Sagnac switch

    NASA Astrophysics Data System (ADS)

    Bhattachryya, Arunava; Kumar Gayen, Dilip; Chattopadhyay, Tanay

    2013-04-01

    All-optical 4-bit binary to binary coded decimal (BCD) converter has been proposed and described, with the help of semiconductor optical amplifier (SOA)-assisted Sagnac interferometric switches in this manuscript. The paper describes all-optical conversion scheme using a set of all-optical switches. BCD is common in computer systems that display numeric values, especially in those consisting solely of digital logic with no microprocessor. In many personal computers, the basic input/output system (BIOS) keep the date and time in BCD format. The operations of the circuit are studied theoretically and analyzed through numerical simulations. The model accounts for the SOA small signal gain, line-width enhancement factor and carrier lifetime, the switching pulse energy and width, and the Sagnac loop asymmetry. By undertaking a detailed numerical simulation the influence of these key parameters on the metrics that determine the quality of switching is thoroughly investigated.

  3. Experimental validation of spatial Fourier transform-based multiple sound zone generation with a linear loudspeaker array.

    PubMed

    Okamoto, Takuma; Sakaguchi, Atsushi

    2017-03-01

    Generating acoustically bright and dark zones using loudspeakers is gaining attention as one of the most important acoustic communication techniques for such uses as personal sound systems and multilingual guide services. Although most conventional methods are based on numerical solutions, an analytical approach based on the spatial Fourier transform with a linear loudspeaker array has been proposed, and its effectiveness has been compared with conventional acoustic energy difference maximization and presented by computer simulations. To describe the effectiveness of the proposal in actual environments, this paper investigates the experimental validation of the proposed approach with rectangular and Hann windows and compared it with three conventional methods: simple delay-and-sum beamforming, contrast maximization, and least squares-based pressure matching using an actually implemented linear array of 64 loudspeakers in an anechoic chamber. The results of both the computer simulations and the actual experiments show that the proposed approach with a Hann window more accurately controlled the bright and dark zones than the conventional methods.

  4. 47 CFR 15.102 - CPU boards and power supplies used in personal computers.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... computers. 15.102 Section 15.102 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL RADIO FREQUENCY DEVICES Unintentional Radiators § 15.102 CPU boards and power supplies used in personal computers. (a... modifications that must be made to a personal computer, peripheral device, CPU board or power supply during...

  5. 47 CFR 15.102 - CPU boards and power supplies used in personal computers.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... computers. 15.102 Section 15.102 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL RADIO FREQUENCY DEVICES Unintentional Radiators § 15.102 CPU boards and power supplies used in personal computers. (a... modifications that must be made to a personal computer, peripheral device, CPU board or power supply during...

  6. 47 CFR 15.102 - CPU boards and power supplies used in personal computers.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... computers. 15.102 Section 15.102 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL RADIO FREQUENCY DEVICES Unintentional Radiators § 15.102 CPU boards and power supplies used in personal computers. (a... modifications that must be made to a personal computer, peripheral device, CPU board or power supply during...

  7. 47 CFR 15.102 - CPU boards and power supplies used in personal computers.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... computers. 15.102 Section 15.102 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL RADIO FREQUENCY DEVICES Unintentional Radiators § 15.102 CPU boards and power supplies used in personal computers. (a... modifications that must be made to a personal computer, peripheral device, CPU board or power supply during...

  8. 47 CFR 15.102 - CPU boards and power supplies used in personal computers.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... computers. 15.102 Section 15.102 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL RADIO FREQUENCY DEVICES Unintentional Radiators § 15.102 CPU boards and power supplies used in personal computers. (a... modifications that must be made to a personal computer, peripheral device, CPU board or power supply during...

  9. Research and the Personal Computer.

    ERIC Educational Resources Information Center

    Blackburn, D. A.

    1989-01-01

    Discussed is the history and elements of the personal computer. Its uses as a laboratory assistant and generic toolkit for mathematical analysis and modeling are included. The future of the personal computer in research is addressed. (KR)

  10. Rotary engine performance computer program (RCEMAP and RCEMAPPC): User's guide

    NASA Technical Reports Server (NTRS)

    Bartrand, Timothy A.; Willis, Edward A.

    1993-01-01

    This report is a user's guide for a computer code that simulates the performance of several rotary combustion engine configurations. It is intended to assist prospective users in getting started with RCEMAP and/or RCEMAPPC. RCEMAP (Rotary Combustion Engine performance MAP generating code) is the mainframe version, while RCEMAPPC is a simplified subset designed for the personal computer, or PC, environment. Both versions are based on an open, zero-dimensional combustion system model for the prediction of instantaneous pressures, temperature, chemical composition and other in-chamber thermodynamic properties. Both versions predict overall engine performance and thermal characteristics, including bmep, bsfc, exhaust gas temperature, average material temperatures, and turbocharger operating conditions. Required inputs include engine geometry, materials, constants for use in the combustion heat release model, and turbomachinery maps. Illustrative examples and sample input files for both versions are included.

  11. Seismic Wave Propagation on the Tablet Computer

    NASA Astrophysics Data System (ADS)

    Emoto, K.

    2015-12-01

    Tablet computers widely used in recent years. The performance of the tablet computer is improving year by year. Some of them have performance comparable to the personal computer of a few years ago with respect to the calculation speed and the memory size. The convenience and the intuitive operation are the advantage of the tablet computer compared to the desktop PC. I developed the iPad application of the numerical simulation of the seismic wave propagation. The numerical simulation is based on the 2D finite difference method with the staggered-grid scheme. The number of the grid points is 512 x 384 = 196,608. The grid space is 200m in both horizontal and vertical directions. That is the calculation area is 102km x 77km. The time step is 0.01s. In order to reduce the user waiting time, the image of the wave field is drawn simultaneously with the calculation rather than playing the movie after the whole calculation. P and S wave energies are plotted on the screen every 20 steps (0.2s). There is the trade-off between the smooth simulation and the resolution of the wave field image. In the current setting, it takes about 30s to calculate the 10s wave propagation (50 times image updates). The seismogram at the receiver is displayed below of the wave field updated in real time. The default medium structure consists of 3 layers. The layer boundary is defined by 10 movable points with linear interpolation. Users can intuitively change to the arbitrary boundary shape by moving the point. Also users can easily change the source and the receiver positions. The favorite structure can be saved and loaded. For the advance simulation, users can introduce the random velocity fluctuation whose spectrum can be changed to the arbitrary shape. By using this application, everyone can simulate the seismic wave propagation without the special knowledge of the elastic wave equation. So far, the Japanese version of the application is released on the App Store. Now I am preparing the English version.

  12. Sensitivity of reentrant driver localization to electrophysiological parameter variability in image-based computational models of persistent atrial fibrillation sustained by a fibrotic substrate

    NASA Astrophysics Data System (ADS)

    Deng, Dongdong; Murphy, Michael J.; Hakim, Joe B.; Franceschi, William H.; Zahid, Sohail; Pashakhanloo, Farhad; Trayanova, Natalia A.; Boyle, Patrick M.

    2017-09-01

    Atrial fibrillation (AF) is the most common sustained cardiac arrhythmia, causing morbidity and mortality in millions worldwide. The atria of patients with persistent AF (PsAF) are characterized by the presence of extensive and distributed atrial fibrosis, which facilitates the formation of persistent reentrant drivers (RDs, i.e., spiral waves), which promote fibrillatory activity. Targeted catheter ablation of RD-harboring tissues has shown promise as a clinical treatment for PsAF, but the outcomes remain sub-par. Personalized computational modeling has been proposed as a means of non-invasively predicting optimal ablation targets in individual PsAF patients, but it remains unclear how RD localization dynamics are influenced by inter-patient variability in the spatial distribution of atrial fibrosis, action potential duration (APD), and conduction velocity (CV). Here, we conduct simulations in computational models of fibrotic atria derived from the clinical imaging of PsAF patients to characterize the sensitivity of RD locations to these three factors. We show that RDs consistently anchor to boundaries between fibrotic and non-fibrotic tissues, as delineated by late gadolinium-enhanced magnetic resonance imaging, but those changes in APD/CV can enhance or attenuate the likelihood that an RD will anchor to a specific site. These findings show that the level of uncertainty present in patient-specific atrial models reconstructed without any invasive measurements (i.e., incorporating each individual's unique distribution of fibrotic tissue from medical imaging alongside an average representation of AF-remodeled electrophysiology) is sufficiently high that a personalized ablation strategy based on targeting simulation-predicted RD trajectories alone may not produce the desired result.

  13. Monte Carlo simulation of electrothermal atomization on a desktop personal computer

    NASA Astrophysics Data System (ADS)

    Histen, Timothy E.; Güell, Oscar A.; Chavez, Iris A.; Holcombea, James A.

    1996-07-01

    Monte Carlo simulations have been applied to electrothermal atomization (ETA) using a tubular atomizer (e.g. graphite furnace) because of the complexity in the geometry, heating, molecular interactions, etc. The intense computational time needed to accurately model ETA often limited its effective implementation to the use of supercomputers. However, with the advent of more powerful desktop processors, this is no longer the case. A C-based program has been developed and can be used under Windows TM or DOS. With this program, basic parameters such as furnace dimensions, sample placement, furnace heating and kinetic parameters such as activation energies for desorption and adsorption can be varied to show the absorbance profile dependence on these parameters. Even data such as time-dependent spatial distribution of analyte inside the furnace can be collected. The DOS version also permits input of external temperaturetime data to permit comparison of simulated profiles with experimentally obtained absorbance data. The run-time versions are provided along with the source code. This article is an electronic publication in Spectrochimica Acta Electronica (SAE), the electronic section of Spectrochimica Acta Part B (SAB). The hardcopy text is accompanied by a diskette with a program (PC format), data files and text files.

  14. Scrap computer recycling in Taiwan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, C.H.; Chang, S.L.; Wang, K.M.

    1999-07-01

    It is estimated that approximately 700,000 scrap personal computers will be generated each year in Taiwan. The disposal of such a huge amount of scrap computers presents a difficult task for the island due to the scarcity of landfills and incineration facilities available locally. Also, the hazardous materials contained in the scrap computers may cause serious pollution to the environment, if they are not properly disposed. Thus, EPA of Taiwan has declared scrap personal computers as a producer responsibility recycling product on July 1997 to mandate that the manufacturers, importers and sellers of personal computers have to recover and recyclemore » their scrap computers properly. Beginning on June 1, 1998, a scrap computer recycling plan is officially implemented on the island. Under this plan, consumers can deliver their unwanted personal computers to the designated collection points to receive reward money. Currently, only six items are mandated to be recycled in this recycling plan. They are notebooks, monitor and the hard disk, power supply, printed circuit board and shell of the main frame of the personal computer. This paper presents the current scrap computer recycling system in Taiwan.« less

  15. Simulation of an SEIR infectious disease model on the dynamic contact network of conference attendees

    PubMed Central

    2011-01-01

    Background The spread of infectious diseases crucially depends on the pattern of contacts between individuals. Knowledge of these patterns is thus essential to inform models and computational efforts. However, there are few empirical studies available that provide estimates of the number and duration of contacts between social groups. Moreover, their space and time resolutions are limited, so that data are not explicit at the person-to-person level, and the dynamic nature of the contacts is disregarded. In this study, we aimed to assess the role of data-driven dynamic contact patterns between individuals, and in particular of their temporal aspects, in shaping the spread of a simulated epidemic in the population. Methods We considered high-resolution data about face-to-face interactions between the attendees at a conference, obtained from the deployment of an infrastructure based on radiofrequency identification (RFID) devices that assessed mutual face-to-face proximity. The spread of epidemics along these interactions was simulated using an SEIR (Susceptible, Exposed, Infectious, Recovered) model, using both the dynamic network of contacts defined by the collected data, and two aggregated versions of such networks, to assess the role of the data temporal aspects. Results We show that, on the timescales considered, an aggregated network taking into account the daily duration of contacts is a good approximation to the full resolution network, whereas a homogeneous representation that retains only the topology of the contact network fails to reproduce the size of the epidemic. Conclusions These results have important implications for understanding the level of detail needed to correctly inform computational models for the study and management of real epidemics. Please see related article BMC Medicine, 2011, 9:88 PMID:21771290

  16. In silico cancer modeling: is it ready for primetime?

    PubMed Central

    Deisboeck, Thomas S; Zhang, Le; Yoon, Jeongah; Costa, Jose

    2011-01-01

    SUMMARY At the dawn of the era of personalized, systems-driven medicine, computational or in silico modeling and the simulation of disease processes is becoming increasingly important for hypothesis generation and data integration in both experiment and clinics alike. Arguably, this is nowhere more visible than in oncology. To illustrate the field’s vast potential as well as its current limitations we briefly review selected works on modeling malignant brain tumors. Implications for clinical practice, including trial design and outcome prediction are also discussed. PMID:18852721

  17. Effects of Simulated Pathophysiology on the Performance of a Decision Support Medical Monitoring System for Early Detection of Hemodynamic Decompensation in Humans

    DTIC Science & Technology

    2014-10-01

    pulse oximeter (Cardiocap/5; Datex-Ohmeda, Louisville, CO). The EKG and pulse oximeter tracings were interfaced with a personal computer for con- tinuous...responses to reduced central venous pressure (CVP) and pulse pressure (PP) elicited during graded lower body negative pressure (LBNP) to those observed...Johnson BD, Curry TB, Convertino VA, & Joyner MJ. The association between pulse pressure and stroke volume during lower body negative pressure and

  18. Measurement and Validation of Bidirectional Reflectance of Space Shuttle and Space Station Materials for Computerized Lighting Models

    NASA Technical Reports Server (NTRS)

    Fletcher, Lauren E.; Aldridge, Ann M.; Wheelwright, Charles; Maida, James

    1997-01-01

    Task illumination has a major impact on human performance: What a person can perceive in his environment significantly affects his ability to perform tasks, especially in space's harsh environment. Training for lighting conditions in space has long depended on physical models and simulations to emulate the effect of lighting, but such tests are expensive and time-consuming. To evaluate lighting conditions not easily simulated on Earth, personnel at NASA Johnson Space Center's (JSC) Graphics Research and Analysis Facility (GRAF) have been developing computerized simulations of various illumination conditions using the ray-tracing program, Radiance, developed by Greg Ward at Lawrence Berkeley Laboratory. Because these computer simulations are only as accurate as the data used, accurate information about the reflectance properties of materials and light distributions is needed. JSC's Lighting Environment Test Facility (LETF) personnel gathered material reflectance properties for a large number of paints, metals, and cloths used in the Space Shuttle and Space Station programs, and processed these data into reflectance parameters needed for the computer simulations. They also gathered lamp distribution data for most of the light sources used, and validated the ability to accurately simulate lighting levels by comparing predictions with measurements for several ground-based tests. The result of this study is a database of material reflectance properties for a wide variety of materials, and lighting information for most of the standard light sources used in the Shuttle/Station programs. The combination of the Radiance program and GRAF's graphics capability form a validated computerized lighting simulation capability for NASA.

  19. The Technology Refresh Program: Affording State-of-the Art Personal Computing.

    ERIC Educational Resources Information Center

    Spiwak, Rand

    2000-01-01

    Describes the Florida Community College Technology Refresh Program in which 28 Florida community colleges refresh their personal computer technology on a three-year cyclical basis through negotiation of a contract with Dell Computer Corporation. Discusses the contract highlights (such as a 22.5 percent discount on personal computers and on-site…

  20. Using Personal Computers To Acquire Special Education Information. Revised. ERIC Digest #429.

    ERIC Educational Resources Information Center

    ERIC Clearinghouse on Handicapped and Gifted Children, Reston, VA.

    This digest offers basic information about resources, available to users of personal computers, in the area of professional development in special education. Two types of resources are described: those that can be purchased on computer diskettes and those made available by linking personal computers through electronic telephone networks. Resources…

  1. Use of a personal computer for dynamical engineering illustrations in a classroom and over an instructional TV network

    NASA Technical Reports Server (NTRS)

    Watson, V. R.

    1983-01-01

    A personal computer has been used to illustrate physical phenomena and problem solution techniques in engineering classes. According to student evaluations, instruction of concepts was greatly improved through the use of these illustrations. This paper describes the class of phenomena that can be effectively illustrated, the techniques used to create these illustrations, and the techniques used to display the illustrations in regular classrooms and over an instructional TV network. The features of a personal computer required to apply these techniques are listed. The capabilities of some present personal computers are discussed and a forecast of the capabilities of future personal computers is presented.

  2. Revision and Expansion of Navy Computer Adaptive Personality Scales (NCAPS)

    DTIC Science & Technology

    2007-08-01

    Nav Pesne Reerh Stde, an Technolg y Dii sio Revision and Expansion of Navy Computer Adaptive Personality Scales ( NCAPS ) Robert J. Schneider, Ph.D...TN-o7-12 August 2007 Revision and Expansion of Navy Computer Adaptive Personality Scales ( NCAPS ) Robert J. Schneider, Ph.D. Kerri L. Ferstl, Ph.D...03/31/2006 4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER Revision and Expansion of Navy Computer Adaptive Personality Scales ( NCAPS ) 5b. GRANT NUMBER 5c

  3. Investigation of Item-Pair Presentation and Construct Validity of the Navy Computer Adaptive Personality Scales (NCAPS)

    DTIC Science & Technology

    2006-10-01

    Investigation of Item-Pair Presentation and Construct Validity of the Navy Computer Adaptive Personality Scales ( NCAPS ) Christina M. Underhill, Ph.D...Construct Validity of the Navy Computer Adaptive Personality Scales ( NCAPS ) Christina M. Underhill, Ph.D. Reviewed and Approved by Jacqueline A. Mottern...and Construct Validity of the Navy Computer Adaptive Personality Scales ( NCAPS ) 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 0602236N and 0603236N 6

  4. Climate change negotiation simulations for students: responses across gender and age.A case study: San Francisco State University World Climate Exercises

    NASA Astrophysics Data System (ADS)

    Rasheva, E. A.

    2015-12-01

    For decades, role-play and simulation exercises have been utilized for learning and policy decision making. While the power of Model UN simulations in building first-person experience and understanding of complex international issues is well known, the effectiveness of simulations for inspiring citizen engagement in scientific public-policy issues is little studied. My work hypothesizes that climate-change negotiation simulations can enhance students' scientific literacy and policy advocacy. It aims to determine how age and gender influence the responsiveness of students to such simulations. During the 2015 fall semester, I am conducting World Climate exercises for fellow graduate and undergraduate students at San Francisco State University. At the end of the exercise, I will have collected the responses to an anonymous questionnaire in which the participants indicate age and gender. The questionnaire asks participants to describe their hopes and fears for the future and to propose public and personal actions for achieving a strong climate change agreement. I am tracking differences to determine whether participants' age and gender correlate with particular patterns of feeling and thinking. My future research will aim to determine whether and how strongly the World Climate Exercise has affected participants' actual policy engagement. This work will also reflect on my experiences as a World Climate facilitator. I will describe the facilitation process and then discuss some of my observations from the sessions. I will specify the challenges I have encountered and suggest strategies that can strengthen the learning process. World Climate is a computer-simulation-based climate change negotiations role-playing exercise developed by Climate Interactive in partnership with the System Dynamics Group at the MIT Sloan School of Management.

  5. Co-occurrence of addictive behaviours: personality factors related to substance use, gambling and computer gaming.

    PubMed

    Walther, Birte; Morgenstern, Matthis; Hanewinkel, Reiner

    2012-01-01

    To investigate co-occurrence and shared personality characteristics of problematic computer gaming, problematic gambling and substance use. Cross-sectional survey data were collected from 2,553 German students aged 12-25 years. Self-report measures of substance use (alcohol, tobacco and cannabis), problematic gambling (South Oaks Gambling Screen - Revised for Adolescents, SOGS-RA), problematic computer gaming (Video Game Dependency Scale, KFN-CSAS-II), and of twelve different personality characteristics were obtained. Analyses revealed positive correlations between tobacco, alcohol and cannabis use and a smaller positive correlation between problematic gambling and problematic computer gaming. Problematic computer gaming co-occurred only with cannabis use, whereas problematic gambling was associated with all three types of substance use. Multivariate multilevel analyses showed differential patterns of personality characteristics. High impulsivity was the only personality characteristic associated with all five addictive behaviours. Depression and extraversion were specific to substance users. Four personality characteristics were specifically associated with problematic computer gaming: irritability/aggression, social anxiety, ADHD, and low self-esteem. Problematic gamblers seem to be more similar to substance users than problematic computer gamers. From a personality perspective, results correspond to the inclusion of gambling in the same DSM-V category as substance use and question a one-to-one proceeding for computer gaming. Copyright © 2012 S. Karger AG, Basel.

  6. Precollege Computer Literacy: A Personal Computing Approach. Second Edition.

    ERIC Educational Resources Information Center

    Moursund, David

    Intended for elementary and secondary teachers and curriculum specialists, this booklet discusses and defines computer literacy as a functional knowledge of computers and their effects on students and the rest of society. It analyzes personal computing and the aspects of computers that have direct impact on students. Outlining computer-assisted…

  7. Learners' Field Dependence and the Effects of Personalized Narration on Learners' Computer Perceptions and Task-Related Attitudes in Multimedia Learning

    ERIC Educational Resources Information Center

    Liew, Tze Wei; Tan, Su-Mae; Seydali, Rouzbeh

    2014-01-01

    In this article, the effects of personalized narration in multimedia learning on learners' computer perceptions and task-related attitudes were examined. Twenty-six field independent and 22 field dependent participants studied the computer-based multimedia lessons on C-Programming, either with personalized narration or non-personalized narration.…

  8. Personality Types and Affinity for Computers

    DTIC Science & Technology

    1991-03-01

    differences on personality dimensions between the respondents, and to explore the relationship between these differences and computer affinity. The results...between the respondents, and to explore the relationship between these differences and computer affinity. The results revealed no significant differences...type to this measure of computer affinity. 2 II. LITERATURZ REVIEW The interest of this study was the relationship between a person’s psychological

  9. Climate simulations and services on HPC, Cloud and Grid infrastructures

    NASA Astrophysics Data System (ADS)

    Cofino, Antonio S.; Blanco, Carlos; Minondo Tshuma, Antonio

    2017-04-01

    Cloud, Grid and High Performance Computing have changed the accessibility and availability of computing resources for Earth Science research communities, specially for Climate community. These paradigms are modifying the way how climate applications are being executed. By using these technologies the number, variety and complexity of experiments and resources are increasing substantially. But, although computational capacity is increasing, traditional applications and tools used by the community are not good enough to manage this large volume and variety of experiments and computing resources. In this contribution, we evaluate the challenges to run climate simulations and services on Grid, Cloud and HPC infrestructures and how to tackle them. The Grid and Cloud infrastructures provided by EGI's VOs ( esr , earth.vo.ibergrid and fedcloud.egi.eu) will be evaluated, as well as HPC resources from PRACE infrastructure and institutional clusters. To solve those challenges, solutions using DRM4G framework will be shown. DRM4G provides a good framework to manage big volume and variety of computing resources for climate experiments. This work has been supported by the Spanish National R&D Plan under projects WRF4G (CGL2011-28864), INSIGNIA (CGL2016-79210-R) and MULTI-SDM (CGL2015-66583-R) ; the IS-ENES2 project from the 7FP of the European Commission (grant agreement no. 312979); the European Regional Development Fund—ERDF and the Programa de Personal Investigador en Formación Predoctoral from Universidad de Cantabria and Government of Cantabria.

  10. Computer-based diagnosis of illness in historical persons.

    PubMed

    Peters, T J

    2013-01-01

    Retrospective diagnosis of illness in historical figures is a popular but somewhat unreliable pastime due to the lack of detailed information and reliable reports about clinical features and disease progression. Modern computer-based diagnostic programmes have been used to supplement historical documents and accounts, offering new and more objective approaches to the retrospective investigations of the medical conditions of historical persons. In the case of King George III, modern technology has been used to strengthen the findings of previous reports rejecting the popular diagnosis of variegate porphyria in the King, his grandson Augustus d'Esté and his antecedent King James VI and I. Alternative diagnoses based on these programmes are indicated. The Operational Criteria in Studies of Psychotic Illness (OPCRIT) programme and the Young mania scale have been applied to the features described for George III and suggest a diagnosis of bipolar disorder. The neuro-diagnostic programme SimulConsult was applied to Augustus d'Esté and suggests a diagnosis of neuromyelitis optica rather than acute porphyria with secondarily multiple sclerosis, as proposed by others. James VI and I's complex medical history and the clinical features of his behavioural traits were also subjected to SimulConsult analysis; acute porphyria was rejected and the unexpected diagnosis of attenuated (mild) Lesch-Nyhan disease offered. A brief review of these approaches along with full reference listings to the methodology including validation are provided. Textual analysis of the written and verbal outputs of historical figures indicate possible future developments in the diagnosis of medical disorders in historical figures.

  11. A Simplified Model for Detonation Based Pressure-Gain Combustors

    NASA Technical Reports Server (NTRS)

    Paxson, Daniel E.

    2010-01-01

    A time-dependent model is presented which simulates the essential physics of a detonative or otherwise constant volume, pressure-gain combustor for gas turbine applications. The model utilizes simple, global thermodynamic relations to determine an assumed instantaneous and uniform post-combustion state in one of many envisioned tubes comprising the device. A simple, second order, non-upwinding computational fluid dynamic algorithm is then used to compute the (continuous) flowfield properties during the blowdown and refill stages of the periodic cycle which each tube undergoes. The exhausted flow is averaged to provide mixed total pressure and enthalpy which may be used as a cycle performance metric for benefits analysis. The simplicity of the model allows for nearly instantaneous results when implemented on a personal computer. The results compare favorably with higher resolution numerical codes which are more difficult to configure, and more time consuming to operate.

  12. Virtual reality in anxiety disorders: the past and the future.

    PubMed

    Gorini, Alessandra; Riva, Giuseppe

    2008-02-01

    One of the most effective treatments of anxiety is exposure therapy: a person is exposed to specific feared situations or objects that trigger anxiety. This exposure process may be done through actual exposure, with visualization, by imagination or using virtual reality (VR), that provides users with computer simulated environments with and within which they can interact. VR is made possible by the capability of computers to synthesize a 3D graphical environment from numerical data. Furthermore, because input devices sense the subject's reactions and motions, the computer can modify the synthetic environment accordingly, creating the illusion of interacting with, and thus being immersed within the environment. Starting from 1995, different experimental studies have been conducted in order to investigate the effect of VR exposure in the treatment of subclinical fears and anxiety disorders. This review will discuss their outcome and provide guidelines for the use of VR exposure for the treatment of anxious patients.

  13. A CLIPS based personal computer hardware diagnostic system

    NASA Technical Reports Server (NTRS)

    Whitson, George M.

    1991-01-01

    Often the person designated to repair personal computers has little or no knowledge of how to repair a computer. Described here is a simple expert system to aid these inexperienced repair people. The first component of the system leads the repair person through a number of simple system checks such as making sure that all cables are tight and that the dip switches are set correctly. The second component of the system assists the repair person in evaluating error codes generated by the computer. The final component of the system applies a large knowledge base to attempt to identify the component of the personal computer that is malfunctioning. We have implemented and tested our design with a full system to diagnose problems for an IBM compatible system based on the 8088 chip. In our tests, the inexperienced repair people found the system very useful in diagnosing hardware problems.

  14. ETARA PC version 3.3 user's guide: Reliability, availability, maintainability simulation model

    NASA Technical Reports Server (NTRS)

    Hoffman, David J.; Viterna, Larry A.

    1991-01-01

    A user's manual describing an interactive, menu-driven, personal computer based Monte Carlo reliability, availability, and maintainability simulation program called event time availability reliability (ETARA) is discussed. Given a reliability block diagram representation of a system, ETARA simulates the behavior of the system over a specified period of time using Monte Carlo methods to generate block failure and repair intervals as a function of exponential and/or Weibull distributions. Availability parameters such as equivalent availability, state availability (percentage of time as a particular output state capability), continuous state duration and number of state occurrences can be calculated. Initial spares allotment and spares replenishment on a resupply cycle can be simulated. The number of block failures are tabulated both individually and by block type, as well as total downtime, repair time, and time waiting for spares. Also, maintenance man-hours per year and system reliability, with or without repair, at or above a particular output capability can be calculated over a cumulative period of time or at specific points in time.

  15. ME(SSY)**2: Monte Carlo Code for Star Cluster Simulations

    NASA Astrophysics Data System (ADS)

    Freitag, Marc Dewi

    2013-02-01

    ME(SSY)**2 stands for “Monte-carlo Experiments with Spherically SYmmetric Stellar SYstems." This code simulates the long term evolution of spherical clusters of stars; it was devised specifically to treat dense galactic nuclei. It is based on the pioneering Monte Carlo scheme proposed by Hénon in the 70's and includes all relevant physical ingredients (2-body relaxation, stellar mass spectrum, collisions, tidal disruption, ldots). It is basically a Monte Carlo resolution of the Fokker-Planck equation. It can cope with any stellar mass spectrum or velocity distribution. Being a particle-based method, it also allows one to take stellar collisions into account in a very realistic way. This unique code, featuring most important physical processes, allows million particle simulations, spanning a Hubble time, in a few CPU days on standard personal computers and provides a wealth of data only rivalized by N-body simulations. The current version of the software requires the use of routines from the "Numerical Recipes in Fortran 77" (http://www.nrbook.com/a/bookfpdf.php).

  16. Making On-line Science Course Materials Easily Translatable and Accessible Worldwide: Challenges and Solutions

    NASA Astrophysics Data System (ADS)

    Adams, Wendy K.; Alhadlaq, Hisham; Malley, Christopher V.; Perkins, Katherine K.; Olson, Jonathan; Alshaya, Fahad; Alabdulkareem, Saleh; Wieman, Carl E.

    2012-02-01

    The PhET Interactive Simulations Project partnered with the Excellence Research Center of Science and Mathematics Education at King Saud University with the joint goal of making simulations useable worldwide. One of the main challenges of this partnership is to make PhET simulations and the website easily translatable into any language. The PhET project team overcame this challenge by creating the Translation Utility. This tool allows a person fluent in both English and another language to easily translate any of the PhET simulations and requires minimal computer expertise. In this paper we discuss the technical issues involved in this software solution, as well as the issues involved in obtaining accurate translations. We share our solutions to many of the unexpected problems we encountered that would apply generally to making on-line scientific course materials available in many different languages, including working with: languages written right-to-left, different character sets, and different conventions for expressing equations, variables, units and scientific notation.

  17. Dosimetry applications in GATE Monte Carlo toolkit.

    PubMed

    Papadimitroulas, Panagiotis

    2017-09-01

    Monte Carlo (MC) simulations are a well-established method for studying physical processes in medical physics. The purpose of this review is to present GATE dosimetry applications on diagnostic and therapeutic simulated protocols. There is a significant need for accurate quantification of the absorbed dose in several specific applications such as preclinical and pediatric studies. GATE is an open-source MC toolkit for simulating imaging, radiotherapy (RT) and dosimetry applications in a user-friendly environment, which is well validated and widely accepted by the scientific community. In RT applications, during treatment planning, it is essential to accurately assess the deposited energy and the absorbed dose per tissue/organ of interest, as well as the local statistical uncertainty. Several types of realistic dosimetric applications are described including: molecular imaging, radio-immunotherapy, radiotherapy and brachytherapy. GATE has been efficiently used in several applications, such as Dose Point Kernels, S-values, Brachytherapy parameters, and has been compared against various MC codes which are considered as standard tools for decades. Furthermore, the presented studies show reliable modeling of particle beams when comparing experimental with simulated data. Examples of different dosimetric protocols are reported for individualized dosimetry and simulations combining imaging and therapy dose monitoring, with the use of modern computational phantoms. Personalization of medical protocols can be achieved by combining GATE MC simulations with anthropomorphic computational models and clinical anatomical data. This is a review study, covering several dosimetric applications of GATE, and the different tools used for modeling realistic clinical acquisitions with accurate dose assessment. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  18. An analysis of intergroup rivalry using Ising model and reinforcement learning

    NASA Astrophysics Data System (ADS)

    Zhao, Feng-Fei; Qin, Zheng; Shao, Zhuo

    2014-01-01

    Modeling of intergroup rivalry can help us better understand economic competitions, political elections and other similar activities. The result of intergroup rivalry depends on the co-evolution of individual behavior within one group and the impact from the rival group. In this paper, we model the rivalry behavior using Ising model. Different from other simulation studies using Ising model, the evolution rules of each individual in our model are not static, but have the ability to learn from historical experience using reinforcement learning technique, which makes the simulation more close to real human behavior. We studied the phase transition in intergroup rivalry and focused on the impact of the degree of social freedom, the personality of group members and the social experience of individuals. The results of computer simulation show that a society with a low degree of social freedom and highly educated, experienced individuals is more likely to be one-sided in intergroup rivalry.

  19. Distributed communication and psychosocial performance in simulated space dwelling groups

    NASA Astrophysics Data System (ADS)

    Hienz, R. D.; Brady, J. V.; Hursh, S. R.; Ragusa, L. C.; Rouse, C. O.; Gasior, E. D.

    2005-05-01

    The present report describes the development and application of a distributed interactive multi-person simulation in a computer-generated planetary environment as an experimental test bed for modeling the human performance effects of variations in the types of communication modes available, and in the types of stress and incentive conditions underlying the completion of mission goals. The results demonstrated a high degree of interchangeability between communication modes (audio, text) when one mode was not available. Additionally, the addition of time pressure stress to complete tasks resulted in a reduction in performance effectiveness, and these performance reductions were ameliorated via the introduction of positive incentives contingent upon improved performances. The results obtained confirmed that cooperative and productive psychosocial interactions can be maintained between individually isolated and dispersed members of simulated spaceflight crews communicating and problem-solving effectively over extended time intervals without the benefit of one another's physical presence.

  20. A web-based repository of surgical simulator projects.

    PubMed

    Leskovský, Peter; Harders, Matthias; Székely, Gábor

    2006-01-01

    The use of computer-based surgical simulators for training of prospective surgeons has been a topic of research for more than a decade. As a result, a large number of academic projects have been carried out, and a growing number of commercial products are available on the market. Keeping track of all these endeavors for established groups as well as for newly started projects can be quite arduous. Gathering information on existing methods, already traveled research paths, and problems encountered is a time consuming task. To alleviate this situation, we have established a modifiable online repository of existing projects. It contains detailed information about a large number of simulator projects gathered from web pages, papers and personal communication. The database is modifiable (with password protected sections) and also allows for a simple statistical analysis of the collected data. For further information, the surgical repository web page can be found at www.virtualsurgery.vision.ee.ethz.ch.

  1. An Implemented Strategy for Campus Connectivity and Cooperative Computing.

    ERIC Educational Resources Information Center

    Halaris, Antony S.; Sloan, Lynda W.

    1989-01-01

    ConnectPac, a software package developed at Iona College to allow a computer user to access all services from a single personal computer, is described. ConnectPac uses mainframe computing to support a campus computing network, integrating personal and centralized computing into a menu-driven user environment. (Author/MLW)

  2. Development of Personalized Radiant Cooling System for an Office Room

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khare, Vaibhav; Sharma, Anuj; Mathur, Jyotirmay

    2015-01-01

    The building industry nowadays is facing two major challenges increased concern for energy reduction and growing need for thermal comfort. These challenges have led many researchers to develop Radiant Cooling Systems that show a large potential for energy savings. This study aims to develop a personalized cooling system using the principle of radiant cooling integrated with conventional all-air system to achieve better thermal environment at the workspace. Personalized conditioning aims to create a microclimatic zone around a single workspace. In this way, the energy is deployed only where it is actually needed, and the individual s needs for thermal comfortmore » are fulfilled. To study the effect of air temperature along with air temperature distribution for workspace, air temperature near the vicinity of the occupant has been obtained as a result of Computational Fluid Dynamics (CFD) simulation using FLUENT. The analysis showed that personalized radiant system improves thermal environment near the workspace and allows all-air systems to work at higher thermostat temperature without compromising the thermal comfort, which in turn reduces its energy consumption.« less

  3. Computer Competence for the Applied Gerontologist.

    ERIC Educational Resources Information Center

    Dickel, C. Timothy; Young, W. Wayne

    This paper shares some ideas regarding the use of computers by persons who use their gerontology training in direct service to older persons and their families. It proposes that, as professionals serving older persons and their families look toward the future, they need to conscientiously incorporate computer competence into their practice. The…

  4. Personal-Computer Video-Terminal Emulator

    NASA Technical Reports Server (NTRS)

    Buckley, R. H.; Koromilas, A.; Smith, R. M.; Lee, G. E.; Giering, E. W.

    1985-01-01

    OWL-1200 video terminal emulator has been written for IBM Personal Computer. The OWL-1200 is a simple user terminal with some intelligent capabilities. These capabilities include screen formatting and block transmission of data. Emulator is written in PASCAL and Assembler for the IBM Personal Computer operating under DOS 1.1.

  5. Selecting Personal Computers.

    ERIC Educational Resources Information Center

    Djang, Philipp A.

    1993-01-01

    Describes a Multiple Criteria Decision Analysis Approach for the selection of personal computers that combines the capabilities of Analytic Hierarchy Process and Integer Goal Programing. An example of how decision makers can use this approach to determine what kind of personal computers and how many of each type to purchase is given. (nine…

  6. A heterogeneous and parallel computing framework for high-resolution hydrodynamic simulations

    NASA Astrophysics Data System (ADS)

    Smith, Luke; Liang, Qiuhua

    2015-04-01

    Shock-capturing hydrodynamic models are now widely applied in the context of flood risk assessment and forecasting, accurately capturing the behaviour of surface water over ground and within rivers. Such models are generally explicit in their numerical basis, and can be computationally expensive; this has prohibited full use of high-resolution topographic data for complex urban environments, now easily obtainable through airborne altimetric surveys (LiDAR). As processor clock speed advances have stagnated in recent years, further computational performance gains are largely dependent on the use of parallel processing. Heterogeneous computing architectures (e.g. graphics processing units or compute accelerator cards) provide a cost-effective means of achieving high throughput in cases where the same calculation is performed with a large input dataset. In recent years this technique has been applied successfully for flood risk mapping, such as within the national surface water flood risk assessment for the United Kingdom. We present a flexible software framework for hydrodynamic simulations across multiple processors of different architectures, within multiple computer systems, enabled using OpenCL and Message Passing Interface (MPI) libraries. A finite-volume Godunov-type scheme is implemented using the HLLC approach to solving the Riemann problem, with optional extension to second-order accuracy in space and time using the MUSCL-Hancock approach. The framework is successfully applied on personal computers and a small cluster to provide considerable improvements in performance. The most significant performance gains were achieved across two servers, each containing four NVIDIA GPUs, with a mix of K20, M2075 and C2050 devices. Advantages are found with respect to decreased parametric sensitivity, and thus in reducing uncertainty, for a major fluvial flood within a large catchment during 2005 in Carlisle, England. Simulations for the three-day event could be performed on a 2m grid within a few hours. In the context of a rapid pluvial flood event in Newcastle upon Tyne during 2012, the technique allows simulation of inundation for a 31km2 of the city centre in less than an hour on a 2m grid; however, further grid refinement is required to fully capture important smaller flow pathways. Good agreement between the model and observed inundation is achieved for a variety of dam failure, slow fluvial inundation, rapid pluvial inundation, and defence breach scenarios in the UK.

  7. A systematic review to identify areas of enhancements of pandemic simulation models for operational use at provincial and local levels

    PubMed Central

    2012-01-01

    Background In recent years, computer simulation models have supported development of pandemic influenza preparedness policies. However, U.S. policymakers have raised several concerns about the practical use of these models. In this review paper, we examine the extent to which the current literature already addresses these concerns and identify means of enhancing the current models for higher operational use. Methods We surveyed PubMed and other sources for published research literature on simulation models for influenza pandemic preparedness. We identified 23 models published between 1990 and 2010 that consider single-region (e.g., country, province, city) outbreaks and multi-pronged mitigation strategies. We developed a plan for examination of the literature based on the concerns raised by the policymakers. Results While examining the concerns about the adequacy and validity of data, we found that though the epidemiological data supporting the models appears to be adequate, it should be validated through as many updates as possible during an outbreak. Demographical data must improve its interfaces for access, retrieval, and translation into model parameters. Regarding the concern about credibility and validity of modeling assumptions, we found that the models often simplify reality to reduce computational burden. Such simplifications may be permissible if they do not interfere with the performance assessment of the mitigation strategies. We also agreed with the concern that social behavior is inadequately represented in pandemic influenza models. Our review showed that the models consider only a few social-behavioral aspects including contact rates, withdrawal from work or school due to symptoms appearance or to care for sick relatives, and compliance to social distancing, vaccination, and antiviral prophylaxis. The concern about the degree of accessibility of the models is palpable, since we found three models that are currently accessible by the public while other models are seeking public accessibility. Policymakers would prefer models scalable to any population size that can be downloadable and operable in personal computers. But scaling models to larger populations would often require computational needs that cannot be handled with personal computers and laptops. As a limitation, we state that some existing models could not be included in our review due to their limited available documentation discussing the choice of relevant parameter values. Conclusions To adequately address the concerns of the policymakers, we need continuing model enhancements in critical areas including: updating of epidemiological data during a pandemic, smooth handling of large demographical databases, incorporation of a broader spectrum of social-behavioral aspects, updating information for contact patterns, adaptation of recent methodologies for collecting human mobility data, and improvement of computational efficiency and accessibility. PMID:22463370

  8. Optical design of a novel instrument that uses the Hartmann-Shack sensor and Zernike polynomials to measure and simulate customized refraction correction surgery outcomes and patient satisfaction

    NASA Astrophysics Data System (ADS)

    Yasuoka, Fatima M. M.; Matos, Luciana; Cremasco, Antonio; Numajiri, Mirian; Marcato, Rafael; Oliveira, Otavio G.; Sabino, Luis G.; Castro N., Jarbas C.; Bagnato, Vanderlei S.; Carvalho, Luis A. V.

    2016-03-01

    An optical system that conjugates the patient's pupil to the plane of a Hartmann-Shack (HS) wavefront sensor has been simulated using optical design software. And an optical bench prototype is mounted using mechanical eye device, beam splitter, illumination system, lenses, mirrors, mirrored prism, movable mirror, wavefront sensor and camera CCD. The mechanical eye device is used to simulate aberrations of the eye. From this device the rays are emitted and travelled by the beam splitter to the optical system. Some rays fall on the camera CCD and others pass in the optical system and finally reach the sensor. The eye models based on typical in vivo eye aberrations is constructed using the optical design software Zemax. The computer-aided outcomes of each HS images for each case are acquired, and these images are processed using customized techniques. The simulated and real images for low order aberrations are compared using centroid coordinates to assure that the optical system is constructed precisely in order to match the simulated system. Afterwards a simulated version of retinal images is constructed to show how these typical eyes would perceive an optotype positioned 20 ft away. Certain personalized corrections are allowed by eye doctors based on different Zernike polynomial values and the optical images are rendered to the new parameters. Optical images of how that eye would see with or without corrections of certain aberrations are generated in order to allow which aberrations can be corrected and in which degree. The patient can then "personalize" the correction to their own satisfaction. This new approach to wavefront sensing is a promising change in paradigm towards the betterment of the patient-physician relationship.

  9. IUE Data Analysis Software for Personal Computers

    NASA Technical Reports Server (NTRS)

    Thompson, R.; Caplinger, J.; Taylor, L.; Lawton , P.

    1996-01-01

    This report summarizes the work performed for the program titled, "IUE Data Analysis Software for Personal Computers" awarded under Astrophysics Data Program NRA 92-OSSA-15. The work performed was completed over a 2-year period starting in April 1994. As a result of the project, 450 IDL routines and eight database tables are now available for distribution for Power Macintosh computers and Personal Computers running Windows 3.1.

  10. Personalized Computer-Assisted Mathematics Problem-Solving Program and Its Impact on Taiwanese Students

    ERIC Educational Resources Information Center

    Chen, Chiu-Jung; Liu, Pei-Lin

    2007-01-01

    This study evaluated the effects of a personalized computer-assisted mathematics problem-solving program on the performance and attitude of Taiwanese fourth grade students. The purpose of this study was to determine whether the personalized computer-assisted program improved student performance and attitude over the nonpersonalized program.…

  11. Wireless Phone Threat Assessment for Aircraft Communication and Navigation Radios

    NASA Technical Reports Server (NTRS)

    Nguyens, T. X.; Koppen, S. V.; Smith, L. J.; Williams, R. A.; Salud, M. T.

    2005-01-01

    Emissions in aircraft communication and navigation bands are measured for the latest generation of wireless phones. The two wireless technologies considered, GSM/GPRS and CDMA2000, are the latest available to general consumers in the U.S. A base-station simulator is used to control the phones. The measurements are conducted using reverberation chambers, and the results are compared against FCC and aircraft installed equipment emission limits. The results are also compared against baseline emissions from laptop computers and personal digital assistant devices that are currently allowed to operate on aircraft.

  12. Acoustic Response of Underwater Munitions near a Sediment Interface: Measurement Model Comparisons and Classification Schemes

    DTIC Science & Technology

    2015-04-23

    12 Figure 4. Pulse- compressed baseband signals for sequence 40 from TREX13 …… 13 Figure 5. SAS image for sequence 40 from TREX13...12 meshes with data …………… 28 Figure 14. FE simulations for aluminum and steel replicas of an 100-mm UXO …… 28 Figure 15. FE meshes for two targets...PCB Pulse- compressed and baseband PC SWAT Personal Computer Shallow Water Acoustic Toolset PondEx09 Pond Experiment 2009 PondEx10 Pond Experiment

  13. Enhancing Privacy in Participatory Sensing Applications with Multidimensional Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Forrest, Stephanie; He, Wenbo; Groat, Michael

    2013-01-01

    Participatory sensing applications rely on individuals to share personal data to produce aggregated models and knowledge. In this setting, privacy concerns can discourage widespread adoption of new applications. We present a privacy-preserving participatory sensing scheme based on negative surveys for both continuous and multivariate categorical data. Without relying on encryption, our algorithms enhance the privacy of sensed data in an energy and computation efficient manner. Simulations and implementation on Android smart phones illustrate how multidimensional data can be aggregated in a useful and privacy-enhancing manner.

  14. Ethical sensitivity intervention in science teacher education: Using computer simulations and professional codes of ethics

    NASA Astrophysics Data System (ADS)

    Holmes, Shawn Yvette

    A simulation was created to emulate two Racial Ethical Sensitivity Test (REST) videos (Brabeck et al., 2000). The REST is a reliable assessment for ethical sensitivity to racial and gender intolerant behaviors in educational settings. Quantitative and qualitative analysis of the REST was performed using the Quick-REST survey and an interview protocol. The purpose of this study was to affect science educator ability to recognize instances of racial and gender intolerant behaviors by levering immersive qualities of simulations. The fictitious Hazelton High School virtual environment was created by the researcher and compared with the traditional REST. The study investigated whether computer simulations can influence the ethical sensitivity of preservice and inservice science teachers to racial and gender intolerant behaviors in school settings. The post-test only research design involved 32 third-year science education students enrolled in science education classes at several southeastern universities and 31 science teachers from the same locale, some of which were part of an NSF project. Participant samples were assigned to the video control group or the simulation experimental group. This resulted in four comparison group; preservice video, preservice simulation, inservice video and inservice simulation. Participants experienced two REST scenarios in the appropriate format then responded to Quick-REST survey questions for both scenarios. Additionally, the simulation groups answered in-simulation and post-simulation questions. Nonparametric analysis of the Quick-REST ascertained differences between comparison groups. Cronbach's alpha was calculated for internal consistency. The REST interview protocol was used to analyze recognition of intolerant behaviors in the in-simulation prompts. Post-simulation prompts were analyzed for emergent themes concerning effect of the simulation on responses. The preservice video group had a significantly higher mean rank score than other comparison groups. There were no significant differences across the remaining groups. Qualitative analyses of in-simulation prompts suggest both preservice and inservice participants are unlikely to take action in an intolerant environment. Themes emerged in the post-simulation responses indicated participants viewed the simulation as a reflective, interactive, personal, and organic environment.

  15. Computer-based simulation training to improve learning outcomes in mannequin-based simulation exercises.

    PubMed

    Curtin, Lindsay B; Finn, Laura A; Czosnowski, Quinn A; Whitman, Craig B; Cawley, Michael J

    2011-08-10

    To assess the impact of computer-based simulation on the achievement of student learning outcomes during mannequin-based simulation. Participants were randomly assigned to rapid response teams of 5-6 students and then teams were randomly assigned to either a group that completed either computer-based or mannequin-based simulation cases first. In both simulations, students used their critical thinking skills and selected interventions independent of facilitator input. A predetermined rubric was used to record and assess students' performance in the mannequin-based simulations. Feedback and student performance scores were generated by the software in the computer-based simulations. More of the teams in the group that completed the computer-based simulation before completing the mannequin-based simulation achieved the primary outcome for the exercise, which was survival of the simulated patient (41.2% vs. 5.6%). The majority of students (>90%) recommended the continuation of simulation exercises in the course. Students in both groups felt the computer-based simulation should be completed prior to the mannequin-based simulation. The use of computer-based simulation prior to mannequin-based simulation improved the achievement of learning goals and outcomes. In addition to improving participants' skills, completing the computer-based simulation first may improve participants' confidence during the more real-life setting achieved in the mannequin-based simulation.

  16. High-Performance Agent-Based Modeling Applied to Vocal Fold Inflammation and Repair.

    PubMed

    Seekhao, Nuttiiya; Shung, Caroline; JaJa, Joseph; Mongeau, Luc; Li-Jessen, Nicole Y K

    2018-01-01

    Fast and accurate computational biology models offer the prospect of accelerating the development of personalized medicine. A tool capable of estimating treatment success can help prevent unnecessary and costly treatments and potential harmful side effects. A novel high-performance Agent-Based Model (ABM) was adopted to simulate and visualize multi-scale complex biological processes arising in vocal fold inflammation and repair. The computational scheme was designed to organize the 3D ABM sub-tasks to fully utilize the resources available on current heterogeneous platforms consisting of multi-core CPUs and many-core GPUs. Subtasks are further parallelized and convolution-based diffusion is used to enhance the performance of the ABM simulation. The scheme was implemented using a client-server protocol allowing the results of each iteration to be analyzed and visualized on the server (i.e., in-situ ) while the simulation is running on the same server. The resulting simulation and visualization software enables users to interact with and steer the course of the simulation in real-time as needed. This high-resolution 3D ABM framework was used for a case study of surgical vocal fold injury and repair. The new framework is capable of completing the simulation, visualization and remote result delivery in under 7 s per iteration, where each iteration of the simulation represents 30 min in the real world. The case study model was simulated at the physiological scale of a human vocal fold. This simulation tracks 17 million biological cells as well as a total of 1.7 billion signaling chemical and structural protein data points. The visualization component processes and renders all simulated biological cells and 154 million signaling chemical data points. The proposed high-performance 3D ABM was verified through comparisons with empirical vocal fold data. Representative trends of biomarker predictions in surgically injured vocal folds were observed.

  17. High-Performance Agent-Based Modeling Applied to Vocal Fold Inflammation and Repair

    PubMed Central

    Seekhao, Nuttiiya; Shung, Caroline; JaJa, Joseph; Mongeau, Luc; Li-Jessen, Nicole Y. K.

    2018-01-01

    Fast and accurate computational biology models offer the prospect of accelerating the development of personalized medicine. A tool capable of estimating treatment success can help prevent unnecessary and costly treatments and potential harmful side effects. A novel high-performance Agent-Based Model (ABM) was adopted to simulate and visualize multi-scale complex biological processes arising in vocal fold inflammation and repair. The computational scheme was designed to organize the 3D ABM sub-tasks to fully utilize the resources available on current heterogeneous platforms consisting of multi-core CPUs and many-core GPUs. Subtasks are further parallelized and convolution-based diffusion is used to enhance the performance of the ABM simulation. The scheme was implemented using a client-server protocol allowing the results of each iteration to be analyzed and visualized on the server (i.e., in-situ) while the simulation is running on the same server. The resulting simulation and visualization software enables users to interact with and steer the course of the simulation in real-time as needed. This high-resolution 3D ABM framework was used for a case study of surgical vocal fold injury and repair. The new framework is capable of completing the simulation, visualization and remote result delivery in under 7 s per iteration, where each iteration of the simulation represents 30 min in the real world. The case study model was simulated at the physiological scale of a human vocal fold. This simulation tracks 17 million biological cells as well as a total of 1.7 billion signaling chemical and structural protein data points. The visualization component processes and renders all simulated biological cells and 154 million signaling chemical data points. The proposed high-performance 3D ABM was verified through comparisons with empirical vocal fold data. Representative trends of biomarker predictions in surgically injured vocal folds were observed. PMID:29706894

  18. Drift trajectories of a floating human body simulated in a hydraulic model of Puget Sound.

    PubMed

    Ebbesmeyer, C C; Haglund, W D

    1994-01-01

    After a young man jumped off a 221-foot (67 meters) high bridge, the drift of the body that beached 20 miles (32 km) away at Alki Point in Seattle, Washington was simulated with a hydraulic model. Simulations for the appropriate time period were performed using a small floating bead to represent the body in the hydraulic model at the University of Washington. Bead movements were videotaped and transferred to Computer Aided Drafting (AutoCAD) charts on a personal computer. Because of strong tidal currents in the narrow passage under the bridge (The Narrows near Tacoma, WA), small changes in the time of the jump (+/- 30 minutes) made large differences in the distance the body traveled (30 miles; 48 km). Hydraulic and other types of oceanographic models may be located by contacting technical experts known as physical oceanographers at local universities, and can be utilized to demonstrate trajectories of floating objects and the time required to arrive at selected locations. Potential applications for forensic death investigators include: to be able to set geographic and time limits for searches; determine potential origin of remains found floating or beached; and confirm and correlate information regarding entry into the water and sightings of remains.

  19. Quantum simulator review

    NASA Astrophysics Data System (ADS)

    Bednar, Earl; Drager, Steven L.

    2007-04-01

    Quantum information processing's objective is to utilize revolutionary computing capability based on harnessing the paradigm shift offered by quantum computing to solve classically hard and computationally challenging problems. Some of our computationally challenging problems of interest include: the capability for rapid image processing, rapid optimization of logistics, protecting information, secure distributed simulation, and massively parallel computation. Currently, one important problem with quantum information processing is that the implementation of quantum computers is difficult to realize due to poor scalability and great presence of errors. Therefore, we have supported the development of Quantum eXpress and QuIDD Pro, two quantum computer simulators running on classical computers for the development and testing of new quantum algorithms and processes. This paper examines the different methods used by these two quantum computing simulators. It reviews both simulators, highlighting each simulators background, interface, and special features. It also demonstrates the implementation of current quantum algorithms on each simulator. It concludes with summary comments on both simulators.

  20. Uncertainty quantification for personalized analyses of human proximal femurs.

    PubMed

    Wille, Hagen; Ruess, Martin; Rank, Ernst; Yosibash, Zohar

    2016-02-29

    Computational models for the personalized analysis of human femurs contain uncertainties in bone material properties and loads, which affect the simulation results. To quantify the influence we developed a probabilistic framework based on polynomial chaos (PC) that propagates stochastic input variables through any computational model. We considered a stochastic E-ρ relationship and a stochastic hip contact force, representing realistic variability of experimental data. Their influence on the prediction of principal strains (ϵ1 and ϵ3) was quantified for one human proximal femur, including sensitivity and reliability analysis. Large variabilities in the principal strain predictions were found in the cortical shell of the femoral neck, with coefficients of variation of ≈40%. Between 60 and 80% of the variance in ϵ1 and ϵ3 are attributable to the uncertainty in the E-ρ relationship, while ≈10% are caused by the load magnitude and 5-30% by the load direction. Principal strain directions were unaffected by material and loading uncertainties. The antero-superior and medial inferior sides of the neck exhibited the largest probabilities for tensile and compression failure, however all were very small (pf<0.001). In summary, uncertainty quantification with PC has been demonstrated to efficiently and accurately describe the influence of very different stochastic inputs, which increases the credibility and explanatory power of personalized analyses of human proximal femurs. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Familias 3 - Extensions and new functionality.

    PubMed

    Kling, Daniel; Tillmar, Andreas O; Egeland, Thore

    2014-11-01

    In relationship testing the aim is to determine the most probable pedigree structure given genetic marker data for a set of persons. Disaster Victim Identification (DVI) based on DNA data from presumed relatives of the missing persons can be considered to be a collection of relationship problems. Forensic calculations in investigative mode address questions like "How many markers and reference persons are needed?" Such questions can be answered by simulations. Mutations, deviations from Hardy-Weinberg Equilibrium (or more generally, accounting for population substructure) and silent alleles cannot be ignored when evaluating forensic evidence in case work. With the advent of new markers, so called microvariants have become more common. Previous mutation models are no longer appropriate and a new model is proposed. This paper describes methods designed to deal with DVI problems and a new simulation model to study distribution of likelihoods. There are softwares available, addressing similar problems. However, for some problems including DVI, we are not aware of freely available validated software. The Familias software has long been widely used by forensic laboratories worldwide to compute likelihoods in relationship scenarios, though previous versions have lacked desired functionality, such as the above mentioned. The extensions as well as some other novel features have been implemented in the new version, freely available at www.familias.no. The implementation and validation are briefly mentioned leaving complete details to Supplementary sections. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  2. Can low-cost VOR and Omega receivers suffice for RNAV - A new computer-based navigation technique

    NASA Technical Reports Server (NTRS)

    Hollaar, L. A.

    1978-01-01

    It is shown that although RNAV is particularly valuable for the personal transportation segment of general aviation, it has not gained complete acceptance. This is due, in part, to its high cost and the necessary special-handling air traffic control. VOR/DME RNAV calculations are ideally suited for analog computers, and the use of microprocessor technology has been suggested for reducing RNAV costs. Three navigation systems, VOR, Omega, and DR, are compared for common navigational difficulties, such as station geometry, siting errors, ground disturbances, and terminal area coverage. The Kalman filtering technique is described with reference to the disadvantages when using a system including standard microprocessors. An integrated navigation system, using input data from various low-cost sensor systems, is presented and current simulation studies are noted.

  3. Preoperative planning with three-dimensional reconstruction of patient's anatomy, rapid prototyping and simulation for endoscopic mitral valve repair.

    PubMed

    Sardari Nia, Peyman; Heuts, Samuel; Daemen, Jean; Luyten, Peter; Vainer, Jindrich; Hoorntje, Jan; Cheriex, Emile; Maessen, Jos

    2017-02-01

    Mitral valve repair performed by an experienced surgeon is superior to mitral valve replacement for degenerative mitral valve disease; however, many surgeons are still deterred from adapting this procedure because of a steep learning curve. Simulation-based training and planning could improve the surgical performance and reduce the learning curve. The aim of this study was to develop a patient-specific simulation for mitral valve repair and provide a proof of concept of personalized medicine in a patient prospectively planned for mitral valve surgery. A 65-year old male with severe symptomatic mitral valve regurgitation was referred to our mitral valve heart team. On the basis of three-dimensional (3D) transoesophageal echocardiography and computed tomography, 3D reconstructions of the patient's anatomy were constructed. By navigating through these reconstructions, the repair options and surgical access were chosen (minimally invasive repair). Using rapid prototyping and negative mould fabrication, we developed a process to cast a patient-specific mitral valve silicone replica for preoperative repair in a high-fidelity simulator. Mitral valve and negative mould were printed in systole to capture the pathology when the valve closes. A patient-specific mitral valve silicone replica was casted and mounted in the simulator. All repair techniques could be performed in the simulator to choose the best repair strategy. As the valve was printed in systole, no special testing other than adjusting the coaptation area was required. Subsequently, the patient was operated, mitral valve pathology was validated and repair was successfully done as in the simulation. The patient-specific simulation and planning could be applied for surgical training, starting the (minimally invasive) mitral valve repair programme, planning of complex cases and the evaluation of new interventional techniques. The personalized medicine could be a possible pathway towards enhancing reproducibility, patient's safety and effectiveness of a complex surgical procedure. © The Author 2016. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.

  4. Resource Guide for Persons with Speech or Language Impairments.

    ERIC Educational Resources Information Center

    IBM, Atlanta, GA. National Support Center for Persons with Disabilities.

    The resource guide identifies products which assist speech or language impaired individuals in accessing IBM (International Business Machine) Personal Computers or the IBM Personal System/2 family of products. An introduction provides a general overview of ways computers can help persons with speech or language handicaps. The document then…

  5. Prediction system of the 1-AU arrival times of CME-associated interplanetary shocks using three-dimensional simulations

    NASA Astrophysics Data System (ADS)

    den, Mitsue; Amo, Hiroyoshi; Sugihara, Kohta; Takei, Toshifumi; Ogawa, Tomoya; Tanaka, Takashi; Watari, Shinichi

    We describe prediction system of the 1-AU arrival times of interplanetary shock waves associated with coromal mass ejections (CMEs). The system is based on modeling of the shock propagation using a three-dimensional adaptive mesh refinement (AMR) code. Once a CME is observed by LASCO/SOHO, firstly ambient solar wind is obtained by numerical simulation, which reproduces the solar wind parameters at that time observed by ACE spacecraft. Then we input the expansion speed and occurrence position data of that CME as initial condtions for an CME model, and 3D simulation of the CME and the shock propagation is perfomed until the shock wave passes the 1-AU. Input the parameters, execution of simulation and output of the result are available on Web, so a person who is not familiar with operation of computer or simulations or is not a researcher can use this system to predict the shock passage time. Simulated CME and shock evolution is visuallized at the same time with simulation and snap shots appear on the web automatically, so that user can follow the propagation. This system is expected to be useful for forecasters of space weather. We will describe the system and simulation model in detail.

  6. High-performance parallel computing in the classroom using the public goods game as an example

    NASA Astrophysics Data System (ADS)

    Perc, Matjaž

    2017-07-01

    The use of computers in statistical physics is common because the sheer number of equations that describe the behaviour of an entire system particle by particle often makes it impossible to solve them exactly. Monte Carlo methods form a particularly important class of numerical methods for solving problems in statistical physics. Although these methods are simple in principle, their proper use requires a good command of statistical mechanics, as well as considerable computational resources. The aim of this paper is to demonstrate how the usage of widely accessible graphics cards on personal computers can elevate the computing power in Monte Carlo simulations by orders of magnitude, thus allowing live classroom demonstration of phenomena that would otherwise be out of reach. As an example, we use the public goods game on a square lattice where two strategies compete for common resources in a social dilemma situation. We show that the second-order phase transition to an absorbing phase in the system belongs to the directed percolation universality class, and we compare the time needed to arrive at this result by means of the main processor and by means of a suitable graphics card. Parallel computing on graphics processing units has been developed actively during the last decade, to the point where today the learning curve for entry is anything but steep for those familiar with programming. The subject is thus ripe for inclusion in graduate and advanced undergraduate curricula, and we hope that this paper will facilitate this process in the realm of physics education. To that end, we provide a documented source code for an easy reproduction of presented results and for further development of Monte Carlo simulations of similar systems.

  7. A breakthrough for experiencing and understanding simulated physics

    NASA Technical Reports Server (NTRS)

    Watson, Val

    1988-01-01

    The use of computer simulation in physics research is discussed, focusing on improvements to graphic workstations. Simulation capabilities and applications of enhanced visualization tools are outlined. The elements of an ideal computer simulation are presented and the potential for improving various simulation elements is examined. The interface between the human and the computer and simulation models are considered. Recommendations are made for changes in computer simulation practices and applications of simulation technology in education.

  8. Analysing the performance of personal computers based on Intel microprocessors for sequence aligning bioinformatics applications.

    PubMed

    Nair, Pradeep S; John, Eugene B

    2007-01-01

    Aligning specific sequences against a very large number of other sequences is a central aspect of bioinformatics. With the widespread availability of personal computers in biology laboratories, sequence alignment is now often performed locally. This makes it necessary to analyse the performance of personal computers for sequence aligning bioinformatics benchmarks. In this paper, we analyse the performance of a personal computer for the popular BLAST and FASTA sequence alignment suites. Results indicate that these benchmarks have a large number of recurring operations and use memory operations extensively. It seems that the performance can be improved with a bigger L1-cache.

  9. Three-dimensional (3D) printing and its applications for aortic diseases.

    PubMed

    Hangge, Patrick; Pershad, Yash; Witting, Avery A; Albadawi, Hassan; Oklu, Rahmi

    2018-04-01

    Three-dimensional (3D) printing is a process which generates prototypes from virtual objects in computer-aided design (CAD) software. Since 3D printing enables the creation of customized objects, it is a rapidly expanding field in an age of personalized medicine. We discuss the use of 3D printing in surgical planning, training, and creation of devices for the treatment of aortic diseases. 3D printing can provide operators with a hands-on model to interact with complex anatomy, enable prototyping of devices for implantation based upon anatomy, or even provide pre-procedural simulation. Potential exists to expand upon current uses of 3D printing to create personalized implantable devices such as grafts. Future studies should aim to demonstrate the impact of 3D printing on outcomes to make this technology more accessible to patients with complex aortic diseases.

  10. A posture recognition based fall detection system for monitoring an elderly person in a smart home environment.

    PubMed

    Yu, Miao; Rhuma, Adel; Naqvi, Syed Mohsen; Wang, Liang; Chambers, Jonathon

    2012-11-01

    We propose a novel computer vision based fall detection system for monitoring an elderly person in a home care application. Background subtraction is applied to extract the foreground human body and the result is improved by using certain post-processing. Information from ellipse fitting and a projection histogram along the axes of the ellipse are used as the features for distinguishing different postures of the human. These features are then fed into a directed acyclic graph support vector machine (DAGSVM) for posture classification, the result of which is then combined with derived floor information to detect a fall. From a dataset of 15 people, we show that our fall detection system can achieve a high fall detection rate (97.08%) and a very low false detection rate (0.8%) in a simulated home environment.

  11. Four PPPPerspectives on computational creativity in theory and in practice

    NASA Astrophysics Data System (ADS)

    Jordanous, Anna

    2016-04-01

    Computational creativity is the modelling, simulating or replicating of creativity computationally. In examining and learning from these "creative systems", from what perspective should the creativity of a system be considered? Are we interested in the creativity of the system's output? Or of its creative processes? Features of the system? Or how it operates within its environment? Traditionally computational creativity has focused more on creative systems' products or processes, though this focus has widened recently. Creativity research offers the Four Ps of creativity: Person/Producer, Product, Process and Press/Environment. This paper presents the Four Ps, explaining each in the context of creativity research and how it relates to computational creativity. To illustrate the usefulness of the Four Ps in taking broader perspectives on creativity in its computational treatment, the concepts of novelty and value are explored using the Four Ps, highlighting aspects of novelty and value that may otherwise be overlooked. Analysis of recent research in computational creativity finds that although each of the Four Ps appears in the body of computational creativity work, individual pieces of work often do not acknowledge all Four Ps, missing opportunities to widen their work's relevance. We can see, though, that high-status computational creativity papers do typically address all Four Ps. This paper argues that the broader views of creativity afforded by the Four Ps is vital in guiding us towards more comprehensively useful computational investigations of creativity.

  12. Simulation Detection in Handwritten Documents by Forensic Document Examiners.

    PubMed

    Kam, Moshe; Abichandani, Pramod; Hewett, Tom

    2015-07-01

    This study documents the results of a controlled experiment designed to quantify the abilities of forensic document examiners (FDEs) and laypersons to detect simulations in handwritten documents. Nineteen professional FDEs and 26 laypersons (typical of a jury pool) were asked to inspect test packages that contained six (6) known handwritten documents written by the same person and two (2) questioned handwritten documents. Each questioned document was either written by the person who wrote the known documents, or written by a different person who tried to simulate the writing of the person who wrote the known document. The error rates of the FDEs were smaller than those of the laypersons when detecting simulations in the questioned documents. Among other findings, the FDEs never labeled a questioned document that was written by the same person who wrote the known documents as "simulation." There was a significant statistical difference between the responses of the FDEs and layperson for documents without simulations. © 2015 American Academy of Forensic Sciences.

  13. Simulation of an integrated system for the production of methane and single cell protein from biomass

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thomas, M.V.

    1989-01-01

    A numerical model was developed to simulate the operation of an integrated system for the production of methane and single-cell algal protein from a variety of biomass energy crops or waste streams. Economic analysis was performed at the end of each simulation. The model was capable of assisting in the determination of design parameters by providing relative economic information for various strategies. Three configurations of anaerobic reactors were simulated. These included fed-bed reactors, conventional stirred tank reactors, and continuously expanding reactors. A generic anaerobic digestion process model, using lumped substrate parameters, was developed for use by type-specific reactor models. Themore » generic anaerobic digestion model provided a tool for the testing of conversion efficiencies and kinetic parameters for a wide range of substrate types and reactor designs. Dynamic growth models were used to model the growth of algae and Eichornia crassipes was modeled as a function of daily incident radiation and temperature. The growth of Eichornia crassipes was modeled for the production of biomass as a substrate for digestion. Computer simulations with the system model indicated that tropical or subtropical locations offered the most promise for a viable system. The availability of large quantities of digestible waste and low land prices were found to be desirable in order to take advantage of the economies of scale. Other simulations indicated that poultry and swine manure produced larger biogas yields than cattle manure. The model was created in a modular fashion to allow for testing of a wide variety of unit operations. Coding was performed in the Pascal language for use on personal computers.« less

  14. Reproducibility of haemodynamical simulations in a subject-specific stented aneurysm model--a report on the Virtual Intracranial Stenting Challenge 2007.

    PubMed

    Radaelli, A G; Augsburger, L; Cebral, J R; Ohta, M; Rüfenacht, D A; Balossino, R; Benndorf, G; Hose, D R; Marzo, A; Metcalfe, R; Mortier, P; Mut, F; Reymond, P; Socci, L; Verhegghe, B; Frangi, A F

    2008-07-19

    This paper presents the results of the Virtual Intracranial Stenting Challenge (VISC) 2007, an international initiative whose aim was to establish the reproducibility of state-of-the-art haemodynamical simulation techniques in subject-specific stented models of intracranial aneurysms (IAs). IAs are pathological dilatations of the cerebral artery walls, which are associated with high mortality and morbidity rates due to subarachnoid haemorrhage following rupture. The deployment of a stent as flow diverter has recently been indicated as a promising treatment option, which has the potential to protect the aneurysm by reducing the action of haemodynamical forces and facilitating aneurysm thrombosis. The direct assessment of changes in aneurysm haemodynamics after stent deployment is hampered by limitations in existing imaging techniques and currently requires resorting to numerical simulations. Numerical simulations also have the potential to assist in the personalized selection of an optimal stent design prior to intervention. However, from the current literature it is difficult to assess the level of technological advancement and the reproducibility of haemodynamical predictions in stented patient-specific models. The VISC 2007 initiative engaged in the development of a multicentre-controlled benchmark to analyse differences induced by diverse grid generation and computational fluid dynamics (CFD) technologies. The challenge also represented an opportunity to provide a survey of available technologies currently adopted by international teams from both academic and industrial institutions for constructing computational models of stented aneurysms. The results demonstrate the ability of current strategies in consistently quantifying the performance of three commercial intracranial stents, and contribute to reinforce the confidence in haemodynamical simulation, thus taking a step forward towards the introduction of simulation tools to support diagnostics and interventional planning.

  15. Today's Personal Computers: Products for Every Need--Part II.

    ERIC Educational Resources Information Center

    Personal Computing, 1981

    1981-01-01

    Looks at microcomputers manufactured by Altos Computer Systems, Cromemco, Exidy, Intelligent Systems, Intertec Data Systems, Mattel, Nippon Electronics, Northstar, Personal Micro Computers, and Sinclair. (Part I of this article, examining other computers, appeared in the May 1981 issue.) Journal availability: Hayden Publishing Company, 50 Essex…

  16. Computer Self-Efficacy, Computer Anxiety, Performance and Personal Outcomes of Turkish Physical Education Teachers

    ERIC Educational Resources Information Center

    Aktag, Isil

    2015-01-01

    The purpose of this study is to determine the computer self-efficacy, performance outcome, personal outcome, and affect and anxiety level of physical education teachers. Influence of teaching experience, computer usage and participation of seminars or in-service programs on computer self-efficacy level were determined. The subjects of this study…

  17. Click! 101 Computer Activities and Art Projects for Kids and Grown-Ups.

    ERIC Educational Resources Information Center

    Bundesen, Lynne; And Others

    This book presents 101 computer activities and projects geared toward children and adults. The activities for both personal computers (PCs) and Macintosh were developed on the Windows 95 computer operating system, but they are adaptable to non-Windows personal computers as well. The book is divided into two parts. The first part provides an…

  18. Optimizing agent-based transmission models for infectious diseases.

    PubMed

    Willem, Lander; Stijven, Sean; Tijskens, Engelbert; Beutels, Philippe; Hens, Niel; Broeckhove, Jan

    2015-06-02

    Infectious disease modeling and computational power have evolved such that large-scale agent-based models (ABMs) have become feasible. However, the increasing hardware complexity requires adapted software designs to achieve the full potential of current high-performance workstations. We have found large performance differences with a discrete-time ABM for close-contact disease transmission due to data locality. Sorting the population according to the social contact clusters reduced simulation time by a factor of two. Data locality and model performance can also be improved by storing person attributes separately instead of using person objects. Next, decreasing the number of operations by sorting people by health status before processing disease transmission has also a large impact on model performance. Depending of the clinical attack rate, target population and computer hardware, the introduction of the sort phase decreased the run time from 26% up to more than 70%. We have investigated the application of parallel programming techniques and found that the speedup is significant but it drops quickly with the number of cores. We observed that the effect of scheduling and workload chunk size is model specific and can make a large difference. Investment in performance optimization of ABM simulator code can lead to significant run time reductions. The key steps are straightforward: the data structure for the population and sorting people on health status before effecting disease propagation. We believe these conclusions to be valid for a wide range of infectious disease ABMs. We recommend that future studies evaluate the impact of data management, algorithmic procedures and parallelization on model performance.

  19. Computational simulation of passive leg-raising effects on hemodynamics during cardiopulmonary resuscitation.

    PubMed

    Shin, Dong Ah; Park, Jiheum; Lee, Jung Chan; Shin, Sang Do; Kim, Hee Chan

    2017-03-01

    The passive leg-raising (PLR) maneuver has been used for patients with circulatory failure to improve hemodynamic responsiveness by increasing cardiac output, which should also be beneficial and may exert synergetic effects during cardiopulmonary resuscitation (CPR). However, the impact of the PLR maneuver on CPR remains unclear due to difficulties in monitoring cardiac output in real-time during CPR and a lack of clinical evidence. We developed a computational model that couples hemodynamic behavior during standard CPR and the PLR maneuver, and simulated the model by applying different angles of leg raising from 0° to 90° and compression rates from 80/min to 160/min. The simulation results showed that the PLR maneuver during CPR significantly improves cardiac output (CO), systemic perfusion pressure (SPP) and coronary perfusion pressure (CPP) by ∼40-65% particularly under the recommended range of compression rates between 100/min and 120/min with 45° of leg raise, compared to standard CPR. However, such effects start to wane with further leg lifts, indicating the existence of an optimal angle of leg raise for each person to achieve the best hemodynamic responses. We developed a CPR-PLR model and demonstrated the effects of PLR on hemodynamics by investigating changes in CO, SPP, and CPP under different compression rates and angles of leg raising. Our computational model will facilitate study of PLR effects during CPR and the development of an advanced model combined with circulatory disorders, which will be a valuable asset for further studies. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  20. The Role of Computational Modeling and Simulation in the Total Product Life Cycle of Peripheral Vascular Devices

    PubMed Central

    Morrison, Tina M.; Dreher, Maureen L.; Nagaraja, Srinidhi; Angelone, Leonardo M.; Kainz, Wolfgang

    2018-01-01

    The total product life cycle (TPLC) of medical devices has been defined by four stages: discovery and ideation, regulatory decision, product launch, and postmarket monitoring. Manufacturers of medical devices intended for use in the peripheral vasculature, such as stents, inferior vena cava (IVC) filters, and stent-grafts, mainly use computational modeling and simulation (CM&S) to aid device development and design optimization, supplement bench testing for regulatory decisions, and assess postmarket changes or failures. For example, computational solid mechanics and fluid dynamics enable the investigation of design limitations in the ideation stage. To supplement bench data in regulatory submissions, manufactures can evaluate the effects of anatomical characteristics and expected in vivo loading environment on device performance. Manufacturers might also harness CM&S to aid root-cause analyses that are necessary when failures occur postmarket, when the device is exposed to broad clinical use. Once identified, CM&S tools can then be used for redesign to address the failure mode and re-establish the performance profile with the appropriate models. The Center for Devices and Radiological Health (CDRH) wants to advance the use of CM&S for medical devices and supports the development of virtual physiological patients, clinical trial simulations, and personalized medicine. Thus, the purpose of this paper is to describe specific examples of how CM&S is currently used to support regulatory submissions at different phases of the TPLC and to present some of the stakeholder-led initiatives for advancing CM&S for regulatory decision-making. PMID:29479395

  1. The Role of Computational Modeling and Simulation in the Total Product Life Cycle of Peripheral Vascular Devices.

    PubMed

    Morrison, Tina M; Dreher, Maureen L; Nagaraja, Srinidhi; Angelone, Leonardo M; Kainz, Wolfgang

    2017-01-01

    The total product life cycle (TPLC) of medical devices has been defined by four stages: discovery and ideation, regulatory decision, product launch, and postmarket monitoring. Manufacturers of medical devices intended for use in the peripheral vasculature, such as stents, inferior vena cava (IVC) filters, and stent-grafts, mainly use computational modeling and simulation (CM&S) to aid device development and design optimization, supplement bench testing for regulatory decisions, and assess postmarket changes or failures. For example, computational solid mechanics and fluid dynamics enable the investigation of design limitations in the ideation stage. To supplement bench data in regulatory submissions, manufactures can evaluate the effects of anatomical characteristics and expected in vivo loading environment on device performance. Manufacturers might also harness CM&S to aid root-cause analyses that are necessary when failures occur postmarket, when the device is exposed to broad clinical use. Once identified, CM&S tools can then be used for redesign to address the failure mode and re-establish the performance profile with the appropriate models. The Center for Devices and Radiological Health (CDRH) wants to advance the use of CM&S for medical devices and supports the development of virtual physiological patients, clinical trial simulations, and personalized medicine. Thus, the purpose of this paper is to describe specific examples of how CM&S is currently used to support regulatory submissions at different phases of the TPLC and to present some of the stakeholder-led initiatives for advancing CM&S for regulatory decision-making.

  2. Computer Language Settings and Canadian Spellings

    ERIC Educational Resources Information Center

    Shuttleworth, Roger

    2011-01-01

    The language settings used on personal computers interact with the spell-checker in Microsoft Word, which directly affects the flagging of spellings that are deemed incorrect. This study examined the language settings of personal computers owned by a group of Canadian university students. Of 21 computers examined, only eight had their Windows…

  3. Specification of Computer Systems by Objectives.

    ERIC Educational Resources Information Center

    Eltoft, Douglas

    1989-01-01

    Discusses the evolution of mainframe and personal computers, and presents a case study of a network developed at the University of Iowa called the Iowa Computer-Aided Engineering Network (ICAEN) that combines Macintosh personal computers with Apollo workstations. Functional objectives are stressed as the best measure of system performance. (LRW)

  4. Transforming clinical imaging and 3D data for virtual reality learning objects: HTML5 and mobile devices implementation.

    PubMed

    Trelease, Robert B; Nieder, Gary L

    2013-01-01

    Web deployable anatomical simulations or "virtual reality learning objects" can easily be produced with QuickTime VR software, but their use for online and mobile learning is being limited by the declining support for web browser plug-ins for personal computers and unavailability on popular mobile devices like Apple iPad and Android tablets. This article describes complementary methods for creating comparable, multiplatform VR learning objects in the new HTML5 standard format, circumventing platform-specific limitations imposed by the QuickTime VR multimedia file format. Multiple types or "dimensions" of anatomical information can be embedded in such learning objects, supporting different kinds of online learning applications, including interactive atlases, examination questions, and complex, multi-structure presentations. Such HTML5 VR learning objects are usable on new mobile devices that do not support QuickTime VR, as well as on personal computers. Furthermore, HTML5 VR learning objects can be embedded in "ebook" document files, supporting the development of new types of electronic textbooks on mobile devices that are increasingly popular and self-adopted for mobile learning. © 2012 American Association of Anatomists.

  5. The impact of personalized probabilistic wall thickness models on peak wall stress in abdominal aortic aneurysms.

    PubMed

    Biehler, J; Wall, W A

    2018-02-01

    If computational models are ever to be used in high-stakes decision making in clinical practice, the use of personalized models and predictive simulation techniques is a must. This entails rigorous quantification of uncertainties as well as harnessing available patient-specific data to the greatest extent possible. Although researchers are beginning to realize that taking uncertainty in model input parameters into account is a necessity, the predominantly used probabilistic description for these uncertain parameters is based on elementary random variable models. In this work, we set out for a comparison of different probabilistic models for uncertain input parameters using the example of an uncertain wall thickness in finite element models of abdominal aortic aneurysms. We provide the first comparison between a random variable and a random field model for the aortic wall and investigate the impact on the probability distribution of the computed peak wall stress. Moreover, we show that the uncertainty about the prevailing peak wall stress can be reduced if noninvasively available, patient-specific data are harnessed for the construction of the probabilistic wall thickness model. Copyright © 2017 John Wiley & Sons, Ltd.

  6. Personal Computer Transport Analysis Program

    NASA Technical Reports Server (NTRS)

    DiStefano, Frank, III; Wobick, Craig; Chapman, Kirt; McCloud, Peter

    2012-01-01

    The Personal Computer Transport Analysis Program (PCTAP) is C++ software used for analysis of thermal fluid systems. The program predicts thermal fluid system and component transients. The output consists of temperatures, flow rates, pressures, delta pressures, tank quantities, and gas quantities in the air, along with air scrubbing component performance. PCTAP s solution process assumes that the tubes in the system are well insulated so that only the heat transfer between fluid and tube wall and between adjacent tubes is modeled. The system described in the model file is broken down into its individual components; i.e., tubes, cold plates, heat exchangers, etc. A solution vector is built from the components and a flow is then simulated with fluid being transferred from one component to the next. The solution vector of components in the model file is built at the initiation of the run. This solution vector is simply a list of components in the order of their inlet dependency on other components. The component parameters are updated in the order in which they appear in the list at every time step. Once the solution vectors have been determined, PCTAP cycles through the components in the solution vector, executing their outlet function for each time-step increment.

  7. AIR TOXICS EMISSIONS FROM ELECTRONICS INCINERATION

    EPA Science Inventory

    The purpose of this project is to examine the emissions of air toxics from the combustion of electronics equipment, primarily personal computer components. Due to a shortage of recycling programs for personal computers and other personal electronics equipment, most of these mate...

  8. Domesticating the Personal Computer: The Mainstreaming of a New Technology and the Cultural Management of a Widespread Technophobia, 1964-.

    ERIC Educational Resources Information Center

    Reed, Lori

    2000-01-01

    Uses discourses on "computer-phobia" and "computer addiction" to describe the cultural work involved and marketing strategies used between 1960s-1990s regarding management of computer fear. Draws on popular discourses, advertisements, and advice literature to explore how the personal computer was successfully connected to middle-class family…

  9. Computational Fluid Dynamics Uncertainty Analysis for Payload Fairing Spacecraft Environmental Control Systems

    NASA Technical Reports Server (NTRS)

    Groves, Curtis E.

    2013-01-01

    Spacecraft thermal protection systems are at risk of being damaged due to airflow produced from Environmental Control Systems. There are inherent uncertainties and errors associated with using Computational Fluid Dynamics to predict the airflow field around a spacecraft from the Environmental Control System. This proposal describes an approach to validate the uncertainty in using Computational Fluid Dynamics to predict airflow speeds around an encapsulated spacecraft. The research described here is absolutely cutting edge. Quantifying the uncertainty in analytical predictions is imperative to the success of any simulation-based product. The method could provide an alternative to traditional"validation by test only'' mentality. This method could be extended to other disciplines and has potential to provide uncertainty for any numerical simulation, thus lowering the cost of performing these verifications while increasing the confidence in those predictions. Spacecraft requirements can include a maximum airflow speed to protect delicate instruments during ground processing. Computationaf Fluid Dynamics can be used to veritY these requirements; however, the model must be validated by test data. The proposed research project includes the following three objectives and methods. Objective one is develop, model, and perform a Computational Fluid Dynamics analysis of three (3) generic, non-proprietary, environmental control systems and spacecraft configurations. Several commercially available solvers have the capability to model the turbulent, highly three-dimensional, incompressible flow regime. The proposed method uses FLUENT and OPEN FOAM. Objective two is to perform an uncertainty analysis of the Computational Fluid . . . Dynamics model using the methodology found in "Comprehensive Approach to Verification and Validation of Computational Fluid Dynamics Simulations". This method requires three separate grids and solutions, which quantify the error bars around Computational Fluid Dynamics predictions. The method accounts for all uncertainty terms from both numerical and input variables. Objective three is to compile a table of uncertainty parameters that could be used to estimate the error in a Computational Fluid Dynamics model of the Environmental Control System /spacecraft system. Previous studies have looked at the uncertainty in a Computational Fluid Dynamics model for a single output variable at a single point, for example the re-attachment length of a backward facing step. To date, the author is the only person to look at the uncertainty in the entire computational domain. For the flow regime being analyzed (turbulent, threedimensional, incompressible), the error at a single point can propagate into the solution both via flow physics and numerical methods. Calculating the uncertainty in using Computational Fluid Dynamics to accurately predict airflow speeds around encapsulated spacecraft in is imperative to the success of future missions.

  10. Quantum analogue computing.

    PubMed

    Kendon, Vivien M; Nemoto, Kae; Munro, William J

    2010-08-13

    We briefly review what a quantum computer is, what it promises to do for us and why it is so hard to build one. Among the first applications anticipated to bear fruit is the quantum simulation of quantum systems. While most quantum computation is an extension of classical digital computation, quantum simulation differs fundamentally in how the data are encoded in the quantum computer. To perform a quantum simulation, the Hilbert space of the system to be simulated is mapped directly onto the Hilbert space of the (logical) qubits in the quantum computer. This type of direct correspondence is how data are encoded in a classical analogue computer. There is no binary encoding, and increasing precision becomes exponentially costly: an extra bit of precision doubles the size of the computer. This has important consequences for both the precision and error-correction requirements of quantum simulation, and significant open questions remain about its practicality. It also means that the quantum version of analogue computers, continuous-variable quantum computers, becomes an equally efficient architecture for quantum simulation. Lessons from past use of classical analogue computers can help us to build better quantum simulators in future.

  11. Student Ability, Confidence, and Attitudes Toward Incorporating a Computer into a Patient Interview.

    PubMed

    Ray, Sarah; Valdovinos, Katie

    2015-05-25

    To improve pharmacy students' ability to effectively incorporate a computer into a simulated patient encounter and to improve their awareness of barriers and attitudes towards and their confidence in using a computer during simulated patient encounters. Students completed a survey that assessed their awareness of, confidence in, and attitudes towards computer use during simulated patient encounters. Students were evaluated with a rubric on their ability to incorporate a computer into a simulated patient encounter. Students were resurveyed and reevaluated after instruction. Students improved in their ability to effectively incorporate computer usage into a simulated patient encounter. They also became more aware of and improved their attitudes toward barriers regarding such usage and gained more confidence in their ability to use a computer during simulated patient encounters. Instruction can improve pharmacy students' ability to incorporate a computer into simulated patient encounters. This skill is critical to developing efficiency while maintaining rapport with patients.

  12. FOREWORD: Third Nordic Symposium on Computer Simulation in Physics, Chemistry, Biology and Mathematics

    NASA Astrophysics Data System (ADS)

    Kaski, K.; Salomaa, M.

    1990-01-01

    These are Proceedings of the Third Nordic Symposium on Computer Simulation in Physics, Chemistry, Biology, and Mathematics, held August 25-26, 1989, at Lahti (Finland). The Symposium belongs to an annual series of Meetings, the first one of which was arranged in 1987 at Lund (Sweden) and the second one in 1988 at Kolle-Kolle near Copenhagen (Denmark). Although these Symposia have thus far been essentially Nordic events, their international character has increased significantly; the trend is vividly reflected through contributions in the present Topical Issue. The interdisciplinary nature of Computational Science is central to the activity; this fundamental aspect is also responsible, in an essential way, for its rapidly increasing impact. Crucially important to a wide spectrum of superficially disparate fields is the common need for extensive - and often quite demanding - computational modelling. For such theoretical models, no closed-form (analytical) solutions are available or they would be extremely difficult to find; hence one must rather resort to the Art of performing computational investigations. Among the unifying features in the computational research are the methods of simulation employed; methods which frequently are quite closely related with each other even for faculties of science that are quite unrelated. Computer simulation in Natural Sciences is presently apprehended as a discipline on its own right, occupying a broad region somewhere between the experimental and theoretical methods, but also partially overlapping with and complementing them. - Whichever its proper definition may be, the computational approach serves as a novel and an extremely versatile tool with which one can equally well perform "pure" experimental modelling and conduct "computational theory". Computational studies that have earlier been made possible only through supercomputers have opened unexpected, as well as exciting, novel frontiers equally in mathematics (e.g., fractals), physics (fluid-dynamical and quantum-mechanical calculations; extensive numerical simulations of various condensed-matter systems; the development of stellar constellations, even the early Universe), chemistry (quantum-chemical calculations on the structures of new chemical compounds; chemical reactions and reaction dynamics), and biology (various models, for example, in population dynamics). We succeeded in our effort to assemble several internationally recognized researchers of Computational Science to deliver invited talks on a couple of exceptionally beautiful late-summer days in the modern premises of the Adult Education Center at Lahti. Among the plenary speakers, Per Bak described his highly original work on self-organized criticality. David Ceperley discussed pioneering numerical simulations of superfluid helium in which, for the first time, Feynman's path-integral formulation of quantum mechanics has been implemented on a computer. Jim Gunton presented his comprehensive studies of the Cahn-Hilliard equation for the dynamics of ordering in a condensed-matter system far from equilibrium, while Alex Hansen explained those on nonlinear breakdown in disordered materials. Representing the important field of computational chemistry, Bo Jönsson dealt with attractive forces between polyelectrolytes. Kurt Kremer gave an interesting account on computer-simulation studies of complex polymer systems, while Ole Mouritsen reviewed studies of interfacial fluctuations in lipid membranes. Pekka Pyykkö introduced his pioneering work which has led to predictions of completely novel chemical species. Annette Zippelius gave an expert introduction to the highly active field of neural networks. It is evident from each of these intriguing plenary contributions that, indeed, the computational approach is a frontier field of science, possibly providing the most versatile research method available today. We also arranged a competition for the best Posters presented at the Symposium; the Prizes were some of the newest books on the beauty of fractals. The First Prize was won by Hanna Viertio, the Second Prize by Miguel Zendejas and the Third Prize was shared by Leo Kärkkäinen and Kari Rummukainen. As for the future of Computational Science, we identify two principal avenues: (a) big science - large centers with ultrafast supercomputers, and (b) small science - active groups utilizing personal minisupercomputers or supenvorkstations. At present, it appears that the latter already compete extremely favourably in their performance with the massive supercomputers - at least in their throughput and, especially, in tasks where a broad range of diverse software support is not absolutely necessary. In view of this important emergence of "personal supercomputing", we envisage that the role and the development of large computer centers will have to be reviewed critically and modified accordingly. Furthermore, a promise for some radically new approaches to Computational Science could be provided by massively parallel computers; among them, maybe solutions based on ideas of neural computing could be utilized, especially for restricted applications. Therefore, in order not to overlook any important advances within such a forefront field, one should rather choose the strategy of actively following each and every one of these routes. In perspective of the large variety of simultaneous developments, we want to emphasize the importance of Nordic collaboration in sharing expertise and experience in the rapidly progressing research - it ought to be cultivated and could be expanded. Therefore, we think that it is vitally important to continue with and to further promote the kind of Nordic Symposia that have been held at Lund, Kolle-Kolle, and Lahti. We want to thank most cordially the plenary and invited speakers, contributors, students, and in particular the Conference Secretary, Ms Ulla Ahlfors and Dr Milja Mäkelä, who was responsible for the local arrangements. The work that they did served to make this Symposium a scientific success and a useful and pleasant experience for all the well over 100 participants. We also thank the City of Lahti for kindly arranging a refreshing reception at the Town Hall. We wish to express our gratitude to Nordiska Kulturfonden, NORDITA, the Research Institute for Theoretical Physics at the University of Helsinki, the Finnish Ministry of Education and the Academy of Finland for their financial support. March 1990

  13. Computer modelling as a tool for the exposure assessment of operators using faulty agricultural pesticide spraying equipment.

    PubMed

    Bańkowski, Robert; Wiadrowska, Bozena; Beresińska, Martyna; Ludwicki, Jan K; Noworyta-Głowacka, Justyna; Godyń, Artur; Doruchowski, Grzegorz; Hołownicki, Ryszard

    2013-01-01

    Faulty but still operating agricultural pesticide sprayers may pose an unacceptable health risk for operators. The computerized models designed to calculate exposure and risk for pesticide sprayers used as an aid in the evaluation and further authorisation of plant protection products may be applied also to assess a health risk for operators when faulty sprayers are used. To evaluate the impact of different exposure scenarios on the health risk for the operators using faulty agricultural spraying equipment by means of computer modelling. The exposure modelling was performed for 15 pesticides (5 insecticides, 7 fungicides and 3 herbicides). The critical parameter, i.e. toxicological end-point, on which the risk assessment was based was the no observable adverse effect level (NOAEL). This enabled risk to be estimated under various exposure conditions such as pesticide concentration in the plant protection product and type of the sprayed crop as well as the number of treatments. Computer modelling was based on the UK POEM model including determination of the acceptable operator exposure level (AOEL). Thus the degree of operator exposure could be defined during pesticide treatment whether or not personal protection equipment had been employed by individuals. Data used for computer modelling was obtained from simulated, pesticide substitute treatments using variously damaged knapsack sprayers. These substitute preparations consisted of markers that allowed computer simulations to be made, analogous to real-life exposure situations, in a dose dependent fashion. Exposures were estimated according to operator dosimetry exposure under 'field' conditions for low level, medium and high target field crops. The exposure modelling in the high target field crops demonstrated exceedance of the AOEL in all simulated treatment cases (100%) using damaged sprayers irrespective of the type of damage or if individual protective measures had been adopted or not. For low level and medium field crops exceedances ranged between 40 - 80% cases. The computer modelling may be considered as an practical tool for the hazard assessment when the faulty agricultural sprayers are used. It also may be applied for programming the quality checks and maintenance systems of this equipment.

  14. The Impact of Cognitive and Non-Cognitive Personality Traits on Computer Literacy Level

    ERIC Educational Resources Information Center

    Saparniene, Diana; Merkys, Gediminas; Saparnis, Gintaras

    2006-01-01

    Purpose: The paper deals with the study of students' computer literacy one of the purposes being demonstration the impact of the cognitive and non-cognitive personality traits (attention, verbal and non-verbal intelligence, emotional-motivational relationship with computer, learning strategies, etc.) on the quality of computer literacy.…

  15. 47 CFR 15.101 - Equipment authorization of unintentional radiators.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    .... TV interface device Declaration of Conformity or Certification. Cable system terminal device Declaration of Conformity. Stand-alone cable input selector switch Verification. Class B personal computers... used with Class B personal computers Declaration of Conformity or Certification. 1 Class B personal...

  16. 47 CFR 15.101 - Equipment authorization of unintentional radiators.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    .... TV interface device Declaration of Conformity or Certification. Cable system terminal device Declaration of Conformity. Stand-alone cable input selector switch Verification. Class B personal computers... used with Class B personal computers Declaration of Conformity or Certification. 1 Class B personal...

  17. 47 CFR 15.101 - Equipment authorization of unintentional radiators.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    .... TV interface device Declaration of Conformity or Certification. Cable system terminal device Declaration of Conformity. Stand-alone cable input selector switch Verification. Class B personal computers... used with Class B personal computers Declaration of Conformity or Certification. 1 Class B personal...

  18. 47 CFR 15.101 - Equipment authorization of unintentional radiators.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    .... TV interface device Declaration of Conformity or Certification. Cable system terminal device Declaration of Conformity. Stand-alone cable input selector switch Verification. Class B personal computers... used with Class B personal computers Declaration of Conformity or Certification. 1 Class B personal...

  19. 47 CFR 15.101 - Equipment authorization of unintentional radiators.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    .... TV interface device Declaration of Conformity or Certification. Cable system terminal device Declaration of Conformity. Stand-alone cable input selector switch Verification. Class B personal computers... used with Class B personal computers Declaration of Conformity or Certification. 1 Class B personal...

  20. A 3D visualization and simulation of the individual human jaw.

    PubMed

    Muftić, Osman; Keros, Jadranka; Baksa, Sarajko; Carek, Vlado; Matković, Ivo

    2003-01-01

    A new biomechanical three-dimensional (3D) model for the human mandible based on computer-generated virtual model is proposed. Using maps obtained from the special kinds of photos of the face of the real subject, it is possible to attribute personality to the virtual character, while computer animation offers movements and characteristics within the confines of space and time of the virtual world. A simple two-dimensional model of the jaw cannot explain the biomechanics, where the muscular forces through occlusion and condylar surfaces are in the state of 3D equilibrium. In the model all forces are resolved into components according to a selected coordinate system. The muscular forces act on the jaw, along with the necessary force level for chewing as some kind of mandible balance, preventing dislocation and loading of nonarticular tissues. In the work is used new approach to computer-generated animation of virtual 3D characters (called "Body SABA"), using in one object package of minimal costs and easy for operation.

  1. A Research Program in Computer Technology. 1987 Annual Technical Report

    DTIC Science & Technology

    1990-07-01

    TITLE (Indcle Security Clanificstion) 1987 Annual Technical Report: *A Research Program in Computer Technology (Unclassified) 12. PERSONAL AUTHOR(S) IS...distributed processing, survivable networks 17. NCE: distributed processing, local networks, personal computers, workstation environment 18. SC Dev...are the auw’iors and should not be Interpreted as representIng the official opinion or policy of DARPA, the U.S. Government, or any person or agency

  2. Optics Program Modified for Multithreaded Parallel Computing

    NASA Technical Reports Server (NTRS)

    Lou, John; Bedding, Dave; Basinger, Scott

    2006-01-01

    A powerful high-performance computer program for simulating and analyzing adaptive and controlled optical systems has been developed by modifying the serial version of the Modeling and Analysis for Controlled Optical Systems (MACOS) program to impart capabilities for multithreaded parallel processing on computing systems ranging from supercomputers down to Symmetric Multiprocessing (SMP) personal computers. The modifications included the incorporation of OpenMP, a portable and widely supported application interface software, that can be used to explicitly add multithreaded parallelism to an application program under a shared-memory programming model. OpenMP was applied to parallelize ray-tracing calculations, one of the major computing components in MACOS. Multithreading is also used in the diffraction propagation of light in MACOS based on pthreads [POSIX Thread, (where "POSIX" signifies a portable operating system for UNIX)]. In tests of the parallelized version of MACOS, the speedup in ray-tracing calculations was found to be linear, or proportional to the number of processors, while the speedup in diffraction calculations ranged from 50 to 60 percent, depending on the type and number of processors. The parallelized version of MACOS is portable, and, to the user, its interface is basically the same as that of the original serial version of MACOS.

  3. Development of simulation computer complex specification

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The Training Simulation Computer Complex Study was one of three studies contracted in support of preparations for procurement of a shuttle mission simulator for shuttle crew training. The subject study was concerned with definition of the software loads to be imposed on the computer complex to be associated with the shuttle mission simulator and the development of procurement specifications based on the resulting computer requirements. These procurement specifications cover the computer hardware and system software as well as the data conversion equipment required to interface the computer to the simulator hardware. The development of the necessary hardware and software specifications required the execution of a number of related tasks which included, (1) simulation software sizing, (2) computer requirements definition, (3) data conversion equipment requirements definition, (4) system software requirements definition, (5) a simulation management plan, (6) a background survey, and (7) preparation of the specifications.

  4. Interactive Media and Simulation Tools for Technical Training

    NASA Technical Reports Server (NTRS)

    Gramoll, Kurt

    1997-01-01

    Over the last several years, integration of multiple media sources into a single information system has been rapidly developing. It has been found that when sound, graphics, text, animations, and simulations are skillfully integrated, the sum of the parts exceeds the individual parts for effective learning. In addition, simulations can be used to design and understand complex engineering processes. With the recent introduction of many high-level authoring, animation, modeling, and rendering programs for personal computers, significant multimedia programs can be developed by practicing engineers, scientists and even managers for both training and education. However, even with these new tools, a considerable amount of time is required to produce an interactive multimedia program. The development of both CD-ROM and Web-based programs are discussed in addition to the use of technically oriented animations. Also examined are various multimedia development tools and how they are used to develop effective engineering education courseware. Demonstrations of actual programs in engineering mechanics are shown.

  5. Computational simulation of concurrent engineering for aerospace propulsion systems

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Singhal, S. N.

    1992-01-01

    Results are summarized of an investigation to assess the infrastructure available and the technology readiness in order to develop computational simulation methods/software for concurrent engineering. These results demonstrate that development of computational simulations methods for concurrent engineering is timely. Extensive infrastructure, in terms of multi-discipline simulation, component-specific simulation, system simulators, fabrication process simulation, and simulation of uncertainties - fundamental in developing such methods, is available. An approach is recommended which can be used to develop computational simulation methods for concurrent engineering for propulsion systems and systems in general. Benefits and facets needing early attention in the development are outlined.

  6. Computational simulation for concurrent engineering of aerospace propulsion systems

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Singhal, S. N.

    1993-01-01

    Results are summarized for an investigation to assess the infrastructure available and the technology readiness in order to develop computational simulation methods/software for concurrent engineering. These results demonstrate that development of computational simulation methods for concurrent engineering is timely. Extensive infrastructure, in terms of multi-discipline simulation, component-specific simulation, system simulators, fabrication process simulation, and simulation of uncertainties--fundamental to develop such methods, is available. An approach is recommended which can be used to develop computational simulation methods for concurrent engineering of propulsion systems and systems in general. Benefits and issues needing early attention in the development are outlined.

  7. Computational simulation for concurrent engineering of aerospace propulsion systems

    NASA Astrophysics Data System (ADS)

    Chamis, C. C.; Singhal, S. N.

    1993-02-01

    Results are summarized for an investigation to assess the infrastructure available and the technology readiness in order to develop computational simulation methods/software for concurrent engineering. These results demonstrate that development of computational simulation methods for concurrent engineering is timely. Extensive infrastructure, in terms of multi-discipline simulation, component-specific simulation, system simulators, fabrication process simulation, and simulation of uncertainties--fundamental to develop such methods, is available. An approach is recommended which can be used to develop computational simulation methods for concurrent engineering of propulsion systems and systems in general. Benefits and issues needing early attention in the development are outlined.

  8. A 3DHZETRN Code in a Spherical Uniform Sphere with Monte Carlo Verification

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Slaba, Tony C.; Badavi, Francis F.; Reddell, Brandon D.; Bahadori, Amir A.

    2014-01-01

    The computationally efficient HZETRN code has been used in recent trade studies for lunar and Martian exploration and is currently being used in the engineering development of the next generation of space vehicles, habitats, and extra vehicular activity equipment. A new version (3DHZETRN) capable of transporting High charge (Z) and Energy (HZE) and light ions (including neutrons) under space-like boundary conditions with enhanced neutron and light ion propagation is under development. In the present report, new algorithms for light ion and neutron propagation with well-defined convergence criteria in 3D objects is developed and tested against Monte Carlo simulations to verify the solution methodology. The code will be available through the software system, OLTARIS, for shield design and validation and provides a basis for personal computer software capable of space shield analysis and optimization.

  9. Developmental Stages in School Computer Use: Neither Marx Nor Piaget.

    ERIC Educational Resources Information Center

    Lengel, James G.

    Karl Marx's theory of stages can be applied to computer use in the schools. The first stage, the P Stage, comprises the entry of the computer into the school. Computer use at this stage is personal and tends to center around one personality. Social studies teachers are seldom among this select few. The second stage of computer use, the D Stage, is…

  10. FUN3D Manual: 12.9

    NASA Technical Reports Server (NTRS)

    Biedron, Robert T.; Carlson, Jan-Renee; Derlaga, Joseph M.; Gnoffo, Peter A.; Hammond, Dana P.; Jones, William T.; Kleb, Bil; Lee-Rausch, Elizabeth M.; Nielsen, Eric J.; Park, Michael A.; hide

    2016-01-01

    This manual describes the installation and execution of FUN3D version 12.9, including optional dependent packages. FUN3D is a suite of computational fluid dynamics simulation and design tools that uses mixed-element unstructured grids in a large number of formats, including structured multiblock and overset grid systems. A discretely-exact adjoint solver enables efficient gradient-based design and grid adaptation to reduce estimated discretization error. FUN3D is available with and without a reacting, real-gas capability. This generic gas option is available only for those persons that qualify for its beta release status.

  11. FUN3D Manual: 13.2

    NASA Technical Reports Server (NTRS)

    Biedron, Robert T.; Carlson, Jan-Renee; Derlaga, Joseph M.; Gnoffo, Peter A.; Hammond, Dana P.; Jones, William T.; Kleb, William L.; Lee-Rausch, Elizabeth M.; Nielsen, Eric J.; Park, Michael A.; hide

    2017-01-01

    This manual describes the installation and execution of FUN3D version 13.2, including optional dependent packages. FUN3D is a suite of computational fluid dynamics simulation and design tools that uses mixed-element unstructured grids in a large number of formats, including structured multiblock and overset grid systems. A discretely-exact adjoint solver enables efficient gradient-based design and grid adaptation to reduce estimated discretization error. FUN3D is available with and without a reacting, real-gas capability. This generic gas option is available only for those persons that qualify for its beta release status.

  12. FUN3D Manual: 12.6

    NASA Technical Reports Server (NTRS)

    Biedron, Robert T.; Derlaga, Joseph M.; Gnoffo, Peter A.; Hammond, Dana P.; Jones, William T.; Kleb, William L.; Lee-Rausch, Elizabeth M.; Nielsen, Eric J.; Park, Michael A.; Rumsey, Christopher L.; hide

    2015-01-01

    This manual describes the installation and execution of FUN3D version 12.6, including optional dependent packages. FUN3D is a suite of computational fluid dynamics simulation and design tools that uses mixed-element unstructured grids in a large number of formats, including structured multiblock and overset grid systems. A discretely-exact adjoint solver enables efficient gradient-based design and grid adaptation to reduce estimated discretization error. FUN3D is available with and without a reacting, real-gas capability. This generic gas option is available only for those persons that qualify for its beta release status.

  13. FUN3D Manual: 12.7

    NASA Technical Reports Server (NTRS)

    Biedron, Robert T.; Carlson, Jan-Renee; Derlaga, Joseph M.; Gnoffo, Peter A.; Hammond, Dana P.; Jones, William T.; Kleb, Bil; Lee-Rausch, Elizabeth M.; Nielsen, Eric J.; Park, Michael A.; hide

    2015-01-01

    This manual describes the installation and execution of FUN3D version 12.7, including optional dependent packages. FUN3D is a suite of computational fluid dynamics simulation and design tools that uses mixed-element unstructured grids in a large number of formats, including structured multiblock and overset grid systems. A discretely-exact adjoint solver enables efficient gradient-based design and grid adaptation to reduce estimated discretization error. FUN3D is available with and without a reacting, real-gas capability. This generic gas option is available only for those persons that qualify for its beta release status.

  14. FUN3D Manual: 12.5

    NASA Technical Reports Server (NTRS)

    Biedron, Robert T.; Derlaga, Joseph M.; Gnoffo, Peter A.; Hammond, Dana P.; Jones, William T.; Kleb, William L.; Lee-Rausch, Elizabeth M.; Nielsen, Eric J.; Park, Michael A.; Rumsey, Christopher L.; hide

    2014-01-01

    This manual describes the installation and execution of FUN3D version 12.5, including optional dependent packages. FUN3D is a suite of computational uid dynamics simulation and design tools that uses mixed-element unstructured grids in a large number of formats, including structured multiblock and overset grid systems. A discretely-exact adjoint solver enables ecient gradient-based design and grid adaptation to reduce estimated discretization error. FUN3D is available with and without a reacting, real-gas capability. This generic gas option is available only for those persons that qualify for its beta release status.

  15. FUN3D Manual: 12.8

    NASA Technical Reports Server (NTRS)

    Biedron, Robert T.; Carlson, Jan-Renee; Derlaga, Joseph M.; Gnoffo, Peter A.; Hammond, Dana P.; Jones, William T.; Kleb, Bil; Lee-Rausch, Elizabeth M.; Nielsen, Eric J.; Park, Michael A.; hide

    2015-01-01

    This manual describes the installation and execution of FUN3D version 12.8, including optional dependent packages. FUN3D is a suite of computational fluid dynamics simulation and design tools that uses mixed-element unstructured grids in a large number of formats, including structured multiblock and overset grid systems. A discretely-exact adjoint solver enables efficient gradient-based design and grid adaptation to reduce estimated discretization error. FUN3D is available with and without a reacting, real-gas capability. This generic gas option is available only for those persons that qualify for its beta release status.

  16. FUN3D Manual: 12.4

    NASA Technical Reports Server (NTRS)

    Biedron, Robert T.; Derlaga, Joseph M.; Gnoffo, Peter A.; Hammond, Dana P.; Jones, William T.; Kleb, Bil; Lee-Rausch, Elizabeth M.; Nielsen, Eric J.; Park, Michael A.; Rumsey, Christopher L.; hide

    2014-01-01

    This manual describes the installation and execution of FUN3D version 12.4, including optional dependent packages. FUN3D is a suite of computational fluid dynamics simulation and design tools that uses mixedelement unstructured grids in a large number of formats, including structured multiblock and overset grid systems. A discretely-exact adjoint solver enables efficient gradient-based design and grid adaptation to reduce estimated discretization error. FUN3D is available with and without a reacting, real-gas capability. This generic gas option is available only for those persons that qualify for its beta release status.

  17. FUN3D Manual: 13.1

    NASA Technical Reports Server (NTRS)

    Biedron, Robert T.; Carlson, Jan-Renee; Derlaga, Joseph M.; Gnoffo, Peter A.; Hammond, Dana P.; Jones, William T.; Kleb, Bil; Lee-Rausch, Elizabeth M.; Nielsen, Eric J.; Park, Michael A.; hide

    2017-01-01

    This manual describes the installation and execution of FUN3D version 13.1, including optional dependent packages. FUN3D is a suite of computational fluid dynamics simulation and design tools that uses mixed-element unstructured grids in a large number of formats, including structured multiblock and overset grid systems. A discretely-exact adjoint solver enables efficient gradient-based design and grid adaptation to reduce estimated discretization error. FUN3D is available with and without a reacting, real-gas capability. This generic gas option is available only for those persons that qualify for its beta release status.

  18. FUN3D Manual: 13.0

    NASA Technical Reports Server (NTRS)

    Biedron, Robert T.; Carlson, Jan-Renee; Derlaga, Joseph M.; Gnoffo, Peter A.; Hammond, Dana P.; Jones, William T.; Kleb, Bill; Lee-Rausch, Elizabeth M.; Nielsen, Eric J.; Park, Michael A.; hide

    2016-01-01

    This manual describes the installation and execution of FUN3D version 13.0, including optional dependent packages. FUN3D is a suite of computational fluid dynamics simulation and design tools that uses mixed-element unstructured grids in a large number of formats, including structured multiblock and overset grid systems. A discretely-exact adjoint solver enables efficient gradient-based design and grid adaptation to reduce estimated discretization error. FUN3D is available with and without a reacting, real-gas capability. This generic gas option is available only for those persons that qualify for its beta release status.

  19. FUN3D Manual: 13.3

    NASA Technical Reports Server (NTRS)

    Biedron, Robert T.; Carlson, Jan-Renee; Derlaga, Joseph M.; Gnoffo, Peter A.; Hammond, Dana P.; Jones, William T.; Kleb, Bil; Lee-Rausch, Elizabeth M.; Nielsen, Eric J.; Park, Michael A.; hide

    2018-01-01

    This manual describes the installation and execution of FUN3D version 13.3, including optional dependent packages. FUN3D is a suite of computational fluid dynamics simulation and design tools that uses mixed-element unstructured grids in a large number of formats, including structured multiblock and overset grid systems. A discretely-exact adjoint solver enables efficient gradient-based design and grid adaptation to reduce estimated discretization error. FUN3D is available with and without a reacting, real-gas capability. This generic gas option is available only for those persons that qualify for its beta release status.

  20. 3D printed microfluidic mixer for point-of-care diagnosis of anemia.

    PubMed

    Plevniak, Kimberly; Campbell, Matthew; Mei He

    2016-08-01

    3D printing has been an emerging fabrication tool in prototyping and manufacturing. We demonstrated a 3D microfluidic simulation guided computer design and 3D printer prototyping for quick turnaround development of microfluidic 3D mixers, which allows fast self-mixing of reagents with blood through capillary force. Combined with smartphone, the point-of-care diagnosis of anemia from finger-prick blood has been successfully implemented and showed consistent results with clinical measurements. Capable of 3D fabrication flexibility and smartphone compatibility, this work presents a novel diagnostic strategy for advancing personalized medicine and mobile healthcare.

  1. A Research Program in Computer Technology. 1986 Annual Technical Report

    DTIC Science & Technology

    1989-08-01

    1986 (Annual Technical Report I July 1985 - June 1986 A Research Program in Computer Technology ISI/SR-87-178 U S C INFORMA-TION S C I EN C ES...Program in Computer Technology (Unclassified) 12. PERSONAL AUTHOR(S) 151 Research Staff 13a. TYPE OF REPORT 113b. TIME COVERED 14 DATE OF REPORT (Yeer...survivable networks 17. distributed processing, local networks, personal computers, workstation environment 18. computer acquisition, Strategic Computing 19

  2. Optimization of droplets for UV-NIL using coarse-grain simulation of resist flow

    NASA Astrophysics Data System (ADS)

    Sirotkin, Vadim; Svintsov, Alexander; Zaitsev, Sergey

    2009-03-01

    A mathematical model and numerical method are described, which make it possible to simulate ultraviolet ("step and flash") nanoimprint lithography (UV-NIL) process adequately even using standard Personal Computers. The model is derived from 3D Navier-Stokes equations with the understanding that the resist motion is largely directed along the substrate surface and characterized by ultra-low values of the Reynolds number. By the numerical approximation of the model, a special finite difference method is applied (a coarse-grain method). A coarse-grain modeling tool for detailed analysis of resist spreading in UV-NIL at the structure-scale level is tested. The obtained results demonstrate the high ability of the tool to calculate optimal dispensing for given stamp design and process parameters. This dispensing provides uniform filled areas and a homogeneous residual layer thickness in UV-NIL.

  3. 47 CFR 32.2000 - Instructions for telecommunications plant accounts.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... equipment; 2122, Furniture; 2123, Office equipment; 2124, General purpose computers, costing $2,000 or less... for personal computers falling within Account 2124. Personal computers classifiable to Account 2124..., power, construction quarters, office space and equipment directly related to the construction project...

  4. REMOTE: Modem Communicator Program for the IBM personal computer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McGirt, F.

    1984-06-01

    REMOTE, a Modem Communicator Program, was developed to provide full duplex serial communication with arbitrary remote computers via either dial-up telephone modems or direct lines. The latest version of REMOTE (documented in this report) was developed for the IBM Personal Computer.

  5. 47 CFR 32.2000 - Instructions for telecommunications plant accounts.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... equipment; 2122, Furniture; 2123, Office equipment; 2124, General purpose computers, costing $2,000 or less... for personal computers falling within Account 2124. Personal computers classifiable to Account 2124..., power, construction quarters, office space and equipment directly related to the construction project...

  6. 47 CFR 32.2000 - Instructions for telecommunications plant accounts.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... equipment; 2122, Furniture; 2123, Office equipment; 2124, General purpose computers, costing $2,000 or less... for personal computers falling within Account 2124. Personal computers classifiable to Account 2124..., power, construction quarters, office space and equipment directly related to the construction project...

  7. 47 CFR 32.2000 - Instructions for telecommunications plant accounts.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... equipment; 2122, Furniture; 2123, Office equipment; 2124, General purpose computers, costing $2,000 or less... for personal computers falling within Account 2124. Personal computers classifiable to Account 2124..., power, construction quarters, office space and equipment directly related to the construction project...

  8. Does computer-synthesized speech manifest personality? Experimental tests of recognition, similarity-attraction, and consistency-attraction.

    PubMed

    Nass, C; Lee, K M

    2001-09-01

    Would people exhibit similarity-attraction and consistency-attraction toward unambiguously computer-generated speech even when personality is clearly not relevant? In Experiment 1, participants (extrovert or introvert) heard a synthesized voice (extrovert or introvert) on a book-buying Web site. Participants accurately recognized personality cues in text to speech and showed similarity-attraction in their evaluation of the computer voice, the book reviews, and the reviewer. Experiment 2, in a Web auction context, added personality of the text to the previous design. The results replicated Experiment 1 and demonstrated consistency (voice and text personality)-attraction. To maximize liking and trust, designers should set parameters, for example, words per minute or frequency range, that create a personality that is consistent with the user and the content being presented.

  9. A Context-Aware Ubiquitous Learning Approach for Providing Instant Learning Support in Personal Computer Assembly Activities

    ERIC Educational Resources Information Center

    Hsu, Ching-Kun; Hwang, Gwo-Jen

    2014-01-01

    Personal computer assembly courses have been recognized as being essential in helping students understand computer structure as well as the functionality of each computer component. In this study, a context-aware ubiquitous learning approach is proposed for providing instant assistance to individual students in the learning activity of a…

  10. Comparing the Use of the Interpersonal Computer, Personal Computer and Pen-and-Paper When Solving Arithmetic Exercises

    ERIC Educational Resources Information Center

    Alcoholado, Cristián; Diaz, Anita; Tagle, Arturo; Nussbaum, Miguel; Infante, Cristián

    2016-01-01

    This study aims to understand the differences in student learning outcomes and classroom behaviour when using the interpersonal computer, personal computer and pen-and-paper to solve arithmetic exercises. In this multi-session experiment, third grade students working on arithmetic exercises from various curricular units were divided into three…

  11. Design, development, and evaluation of an online virtual emergency department for training trauma teams.

    PubMed

    Youngblood, Patricia; Harter, Phillip M; Srivastava, Sakti; Moffett, Shannon; Heinrichs, Wm LeRoy; Dev, Parvati

    2008-01-01

    Training interdisciplinary trauma teams to work effectively together using simulation technology has led to a reduction in medical errors in emergency department, operating room, and delivery room contexts. High-fidelity patient simulators (PSs)-the predominant method for training healthcare teams-are expensive to develop and implement and require that trainees be present in the same place at the same time. In contrast, online computer-based simulators are more cost effective and allow simultaneous participation by students in different locations and time zones. In this pilot study, the researchers created an online virtual emergency department (Virtual ED) for team training in crisis management, and compared the effectiveness of the Virtual ED with the PS. We hypothesized that there would be no difference in learning outcomes for graduating medical students trained with each method. In this pilot study, we used a pretest-posttest control group, experimental design in which 30 subjects were randomly assigned to either the Virtual ED or the PS system. In the Virtual ED each subject logged into the online environment and took the role of a team member. Four-person teams worked together in the Virtual ED, communicating in real time with live voice over Internet protocol, to manage computer-controlled patients who exhibited signs and symptoms of physical trauma. Each subject had the opportunity to be the team leader. The subjects' leadership behavior as demonstrated in both a pretest case and a posttest case was assessed by 3 raters, using a behaviorally anchored scale. In the PS environment, 4-person teams followed the same research protocol, using the same clinical scenarios in a Simulation Center. Guided by the Emergency Medicine Crisis Resource Management curriculum, both the Virtual ED and the PS groups applied the basic principles of team leadership and trauma management (Advanced Trauma Life Support) to manage 6 trauma cases-a pretest case, 4 training cases, and a posttest case. The subjects in each group were assessed individually with the same simulation method that they used for the training cases. Subjects who used either the Virtual ED or the PS showed significant improvement in performance between pretest and posttest cases (P < 0.05). In addition, there was no significant difference in subjects' performance between the 2 types of simulation, suggesting that the online Virtual ED may be as effective for learning team skills as the PS, the method widely used in Simulation Centers. Data on usability and attitudes toward both simulation methods as learning tools were equally positive. This study shows the potential value of using virtual learning environments for developing medical students' and resident physicians' team leadership and crisis management skills.

  12. Understanding Emergency Care Delivery Through Computer Simulation Modeling.

    PubMed

    Laker, Lauren F; Torabi, Elham; France, Daniel J; Froehle, Craig M; Goldlust, Eric J; Hoot, Nathan R; Kasaie, Parastu; Lyons, Michael S; Barg-Walkow, Laura H; Ward, Michael J; Wears, Robert L

    2018-02-01

    In 2017, Academic Emergency Medicine convened a consensus conference entitled, "Catalyzing System Change through Health Care Simulation: Systems, Competency, and Outcomes." This article, a product of the breakout session on "understanding complex interactions through systems modeling," explores the role that computer simulation modeling can and should play in research and development of emergency care delivery systems. This article discusses areas central to the use of computer simulation modeling in emergency care research. The four central approaches to computer simulation modeling are described (Monte Carlo simulation, system dynamics modeling, discrete-event simulation, and agent-based simulation), along with problems amenable to their use and relevant examples to emergency care. Also discussed is an introduction to available software modeling platforms and how to explore their use for research, along with a research agenda for computer simulation modeling. Through this article, our goal is to enhance adoption of computer simulation, a set of methods that hold great promise in addressing emergency care organization and design challenges. © 2017 by the Society for Academic Emergency Medicine.

  13. A Community-Based Participatory Approach to Personalized, Computer-Generated Nutrition Feedback Reports: The Healthy Environments Partnership

    PubMed Central

    Kannan, Srimathi; Schulz, Amy; Israel, Barbara; Ayra, Indira; Weir, Sheryl; Dvonch, Timothy J.; Rowe, Zachary; Miller, Patricia; Benjamin, Alison

    2008-01-01

    Background Computer tailoring and personalizing recommendations for dietary health-promoting behaviors are in accordance with community-based participatory research (CBPR) principles, which emphasizes research that benefits the participants and community involved. Objective To describe the CBPR process utilized to computer-generate and disseminate personalized nutrition feedback reports (NFRs) for Detroit Healthy Environments Partnership (HEP) study participants. METHODS The CBPR process included discussion and feedback from HEP partners on several draft personalized reports. The nutrition feedback process included defining the feedback objectives; prioritizing the nutrients; customizing the report design; reviewing and revising the NFR template and readability; producing and disseminating the report; and participant follow-up. Lessons Learned Application of CBPR principles in designing the NFR resulted in a reader-friendly product with useful recommendations to promote heart health. Conclusions A CBPR process can enhance computer tailoring of personalized NFRs to address racial and socioeconomic disparities in cardiovascular disease (CVD). PMID:19337572

  14. DS-CDMA satellite diversity reception for personal satellite communication: Downlink performance analysis

    NASA Technical Reports Server (NTRS)

    DeGaudenzi, Riccardo; Giannetti, Filippo

    1995-01-01

    The downlink of a satellite-mobile personal communication system employing power-controlled Direct Sequence Code Division Multiple Access (DS-CDMA) and exploiting satellite-diversity is analyzed and its performance compared with a more traditional communication system utilizing single satellite reception. The analytical model developed has been thoroughly validated by means of extensive Monte Carlo computer simulations. It is shown how the capacity gain provided by diversity reception shrinks considerably in the presence of increasing traffic or in the case of light shadowing conditions. Moreover, the quantitative results tend to indicate that to combat system capacity reduction due to intra-system interference, no more than two satellites shall be active over the same region. To achieve higher system capacity, differently from terrestrial cellular systems, Multi-User Detection (MUD) techniques are likely to be required in the mobile user terminal, thus considerably increasing its complexity.

  15. Three-dimensional (3D) printing and its applications for aortic diseases

    PubMed Central

    Hangge, Patrick; Pershad, Yash; Witting, Avery A.; Albadawi, Hassan

    2018-01-01

    Three-dimensional (3D) printing is a process which generates prototypes from virtual objects in computer-aided design (CAD) software. Since 3D printing enables the creation of customized objects, it is a rapidly expanding field in an age of personalized medicine. We discuss the use of 3D printing in surgical planning, training, and creation of devices for the treatment of aortic diseases. 3D printing can provide operators with a hands-on model to interact with complex anatomy, enable prototyping of devices for implantation based upon anatomy, or even provide pre-procedural simulation. Potential exists to expand upon current uses of 3D printing to create personalized implantable devices such as grafts. Future studies should aim to demonstrate the impact of 3D printing on outcomes to make this technology more accessible to patients with complex aortic diseases. PMID:29850416

  16. Self-reported neck symptoms and use of personal computers, laptops and cell phones among Finns aged 18-65.

    PubMed

    Korpinen, Leena; Pääkkönen, Rauno; Gobba, Fabriziomaria

    2013-01-01

    The purpose of this study was to investigate the possible relation between self-reported neck symptoms (aches, pain or numbness) and use of computers/cell phones. The study was carried out as a cross-sectional study by posting a questionnaire to 15,000 working-age persons, and 15.1% of all respondents (6121) reported that they very often experienced physical symptoms in the neck. The results showed that they also had many other symptoms very often, and 49% used a computer daily at work and 83.9% used cell phones. We compared physical/mental symptoms of persons with symptoms in the neck quite often or more, with others. We found significant differences in the physical/mental symptoms and use of cell phones and computers. The results suggest taking into account in the future that those persons' symptoms in the neck can be associated with use of cell phones or computers. We investigated the possible relation between neck symptoms and use of computers/cell phones. We found that persons, who very often had symptoms in the neck, had also other symptoms very often (e.g. exhaustion at work). Their use of information and communication technology (e.g. computers) can associate with their symptoms.

  17. The Simultaneous Production Model; A Model for the Construction, Testing, Implementation and Revision of Educational Computer Simulation Environments.

    ERIC Educational Resources Information Center

    Zillesen, Pieter G. van Schaick

    This paper introduces a hardware and software independent model for producing educational computer simulation environments. The model, which is based on the results of 32 studies of educational computer simulations program production, implies that educational computer simulation environments are specified, constructed, tested, implemented, and…

  18. The Learning Effects of Computer Simulations in Science Education

    ERIC Educational Resources Information Center

    Rutten, Nico; van Joolingen, Wouter R.; van der Veen, Jan T.

    2012-01-01

    This article reviews the (quasi)experimental research of the past decade on the learning effects of computer simulations in science education. The focus is on two questions: how use of computer simulations can enhance traditional education, and how computer simulations are best used in order to improve learning processes and outcomes. We report on…

  19. Soft-error tolerance and energy consumption evaluation of embedded computer with magnetic random access memory in practical systems using computer simulations

    NASA Astrophysics Data System (ADS)

    Nebashi, Ryusuke; Sakimura, Noboru; Sugibayashi, Tadahiko

    2017-08-01

    We evaluated the soft-error tolerance and energy consumption of an embedded computer with magnetic random access memory (MRAM) using two computer simulators. One is a central processing unit (CPU) simulator of a typical embedded computer system. We simulated the radiation-induced single-event-upset (SEU) probability in a spin-transfer-torque MRAM cell and also the failure rate of a typical embedded computer due to its main memory SEU error. The other is a delay tolerant network (DTN) system simulator. It simulates the power dissipation of wireless sensor network nodes of the system using a revised CPU simulator and a network simulator. We demonstrated that the SEU effect on the embedded computer with 1 Gbit MRAM-based working memory is less than 1 failure in time (FIT). We also demonstrated that the energy consumption of the DTN sensor node with MRAM-based working memory can be reduced to 1/11. These results indicate that MRAM-based working memory enhances the disaster tolerance of embedded computers.

  20. Advanced laptop and small personal computer technology

    NASA Technical Reports Server (NTRS)

    Johnson, Roger L.

    1991-01-01

    Advanced laptop and small personal computer technology is presented in the form of the viewgraphs. The following areas of hand carried computers and mobile workstation technology are covered: background, applications, high end products, technology trends, requirements for the Control Center application, and recommendations for the future.

  1. The Relationship between Personality and Computer Deviance

    ERIC Educational Resources Information Center

    Burns, Cardra E.

    2013-01-01

    Computer deviance by employees, defined as malicious and nonmalicious computer use behaviors, has contributed to billions of dollars of monetary and productivity losses for public and private sector organizations. The purpose of this correlational study was to examine the relationship between personality characteristics and employees' computer…

  2. Forensic Odontology: Automatic Identification of Persons Comparing Antemortem and Postmortem Panoramic Radiographs Using Computer Vision.

    PubMed

    Heinrich, Andreas; Güttler, Felix; Wendt, Sebastian; Schenkl, Sebastian; Hubig, Michael; Wagner, Rebecca; Mall, Gita; Teichgräber, Ulf

    2018-06-18

     In forensic odontology the comparison between antemortem and postmortem panoramic radiographs (PRs) is a reliable method for person identification. The purpose of this study was to improve and automate identification of unknown people by comparison between antemortem and postmortem PR using computer vision.  The study includes 43 467 PRs from 24 545 patients (46 % females/54 % males). All PRs were filtered and evaluated with Matlab R2014b including the toolboxes image processing and computer vision system. The matching process used the SURF feature to find the corresponding points between two PRs (unknown person and database entry) out of the whole database.  From 40 randomly selected persons, 34 persons (85 %) could be reliably identified by corresponding PR matching points between an already existing scan in the database and the most recent PR. The systematic matching yielded a maximum of 259 points for a successful identification between two different PRs of the same person and a maximum of 12 corresponding matching points for other non-identical persons in the database. Hence 12 matching points are the threshold for reliable assignment.  Operating with an automatic PR system and computer vision could be a successful and reliable tool for identification purposes. The applied method distinguishes itself by virtue of its fast and reliable identification of persons by PR. This Identification method is suitable even if dental characteristics were removed or added in the past. The system seems to be robust for large amounts of data.   · Computer vision allows an automated antemortem and postmortem comparison of panoramic radiographs (PRs) for person identification.. · The present method is able to find identical matching partners among huge datasets (big data) in a short computing time.. · The identification method is suitable even if dental characteristics were removed or added.. · Heinrich A, Güttler F, Wendt S et al. Forensic Odontology: Automatic Identification of Persons Comparing Antemortem and Postmortem Panoramic Radiographs Using Computer Vision. Fortschr Röntgenstr 2018; DOI: 10.1055/a-0632-4744. © Georg Thieme Verlag KG Stuttgart · New York.

  3. Computational fluid dynamics simulation of airflow in the trachea and main bronchi for the subjects with left pulmonary artery sling

    PubMed Central

    2014-01-01

    Background Left pulmonary artery sling (LPAS) is a rare but severe congenital anomaly, in which the stenoses are formed in the trachea and/or main bronchi. Multi-detector computed tomography (MDCT) provides useful anatomical images, but does not offer functional information. The objective of the present study is to quantitatively analyze the airflow in the trachea and main bronchi of LPAS subjects through computational fluid dynamics (CFD) simulation. Methods Five subjects (four LPAS patients, one normal control) aging 6-19 months are analyzed. The geometric model of the trachea and the two main bronchi is extracted from the MDCT images. The inlet velocity is determined based on the body weight and the inlet area. Both the geometric model and personalized inflow conditions are imported into CFD software, ANSYS. The pressure drop, mass flow ratio through two bronchi, wall pressure, flow velocity and wall shear stress (WSS) are obtained, and compared to the normal control. Results Due to the tracheal and/or bronchial stenosis, the pressure drop for the LPAS patients ranges 78.9 - 914.5 Pa, much higher than for the normal control (0.7 Pa). The mass flow ratio through the two bronchi does not correlate with the sectional area ratio if the anomalous left pulmonary artery compresses the trachea or bronchi. It is suggested that the C-shaped trachea plays an important role on facilitating the air flow into the left bronchus with the inertia force. For LPAS subjects, the distributions of velocities, wall pressure and WSS are less regular than for the normal control. At the stenotic site, high velocity, low wall pressure and high WSS are observed. Conclusions Using geometric models extracted from CT images and the patient-specified inlet boundary conditions, CFD simulation can provide vital quantitative flow information for LPAS. Due to the stenosis, high pressure drops, inconsistent distributions of velocities, wall pressure and WSS are observed. The C-shaped trachea may facilitate a larger flow of air into the left bronchus under the inertial force, and decrease the ventilation of the right lung. Quantitative and personalized information may help understand the mechanism of LPAS and the correlations between stenosis and dyspnea, and facilitate the structural and functional assessment of LPAS. PMID:24957947

  4. Golfing with protons: using research grade simulation algorithms for online games

    NASA Astrophysics Data System (ADS)

    Harold, J.

    2004-12-01

    Scientists have long known the power of simulations. By modeling a system in a computer, researchers can experiment at will, developing an intuitive sense of how a system behaves. The rapid increase in the power of personal computers, combined with technologies such as Flash, Shockwave and Java, allow us to bring research simulations into the education world by creating exploratory environments for the public. This approach is illustrated by a project funded by a small grant from NSF's Informal Science Education program, through an opportunity that provides education supplements to existing research awards. Using techniques adapted from a magnetospheric research program, several Flash based interactives have been developed that allow web site visitors to explore the motion of particles in the Earth's magnetosphere. These pieces were folded into a larger Space Weather Center web project at the Space Science Institute (www.spaceweathercenter.org). Rather than presenting these interactives as plasma simulations per se, the research algorithms were used to create games such as "Magneto Mini Golf", where the balls are protons moving in combined electric and magnetic fields. The "holes" increase in complexity, beginning with no fields and progressing towards a simple model of Earth's magnetosphere. The emphasis of the activity is gameplay, but because it is at its core a plasma simulation, the user develops an intuitive sense of charged particle motion as they progress. Meanwhile, the pieces contain embedded assessments that are measurable through a database driven tracking system. Mining that database not only provides helpful usability information, but allows us to examine whether users are meeting the learning goals of the activities. We will discuss the development and evaluation results of the project, as well as the potential for these types of activities to shift the expectations of what a web site can and should provide educationally.

  5. So, you are buying your first computer.

    PubMed

    Ferrara-Love, R

    1999-06-01

    Buying your first computer need not be that complicated. The first thing that is needed is an understanding of what you want and need the computer for. By making a list of the various essentials, you will be on your way to purchasing that computer. Once that is completed, you will need an understanding of what each of the components of the computer is, how it works, and what options you have. This way, you will be better able to discuss your needs with the salesperson. The focus of this article is limited to personal computers or PCs (i.e., IBMs [Armonk, NY], IBM clones, Compaq [Houston, TX], Gateway [North Sioux City, SD], and so on). I am not including Macintosh or Apple [Cupertino, CA] in this discussion; most software is often made exclusively for personal computers or at least on the market for personal computers before becoming available in Macintosh version.

  6. Compressible or incompressible blend of interacting monodisperse linear polymers near a surface.

    PubMed

    Batman, Richard; Gujrati, P D

    2007-08-28

    We consider a lattice model of a mixture of repulsive, attractive, or neutral monodisperse linear polymers of two species, A and B, with a third monomeric species C, which may be taken to represent free volume. The mixture is confined between two hard, parallel plates of variable separation whose interactions with A and C may be attractive, repulsive, or neutral, and may be different from each other. The interactions with A and C are all that are required to completely specify the effect of each surface on all three components. We numerically study various density profiles as we move away from the surface, by using the recursive method of Gujrati and Chhajer [J. Chem. Phys. 106, 5599 (1997)] that has already been previously applied to study polydisperse solutions and blends next to surfaces. The resulting density profiles show the oscillations that are seen in Monte Carlo simulations and the enrichment of the smaller species at a neutral surface. The method is computationally ultrafast and can be carried out on a personal computer (PC), even in the incompressible case, when Monte Carlo simulations are not feasible. The calculations of density profiles usually take less than 20 min on a PC.

  7. Accounting for receptor flexibility and enhanced sampling methods in computer-aided drug design.

    PubMed

    Sinko, William; Lindert, Steffen; McCammon, J Andrew

    2013-01-01

    Protein flexibility plays a major role in biomolecular recognition. In many cases, it is not obvious how molecular structure will change upon association with other molecules. In proteins, these changes can be major, with large deviations in overall backbone structure, or they can be more subtle as in a side-chain rotation. Either way the algorithms that predict the favorability of biomolecular association require relatively accurate predictions of the bound structure to give an accurate assessment of the energy involved in association. Here, we review a number of techniques that have been proposed to accommodate receptor flexibility in the simulation of small molecules binding to protein receptors. We investigate modifications to standard rigid receptor docking algorithms and also explore enhanced sampling techniques, and the combination of free energy calculations and enhanced sampling techniques. The understanding and allowance for receptor flexibility are helping to make computer simulations of ligand protein binding more accurate. These developments may help improve the efficiency of drug discovery and development. Efficiency will be essential as we begin to see personalized medicine tailored to individual patients, which means specific drugs are needed for each patient's genetic makeup. © 2012 John Wiley & Sons A/S.

  8. Visualization Methods for Viability Studies of Inspection Modules for the Space Shuttle

    NASA Technical Reports Server (NTRS)

    Mobasher, Amir A.

    2005-01-01

    An effective simulation of an object, process, or task must be similar to that object, process, or task. A simulation could consist of a physical device, a set of mathematical equations, a computer program, a person, or some combination of these. There are many reasons for the use of simulators. Although some of the reasons are unique to a specific situation, there are many general reasons and purposes for using simulators. Some are listed but not limited to (1) Safety, (2) Scarce resources, (3) Teaching/education, (4) Additional capabilities, (5) Flexibility and (6) Cost. Robot simulators are in use for all of these reasons. Virtual environments such as simulators will eliminate physical contact with humans and hence will increase the safety of work environment. Corporations with limited funding and resources may utilize simulators to accomplish their goals while saving manpower and money. A computer simulation is safer than working with a real robot. Robots are typically a scarce resource. Schools typically don t have a large number of robots, if any. Factories don t want the robots not performing useful work unless absolutely necessary. Robot simulators are useful in teaching robotics. A simulator gives a student hands-on experience, if only with a simulator. The simulator is more flexible. A user can quickly change the robot configuration, workcell, or even replace the robot with a different one altogether. In order to be useful, a robot simulator must create a model that accurately performs like the real robot. A powerful simulator is usually thought of as a combination of a CAD package with simulation capabilities. Computer Aided Design (CAD) techniques are used extensively by engineers in virtually all areas of engineering. Parts are designed interactively aided by the graphical display of both wireframe and more realistic shaded renderings. Once a part s dimensions have been specified to the CAD package, designers can view the part from any direction to examine how it will look and perform in relation to other parts. If changes are deemed necessary, the designer can easily make the changes and view the results graphically. However, a complex process of moving parts intended for operation in a complex environment can only be fully understood through the process of animated graphical simulation. A CAD package with simulation capabilities allows the designer to develop geometrical models of the process being designed, as well as the environment in which the process will be used, and then test the process in graphical animation much as the actual physical system would be run . By being able to operate the system of moving and stationary parts, the designer is able to see in simulation how the system will perform under a wide variety of conditions. If, for example, undesired collisions occur between parts of the system, design changes can be easily made without the expense or potential danger of testing the physical system.

  9. Benefits of computer screen-based simulation in learning cardiac arrest procedures.

    PubMed

    Bonnetain, Elodie; Boucheix, Jean-Michel; Hamet, Maël; Freysz, Marc

    2010-07-01

    What is the best way to train medical students early so that they acquire basic skills in cardiopulmonary resuscitation as effectively as possible? Studies have shown the benefits of high-fidelity patient simulators, but have also demonstrated their limits. New computer screen-based multimedia simulators have fewer constraints than high-fidelity patient simulators. In this area, as yet, there has been no research on the effectiveness of transfer of learning from a computer screen-based simulator to more realistic situations such as those encountered with high-fidelity patient simulators. We tested the benefits of learning cardiac arrest procedures using a multimedia computer screen-based simulator in 28 Year 2 medical students. Just before the end of the traditional resuscitation course, we compared two groups. An experiment group (EG) was first asked to learn to perform the appropriate procedures in a cardiac arrest scenario (CA1) in the computer screen-based learning environment and was then tested on a high-fidelity patient simulator in another cardiac arrest simulation (CA2). While the EG was learning to perform CA1 procedures in the computer screen-based learning environment, a control group (CG) actively continued to learn cardiac arrest procedures using practical exercises in a traditional class environment. Both groups were given the same amount of practice, exercises and trials. The CG was then also tested on the high-fidelity patient simulator for CA2, after which it was asked to perform CA1 using the computer screen-based simulator. Performances with both simulators were scored on a precise 23-point scale. On the test on a high-fidelity patient simulator, the EG trained with a multimedia computer screen-based simulator performed significantly better than the CG trained with traditional exercises and practice (16.21 versus 11.13 of 23 possible points, respectively; p<0.001). Computer screen-based simulation appears to be effective in preparing learners to use high-fidelity patient simulators, which present simulations that are closer to real-life situations.

  10. Application of ubiquitous computing in personal health monitoring systems.

    PubMed

    Kunze, C; Grossmann, U; Stork, W; Müller-Glaser, K D

    2002-01-01

    A possibility to significantly reduce the costs of public health systems is to increasingly use information technology. The Laboratory for Information Processing Technology (ITIV) at the University of Karlsruhe is developing a personal health monitoring system, which should improve health care and at the same time reduce costs by combining micro-technological smart sensors with personalized, mobile computing systems. In this paper we present how ubiquitous computing theory can be applied in the health-care domain.

  11. Exact and efficient simulation of concordant computation

    NASA Astrophysics Data System (ADS)

    Cable, Hugo; Browne, Daniel E.

    2015-11-01

    Concordant computation is a circuit-based model of quantum computation for mixed states, that assumes that all correlations within the register are discord-free (i.e. the correlations are essentially classical) at every step of the computation. The question of whether concordant computation always admits efficient simulation by a classical computer was first considered by Eastin in arXiv:quant-ph/1006.4402v1, where an answer in the affirmative was given for circuits consisting only of one- and two-qubit gates. Building on this work, we develop the theory of classical simulation of concordant computation. We present a new framework for understanding such computations, argue that a larger class of concordant computations admit efficient simulation, and provide alternative proofs for the main results of arXiv:quant-ph/1006.4402v1 with an emphasis on the exactness of simulation which is crucial for this model. We include detailed analysis of the arithmetic complexity for solving equations in the simulation, as well as extensions to larger gates and qudits. We explore the limitations of our approach, and discuss the challenges faced in developing efficient classical simulation algorithms for all concordant computations.

  12. A general model for likelihood computations of genetic marker data accounting for linkage, linkage disequilibrium, and mutations.

    PubMed

    Kling, Daniel; Tillmar, Andreas; Egeland, Thore; Mostad, Petter

    2015-09-01

    Several applications necessitate an unbiased determination of relatedness, be it in linkage or association studies or in a forensic setting. An appropriate model to compute the joint probability of some genetic data for a set of persons given some hypothesis about the pedigree structure is then required. The increasing number of markers available through high-density SNP microarray typing and NGS technologies intensifies the demand, where using a large number of markers may lead to biased results due to strong dependencies between closely located loci, both within pedigrees (linkage) and in the population (allelic association or linkage disequilibrium (LD)). We present a new general model, based on a Markov chain for inheritance patterns and another Markov chain for founder allele patterns, the latter allowing us to account for LD. We also demonstrate a specific implementation for X chromosomal markers that allows for computation of likelihoods based on hypotheses of alleged relationships and genetic marker data. The algorithm can simultaneously account for linkage, LD, and mutations. We demonstrate its feasibility using simulated examples. The algorithm is implemented in the software FamLinkX, providing a user-friendly GUI for Windows systems (FamLinkX, as well as further usage instructions, is freely available at www.famlink.se ). Our software provides the necessary means to solve cases where no previous implementation exists. In addition, the software has the possibility to perform simulations in order to further study the impact of linkage and LD on computed likelihoods for an arbitrary set of markers.

  13. NASA Tech Briefs, February 2001. Volume 25, No. 2

    NASA Technical Reports Server (NTRS)

    2001-01-01

    The topics include: 1) Application Briefs; 2) National Design Engineering Show Preview; 3) Marketing Inventions to Increase Income; 4) A Personal-Computer-Based Physiological Training System; 5) Reconfigurable Arrays of Transistors for Evolvable Hardware; 6) Active Tactile Display Device for Reading by a Blind Person; 7) Program Automates Management of IBM VM Computer Systems; 8) System for Monitoring the Environment of a Spacecraft Launch; 9) Measurement of Stresses and Strains in Muscles and Tendons; 10) Optical Measurement of Temperatures in Muscles and Tendons; 11) Small Low-Temperature Thermometer With Nanokelvin Resolution; 12) Heterodyne Interferometer With Phase-Modulated Carrier; 13) Rechargeable Batteries Based on Intercalation in Graphite; 14) Signal Processor for Doppler Measurements in Icing Research; 15) Model Optimizes Drying of Wet Sheets; 16) High-Performance POSS-Modified Polymeric Composites; 17) Model Simulates Semi-Solid Material Processing; 18) Modular Cryogenic Insulation; 19) Passive Venting for Alleviating Helicopter Tail-Boom Loads; 20) Computer Program Predicts Rocket Noise; 21) Process for Polishing Bare Aluminum to High Optical Quality; 22) External Adhesive Pressure-Wall Patch; 23) Java Implementation of Information-Sharing Protocol; 24) Electronic Bulletin Board Publishes Schedules in Real Time; 25) Apparatus Would Extract Water From the Martian Atmosphere; 26) Review of Research on Supercritical vs Subcritical Fluids; 27) Hybrid Regenerative Water-Recycling System; 28) Study of Fusion-Driven Plasma Thruster With Magnetic Nozzle; 29) Liquid/Vapor-Hydrazine Thruster Would Produce Small Impulses; and 30) Thruster Based on Sublimation of Solid Hydrazine

  14. Onboard functional and molecular imaging: A design investigation for robotic multipinhole SPECT

    PubMed Central

    Bowsher, James; Yan, Susu; Roper, Justin; Giles, William; Yin, Fang-Fang

    2014-01-01

    Purpose: Onboard imaging—currently performed primarily by x-ray transmission modalities—is essential in modern radiation therapy. As radiation therapy moves toward personalized medicine, molecular imaging, which views individual gene expression, may also be important onboard. Nuclear medicine methods, such as single photon emission computed tomography (SPECT), are premier modalities for molecular imaging. The purpose of this study is to investigate a robotic multipinhole approach to onboard SPECT. Methods: Computer-aided design (CAD) studies were performed to assess the feasibility of maneuvering a robotic SPECT system about a patient in position for radiation therapy. In order to obtain fast, high-quality SPECT images, a 49-pinhole SPECT camera was designed which provides high sensitivity to photons emitted from an imaging region of interest. This multipinhole system was investigated by computer-simulation studies. Seventeen hot spots 10 and 7 mm in diameter were placed in the breast region of a supine female phantom. Hot spot activity concentration was six times that of background. For the 49-pinhole camera and a reference, more conventional, broad field-of-view (FOV) SPECT system, projection data were computer simulated for 4-min scans and SPECT images were reconstructed. Hot-spot localization was evaluated using a nonprewhitening forced-choice numerical observer. Results: The CAD simulation studies found that robots could maneuver SPECT cameras about patients in position for radiation therapy. In the imaging studies, most hot spots were apparent in the 49-pinhole images. Average localization errors for 10-mm- and 7-mm-diameter hot spots were 0.4 and 1.7 mm, respectively, for the 49-pinhole system, and 3.1 and 5.7 mm, respectively, for the reference broad-FOV system. Conclusions: A robot could maneuver a multipinhole SPECT system about a patient in position for radiation therapy. The system could provide onboard functional and molecular imaging with 4-min scan times. PMID:24387490

  15. Onboard functional and molecular imaging: A design investigation for robotic multipinhole SPECT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bowsher, James, E-mail: james.bowsher@duke.edu; Giles, William; Yin, Fang-Fang

    2014-01-15

    Purpose: Onboard imaging—currently performed primarily by x-ray transmission modalities—is essential in modern radiation therapy. As radiation therapy moves toward personalized medicine, molecular imaging, which views individual gene expression, may also be important onboard. Nuclear medicine methods, such as single photon emission computed tomography (SPECT), are premier modalities for molecular imaging. The purpose of this study is to investigate a robotic multipinhole approach to onboard SPECT. Methods: Computer-aided design (CAD) studies were performed to assess the feasibility of maneuvering a robotic SPECT system about a patient in position for radiation therapy. In order to obtain fast, high-quality SPECT images, a 49-pinholemore » SPECT camera was designed which provides high sensitivity to photons emitted from an imaging region of interest. This multipinhole system was investigated by computer-simulation studies. Seventeen hot spots 10 and 7 mm in diameter were placed in the breast region of a supine female phantom. Hot spot activity concentration was six times that of background. For the 49-pinhole camera and a reference, more conventional, broad field-of-view (FOV) SPECT system, projection data were computer simulated for 4-min scans and SPECT images were reconstructed. Hot-spot localization was evaluated using a nonprewhitening forced-choice numerical observer. Results: The CAD simulation studies found that robots could maneuver SPECT cameras about patients in position for radiation therapy. In the imaging studies, most hot spots were apparent in the 49-pinhole images. Average localization errors for 10-mm- and 7-mm-diameter hot spots were 0.4 and 1.7 mm, respectively, for the 49-pinhole system, and 3.1 and 5.7 mm, respectively, for the reference broad-FOV system. Conclusions: A robot could maneuver a multipinhole SPECT system about a patient in position for radiation therapy. The system could provide onboard functional and molecular imaging with 4-min scan times.« less

  16. Investigating Nigerian Primary School Teachers' Preparedness to Adopt Personal Response System in ESL Classroom

    ERIC Educational Resources Information Center

    Agbatogun, Alaba Olaoluwakotansibe

    2012-01-01

    This study investigated the extent to which computer literacy dimensions (computer general knowledge, documents and documentations, communication and surfing as well as data inquiry), computer use and academic qualification as independent variables predicted primary school teachers' attitude towards the integration of Personal Response System in…

  17. The Use of Microcomputers in Distance Teaching Systems. ZIFF Papiere 70.

    ERIC Educational Resources Information Center

    Rumble, Greville

    Microcomputers have revolutionized distance education in virtually every area. Used alone, personal computers provide students with a wide range of utilities, including word processing, graphics packages, and spreadsheets. When linked to a mainframe computer or connected to other personal computers in local area networks, microcomputers can…

  18. Pentium Pro inside. 1; A treecode at 430 Gigaflops on ASCI Red

    NASA Technical Reports Server (NTRS)

    Warren, M. S.; Becker, D. J.; Sterling, T.; Salmon, J. K.; Goda, M. P.

    1997-01-01

    As an entry for the 1997 Gordon Bell performance prize, we present results from two methods of solving the gravitational N-body problem on the Intel Teraflops system at Sandia National Laboratory (ASCI Red). The first method, an O(N2) algorithm, obtained 635 Gigaflops for a 1 million particle problem on 6800 Pentium Pro processors. The second solution method, a tree-code which scales as O(N log N), sustained 170 Gigaflops over a continuous 9.4 hour period on 4096 processors, integrating the motion of 322 million mutually interacting particles in a cosmology simulation, while saving over 100 Gigabytes of raw data. Additionally, the tree-code sustained 430 Gigaflops on 6800 processors for the first 5 time-steps of that simulation. This tree-code solution is approximately 105 times more efficient than the O(N2) algorithm for this problem. As an entry for the 1997 Gordon Bell price/performance prize, we present two calculations from the disciplines of astrophysics and fluid dynamics. The simulations were performed on two 16 Pentium Pro processor Beowulf-class computers (Loki and Hyglac) constructed entirely from commodity personal computer technology, at a cost of roughly $50k each in September, 1996. The price of an equivalent system in August 1997 is less than $30. At Los Alamos, Loki performed a gravitational tree-code N-body simulation of galaxy formation using 9.75 million particles, which sustained an average of 879 Mflops over a ten day period, and produced roughly 10 Gbytes of raw data.

  19. Using a Computer Simulation to Improve Psychological Readiness for Job Interviewing in Unemployed Individuals of Pre-Retirement Age.

    PubMed

    Aysina, Rimma M; Efremova, Galina I; Maksimenko, Zhanna A; Nikiforov, Mikhail V

    2017-05-01

    Unemployed individuals of pre-retirement age face significant challenges in finding a new job. This may be partly due to their lack of psychological readiness to go through a job interview. We view psychological readiness as one of the psychological attitude components. It is an active conscious readiness to interact with a certain aspect of reality, based on previously acquired experience. It includes a persons' special competence to manage their activities and cope with anxiety. We created Job Interview Simulation Training (JIST) - a computer-based simulator, which allowed unemployed job seekers to practice interviewing repeatedly in a stress-free environment. We hypothesized that completion of JIST would be related to increase in pre-retirement job seekers' psychological readiness for job interviewing in real life. Participants were randomized into control (n = 18) and experimental (n = 21) conditions. Both groups completed pre- and post-intervention job interview role-plays and self-reporting forms of psychological readiness for job interviewing. JIST consisted of 5 sessions of a simulated job interview, and the experimental group found it easy to use and navigate as well as helpful to prepare for interviewing. After finishing JIST-sessions the experimental group had significant decrease in heart rate during the post-intervention role-play and demonstrated significant increase in their self-rated psychological readiness, whereas the control group did not have changes in these variables. Future research may help clarify whether JIST is related to an increase in re-employment of pre-retirement job seekers.

  20. Predicting debris-flow initiation and run-out with a depth-averaged two-phase model and adaptive numerical methods

    NASA Astrophysics Data System (ADS)

    George, D. L.; Iverson, R. M.

    2012-12-01

    Numerically simulating debris-flow motion presents many challenges due to the complicated physics of flowing granular-fluid mixtures, the diversity of spatial scales (ranging from a characteristic particle size to the extent of the debris flow deposit), and the unpredictability of the flow domain prior to a simulation. Accurately predicting debris-flows requires models that are complex enough to represent the dominant effects of granular-fluid interaction, while remaining mathematically and computationally tractable. We have developed a two-phase depth-averaged mathematical model for debris-flow initiation and subsequent motion. Additionally, we have developed software that numerically solves the model equations efficiently on large domains. A unique feature of the mathematical model is that it includes the feedback between pore-fluid pressure and the evolution of the solid grain volume fraction, a process that regulates flow resistance. This feature endows the model with the ability to represent the transition from a stationary mass to a dynamic flow. With traditional approaches, slope stability analysis and flow simulation are treated separately, and the latter models are often initialized with force balances that are unrealistically far from equilibrium. Additionally, our new model relies on relatively few dimensionless parameters that are functions of well-known material properties constrained by physical data (eg. hydraulic permeability, pore-fluid viscosity, debris compressibility, Coulomb friction coefficient, etc.). We have developed numerical methods and software for accurately solving the model equations. By employing adaptive mesh refinement (AMR), the software can efficiently resolve an evolving debris flow as it advances through irregular topography, without needing terrain-fit computational meshes. The AMR algorithms utilize multiple levels of grid resolutions, so that computationally inexpensive coarse grids can be used where the flow is absent, and much higher resolution grids evolve with the flow. The reduction in computational cost, due to AMR, makes very large-scale problems tractable on personal computers. Model accuracy can be tested by comparison of numerical predictions and empirical data. These comparisons utilize controlled experiments conducted at the USGS debris-flow flume, which provide detailed data about flow mobilization and dynamics. Additionally, we have simulated historical large-scale debris flows, such as the (≈50 million m^3) debris flow that originated on Mt. Meager, British Columbia in 2010. This flow took a very complex route through highly variable topography and provides a valuable benchmark for testing. Maps of the debris flow deposit and data from seismic stations provide evidence regarding flow initiation, transit times and deposition. Our simulations reproduce many of the complex patterns of the event, such as run-out geometry and extent, and the large-scale nature of the flow and the complex topographical features demonstrate the utility of AMR in flow simulations.

  1. Real-time simulation of the TF30-P-3 turbofan engine using a hybrid computer

    NASA Technical Reports Server (NTRS)

    Szuch, J. R.; Bruton, W. M.

    1974-01-01

    A real-time, hybrid-computer simulation of the TF30-P-3 turbofan engine was developed. The simulation was primarily analog in nature but used the digital portion of the hybrid computer to perform bivariate function generation associated with the performance of the engine's rotating components. FORTRAN listings and analog patching diagrams are provided. The hybrid simulation was controlled by a digital computer programmed to simulate the engine's standard hydromechanical control. Both steady-state and dynamic data obtained from the digitally controlled engine simulation are presented. Hybrid simulation data are compared with data obtained from a digital simulation provided by the engine manufacturer. The comparisons indicate that the real-time hybrid simulation adequately matches the baseline digital simulation.

  2. 14 CFR 121.441 - Proficiency checks.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... certificate holder may use any person nor may any person serve as a required pilot flight crewmember unless that person has satisfactorily completed either a proficiency check, or an approved simulator course of... check or the simulator training. (2) For all other pilots— (i) Within the preceding 24 calendar months...

  3. 14 CFR 121.441 - Proficiency checks.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... certificate holder may use any person nor may any person serve as a required pilot flight crewmember unless that person has satisfactorily completed either a proficiency check, or an approved simulator course of... check or the simulator training. (2) For all other pilots— (i) Within the preceding 24 calendar months...

  4. Developing a rich definition of the person/residence to support person-oriented models of consumer product usage

    EPA Science Inventory

    Person Oriented Models (POMs) provide a basis for simulating aggregate chemical exposures in a population over time (Price and Chaisson, 2005). POMs assign characteristics to simulated individuals that are used to determine the individual’s probability of interacting with e...

  5. Business Profitability: A Simulation Study of Personalities and Organizational Success.

    ERIC Educational Resources Information Center

    Carsrud, A. L.; And Others

    There is no clearly established link between business success and the personality characteristics of the individual. To investigate the effect of the personalities of potential entrepreneurs/business owners on success, college senior business administration majors (N=152) participted in a business simulation game. Subjects first completed the Work…

  6. Displaying Computer Simulations Of Physical Phenomena

    NASA Technical Reports Server (NTRS)

    Watson, Val

    1991-01-01

    Paper discusses computer simulation as means of experiencing and learning to understand physical phenomena. Covers both present simulation capabilities and major advances expected in near future. Visual, aural, tactile, and kinesthetic effects used to teach such physical sciences as dynamics of fluids. Recommends classrooms in universities, government, and industry be linked to advanced computing centers so computer simulations integrated into education process.

  7. Do personal computers make doctors less personal?

    PubMed Central

    Rethans, Jan-Joost; Höppener, Paul; Wolfs, George; Diederiks, Jos

    1988-01-01

    Ten months after the installation of a computer in a general practice surgery a postal survey (piloted questionnaire) was sent to 390 patients. The patients' views of their relationship with their doctor after the computer was introduced were compared with their view of their relationship before the installation of the computer. More than 96% of the patients (n=263) stated that contact with their doctor was as easy and as personal as before. Most stated that the computer did not influence the duration of the consultation. Eighty one patients (30%) stated, however, that they thought that their privacy was reduced. Unlike studies of patients' attitudes performed before any actual experience of use of a computer in general practice, this study found that patients have little difficulty in accepting the presence of a computer in the consultation room. Nevertheless, doctors should inform their patients about any connections between their computer and other, external computers to allay fears about a decrease in privacy. PMID:3132287

  8. A Computer-Supported Method to Reveal and Assess Personal Professional Theories in Vocational Education

    ERIC Educational Resources Information Center

    van den Bogaart, Antoine C. M.; Bilderbeek, Richel J. C.; Schaap, Harmen; Hummel, Hans G. K.; Kirschner, Paul A.

    2016-01-01

    This article introduces a dedicated, computer-supported method to construct and formatively assess open, annotated concept maps of Personal Professional Theories (PPTs). These theories are internalised, personal bodies of formal and practical knowledge, values, norms and convictions that professionals use as a reference to interpret and acquire…

  9. The DYNAMO Simulation Language--An Alternate Approach to Computer Science Education.

    ERIC Educational Resources Information Center

    Bronson, Richard

    1986-01-01

    Suggests the use of computer simulation of continuous systems as a problem solving approach to computer languages. Outlines the procedures that the system dynamics approach employs in computer simulations. Explains the advantages of the special purpose language, DYNAMO. (ML)

  10. Protocols for Handling Messages Between Simulation Computers

    NASA Technical Reports Server (NTRS)

    Balcerowski, John P.; Dunnam, Milton

    2006-01-01

    Practical Simulator Network (PSimNet) is a set of data-communication protocols designed especially for use in handling messages between computers that are engaging cooperatively in real-time or nearly-real-time training simulations. In a typical application, computers that provide individualized training at widely dispersed locations would communicate, by use of PSimNet, with a central host computer that would provide a common computational- simulation environment and common data. Originally intended for use in supporting interfaces between training computers and computers that simulate the responses of spacecraft scientific payloads, PSimNet could be especially well suited for a variety of other applications -- for example, group automobile-driver training in a classroom. Another potential application might lie in networking of automobile-diagnostic computers at repair facilities to a central computer that would compile the expertise of numerous technicians and engineers and act as an expert consulting technician.

  11. Reversible simulation of irreversible computation

    NASA Astrophysics Data System (ADS)

    Li, Ming; Tromp, John; Vitányi, Paul

    1998-09-01

    Computer computations are generally irreversible while the laws of physics are reversible. This mismatch is penalized by among other things generating excess thermic entropy in the computation. Computing performance has improved to the extent that efficiency degrades unless all algorithms are executed reversibly, for example by a universal reversible simulation of irreversible computations. All known reversible simulations are either space hungry or time hungry. The leanest method was proposed by Bennett and can be analyzed using a simple ‘reversible’ pebble game. The reachable reversible simulation instantaneous descriptions (pebble configurations) of such pebble games are characterized completely. As a corollary we obtain the reversible simulation by Bennett and, moreover, show that it is a space-optimal pebble game. We also introduce irreversible steps and give a theorem on the tradeoff between the number of allowed irreversible steps and the memory gain in the pebble game. In this resource-bounded setting the limited erasing needs to be performed at precise instants during the simulation. The reversible simulation can be modified so that it is applicable also when the simulated computation time is unknown.

  12. Massively parallel quantum computer simulator

    NASA Astrophysics Data System (ADS)

    De Raedt, K.; Michielsen, K.; De Raedt, H.; Trieu, B.; Arnold, G.; Richter, M.; Lippert, Th.; Watanabe, H.; Ito, N.

    2007-01-01

    We describe portable software to simulate universal quantum computers on massive parallel computers. We illustrate the use of the simulation software by running various quantum algorithms on different computer architectures, such as a IBM BlueGene/L, a IBM Regatta p690+, a Hitachi SR11000/J1, a Cray X1E, a SGI Altix 3700 and clusters of PCs running Windows XP. We study the performance of the software by simulating quantum computers containing up to 36 qubits, using up to 4096 processors and up to 1 TB of memory. Our results demonstrate that the simulator exhibits nearly ideal scaling as a function of the number of processors and suggest that the simulation software described in this paper may also serve as benchmark for testing high-end parallel computers.

  13. Personal Computers.

    ERIC Educational Resources Information Center

    Toong, Hoo-min D.; Gupta, Amar

    1982-01-01

    Describes the hardware, software, applications, and current proliferation of personal computers (microcomputers). Includes discussions of microprocessors, memory, output (including printers), application programs, the microcomputer industry, and major microcomputer manufacturers (Apple, Radio Shack, Commodore, and IBM). (JN)

  14. User's Guide for Mixed-Size Sediment Transport Model for Networks of One-Dimensional Open Channels

    USGS Publications Warehouse

    Bennett, James P.

    2001-01-01

    This user's guide describes a mathematical model for predicting the transport of mixed sizes of sediment by flow in networks of one-dimensional open channels. The simulation package is useful for general sediment routing problems, prediction of erosion and deposition following dam removal, and scour in channels at road embankment crossings or other artificial structures. The model treats input hydrographs as stepwise steady-state, and the flow computation algorithm automatically switches between sub- and supercritical flow as dictated by channel geometry and discharge. A variety of boundary conditions including weirs and rating curves may be applied both external and internal to the flow network. The model may be used to compute flow around islands and through multiple openings in embankments, but the network must be 'simple' in the sense that the flow directions in all channels can be specified before simulation commences. The location and shape of channel banks are user specified, and all bedelevation changes take place between these banks and above a user-specified bedrock elevation. Computation of sediment-transport emphasizes the sand-size range (0.0625-2.0 millimeter) but the user may select any desired range of particle diameters including silt and finer (<0.0625 millimeter). As part of data input, the user may set the original bed-sediment composition of any number of layers of known thickness. The model computes the time evolution of total transport and the size composition of bed- and suspended-load sand through any cross section of interest. It also tracks bed -surface elevation and size composition. The model is written in the FORTRAN programming language for implementation on personal computers using the WINDOWS operating system and, along with certain graphical output display capability, is accessed from a graphical user interface (GUI). The GUI provides a framework for selecting input files and parameters of a number of components of the sediment-transport process. There are no restrictions in the use of the model as to numbers of channels, channel junctions, cross sections per channel, or points defining the cross sections. Following completion of the simulation computations, the GUI accommodates display of longitudinal plots of either bed elevation and size composition, or of transport rate and size composition of the various components, for individual channels and selected times during the simulation period. For individual cross sections, the GUI also allows display of time series of transport rate and size composition of the various components and of bed elevation and size composition.

  15. Space-filling designs for computer experiments: A review

    DOE PAGES

    Joseph, V. Roshan

    2016-01-29

    Improving the quality of a product/process using a computer simulator is a much less expensive option than the real physical testing. However, simulation using computationally intensive computer models can be time consuming and therefore, directly doing the optimization on the computer simulator can be infeasible. Experimental design and statistical modeling techniques can be used for overcoming this problem. This article reviews experimental designs known as space-filling designs that are suitable for computer simulations. In the review, a special emphasis is given for a recently developed space-filling design called maximum projection design. Furthermore, its advantages are illustrated using a simulation conductedmore » for optimizing a milling process.« less

  16. Space-filling designs for computer experiments: A review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joseph, V. Roshan

    Improving the quality of a product/process using a computer simulator is a much less expensive option than the real physical testing. However, simulation using computationally intensive computer models can be time consuming and therefore, directly doing the optimization on the computer simulator can be infeasible. Experimental design and statistical modeling techniques can be used for overcoming this problem. This article reviews experimental designs known as space-filling designs that are suitable for computer simulations. In the review, a special emphasis is given for a recently developed space-filling design called maximum projection design. Furthermore, its advantages are illustrated using a simulation conductedmore » for optimizing a milling process.« less

  17. Computer Support of Operator Training: Constructing and Testing a Prototype of a CAL (Computer Aided Learning) Supported Simulation Environment.

    ERIC Educational Resources Information Center

    Zillesen, P. G. van Schaick; And Others

    Instructional feedback given to the learners during computer simulation sessions may be greatly improved by integrating educational computer simulation programs with hypermedia-based computer-assisted learning (CAL) materials. A prototype of a learning environment of this type called BRINE PURIFICATION was developed for use in corporate training…

  18. An intelligent rollator for mobility impaired persons, especially stroke patients.

    PubMed

    Hellström, Thomas; Lindahl, Olof; Bäcklund, Tomas; Karlsson, Marcus; Hohnloser, Peter; Bråndal, Anna; Hu, Xiaolei; Wester, Per

    2016-07-01

    An intelligent rollator (IRO) was developed that aims at obstacle detection and guidance to avoid collisions and accidental falls. The IRO is a retrofit four-wheeled rollator with an embedded computer, two solenoid brakes, rotation sensors on the wheels and IR-distance sensors. The value reported by each distance sensor was compared in the computer to a nominal distance. Deviations indicated a present obstacle and caused activation of one of the brakes in order to influence the direction of motion to avoid the obstacle. The IRO was tested by seven healthy subjects with simulated restricted and blurred sight and five stroke subjects on a standardised indoor track with obstacles. All tested subjects walked faster with intelligence deactivated. Three out of five stroke patients experienced more detected obstacles with intelligence activated. This suggests enhanced safety during walking with IRO. Further studies are required to explore the full value of the IRO.

  19. Satellite interference analysis and simulation using personal computers

    NASA Astrophysics Data System (ADS)

    Kantak, Anil

    1988-03-01

    This report presents the complete analysis and formulas necessary to quantify the interference experienced by a generic satellite communications receiving station due to an interfering satellite. Both satellites, the desired as well as the interfering satellite, are considered to be in elliptical orbits. Formulas are developed for the satellite look angles and the satellite transmit angles generally related to the land mask of the receiving station site for both satellites. Formulas for considering Doppler effect due to the satellite motion as well as the Earth's rotation are developed. The effect of the interfering-satellite signal modulation and the Doppler effect on the power received are considered. The statistical formulation of the interference effect is presented in the form of a histogram of the interference to the desired signal power ratio. Finally, a computer program suitable for microcomputers such as IBM AT is provided with the flowchart, a sample run, results of the run, and the program code.

  20. Solutions for acceleration measurement in vehicle crash tests

    NASA Astrophysics Data System (ADS)

    Dima, D. S.; Covaciu, D.

    2017-10-01

    Crash tests are useful for validating computer simulations of road traffic accidents. One of the most important parameters measured is the acceleration. The evolution of acceleration versus time, during a crash test, form a crash pulse. The correctness of the crash pulse determination depends on the data acquisition system used. Recommendations regarding the instrumentation for impact tests are given in standards, which are focused on the use of accelerometers as impact sensors. The goal of this paper is to present the device and software developed by authors for data acquisition and processing. The system includes two accelerometers with different input ranges, a processing unit based on a 32-bit microcontroller and a data logging unit with SD card. Data collected on card, as text files, is processed with a dedicated software running on personal computers. The processing is based on diagrams and includes the digital filters recommended in standards.

  1. Intelligent computer-aided training authoring environment

    NASA Technical Reports Server (NTRS)

    Way, Robert D.

    1994-01-01

    Although there has been much research into intelligent tutoring systems (ITS), there are few authoring systems available that support ITS metaphors. Instructional developers are generally obliged to use tools designed for creating on-line books. We are currently developing an authoring environment derived from NASA's research on intelligent computer-aided training (ICAT). The ICAT metaphor, currently in use at NASA has proven effective in disciplines from satellite deployment to high school physics. This technique provides a personal trainer (PT) who instructs the student using a simulated work environment (SWE). The PT acts as a tutor, providing individualized instruction and assistance to each student. Teaching in an SWE allows the student to learn tasks by doing them, rather than by reading about them. This authoring environment will expedite ICAT development by providing a tool set that guides the trainer modeling process. Additionally, this environment provides a vehicle for distributing NASA's ICAT technology to the private sector.

  2. Satellite Interference Analysis and Simulation Using Personal Computers

    NASA Technical Reports Server (NTRS)

    Kantak, Anil

    1988-01-01

    This report presents the complete analysis and formulas necessary to quantify the interference experienced by a generic satellite communications receiving station due to an interfering satellite. Both satellites, the desired as well as the interfering satellite, are considered to be in elliptical orbits. Formulas are developed for the satellite look angles and the satellite transmit angles generally related to the land mask of the receiving station site for both satellites. Formulas for considering Doppler effect due to the satellite motion as well as the Earth's rotation are developed. The effect of the interfering-satellite signal modulation and the Doppler effect on the power received are considered. The statistical formulation of the interference effect is presented in the form of a histogram of the interference to the desired signal power ratio. Finally, a computer program suitable for microcomputers such as IBM AT is provided with the flowchart, a sample run, results of the run, and the program code.

  3. [Use of personal computers by diplomats of anesthesiology in Japan].

    PubMed

    Yamamoto, K; Ohmura, S; Tsubokawa, T; Kita, M; Kushida, Y; Kobayashi, T

    1999-04-01

    Use of personal computers by diplomats of the Japanese Board of Anesthesiology working in Japanese university hospitals was investigated. Unsigned questionnaires were returned from 232 diplomats of 18 anesthesia departments. The age of responders ranged from twenties to sixties. Personal computer systems are used by 223 diplomats (96.1%), while nine (3.9%) do not use them. The computer systems used are: Apple Macintosh 77%, IBM compatible PC 21% and UNIX 2%. Although 197 diplomats have e-mail addresses, only 162 of them actually send and receive e-mails. Diplomats in fifties use e-mail most actively and those in sixties come second.

  4. Computer Simulation in Mass Emergency and Disaster Response: An Evaluation of Its Effectiveness as a Tool for Demonstrating Strategic Competency in Emergency Department Medical Responders

    ERIC Educational Resources Information Center

    O'Reilly, Daniel J.

    2011-01-01

    This study examined the capability of computer simulation as a tool for assessing the strategic competency of emergency department nurses as they responded to authentically computer simulated biohazard-exposed patient case studies. Thirty registered nurses from a large, urban hospital completed a series of computer-simulated case studies of…

  5. Space Ultrareliable Modular Computer (SUMC) instruction simulator

    NASA Technical Reports Server (NTRS)

    Curran, R. T.

    1972-01-01

    The design principles, description, functional operation, and recommended expansion and enhancements are presented for the Space Ultrareliable Modular Computer interpretive simulator. Included as appendices are the user's manual, program module descriptions, target instruction descriptions, simulator source program listing, and a sample program printout. In discussing the design and operation of the simulator, the key problems involving host computer independence and target computer architectural scope are brought into focus.

  6. From gaze cueing to perspective taking: Revisiting the claim that we automatically compute where or what other people are looking at

    PubMed Central

    Bukowski, Henryk; Hietanen, Jari K.; Samson, Dana

    2015-01-01

    ABSTRACT Two paradigms have shown that people automatically compute what or where another person is looking at. In the visual perspective-taking paradigm, participants judge how many objects they see; whereas, in the gaze cueing paradigm, participants identify a target. Unlike in the former task, in the latter task, the influence of what or where the other person is looking at is only observed when the other person is presented alone before the task-relevant objects. We show that this discrepancy across the two paradigms is not due to differences in visual settings (Experiment 1) or available time to extract the directional information (Experiment 2), but that it is caused by how attention is deployed in response to task instructions (Experiment 3). Thus, the mere presence of another person in the field of view is not sufficient to compute where/what that person is looking at, which qualifies the claimed automaticity of such computations. PMID:26924936

  7. Geometric Data Perturbation-Based Personal Health Record Transactions in Cloud Computing

    PubMed Central

    Balasubramaniam, S.; Kavitha, V.

    2015-01-01

    Cloud computing is a new delivery model for information technology services and it typically involves the provision of dynamically scalable and often virtualized resources over the Internet. However, cloud computing raises concerns on how cloud service providers, user organizations, and governments should handle such information and interactions. Personal health records represent an emerging patient-centric model for health information exchange, and they are outsourced for storage by third parties, such as cloud providers. With these records, it is necessary for each patient to encrypt their own personal health data before uploading them to cloud servers. Current techniques for encryption primarily rely on conventional cryptographic approaches. However, key management issues remain largely unsolved with these cryptographic-based encryption techniques. We propose that personal health record transactions be managed using geometric data perturbation in cloud computing. In our proposed scheme, the personal health record database is perturbed using geometric data perturbation and outsourced to the Amazon EC2 cloud. PMID:25767826

  8. Geometric data perturbation-based personal health record transactions in cloud computing.

    PubMed

    Balasubramaniam, S; Kavitha, V

    2015-01-01

    Cloud computing is a new delivery model for information technology services and it typically involves the provision of dynamically scalable and often virtualized resources over the Internet. However, cloud computing raises concerns on how cloud service providers, user organizations, and governments should handle such information and interactions. Personal health records represent an emerging patient-centric model for health information exchange, and they are outsourced for storage by third parties, such as cloud providers. With these records, it is necessary for each patient to encrypt their own personal health data before uploading them to cloud servers. Current techniques for encryption primarily rely on conventional cryptographic approaches. However, key management issues remain largely unsolved with these cryptographic-based encryption techniques. We propose that personal health record transactions be managed using geometric data perturbation in cloud computing. In our proposed scheme, the personal health record database is perturbed using geometric data perturbation and outsourced to the Amazon EC2 cloud.

  9. From gaze cueing to perspective taking: Revisiting the claim that we automatically compute where or what other people are looking at.

    PubMed

    Bukowski, Henryk; Hietanen, Jari K; Samson, Dana

    2015-09-14

    Two paradigms have shown that people automatically compute what or where another person is looking at. In the visual perspective-taking paradigm, participants judge how many objects they see; whereas, in the gaze cueing paradigm, participants identify a target. Unlike in the former task, in the latter task, the influence of what or where the other person is looking at is only observed when the other person is presented alone before the task-relevant objects. We show that this discrepancy across the two paradigms is not due to differences in visual settings (Experiment 1) or available time to extract the directional information (Experiment 2), but that it is caused by how attention is deployed in response to task instructions (Experiment 3). Thus, the mere presence of another person in the field of view is not sufficient to compute where/what that person is looking at, which qualifies the claimed automaticity of such computations.

  10. How we remember what we can do

    PubMed Central

    Declerck, Gunnar

    2015-01-01

    According to the motor simulation theory, the knowledge we possess of what we can do is based on simulation mechanisms triggered by an off-line activation of the brain areas involved in motor control. Action capabilities memory does not work by storing some content, but consists in the capacity, rooted in sensory-motor systems, to reenact off-line action sequences exhibiting the range of our powers. In this paper, I present several arguments from cognitive neuropsychology, but also first-person analysis of experience, against this hypothesis. The claim that perceptual access to affordances is mediated by motor simulation processes rests on a misunderstanding of what affordances are, and comes up against a computational reality principle. Motor simulation cannot provide access to affordances because (i) the affordances we are aware of at each moment are too many for their realization to be simulated by the brain and (ii) affordances are not equivalent to currently or personally feasible actions. The explanatory significance of the simulation theory must then be revised downwards compared to what is claimed by most of its advocates. One additional challenge is to determine the prerequisite, in terms of cognitive processing, for the motor simulation mechanisms to work. To overcome the limitations of the simulation theory, I propose a new approach: the direct content specification hypothesis. This hypothesis states that, at least for the most basic actions of our behavioral repertoire, the action possibilities we are aware of through perception are directly specified by perceptual variables characterizing the content of our experience. The cognitive system responsible for the perception of action possibilities is consequently far more direct, in terms of cognitive processing, than what is stated by the simulation theory. To support this hypothesis I review evidence from current neuropsychological research, in particular data suggesting a phenomenon of ‘fossilization’ of affordances. Fossilization can be defined as a gap between the capacities that are treated as available by the cognitive system and the capacities this system really has at its disposal. These considerations do not mean that motor simulation cannot contribute to explain how we gain perceptual knowledge of what we can do based on the memory of our past performances. However, when precisely motor simulation plays a role and what it is for exactly currently remain largely unknown. PMID:26507953

  11. Heat transfer simulation in a vertical Bridgman CdTe growth configuration

    NASA Astrophysics Data System (ADS)

    Martinez-Tomas, C.; Muñoz, V.; Triboulet, R.

    1999-02-01

    Modelling and numerical simulation of crystal growth processes have been shown to be powerful tools in order to understand the physical effects of different parameters on the growth conditions. In this study a finite difference/control volume technique for the study of heat transfer has been employed. This model takes into account the whole system: furnace temperature profile, air gap between furnace walls and ampoule, ampoule geometry, crucible coating if any, solid and liquid CdTe thermal properties, conduction, convection and radiation of heat and phase change. We have used the commercial code FLUENT for the numerical resolution that can be running on a personal computer. Results show that the temperature field is very sensitive to the charge and ampoule peculiarities. As a consequence, significant differences between the velocity of the ampoule and that of the isotherm determining the solid/liquid interface have been found at the onset of the growth.

  12. Scaled Jump in Gravity-Reduced Virtual Environments.

    PubMed

    Kim, MyoungGon; Cho, Sunglk; Tran, Tanh Quang; Kim, Seong-Pil; Kwon, Ohung; Han, JungHyun

    2017-04-01

    The reduced gravity experienced in lunar or Martian surfaces can be simulated on the earth using a cable-driven system, where the cable lifts a person to reduce his or her weight. This paper presents a novel cable-driven system designed for the purpose. It is integrated with a head-mounted display and a motion capture system. Focusing on jump motion within the system, this paper proposes to scale the jump and reports the experiments made for quantifying the extent to which a jump can be scaled without the discrepancy between physical and virtual jumps being noticed by the user. With the tolerable range of scaling computed from these experiments, an application named retargeted jump is developed, where a user can jump up onto virtual objects while physically jumping in the real-world flat floor. The core techniques presented in this paper can be extended to develop extreme-sport simulators such as parasailing and skydiving.

  13. Space Station Freedom Data Assessment Study

    NASA Technical Reports Server (NTRS)

    Johnson, Anngienetta R.; Deskevich, Joseph

    1990-01-01

    The SSF Data Assessment Study was initiated to identify payload and operations data requirements to be supported in the Space Station era. To initiate the study payload requirements from the projected SSF user community were obtained utilizing an electronic questionnaire. The results of the questionnaire were incorporated in a personal computer compatible database used for mission scheduling and end-to-end communications analyses. This paper discusses data flow paths and associated latencies, communications bottlenecks, resource needs versus availability, payload scheduling 'warning flags' and payload data loading requirements for each major milestone in the Space Station buildup sequence. This paper also presents the statistical and analytical assessments produced using the data base, an experiment scheduling program, and a Space Station unique end-to-end simulation model. The modeling concepts and simulation methodologies presented in this paper provide a foundation for forecasting communication requirements and identifying modeling tools to be used in the SSF Tactical Operations Planning (TOP) process.

  14. Envisioning the Handheld-Centric Classroom

    ERIC Educational Resources Information Center

    Norris, Cathleen; Soloway, Elliot

    2004-01-01

    While appropriate as an initial focus, it is time that the educational community move beyond an emphasis on 1:1 computing (each child having his/her own personal computer) to a vision of a handheld-centric classroom, where each child not only has his/her own personal, handheld computer, but also has access to networked PCs, probeware, digital…

  15. Computer Technology and Persons with Disabilities: Proceedings of the Conference (Northridge, California, October 17-19, 1985).

    ERIC Educational Resources Information Center

    Murphy, Harry J.

    Twenty-seven papers are presented from a conference on applications of computer technology for disabled persons. The following titles and authors are represented: "Computer Applications For Rehabilitation Organizations: Finding What You Need" (T. Backer); "Similarities In Cognitive Development Of Severely Physically Handicapped and Younger Regular…

  16. The Relationships among Unethical Computer Usage Behavior and Some Personality Characteristics of Turkish University Students

    ERIC Educational Resources Information Center

    Ceyhan, A. Aykut; Ceyhan, Esra

    2007-01-01

    This research aims at examining the relationships among unethical computer usage behavior and the personality characteristics of locus of control, adjustment to social norms, antisocial tendency, and aggression on Turkish university students. The research was applied to 217 university students. Data were collected through Unethical Computer Using…

  17. Computer-aided Instructional System for Transmission Line Simulation.

    ERIC Educational Resources Information Center

    Reinhard, Erwin A.; Roth, Charles H., Jr.

    A computer-aided instructional system has been developed which utilizes dynamic computer-controlled graphic displays and which requires student interaction with a computer simulation in an instructional mode. A numerical scheme has been developed for digital simulation of a uniform, distortionless transmission line with resistive terminations and…

  18. Verifying the Simulation Hypothesis via Infinite Nested Universe Simulacrum Loops

    NASA Astrophysics Data System (ADS)

    Sharma, Vikrant

    2017-01-01

    The simulation hypothesis proposes that local reality exists as a simulacrum within a hypothetical computer's dimension. More specifically, Bostrom's trilemma proposes that the number of simulations an advanced 'posthuman' civilization could produce makes the proposition very likely. In this paper a hypothetical method to verify the simulation hypothesis is discussed using infinite regression applied to a new type of infinite loop. Assign dimension n to any computer in our present reality, where dimension signifies the hierarchical level in nested simulations our reality exists in. A computer simulating known reality would be dimension (n-1), and likewise a computer simulating an artificial reality, such as a video game, would be dimension (n +1). In this method, among others, four key assumptions are made about the nature of the original computer dimension n. Summations show that regressing such a reality infinitely will create convergence, implying that the verification of whether local reality is a grand simulation is feasible to detect with adequate compute capability. The action of reaching said convergence point halts the simulation of local reality. Sensitivities to the four assumptions and implications are discussed.

  19. Bipartite graphs as models of population structures in evolutionary multiplayer games.

    PubMed

    Peña, Jorge; Rochat, Yannick

    2012-01-01

    By combining evolutionary game theory and graph theory, "games on graphs" study the evolutionary dynamics of frequency-dependent selection in population structures modeled as geographical or social networks. Networks are usually represented by means of unipartite graphs, and social interactions by two-person games such as the famous prisoner's dilemma. Unipartite graphs have also been used for modeling interactions going beyond pairwise interactions. In this paper, we argue that bipartite graphs are a better alternative to unipartite graphs for describing population structures in evolutionary multiplayer games. To illustrate this point, we make use of bipartite graphs to investigate, by means of computer simulations, the evolution of cooperation under the conventional and the distributed N-person prisoner's dilemma. We show that several implicit assumptions arising from the standard approach based on unipartite graphs (such as the definition of replacement neighborhoods, the intertwining of individual and group diversity, and the large overlap of interaction neighborhoods) can have a large impact on the resulting evolutionary dynamics. Our work provides a clear example of the importance of construction procedures in games on graphs, of the suitability of bigraphs and hypergraphs for computational modeling, and of the importance of concepts from social network analysis such as centrality, centralization and bipartite clustering for the understanding of dynamical processes occurring on networked population structures.

  20. Computer-based, personalized cognitive training versus classical computer games: a randomized double-blind prospective trial of cognitive stimulation.

    PubMed

    Peretz, Chava; Korczyn, Amos D; Shatil, Evelyn; Aharonson, Vered; Birnboim, Smadar; Giladi, Nir

    2011-01-01

    Many studies have suggested that cognitive training can result in cognitive gains in healthy older adults. We investigated whether personalized computerized cognitive training provides greater benefits than those obtained by playing conventional computer games. This was a randomized double-blind interventional study. Self-referred healthy older adults (n = 155, 68 ± 7 years old) were assigned to either a personalized, computerized cognitive training or to a computer games group. Cognitive performance was assessed at baseline and after 3 months by a neuropsychological assessment battery. Differences in cognitive performance scores between and within groups were evaluated using mixed effects models in 2 approaches: adherence only (AO; n = 121) and intention to treat (ITT; n = 155). Both groups improved in cognitive performance. The improvement in the personalized cognitive training group was significant (p < 0.03, AO and ITT approaches) in all 8 cognitive domains. However, in the computer games group it was significant (p < 0.05) in only 4 (AO) or 6 domains (ITT). In the AO analysis, personalized cognitive training was significantly more effective than playing games in improving visuospatial working memory (p = 0.0001), visuospatial learning (p = 0.0012) and focused attention (p = 0.0019). Personalized, computerized cognitive training appears to be more effective than computer games in improving cognitive performance in healthy older adults. Further studies are needed to evaluate the ecological validity of these findings. Copyright © 2011 S. Karger AG, Basel.

  1. Bluetooth Low Energy Peripheral Android Health App for Educational and Interoperability Testing Purposes.

    PubMed

    Frohner, Matthias; Urbauer, Philipp; Sauermann, Stefan

    2017-01-01

    Based on recent telemonitoring activities in Austria for enabling integrated health care, the communication interfaces between personal health devices (e.g. blood pressure monitor) and personal health gateway devices (e.g. smartphone, routing received information to wide area networks) play an important role. In order to ease testing of the Bluetooth Low Energy interface functionality of the personal health gateway devices, a personal health device simulator was developed. Based on specifications from the Bluetooth SIG a XML software test configuration file structure is defined that declares the specific features of the personal health devices simulated. Using this configuration file, different scenarios are defined, e.g. send a single measurement result from a blood pressure reading or sending multiple (historic) weight scale readings. The simulator is intended to be used for educational purposes in lectures, where the number of physical personal health devices can be reduced and learning can be improved. It could be shown that this simulator assists the development process of mHealth applications by reducing the time needed for development and testing.

  2. Personalization through the Application of Inverse Bayes to Student Modeling

    ERIC Educational Resources Information Center

    Lang, Charles William McLeod

    2015-01-01

    Personalization, the idea that teaching can be tailored to each students' needs, has been a goal for the educational enterprise for at least 2,500 years (Regian, Shute, & Shute, 2013, p.2). Recently personalization has picked up speed with the advent of mobile computing, the Internet and increases in computer processing power. These changes…

  3. Computer-Based Script Training for Aphasia: Emerging Themes from Post-Treatment Interviews

    ERIC Educational Resources Information Center

    Cherney, Leora R.; Halper, Anita S.; Kaye, Rosalind C.

    2011-01-01

    This study presents results of post-treatment interviews following computer-based script training for persons with chronic aphasia. Each of the 23 participants received 9 weeks of AphasiaScripts training. Post-treatment interviews were conducted with the person with aphasia and/or a significant other person. The 23 interviews yielded 584 coded…

  4. INDIVIDUAL-BASED MODELS: POWERFUL OR POWER STRUGGLE?

    PubMed

    Willem, L; Stijven, S; Hens, N; Vladislavleva, E; Broeckhove, J; Beutels, P

    2015-01-01

    Individual-based models (IBMs) offer endless possibilities to explore various research questions but come with high model complexity and computational burden. Large-scale IBMs have become feasible but the novel hardware architectures require adapted software. The increased model complexity also requires systematic exploration to gain thorough system understanding. We elaborate on the development of IBMs for vaccine-preventable infectious diseases and model exploration with active learning. Investment in IBM simulator code can lead to significant runtime reductions. We found large performance differences due to data locality. Sorting the population once, reduced simulation time by a factor two. Storing person attributes separately instead of using person objects also seemed more efficient. Next, we improved model performance up to 70% by structuring potential contacts based on health status before processing disease transmission. The active learning approach we present is based on iterative surrogate modelling and model-guided experimentation. Symbolic regression is used for nonlinear response surface modelling with automatic feature selection. We illustrate our approach using an IBM for influenza vaccination. After optimizing the parameter spade, we observed an inverse relationship between vaccination coverage and the clinical attack rate reinforced by herd immunity. These insights can be used to focus and optimise research activities, and to reduce both dimensionality and decision uncertainty.

  5. Computing with Beowulf

    NASA Technical Reports Server (NTRS)

    Cohen, Jarrett

    1999-01-01

    Parallel computers built out of mass-market parts are cost-effectively performing data processing and simulation tasks. The Supercomputing (now known as "SC") series of conferences celebrated its 10th anniversary last November. While vendors have come and gone, the dominant paradigm for tackling big problems still is a shared-resource, commercial supercomputer. Growing numbers of users needing a cheaper or dedicated-access alternative are building their own supercomputers out of mass-market parts. Such machines are generally called Beowulf-class systems after the 11th century epic. This modern-day Beowulf story began in 1994 at NASA's Goddard Space Flight Center. A laboratory for the Earth and space sciences, computing managers there threw down a gauntlet to develop a $50,000 gigaFLOPS workstation for processing satellite data sets. Soon, Thomas Sterling and Don Becker were working on the Beowulf concept at the University Space Research Association (USRA)-run Center of Excellence in Space Data and Information Sciences (CESDIS). Beowulf clusters mix three primary ingredients: commodity personal computers or workstations, low-cost Ethernet networks, and the open-source Linux operating system. One of the larger Beowulfs is Goddard's Highly-parallel Integrated Virtual Environment, or HIVE for short.

  6. Expertise transfer for expert system design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boose, J.H.

    This book is about the Expertise Transfer System-a computer program which interviews experts and helps them build expert systems, i.e. computer programs that use knowledge from experts to make decisions and judgements under conditions of uncertainty. The techniques are useful to anyone who uses decision-making information based on the expertise of others. The methods can also be applied to personal decision-making. The interviewing methodology is borrowed from a branch of psychology called Personal Construct Theory. It is not necessary to use a computer to take advantage of the techniques from Personal Construction Theory; the fundamental procedures used by the Expertisemore » Transfer System can be performed using paper and pencil. It is not necessary that the reader understand very much about computers to understand the ideas in this book. The few relevant concepts from computer science and expert systems that are needed are explained in a straightforward manner. Ideas from Personal Construct Psychology are also introduced as needed.« less

  7. A pathway to personalization of integrated treatment: informatics and decision science in psychiatric rehabilitation.

    PubMed

    Spaulding, William; Deogun, Jitender

    2011-09-01

    Personalization of treatment is a current strategic goal for improving health care. Integrated treatment approaches such as psychiatric rehabilitation benefit from personalization because they involve matching diverse arrays of treatment options to individually unique profiles of need. The need for personalization is evident in the heterogeneity of people with severe mental illness and in the findings of experimental psychopathology. One pathway to personalization lies in analysis of the judgments and decision making of human experts and other participants as they respond to complex circumstances in pursuit of treatment and rehabilitation goals. Such analysis is aided by computer simulation of human decision making, which in turn informs development of computerized clinical decision support systems. This inspires a research program involving concurrent development of databases, domain ontology, and problem-solving algorithms, toward the goal of personalizing psychiatric rehabilitation through human collaboration with intelligent cyber systems. The immediate hurdle is to demonstrate that clinical decisions beyond diagnosis really do affect outcome. This can be done by supporting the hypothesis that a human treatment team with access to a reasonably comprehensive clinical database that tracks patient status and treatment response over time achieves better outcome than a treatment team without such access, in a controlled experimental trial. Provided the hypothesis can be supported, the near future will see prototype systems that can construct an integrated assessment, formulation, and rehabilitation plan from clinical assessment data and contextual information. This will lead to advanced systems that collaborate with human decision makers to personalize psychiatric rehabilitation and optimize outcome.

  8. Uses of Computer Simulation Models in Ag-Research and Everyday Life

    USDA-ARS?s Scientific Manuscript database

    When the news media talks about models they could be talking about role models, fashion models, conceptual models like the auto industry uses, or computer simulation models. A computer simulation model is a computer code that attempts to imitate the processes and functions of certain systems. There ...

  9. Simulation Testing for Selection of Critical Care Medicine Trainees. A Pilot Feasibility Study.

    PubMed

    Cocciante, Adriano G; Nguyen, Martin N; Marane, Candida F; Panayiotou, Anita E; Karahalios, Amalia; Beer, Janet A; Johal, Navroop; Morris, John; Turner, Stacy; Hessian, Elizabeth C

    2016-04-01

    Selection of physicians into anesthesiology, intensive care, and emergency medicine training has traditionally relied on evaluation of curriculum vitae, letters of recommendation, and interviews, despite these methods being poor predictors of subsequent workplace performance. In this study, we evaluated the feasibility and face validity of incorporating assessment of nontechnical skills in simulation and personality traits into an existing junior doctor selection framework. Candidates short-listed for a critical care residency position were invited to participate in the study. On the interview day, consenting candidates participated in a simulation scenario and debriefing and completed a personality test (16 Personality Factor Questionnaire) and a survey. Timing of participants' progression through the stations and faculty staff numbers were evaluated. Nontechnical skills were evaluated and candidates ranked using the Ottawa Crisis Resource Management Global Rating Scale (Ottawa GRS). Nontechnical skills ranking and traditional selection method ranking were compared using the concordance correlation coefficient. Interrater reliability was assessed using the concordance correlation coefficient. Thirteen of 20 eligible participants consented to study inclusion. All participants completed the necessary stations without significant time delays. Eighteen staff members were required to conduct interviews, simulation, debriefing, and personality testing. Participants rated the simulation station to be acceptable, fair, and relevant and as providing an opportunity to demonstrate abilities. Personality testing was rated less fair, less relevant, and less acceptable, and as giving less opportunity to demonstrate abilities. Participants reported that simulation was equally as stressful as the interview, whereas personality testing was rated less stressful. Assessors rated both personality testing and simulation as acceptable and able to provide additional information about candidates. The Ottawa GRS showed moderate interrater concordance. There was moderate concordance between rankings based on traditional selection methods and Ottawa GRS rankings (ρ = 0.52; 95% confidence interval, -0.02 to 0.82; P = 0.06). A multistation selection process involving interviews, simulation, and personality testing is feasible and has face validity. A potential barrier to adoption is the high number of faculty required to conduct the process.

  10. A meta-analysis of outcomes from the use of computer-simulated experiments in science education

    NASA Astrophysics Data System (ADS)

    Lejeune, John Van

    The purpose of this study was to synthesize the findings from existing research on the effects of computer simulated experiments on students in science education. Results from 40 reports were integrated by the process of meta-analysis to examine the effect of computer-simulated experiments and interactive videodisc simulations on student achievement and attitudes. Findings indicated significant positive differences in both low-level and high-level achievement of students who use computer-simulated experiments and interactive videodisc simulations as compared to students who used more traditional learning activities. No significant differences in retention, student attitudes toward the subject, or toward the educational method were found. Based on the findings of this study, computer-simulated experiments and interactive videodisc simulations should be used to enhance students' learning in science, especially in cases where the use of traditional laboratory activities are expensive, dangerous, or impractical.

  11. Standard Isotherm Fit Information for Dry CO2 on Sorbents for 4-Bed Molecular Sieve

    NASA Technical Reports Server (NTRS)

    Cmarik, G. E.; Son, K. N.; Knox, J. C.

    2017-01-01

    Onboard the ISS, one of the systems tasked with removal of metabolic carbon dioxide (CO2) is a 4-bed molecular sieve (4BMS) system. In order to enable a 4-person mission to succeed, systems for removal of metabolic CO2 must reliably operate for several years while minimizing power, mass, and volume requirements. This minimization can be achieved through system redesign and/or changes to the separation material(s). A material screening process has identified the most reliable sorbent materials for the next 4BMS. Sorbent characterization will provide the information necessary to guide system design by providing inputs for computer simulations.

  12. The USL NASA PC R and D project: Detailed specifications of objects

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Chum, Frank Y.; Hall, Philip P.; Moreau, Dennis R.; Triantafyllopoulos, Spiros

    1984-01-01

    The specifications for a number of projects which are to be implemented within the University of Southwestern Louisiana NASA PC R and D Project are discussed. The goals and objectives of the PC development project and the interrelationships of the various components are discussed. Six projects are described. They are a NASA/RECON simulator, a user interface to multiple remote information systems, evaluation of various personal computer systems, statistical analysis software development, interactive presentation system development, and the development of a distributed processing environment. The relationships of these projects to one another and to the goals and objectives of the overall project are discussed.

  13. Near real-time traffic routing

    NASA Technical Reports Server (NTRS)

    Yang, Chaowei (Inventor); Xie, Jibo (Inventor); Zhou, Bin (Inventor); Cao, Ying (Inventor)

    2012-01-01

    A near real-time physical transportation network routing system comprising: a traffic simulation computing grid and a dynamic traffic routing service computing grid. The traffic simulator produces traffic network travel time predictions for a physical transportation network using a traffic simulation model and common input data. The physical transportation network is divided into a multiple sections. Each section has a primary zone and a buffer zone. The traffic simulation computing grid includes multiple of traffic simulation computing nodes. The common input data includes static network characteristics, an origin-destination data table, dynamic traffic information data and historical traffic data. The dynamic traffic routing service computing grid includes multiple dynamic traffic routing computing nodes and generates traffic route(s) using the traffic network travel time predictions.

  14. Radiotherapy Monte Carlo simulation using cloud computing technology.

    PubMed

    Poole, C M; Cornelius, I; Trapp, J V; Langton, C M

    2012-12-01

    Cloud computing allows for vast computational resources to be leveraged quickly and easily in bursts as and when required. Here we describe a technique that allows for Monte Carlo radiotherapy dose calculations to be performed using GEANT4 and executed in the cloud, with relative simulation cost and completion time evaluated as a function of machine count. As expected, simulation completion time decreases as 1/n for n parallel machines, and relative simulation cost is found to be optimal where n is a factor of the total simulation time in hours. Using the technique, we demonstrate the potential usefulness of cloud computing as a solution for rapid Monte Carlo simulation for radiotherapy dose calculation without the need for dedicated local computer hardware as a proof of principal.

  15. On efficiency of fire simulation realization: parallelization with greater number of computational meshes

    NASA Astrophysics Data System (ADS)

    Valasek, Lukas; Glasa, Jan

    2017-12-01

    Current fire simulation systems are capable to utilize advantages of high-performance computer (HPC) platforms available and to model fires efficiently in parallel. In this paper, efficiency of a corridor fire simulation on a HPC computer cluster is discussed. The parallel MPI version of Fire Dynamics Simulator is used for testing efficiency of selected strategies of allocation of computational resources of the cluster using a greater number of computational cores. Simulation results indicate that if the number of cores used is not equal to a multiple of the total number of cluster node cores there are allocation strategies which provide more efficient calculations.

  16. Simulating complex intracellular processes using object-oriented computational modelling.

    PubMed

    Johnson, Colin G; Goldman, Jacki P; Gullick, William J

    2004-11-01

    The aim of this paper is to give an overview of computer modelling and simulation in cellular biology, in particular as applied to complex biochemical processes within the cell. This is illustrated by the use of the techniques of object-oriented modelling, where the computer is used to construct abstractions of objects in the domain being modelled, and these objects then interact within the computer to simulate the system and allow emergent properties to be observed. The paper also discusses the role of computer simulation in understanding complexity in biological systems, and the kinds of information which can be obtained about biology via simulation.

  17. Qualification and Approval of Personal Computer-Based Aviation Training Devices

    DOT National Transportation Integrated Search

    1997-05-12

    This Advisory Circular (AC) provides information and guidance to potential training device manufacturers and aviation training consumers concerning a means, acceptable to the Administrator, by which personal computer-based aviation training devices (...

  18. 92. VIEW OF CHART RECORDERS AND PERSONAL COMPUTER LINING NORTHEAST ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    92. VIEW OF CHART RECORDERS AND PERSONAL COMPUTER LINING NORTHEAST CORNER OF AUTOPILOT ROOM - Vandenberg Air Force Base, Space Launch Complex 3, Launch Operations Building, Napa & Alden Roads, Lompoc, Santa Barbara County, CA

  19. Overreliance on auditory feedback may lead to sound/syllable repetitions: simulations of stuttering and fluency-inducing conditions with a neural model of speech production

    PubMed Central

    Civier, Oren; Tasko, Stephen M.; Guenther, Frank H.

    2010-01-01

    This paper investigates the hypothesis that stuttering may result in part from impaired readout of feedforward control of speech, which forces persons who stutter (PWS) to produce speech with a motor strategy that is weighted too much toward auditory feedback control. Over-reliance on feedback control leads to production errors which, if they grow large enough, can cause the motor system to “reset” and repeat the current syllable. This hypothesis is investigated using computer simulations of a “neurally impaired” version of the DIVA model, a neural network model of speech acquisition and production. The model’s outputs are compared to published acoustic data from PWS’ fluent speech, and to combined acoustic and articulatory movement data collected from the dysfluent speech of one PWS. The simulations mimic the errors observed in the PWS subject’s speech, as well as the repairs of these errors. Additional simulations were able to account for enhancements of fluency gained by slowed/prolonged speech and masking noise. Together these results support the hypothesis that many dysfluencies in stuttering are due to a bias away from feedforward control and toward feedback control. PMID:20831971

  20. A Parallel Sliding Region Algorithm to Make Agent-Based Modeling Possible for a Large-Scale Simulation: Modeling Hepatitis C Epidemics in Canada.

    PubMed

    Wong, William W L; Feng, Zeny Z; Thein, Hla-Hla

    2016-11-01

    Agent-based models (ABMs) are computer simulation models that define interactions among agents and simulate emergent behaviors that arise from the ensemble of local decisions. ABMs have been increasingly used to examine trends in infectious disease epidemiology. However, the main limitation of ABMs is the high computational cost for a large-scale simulation. To improve the computational efficiency for large-scale ABM simulations, we built a parallelizable sliding region algorithm (SRA) for ABM and compared it to a nonparallelizable ABM. We developed a complex agent network and performed two simulations to model hepatitis C epidemics based on the real demographic data from Saskatchewan, Canada. The first simulation used the SRA that processed on each postal code subregion subsequently. The second simulation processed the entire population simultaneously. It was concluded that the parallelizable SRA showed computational time saving with comparable results in a province-wide simulation. Using the same method, SRA can be generalized for performing a country-wide simulation. Thus, this parallel algorithm enables the possibility of using ABM for large-scale simulation with limited computational resources.

  1. Quantum chemistry simulation on quantum computers: theories and experiments.

    PubMed

    Lu, Dawei; Xu, Boruo; Xu, Nanyang; Li, Zhaokai; Chen, Hongwei; Peng, Xinhua; Xu, Ruixue; Du, Jiangfeng

    2012-07-14

    It has been claimed that quantum computers can mimic quantum systems efficiently in the polynomial scale. Traditionally, those simulations are carried out numerically on classical computers, which are inevitably confronted with the exponential growth of required resources, with the increasing size of quantum systems. Quantum computers avoid this problem, and thus provide a possible solution for large quantum systems. In this paper, we first discuss the ideas of quantum simulation, the background of quantum simulators, their categories, and the development in both theories and experiments. We then present a brief introduction to quantum chemistry evaluated via classical computers followed by typical procedures of quantum simulation towards quantum chemistry. Reviewed are not only theoretical proposals but also proof-of-principle experimental implementations, via a small quantum computer, which include the evaluation of the static molecular eigenenergy and the simulation of chemical reaction dynamics. Although the experimental development is still behind the theory, we give prospects and suggestions for future experiments. We anticipate that in the near future quantum simulation will become a powerful tool for quantum chemistry over classical computations.

  2. An analysis of the 70-meter antenna hydrostatic bearing by means of computer simulation

    NASA Technical Reports Server (NTRS)

    Bartos, R. D.

    1993-01-01

    Recently, the computer program 'A Computer Solution for Hydrostatic Bearings with Variable Film Thickness,' used to design the hydrostatic bearing of the 70-meter antennas, was modified to improve the accuracy with which the program predicts the film height profile and oil pressure distribution between the hydrostatic bearing pad and the runner. This article presents a description of the modified computer program, the theory upon which the computer program computations are based, computer simulation results, and a discussion of the computer simulation results.

  3. Comparative Implementation of High Performance Computing for Power System Dynamic Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jin, Shuangshuang; Huang, Zhenyu; Diao, Ruisheng

    Dynamic simulation for transient stability assessment is one of the most important, but intensive, computations for power system planning and operation. Present commercial software is mainly designed for sequential computation to run a single simulation, which is very time consuming with a single processer. The application of High Performance Computing (HPC) to dynamic simulations is very promising in accelerating the computing process by parallelizing its kernel algorithms while maintaining the same level of computation accuracy. This paper describes the comparative implementation of four parallel dynamic simulation schemes in two state-of-the-art HPC environments: Message Passing Interface (MPI) and Open Multi-Processing (OpenMP).more » These implementations serve to match the application with dedicated multi-processor computing hardware and maximize the utilization and benefits of HPC during the development process.« less

  4. The effectiveness of interactive computer simulations on college engineering student conceptual understanding and problem-solving ability related to circular motion

    NASA Astrophysics Data System (ADS)

    Chien, Cheng-Chih

    In the past thirty years, the effectiveness of computer assisted learning was found varied by individual studies. Today, with drastic technical improvement, computers have been widely spread in schools and used in a variety of ways. In this study, a design model involving educational technology, pedagogy, and content domain is proposed for effective use of computers in learning. Computer simulation, constructivist and Vygotskian perspectives, and circular motion are the three elements of the specific Chain Model for instructional design. The goal of the physics course is to help students remove the ideas which are not consistent with the physics community and rebuild new knowledge. To achieve the learning goal, the strategies of using conceptual conflicts and using language to internalize specific tasks into mental functions were included. Computer simulations and accompanying worksheets were used to help students explore their own ideas and to generate questions for discussions. Using animated images to describe the dynamic processes involved in the circular motion may reduce the complexity and possible miscommunications resulting from verbal explanations. The effectiveness of the instructional material on student learning is evaluated. The results of problem solving activities show that students using computer simulations had significantly higher scores than students not using computer simulations. For conceptual understanding, on the pretest students in the non-simulation group had significantly higher score than students in the simulation group. There was no significant difference observed between the two groups in the posttest. The relations of gender, prior physics experience, and frequency of computer uses outside the course to student achievement were also studied. There were fewer female students than male students and fewer students using computer simulations than students not using computer simulations. These characteristics affect the statistical power for detecting differences. For the future research, more intervention of simulations may be introduced to explore the potential of computer simulation in helping students learning. A test for conceptual understanding with more problems and appropriate difficulty level may be needed.

  5. Dynamic gas temperature measurements using a personal computer for data acquisition and reduction

    NASA Technical Reports Server (NTRS)

    Fralick, Gustave C.; Oberle, Lawrence G.; Greer, Lawrence C., III

    1993-01-01

    This report describes a dynamic gas temperature measurement system. It has frequency response to 1000 Hz, and can be used to measure temperatures in hot, high pressure, high velocity flows. A personal computer is used for collecting and processing data, which results in a much shorter wait for results than previously. The data collection process and the user interface are described in detail. The changes made in transporting the software from a mainframe to a personal computer are described in appendices, as is the overall theory of operation.

  6. Computed tomographic colonography to screen for colorectal cancer, extracolonic cancer, and aortic aneurysm: model simulation with cost-effectiveness analysis.

    PubMed

    Hassan, Cesare; Pickhardt, Perry J; Pickhardt, Perry; Laghi, Andrea; Kim, Daniel H; Kim, Daniel; Zullo, Angelo; Iafrate, Franco; Di Giulio, Lorenzo; Morini, Sergio

    2008-04-14

    In addition to detecting colorectal neoplasia, abdominal computed tomography (CT) with colonography technique (CTC) can also detect unsuspected extracolonic cancers and abdominal aortic aneurysms (AAA).The efficacy and cost-effectiveness of this combined abdominal CT screening strategy are unknown. A computerized Markov model was constructed to simulate the occurrence of colorectal neoplasia, extracolonic malignant neoplasm, and AAA in a hypothetical cohort of 100,000 subjects from the United States who were 50 years of age. Simulated screening with CTC, using a 6-mm polyp size threshold for reporting, was compared with a competing model of optical colonoscopy (OC), both without and with abdominal ultrasonography for AAA detection (OC-US strategy). In the simulated population, CTC was the dominant screening strategy, gaining an additional 1458 and 462 life-years compared with the OC and OC-US strategies and being less costly, with a savings of $266 and $449 per person, respectively. The additional gains for CTC were largely due to a decrease in AAA-related deaths, whereas the modeled benefit from extracolonic cancer downstaging was a relatively minor factor. At sensitivity analysis, OC-US became more cost-effective only when the CTC sensitivity for large polyps dropped to 61% or when broad variations of costs were simulated, such as an increase in CTC cost from $814 to $1300 or a decrease in OC cost from $1100 to $500. With the OC-US approach, suboptimal compliance had a strong negative influence on efficacy and cost-effectiveness. The estimated mortality from CT-induced cancer was less than estimated colonoscopy-related mortality (8 vs 22 deaths), both of which were minor compared with the positive benefit from screening. When detection of extracolonic findings such as AAA and extracolonic cancer are considered in addition to colorectal neoplasia in our model simulation, CT colonography is a dominant screening strategy (ie, more clinically effective and more cost-effective) over both colonoscopy and colonoscopy with 1-time ultrasonography.

  7. Demonstration of theoretical and experimental simulations in fiber optics course

    NASA Astrophysics Data System (ADS)

    Yao, Tianfu; Wang, Xiaolin; Shi, Jianhua; Lei, Bing; Liu, Wei; Wang, Wei; Hu, Haojun

    2017-08-01

    "Fiber optics" course plays a supporting effect in the curriculum frame of optics and photonics at both undergraduate and postgraduate levels. Moreover, the course can be treated as compulsory for students specialized in the fiber-related field, such as fiber communication, fiber sensing and fiber light source. The corresponding content in fiber optics requires the knowledge of geometrical and physical optics as background, including basic optical theory and fiber components in practice. Thus, to help the students comprehend the relatively abundant and complex content, it is necessary to investigate novel teaching method assistant the classic lectures. In this paper, we introduce the multidimensional pattern in fiber-optics teaching involving theoretical and laboratory simulations. First, the theoretical simulations is demonstrated based on the self-developed software named "FB tool" which can be installed in both smart phone with Android operating system and personal computer. FB tool covers the fundamental calculations relating to transverse modes, fiber lasers and nonlinearities and so on. By comparing the calculation results with other commercial software like COMSOL, SFTool shows high accuracy with high speed. Then the laboratory simulations are designed including fiber coupling, Erbium doped fiber amplifiers, fiber components and so on. The simulations not only supports students understand basic knowledge in the course, but also provides opportunities to develop creative projects in fiber optics.

  8. Structured light imaging system for structural and optical characterization of 3D tissue-simulating phantoms

    NASA Astrophysics Data System (ADS)

    Liu, Songde; Smith, Zach; Xu, Ronald X.

    2016-10-01

    There is a pressing need for a phantom standard to calibrate medical optical devices. However, 3D printing of tissue-simulating phantom standard is challenged by lacking of appropriate methods to characterize and reproduce surface topography and optical properties accurately. We have developed a structured light imaging system to characterize surface topography and optical properties (absorption coefficient and reduced scattering coefficient) of 3D tissue-simulating phantoms. The system consisted of a hyperspectral light source, a digital light projector (DLP), a CMOS camera, two polarizers, a rotational stage, a translation stage, a motion controller, and a personal computer. Tissue-simulating phantoms with different structural and optical properties were characterized by the proposed imaging system and validated by a standard integrating sphere system. The experimental results showed that the proposed system was able to achieve pixel-level optical properties with a percentage error of less than 11% for absorption coefficient and less than 7% for reduced scattering coefficient for phantoms without surface curvature. In the meanwhile, 3D topographic profile of the phantom can be effectively reconstructed with an accuracy of less than 1% deviation error. Our study demonstrated that the proposed structured light imaging system has the potential to characterize structural profile and optical properties of 3D tissue-simulating phantoms.

  9. The Application and Impact of Computer-Generated Personalized Nutrition Education: A Review of the Literature.

    ERIC Educational Resources Information Center

    Brug, Johannes; Campbell, Marci; van Assema, Patricia

    1999-01-01

    Describes the process of providing people with computer-tailored nutrition education and reviews the studies on the impact of this type of education. Results indicate that computer-tailored nutrition education is more likely to be read, remembered, and experienced as personally relevant compared to standard materials. It also appears to have a…

  10. Influence of Recent Developments in Computer Technology on Professional Development in Vocational Education. Final Report.

    ERIC Educational Resources Information Center

    Passmore, David Lynn

    Intended for developers of vocational education professionals and for educators making decisions about the usefulness of personal computers in education, this report deals with the effects of the personal computing revolution on professional development of vocational educators. The two major papers and published opinion pieces that make up this…

  11. Theory and Programs for Dynamic Modeling of Tree Rings from Climate

    Treesearch

    Paul C. van Deusen; Jennifer Koretz

    1988-01-01

    Computer programs written in GAUSS(TM) for IBM compatible personal computers are described that perform dynamic tree ring modeling with climate data; the underlying theory is also described. The programs and a separate users manual are available from the authors, although users must have the GAUSS software package on their personal computer. An example application of...

  12. Tablet Personal Computer Integration in Higher Education: Applying the Unified Theory of Acceptance and Use Technology Model to Understand Supporting Factors

    ERIC Educational Resources Information Center

    Moran, Mark; Hawkes, Mark; El Gayar, Omar

    2010-01-01

    Many educational institutions have implemented ubiquitous or required laptop, notebook, or tablet personal computing programs for their students. Yet, limited evidence exists to validate integration and acceptance of the technology among student populations. This research examines student acceptance of mobile computing devices using a modification…

  13. Paper simulation techniques in user requirements analysis for interactive computer systems

    NASA Technical Reports Server (NTRS)

    Ramsey, H. R.; Atwood, M. E.; Willoughby, J. K.

    1979-01-01

    This paper describes the use of a technique called 'paper simulation' in the analysis of user requirements for interactive computer systems. In a paper simulation, the user solves problems with the aid of a 'computer', as in normal man-in-the-loop simulation. In this procedure, though, the computer does not exist, but is simulated by the experimenters. This allows simulated problem solving early in the design effort, and allows the properties and degree of structure of the system and its dialogue to be varied. The technique, and a method of analyzing the results, are illustrated with examples from a recent paper simulation exercise involving a Space Shuttle flight design task

  14. Computational composite mechanics for aerospace propulsion structures

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.

    1986-01-01

    Specialty methods are presented for the computational simulation of specific composite behavior. These methods encompass all aspects of composite mechanics, impact, progressive fracture and component specific simulation. Some of these methods are structured to computationally simulate, in parallel, the composite behavior and history from the initial fabrication through several missions and even to fracture. Select methods and typical results obtained from such simulations are described in detail in order to demonstrate the effectiveness of computationally simulating (1) complex composite structural behavior in general and (2) specific aerospace propulsion structural components in particular.

  15. Computational composite mechanics for aerospace propulsion structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    1987-01-01

    Specialty methods are presented for the computational simulation of specific composite behavior. These methods encompass all aspects of composite mechanics, impact, progressive fracture and component specific simulation. Some of these methods are structured to computationally simulate, in parallel, the composite behavior and history from the initial frabrication through several missions and even to fracture. Select methods and typical results obtained from such simulations are described in detail in order to demonstrate the effectiveness of computationally simulating: (1) complex composite structural behavior in general, and (2) specific aerospace propulsion structural components in particular.

  16. Functional requirements for design of the Space Ultrareliable Modular Computer (SUMC) system simulator

    NASA Technical Reports Server (NTRS)

    Curran, R. T.; Hornfeck, W. A.

    1972-01-01

    The functional requirements for the design of an interpretive simulator for the space ultrareliable modular computer (SUMC) are presented. A review of applicable existing computer simulations is included along with constraints on the SUMC simulator functional design. Input requirements, output requirements, and language requirements for the simulator are discussed in terms of a SUMC configuration which may vary according to the application.

  17. MAGIC Computer Simulation. Volume 2: Analyst Manual, Part 1

    DTIC Science & Technology

    1971-05-01

    A review of the subject Magic Computer Simulation User and Analyst Manuals has been conducted based upon a request received from the US Army...1971 4. TITLE AND SUBTITLE MAGIC Computer Simulation Analyst Manual Part 1 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6...14. ABSTRACT The MAGIC computer simulation generates target description data consisting of item-by-item listings of the target’s components and air

  18. Computational simulation of progressive fracture in fiber composites

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.

    1986-01-01

    Computational methods for simulating and predicting progressive fracture in fiber composite structures are presented. These methods are integrated into a computer code of modular form. The modules include composite mechanics, finite element analysis, and fracture criteria. The code is used to computationally simulate progressive fracture in composite laminates with and without defects. The simulation tracks the fracture progression in terms of modes initiating fracture, damage growth, and imminent global (catastrophic) laminate fracture.

  19. Computational Psychometrics for the Measurement of Collaborative Problem Solving Skills

    PubMed Central

    Polyak, Stephen T.; von Davier, Alina A.; Peterschmidt, Kurt

    2017-01-01

    This paper describes a psychometrically-based approach to the measurement of collaborative problem solving skills, by mining and classifying behavioral data both in real-time and in post-game analyses. The data were collected from a sample of middle school children who interacted with a game-like, online simulation of collaborative problem solving tasks. In this simulation, a user is required to collaborate with a virtual agent to solve a series of tasks within a first-person maze environment. The tasks were developed following the psychometric principles of Evidence Centered Design (ECD) and are aligned with the Holistic Framework developed by ACT. The analyses presented in this paper are an application of an emerging discipline called computational psychometrics which is growing out of traditional psychometrics and incorporates techniques from educational data mining, machine learning and other computer/cognitive science fields. In the real-time analysis, our aim was to start with limited knowledge of skill mastery, and then demonstrate a form of continuous Bayesian evidence tracing that updates sub-skill level probabilities as new conversation flow event evidence is presented. This is performed using Bayes' rule and conversation item conditional probability tables. The items are polytomous and each response option has been tagged with a skill at a performance level. In our post-game analysis, our goal was to discover unique gameplay profiles by performing a cluster analysis of user's sub-skill performance scores based on their patterns of selected dialog responses. PMID:29238314

  20. Computational Psychometrics for the Measurement of Collaborative Problem Solving Skills.

    PubMed

    Polyak, Stephen T; von Davier, Alina A; Peterschmidt, Kurt

    2017-01-01

    This paper describes a psychometrically-based approach to the measurement of collaborative problem solving skills, by mining and classifying behavioral data both in real-time and in post-game analyses. The data were collected from a sample of middle school children who interacted with a game-like, online simulation of collaborative problem solving tasks. In this simulation, a user is required to collaborate with a virtual agent to solve a series of tasks within a first-person maze environment. The tasks were developed following the psychometric principles of Evidence Centered Design (ECD) and are aligned with the Holistic Framework developed by ACT. The analyses presented in this paper are an application of an emerging discipline called computational psychometrics which is growing out of traditional psychometrics and incorporates techniques from educational data mining, machine learning and other computer/cognitive science fields. In the real-time analysis, our aim was to start with limited knowledge of skill mastery, and then demonstrate a form of continuous Bayesian evidence tracing that updates sub-skill level probabilities as new conversation flow event evidence is presented. This is performed using Bayes' rule and conversation item conditional probability tables. The items are polytomous and each response option has been tagged with a skill at a performance level. In our post-game analysis, our goal was to discover unique gameplay profiles by performing a cluster analysis of user's sub-skill performance scores based on their patterns of selected dialog responses.

  1. Development of kinetic analysis technique for PACS management and a screening examination in dynamic radiography

    NASA Astrophysics Data System (ADS)

    Tsuchiya, Yuichiro; Kodera, Yoshie

    2005-04-01

    The purpose of this study was to develop of kinetic analysis method for PACS management and computer-aided diagnosis. We obtained dynamic chest radiographs (512x512, 8bit, 4fps, and 1344x1344, 12bit, 3fps) of five healthy volunteers during respiration using an I.I. system twice, and one healthy volunteer using dynamic FPD system. Optical flows of images were obtained using customized block matching technique, and were divided into a direction, and transformed into the RGB color. Density was determined by the sum pixel length of movement during respiration phase. The made new static image was defined as the "kinetic map". The evaluation of patient's collation was performed with a template matching to the three colors. The same person's each correlation value and similar-coefficient which is defined in this study were statistically significant high (P<0.01). We used the artificial neural network (ANN) for the judgment of the same person. Five volunteers were divided into two groups, three volunteers and two volunteers became a training signal and unknown signal. Correlation value and similar-coefficient was used for the input signal, and ANN was designed so that the same person's probability might be outputted. The average of the specificity of the unknown signal obtained 98.2%. The kinetic map including the imitation tumor was used for the simulation. The tumor was detected by temporal subtraction of kinetic map, and then the superior sensitivity was obtained. Our analysis method was useful in risk management and computer-aided diagnosis.

  2. Using Microcomputers Simulations in the Classroom: Examples from Undergraduate and Faculty Computer Literacy Courses.

    ERIC Educational Resources Information Center

    Hart, Jeffrey A.

    1985-01-01

    Presents a discussion of how computer simulations are used in two undergraduate social science courses and a faculty computer literacy course on simulations and artificial intelligence. Includes a list of 60 simulations for use on mainframes and microcomputers. Entries include type of hardware required, publisher's address, and cost. Sample…

  3. Learning Oceanography from a Computer Simulation Compared with Direct Experience at Sea

    ERIC Educational Resources Information Center

    Winn, William; Stahr, Frederick; Sarason, Christian; Fruland, Ruth; Oppenheimer, Peter; Lee, Yen-Ling

    2006-01-01

    Considerable research has compared how students learn science from computer simulations with how they learn from "traditional" classes. Little research has compared how students learn science from computer simulations with how they learn from direct experience in the real environment on which the simulations are based. This study compared two…

  4. Effect of Computer Simulations at the Particulate and Macroscopic Levels on Students' Understanding of the Particulate Nature of Matter

    ERIC Educational Resources Information Center

    Tang, Hui; Abraham, Michael R.

    2016-01-01

    Computer-based simulations can help students visualize chemical representations and understand chemistry concepts, but simulations at different levels of representation may vary in effectiveness on student learning. This study investigated the influence of computer activities that simulate chemical reactions at different levels of representation…

  5. Computer Administering of the Psychological Investigations: Set-Relational Representation

    NASA Astrophysics Data System (ADS)

    Yordzhev, Krasimir

    Computer administering of a psychological investigation is the computer representation of the entire procedure of psychological assessments - test construction, test implementation, results evaluation, storage and maintenance of the developed database, its statistical processing, analysis and interpretation. A mathematical description of psychological assessment with the aid of personality tests is discussed in this article. The set theory and the relational algebra are used in this description. A relational model of data, needed to design a computer system for automation of certain psychological assessments is given. Some finite sets and relation on them, which are necessary for creating a personality psychological test, are described. The described model could be used to develop real software for computer administering of any psychological test and there is full automation of the whole process: test construction, test implementation, result evaluation, storage of the developed database, statistical implementation, analysis and interpretation. A software project for computer administering personality psychological tests is suggested.

  6. Robust Real-Time Musculoskeletal Modeling Driven by Electromyograms.

    PubMed

    Durandau, Guillaume; Farina, Dario; Sartori, Massimo

    2018-03-01

    Current clinical biomechanics involves lengthy data acquisition and time-consuming offline analyses with biomechanical models not operating in real-time for man-machine interfacing. We developed a method that enables online analysis of neuromusculoskeletal function in vivo in the intact human. We used electromyography (EMG)-driven musculoskeletal modeling to simulate all transformations from muscle excitation onset (EMGs) to mechanical moment production around multiple lower-limb degrees of freedom (DOFs). We developed a calibration algorithm that enables adjusting musculoskeletal model parameters specifically to an individual's anthropometry and force-generating capacity. We incorporated the modeling paradigm into a computationally efficient, generic framework that can be interfaced in real-time with any movement data collection system. The framework demonstrated the ability of computing forces in 13 lower-limb muscle-tendon units and resulting moments about three joint DOFs simultaneously in real-time. Remarkably, it was capable of extrapolating beyond calibration conditions, i.e., predicting accurate joint moments during six unseen tasks and one unseen DOF. The proposed framework can dramatically reduce evaluation latency in current clinical biomechanics and open up new avenues for establishing prompt and personalized treatments, as well as for establishing natural interfaces between patients and rehabilitation systems. The integration of EMG with numerical modeling will enable simulating realistic neuromuscular strategies in conditions including muscular/orthopedic deficit, which could not be robustly simulated via pure modeling formulations. This will enable translation to clinical settings and development of healthcare technologies including real-time bio-feedback of internal mechanical forces and direct patient-machine interfacing.

  7. Hospital influenza pandemic stockpiling needs: A computer simulation.

    PubMed

    Abramovich, Mark N; Hershey, John C; Callies, Byron; Adalja, Amesh A; Tosh, Pritish K; Toner, Eric S

    2017-03-01

    A severe influenza pandemic could overwhelm hospitals but planning guidance that accounts for the dynamic interrelationships between planning elements is lacking. We developed a methodology to calculate pandemic supply needs based on operational considerations in hospitals and then tested the methodology at Mayo Clinic in Rochester, MN. We upgraded a previously designed computer modeling tool and input carefully researched resource data from the hospital to run 10,000 Monte Carlo simulations using various combinations of variables to determine resource needs across a spectrum of scenarios. Of 10,000 iterations, 1,315 fell within the parameters defined by our simulation design and logical constraints. From these valid iterations, we projected supply requirements by percentile for key supplies, pharmaceuticals, and personal protective equipment requirements needed in a severe pandemic. We projected supplies needs for a range of scenarios that use up to 100% of Mayo Clinic-Rochester's surge capacity of beds and ventilators. The results indicate that there are diminishing patient care benefits for stockpiling on the high side of the range, but that having some stockpile of critical resources, even if it is relatively modest, is most important. We were able to display the probabilities of needing various supply levels across a spectrum of scenarios. The tool could be used to model many other hospital preparedness issues, but validation in other settings is needed. Copyright © 2017 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.

  8. The Role of Personal Computers in Vocational Education: A Critical View.

    ERIC Educational Resources Information Center

    Passmore, David L.; And Others

    1984-01-01

    Personal computers are inexpensive, portable, accessible, and adaptable for vocational education instruction, administration, and communications. Successful infusion of microcomputers into vocational education requires staff orientation, improvement in software quality, and careful planning. (SK)

  9. Explicit Content Caching at Mobile Edge Networks with Cross-Layer Sensing

    PubMed Central

    Chen, Lingyu; Su, Youxing; Luo, Wenbin; Hong, Xuemin; Shi, Jianghong

    2018-01-01

    The deployment density and computational power of small base stations (BSs) are expected to increase significantly in the next generation mobile communication networks. These BSs form the mobile edge network, which is a pervasive and distributed infrastructure that can empower a variety of edge/fog computing applications. This paper proposes a novel edge-computing application called explicit caching, which stores selective contents at BSs and exposes such contents to local users for interactive browsing and download. We formulate the explicit caching problem as a joint content recommendation, caching, and delivery problem, which aims to maximize the expected user quality-of-experience (QoE) with varying degrees of cross-layer sensing capability. Optimal and effective heuristic algorithms are presented to solve the problem. The theoretical performance bounds of the explicit caching system are derived in simplified scenarios. The impacts of cache storage space, BS backhaul capacity, cross-layer information, and user mobility on the system performance are simulated and discussed in realistic scenarios. Results suggest that, compared with conventional implicit caching schemes, explicit caching can better exploit the mobile edge network infrastructure for personalized content dissemination. PMID:29565313

  10. Explicit Content Caching at Mobile Edge Networks with Cross-Layer Sensing.

    PubMed

    Chen, Lingyu; Su, Youxing; Luo, Wenbin; Hong, Xuemin; Shi, Jianghong

    2018-03-22

    The deployment density and computational power of small base stations (BSs) are expected to increase significantly in the next generation mobile communication networks. These BSs form the mobile edge network, which is a pervasive and distributed infrastructure that can empower a variety of edge/fog computing applications. This paper proposes a novel edge-computing application called explicit caching, which stores selective contents at BSs and exposes such contents to local users for interactive browsing and download. We formulate the explicit caching problem as a joint content recommendation, caching, and delivery problem, which aims to maximize the expected user quality-of-experience (QoE) with varying degrees of cross-layer sensing capability. Optimal and effective heuristic algorithms are presented to solve the problem. The theoretical performance bounds of the explicit caching system are derived in simplified scenarios. The impacts of cache storage space, BS backhaul capacity, cross-layer information, and user mobility on the system performance are simulated and discussed in realistic scenarios. Results suggest that, compared with conventional implicit caching schemes, explicit caching can better exploit the mobile edge network infrastructure for personalized content dissemination.

  11. [Application of computer-aided osteotomy template design in treatment of developmental dysplasia of the hip with steel osteotomy].

    PubMed

    Tong, Kuang; Zhang, Yuanzhi; Zhang, Sheng; Yu, Bin

    2013-06-01

    To provide an accurate method for osteotomy in the treatment of developmental dysplasia of the hip with steel osteotomy by three-dimensional reconstruction and Reverse Engineering technique. Between January 2011 and December 2012, 13 children with developmental dysplasia of the hip underwent steel osteotomy. 3D CT scan pelvic images were obtained and transferred via a DICOM network into a computer workstation to construct 3D models of the hip using Materialise Mimics 14.1 software in STL format. These models were imported into Imageware 12.0 software for steel osteotomy simulation until a stable hip was attained in the anatomical position for dislocation or subluxation of the hip in older children. The osteotomy navigational templates were designed according to the anatomical features after a stable hip was reconstructed. These navigational templates were manufactured using a rapid prototyping technique. The reconstruction hips in these children show good matching property and acetabulum cover. The computer-aided design of osteotomy template provides personalized and accurate solutions in the treatment of developmental dysplasia of the hip with steel osteotomy in older children.

  12. Computer-based simulation training in emergency medicine designed in the light of malpractice cases.

    PubMed

    Karakuş, Akan; Duran, Latif; Yavuz, Yücel; Altintop, Levent; Calişkan, Fatih

    2014-07-27

    Using computer-based simulation systems in medical education is becoming more and more common. Although the benefits of practicing with these systems in medical education have been demonstrated, advantages of using computer-based simulation in emergency medicine education are less validated. The aim of the present study was to assess the success rates of final year medical students in doing emergency medical treatment and evaluating the effectiveness of computer-based simulation training in improving final year medical students' knowledge. Twenty four Students trained with computer-based simulation and completed at least 4 hours of simulation-based education between the dates Feb 1, 2010 - May 1, 2010. Also a control group (traditionally trained, n =24) was chosen. After the end of training, students completed an examination about 5 randomized medical simulation cases. In 5 cases, an average of 3.9 correct medical approaches carried out by computer-based simulation trained students, an average of 2.8 correct medical approaches carried out by traditionally trained group (t = 3.90, p < 0.005). We found that the success of students trained with simulation training in cases which required complicated medical approach, was statistically higher than the ones who didn't take simulation training (p ≤ 0.05). Computer-based simulation training would be significantly effective in learning of medical treatment algorithms. We thought that these programs can improve the success rate of students especially in doing adequate medical approach to complex emergency cases.

  13. NeuroManager: a workflow analysis based simulation management engine for computational neuroscience

    PubMed Central

    Stockton, David B.; Santamaria, Fidel

    2015-01-01

    We developed NeuroManager, an object-oriented simulation management software engine for computational neuroscience. NeuroManager automates the workflow of simulation job submissions when using heterogeneous computational resources, simulators, and simulation tasks. The object-oriented approach (1) provides flexibility to adapt to a variety of neuroscience simulators, (2) simplifies the use of heterogeneous computational resources, from desktops to super computer clusters, and (3) improves tracking of simulator/simulation evolution. We implemented NeuroManager in MATLAB, a widely used engineering and scientific language, for its signal and image processing tools, prevalence in electrophysiology analysis, and increasing use in college Biology education. To design and develop NeuroManager we analyzed the workflow of simulation submission for a variety of simulators, operating systems, and computational resources, including the handling of input parameters, data, models, results, and analyses. This resulted in 22 stages of simulation submission workflow. The software incorporates progress notification, automatic organization, labeling, and time-stamping of data and results, and integrated access to MATLAB's analysis and visualization tools. NeuroManager provides users with the tools to automate daily tasks, and assists principal investigators in tracking and recreating the evolution of research projects performed by multiple people. Overall, NeuroManager provides the infrastructure needed to improve workflow, manage multiple simultaneous simulations, and maintain provenance of the potentially large amounts of data produced during the course of a research project. PMID:26528175

  14. NeuroManager: a workflow analysis based simulation management engine for computational neuroscience.

    PubMed

    Stockton, David B; Santamaria, Fidel

    2015-01-01

    We developed NeuroManager, an object-oriented simulation management software engine for computational neuroscience. NeuroManager automates the workflow of simulation job submissions when using heterogeneous computational resources, simulators, and simulation tasks. The object-oriented approach (1) provides flexibility to adapt to a variety of neuroscience simulators, (2) simplifies the use of heterogeneous computational resources, from desktops to super computer clusters, and (3) improves tracking of simulator/simulation evolution. We implemented NeuroManager in MATLAB, a widely used engineering and scientific language, for its signal and image processing tools, prevalence in electrophysiology analysis, and increasing use in college Biology education. To design and develop NeuroManager we analyzed the workflow of simulation submission for a variety of simulators, operating systems, and computational resources, including the handling of input parameters, data, models, results, and analyses. This resulted in 22 stages of simulation submission workflow. The software incorporates progress notification, automatic organization, labeling, and time-stamping of data and results, and integrated access to MATLAB's analysis and visualization tools. NeuroManager provides users with the tools to automate daily tasks, and assists principal investigators in tracking and recreating the evolution of research projects performed by multiple people. Overall, NeuroManager provides the infrastructure needed to improve workflow, manage multiple simultaneous simulations, and maintain provenance of the potentially large amounts of data produced during the course of a research project.

  15. Methodology of modeling and measuring computer architectures for plasma simulations

    NASA Technical Reports Server (NTRS)

    Wang, L. P. T.

    1977-01-01

    A brief introduction to plasma simulation using computers and the difficulties on currently available computers is given. Through the use of an analyzing and measuring methodology - SARA, the control flow and data flow of a particle simulation model REM2-1/2D are exemplified. After recursive refinements the total execution time may be greatly shortened and a fully parallel data flow can be obtained. From this data flow, a matched computer architecture or organization could be configured to achieve the computation bound of an application problem. A sequential type simulation model, an array/pipeline type simulation model, and a fully parallel simulation model of a code REM2-1/2D are proposed and analyzed. This methodology can be applied to other application problems which have implicitly parallel nature.

  16. A Computer-Based Simulation of an Acid-Base Titration

    ERIC Educational Resources Information Center

    Boblick, John M.

    1971-01-01

    Reviews the advantages of computer simulated environments for experiments, referring in particular to acid-base titrations. Includes pre-lab instructions and a sample computer printout of a student's use of an acid-base simulation. Ten references. (PR)

  17. Software for Brain Network Simulations: A Comparative Study

    PubMed Central

    Tikidji-Hamburyan, Ruben A.; Narayana, Vikram; Bozkus, Zeki; El-Ghazawi, Tarek A.

    2017-01-01

    Numerical simulations of brain networks are a critical part of our efforts in understanding brain functions under pathological and normal conditions. For several decades, the community has developed many software packages and simulators to accelerate research in computational neuroscience. In this article, we select the three most popular simulators, as determined by the number of models in the ModelDB database, such as NEURON, GENESIS, and BRIAN, and perform an independent evaluation of these simulators. In addition, we study NEST, one of the lead simulators of the Human Brain Project. First, we study them based on one of the most important characteristics, the range of supported models. Our investigation reveals that brain network simulators may be biased toward supporting a specific set of models. However, all simulators tend to expand the supported range of models by providing a universal environment for the computational study of individual neurons and brain networks. Next, our investigations on the characteristics of computational architecture and efficiency indicate that all simulators compile the most computationally intensive procedures into binary code, with the aim of maximizing their computational performance. However, not all simulators provide the simplest method for module development and/or guarantee efficient binary code. Third, a study of their amenability for high-performance computing reveals that NEST can almost transparently map an existing model on a cluster or multicore computer, while NEURON requires code modification if the model developed for a single computer has to be mapped on a computational cluster. Interestingly, parallelization is the weakest characteristic of BRIAN, which provides no support for cluster computations and limited support for multicore computers. Fourth, we identify the level of user support and frequency of usage for all simulators. Finally, we carry out an evaluation using two case studies: a large network with simplified neural and synaptic models and a small network with detailed models. These two case studies allow us to avoid any bias toward a particular software package. The results indicate that BRIAN provides the most concise language for both cases considered. Furthermore, as expected, NEST mostly favors large network models, while NEURON is better suited for detailed models. Overall, the case studies reinforce our general observation that simulators have a bias in the computational performance toward specific types of the brain network models. PMID:28775687

  18. Definitions of database files and fields of the Personal Computer-Based Water Data Sources Directory

    USGS Publications Warehouse

    Green, J. Wayne

    1991-01-01

    This report describes the data-base files and fields of the personal computer-based Water Data Sources Directory (WDSD). The personal computer-based WDSD was derived from the U.S. Geological Survey (USGS) mainframe computer version. The mainframe version of the WDSD is a hierarchical data-base design. The personal computer-based WDSD is a relational data- base design. This report describes the data-base files and fields of the relational data-base design in dBASE IV (the use of brand names in this abstract is for identification purposes only and does not constitute endorsement by the U.S. Geological Survey) for the personal computer. The WDSD contains information on (1) the type of organization, (2) the major orientation of water-data activities conducted by each organization, (3) the names, addresses, and telephone numbers of offices within each organization from which water data may be obtained, (4) the types of data held by each organization and the geographic locations within which these data have been collected, (5) alternative sources of an organization's data, (6) the designation of liaison personnel in matters related to water-data acquisition and indexing, (7) the volume of water data indexed for the organization, and (8) information about other types of data and services available from the organization that are pertinent to water-resources activities.

  19. [Individual psychological features in patients with computer game addiction suffering and predisposed persons].

    PubMed

    Kardashian, R A

    2018-01-01

    To study personality characteristics in adolescents with computer game addiction. The study included students of grades 7 to 10 at the age of 12-17 years (14.6±2.4 years), their parents and school teachers. The results of a study showed the following combination in patients: 'genophilic' type of DPT with schizoid personality accentuation and 'projection' type of psychological protection, and 'dignitophilic' type of DPT with labile personality accentuation and 'denial' type of PP.

  20. Intermediate-sized natural gas fueled carbonate fuel cell power plants

    NASA Astrophysics Data System (ADS)

    Sudhoff, Frederick A.; Fleming, Donald K.

    1994-04-01

    This executive summary of the report describes the accomplishments of the joint US Department of Energy's (DOE) Morgantown Energy Technology Center (METC) and M-C POWER Corporation's Cooperative Research and Development Agreement (CRADA) No. 93-013. This study addresses the intermediate power plant size between 2 megawatt (MW) and 200 MW. A 25 MW natural-gas, fueled-carbonate fuel cell power plant was chosen for this purpose. In keeping with recent designs, the fuel cell will operate under approximately three atmospheres of pressure. An expander/alternator is utilized to expand exhaust gas to atmospheric conditions and generate additional power. A steam-bottoming cycle is not included in this study because it is not believed to be cost effective for this system size. This study also addresses the simplicity and accuracy of a spreadsheet-based simulation with that of a full Advanced System for Process Engineering (ASPEN) simulation. The personal computer can fully utilize the simple spreadsheet model simulation. This model can be made available to all users and is particularly advantageous to the small business user.

Top