Sample records for simulation model view

  1. Systems Engineering Model and Training Application for Desktop Environment

    NASA Technical Reports Server (NTRS)

    May, Jeffrey T.

    2010-01-01

    Provide a graphical user interface based simulator for desktop training, operations and procedure development and system reference. This simulator allows for engineers to train and further understand the dynamics of their system from their local desktops. It allows the users to train and evaluate their system at a pace and skill level based on the user's competency and from a perspective based on the user's need. The simulator will not require any special resources to execute and should generally be available for use. The interface is based on a concept of presenting the model of the system in ways that best suits the user's application or training needs. The three levels of views are Component View, the System View (overall system), and the Console View (monitor). These views are portals into a single model, so changing the model from one view or from a model manager Graphical User Interface will be reflected on all other views.

  2. Use case driven approach to develop simulation model for PCS of APR1400 simulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dong Wook, Kim; Hong Soo, Kim; Hyeon Tae, Kang

    2006-07-01

    The full-scope simulator is being developed to evaluate specific design feature and to support the iterative design and validation in the Man-Machine Interface System (MMIS) design of Advanced Power Reactor (APR) 1400. The simulator consists of process model, control logic model, and MMI for the APR1400 as well as the Power Control System (PCS). In this paper, a use case driven approach is proposed to develop a simulation model for PCS. In this approach, a system is considered from the point of view of its users. User's view of the system is based on interactions with the system and themore » resultant responses. In use case driven approach, we initially consider the system as a black box and look at its interactions with the users. From these interactions, use cases of the system are identified. Then the system is modeled using these use cases as functions. Lower levels expand the functionalities of each of these use cases. Hence, starting from the topmost level view of the system, we proceeded down to the lowest level (the internal view of the system). The model of the system thus developed is use case driven. This paper will introduce the functionality of the PCS simulation model, including a requirement analysis based on use case and the validation result of development of PCS model. The PCS simulation model using use case will be first used during the full-scope simulator development for nuclear power plant and will be supplied to Shin-Kori 3 and 4 plant. The use case based simulation model development can be useful for the design and implementation of simulation models. (authors)« less

  3. Numerical simulation of nonlinear feedback model of saccade generation circuit implemented in the LabView graphical programming language.

    PubMed

    Jackson, M E; Gnadt, J W

    1999-03-01

    The object-oriented graphical programming language LabView was used to implement the numerical solution to a computational model of saccade generation in primates. The computational model simulates the activity and connectivity of anatomical strictures known to be involved in saccadic eye movements. The LabView program provides a graphical user interface to the model that makes it easy to observe and modify the behavior of each element of the model. Essential elements of the source code of the LabView program are presented and explained. A copy of the model is available for download from the internet.

  4. A teleoperation training simulator with visual and kinesthetic force virtual reality

    NASA Technical Reports Server (NTRS)

    Kim, Won S.; Schenker, Paul

    1992-01-01

    A force-reflecting teleoperation training simulator with a high-fidelity real-time graphics display has been developed for operator training. A novel feature of this simulator is that it enables the operator to feel contact forces and torques through a force-reflecting controller during the execution of the simulated peg-in-hole task, providing the operator with the feel of visual and kinesthetic force virtual reality. A peg-in-hole task is used in our simulated teleoperation trainer as a generic teleoperation task. A quasi-static analysis of a two-dimensional peg-in-hole task model has been extended to a three-dimensional model analysis to compute contact forces and torques for a virtual realization of kinesthetic force feedback. The simulator allows the user to specify force reflection gains and stiffness (compliance) values of the manipulator hand for both the three translational and the three rotational axes in Cartesian space. Three viewing modes are provided for graphics display: single view, two split views, and stereoscopic view.

  5. CONFIG: Integrated engineering of systems and their operation

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Ryan, Dan; Fleming, Land

    1994-01-01

    This article discusses CONFIG 3, a prototype software tool that supports integrated conceptual design evaluation from early in the product life cycle, by supporting isolated or integrated modeling, simulation, and analysis of the function, structure, behavior, failures and operations of system designs. Integration and reuse of models is supported in an object-oriented environment providing capabilities for graph analysis and discrete event simulation. CONFIG supports integration among diverse modeling approaches (component view, configuration or flow path view, and procedure view) and diverse simulation and analysis approaches. CONFIG is designed to support integrated engineering in diverse design domains, including mechanical and electro-mechanical systems, distributed computer systems, and chemical processing and transport systems.

  6. Simulation study of geometric shape factor approach to estimating earth emitted flux densities from wide field-of-view radiation measurements

    NASA Technical Reports Server (NTRS)

    Weaver, W. L.; Green, R. N.

    1980-01-01

    A study was performed on the use of geometric shape factors to estimate earth-emitted flux densities from radiation measurements with wide field-of-view flat-plate radiometers on satellites. Sets of simulated irradiance measurements were computed for unrestricted and restricted field-of-view detectors. In these simulations, the earth radiation field was modeled using data from Nimbus 2 and 3. Geometric shape factors were derived and applied to these data to estimate flux densities on global and zonal scales. For measurements at a satellite altitude of 600 km, estimates of zonal flux density were in error 1.0 to 1.2%, and global flux density errors were less than 0.2%. Estimates with unrestricted field-of-view detectors were about the same for Lambertian and non-Lambertian radiation models, but were affected by satellite altitude. The opposite was found for the restricted field-of-view detectors.

  7. Interoperability format translation and transformation between IFC architectural design file and simulation file formats

    DOEpatents

    Chao, Tian-Jy; Kim, Younghun

    2015-02-03

    Automatically translating a building architecture file format (Industry Foundation Class) to a simulation file, in one aspect, may extract data and metadata used by a target simulation tool from a building architecture file. Interoperability data objects may be created and the extracted data is stored in the interoperability data objects. A model translation procedure may be prepared to identify a mapping from a Model View Definition to a translation and transformation function. The extracted data may be transformed using the data stored in the interoperability data objects, an input Model View Definition template, and the translation and transformation function to convert the extracted data to correct geometric values needed for a target simulation file format used by the target simulation tool. The simulation file in the target simulation file format may be generated.

  8. Neuronvisio: A Graphical User Interface with 3D Capabilities for NEURON.

    PubMed

    Mattioni, Michele; Cohen, Uri; Le Novère, Nicolas

    2012-01-01

    The NEURON simulation environment is a commonly used tool to perform electrical simulation of neurons and neuronal networks. The NEURON User Interface, based on the now discontinued InterViews library, provides some limited facilities to explore models and to plot their simulation results. Other limitations include the inability to generate a three-dimensional visualization, no standard mean to save the results of simulations, or to store the model geometry within the results. Neuronvisio (http://neuronvisio.org) aims to address these deficiencies through a set of well designed python APIs and provides an improved UI, allowing users to explore and interact with the model. Neuronvisio also facilitates access to previously published models, allowing users to browse, download, and locally run NEURON models stored in ModelDB. Neuronvisio uses the matplotlib library to plot simulation results and uses the HDF standard format to store simulation results. Neuronvisio can be viewed as an extension of NEURON, facilitating typical user workflows such as model browsing, selection, download, compilation, and simulation. The 3D viewer simplifies the exploration of complex model structure, while matplotlib permits the plotting of high-quality graphs. The newly introduced ability of saving numerical results allows users to perform additional analysis on their previous simulations.

  9. Digital Simulation and Modelling.

    ERIC Educational Resources Information Center

    Hawthorne, G. B., Jr.

    A basically tutorial point of view is taken in this general discussion. The author examines the basic concepts and principles of simulation and modelling and the application of digital computers to these tasks. Examples of existing simulations, a discussion of the applicability and feasibility of simulation studies, a review of simulation…

  10. Interoperability format translation and transformation between IFC architectural design file and simulation file formats

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chao, Tian-Jy; Kim, Younghun

    Automatically translating a building architecture file format (Industry Foundation Class) to a simulation file, in one aspect, may extract data and metadata used by a target simulation tool from a building architecture file. Interoperability data objects may be created and the extracted data is stored in the interoperability data objects. A model translation procedure may be prepared to identify a mapping from a Model View Definition to a translation and transformation function. The extracted data may be transformed using the data stored in the interoperability data objects, an input Model View Definition template, and the translation and transformation function tomore » convert the extracted data to correct geometric values needed for a target simulation file format used by the target simulation tool. The simulation file in the target simulation file format may be generated.« less

  11. 47. MISSISSIPPI BASIN MODEL AT CLINTON SUBSTATION. VIEW OF STAGE ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    47. MISSISSIPPI BASIN MODEL AT CLINTON SUBSTATION. VIEW OF STAGE TRANSMITTER AT ALTON, ILLINOIS MODEL SECTION, AND LOCK AND DAM (OLD #26) SIMULATOR. - Waterways Experiment Station, Hydraulics Laboratory, Halls Ferry Road, 2 miles south of I-20, Vicksburg, Warren County, MS

  12. Computer-aided operations engineering with integrated models of systems and operations

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Ryan, Dan; Fleming, Land

    1994-01-01

    CONFIG 3 is a prototype software tool that supports integrated conceptual design evaluation from early in the product life cycle, by supporting isolated or integrated modeling, simulation, and analysis of the function, structure, behavior, failures and operation of system designs. Integration and reuse of models is supported in an object-oriented environment providing capabilities for graph analysis and discrete event simulation. Integration is supported among diverse modeling approaches (component view, configuration or flow path view, and procedure view) and diverse simulation and analysis approaches. Support is provided for integrated engineering in diverse design domains, including mechanical and electro-mechanical systems, distributed computer systems, and chemical processing and transport systems. CONFIG supports abstracted qualitative and symbolic modeling, for early conceptual design. System models are component structure models with operating modes, with embedded time-related behavior models. CONFIG supports failure modeling and modeling of state or configuration changes that result in dynamic changes in dependencies among components. Operations and procedure models are activity structure models that interact with system models. CONFIG is designed to support evaluation of system operability, diagnosability and fault tolerance, and analysis of the development of system effects of problems over time, including faults, failures, and procedural or environmental difficulties.

  13. Implementation of interactive virtual simulation of physical systems

    NASA Astrophysics Data System (ADS)

    Sanchez, H.; Escobar, J. J.; Gonzalez, J. D.; Beltran, J.

    2014-03-01

    Considering the limited availability of laboratories for physics teaching and the difficulties this causes in the learning of school students in Santa Marta Colombia, we have developed software in order to generate greater student interaction with the phenomena physical and improve their understanding. Thereby, this system has been proposed in an architecture Model/View- View- Model (MVVM), sharing the benefits of MVC. Basically, this pattern consists of 3 parts: The Model, that is responsible for business logic related. The View, which is the part with which we are most familiar and the user sees. Its role is to display data to the user and allowing manipulation of the data of the application. The ViewModel, which is the middle part of the Model and the View (analogous to the Controller in the MVC pattern), as well as being responsible for implementing the behavior of the view to respond to user actions and expose data model in a way that is easy to use links to data in the view. .NET Framework 4.0 and editing package Silverlight 4 and 5 are the main requirements needed for the deployment of physical simulations that are hosted in the web application and a web browser (Internet Explorer, Mozilla Firefox or Chrome). The implementation of this innovative application in educational institutions has shown that students improved their contextualization of physical phenomena.

  14. Coarse-graining to the meso and continuum scales with molecular-dynamics-like models

    NASA Astrophysics Data System (ADS)

    Plimpton, Steve

    Many engineering-scale problems that industry or the national labs try to address with particle-based simulations occur at length and time scales well beyond the most optimistic hopes of traditional coarse-graining methods for molecular dynamics (MD), which typically start at the atomic scale and build upward. However classical MD can be viewed as an engine for simulating particles at literally any length or time scale, depending on the models used for individual particles and their interactions. To illustrate I'll highlight several coarse-grained (CG) materials models, some of which are likely familiar to molecular-scale modelers, but others probably not. These include models for water droplet freezing on surfaces, dissipative particle dynamics (DPD) models of explosives where particles have internal state, CG models of nano or colloidal particles in solution, models for aspherical particles, Peridynamics models for fracture, and models of granular materials at the scale of industrial processing. All of these can be implemented as MD-style models for either soft or hard materials; in fact they are all part of our LAMMPS MD package, added either by our group or contributed by collaborators. Unlike most all-atom MD simulations, CG simulations at these scales often involve highly non-uniform particle densities. So I'll also discuss a load-balancing method we've implemented for these kinds of models, which can improve parallel efficiencies. From the physics point-of-view, these models may be viewed as non-traditional or ad hoc. But because they are MD-style simulations, there's an opportunity for physicists to add statistical mechanics rigor to individual models. Or, in keeping with a theme of this session, to devise methods that more accurately bridge models from one scale to the next.

  15. A comparison of newborn stylized and tomographic models for dose assessment in paediatric radiology

    NASA Astrophysics Data System (ADS)

    Staton, R. J.; Pazik, F. D.; Nipper, J. C.; Williams, J. L.; Bolch, W. E.

    2003-04-01

    Establishment of organ doses from diagnostic and interventional examinations is a key component to quantifying the radiation risks from medical exposures and for formulating corresponding dose-reduction strategies. Radiation transport models of human anatomy provide a convenient method for simulating radiological examinations. At present, two classes of models exist: stylized mathematical models and tomographic voxel models. In the present study, organ dose comparisons are made for projection radiographs of both a stylized and a tomographic model of the newborn patient. Sixteen separate radiographs were simulated for each model at x-ray technique factors typical of newborn examinations: chest, abdomen, thorax and head views in the AP, PA, left LAT and right LAT projection orientation. For AP and PA radiographs of the torso (chest, abdomen and thorax views), the effective dose assessed for the tomographic model exceeds that for the stylized model with per cent differences ranging from 19% (AP abdominal view) to 43% AP chest view. In contrast, the effective dose for the stylized model exceeds that for the tomographic model for all eight lateral views including those of the head, with per cent differences ranging from 9% (LLAT chest view) to 51% (RLAT thorax view). While organ positioning differences do exist between the models, a major factor contributing to differences in effective dose is the models' exterior trunk shape. In the tomographic model, a more elliptical shape is seen thus providing for less tissue shielding for internal organs in the AP and PA directions, with corresponding increased tissue shielding in the lateral directions. This observation is opposite of that seen in comparisons of stylized and tomographic models of the adult.

  16. A simplified computational memory model from information processing.

    PubMed

    Zhang, Lanhua; Zhang, Dongsheng; Deng, Yuqin; Ding, Xiaoqian; Wang, Yan; Tang, Yiyuan; Sun, Baoliang

    2016-11-23

    This paper is intended to propose a computational model for memory from the view of information processing. The model, called simplified memory information retrieval network (SMIRN), is a bi-modular hierarchical functional memory network by abstracting memory function and simulating memory information processing. At first meta-memory is defined to express the neuron or brain cortices based on the biology and graph theories, and we develop an intra-modular network with the modeling algorithm by mapping the node and edge, and then the bi-modular network is delineated with intra-modular and inter-modular. At last a polynomial retrieval algorithm is introduced. In this paper we simulate the memory phenomena and functions of memorization and strengthening by information processing algorithms. The theoretical analysis and the simulation results show that the model is in accordance with the memory phenomena from information processing view.

  17. Pilot/vehicle model analysis of visual and motion cue requirements in flight simulation. [helicopter hovering

    NASA Technical Reports Server (NTRS)

    Baron, S.; Lancraft, R.; Zacharias, G.

    1980-01-01

    The optimal control model (OCM) of the human operator is used to predict the effect of simulator characteristics on pilot performance and workload. The piloting task studied is helicopter hover. Among the simulator characteristics considered were (computer generated) visual display resolution, field of view and time delay.

  18. Analysis of the Hexapod Work Space using integration of a CAD/CAE system and the LabVIEW software

    NASA Astrophysics Data System (ADS)

    Herbuś, K.; Ociepka, P.

    2015-11-01

    The paper presents the problems related to the integration of a CAD/CAE system with the LabVIEW software. The purpose of the integration is to determine the workspace of a hexapod model basing on a mathematical model describing it motion. In the first stage of the work concerning the integration task the 3D model to simulate movements of a hexapod was elaborated. This phase of the work was done in the “Motion Simulation” module of the CAD/CAE/CAM Siemens NX system. The first step was to define the components of the 3D model in the form of “links”. Individual links were defined according to the nature of the hexapod elements action. In the model prepared for movement simulation were created links corresponding to such elements as: electric actuator, top plate, bottom plate, ball-and-socket joint, toggle joint Phillips. Then were defined the constraints of the “joint” type (e.g.: revolute joint, slider joint, spherical joint) between the created component of the “link” type, so that the computer simulation corresponds to the operation of a real hexapod. The next stage of work included implementing the mathematical model describing the functioning of a hexapod in the LabVIEW software. At this stage, particular attention was paid to determining procedures for integrating the virtual 3D hexapod model with the results of calculations performed in the LabVIEW. The results relate to specific values of the jump of electric actuators depending on the position of the car on the hexapod. The use of integration made it possible to determine the safe operating space of a stationary hexapod taking into consideration the security of a person in the driving simulator designed for the disabled.

  19. SU-E-T-36: A GPU-Accelerated Monte-Carlo Dose Calculation Platform and Its Application Toward Validating a ViewRay Beam Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Y; Mazur, T; Green, O

    Purpose: To build a fast, accurate and easily-deployable research platform for Monte-Carlo dose calculations. We port the dose calculation engine PENELOPE to C++, and accelerate calculations using GPU acceleration. Simulations of a Co-60 beam model provided by ViewRay demonstrate the capabilities of the platform. Methods: We built software that incorporates a beam model interface, CT-phantom model, GPU-accelerated PENELOPE engine, and GUI front-end. We rewrote the PENELOPE kernel in C++ (from Fortran) and accelerated the code on a GPU. We seamlessly integrated a Co-60 beam model (obtained from ViewRay) into our platform. Simulations of various field sizes and SSDs using amore » homogeneous water phantom generated PDDs, dose profiles, and output factors that were compared to experiment data. Results: With GPU acceleration using a dated graphics card (Nvidia Tesla C2050), a highly accurate simulation – including 100*100*100 grid, 3×3×3 mm3 voxels, <1% uncertainty, and 4.2×4.2 cm2 field size – runs 24 times faster (20 minutes versus 8 hours) than when parallelizing on 8 threads across a new CPU (Intel i7-4770). Simulated PDDs, profiles and output ratios for the commercial system agree well with experiment data measured using radiographic film or ionization chamber. Based on our analysis, this beam model is precise enough for general applications. Conclusions: Using a beam model for a Co-60 system provided by ViewRay, we evaluate a dose calculation platform that we developed. Comparison to measurements demonstrates the promise of our software for use as a research platform for dose calculations, with applications including quality assurance and treatment plan verification.« less

  20. Simulation and Correction of Triana-Viewed Earth Radiation Budget with ERBE/ISCCP Data

    NASA Technical Reports Server (NTRS)

    Huang, Jian-Ping; Minnis, Patrick; Doelling, David R.; Valero, Francisco P. J.

    2002-01-01

    This paper describes the simulation of the earth radiation budget (ERB) as viewed by Triana and the development of correction models for converting Trianaviewed radiances into a complete ERB. A full range of Triana views and global radiation fields are simulated using a combination of datasets from ERBE (Earth Radiation Budget Experiment) and ISCCP (International Satellite Cloud Climatology Project) and analyzed with a set of empirical correction factors specific to the Triana views. The results show that the accuracy of global correction factors to estimate ERB from Triana radiances is a function of the Triana position relative to the Lagrange-1 (L1) or the Sun location. Spectral analysis of the global correction factor indicates that both shortwave (SW; 0.2 - 5.0 microns) and longwave (LW; 5 -50 microns) parameters undergo seasonal and diurnal cycles that dominate the periodic fluctuations. The diurnal cycle, especially its amplitude, is also strongly dependent on the seasonal cycle. Based on these results, models are developed to correct the radiances for unviewed areas and anisotropic emission and reflection. A preliminary assessment indicates that these correction models can be applied to Triana radiances to produce the most accurate global ERB to date.

  1. A simplified computational memory model from information processing

    PubMed Central

    Zhang, Lanhua; Zhang, Dongsheng; Deng, Yuqin; Ding, Xiaoqian; Wang, Yan; Tang, Yiyuan; Sun, Baoliang

    2016-01-01

    This paper is intended to propose a computational model for memory from the view of information processing. The model, called simplified memory information retrieval network (SMIRN), is a bi-modular hierarchical functional memory network by abstracting memory function and simulating memory information processing. At first meta-memory is defined to express the neuron or brain cortices based on the biology and graph theories, and we develop an intra-modular network with the modeling algorithm by mapping the node and edge, and then the bi-modular network is delineated with intra-modular and inter-modular. At last a polynomial retrieval algorithm is introduced. In this paper we simulate the memory phenomena and functions of memorization and strengthening by information processing algorithms. The theoretical analysis and the simulation results show that the model is in accordance with the memory phenomena from information processing view. PMID:27876847

  2. The new car following model considering vehicle dynamics influence and numerical simulation

    NASA Astrophysics Data System (ADS)

    Sun, Dihua; Liu, Hui; Zhang, Geng; Zhao, Min

    2015-12-01

    In this paper, the car following model is investigated by considering the vehicle dynamics in a cyber physical view. In fact, that driving is a typical cyber physical process which couples the cyber aspect of the vehicles' information and driving decision tightly with the dynamics and physics of the vehicles and traffic environment. However, the influence from the physical (vehicle) view was been ignored in the previous car following models. In order to describe the car following behavior more reasonably in real traffic, a new car following model by considering vehicle dynamics (for short, D-CFM) is proposed. In this paper, we take the full velocity difference (FVD) car following model as a case. The stability condition is given on the base of the control theory. The analytical method and numerical simulation results show that the new models can describe the evolution of traffic congestion. The simulations also show vehicles with a more actual acceleration of starting process than early models.

  3. SeaWiFS technical report series. Volume 10: Modeling of the SeaWiFS solar and lunar observations

    NASA Technical Reports Server (NTRS)

    Woodward, Robert H.; Barnes, Robert A.; Mcclain, Charles R.; Esaias, Wayne E.; Barnes, William L.; Mecherikunnel, Ann T.; Hooker, Stanford B. (Editor); Firestone, Elaine R. (Editor)

    1993-01-01

    Post-launch stability monitoring of the Sea-viewing Wide Field-of-view Sensor (SeaWifs) will include periodic sweeps of both an onboard solar diffuser plate and the moon. The diffuser views will provide short-term checks and the lunar views will monitor long-term trends in the instrument's radiometric stability. Models of the expected sensor response to these observations were created on the SeaWiFS computer at the National Aeronautics and Space Administration's (NASA) Goddard Space Flight Center (GSFC) using the Interactive Data Language (IDL) utility with a graphical user interface (GUI). The solar model uses the area of intersecting circles to simulate the ramping of sensor response while viewing the diffuser. This model is compared with preflight laboratory scans of the solar diffuser. The lunar model reads a high-resolution lunar image as input. The observations of the moon are simulated with a bright target recovery algorithm that includes ramping and ringing functions. Tests using the lunar model indicate that the integrated radiance of the entire lunar surface provides a more stable quantity than the mean of radiances from centralized pixels. The lunar model is compared to ground-based scans by the SeaWiFS instrument of a full moon in December 1992. Quality assurance and trend analyses routines for calibration and for telemetry data are also discussed.

  4. Formation of Novice Business Students' Mental Models through Simulation Gaming

    ERIC Educational Resources Information Center

    Palmunen, Lauri-Matti; Pelto, Elina; Paalumäki, Anni; Lainema, Timo

    2013-01-01

    Studies on students' perceptions of learning in business simulations often suggest that students like simulations and view them more positively than both lectures and case discussions. However, research on the actual learning outcomes deriving from participating in business simulations still needs to be pursued. Consequently, the purpose of this…

  5. Model and simulation of Krause model in dynamic open network

    NASA Astrophysics Data System (ADS)

    Zhu, Meixia; Xie, Guangqiang

    2017-08-01

    The construction of the concept of evolution is an effective way to reveal the formation of group consensus. This study is based on the modeling paradigm of the HK model (Hegsekmann-Krause). This paper analyzes the evolution of multi - agent opinion in dynamic open networks with member mobility. The results of the simulation show that when the number of agents is constant, the interval distribution of the initial distribution will affect the number of the final view, The greater the distribution of opinions, the more the number of views formed eventually; The trust threshold has a decisive effect on the number of views, and there is a negative correlation between the trust threshold and the number of opinions clusters. The higher the connectivity of the initial activity group, the more easily the subjective opinion in the evolution of opinion to achieve rapid convergence. The more open the network is more conducive to the unity of view, increase and reduce the number of agents will not affect the consistency of the group effect, but not conducive to stability.

  6. 3D visualization of ultra-fine ICON climate simulation data

    NASA Astrophysics Data System (ADS)

    Röber, Niklas; Spickermann, Dela; Böttinger, Michael

    2016-04-01

    Advances in high performance computing and model development allow the simulation of finer and more detailed climate experiments. The new ICON model is based on an unstructured triangular grid and can be used for a wide range of applications, ranging from global coupled climate simulations down to very detailed and high resolution regional experiments. It consists of an atmospheric and an oceanic component and scales very well for high numbers of cores. This allows us to conduct very detailed climate experiments with ultra-fine resolutions. ICON is jointly developed in partnership with DKRZ by the Max Planck Institute for Meteorology and the German Weather Service. This presentation discusses our current workflow for analyzing and visualizing this high resolution data. The ICON model has been used for eddy resolving (<10km) ocean simulations, as well as for ultra-fine cloud resolving (120m) atmospheric simulations. This results in very large 3D time dependent multi-variate data that need to be displayed and analyzed. We have developed specific plugins for the free available visualization software ParaView and Vapor, which allows us to read and handle that much data. Within ParaView, we can additionally compare prognostic variables with performance data side by side to investigate the performance and scalability of the model. With the simulation running in parallel on several hundred nodes, an equal load balance is imperative. In our presentation we show visualizations of high-resolution ICON oceanographic and HDCP2 atmospheric simulations that were created using ParaView and Vapor. Furthermore we discuss our current efforts to improve our visualization capabilities, thereby exploring the potential of regular in-situ visualization, as well as of in-situ compression / post visualization.

  7. Validation of the thermal code of RadTherm-IR, IR-Workbench, and F-TOM

    NASA Astrophysics Data System (ADS)

    Schwenger, Frédéric; Grossmann, Peter; Malaplate, Alain

    2009-05-01

    System assessment by image simulation requires synthetic scenarios that can be viewed by the device to be simulated. In addition to physical modeling of the camera, a reliable modeling of scene elements is necessary. Software products for modeling of target data in the IR should be capable of (i) predicting surface temperatures of scene elements over a long period of time and (ii) computing sensor views of the scenario. For such applications, FGAN-FOM acquired the software products RadTherm-IR (ThermoAnalytics Inc., Calumet, USA; IR-Workbench (OKTAL-SE, Toulouse, France). Inspection of the accuracy of simulation results by validation is necessary before using these products for applications. In the first step of validation, the performance of both "thermal solvers" was determined through comparison of the computed diurnal surface temperatures of a simple object with the corresponding values from measurements. CUBI is a rather simple geometric object with well known material parameters which makes it suitable for testing and validating object models in IR. It was used in this study as a test body. Comparison of calculated and measured surface temperature values will be presented, together with the results from the FGAN-FOM thermal object code F-TOM. In the second validation step, radiances of the simulated sensor views computed by RadTherm-IR and IR-Workbench will be compared with radiances retrieved from the recorded sensor images taken by the sensor that was simulated. Strengths and weaknesses of the models RadTherm-IR, IR-Workbench and F-TOM will be discussed.

  8. Modelling and simulating a crisis management system: an organisational perspective

    NASA Astrophysics Data System (ADS)

    Chaawa, Mohamed; Thabet, Inès; Hanachi, Chihab; Ben Said, Lamjed

    2017-04-01

    Crises are complex situations due to the dynamism of the environment, its unpredictability and the complexity of the interactions among several different and autonomous involved organisations. In such a context, establishing an organisational view as well as structuring organisations' communications and their functioning is a crucial requirement. In this article, we propose a multi-agent organisational model (OM) to abstract, simulate and analyse a crisis management system (CMS). The objective is to evaluate the CMS from an organisational view, to assess its strength as well as its weakness and to provide deciders with some recommendations for a more flexible and reactive CMS. The proposed OM is illustrated through a real case study: a snowstorm in a Tunisian region. More precisely, we made the following contribution: firstly, we provide an environmental model that identifies the concepts involved in the crisis. Then, we define a role model that copes with the involved actors. In addition, we specify the organisational structure and the interaction model that rule communications and structure actors' functioning. Those models, built following the GAIA methodology, abstract the CMS from an organisational perspective. Finally, we implemented a customisable multi-agent simulator based on the Janus platform to analyse, through several performed simulations, the organisational model.

  9. Simulation-Based Mission Rehearsal as a Human Activity System.

    DTIC Science & Technology

    1996-09-01

    explain this demonstrated importance of the people involved in MR, a human activity system model of simulation-based rehearsal was developed. It provides...Implications of this human activity system view are discussed, including: places in the mission preparation process where simulation can benefit operations

  10. Simulated response of a multispectral scanner over wheat as a function of wavelength and view/illumination direction

    NASA Technical Reports Server (NTRS)

    Bauer, M. E. (Principal Investigator); Vanderbilt, V. C.; Robinson, B. F.; Biehl, L. L.; Vanderbilt, A. S.

    1981-01-01

    The reflectance response with view angle of wheat, was analyzed. The analyses, which assumes there are no atmospheric effects, and otherwise simulates the response of a multispectral scanner, is based upon spectra taken continuously in wavelength from 0.45 to 2.4 micrometers at more than 1200 view/illumination directions using an Exotech model 20C spectra radiometer. Data were acquired six meters above four wheat canopies, each at a different growth stage. The analysis shows that the canopy reflective response is a pronounced function of illumination angle, scanner view angle and wavelength. The variation is greater at low solar elevations compared to high solar elevations.

  11. Translating building information modeling to building energy modeling using model view definition.

    PubMed

    Jeong, WoonSeong; Kim, Jong Bum; Clayton, Mark J; Haberl, Jeff S; Yan, Wei

    2014-01-01

    This paper presents a new approach to translate between Building Information Modeling (BIM) and Building Energy Modeling (BEM) that uses Modelica, an object-oriented declarative, equation-based simulation environment. The approach (BIM2BEM) has been developed using a data modeling method to enable seamless model translations of building geometry, materials, and topology. Using data modeling, we created a Model View Definition (MVD) consisting of a process model and a class diagram. The process model demonstrates object-mapping between BIM and Modelica-based BEM (ModelicaBEM) and facilitates the definition of required information during model translations. The class diagram represents the information and object relationships to produce a class package intermediate between the BIM and BEM. The implementation of the intermediate class package enables system interface (Revit2Modelica) development for automatic BIM data translation into ModelicaBEM. In order to demonstrate and validate our approach, simulation result comparisons have been conducted via three test cases using (1) the BIM-based Modelica models generated from Revit2Modelica and (2) BEM models manually created using LBNL Modelica Buildings library. Our implementation shows that BIM2BEM (1) enables BIM models to be translated into ModelicaBEM models, (2) enables system interface development based on the MVD for thermal simulation, and (3) facilitates the reuse of original BIM data into building energy simulation without an import/export process.

  12. Translating Building Information Modeling to Building Energy Modeling Using Model View Definition

    PubMed Central

    Kim, Jong Bum; Clayton, Mark J.; Haberl, Jeff S.

    2014-01-01

    This paper presents a new approach to translate between Building Information Modeling (BIM) and Building Energy Modeling (BEM) that uses Modelica, an object-oriented declarative, equation-based simulation environment. The approach (BIM2BEM) has been developed using a data modeling method to enable seamless model translations of building geometry, materials, and topology. Using data modeling, we created a Model View Definition (MVD) consisting of a process model and a class diagram. The process model demonstrates object-mapping between BIM and Modelica-based BEM (ModelicaBEM) and facilitates the definition of required information during model translations. The class diagram represents the information and object relationships to produce a class package intermediate between the BIM and BEM. The implementation of the intermediate class package enables system interface (Revit2Modelica) development for automatic BIM data translation into ModelicaBEM. In order to demonstrate and validate our approach, simulation result comparisons have been conducted via three test cases using (1) the BIM-based Modelica models generated from Revit2Modelica and (2) BEM models manually created using LBNL Modelica Buildings library. Our implementation shows that BIM2BEM (1) enables BIM models to be translated into ModelicaBEM models, (2) enables system interface development based on the MVD for thermal simulation, and (3) facilitates the reuse of original BIM data into building energy simulation without an import/export process. PMID:25309954

  13. AI and simulation: What can they learn from each other

    NASA Technical Reports Server (NTRS)

    Colombano, Silvano P.

    1988-01-01

    Simulation and Artificial Intelligence share a fertile common ground both from a practical and from a conceptual point of view. Strengths and weaknesses of both Knowledge Based System and Modeling and Simulation are examined and three types of systems that combine the strengths of both technologies are discussed. These types of systems are a practical starting point, however, the real strengths of both technologies will be exploited only when they are combined in a common knowledge representation paradigm. From an even deeper conceptual point of view, one might even argue that the ability to reason from a set of facts (i.e., Expert System) is less representative of human reasoning than the ability to make a model of the world, change it as required, and derive conclusions about the expected behavior of world entities. This is a fundamental problem in AI, and Modeling Theory can contribute to its solution. The application of Knowledge Engineering technology to a Distributed Processing Network Simulator (DPNS) is discussed.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keefer, Donald A.; Shaffer, Eric G.; Storsved, Brynne

    A free software application, RVA, has been developed as a plugin to the US DOE-funded ParaView visualization package, to provide support in the visualization and analysis of complex reservoirs being managed using multi-fluid EOR techniques. RVA, for Reservoir Visualization and Analysis, was developed as an open-source plugin to the 64 bit Windows version of ParaView 3.14. RVA was developed at the University of Illinois at Urbana-Champaign, with contributions from the Illinois State Geological Survey, Department of Computer Science and National Center for Supercomputing Applications. RVA was designed to utilize and enhance the state-of-the-art visualization capabilities within ParaView, readily allowing jointmore » visualization of geologic framework and reservoir fluid simulation model results. Particular emphasis was placed on enabling visualization and analysis of simulation results highlighting multiple fluid phases, multiple properties for each fluid phase (including flow lines), multiple geologic models and multiple time steps. Additional advanced functionality was provided through the development of custom code to implement data mining capabilities. The built-in functionality of ParaView provides the capacity to process and visualize data sets ranging from small models on local desktop systems to extremely large models created and stored on remote supercomputers. The RVA plugin that we developed and the associated User Manual provide improved functionality through new software tools, and instruction in the use of ParaView-RVA, targeted to petroleum engineers and geologists in industry and research. The RVA web site (http://rva.cs.illinois.edu) provides an overview of functions, and the development web site (https://github.com/shaffer1/RVA) provides ready access to the source code, compiled binaries, user manual, and a suite of demonstration data sets. Key functionality has been included to support a range of reservoirs visualization and analysis needs, including: sophisticated connectivity analysis, cross sections through simulation results between selected wells, simplified volumetric calculations, global vertical exaggeration adjustments, ingestion of UTChem simulation results, ingestion of Isatis geostatistical framework models, interrogation of joint geologic and reservoir modeling results, joint visualization and analysis of well history files, location-targeted visualization, advanced correlation analysis, visualization of flow paths, and creation of static images and animations highlighting targeted reservoir features.« less

  15. RVA: A Plugin for ParaView 3.14

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-09-04

    RVA is a plugin developed for the 64-bit Windows version of the ParaView 3.14 visualization package. RVA is designed to provide support in the visualization and analysis of complex reservoirs being managed using multi-fluid EOR techniques. RVA, for Reservoir Visualization and Analysis, was developed at the University of Illinois at Urbana-Champaign, with contributions from the Illinois State Geological Survey, Department of Computer Science and National Center for Supercomputing Applications. RVA was designed to utilize and enhance the state-of-the-art visualization capabilities within ParaView, readily allowing joint visualization of geologic framework and reservoir fluid simulation model results. Particular emphasis was placed onmore » enabling visualization and analysis of simulation results highlighting multiple fluid phases, multiple properties for each fluid phase (including flow lines), multiple geologic models and multiple time steps. Additional advanced functionality was provided through the development of custom code to implement data mining capabilities. The built-in functionality of ParaView provides the capacity to process and visualize data sets ranging from small models on local desktop systems to extremely large models created and stored on remote supercomputers. The RVA plugin that we developed and the associated User Manual provide improved functionality through new software tools, and instruction in the use of ParaView-RVA, targeted to petroleum engineers and geologists in industry and research. The RVA web site (http://rva.cs.illinois.edu) provides an overview of functions, and the development web site (https://github.com/shaffer1/RVA) provides ready access to the source code, compiled binaries, user manual, and a suite of demonstration data sets. Key functionality has been included to support a range of reservoirs visualization and analysis needs, including: sophisticated connectivity analysis, cross sections through simulation results between selected wells, simplified volumetric calculations, global vertical exaggeration adjustments, ingestion of UTChem simulation results, ingestion of Isatis geostatistical framework models, interrogation of joint geologic and reservoir modeling results, joint visualization and analysis of well history files, location-targeted visualization, advanced correlation analysis, visualization of flow paths, and creation of static images and animations highlighting targeted reservoir features.« less

  16. Using simplifications of reality in the real world: Robust benefits of models for decision making

    NASA Astrophysics Data System (ADS)

    Hunt, R. J.

    2008-12-01

    Models are by definition simplifications of reality; the degree and nature of simplification, however, is debated. One view is "the world is 3D, heterogeneous, and transient, thus good models are too" - the more a model directly simulates the complexity of the real world the better it is considered to be. An alternative view is to only use simple models up front because real-world complexity can never be truly known. A third view is construct and calibrate as many models as predictions. A fourth is to build highly parameterized models and either look at an ensemble of results, or use mathematical regularization to identify an optimal most reasonable parameter set and fit. Although each view may have utility for a given decision-making process, there are common threads that perhaps run through all views. First, the model-construction process itself can help the decision-making process because it raises the discussion of opposing parties from one of contrasting professional opinions to discussion of reasonable types and ranges of model inputs and processes. Secondly, no matter what view is used to guide the model building, model predictions for the future might be expected to perform poorly in the future due to unanticipated future changes and stressors to the underlying system simulated. Although this does not reduce the obligation of the modeler to build representative tools for the system, it should serve to temper expectations of model performance. Finally, perhaps the most under-appreciated utility of models is for calculating the reduction in prediction uncertainty resulting from different data collection strategies - an attractive feature separate from the calculation and minimization of absolute prediction uncertainty itself. This type of model output facilitates focusing on efficient use of current and future monitoring resources - something valued by many decision-makers regardless of background, system managed, and societal context.

  17. Attitude Estimation for Large Field-of-View Sensors

    NASA Technical Reports Server (NTRS)

    Cheng, Yang; Crassidis, John L.; Markley, F. Landis

    2005-01-01

    The QUEST measurement noise model for unit vector observations has been widely used in spacecraft attitude estimation for more than twenty years. It was derived under the approximation that the noise lies in the tangent plane of the respective unit vector and is axially symmetrically distributed about the vector. For large field-of-view sensors, however, this approximation may be poor, especially when the measurement falls near the edge of the field of view. In this paper a new measurement noise model is derived based on a realistic noise distribution in the focal-plane of a large field-of-view sensor, which shows significant differences from the QUEST model for unit vector observations far away from the sensor boresight. An extended Kalman filter for attitude estimation is then designed with the new measurement noise model. Simulation results show that with the new measurement model the extended Kalman filter achieves better estimation performance using large field-of-view sensor observations.

  18. Intubation simulation with a cross-sectional visual guidance.

    PubMed

    Rhee, Chi-Hyoung; Kang, Chul Won; Lee, Chang Ha

    2013-01-01

    We present an intubation simulation with deformable objects and a cross-sectional visual guidance using a general haptic device. Our method deforms the tube model when it collides with the human model. Mass-Spring model with the Euler integration is used for the tube deformation. For the trainee's more effective understanding of the intubation process, we provide a cross-sectional view of the oral cavity and the tube. Our system also applies a stereoscopic rendering to improve the depth perception and the reality of the simulation.

  19. Troughs on Martian Ice Sheets: Analysis of Their Closure and Mass Balance

    NASA Technical Reports Server (NTRS)

    Fountain, A.; Kargel, J.; Lewis, K.; MacAyeal, D.; Pfeffer, T.; Zwally, J.

    2000-01-01

    At the Copenhagen workshop on Martian polar processes, Ralf Greve commented that the flow regime surrounding scarps and troughs of the Martian polar ice sheets cannot be modeled using traditional "plan view" ice-sheet models. Such models are inadequate because they typically use reduced equations that embody certain simplifications applicable only to terrestrial ice sheets where the upper ice sheet surface is smooth. In response to this suggestion, we have constructed a 2-dimensional, time dependent "side view" (two spatial dimensions: one horizontal, one vertical) model of scarp closure that is designed to overcome the difficulties described by Greve. The purpose of the model is to evaluate the scales of stress variation and styles of flow closure so as to estimate errors that may be encountered by "plan view" models. We show that there may be avenues whereby the complications associated with scarp closure can be overcome in "plan view" models through appropriate parameterizations of 3-dimensional effects. Following this, we apply the flow model to simulate the evolution of a typical scarp on the North Polar Cap of Mars. Our simulations investigate: (a) the role of "radiation trapping" (see our companion abstract) in creating and maintaining "spiral-like" scarps on the ice sheet, (b) the consequences of different flowlaws and ice compositions on scarp evolution and, in particular, scarp age, and (c) the role of dust and debris in scarp evolution.

  20. Human task animation from performance models and natural language input

    NASA Technical Reports Server (NTRS)

    Esakov, Jeffrey; Badler, Norman I.; Jung, Moon

    1989-01-01

    Graphical manipulation of human figures is essential for certain types of human factors analyses such as reach, clearance, fit, and view. In many situations, however, the animation of simulated people performing various tasks may be based on more complicated functions involving multiple simultaneous reaches, critical timing, resource availability, and human performance capabilities. One rather effective means for creating such a simulation is through a natural language description of the tasks to be carried out. Given an anthropometrically-sized figure and a geometric workplace environment, various simple actions such as reach, turn, and view can be effectively controlled from language commands or standard NASA checklist procedures. The commands may also be generated by external simulation tools. Task timing is determined from actual performance models, if available, such as strength models or Fitts' Law. The resulting action specification are animated on a Silicon Graphics Iris workstation in real-time.

  1. Methods Data Qualification Interim Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    R. Sam Alessi; Tami Grimmett; Leng Vang

    The overall goal of the Next Generation Nuclear Plant (NGNP) Data Management and Analysis System (NDMAS) is to maintain data provenance for all NGNP data including the Methods component of NGNP data. Multiple means are available to access data stored in NDMAS. A web portal environment allows users to access data, view the results of qualification tests and view graphs and charts of various attributes of the data. NDMAS also has methods for the management of the data output from VHTR simulation models and data generated from experiments designed to verify and validate the simulation codes. These simulation models representmore » the outcome of mathematical representation of VHTR components and systems. The methods data management approaches described herein will handle data that arise from experiment, simulation, and external sources for the main purpose of facilitating parameter estimation and model verification and validation (V&V). A model integration environment entitled ModelCenter is used to automate the storing of data from simulation model runs to the NDMAS repository. This approach does not adversely change the why computational scientists conduct their work. The method is to be used mainly to store the results of model runs that need to be preserved for auditing purposes or for display to the NDMAS web portal. This interim report demonstrates the currently development of NDMAS for Methods data and discusses data and its qualification that is currently part of NDMAS.« less

  2. Simulation study of a geometric shape factor technique for estimating earth-emitted radiant flux densities from wide-field-of-view radiation measurements

    NASA Technical Reports Server (NTRS)

    Weaver, W. L.; Green, R. N.

    1980-01-01

    Geometric shape factors were computed and applied to satellite simulated irradiance measurements to estimate Earth emitted flux densities for global and zonal scales and for areas smaller than the detector field of view (FOV). Wide field of view flat plate detectors were emphasized, but spherical detectors were also studied. The radiation field was modeled after data from the Nimbus 2 and 3 satellites. At a satellite altitude of 600 km, zonal estimates were in error 1.0 to 1.2 percent and global estimates were in error less than 0.2 percent. Estimates with unrestricted field of view (UFOV) detectors were about the same for Lambertian and limb darkening radiation models. The opposite was found for restricted field of view detectors. The UFOV detectors are found to be poor estimators of flux density from the total FOV and are shown to be much better as estimators of flux density from a circle centered at the FOV with an area significantly smaller than that for the total FOV.

  3. Burn severity mapping using simulation modeling and satellite imagery

    Treesearch

    Eva C. Karau; Robert E. Keane

    2010-01-01

    Although burn severity maps derived from satellite imagery provide a landscape view of fire impacts, fire effects simulation models can provide spatial fire severity estimates and add a biotic context in which to interpret severity. In this project, we evaluated two methods of mapping burn severity in the context of rapid post-fire assessment for four wildfires in...

  4. Novel Characterization of Capsule X-Ray Drive at the National Ignition Facility [Using ViewFactor Experiments to Measure Hohlraum X-Radiation Drive from the Capsule Point-of-View in Ignition Experiments on the National Ignition Facility

    DOE PAGES

    MacLaren, S. A.; Schneider, M. B.; Widmann, K.; ...

    2014-03-13

    Here, indirect drive experiments at the National Ignition Facility are designed to achieve fusion by imploding a fuel capsule with x rays from a laser-driven hohlraum. Previous experiments have been unable to determine whether a deficit in measured ablator implosion velocity relative to simulations is due to inadequate models of the hohlraum or ablator physics. ViewFactor experiments allow for the first time a direct measure of the x-ray drive from the capsule point of view. The experiments show a 15%–25% deficit relative to simulations and thus explain nearly all of the disagreement with the velocity data. In addition, the datamore » from this open geometry provide much greater constraints on a predictive model of laser-driven hohlraum performance than the nominal ignition target.« less

  5. Virtual Simulations and Serious Games in a Laptop-Based University: Gauging Faculty and Student Perceptions

    ERIC Educational Resources Information Center

    Kapralos, Bill; Hogan, Michelle; Pribetic, Antonin I.; Dubrowski, Adam

    2011-01-01

    Purpose: Gaming and interactive virtual simulation environments support a learner-centered educational model allowing learners to work through problems acquiring knowledge through an active, experiential learning approach. To develop effective virtual simulations and serious games, the views and perceptions of learners and educators must be…

  6. Development of Human Posture Simulation Method for Assessing Posture Angles and Spinal Loads

    PubMed Central

    Lu, Ming-Lun; Waters, Thomas; Werren, Dwight

    2015-01-01

    Video-based posture analysis employing a biomechanical model is gaining a growing popularity for ergonomic assessments. A human posture simulation method of estimating multiple body postural angles and spinal loads from a video record was developed to expedite ergonomic assessments. The method was evaluated by a repeated measures study design with three trunk flexion levels, two lift asymmetry levels, three viewing angles and three trial repetitions as experimental factors. The study comprised two phases evaluating the accuracy of simulating self and other people’s lifting posture via a proxy of a computer-generated humanoid. The mean values of the accuracy of simulating self and humanoid postures were 12° and 15°, respectively. The repeatability of the method for the same lifting condition was excellent (~2°). The least simulation error was associated with side viewing angle. The estimated back compressive force and moment, calculated by a three dimensional biomechanical model, exhibited a range of 5% underestimation. The posture simulation method enables researchers to simultaneously quantify body posture angles and spinal loading variables with accuracy and precision comparable to on-screen posture matching methods. PMID:26361435

  7. Determination of Ice Cloud Models Using MODIS and MISR Data

    NASA Technical Reports Server (NTRS)

    Xie, Yu; Yang, Ping; Kattawar, George W.; Minnis, Patrick; Hu, Yongxiang; Wu, Dong L.

    2012-01-01

    Representation of ice clouds in radiative transfer simulations is subject to uncertainties associated with the shapes and sizes of ice crystals within cirrus clouds. In this study, we examined several ice cloud models consisting of smooth, roughened, homogeneous and inhomogeneous hexagonal ice crystals with various aspect ratios. The sensitivity of the bulk scattering properties and solar reflectances of cirrus clouds to specific ice cloud models is investigated using the improved geometric optics method (IGOM) and the discrete ordinates radiative transfer (DISORT) model. The ice crystal habit fractions in the ice cloud model may significantly affect the simulations of cloud reflectances. A new algorithm was developed to help determine an appropriate ice cloud model for application to the satellite-based retrieval of ice cloud properties. The ice cloud particle size retrieved from Moderate Resolution Imaging Spectroradiometer (MODIS) data, collocated with Multi-angle Imaging Spectroradiometer (MISR) observations, is used to infer the optical thicknesses of ice clouds for nine MISR viewing angles. The relative differences between view-dependent cloud optical thickness and the averaged value over the nine MISR viewing angles can vary from -0.5 to 0.5 and are used to evaluate the ice cloud models. In the case for 2 July 2009, the ice cloud model with mixed ice crystal habits is the best fit to the observations (the root mean square (RMS) error of cloud optical thickness reaches 0.365). This ice cloud model also produces consistent cloud property retrievals for the nine MISR viewing configurations within the measurement uncertainties.

  8. Automotive Underhood Thermal Management Analysis Using 3-D Coupled Thermal-Hydrodynamic Computer Models: Thermal Radiation Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pannala, S; D'Azevedo, E; Zacharia, T

    The goal of the radiation modeling effort was to develop and implement a radiation algorithm that is fast and accurate for the underhood environment. As part of this CRADA, a net-radiation model was chosen to simulate radiative heat transfer in an underhood of a car. The assumptions (diffuse-gray and uniform radiative properties in each element) reduce the problem tremendously and all the view factors for radiation thermal calculations can be calculated once and for all at the beginning of the simulation. The cost for online integration of heat exchanges due to radiation is found to be less than 15% ofmore » the baseline CHAD code and thus very manageable. The off-line view factor calculation is constructed to be very modular and has been completely integrated to read CHAD grid files and the output from this code can be read into the latest version of CHAD. Further integration has to be performed to accomplish the same with STAR-CD. The main outcome of this effort is to obtain a highly scalable and portable simulation capability to model view factors for underhood environment (for e.g. a view factor calculation which took 14 hours on a single processor only took 14 minutes on 64 processors). The code has also been validated using a simple test case where analytical solutions are available. This simulation capability gives underhood designers in the automotive companies the ability to account for thermal radiation - which usually is critical in the underhood environment and also turns out to be one of the most computationally expensive components of underhood simulations. This report starts off with the original work plan as elucidated in the proposal in section B. This is followed by Technical work plan to accomplish the goals of the project in section C. In section D, background to the current work is provided with references to the previous efforts this project leverages on. The results are discussed in section 1E. This report ends with conclusions and future scope of work in section F.« less

  9. Object Creation and Human Factors Evaluation for Virtual Environments

    NASA Technical Reports Server (NTRS)

    Lindsey, Patricia F.

    1998-01-01

    The main objective of this project is to provide test objects for simulated environments utilized by the recently established Army/NASA Virtual Innovations Lab (ANVIL) at Marshall Space Flight Center, Huntsville, Al. The objective of the ANVIL lab is to provide virtual reality (VR) models and environments and to provide visualization and manipulation methods for the purpose of training and testing. Visualization equipment used in the ANVIL lab includes head-mounted and boom-mounted immersive virtual reality display devices. Objects in the environment are manipulated using data glove, hand controller, or mouse. These simulated objects are solid or surfaced three dimensional models. They may be viewed or manipulated from any location within the environment and may be viewed on-screen or via immersive VR. The objects are created using various CAD modeling packages and are converted into the virtual environment using dVise. This enables the object or environment to be viewed from any angle or distance for training or testing purposes.

  10. Hybrid-view programming of nuclear fusion simulation code in the PGAS parallel programming language XcalableMP

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tsugane, Keisuke; Boku, Taisuke; Murai, Hitoshi

    Recently, the Partitioned Global Address Space (PGAS) parallel programming model has emerged as a usable distributed memory programming model. XcalableMP (XMP) is a PGAS parallel programming language that extends base languages such as C and Fortran with directives in OpenMP-like style. XMP supports a global-view model that allows programmers to define global data and to map them to a set of processors, which execute the distributed global data as a single thread. In XMP, the concept of a coarray is also employed for local-view programming. In this study, we port Gyrokinetic Toroidal Code - Princeton (GTC-P), which is a three-dimensionalmore » gyrokinetic PIC code developed at Princeton University to study the microturbulence phenomenon in magnetically confined fusion plasmas, to XMP as an example of hybrid memory model coding with the global-view and local-view programming models. In local-view programming, the coarray notation is simple and intuitive compared with Message Passing Interface (MPI) programming while the performance is comparable to that of the MPI version. Thus, because the global-view programming model is suitable for expressing the data parallelism for a field of grid space data, we implement a hybrid-view version using a global-view programming model to compute the field and a local-view programming model to compute the movement of particles. Finally, the performance is degraded by 20% compared with the original MPI version, but the hybrid-view version facilitates more natural data expression for static grid space data (in the global-view model) and dynamic particle data (in the local-view model), and it also increases the readability of the code for higher productivity.« less

  11. Hybrid-view programming of nuclear fusion simulation code in the PGAS parallel programming language XcalableMP

    DOE PAGES

    Tsugane, Keisuke; Boku, Taisuke; Murai, Hitoshi; ...

    2016-06-01

    Recently, the Partitioned Global Address Space (PGAS) parallel programming model has emerged as a usable distributed memory programming model. XcalableMP (XMP) is a PGAS parallel programming language that extends base languages such as C and Fortran with directives in OpenMP-like style. XMP supports a global-view model that allows programmers to define global data and to map them to a set of processors, which execute the distributed global data as a single thread. In XMP, the concept of a coarray is also employed for local-view programming. In this study, we port Gyrokinetic Toroidal Code - Princeton (GTC-P), which is a three-dimensionalmore » gyrokinetic PIC code developed at Princeton University to study the microturbulence phenomenon in magnetically confined fusion plasmas, to XMP as an example of hybrid memory model coding with the global-view and local-view programming models. In local-view programming, the coarray notation is simple and intuitive compared with Message Passing Interface (MPI) programming while the performance is comparable to that of the MPI version. Thus, because the global-view programming model is suitable for expressing the data parallelism for a field of grid space data, we implement a hybrid-view version using a global-view programming model to compute the field and a local-view programming model to compute the movement of particles. Finally, the performance is degraded by 20% compared with the original MPI version, but the hybrid-view version facilitates more natural data expression for static grid space data (in the global-view model) and dynamic particle data (in the local-view model), and it also increases the readability of the code for higher productivity.« less

  12. A tool for multi-scale modelling of the renal nephron

    PubMed Central

    Nickerson, David P.; Terkildsen, Jonna R.; Hamilton, Kirk L.; Hunter, Peter J.

    2011-01-01

    We present the development of a tool, which provides users with the ability to visualize and interact with a comprehensive description of a multi-scale model of the renal nephron. A one-dimensional anatomical model of the nephron has been created and is used for visualization and modelling of tubule transport in various nephron anatomical segments. Mathematical models of nephron segments are embedded in the one-dimensional model. At the cellular level, these segment models use models encoded in CellML to describe cellular and subcellular transport kinetics. A web-based presentation environment has been developed that allows the user to visualize and navigate through the multi-scale nephron model, including simulation results, at the different spatial scales encompassed by the model description. The Zinc extension to Firefox is used to provide an interactive three-dimensional view of the tubule model and the native Firefox rendering of scalable vector graphics is used to present schematic diagrams for cellular and subcellular scale models. The model viewer is embedded in a web page that dynamically presents content based on user input. For example, when viewing the whole nephron model, the user might be presented with information on the various embedded segment models as they select them in the three-dimensional model view. Alternatively, the user chooses to focus the model viewer on a cellular model located in a particular nephron segment in order to view the various membrane transport proteins. Selecting a specific protein may then present the user with a description of the mathematical model governing the behaviour of that protein—including the mathematical model itself and various simulation experiments used to validate the model against the literature. PMID:22670210

  13. Application of 3-Dimensional Printing Technology to Construct an Eye Model for Fundus Viewing Study

    PubMed Central

    Li, Xinhua; Gao, Zhishan; Yuan, Dongqing; Liu, Qinghuai

    2014-01-01

    Objective To construct a life-sized eye model using the three-dimensional (3D) printing technology for fundus viewing study of the viewing system. Methods We devised our schematic model eye based on Navarro's eye and redesigned some parameters because of the change of the corneal material and the implantation of intraocular lenses (IOLs). Optical performance of our schematic model eye was compared with Navarro's schematic eye and other two reported physical model eyes using the ZEMAX optical design software. With computer aided design (CAD) software, we designed the 3D digital model of the main structure of the physical model eye, which was used for three-dimensional (3D) printing. Together with the main printed structure, polymethyl methacrylate(PMMA) aspherical cornea, variable iris, and IOLs were assembled to a physical eye model. Angle scale bars were glued from posterior to periphery of the retina. Then we fabricated other three physical models with different states of ammetropia. Optical parameters of these physical eye models were measured to verify the 3D printing accuracy. Results In on-axis calculations, our schematic model eye possessed similar size of spot diagram compared with Navarro's and Bakaraju's model eye, much smaller than Arianpour's model eye. Moreover, the spherical aberration of our schematic eye was much less than other three model eyes. While in off- axis simulation, it possessed a bit higher coma and similar astigmatism, field curvature and distortion. The MTF curves showed that all the model eyes diminished in resolution with increasing field of view, and the diminished tendency of resolution of our physical eye model was similar to the Navarro's eye. The measured parameters of our eye models with different status of ametropia were in line with the theoretical value. Conclusions The schematic eye model we designed can well simulate the optical performance of the human eye, and the fabricated physical one can be used as a tool in fundus range viewing research. PMID:25393277

  14. Application of 3-dimensional printing technology to construct an eye model for fundus viewing study.

    PubMed

    Xie, Ping; Hu, Zizhong; Zhang, Xiaojun; Li, Xinhua; Gao, Zhishan; Yuan, Dongqing; Liu, Qinghuai

    2014-01-01

    To construct a life-sized eye model using the three-dimensional (3D) printing technology for fundus viewing study of the viewing system. We devised our schematic model eye based on Navarro's eye and redesigned some parameters because of the change of the corneal material and the implantation of intraocular lenses (IOLs). Optical performance of our schematic model eye was compared with Navarro's schematic eye and other two reported physical model eyes using the ZEMAX optical design software. With computer aided design (CAD) software, we designed the 3D digital model of the main structure of the physical model eye, which was used for three-dimensional (3D) printing. Together with the main printed structure, polymethyl methacrylate(PMMA) aspherical cornea, variable iris, and IOLs were assembled to a physical eye model. Angle scale bars were glued from posterior to periphery of the retina. Then we fabricated other three physical models with different states of ammetropia. Optical parameters of these physical eye models were measured to verify the 3D printing accuracy. In on-axis calculations, our schematic model eye possessed similar size of spot diagram compared with Navarro's and Bakaraju's model eye, much smaller than Arianpour's model eye. Moreover, the spherical aberration of our schematic eye was much less than other three model eyes. While in off- axis simulation, it possessed a bit higher coma and similar astigmatism, field curvature and distortion. The MTF curves showed that all the model eyes diminished in resolution with increasing field of view, and the diminished tendency of resolution of our physical eye model was similar to the Navarro's eye. The measured parameters of our eye models with different status of ametropia were in line with the theoretical value. The schematic eye model we designed can well simulate the optical performance of the human eye, and the fabricated physical one can be used as a tool in fundus range viewing research.

  15. Cellular automata model for use with real freeway data

    DOT National Transportation Integrated Search

    2002-01-01

    The exponential rate of increase in freeway traffic is expanding the need for accurate and : realistic methods to model and predict traffic flow. Traffic modeling and simulation facilitates an : examination of both microscopic and macroscopic views o...

  16. A Dynamic Multi-Projection-Contour Approximating Framework for the 3D Reconstruction of Buildings by Super-Generalized Optical Stereo-Pairs.

    PubMed

    Yan, Yiming; Su, Nan; Zhao, Chunhui; Wang, Liguo

    2017-09-19

    In this paper, a novel framework of the 3D reconstruction of buildings is proposed, focusing on remote sensing super-generalized stereo-pairs (SGSPs). As we all know, 3D reconstruction cannot be well performed using nonstandard stereo pairs, since reliable stereo matching could not be achieved when the image-pairs are collected at a great difference of views, and we always failed to obtain dense 3D points for regions of buildings, and cannot do further 3D shape reconstruction. We defined SGSPs as two or more optical images collected in less constrained views but covering the same buildings. It is even more difficult to reconstruct the 3D shape of a building by SGSPs using traditional frameworks. As a result, a dynamic multi-projection-contour approximating (DMPCA) framework was introduced for SGSP-based 3D reconstruction. The key idea is that we do an optimization to find a group of parameters of a simulated 3D model and use a binary feature-image that minimizes the total differences between projection-contours of the building in the SGSPs and that in the simulated 3D model. Then, the simulated 3D model, defined by the group of parameters, could approximate the actual 3D shape of the building. Certain parameterized 3D basic-unit-models of typical buildings were designed, and a simulated projection system was established to obtain a simulated projection-contour in different views. Moreover, the artificial bee colony algorithm was employed to solve the optimization. With SGSPs collected by the satellite and our unmanned aerial vehicle, the DMPCA framework was verified by a group of experiments, which demonstrated the reliability and advantages of this work.

  17. SURVIAC Bulletin: AFRL Research Audit Trail Viewer (ATV). Volume 19, Issue 1, 2003

    DTIC Science & Technology

    2003-01-01

    Trail Viewer, the analyst obtained a close up view of the detailed aircraft model using the Orbit View, enabled the SkyBox , enabled fictional ter...trails and element projections, several simulated terrain types and Skybox environments to help the user maintain perspective, file based

  18. Information Diffusion in Facebook-Like Social Networks Under Information Overload

    NASA Astrophysics Data System (ADS)

    Li, Pei; Xing, Kai; Wang, Dapeng; Zhang, Xin; Wang, Hui

    2013-07-01

    Research on social networks has received remarkable attention, since many people use social networks to broadcast information and stay connected with their friends. However, due to the information overload in social networks, it becomes increasingly difficult for users to find useful information. This paper takes Facebook-like social networks into account, and models the process of information diffusion under information overload. The term view scope is introduced to model the user information-processing capability under information overload, and the average number of times a message appears in view scopes after it is generated is proposed to characterize the information diffusion efficiency. Through theoretical analysis, we find that factors such as network structure and view scope number have no impact on the information diffusion efficiency, which is a surprising result. To verify the results, we conduct simulations and provide the simulation results, which are consistent with the theoretical analysis results perfectly.

  19. Surgical model-view-controller simulation software framework for local and collaborative applications

    PubMed Central

    Sankaranarayanan, Ganesh; Halic, Tansel; Arikatla, Venkata Sreekanth; Lu, Zhonghua; De, Suvranu

    2010-01-01

    Purpose Surgical simulations require haptic interactions and collaboration in a shared virtual environment. A software framework for decoupled surgical simulation based on a multi-controller and multi-viewer model-view-controller (MVC) pattern was developed and tested. Methods A software framework for multimodal virtual environments was designed, supporting both visual interactions and haptic feedback while providing developers with an integration tool for heterogeneous architectures maintaining high performance, simplicity of implementation, and straightforward extension. The framework uses decoupled simulation with updates of over 1,000 Hz for haptics and accommodates networked simulation with delays of over 1,000 ms without performance penalty. Results The simulation software framework was implemented and was used to support the design of virtual reality-based surgery simulation systems. The framework supports the high level of complexity of such applications and the fast response required for interaction with haptics. The efficacy of the framework was tested by implementation of a minimally invasive surgery simulator. Conclusion A decoupled simulation approach can be implemented as a framework to handle simultaneous processes of the system at the various frame rates each process requires. The framework was successfully used to develop collaborative virtual environments (VEs) involving geographically distributed users connected through a network, with the results comparable to VEs for local users. PMID:20714933

  20. Surgical model-view-controller simulation software framework for local and collaborative applications.

    PubMed

    Maciel, Anderson; Sankaranarayanan, Ganesh; Halic, Tansel; Arikatla, Venkata Sreekanth; Lu, Zhonghua; De, Suvranu

    2011-07-01

    Surgical simulations require haptic interactions and collaboration in a shared virtual environment. A software framework for decoupled surgical simulation based on a multi-controller and multi-viewer model-view-controller (MVC) pattern was developed and tested. A software framework for multimodal virtual environments was designed, supporting both visual interactions and haptic feedback while providing developers with an integration tool for heterogeneous architectures maintaining high performance, simplicity of implementation, and straightforward extension. The framework uses decoupled simulation with updates of over 1,000 Hz for haptics and accommodates networked simulation with delays of over 1,000 ms without performance penalty. The simulation software framework was implemented and was used to support the design of virtual reality-based surgery simulation systems. The framework supports the high level of complexity of such applications and the fast response required for interaction with haptics. The efficacy of the framework was tested by implementation of a minimally invasive surgery simulator. A decoupled simulation approach can be implemented as a framework to handle simultaneous processes of the system at the various frame rates each process requires. The framework was successfully used to develop collaborative virtual environments (VEs) involving geographically distributed users connected through a network, with the results comparable to VEs for local users.

  1. ARM Cloud Radar Simulator Package for Global Climate Models Value-Added Product

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Yuying; Xie, Shaocheng

    It has been challenging to directly compare U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Climate Research Facility ground-based cloud radar measurements with climate model output because of limitations or features of the observing processes and the spatial gap between model and the single-point measurements. To facilitate the use of ARM radar data in numerical models, an ARM cloud radar simulator was developed to converts model data into pseudo-ARM cloud radar observations that mimic the instrument view of a narrow atmospheric column (as compared to a large global climate model [GCM] grid-cell), thus allowing meaningful comparison between model outputmore » and ARM cloud observations. The ARM cloud radar simulator value-added product (VAP) was developed based on the CloudSat simulator contained in the community satellite simulator package, the Cloud Feedback Model Intercomparison Project (CFMIP) Observation Simulator Package (COSP) (Bodas-Salcedo et al., 2011), which has been widely used in climate model evaluation with satellite data (Klein et al., 2013, Zhang et al., 2010). The essential part of the CloudSat simulator is the QuickBeam radar simulator that is used to produce CloudSat-like radar reflectivity, but is capable of simulating reflectivity for other radars (Marchand et al., 2009; Haynes et al., 2007). Adapting QuickBeam to the ARM cloud radar simulator within COSP required two primary changes: one was to set the frequency to 35 GHz for the ARM Ka-band cloud radar, as opposed to 94 GHz used for the CloudSat W-band radar, and the second was to invert the view from the ground to space so as to attenuate the beam correctly. In addition, the ARM cloud radar simulator uses a finer vertical resolution (100 m compared to 500 m for CloudSat) to resolve the more detailed structure of clouds captured by the ARM radars. The ARM simulator has been developed following the COSP workflow (Figure 1) and using the capabilities available in COSP wherever possible. The ARM simulator is written in Fortran 90, just as is the COSP. It is incorporated into COSP to facilitate use by the climate modeling community. In order to evaluate simulator output, the observational counterpart of the simulator output, radar reflectivity-height histograms (CFAD) is also generated from the ARM observations. This report includes an overview of the ARM cloud radar simulator VAP and the required simulator-oriented ARM radar data product (radarCFAD) for validating simulator output, as well as a user guide for operating the ARM radar simulator VAP.« less

  2. Simulated Engineer Assessment of the Communications Zone Model (SEAC) (Documentation and Users Manual)

    DTIC Science & Technology

    1988-06-01

    became apparent. ESC originally planned to confect a dedicated model, i.e., one specifically designed to address Korea. However, it reconsidered the...s) and should not be construed as an official US Department of the Army position, policy, or decision unless so designated by other official...model based on object-oriented programming design techniques, and uses the process view of simulation to achieve its purpose. As a direct con

  3. Microcalorimetric Study of the Aging Reactions of Atomized Magnesium Powder

    DTIC Science & Technology

    1992-02-20

    Cutaway View of the Thermometrics /LKB Model 2277 Microcalorimeter . ............ . 3 Figure 2. Schematic Diagram of the Gas Flow Ampoule System...approximately 3.5-4.0 grams) were used to simulate degradation in a non-airtight containerized storage environment. EXPERIMEN4TAL A Thermometrics /LKB Model...View of the Thermometrics /LUB Model 2277 Microcalorimeter V CAAV1Za QA3 1J4LR AT T,. Fge.chao the Gs Flowi I II - I I Apu I I ! Figure 2. Schematic

  4. Consistent View of Protein Fluctuations from All-Atom Molecular Dynamics and Coarse-Grained Dynamics with Knowledge-Based Force-Field.

    PubMed

    Jamroz, Michal; Orozco, Modesto; Kolinski, Andrzej; Kmiecik, Sebastian

    2013-01-08

    It is widely recognized that atomistic Molecular Dynamics (MD), a classical simulation method, captures the essential physics of protein dynamics. That idea is supported by a theoretical study showing that various MD force-fields provide a consensus picture of protein fluctuations in aqueous solution [Rueda, M. et al. Proc. Natl. Acad. Sci. U.S.A. 2007, 104, 796-801]. However, atomistic MD cannot be applied to most biologically relevant processes due to its limitation to relatively short time scales. Much longer time scales can be accessed by properly designed coarse-grained models. We demonstrate that the aforementioned consensus view of protein dynamics from short (nanosecond) time scale MD simulations is fairly consistent with the dynamics of the coarse-grained protein model - the CABS model. The CABS model employs stochastic dynamics (a Monte Carlo method) and a knowledge-based force-field, which is not biased toward the native structure of a simulated protein. Since CABS-based dynamics allows for the simulation of entire folding (or multiple folding events) in a single run, integration of the CABS approach with all-atom MD promises a convenient (and computationally feasible) means for the long-time multiscale molecular modeling of protein systems with atomistic resolution.

  5. Space Transfer Concepts and Analyses for Exploration Missions. Technical Directive 12: Beamed Power Systems Study

    NASA Technical Reports Server (NTRS)

    Eder, D.

    1992-01-01

    Parametric models were constructed for Earth-based laser powered electric orbit transfer from low Earth orbit to geosynchronous orbit. These models were used to carry out performance, cost/benefit, and sensitivity analyses of laser-powered transfer systems including end-to-end life cycle cost analyses for complete systems. Comparisons with conventional orbit transfer systems were made indicating large potential cost savings for laser-powered transfer. Approximate optimization was done to determine best parameter values for the systems. Orbit transfer flights simulations were conducted to explore effects of parameters not practical to model with a spreadsheet. The simulations considered view factors that determine when power can be transferred from ground stations to an orbit transfer vehicle and conducted sensitivity analyses for numbers of ground stations, Isp including dual-Isp transfers, and plane change profiles. Optimal steering laws were used for simultaneous altitude and plane change. Viewing geometry and low-thrust orbit raising were simultaneously simulated. A very preliminary investigation of relay mirrors was made.

  6. Constraining the noise-free distribution of halo spin parameters

    NASA Astrophysics Data System (ADS)

    Benson, Andrew J.

    2017-11-01

    Any measurement made using an N-body simulation is subject to noise due to the finite number of particles used to sample the dark matter distribution function, and the lack of structure below the simulation resolution. This noise can be particularly significant when attempting to measure intrinsically small quantities, such as halo spin. In this work, we develop a model to describe the effects of particle noise on halo spin parameters. This model is calibrated using N-body simulations in which the particle noise can be treated as a Poisson process on the underlying dark matter distribution function, and we demonstrate that this calibrated model reproduces measurements of halo spin parameter error distributions previously measured in N-body convergence studies. Utilizing this model, along with previous measurements of the distribution of halo spin parameters in N-body simulations, we place constraints on the noise-free distribution of halo spins. We find that the noise-free median spin is 3 per cent lower than that measured directly from the N-body simulation, corresponding to a shift of approximately 40 times the statistical uncertainty in this measurement arising purely from halo counting statistics. We also show that measurement of the spin of an individual halo to 10 per cent precision requires at least 4 × 104 particles in the halo - for haloes containing 200 particles, the fractional error on spins measured for individual haloes is of order unity. N-body simulations should be viewed as the results of a statistical experiment applied to a model of dark matter structure formation. When viewed in this way, it is clear that determination of any quantity from such a simulation should be made through forward modelling of the effects of particle noise.

  7. Technology transfer of operator-in-the-loop simulation

    NASA Technical Reports Server (NTRS)

    Yae, K. H.; Lin, H. C.; Lin, T. C.; Frisch, H. P.

    1994-01-01

    The technology developed for operator-in-the-loop simulation in space teleoperation has been applied to Caterpillar's backhoe, wheel loader, and off-highway truck. On an SGI workstation, the simulation integrates computer modeling of kinematics and dynamics, real-time computational and visualization, and an interface with the operator through the operator's console. The console is interfaced with the workstation through an IBM-PC in which the operator's commands were digitized and sent through an RS-232 serial port. The simulation gave visual feedback adequate for the operator in the loop, with the camera's field of vision projected on a large screen in multiple view windows. The view control can emulate either stationary or moving cameras. This simulator created an innovative engineering design environment by integrating computer software and hardware with the human operator's interactions. The backhoe simulation has been adopted by Caterpillar in building a virtual reality tool for backhoe design.

  8. Transient thermal modeling of the nonscanning ERBE detector

    NASA Technical Reports Server (NTRS)

    Mahan, J. R.

    1983-01-01

    A numerical model to predict the transient thermal response of the ERBE nonscanning wide field of view total radiometer channel was developed. The model, which uses Monte Carlo techniques to characterize the radiative component of heat transfer, is described and a listing of the computer program is provided. Application of the model to simulate the actual blackbody calibration procedure is discussed. The use of the model to establish a real time flight data interpretation strategy is recommended. Modification of the model to include a simulated Earth radiation source field and a filter dome is indicated.

  9. Computer simulation of the metastatic progression.

    PubMed

    Wedemann, Gero; Bethge, Anja; Haustein, Volker; Schumacher, Udo

    2014-01-01

    A novel computer model based on a discrete event simulation procedure describes quantitatively the processes underlying the metastatic cascade. Analytical functions describe the size of the primary tumor and the metastases, while a rate function models the intravasation events of the primary tumor and metastases. Events describe the behavior of the malignant cells until the formation of new metastases. The results of the computer simulations are in quantitative agreement with clinical data determined from a patient with hepatocellular carcinoma in the liver. The model provides a more detailed view on the process than a conventional mathematical model. In particular, the implications of interventions on metastasis formation can be calculated.

  10. Simulation studies of wide and medium field of view earth radiation data analysis

    NASA Technical Reports Server (NTRS)

    Green, R. N.

    1978-01-01

    A parameter estimation technique is presented to estimate the radiative flux distribution over the earth from radiometer measurements at satellite altitude. The technique analyzes measurements from a wide field of view (WFOV), horizon to horizon, nadir pointing sensor with a mathematical technique to derive the radiative flux estimates at the top of the atmosphere for resolution elements smaller than the sensor field of view. A computer simulation of the data analysis technique is presented for both earth-emitted and reflected radiation. Zonal resolutions are considered as well as the global integration of plane flux. An estimate of the equator-to-pole gradient is obtained from the zonal estimates. Sensitivity studies of the derived flux distribution to directional model errors are also presented. In addition to the WFOV results, medium field of view results are presented.

  11. Spectral Bio-indicator Simulations for Tracking Photosynthetic Activities in a Corn Field

    NASA Technical Reports Server (NTRS)

    Cheng, Yen-Ben; Middleton, Elizabeth M.; Huemmrich, K. Fred; Zhang, Qingyuan; Corp, Lawrence; Campbell, Petya; Kustas, William

    2011-01-01

    Accurate assessment of vegetation canopy optical properties plays a critical role in monitoring natural and managed ecosystems under environmental changes. In this context, radiative transfer (RT) models simulating vegetation canopy reflectance have been demonstrated to be a powerful tool for understanding and estimating spectral bio-indicators. In this study, two narrow band spectroradiometers were utilized to acquire observations over corn canopies for two summers. These in situ spectral data were then used to validate a two-layer Markov chain-based canopy reflectance model for simulating the Photochemical Reflectance Index (PRI), which has been widely used in recent vegetation photosynthetic light use efficiency (LUE) studies. The in situ PRI derived from narrow band hyperspectral reflectance exhibited clear responses to: 1) viewing geometry which affects the asset of light environment; and 2) seasonal variation corresponding to the growth stage. The RT model (ACRM) successfully simulated the responses to the variable viewing geometry. The best simulations were obtained when the model was set to run in the two layer mode using the sunlit leaves as the upper layer and shaded leaves as the lower layer. Simulated PRI values yielded much better correlations to in situ observations when the cornfield was dominated by green foliage during the early growth, vegetative and reproductive stages (r = 0.78 to 0.86) than in the later senescent stage (r = 0.65). Further sensitivity analyses were conducted to show the important influences of leaf area index (LAI) and the sunlit/shaded ratio on PRI observations.

  12. Assessment of CMIP5 historical simulations of rainfall over Southeast Asia

    NASA Astrophysics Data System (ADS)

    Raghavan, Srivatsan V.; Liu, Jiandong; Nguyen, Ngoc Son; Vu, Minh Tue; Liong, Shie-Yui

    2018-05-01

    We present preliminary analyses of the historical (1986-2005) climate simulations of a ten-member subset of the Coupled Model Inter-comparison Project Phase 5 (CMIP5) global climate models over Southeast Asia. The objective of this study was to evaluate the general circulation models' performance in simulating the mean state of climate over this less-studied climate vulnerable region, with a focus on precipitation. Results indicate that most of the models are unable to reproduce the observed state of climate over Southeast Asia. Though the multi-model ensemble mean is a better representation of the observations, the uncertainties in the individual models are far high. There is no particular model that performed well in simulating the historical climate of Southeast Asia. There seems to be no significant influence of the spatial resolutions of the models on the quality of simulation, despite the view that higher resolution models fare better. The study results emphasize on careful consideration of models for impact studies and the need to improve the next generation of models in their ability to simulate regional climates better.

  13. Imaging simulation of active EO-camera

    NASA Astrophysics Data System (ADS)

    Pérez, José; Repasi, Endre

    2018-04-01

    A modeling scheme for active imaging through atmospheric turbulence is presented. The model consists of two parts: In the first part, the illumination laser beam is propagated to a target that is described by its reflectance properties, using the well-known split-step Fourier method for wave propagation. In the second part, the reflected intensity distribution imaged on a camera is computed using an empirical model developed for passive imaging through atmospheric turbulence. The split-step Fourier method requires carefully chosen simulation parameters. These simulation requirements together with the need to produce dynamic scenes with a large number of frames led us to implement the model on GPU. Validation of this implementation is shown for two different metrics. This model is well suited for Gated-Viewing applications. Examples of imaging simulation results are presented here.

  14. A geostationary Earth orbit satellite model using Easy Java Simulation

    NASA Astrophysics Data System (ADS)

    Wee, Loo Kang; Hwee Goh, Giam

    2013-01-01

    We develop an Easy Java Simulation (EJS) model for students to visualize geostationary orbits near Earth, modelled using a Java 3D implementation of the EJS 3D library. The simplified physics model is described and simulated using a simple constant angular velocity equation. We discuss four computer model design ideas: (1) a simple and realistic 3D view and associated learning in the real world; (2) comparative visualization of permanent geostationary satellites; (3) examples of non-geostationary orbits of different rotation senses, periods and planes; and (4) an incorrect physics model for conceptual discourse. General feedback from the students has been relatively positive, and we hope teachers will find the computer model useful in their own classes.

  15. Using LabVIEW for Applying Mathematical Models in Representing Phenomena

    ERIC Educational Resources Information Center

    Faraco, G.; Gabriele, L.

    2007-01-01

    Simulations make it possible to explore physical and biological phenomena, where conducting the real experiment is impracticable or difficult. The implementation of a software program describing and simulating a given physical situation encourages the understanding of a phenomenon itself. Fifty-nine students, enrolled at the Mathematical Methods…

  16. Simulating video-assisted thoracoscopic lobectomy: a virtual reality cognitive task simulation.

    PubMed

    Solomon, Brian; Bizekis, Costas; Dellis, Sophia L; Donington, Jessica S; Oliker, Aaron; Balsam, Leora B; Zervos, Michael; Galloway, Aubrey C; Pass, Harvey; Grossi, Eugene A

    2011-01-01

    Current video-assisted thoracoscopic surgery training models rely on animals or mannequins to teach procedural skills. These approaches lack inherent teaching/testing capability and are limited by cost, anatomic variations, and single use. In response, we hypothesized that video-assisted thoracoscopic surgery right upper lobe resection could be simulated in a virtual reality environment with commercial software. An anatomy explorer (Maya [Autodesk Inc, San Rafael, Calif] models of the chest and hilar structures) and simulation engine were adapted. Design goals included freedom of port placement, incorporation of well-known anatomic variants, teaching and testing modes, haptic feedback for the dissection, ability to perform the anatomic divisions, and a portable platform. Preexisting commercial models did not provide sufficient surgical detail, and extensive modeling modifications were required. Video-assisted thoracoscopic surgery right upper lobe resection simulation is initiated with a random vein and artery variation. The trainee proceeds in a teaching or testing mode. A knowledge database currently includes 13 anatomic identifications and 20 high-yield lung cancer learning points. The "patient" is presented in the left lateral decubitus position. After initial camera port placement, the endoscopic view is displayed and the thoracoscope is manipulated via the haptic device. The thoracoscope port can be relocated; additional ports are placed using an external "operating room" view. Unrestricted endoscopic exploration of the thorax is allowed. An endo-dissector tool allows for hilar dissection, and a virtual stapling device divides structures. The trainee's performance is reported. A virtual reality cognitive task simulation can overcome the deficiencies of existing training models. Performance scoring is being validated as we assess this simulator for cognitive and technical surgical education. Copyright © 2011. Published by Mosby, Inc.

  17. Introducing GHOST: The Geospace/Heliosphere Observation & Simulation Tool-kit

    NASA Astrophysics Data System (ADS)

    Murphy, J. J.; Elkington, S. R.; Schmitt, P.; Wiltberger, M. J.; Baker, D. N.

    2013-12-01

    Simulation models of the heliospheric and geospace environments can provide key insights into the geoeffective potential of solar disturbances such as Coronal Mass Ejections and High Speed Solar Wind Streams. Advanced post processing of the results of these simulations greatly enhances the utility of these models for scientists and other researchers. Currently, no supported centralized tool exists for performing these processing tasks. With GHOST, we introduce a toolkit for the ParaView visualization environment that provides a centralized suite of tools suited for Space Physics post processing. Building on the work from the Center For Integrated Space Weather Modeling (CISM) Knowledge Transfer group, GHOST is an open-source tool suite for ParaView. The tool-kit plugin currently provides tools for reading LFM and Enlil data sets, and provides automated tools for data comparison with NASA's CDAweb database. As work progresses, many additional tools will be added and through open-source collaboration, we hope to add readers for additional model types, as well as any additional tools deemed necessary by the scientific public. The ultimate end goal of this work is to provide a complete Sun-to-Earth model analysis toolset.

  18. Using detailed inter-network simulation and model abstraction to investigate and evaluate joint battlespace infosphere (JBI) support technologies

    NASA Astrophysics Data System (ADS)

    Green, David M.; Dallaire, Joel D.; Reaper, Jerome H.

    2004-08-01

    The Joint Battlespace Infosphere (JBI) program is performing a technology investigation into global communications, data mining and warehousing, and data fusion technologies by focusing on techniques and methodologies that support twenty first century military distributed collaboration. Advancement of these technologies is vitally important if military decision makers are to have the right data, in the right format, at the right time and place to support making the right decisions within available timelines. A quantitative understanding of individual and combinational effects arising from the application of technologies within a framework is presently far too complex to evaluate at more than a cursory depth. In order to facilitate quantitative analysis under these circumstances, the Distributed Information Enterprise Modeling and Simulation (DIEMS) team was formed to apply modeling and simulation (M&S) techniques to help in addressing JBI analysis challenges. The DIEMS team has been tasked utilizing collaborative distributed M&S architectures to quantitatively evaluate JBI technologies and tradeoffs. This paper first presents a high level view of the DIEMS project. Once this approach has been established, a more concentrated view of the detailed communications simulation techniques used in generating the underlying support data sets is presented.

  19. LEM-CF Premixed Tool Kit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-01-19

    The purpose of LEM-CF Premixed Tool Kit is to process premixed flame simulation data from the LEM-CF solver (https://fileshare.craft-tech.com/clusters/view/lem-cf) into a large-eddy simulation (LES) subgrid model database. These databases may be used with a user-defined-function (UDF) that is included in the Tool Kit. The subgrid model UDF may be used with the ANSYS FLUENT flow solver or other commercial flow solvers.

  20. User interface for ground-water modeling: Arcview extension

    USGS Publications Warehouse

    Tsou, Ming‐shu; Whittemore, Donald O.

    2001-01-01

    Numerical simulation for ground-water modeling often involves handling large input and output data sets. A geographic information system (GIS) provides an integrated platform to manage, analyze, and display disparate data and can greatly facilitate modeling efforts in data compilation, model calibration, and display of model parameters and results. Furthermore, GIS can be used to generate information for decision making through spatial overlay and processing of model results. Arc View is the most widely used Windows-based GIS software that provides a robust user-friendly interface to facilitate data handling and display. An extension is an add-on program to Arc View that provides additional specialized functions. An Arc View interface for the ground-water flow and transport models MODFLOW and MT3D was built as an extension for facilitating modeling. The extension includes preprocessing of spatially distributed (point, line, and polygon) data for model input and postprocessing of model output. An object database is used for linking user dialogs and model input files. The Arc View interface utilizes the capabilities of the 3D Analyst extension. Models can be automatically calibrated through the Arc View interface by external linking to such programs as PEST. The efficient pre- and postprocessing capabilities and calibration link were demonstrated for ground-water modeling in southwest Kansas.

  1. Modeling and Simulation of Ceramic Arrays to Improve Ballistic Performance

    DTIC Science & Technology

    2014-04-30

    experiments (tiles from Supplier, sintered SiC) 15. SUBJECT TERMS Adhesive Layer Effect, .30cal AP M2 Projectile, 762x39 PS Projectile, SPH , Aluminum...Aluminum (AI5083) □ Impacts by .30cal AP-M2 projectile and are modeled using SPH elements in AutoDyn □ Center strike model validation runs with SiC tiles...View SiC\\ Front View □ Smoothed-particle hydrodynamics ( SPH ) used for al parts J SPH Size 0.4 used initially □ SPH Size 0.2 used to capture

  2. Volume and tissue composition preserving deformation of breast CT images to simulate breast compression in mammographic imaging

    NASA Astrophysics Data System (ADS)

    Han, Tao; Chen, Lingyun; Lai, Chao-Jen; Liu, Xinming; Shen, Youtao; Zhong, Yuncheng; Ge, Shuaiping; Yi, Ying; Wang, Tianpeng; Shaw, Chris C.

    2009-02-01

    Images of mastectomy breast specimens have been acquired with a bench top experimental Cone beam CT (CBCT) system. The resulting images have been segmented to model an uncompressed breast for simulation of various CBCT techniques. To further simulate conventional or tomosynthesis mammographic imaging for comparison with the CBCT technique, a deformation technique was developed to convert the CT data for an uncompressed breast to a compressed breast without altering the breast volume or regional breast density. With this technique, 3D breast deformation is separated into two 2D deformations in coronal and axial views. To preserve the total breast volume and regional tissue composition, each 2D deformation step was achieved by altering the square pixels into rectangular ones with the pixel areas unchanged and resampling with the original square pixels using bilinear interpolation. The compression was modeled by first stretching the breast in the superior-inferior direction in the coronal view. The image data were first deformed by distorting the voxels with a uniform distortion ratio. These deformed data were then deformed again using distortion ratios varying with the breast thickness and re-sampled. The deformation procedures were applied in the axial view to stretch the breast in the chest wall to nipple direction while shrinking it in the mediolateral to lateral direction re-sampled and converted into data for uniform cubic voxels. Threshold segmentation was applied to the final deformed image data to obtain the 3D compressed breast model. Our results show that the original segmented CBCT image data were successfully converted into those for a compressed breast with the same volume and regional density preserved. Using this compressed breast model, conventional and tomosynthesis mammograms were simulated for comparison with CBCT.

  3. Agent-based modeling and systems dynamics model reproduction.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    North, M. J.; Macal, C. M.

    2009-01-01

    Reproducibility is a pillar of the scientific endeavour. We view computer simulations as laboratories for electronic experimentation and therefore as tools for science. Recent studies have addressed model reproduction and found it to be surprisingly difficult to replicate published findings. There have been enough failed simulation replications to raise the question, 'can computer models be fully replicated?' This paper answers in the affirmative by reporting on a successful reproduction study using Mathematica, Repast and Swarm for the Beer Game supply chain model. The reproduction process was valuable because it demonstrated the original result's robustness across modelling methodologies and implementation environments.

  4. Spectral and spatial variability of undisturbed and disturbed grass under different view and illumination directions

    NASA Astrophysics Data System (ADS)

    Borel-Donohue, Christoph C.; Shivers, Sarah Wells; Conover, Damon

    2017-05-01

    It is well known that disturbed grass covered surfaces show variability with view and illumination conditions. A good example is a grass field in a soccer stadium that shows stripes indicating in which direction the grass was mowed. These spatial variations are due to a complex interplay of spectral characteristics of grass blades, density, their length and orientations. Viewing a grass surface from nadir or near horizontal directions results in observing different components. Views from a vertical direction show more variations due to reflections from the randomly oriented grass blades and their shadows. Views from near horizontal show a mixture of reflected and transmitted light from grass blades. An experiment was performed on a mowed grass surface which had paths of simulated heavy foot traffic laid down in different directions. High spatial resolution hyperspectral data cubes were taken by an imaging spectrometer covering the visible through near infrared over a period of time covering several hours. Ground truth grass reflectance spectra with a hand held spectrometer were obtained of undisturbed and disturbed areas. Close range images were taken of selected areas with a hand held camera which were then used to reconstruct the 3D geometry of the grass using structure-from-motion algorithms. Computer graphics rendering using raytracing of reconstructed and procedurally created grass surfaces were used to compute BRDF models. In this paper, we discuss differences between observed and simulated spectral and spatial variability. Based on the measurements and/or simulations, we derive simple spectral index methods to detect spatial disturbances and apply scattering models.

  5. Simulation of Satellite, Airborne and Terrestrial LiDAR with DART (I):Waveform Simulation with Quasi-Monte Carlo Ray Tracing

    NASA Technical Reports Server (NTRS)

    Gastellu-Etchegorry, Jean-Philippe; Yin, Tiangang; Lauret, Nicolas; Grau, Eloi; Rubio, Jeremy; Cook, Bruce D.; Morton, Douglas C.; Sun, Guoqing

    2016-01-01

    Light Detection And Ranging (LiDAR) provides unique data on the 3-D structure of atmosphere constituents and the Earth's surface. Simulating LiDAR returns for different laser technologies and Earth scenes is fundamental for evaluating and interpreting signal and noise in LiDAR data. Different types of models are capable of simulating LiDAR waveforms of Earth surfaces. Semi-empirical and geometric models can be imprecise because they rely on simplified simulations of Earth surfaces and light interaction mechanisms. On the other hand, Monte Carlo ray tracing (MCRT) models are potentially accurate but require long computational time. Here, we present a new LiDAR waveform simulation tool that is based on the introduction of a quasi-Monte Carlo ray tracing approach in the Discrete Anisotropic Radiative Transfer (DART) model. Two new approaches, the so-called "box method" and "Ray Carlo method", are implemented to provide robust and accurate simulations of LiDAR waveforms for any landscape, atmosphere and LiDAR sensor configuration (view direction, footprint size, pulse characteristics, etc.). The box method accelerates the selection of the scattering direction of a photon in the presence of scatterers with non-invertible phase function. The Ray Carlo method brings traditional ray-tracking into MCRT simulation, which makes computational time independent of LiDAR field of view (FOV) and reception solid angle. Both methods are fast enough for simulating multi-pulse acquisition. Sensitivity studies with various landscapes and atmosphere constituents are presented, and the simulated LiDAR signals compare favorably with their associated reflectance images and Laser Vegetation Imaging Sensor (LVIS) waveforms. The LiDAR module is fully integrated into DART, enabling more detailed simulations of LiDAR sensitivity to specific scene elements (e.g., atmospheric aerosols, leaf area, branches, or topography) and sensor configuration for airborne or satellite LiDAR sensors.

  6. A Composite View of Ozone Evolution in the 1995-96 Northern Winter Polar Vortex Developed from Airborne Lidar and Satellite Observations

    NASA Technical Reports Server (NTRS)

    Douglass, Anne R.; Schoeberl, M. R.; Kawa, S. R.

    2000-01-01

    The processes which contribute to the ozone evolution in the high latitude lower stratosphere are evaluated using a three dimensional model simulation and ozone observations. The model uses winds and temperatures from the Goddard Earth Observing System Data Assimilation System. The simulation results are compared with ozone observations from three platforms: the differential absorption lidar (DIAL) which was flown on the NASA DC-8 as part of the Vortex Ozone Transport Experiment; the Microwave Limb Sounder (MLS) on the Upper Atmosphere Research Satellite; and the Polar Ozone and Aerosol Measurement (POAM II) solar occulation instrument, on board the French Satellite Pour I'Observations de la Terre. Comparisons of the different data sets with the model simulation are shown to provide complementary information and a consistent view of the ozone evolution. The model ozone in December and January is shown to be sensitive to the ozone vertical gradient and the model vertical transport, and only weakly sensitive to the model photochemistry. The most consistent comparison between observed and modeled ozone evolution is found for a simulation where the vertical profiles between 12 and 20 km within the polar vortex closely match December DIAL observations. Diabatic trajectory calculations are used to estimate the uncertainty due to vertical advection quantitatively. The transport uncertainty is significant, and should be accounted for when comparing observations with model ozone. The model ozone evolution during December and January is broadly consistent with the observations when these transport uncertainties are taken into account.

  7. Modeling Early Galaxies Using Radiation Hydrodynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    This simulation uses a flux-limited diffusion solver to explore the radiation hydrodynamics of early galaxies, in particular, the ionizing radiation created by Population III stars. At the time of this rendering, the simulation has evolved to a redshift of 3.5. The simulation volume is 11.2 comoving megaparsecs, and has a uniform grid of 10243 cells, with over 1 billion dark matter and star particles. This animation shows a combined view of the baryon density, dark matter density, radiation energy and emissivity from this simulation. The multi-variate rendering is particularly useful because is shows both the baryonic matter ("normal") and darkmore » matter, and the pressure and temperature variables are properties of only the baryonic matter. Visible in the gas density are "bubbles", or shells, created by the radiation feedback from young stars. Seeing the bubbles from feedback provides confirmation of the physics model implemented. Features such as these are difficult to identify algorithmically, but easily found when viewing the visualization. Simulation was performed on Kraken at the National Institute for Computational Sciences. Visualization was produced using resources of the Argonne Leadership Computing Facility at Argonne National Laboratory.« less

  8. AOD trends during 2001-2010 from observations and model simulations

    NASA Astrophysics Data System (ADS)

    Pozzer, A.; de Meij, A.; Yoon, J.; Tost, H.; Georgoulias, A. K.; Astitha, M.

    2015-05-01

    The aerosol optical depth (AOD) trend between 2001 and 2010 is estimated globally and regionally from observations and results from simulations with the EMAC (ECHAM5/MESSy Atmospheric Chemistry) model. Although interannual variability is applied only to anthropogenic and biomass-burning emissions, the model is able to quantitatively reproduce the AOD trends as observed by the MODIS (Moderate Resolution Imaging Spectroradiometer) satellite sensor, while some discrepancies are found when compared to MISR (Multi-angle Imaging SpectroRadiometer) and SeaWIFS (Sea-viewing Wide Field-of-view Sensor) observations. Thanks to an additional simulation without any change in emissions, it is shown that decreasing AOD trends over the US and Europe are due to the decrease in the emissions, while over the Sahara Desert and the Middle East region, the meteorological changes play a major role. Over Southeast Asia, both meteorology and emissions changes are equally important in defining AOD trends. Additionally, decomposing the regional AOD trends into individual aerosol components reveals that the soluble components are the most dominant contributors to the total AOD, as their influence on the total AOD is enhanced by the aerosol water content.

  9. "Put Myself Into Your Place": Embodied Simulation and Perspective Taking in Autism Spectrum Disorders.

    PubMed

    Conson, Massimiliano; Mazzarella, Elisabetta; Esposito, Dalila; Grossi, Dario; Marino, Nicoletta; Massagli, Angelo; Frolli, Alessandro

    2015-08-01

    Embodied cognition theories hold that cognitive processes are grounded in bodily states. Embodied processes in autism spectrum disorders (ASD) have classically been investigated in studies on imitation. Several observations suggested that unlike typical individuals who are able of copying the model's actions from the model's position, individuals with ASD tend to reenact the model's actions from their own egocentric perspective. Here, we performed two behavioral experiments to directly test the ability of ASD individuals to adopt another person's point of view. In Experiment 1, participants had to explicitly judge the left/right location of a target object in a scene from their own or the actor's point of view (visual perspective taking task). In Experiment 2, participants had to perform left/right judgments on front-facing or back-facing human body images (own body transformation task). Both tasks can be solved by mentally simulating one's own body motion to imagine oneself transforming into the position of another person (embodied simulation strategy), or by resorting to visual/spatial processes, such as mental object rotation (nonembodied strategy). Results of both experiments showed that individual with ASD solved the tasks mainly relying on a nonembodied strategy, whereas typical controls adopted an embodied strategy. Moreover, in the visual perspective taking task ASD participants had more difficulties than controls in inhibiting other-perspective when directed to keep one's own point of view. These findings suggested that, in social cognitive tasks, individuals with ASD do not resort to embodied simulation and have difficulties in cognitive control over self- and other-perspective. © 2015 International Society for Autism Research, Wiley Periodicals, Inc.

  10. The Mixed Instrumental Controller: Using Value of Information to Combine Habitual Choice and Mental Simulation

    PubMed Central

    Pezzulo, Giovanni; Rigoli, Francesco; Chersi, Fabian

    2013-01-01

    Instrumental behavior depends on both goal-directed and habitual mechanisms of choice. Normative views cast these mechanisms in terms of model-free and model-based methods of reinforcement learning, respectively. An influential proposal hypothesizes that model-free and model-based mechanisms coexist and compete in the brain according to their relative uncertainty. In this paper we propose a novel view in which a single Mixed Instrumental Controller produces both goal-directed and habitual behavior by flexibly balancing and combining model-based and model-free computations. The Mixed Instrumental Controller performs a cost-benefits analysis to decide whether to chose an action immediately based on the available “cached” value of actions (linked to model-free mechanisms) or to improve value estimation by mentally simulating the expected outcome values (linked to model-based mechanisms). Since mental simulation entails cognitive effort and increases the reward delay, it is activated only when the associated “Value of Information” exceeds its costs. The model proposes a method to compute the Value of Information, based on the uncertainty of action values and on the distance of alternative cached action values. Overall, the model by default chooses on the basis of lighter model-free estimates, and integrates them with costly model-based predictions only when useful. Mental simulation uses a sampling method to produce reward expectancies, which are used to update the cached value of one or more actions; in turn, this updated value is used for the choice. The key predictions of the model are tested in different settings of a double T-maze scenario. Results are discussed in relation with neurobiological evidence on the hippocampus – ventral striatum circuit in rodents, which has been linked to goal-directed spatial navigation. PMID:23459512

  11. The mixed instrumental controller: using value of information to combine habitual choice and mental simulation.

    PubMed

    Pezzulo, Giovanni; Rigoli, Francesco; Chersi, Fabian

    2013-01-01

    Instrumental behavior depends on both goal-directed and habitual mechanisms of choice. Normative views cast these mechanisms in terms of model-free and model-based methods of reinforcement learning, respectively. An influential proposal hypothesizes that model-free and model-based mechanisms coexist and compete in the brain according to their relative uncertainty. In this paper we propose a novel view in which a single Mixed Instrumental Controller produces both goal-directed and habitual behavior by flexibly balancing and combining model-based and model-free computations. The Mixed Instrumental Controller performs a cost-benefits analysis to decide whether to chose an action immediately based on the available "cached" value of actions (linked to model-free mechanisms) or to improve value estimation by mentally simulating the expected outcome values (linked to model-based mechanisms). Since mental simulation entails cognitive effort and increases the reward delay, it is activated only when the associated "Value of Information" exceeds its costs. The model proposes a method to compute the Value of Information, based on the uncertainty of action values and on the distance of alternative cached action values. Overall, the model by default chooses on the basis of lighter model-free estimates, and integrates them with costly model-based predictions only when useful. Mental simulation uses a sampling method to produce reward expectancies, which are used to update the cached value of one or more actions; in turn, this updated value is used for the choice. The key predictions of the model are tested in different settings of a double T-maze scenario. Results are discussed in relation with neurobiological evidence on the hippocampus - ventral striatum circuit in rodents, which has been linked to goal-directed spatial navigation.

  12. The NASA environmental models of Mars

    NASA Technical Reports Server (NTRS)

    Kaplan, D. I.

    1991-01-01

    NASA environmental models are discussed with particular attention given to the Mars Global Reference Atmospheric Model (Mars-GRAM) and the Mars Terrain simulator. The Mars-GRAM model takes into account seasonal, diurnal, and surface topography and dust storm effects upon the atmosphere. It is also capable of simulating appropriate random density perturbations along any trajectory path through the atmosphere. The Mars Terrain Simulator is a software program that builds pseudo-Martian terrains by layering the effects of geological processes upon one another. Output pictures of the constructed surfaces can be viewed from any vantage point under any illumination conditions. Attention is also given to the document 'Environment of Mars, 1988' in which scientific models of the Martian atmosphere and Martian surface are presented.

  13. Distributed Observer Network

    NASA Technical Reports Server (NTRS)

    2008-01-01

    NASA s advanced visual simulations are essential for analyses associated with life cycle planning, design, training, testing, operations, and evaluation. Kennedy Space Center, in particular, uses simulations for ground services and space exploration planning in an effort to reduce risk and costs while improving safety and performance. However, it has been difficult to circulate and share the results of simulation tools among the field centers, and distance and travel expenses have made timely collaboration even harder. In response, NASA joined with Valador Inc. to develop the Distributed Observer Network (DON), a collaborative environment that leverages game technology to bring 3-D simulations to conventional desktop and laptop computers. DON enables teams of engineers working on design and operations to view and collaborate on 3-D representations of data generated by authoritative tools. DON takes models and telemetry from these sources and, using commercial game engine technology, displays the simulation results in a 3-D visual environment. Multiple widely dispersed users, working individually or in groups, can view and analyze simulation results on desktop and laptop computers in real time.

  14. Thermal IR exitance model of a plant canopy

    NASA Technical Reports Server (NTRS)

    Kimes, D. S.; Smith, J. A.; Link, L. E.

    1981-01-01

    A thermal IR exitance model of a plant canopy based on a mathematical abstraction of three horizontal layers of vegetation was developed. Canopy geometry within each layer is quantitatively described by the foliage and branch orientation distributions and number density. Given this geometric information for each layer and the driving meteorological variables, a system of energy budget equations was determined and solved for average layer temperatures. These estimated layer temperatures, together with the angular distributions of radiating elements, were used to calculate the emitted thermal IR radiation as a function of view angle above the canopy. The model was applied to a lodgepole pine (Pinus contorta) canopy over a diurnal cycle. Simulated vs measured radiometric average temperatures of the midcanopy layer corresponded with 2 C. Simulation results suggested that canopy geometry can significantly influence the effective radiant temperature recorded at varying sensor view angles.

  15. A semiempirical model for interpreting microwave emission from semiarid land surfaces as seen from space

    NASA Technical Reports Server (NTRS)

    Kerr, Yann H.; Njoku, Eni G.

    1990-01-01

    A radiative-transfer model for simulating microwave brightness temperatures over land surfaces is described. The model takes into account sensor viewing conditions (spacecraft altitude, viewing angle, frequency, and polarization) and atmospheric parameters over a soil surface characterized by its moisture, roughness, and temperature and covered with a layer of vegetation characterized by its temperature, water content, single scattering albedo, structure, and percent coverage. In order to reduce the influence of atmospheric and surface temperature effects, the brightness temperatures are expressed as polarization ratios that depend primarily on the soil moisture and roughness, canopy water content, and percentage of cover. The sensitivity of the polarization ratio to these parameters is investigated. Simulation of the temporal evolution of the microwave signal over semiarid areas in the African Sahel is presented and compared to actual satellite data from the SMMR instrument on Nimbus-7.

  16. Improved estimation of leaf area index and leaf chlorophyll content of a potato crop using multi-angle spectral data - potential of unmanned aerial vehicle imagery

    NASA Astrophysics Data System (ADS)

    Roosjen, Peter P. J.; Brede, Benjamin; Suomalainen, Juha M.; Bartholomeus, Harm M.; Kooistra, Lammert; Clevers, Jan G. P. W.

    2018-04-01

    In addition to single-angle reflectance data, multi-angular observations can be used as an additional information source for the retrieval of properties of an observed target surface. In this paper, we studied the potential of multi-angular reflectance data for the improvement of leaf area index (LAI) and leaf chlorophyll content (LCC) estimation by numerical inversion of the PROSAIL model. The potential for improvement of LAI and LCC was evaluated for both measured data and simulated data. The measured data was collected on 19 July 2016 by a frame-camera mounted on an unmanned aerial vehicle (UAV) over a potato field, where eight experimental plots of 30 × 30 m were designed with different fertilization levels. Dozens of viewing angles, covering the hemisphere up to around 30° from nadir, were obtained by a large forward and sideways overlap of collected images. Simultaneously to the UAV flight, in situ measurements of LAI and LCC were performed. Inversion of the PROSAIL model was done based on nadir data and based on multi-angular data collected by the UAV. Inversion based on the multi-angular data performed slightly better than inversion based on nadir data, indicated by the decrease in RMSE from 0.70 to 0.65 m2/m2 for the estimation of LAI, and from 17.35 to 17.29 μg/cm2 for the estimation of LCC, when nadir data were used and when multi-angular data were used, respectively. In addition to inversions based on measured data, we simulated several datasets at different multi-angular configurations and compared the accuracy of the inversions of these datasets with the inversion based on data simulated at nadir position. In general, the results based on simulated (synthetic) data indicated that when more viewing angles, more well distributed viewing angles, and viewing angles up to larger zenith angles were available for inversion, the most accurate estimations were obtained. Interestingly, when using spectra simulated at multi-angular sampling configurations as were captured by the UAV platform (view zenith angles up to 30°), already a huge improvement could be obtained when compared to solely using spectra simulated at nadir position. The results of this study show that the estimation of LAI and LCC by numerical inversion of the PROSAIL model can be improved when multi-angular observations are introduced. However, for the potato crop, PROSAIL inversion for measured data only showed moderate accuracy and slight improvements.

  17. Modeling of the ITER-like wide-angle infrared thermography view of JET.

    PubMed

    Aumeunier, M-H; Firdaouss, M; Travère, J-M; Loarer, T; Gauthier, E; Martin, V; Chabaud, D; Humbert, E

    2012-10-01

    Infrared (IR) thermography systems are mandatory to ensure safe plasma operation in fusion devices. However, IR measurements are made much more complicated in metallic environment because of the spurious contributions of the reflected fluxes. This paper presents a full predictive photonic simulation able to assess accurately the surface temperature measurement with classical IR thermography from a given plasma scenario and by taking into account the optical properties of PFCs materials. This simulation has been carried out the ITER-like wide angle infrared camera view of JET in comparing with experimental data. The consequences and the effects of the low emissivity and the bidirectional reflectivity distribution function used in the model for the metallic PFCs on the contribution of the reflected flux in the analysis are discussed.

  18. Novel characterization of capsule x-ray drive at the National Ignition Facility.

    PubMed

    MacLaren, S A; Schneider, M B; Widmann, K; Hammer, J H; Yoxall, B E; Moody, J D; Bell, P M; Benedetti, L R; Bradley, D K; Edwards, M J; Guymer, T M; Hinkel, D E; Hsing, W W; Kervin, M L; Meezan, N B; Moore, A S; Ralph, J E

    2014-03-14

    Indirect drive experiments at the National Ignition Facility are designed to achieve fusion by imploding a fuel capsule with x rays from a laser-driven hohlraum. Previous experiments have been unable to determine whether a deficit in measured ablator implosion velocity relative to simulations is due to inadequate models of the hohlraum or ablator physics. ViewFactor experiments allow for the first time a direct measure of the x-ray drive from the capsule point of view. The experiments show a 15%-25% deficit relative to simulations and thus explain nearly all of the disagreement with the velocity data. In addition, the data from this open geometry provide much greater constraints on a predictive model of laser-driven hohlraum performance than the nominal ignition target.

  19. Demonstration of a High-Fidelity Predictive/Preview Display Technique for Telerobotic Servicing in Space

    NASA Technical Reports Server (NTRS)

    Kim, Won S.; Bejczy, Antal K.

    1993-01-01

    A highly effective predictive/preview display technique for telerobotic servicing in space under several seconds communication time delay has been demonstrated on a large laboratory scale in May 1993, involving the Jet Propulsion Laboratory as the simulated ground control station and, 2500 miles away, the Goddard Space Flight Center as the simulated satellite servicing set-up. The technique is based on a high-fidelity calibration procedure that enables a high-fidelity overlay of 3-D graphics robot arm and object models over given 2-D TV camera images of robot arm and objects. To generate robot arm motions, the operator can confidently interact in real time with the graphics models of the robot arm and objects overlaid on an actual camera view of the remote work site. The technique also enables the operator to generate high-fidelity synthetic TV camera views showing motion events that are hidden in a given TV camera view or for which no TV camera views are available. The positioning accuracy achieved by this technique for a zoomed-in camera setting was about +/-5 mm, well within the allowable +/-12 mm error margin at the insertion of a 45 cm long tool in the servicing task.

  20. Dynamic tracking of prosthetic valve motion and deformation from bi-plane x-ray views: feasibility study

    NASA Astrophysics Data System (ADS)

    Hatt, Charles R.; Wagner, Martin; Raval, Amish N.; Speidel, Michael A.

    2016-03-01

    Transcatheter aortic valve replacement (TAVR) requires navigation and deployment of a prosthetic valve within the aortic annulus under fluoroscopic guidance. To support improved device visualization in this procedure, this study investigates the feasibility of frame-by-frame 3D reconstruction of a moving and expanding prosthetic valve structure from simultaneous bi-plane x-ray views. In the proposed method, a dynamic 3D model of the valve is used in a 2D/3D registration framework to obtain a reconstruction of the valve. For each frame, valve model parameters describing position, orientation, expansion state, and deformation are iteratively adjusted until forward projections of the model match both bi-plane views. Simulated bi-plane imaging of a valve at different signal-difference-to-noise ratio (SDNR) levels was performed to test the approach. 20 image sequences with 50 frames of valve deployment were simulated at each SDNR. The simulation achieved a target registration error (TRE) of the estimated valve model of 0.93 +/- 2.6 mm (mean +/- S.D.) for the lowest SDNR of 2. For higher SDNRs (5 to 50) a TRE of 0.04 mm +/- 0.23 mm was achieved. A tabletop phantom study was then conducted using a TAVR valve. The dynamic 3D model was constructed from high resolution CT scans and a simple expansion model. TRE was 1.22 +/- 0.35 mm for expansion states varying from undeployed to fully deployed, and for moderate amounts of inter-frame motion. Results indicate that it is feasible to use bi-plane imaging to recover the 3D structure of deformable catheter devices.

  1. Dynamic tracking of prosthetic valve motion and deformation from bi-plane x-ray views: feasibility study.

    PubMed

    Hatt, Charles R; Wagner, Martin; Raval, Amish N; Speidel, Michael A

    2016-01-01

    Transcatheter aortic valve replacement (TAVR) requires navigation and deployment of a prosthetic valve within the aortic annulus under fluoroscopic guidance. To support improved device visualization in this procedure, this study investigates the feasibility of frame-by-frame 3D reconstruction of a moving and expanding prosthetic valve structure from simultaneous bi-plane x-ray views. In the proposed method, a dynamic 3D model of the valve is used in a 2D/3D registration framework to obtain a reconstruction of the valve. For each frame, valve model parameters describing position, orientation, expansion state, and deformation are iteratively adjusted until forward projections of the model match both bi-plane views. Simulated bi-plane imaging of a valve at different signal-difference-to-noise ratio (SDNR) levels was performed to test the approach. 20 image sequences with 50 frames of valve deployment were simulated at each SDNR. The simulation achieved a target registration error (TRE) of the estimated valve model of 0.93 ± 2.6 mm (mean ± S.D.) for the lowest SDNR of 2. For higher SDNRs (5 to 50) a TRE of 0.04 mm ± 0.23 mm was achieved. A tabletop phantom study was then conducted using a TAVR valve. The dynamic 3D model was constructed from high resolution CT scans and a simple expansion model. TRE was 1.22 ± 0.35 mm for expansion states varying from undeployed to fully deployed, and for moderate amounts of inter-frame motion. Results indicate that it is feasible to use bi-plane imaging to recover the 3D structure of deformable catheter devices.

  2. Field of View Evaluation for Flight Simulators Used in Spatial Disorientation Training

    DTIC Science & Technology

    2014-01-01

    Naval Medical Research Unit Dayton FIELD OF VIEW EVALUATION FOR FLIGHT SIMULATORS USED IN SPATIAL DISORIENTATION TRAINING WILLIAMS, H.P...COVERED (from – to) 2013JUL30 to 2014JUN30 4. TITLE Field of View Evaluation for Flight Simulators Used in Spatial Disorientation Training 5a...simulator systems as well, and implications and recommendations for SD training are discussed. 3 Field of View Evaluation for Flight Simulators

  3. Modelling of countermeasures for AFV protection against IR SACLOS systems

    NASA Astrophysics Data System (ADS)

    Walmsley, R.; Butters, B.; Ayling, R.; Richardson, M.

    2005-11-01

    Countermeasures consisting of obscurants and decoys can be used separately or in combination in attempting to defeat an attack on an Armoured Fighting Vehicle (AFV) by an IR SACLOS missile system. The engagement can occur over a wide range of conditions of wind speed, wind direction and the AFV route relative to the SACLOS firing post. The countermeasures need to be evaluated over the full set of conditions. Simulation with a man in the loop can be expensive and very time consuming. Without using a man in the loop, a fully computer based simulation can be used to identify the scenarios in which defeat of the SACLOS system may be possible. These instances can be examined in more detail using the same simulation application or by using the conditions in a more detailed modelling and simulation facility. An IR imaging tracker is used instead of the man in the loop to simulate the SACLOS operator. The missile is guided onto the target by either the clear view of the AFV or by the AFV position predicted by the tracker while the AFV is obscured. The modelled scenarios feature a typical AFV modelled as a 3D object with a nominal 8 -12 μm signature. The modelled obscurant munitions are hypothetical but based on achievable designs based on current obscurant material performance and dissemination methods. Some general results and conclusions about the method are presented with a view of further work and the use of decoys with the obscurant to present a reappearing alternative target.

  4. Sensitivities of simulated satellite views of clouds to subgrid-scale overlap and condensate heterogeneity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hillman, Benjamin R.; Marchand, Roger T.; Ackerman, Thomas P.

    Satellite simulators are often used to account for limitations in satellite retrievals of cloud properties in comparisons between models and satellite observations. The purpose of the simulator framework is to enable more robust evaluation of model cloud properties, so that di erences between models and observations can more con dently be attributed to model errors. However, these simulators are subject to uncertainties themselves. A fundamental uncertainty exists in connecting the spatial scales at which cloud properties are retrieved with those at which clouds are simulated in global models. In this study, we create a series of sensitivity tests using 4more » km global model output from the Multiscale Modeling Framework to evaluate the sensitivity of simulated satellite retrievals when applied to climate models whose grid spacing is many tens to hundreds of kilometers. In particular, we examine the impact of cloud and precipitation overlap and of condensate spatial variability. We find the simulated retrievals are sensitive to these assumptions. Specifically, using maximum-random overlap with homogeneous cloud and precipitation condensate, which is often used in global climate models, leads to large errors in MISR and ISCCP-simulated cloud cover and in CloudSat-simulated radar reflectivity. To correct for these errors, an improved treatment of unresolved clouds and precipitation is implemented for use with the simulator framework and is shown to substantially reduce the identified errors.« less

  5. [Tasseled cap triangle (TCT)-leaf area index (LAI)model of rice fields based on PROSAIL model and its application].

    PubMed

    Li, Ya Ni; Lu, Lei; Liu, Yong

    2017-12-01

    The tasseled cap triangle (TCT)-leaf area index (LAI) isoline is a model that reflects the distribution of LAI isoline in the spectral space constituted by reflectance of red and near-infrared (NIR) bands, and the LAI retrieval model developed on the basis of this is more accurate than the commonly used statistical relationship models. This study used ground-based measurements of the rice field, validated the applicability of PROSAIL model in simulating canopy reflectance of rice field, and calibrated the input parameters of the model. The ranges of values of PROSAIL input parameters for simulating rice canopy reflectance were determined. Based on this, the TCT-LAI isoline model of rice field was established, and a look-up table (LUT) required in remote sensing retrieval of LAI was developed. Then, the LUT was used for Landsat 8 and WorldView 3 data to retrieve LAI of rice field, respectively. The results showed that the LAI retrieved using the LUT developed from TCT-LAI isoline model had a good linear relationship with the measured LAI R 2 =0.76, RMSE=0.47. Compared with the LAI retrieved from Landsat 8, LAI values retrieved from WorldView 3 va-ried with wider range, and data distribution was more scattered. Resampling the Landsat 8 and WorldView 3 reflectance data to 1 km to retrieve LAI, the result of MODIS LAI product was significantly underestimated compared to that of retrieved LAI.

  6. Angular radiation temperature simulation for time-dependent capsule drive prediction in inertial confinement fusion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jing, Longfei; Yang, Dong; Li, Hang

    2015-02-15

    The x-ray drive on a capsule in an inertial confinement fusion setup is crucial for ignition. Unfortunately, a direct measurement has not been possible so far. We propose an angular radiation temperature simulation to predict the time-dependent drive on the capsule. A simple model, based on the view-factor method for the simulation of the radiation temperature, is presented and compared with the experimental data obtained using the OMEGA laser facility and the simulation results acquired with VISRAD code. We found a good agreement between the time-dependent measurements and the simulation results obtained using this model. The validated model was thenmore » used to analyze the experimental results from the Shenguang-III prototype laser facility. More specifically, the variations of the peak radiation temperatures at different view angles with the albedo of the hohlraum, the motion of the laser spots, the closure of the laser entrance holes, and the deviation of the laser power were investigated. Furthermore, the time-dependent radiation temperature at different orientations and the drive history on the capsule were calculated. The results indicate that the radiation temperature from “U20W112” (named according to the diagnostic hole ID on the target chamber) can be used to approximately predict the drive temperature on the capsule. In addition, the influence of the capsule on the peak radiation temperature is also presented.« less

  7. A computational modeling of semantic knowledge in reading comprehension: Integrating the landscape model with latent semantic analysis.

    PubMed

    Yeari, Menahem; van den Broek, Paul

    2016-09-01

    It is a well-accepted view that the prior semantic (general) knowledge that readers possess plays a central role in reading comprehension. Nevertheless, computational models of reading comprehension have not integrated the simulation of semantic knowledge and online comprehension processes under a unified mathematical algorithm. The present article introduces a computational model that integrates the landscape model of comprehension processes with latent semantic analysis representation of semantic knowledge. In three sets of simulations of previous behavioral findings, the integrated model successfully simulated the activation and attenuation of predictive and bridging inferences during reading, as well as centrality estimations and recall of textual information after reading. Analyses of the computational results revealed new theoretical insights regarding the underlying mechanisms of the various comprehension phenomena.

  8. A Conceptual View of the Officer Procurement Model (TOPOPS). Technical Report No. 73-73.

    ERIC Educational Resources Information Center

    Akman, Allan; Nordhauser, Fred

    This report presents the conceptual design of a computer-based linear programing model of the Air Force officer procurement system called TOPOPS. The TOPOPS model is an aggregate model which simulates officer accession and training and is directed at optimizing officer procurement in terms of either minimizing cost or maximizing accession quality…

  9. Physiological Based Simulator Fidelity Design Guidance

    NASA Technical Reports Server (NTRS)

    Schnell, Thomas; Hamel, Nancy; Postnikov, Alex; Hoke, Jaclyn; McLean, Angus L. M. Thom, III

    2012-01-01

    The evolution of the role of flight simulation has reinforced assumptions in aviation that the degree of realism in a simulation system directly correlates to the training benefit, i.e., more fidelity is always better. The construct of fidelity has several dimensions, including physical fidelity, functional fidelity, and cognitive fidelity. Interaction of different fidelity dimensions has an impact on trainee immersion, presence, and transfer of training. This paper discusses research results of a recent study that investigated if physiological-based methods could be used to determine the required level of simulator fidelity. Pilots performed a relatively complex flight task consisting of mission task elements of various levels of difficulty in a fixed base flight simulator and a real fighter jet trainer aircraft. Flight runs were performed using one forward visual channel of 40 deg. field of view for the lowest level of fidelity, 120 deg. field of view for the middle level of fidelity, and unrestricted field of view and full dynamic acceleration in the real airplane. Neuro-cognitive and physiological measures were collected under these conditions using the Cognitive Avionics Tool Set (CATS) and nonlinear closed form models for workload prediction were generated based on these data for the various mission task elements. One finding of the work described herein is that simple heart rate is a relatively good predictor of cognitive workload, even for short tasks with dynamic changes in cognitive loading. Additionally, we found that models that used a wide range of physiological and neuro-cognitive measures can further boost the accuracy of the workload prediction.

  10. Molecular Modeling of the Binding Structures in the Interlayer Adsorption of a Tetracycline Antibiotic by Smectite Clays

    NASA Astrophysics Data System (ADS)

    Aristilde, L.

    2009-12-01

    A controlling factor in the fate of antibiotics in the environment is their sequestration in soil particles including clay minerals. Of special interest is the interlayer adsorption by smectite clays, which has been shown to influence both the bioavailability and persistence of antibiotics in the soil environment. However, the interlayer structures of the bound antibiotics, essential to an accurate understanding of the adsorption mechanisms, are not well understood. Molecular simulations of oxytetracycline (OTC) with a model montmorillonite (MONT) clay were performed to gain insights into these structures for tetracycline antibiotics. Monte Carlo simulations were used for explorations of the clay layer spacing required for the adsorption of the antibiotic under different hydration states of the clay interlayer; these preliminary results were validated with previous X-ray diffraction patterns obtained following sorption experiments of OTC with MONT. Molecular dynamics relaxation simulations were performed subsequently in order to obtain geometry-optimized structures of the binding conformations of the intercalated antibiotic in the model MONT layers. This study contributes to a mechanistic understanding of the factors controlling the interlayer adsorption of the tetracycline antibiotics by the expandable smectite clay minerals. Figure 1. Optimized Monte Carlo simulation cell of OTC in the interlayer of MONT: perspective side view (top) and bottom view (bottom).

  11. Exploitation of Digital Surface Models Generated from WORLDVIEW-2 Data for SAR Simulation Techniques

    NASA Astrophysics Data System (ADS)

    Ilehag, R.; Auer, S.; d'Angelo, P.

    2017-05-01

    GeoRaySAR, an automated SAR simulator developed at DLR, identifies buildings in high resolution SAR data by utilizing geometric knowledge extracted from digital surface models (DSMs). Hitherto, the simulator has utilized DSMs generated from LiDAR data from airborne sensors with pre-filtered vegetation. Discarding the need for pre-optimized model input, DSMs generated from high resolution optical data (acquired with WorldView-2) are used for the extraction of building-related SAR image parts in this work. An automatic preprocessing of the DSMs has been developed for separating buildings from elevated vegetation (trees, bushes) and reducing the noise level. Based on that, automated simulations are triggered considering the properties of real SAR images. Locations in three cities, Munich, London and Istanbul, were chosen as study areas to determine advantages and limitations related to WorldView-2 DSMs as input for GeoRaySAR. Beyond, the impact of the quality of the DSM in terms of building extraction is evaluated as well as evaluation of building DSM, a DSM only containing buildings. The results indicate that building extents can be detected with DSMs from optical satellite data with various success, dependent on the quality of the DSM as well as on the SAR imaging perspective.

  12. ViSimpl: Multi-View Visual Analysis of Brain Simulation Data

    PubMed Central

    Galindo, Sergio E.; Toharia, Pablo; Robles, Oscar D.; Pastor, Luis

    2016-01-01

    After decades of independent morphological and functional brain research, a key point in neuroscience nowadays is to understand the combined relationships between the structure of the brain and its components and their dynamics on multiple scales, ranging from circuits of neurons at micro or mesoscale to brain regions at macroscale. With such a goal in mind, there is a vast amount of research focusing on modeling and simulating activity within neuronal structures, and these simulations generate large and complex datasets which have to be analyzed in order to gain the desired insight. In such context, this paper presents ViSimpl, which integrates a set of visualization and interaction tools that provide a semantic view of brain data with the aim of improving its analysis procedures. ViSimpl provides 3D particle-based rendering that allows visualizing simulation data with their associated spatial and temporal information, enhancing the knowledge extraction process. It also provides abstract representations of the time-varying magnitudes supporting different data aggregation and disaggregation operations and giving also focus and context clues. In addition, ViSimpl tools provide synchronized playback control of the simulation being analyzed. Finally, ViSimpl allows performing selection and filtering operations relying on an application called NeuroScheme. All these views are loosely coupled and can be used independently, but they can also work together as linked views, both in centralized and distributed computing environments, enhancing the data exploration and analysis procedures. PMID:27774062

  13. ViSimpl: Multi-View Visual Analysis of Brain Simulation Data.

    PubMed

    Galindo, Sergio E; Toharia, Pablo; Robles, Oscar D; Pastor, Luis

    2016-01-01

    After decades of independent morphological and functional brain research, a key point in neuroscience nowadays is to understand the combined relationships between the structure of the brain and its components and their dynamics on multiple scales, ranging from circuits of neurons at micro or mesoscale to brain regions at macroscale. With such a goal in mind, there is a vast amount of research focusing on modeling and simulating activity within neuronal structures, and these simulations generate large and complex datasets which have to be analyzed in order to gain the desired insight. In such context, this paper presents ViSimpl, which integrates a set of visualization and interaction tools that provide a semantic view of brain data with the aim of improving its analysis procedures. ViSimpl provides 3D particle-based rendering that allows visualizing simulation data with their associated spatial and temporal information, enhancing the knowledge extraction process. It also provides abstract representations of the time-varying magnitudes supporting different data aggregation and disaggregation operations and giving also focus and context clues. In addition, ViSimpl tools provide synchronized playback control of the simulation being analyzed. Finally, ViSimpl allows performing selection and filtering operations relying on an application called NeuroScheme. All these views are loosely coupled and can be used independently, but they can also work together as linked views, both in centralized and distributed computing environments, enhancing the data exploration and analysis procedures.

  14. Modeling the Performance of Direct-Detection Doppler Lidar Systems in Real Atmospheres

    NASA Technical Reports Server (NTRS)

    McGill, Matthew J.; Hart, William D.; McKay, Jack A.; Spinhirne, James D.

    1999-01-01

    Previous modeling of the performance of spaceborne direct-detection Doppler lidar systems has assumed extremely idealized atmospheric models. Here we develop a technique for modeling the performance of these systems in a more realistic atmosphere, based on actual airborne lidar observations. The resulting atmospheric model contains cloud and aerosol variability that is absent in other simulations of spaceborne Doppler lidar instruments. To produce a realistic simulation of daytime performance, we include solar radiance values that are based on actual measurements and are allowed to vary as the viewing scene changes. Simulations are performed for two types of direct-detection Doppler lidar systems: the double-edge and the multi-channel techniques. Both systems were optimized to measure winds from Rayleigh backscatter at 355 nm. Simulations show that the measurement uncertainty during daytime is degraded by only about 10-20% compared to nighttime performance, provided a proper solar filter is included in the instrument design.

  15. Modeling the performance of direct-detection Doppler lidar systems including cloud and solar background variability.

    PubMed

    McGill, M J; Hart, W D; McKay, J A; Spinhirne, J D

    1999-10-20

    Previous modeling of the performance of spaceborne direct-detection Doppler lidar systems assumed extremely idealized atmospheric models. Here we develop a technique for modeling the performance of these systems in a more realistic atmosphere, based on actual airborne lidar observations. The resulting atmospheric model contains cloud and aerosol variability that is absent in other simulations of spaceborne Doppler lidar instruments. To produce a realistic simulation of daytime performance, we include solar radiance values that are based on actual measurements and are allowed to vary as the viewing scene changes. Simulations are performed for two types of direct-detection Doppler lidar system: the double-edge and the multichannel techniques. Both systems were optimized to measure winds from Rayleigh backscatter at 355 nm. Simulations show that the measurement uncertainty during daytime is degraded by only approximately 10-20% compared with nighttime performance, provided that a proper solar filter is included in the instrument design.

  16. Magnetic biosensors: Modelling and simulation.

    PubMed

    Nabaei, Vahid; Chandrawati, Rona; Heidari, Hadi

    2018-04-30

    In the past few years, magnetoelectronics has emerged as a promising new platform technology in various biosensors for detection, identification, localisation and manipulation of a wide spectrum of biological, physical and chemical agents. The methods are based on the exposure of the magnetic field of a magnetically labelled biomolecule interacting with a complementary biomolecule bound to a magnetic field sensor. This Review presents various schemes of magnetic biosensor techniques from both simulation and modelling as well as analytical and numerical analysis points of view, and the performance variations under magnetic fields at steady and nonstationary states. This is followed by magnetic sensors modelling and simulations using advanced Multiphysics modelling software (e.g. Finite Element Method (FEM) etc.) and home-made developed tools. Furthermore, outlook and future directions of modelling and simulations of magnetic biosensors in different technologies and materials are critically discussed. Crown Copyright © 2017. Published by Elsevier B.V. All rights reserved.

  17. Integrated Disposal Facility FY 2016: ILAW Verification and Validation of the eSTOMP Simulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freedman, Vicky L.; Bacon, Diana H.; Fang, Yilin

    2016-05-13

    This document describes two sets of simulations carried out to further verify and validate the eSTOMP simulator. In this report, a distinction is made between verification and validation, and the focus is on verifying eSTOMP through a series of published benchmarks on cementitious wastes, and validating eSTOMP based on a lysimeter experiment for the glassified waste. These activities are carried out within the context of a scientific view of validation that asserts that models can only be invalidated, and that model validation (and verification) is a subjective assessment.

  18. Reliability of analog quantum simulation

    DOE PAGES

    Sarovar, Mohan; Zhang, Jun; Zeng, Lishan

    2017-01-03

    Analog quantum simulators (AQS) will likely be the first nontrivial application of quantum technology for predictive simulation. However, there remain questions regarding the degree of confidence that can be placed in the results of AQS since they do not naturally incorporate error correction. Specifically, how do we know whether an analog simulation of a quantum model will produce predictions that agree with the ideal model in the presence of inevitable imperfections? At the same time there is a widely held expectation that certain quantum simulation questions will be robust to errors and perturbations in the underlying hardware. Resolving these twomore » points of view is a critical step in making the most of this promising technology. In this paper we formalize the notion of AQS reliability by determining sensitivity of AQS outputs to underlying parameters, and formulate conditions for robust simulation. Our approach naturally reveals the importance of model symmetries in dictating the robust properties. Finally, to demonstrate the approach, we characterize the robust features of a variety of quantum many-body models.« less

  19. Conceiving processes in atmospheric models-General equations, subscale parameterizations, and 'superparameterizations'

    NASA Astrophysics Data System (ADS)

    Gramelsberger, Gabriele

    The scientific understanding of atmospheric processes has been rooted in the mechanical and physical view of nature ever since dynamic meteorology gained ground in the late 19th century. Conceiving the atmosphere as a giant 'air mass circulation engine' entails applying hydro- and thermodynamical theory to the subject in order to describe the atmosphere's behaviour on small scales. But when it comes to forecasting, it turns out that this view is far too complex to be computed. The limitation of analytical methods precludes an exact solution, forcing scientists to make use of numerical simulation. However, simulation introduces two prerequisites to meteorology: First, the partitioning of the theoretical view into two parts-the large-scale behaviour of the atmosphere, and the effects of smaller-scale processes on this large-scale behaviour, so-called parametrizations; and second, the dependency on computational power in order to achieve a higher resolution. The history of today's atmospheric circulation modelling can be reconstructed as the attempt to improve the handling of these basic constraints. It can be further seen as the old schism between theory and application under new circumstances, which triggers a new discussion about the question of how processes may be conceived in atmospheric modelling.

  20. The Identification and Modeling of Visual Cue Usage in Manual Control Task Experiments

    NASA Technical Reports Server (NTRS)

    Sweet, Barbara Townsend; Trejo, Leonard J. (Technical Monitor)

    1999-01-01

    Many fields of endeavor require humans to conduct manual control tasks while viewing a perspective scene. Manual control refers to tasks in which continuous, or nearly continuous, control adjustments are required. Examples include flying an aircraft, driving a car, and riding a bicycle. Perspective scenes can arise through natural viewing of the world, simulation of a scene (as in flight simulators), or through imaging devices (such as the cameras on an unmanned aerospace vehicle). Designers frequently have some degree of control over the content and characteristics of a perspective scene; airport designers can choose runway markings, vehicle designers can influence the size and shape of windows, as well as the location of the pilot, and simulator database designers can choose scene complexity and content. Little theoretical framework exists to help designers determine the answers to questions related to perspective scene content. An empirical approach is most commonly used to determine optimum perspective scene configurations. The goal of the research effort described in this dissertation has been to provide a tool for modeling the characteristics of human operators conducting manual control tasks with perspective-scene viewing. This is done for the purpose of providing an algorithmic, as opposed to empirical, method for analyzing the effects of changing perspective scene content for closed-loop manual control tasks.

  1. Variation of linear and circular polarization persistence for changing field of view and collection area in a forward scattering environment

    NASA Astrophysics Data System (ADS)

    van der Laan, John D.; Wright, Jeremy B.; Scrymgeour, David A.; Kemme, Shanalyn A.; Dereniak, Eustace L.

    2016-05-01

    We present experimental and simulation results for a laboratory-based forward-scattering environment, where 1 μm diameter polystyrene spheres are suspended in water to model the optical scattering properties of fog. Circular polarization maintains its degree of polarization better than linear polarization as the optical thickness of the scattering environment increases. Both simulation and experiment quantify circular polarization's superior persistence, compared to that of linear polarization, and show that it is much less affected by variations in the field of view and collection area of the optical system. Our experimental environment's lateral extent was physically finite, causing a significant difference between measured and simulated degree of polarization values for incident linearly polarized light, but not for circularly polarized light. Through simulation we demonstrate that circular polarization is less susceptible to the finite environmental extent as well as the collection optic's limiting configuration.

  2. Use of simulated satellite radiances from a mesoscale numerical model to understand kinematic and dynamic processes

    NASA Technical Reports Server (NTRS)

    Kalb, Michael; Robertson, Franklin; Jedlovec, Gary; Perkey, Donald

    1987-01-01

    Techniques by which mesoscale numerical weather prediction model output and radiative transfer codes are combined to simulate the radiance fields that a given passive temperature/moisture satellite sensor would see if viewing the evolving model atmosphere are introduced. The goals are to diagnose the dynamical atmospheric processes responsible for recurring patterns in observed satellite radiance fields, and to develop techniques to anticipate the ability of satellite sensor systems to depict atmospheric structures and provide information useful for numerical weather prediction (NWP). The concept of linking radiative transfer and dynamical NWP codes is demonstrated with time sequences of simulated radiance imagery in the 24 TIROS vertical sounder channels derived from model integrations for March 6, 1982.

  3. NATO Human View Architecture and Human Networks

    NASA Technical Reports Server (NTRS)

    Handley, Holly A. H.; Houston, Nancy P.

    2010-01-01

    The NATO Human View is a system architectural viewpoint that focuses on the human as part of a system. Its purpose is to capture the human requirements and to inform on how the human impacts the system design. The viewpoint contains seven static models that include different aspects of the human element, such as roles, tasks, constraints, training and metrics. It also includes a Human Dynamics component to perform simulations of the human system under design. One of the static models, termed Human Networks, focuses on the human-to-human communication patterns that occur as a result of ad hoc or deliberate team formation, especially teams distributed across space and time. Parameters of human teams that effect system performance can be captured in this model. Human centered aspects of networks, such as differences in operational tempo (sense of urgency), priorities (common goal), and team history (knowledge of the other team members), can be incorporated. The information captured in the Human Network static model can then be included in the Human Dynamics component so that the impact of distributed teams is represented in the simulation. As the NATO militaries transform to a more networked force, the Human View architecture is an important tool that can be used to make recommendations on the proper mix of technological innovations and human interactions.

  4. Cognitive Dissonance Reduction as Constraint Satisfaction.

    ERIC Educational Resources Information Center

    Shultz, Thomas R.; Lepper, Mark R.

    1996-01-01

    It is argued that the reduction of cognitive dissonance can be viewed as a constraint satisfaction problem, and a computational model of the process of consonance seeking is proposed. Simulations from this model matched psychological findings from the insufficient justification and free-choice paradigms of cognitive dissonance theory. (SLD)

  5. The subtle business of model reduction for stochastic chemical kinetics

    NASA Astrophysics Data System (ADS)

    Gillespie, Dan T.; Cao, Yang; Sanft, Kevin R.; Petzold, Linda R.

    2009-02-01

    This paper addresses the problem of simplifying chemical reaction networks by adroitly reducing the number of reaction channels and chemical species. The analysis adopts a discrete-stochastic point of view and focuses on the model reaction set S1⇌S2→S3, whose simplicity allows all the mathematics to be done exactly. The advantages and disadvantages of replacing this reaction set with a single S3-producing reaction are analyzed quantitatively using novel criteria for measuring simulation accuracy and simulation efficiency. It is shown that in all cases in which such a model reduction can be accomplished accurately and with a significant gain in simulation efficiency, a procedure called the slow-scale stochastic simulation algorithm provides a robust and theoretically transparent way of implementing the reduction.

  6. The subtle business of model reduction for stochastic chemical kinetics.

    PubMed

    Gillespie, Dan T; Cao, Yang; Sanft, Kevin R; Petzold, Linda R

    2009-02-14

    This paper addresses the problem of simplifying chemical reaction networks by adroitly reducing the number of reaction channels and chemical species. The analysis adopts a discrete-stochastic point of view and focuses on the model reaction set S(1)<=>S(2)-->S(3), whose simplicity allows all the mathematics to be done exactly. The advantages and disadvantages of replacing this reaction set with a single S(3)-producing reaction are analyzed quantitatively using novel criteria for measuring simulation accuracy and simulation efficiency. It is shown that in all cases in which such a model reduction can be accomplished accurately and with a significant gain in simulation efficiency, a procedure called the slow-scale stochastic simulation algorithm provides a robust and theoretically transparent way of implementing the reduction.

  7. Plan View Pattern Control for Steel Plates through Constrained Locally Weighted Regression

    NASA Astrophysics Data System (ADS)

    Shigemori, Hiroyasu; Nambu, Koji; Nagao, Ryo; Araki, Tadashi; Mizushima, Narihito; Kano, Manabu; Hasebe, Shinji

    A technique for performing parameter identification in a locally weighted regression model using foresight information on the physical properties of the object of interest as constraints was proposed. This method was applied to plan view pattern control of steel plates, and a reduction of shape nonconformity (crop) at the plate head end was confirmed by computer simulation based on real operation data.

  8. Images of turbulent, absorbing-emitting atmospheres and their application to windshear detection

    NASA Astrophysics Data System (ADS)

    Watt, David W.; Philbrick, Daniel A.

    1991-03-01

    The simulation of images generated by thermally-radiating, optically- thick turbulent media are discussed and the time-dependent evolution of these images is modeled. This characteristics of these images are particularly applicable to the atmosphere in the 13-15 mm band and their behavior may have application in detecting aviation hazards. The image is generated by volumetric thermal emission by atmospheric constituents within the field-of-view of the detector. The structure of the turbulent temperature field and the attenuating properties of the atmosphere interact with the field-of-view's geometry to produce a localized region which dominates the optical flow of the image. The simulations discussed in this paper model the time-dependent behavior of images generated by atmospheric flows viewed from an airborne platform. The images ar modelled by (1) generating a random field of temperature fluctuations have the proper spatial structure, (2) adding these fluctuation to the baseline temperature field of the atmospheric event, (3) accumulating the image on the detector from radiation emitted in the imaging volume, (4) allowing the individual radiating points within the imaging volume to move with the local velocity, (5) recalculating the thermal field and generating a new image. This approach was used to simulate the images generated by the temperature and velocity fields of a windshear. The simulation generated pais of images separated by a small time interval. These image paris were analyzed by image cross-correlation. The displacement of the cross-correlation peak was used to infer the velocity at the localized region. The localized region was found to depend weakly on the shape of the velocity profile. Prediction of the localized region, the effects of imaging from a moving platform, alternative image analysis schemes, and possible application to aviation hazards are discussed.

  9. Repercussion of geometric and dynamic constraints on the 3D rendering quality in structurally adaptive multi-view shooting systems

    NASA Astrophysics Data System (ADS)

    Ali-Bey, Mohamed; Moughamir, Saïd; Manamanni, Noureddine

    2011-12-01

    in this paper a simulator of a multi-view shooting system with parallel optical axes and structurally variable configuration is proposed. The considered system is dedicated to the production of 3D contents for auto-stereoscopic visualization. The global shooting/viewing geometrical process, which is the kernel of this shooting system, is detailed and the different viewing, transformation and capture parameters are then defined. An appropriate perspective projection model is afterward derived to work out a simulator. At first, this latter is used to validate the global geometrical process in the case of a static configuration. Next, the simulator is used to show the limitations of a static configuration of this shooting system type by considering the case of dynamic scenes and then a dynamic scheme is achieved to allow a correct capture of this kind of scenes. After that, the effect of the different geometrical capture parameters on the 3D rendering quality and the necessity or not of their adaptation is studied. Finally, some dynamic effects and their repercussions on the 3D rendering quality of dynamic scenes are analyzed using error images and some image quantization tools. Simulation and experimental results are presented throughout this paper to illustrate the different studied points. Some conclusions and perspectives end the paper. [Figure not available: see fulltext.

  10. THE IMPORTANCE OF WIDE-FIELD FOREGROUND REMOVAL FOR 21 cm COSMOLOGY: A DEMONSTRATION WITH EARLY MWA EPOCH OF REIONIZATION OBSERVATIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pober, J. C.; Hazelton, B. J.; Beardsley, A. P.

    2016-03-01

    In this paper we present observations, simulations, and analysis demonstrating the direct connection between the location of foreground emission on the sky and its location in cosmological power spectra from interferometric redshifted 21 cm experiments. We begin with a heuristic formalism for understanding the mapping of sky coordinates into the cylindrically averaged power spectra measurements used by 21 cm experiments, with a focus on the effects of the instrument beam response and the associated sidelobes. We then demonstrate this mapping by analyzing power spectra with both simulated and observed data from the Murchison Widefield Array. We find that removing amore » foreground model that includes sources in both the main field of view and the first sidelobes reduces the contamination in high k{sub ∥} modes by several per cent relative to a model that only includes sources in the main field of view, with the completeness of the foreground model setting the principal limitation on the amount of power removed. While small, a percent-level amount of foreground power is in itself more than enough to prevent recovery of any Epoch of Reionization signal from these modes. This result demonstrates that foreground subtraction for redshifted 21 cm experiments is truly a wide-field problem, and algorithms and simulations must extend beyond the instrument’s main field of view to potentially recover the full 21 cm power spectrum.« less

  11. Analysis procedures and subjective flight results of a simulator validation and cue fidelity experiment

    NASA Technical Reports Server (NTRS)

    Carr, Peter C.; Mckissick, Burnell T.

    1988-01-01

    A joint experiment to investigate simulator validation and cue fidelity was conducted by the Dryden Flight Research Facility of NASA Ames Research Center (Ames-Dryden) and NASA Langley Research Center. The primary objective was to validate the use of a closed-loop pilot-vehicle mathematical model as an analytical tool for optimizing the tradeoff between simulator fidelity requirements and simulator cost. The validation process includes comparing model predictions with simulation and flight test results to evaluate various hypotheses for differences in motion and visual cues and information transfer. A group of five pilots flew air-to-air tracking maneuvers in the Langley differential maneuvering simulator and visual motion simulator and in an F-14 aircraft at Ames-Dryden. The simulators used motion and visual cueing devices including a g-seat, a helmet loader, wide field-of-view horizon, and a motion base platform.

  12. Air freight demand models: An overview

    NASA Technical Reports Server (NTRS)

    Dajani, J. S.; Bernstein, G. W.

    1978-01-01

    A survey is presented of some of the approaches which have been considered in freight demand estimation. The few existing continuous time computer simulations of aviation systems are reviewed, with a view toward the assessment of this approach as a tool for structuring air freight studies and for relating the different components of the air freight system. The variety of available data types and sources, without which the calibration, validation and the testing of both modal split and simulation models would be impossible are also reviewed.

  13. Modified Light Use Efficiency Model for Assessment of Carbon Sequestration in Grasslands of Kazakhstan: Combining Ground Biomass Data and Remote-sensing

    NASA Technical Reports Server (NTRS)

    Propastin, Pavel A.; Kappas, Martin W.; Herrmann, Stefanie M.; Tucker, Compton J.

    2012-01-01

    A modified light use efficiency (LUE) model was tested in the grasslands of central Kazakhstan in terms of its ability to characterize spatial patterns and interannual dynamics of net primary production (NPP) at a regional scale. In this model, the LUE of the grassland biome (en) was simulated from ground-based NPP measurements, absorbed photosynthetically active radiation (APAR) and meteorological observations using a new empirical approach. Using coarse-resolution satellite data from the Sea-viewing Wide Field-of-view Sensor (SeaWiFS), monthly NPP was calculated from 1998 to 2008 over a large grassland region in Kazakhstan. The modelling results were verified against scaled up plot-level observations of grassland biomass and another available NPP data set derived from a field study in a similar grassland biome. The results indicated the reliability of productivity estimates produced by the model for regional monitoring of grassland NPP. The method for simulation of en suggested in this study can be used in grassland regions where no carbon flux measurements are accessible.

  14. Simulating Wake Vortex Detection with the Sensivu Doppler Wind Lidar Simulator

    NASA Technical Reports Server (NTRS)

    Ramsey, Dan; Nguyen, Chi

    2014-01-01

    In support of NASA's Atmospheric Environment Safety Technologies NRA research topic on Wake Vortex Hazard Investigation, Aerospace Innovations (AI) investigated a set of techniques for detecting wake vortex hazards from arbitrary viewing angles, including axial perspectives. This technical report describes an approach to this problem and presents results from its implementation in a virtual lidar simulator developed at AI. Threedimensional data volumes from NASA's Terminal Area Simulation System (TASS) containing strong turbulent vortices were used as the atmospheric domain for these studies, in addition to an analytical vortex model in 3-D space. By incorporating a third-party radiative transfer code (BACKSCAT 4), user-defined aerosol layers can be incorporated into atmospheric models, simulating attenuation and backscatter in different environmental conditions and altitudes. A hazard detection algorithm is described that uses a twocomponent spectral model to identify vortex signatures observable from arbitrary angles.

  15. Interaction in Spoken Word Recognition Models: Feedback Helps.

    PubMed

    Magnuson, James S; Mirman, Daniel; Luthra, Sahil; Strauss, Ted; Harris, Harlan D

    2018-01-01

    Human perception, cognition, and action requires fast integration of bottom-up signals with top-down knowledge and context. A key theoretical perspective in cognitive science is the interactive activation hypothesis: forward and backward flow in bidirectionally connected neural networks allows humans and other biological systems to approximate optimal integration of bottom-up and top-down information under real-world constraints. An alternative view is that online feedback is neither necessary nor helpful; purely feed forward alternatives can be constructed for any feedback system, and online feedback could not improve processing and would preclude veridical perception. In the domain of spoken word recognition, the latter view was apparently supported by simulations using the interactive activation model, TRACE, with and without feedback: as many words were recognized more quickly without feedback as were recognized faster with feedback, However, these simulations used only a small set of words and did not address a primary motivation for interaction: making a model robust in noise. We conducted simulations using hundreds of words, and found that the majority were recognized more quickly with feedback than without. More importantly, as we added noise to inputs, accuracy and recognition times were better with feedback than without. We follow these simulations with a critical review of recent arguments that online feedback in interactive activation models like TRACE is distinct from other potentially helpful forms of feedback. We conclude that in addition to providing the benefits demonstrated in our simulations, online feedback provides a plausible means of implementing putatively distinct forms of feedback, supporting the interactive activation hypothesis.

  16. Interaction in Spoken Word Recognition Models: Feedback Helps

    PubMed Central

    Magnuson, James S.; Mirman, Daniel; Luthra, Sahil; Strauss, Ted; Harris, Harlan D.

    2018-01-01

    Human perception, cognition, and action requires fast integration of bottom-up signals with top-down knowledge and context. A key theoretical perspective in cognitive science is the interactive activation hypothesis: forward and backward flow in bidirectionally connected neural networks allows humans and other biological systems to approximate optimal integration of bottom-up and top-down information under real-world constraints. An alternative view is that online feedback is neither necessary nor helpful; purely feed forward alternatives can be constructed for any feedback system, and online feedback could not improve processing and would preclude veridical perception. In the domain of spoken word recognition, the latter view was apparently supported by simulations using the interactive activation model, TRACE, with and without feedback: as many words were recognized more quickly without feedback as were recognized faster with feedback, However, these simulations used only a small set of words and did not address a primary motivation for interaction: making a model robust in noise. We conducted simulations using hundreds of words, and found that the majority were recognized more quickly with feedback than without. More importantly, as we added noise to inputs, accuracy and recognition times were better with feedback than without. We follow these simulations with a critical review of recent arguments that online feedback in interactive activation models like TRACE is distinct from other potentially helpful forms of feedback. We conclude that in addition to providing the benefits demonstrated in our simulations, online feedback provides a plausible means of implementing putatively distinct forms of feedback, supporting the interactive activation hypothesis. PMID:29666593

  17. Simulating the Birth of Massive Star Clusters: Is Destruction Inevitable?

    NASA Astrophysics Data System (ADS)

    Rosen, Anna

    2013-10-01

    Very early in its operation, the Hubble Space Telescope {HST} opened an entirely new frontier: study of the demographics and properties of star clusters far beyond the Milky Way. However, interpretation of HST's observations has proven difficult, and has led to the development of two conflicting models. One view is that most massive star clusters are disrupted during their infancy by feedback from newly formed stars {i.e., "infant mortality"}, independent of cluster mass or environment. The other model is that most star clusters survive their infancy and are disrupted later by mass-dependent dynamical processes. Since observations at present have failed to discriminate between these views, we propose a theoretical investigation to provide new insight. We will perform radiation-hydrodynamic simulations of the formation of massive star clusters, including for the first time a realistic treatment of the most important stellar feedback processes. These simulations will elucidate the physics of stellar feedback, and allow us to determine whether cluster disruption is mass-dependent or -independent. We will also use our simulations to search for observational diagnostics that can distinguish bound from unbound clusters, and to predict how cluster disruption affects the cluster luminosity function in a variety of galactic environments.

  18. Radiometry simulation within the end-to-end simulation tool SENSOR

    NASA Astrophysics Data System (ADS)

    Wiest, Lorenz; Boerner, Anko

    2001-02-01

    12 An end-to-end simulation is a valuable tool for sensor system design, development, optimization, testing, and calibration. This contribution describes the radiometry module of the end-to-end simulation tool SENSOR. It features MODTRAN 4.0-based look up tables in conjunction with a cache-based multilinear interpolation algorithm to speed up radiometry calculations. It employs a linear reflectance parameterization to reduce look up table size, considers effects due to the topology of a digital elevation model (surface slope, sky view factor) and uses a reflectance class feature map to assign Lambertian and BRDF reflectance properties to the digital elevation model. The overall consistency of the radiometry part is demonstrated by good agreement between ATCOR 4-retrieved reflectance spectra of a simulated digital image cube and the original reflectance spectra used to simulate this image data cube.

  19. Application of Microsoft's ActiveX and DirectX technologies to the visulization of physical system dynamics

    NASA Astrophysics Data System (ADS)

    Mann, Christopher; Narasimhamurthi, Natarajan

    1998-08-01

    This paper discusses a specific implementation of a web and complement based simulation systems. The overall simulation container is implemented within a web page viewed with Microsoft's Internet Explorer 4.0 web browser. Microsoft's ActiveX/Distributed Component Object Model object interfaces are used in conjunction with the Microsoft DirectX graphics APIs to provide visualization functionality for the simulation. The MathWorks' Matlab computer aided control system design program is used as an ActiveX automation server to provide the compute engine for the simulations.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sarovar, Mohan; Zhang, Jun; Zeng, Lishan

    Analog quantum simulators (AQS) will likely be the first nontrivial application of quantum technology for predictive simulation. However, there remain questions regarding the degree of confidence that can be placed in the results of AQS since they do not naturally incorporate error correction. Specifically, how do we know whether an analog simulation of a quantum model will produce predictions that agree with the ideal model in the presence of inevitable imperfections? At the same time there is a widely held expectation that certain quantum simulation questions will be robust to errors and perturbations in the underlying hardware. Resolving these twomore » points of view is a critical step in making the most of this promising technology. In this paper we formalize the notion of AQS reliability by determining sensitivity of AQS outputs to underlying parameters, and formulate conditions for robust simulation. Our approach naturally reveals the importance of model symmetries in dictating the robust properties. Finally, to demonstrate the approach, we characterize the robust features of a variety of quantum many-body models.« less

  1. Modeling of information diffusion in Twitter-like social networks under information overload.

    PubMed

    Li, Pei; Li, Wei; Wang, Hui; Zhang, Xin

    2014-01-01

    Due to the existence of information overload in social networks, it becomes increasingly difficult for users to find useful information according to their interests. This paper takes Twitter-like social networks into account and proposes models to characterize the process of information diffusion under information overload. Users are classified into different types according to their in-degrees and out-degrees, and user behaviors are generalized into two categories: generating and forwarding. View scope is introduced to model the user information-processing capability under information overload, and the average number of times a message appears in view scopes after it is generated by a given type user is adopted to characterize the information diffusion efficiency, which is calculated theoretically. To verify the accuracy of theoretical analysis results, we conduct simulations and provide the simulation results, which are consistent with the theoretical analysis results perfectly. These results are of importance to understand the diffusion dynamics in social networks, and this analysis framework can be extended to consider more realistic situations.

  2. Modeling of Information Diffusion in Twitter-Like Social Networks under Information Overload

    PubMed Central

    Li, Wei

    2014-01-01

    Due to the existence of information overload in social networks, it becomes increasingly difficult for users to find useful information according to their interests. This paper takes Twitter-like social networks into account and proposes models to characterize the process of information diffusion under information overload. Users are classified into different types according to their in-degrees and out-degrees, and user behaviors are generalized into two categories: generating and forwarding. View scope is introduced to model the user information-processing capability under information overload, and the average number of times a message appears in view scopes after it is generated by a given type user is adopted to characterize the information diffusion efficiency, which is calculated theoretically. To verify the accuracy of theoretical analysis results, we conduct simulations and provide the simulation results, which are consistent with the theoretical analysis results perfectly. These results are of importance to understand the diffusion dynamics in social networks, and this analysis framework can be extended to consider more realistic situations. PMID:24795541

  3. Modeling of power electronic systems with EMTP

    NASA Technical Reports Server (NTRS)

    Tam, Kwa-Sur; Dravid, Narayan V.

    1989-01-01

    In view of the potential impact of power electronics on power systems, there is need for a computer modeling/analysis tool to perform simulation studies on power systems with power electronic components as well as to educate engineering students about such systems. The modeling of the major power electronic components of the NASA Space Station Freedom Electric Power System is described along with ElectroMagnetic Transients Program (EMTP) and it is demonstrated that EMTP can serve as a very useful tool for teaching, design, analysis, and research in the area of power systems with power electronic components. EMTP modeling of power electronic circuits is described and simulation results are presented.

  4. Hierarchical lattice models of hydrogen-bond networks in water

    NASA Astrophysics Data System (ADS)

    Dandekar, Rahul; Hassanali, Ali A.

    2018-06-01

    We develop a graph-based model of the hydrogen-bond network in water, with a view toward quantitatively modeling the molecular-level correlational structure of the network. The networks formed are studied by the constructing the model on two infinite-dimensional lattices. Our models are built bottom up, based on microscopic information coming from atomistic simulations, and we show that the predictions of the model are consistent with known results from ab initio simulations of liquid water. We show that simple entropic models can predict the correlations and clustering of local-coordination defects around tetrahedral waters observed in the atomistic simulations. We also find that orientational correlations between bonds are longer ranged than density correlations, determine the directional correlations within closed loops, and show that the patterns of water wires within these structures are also consistent with previous atomistic simulations. Our models show the existence of density and compressibility anomalies, as seen in the real liquid, and the phase diagram of these models is consistent with the singularity-free scenario previously proposed by Sastry and coworkers [Phys. Rev. E 53, 6144 (1996), 10.1103/PhysRevE.53.6144].

  5. Simulation of 100-300 GHz solid-state harmonic sources

    NASA Technical Reports Server (NTRS)

    Zybura, Michael F.; Jones, J. Robert; Jones, Stephen H.; Tait, Gregory B.

    1995-01-01

    Accurate and efficient simulations of the large-signal time-dependent characteristics of second-harmonic Transferred Electron Oscillators (TEO's) and Heterostructure Barrier Varactor (HBV) frequency triplers have been obtained. This is accomplished by using a novel and efficient harmonic-balance circuit analysis technique which facilitates the integration of physics-based hydrodynamic device simulators. The integrated hydrodynamic device/harmonic-balance circuit simulators allow TEO and HBV circuits to be co-designed from both a device and a circuit point of view. Comparisons have been made with published experimental data for both TEO's and HBV's. For TEO's, excellent correlation has been obtained at 140 GHz and 188 GHz in second-harmonic operation. Excellent correlation has also been obtained for HBV frequency triplers operating near 200 GHz. For HBV's, both a lumped quasi-static equivalent circuit model and the hydrodynamic device simulator have been linked to the harmonic-balance circuit simulator. This comparison illustrates the importance of representing active devices with physics-based numerical device models rather than analytical device models.

  6. One-dimensional collision carts computer model and its design ideas for productive experiential learning

    NASA Astrophysics Data System (ADS)

    Wee, Loo Kang

    2012-05-01

    We develop an Easy Java Simulation (EJS) model for students to experience the physics of idealized one-dimensional collision carts. The physics model is described and simulated by both continuous dynamics and discrete transition during collision. In designing the simulations, we discuss briefly three pedagogical considerations namely (1) a consistent simulation world view with a pen and paper representation, (2) a data table, scientific graphs and symbolic mathematical representations for ease of data collection and multiple representational visualizations and (3) a game for simple concept testing that can further support learning. We also suggest using a physical world setup augmented by simulation by highlighting three advantages of real collision carts equipment such as a tacit 3D experience, random errors in measurement and the conceptual significance of conservation of momentum applied to just before and after collision. General feedback from the students has been relatively positive, and we hope teachers will find the simulation useful in their own classes.

  7. Simulating fail-stop in asynchronous distributed systems

    NASA Technical Reports Server (NTRS)

    Sabel, Laura; Marzullo, Keith

    1994-01-01

    The fail-stop failure model appears frequently in the distributed systems literature. However, in an asynchronous distributed system, the fail-stop model cannot be implemented. In particular, it is impossible to reliably detect crash failures in an asynchronous system. In this paper, we show that it is possible to specify and implement a failure model that is indistinguishable from the fail-stop model from the point of view of any process within an asynchronous system. We give necessary conditions for a failure model to be indistinguishable from the fail-stop model, and derive lower bounds on the amount of process replication needed to implement such a failure model. We present a simple one-round protocol for implementing one such failure model, which we call simulated fail-stop.

  8. Distributed Observer Network (DON), Version 3.0, User's Guide

    NASA Technical Reports Server (NTRS)

    Mazzone, Rebecca A.; Conroy, Michael P.

    2015-01-01

    The Distributed Observer Network (DON) is a data presentation tool developed by the National Aeronautics and Space Administration (NASA) to distribute and publish simulation results. Leveraging the display capabilities inherent in modern gaming technology, DON places users in a fully navigable 3-D environment containing graphical models and allows the users to observe how those models evolve and interact over time in a given scenario. Each scenario is driven with data that has been generated by authoritative NASA simulation tools and exported in accordance with a published data interface specification. This decoupling of the data from the source tool enables DON to faithfully display a simulator's results and ensure that every simulation stakeholder will view the exact same information every time.

  9. Some Results of Weak Anticipative Concept Applied in Simulation Based Decision Support in Enterprise

    NASA Astrophysics Data System (ADS)

    Kljajić, Miroljub; Kofjač, Davorin; Kljajić Borštnar, Mirjana; Škraba, Andrej

    2010-11-01

    The simulation models are used as for decision support and learning in enterprises and in schools. Tree cases of successful applications demonstrate usefulness of weak anticipative information. Job shop scheduling production with makespan criterion presents a real case customized flexible furniture production optimization. The genetic algorithm for job shop scheduling optimization is presented. Simulation based inventory control for products with stochastic lead time and demand describes inventory optimization for products with stochastic lead time and demand. Dynamic programming and fuzzy control algorithms reduce the total cost without producing stock-outs in most cases. Values of decision making information based on simulation were discussed too. All two cases will be discussed from optimization, modeling and learning point of view.

  10. Two-Dimensional Computational Fluid Dynamics and Conduction Simulations of Heat Transfer in Horizontal Window Frames with Internal Cavities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gustavsen, Arlid; Kohler, Christian; Dalehaug, Arvid

    2008-12-01

    This paper assesses the accuracy of the simplified frame cavity conduction/convection and radiation models presented in ISO 15099 and used in software for rating and labeling window products. Temperatures and U-factors for typical horizontal window frames with internal cavities are compared; results from Computational Fluid Dynamics (CFD) simulations with detailed radiation modeling are used as a reference. Four different frames were studied. Two were made of polyvinyl chloride (PVC) and two of aluminum. For each frame, six different simulations were performed, two with a CFD code and four with a building-component thermal-simulation tool using the Finite Element Method (FEM). Thismore » FEM tool addresses convection using correlations from ISO 15099; it addressed radiation with either correlations from ISO 15099 or with a detailed, view-factor-based radiation model. Calculations were performed using the CFD code with and without fluid flow in the window frame cavities; the calculations without fluid flow were performed to verify that the CFD code and the building-component thermal-simulation tool produced consistent results. With the FEM-code, the practice of subdividing small frame cavities was examined, in some cases not subdividing, in some cases subdividing cavities with interconnections smaller than five millimeters (mm) (ISO 15099) and in some cases subdividing cavities with interconnections smaller than seven mm (a breakpoint that has been suggested in other studies). For the various frames, the calculated U-factors were found to be quite comparable (the maximum difference between the reference CFD simulation and the other simulations was found to be 13.2 percent). A maximum difference of 8.5 percent was found between the CFD simulation and the FEM simulation using ISO 15099 procedures. The ISO 15099 correlation works best for frames with high U-factors. For more efficient frames, the relative differences among various simulations are larger. Temperature was also compared, at selected locations on the frames. Small differences was found in the results from model to model. Finally, the effectiveness of the ISO cavity radiation algorithms was examined by comparing results from these algorithms to detailed radiation calculations (from both programs). Our results suggest that improvements in cavity heat transfer calculations can be obtained by using detailed radiation modeling (i.e. view-factor or ray-tracing models), and that incorporation of these strategies may be more important for improving the accuracy of results than the use of CFD modeling for horizontal cavities.« less

  11. Signs of Asymmetry in Exploding Stars

    NASA Astrophysics Data System (ADS)

    Hensley, Kerry

    2018-03-01

    Supernova explosions enrich the interstellar medium and can even briefly outshine their host galaxies. However, the mechanism behind these massive explosions still isnt fully understood. Could probing the asymmetry of supernova remnants help us better understand what drives these explosions?Hubble image of the remnant of supernova 1987A, one of the first remnants discovered to be asymmetrical. [ESA/Hubble, NASA]Stellar Send-OffsHigh-mass stars end their lives spectacularly. Each supernova explosion churns the interstellar medium and unleashes high-energy radiation and swarms of neutrinos. Supernovae also suffuse the surrounding interstellar medium with heavy elements that are incorporated into later generations of stars and the planets that form around them.The bubbles of expanding gas these explosions leave behind often appear roughly spherical, but mounting evidence suggests that many supernova remnants are asymmetrical. While asymmetry in supernova remnants can arise when the expanding material plows into the non-uniform interstellar medium, it can also be an intrinsic feature of the explosion itself.Simulation results clockwise from top left: Mass density, calcium mass fraction, oxygen mass fraction, nickel-56 mass fraction. Click to enlarge. [Adapted from Wollaeger et al. 2017]Coding ExplosionsThe presence or absence of asymmetry in a supernova remnant can hold clues as to what drove the explosion. But how can we best observe asymmetry in a supernova remnant? Modeling lets us explore different observational approaches.A team of scientists led by Ryan T. Wollaeger (Los Alamos National Laboratory) used radiative transfer and radiative hydrodynamics simulations to model the explosion of a core-collapse supernova. Wollaeger and collaborators introduced asymmetry into the explosion by creating a single-lobed, fast-moving outflow along one axis.Their simulations showed that while some chemical elements lingered near the origin of the explosion or were distributed evenly throughout the remnant, calcium was isolated to the asymmetrical region, hinting that spectral lines of calcium may be good tracersof asymmetry.Bolometric (top) and gamma-ray (bottom) synthetic light curves for the authors model for a range of simulated viewing angles. [Adapted from Wollaeger et al. 2017]Synthesizing SpectraWollaeger and collaborators then generated synthetic light curves and spectra from their models to determine which spectral features or characteristics indicated the presence of the asymmetric outflow lobe. They found that when an asymmetric outflow lobe is present, the peak luminosity of the explosion depends on the angle at which you view it; the highest luminosity occurs when the lobe is viewed from the side, while the lowest luminosity nearly40%dimmer is seen when the explosion is viewed down the barrel of the lobe. The dense outflow shades the central radioactive source from view, lowering the luminosity.This effect also plays out in the gamma-ray light curves; when viewed down the barrel, the shading of the central source by ahigh-density lobe slows the rise of the gamma-ray luminosity and changes the shape of the light curve compared to views from other vantage points.Another promising avenue for exploring asymmetry is a near-infrared band encompassing an emission line of singly-ionized calcium near 815 nm. Since calcium is confined within the outflow lobe in the simulation, its emission lines are blueshifted when the lobe points toward the observer.The authors point out that there is much more to be done in their models, such as including the effects of shock heating of circumstellar material, which can contribute strongly to the light curve, but these simulations bring us a step closer to understanding the nature of asymmetrical supernova remnants and the explosions that create them.CitationRyan T. Wollaeger et al 2017ApJ845168. doi:10.3847/1538-4357/aa82bd

  12. Ultrafast impact dynamics of reactive materials (Dlott)

    DTIC Science & Technology

    2013-04-16

    Kalia, A. Nakano, B. E. Hohman, and K. L. McNesby, Multimillion atom reactive simulations of nanostructured energetic materials, J. Propul. Power 23...34Materials for Energy Applications - Experiment, Modeling and Simulations ", Mar. 2011, Los Angeles, CA. 7. (invited) Studium Conference on in situ...intermetallics. 7,20-24 The dynamics of conventional reactive materials containing micron to millimeter particles are usually viewed within a

  13. Mathematical Models in Educational Planning. Education and Development, Technical Reports.

    ERIC Educational Resources Information Center

    Organisation for Economic Cooperation and Development, Paris (France).

    This volume contains papers, presented at a 1966 OECD meeting, on the possibilities of applying a number of related techniques such as mathematical model building, simulation, and systematic control theory to the problems of educational planning. The authors and their papers are (1) Richard Stone, "A View of the Conference," (2) Hector…

  14. Mechanical Modeling and Computer Simulation of Protein Folding

    ERIC Educational Resources Information Center

    Prigozhin, Maxim B.; Scott, Gregory E.; Denos, Sharlene

    2014-01-01

    In this activity, science education and modern technology are bridged to teach students at the high school and undergraduate levels about protein folding and to strengthen their model building skills. Students are guided from a textbook picture of a protein as a rigid crystal structure to a more realistic view: proteins are highly dynamic…

  15. Reconciling Simulated and Observed Views of Clouds: MODIS, ISCCP, and the Limits of Instrument Simulators in Climate Models

    NASA Technical Reports Server (NTRS)

    Pincus, Robert; Platnick, Steven E.; Ackerman, Steve; Hemler, Richard; Hofmann, Patrick

    2011-01-01

    The properties of clouds that may be observed by satellite instruments, such as optical depth and cloud top pressure, are only loosely related to the way clouds are represented in models of the atmosphere. One way to bridge this gap is through "instrument simulators," diagnostic tools that map the model representation to synthetic observations so that differences between simulator output and observations can be interpreted unambiguously as model error. But simulators may themselves be restricted by limited information available from the host model or by internal assumptions. This work examines the extent to which instrument simulators are able to capture essential differences between MODIS and ISCCP, two similar but independent estimates of cloud properties. We focus on the stark differences between MODIS and ISCCP observations of total cloudiness and the distribution of cloud optical thickness can be traced to different approaches to marginal pixels, which MODIS excludes and ISCCP treats as homogeneous. These pixels, which likely contain broken clouds, cover about 15% of the planet and contain almost all of the optically thinnest clouds observed by either instrument. Instrument simulators can not reproduce these differences because the host model does not consider unresolved spatial scales and so can not produce broken pixels. Nonetheless, MODIS and ISCCP observation are consistent for all but the optically-thinnest clouds, and models can be robustly evaluated using instrument simulators by excluding ambiguous observations.

  16. A practical laboratory study simulating the percutaneous lumbar transforaminal epidural injection: training model in fresh cadaveric sheep spine.

    PubMed

    Suslu, Husnu

    2012-01-01

    Laboratory training models are essential for developing and refining treatment skills before the clinical application of surgical and invasive procedures. A simple simulation model is needed for young trainees to learn how to handle instruments, and to perform safe lumbar transforaminal epidural injections. Our aim is to present a model of a fresh cadaveric sheep lumbar spine that simulates the lumbar transforaminal epidural injection. The material consists of a 2-year-old fresh cadaveric sheep spine. A 4-step approach was designed for lumbar transforaminal epidural injection under C-arm scopy. For the lumbar transforaminal epidural injection, the fluoroscope was adjusted to get a proper oblique view while the material was stabilized in a prone position. The procedure then begin, using the C-arm guidance scopy. The model simulates well the steps of standard lumbar transforaminal epidural injections in the human spine. The cadaveric sheep spine represents a good method for training and it simulates fluoroscopic lumbar transforaminal epidural steroid injection procedures performed in the human spine.

  17. Evaluation of collimation and imaging configuration in scintimammography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tsui, B.M.W.; Frey, E.C.; Wessell, D.E.

    1996-12-31

    Conventional scintimammography (SM) with {sup 99m}Tc sestamibi has been limited to taking a single lateral view of the breast using a parallel-hole high resolution (LEHR) collimator. The collimator is placed close to the breast for best possible spatial resolution. However, the collimator geometry precludes imaging the breast from other views. We evaluated using a pinhole collimator instead of a LEHR collimator in SM for improved spatial resolution and detection efficiency, and to allow additional imaging views. Results from theoretical calculations indicated that pinhole collimators could be designed with higher spatial resolution and detection efficiency than LEHR when imaging small tomore » medium size breasts. The geometrical shape of the pinhole collimator allows imaging of the breasts from both the lateral and craniocaudal views. The dual-view images allow better determination of the location of the tumors within the breast and improved detection of tumors located in the medial region of the breast. A breast model that simulates the shape and composition of the breast and breast tumors with different sizes and locations was added to an existing 3D mathematical cardiac-torso (MCAT) phantom. A cylindrically shaped phantom with 10 cm diameter and spherical inserts with different sizes and {sup 99m}Tc sestamibi uptakes with respect to the background provide physical models of breast with tumors. Simulation studies using the breast and MCAT phantoms and experimental studies using the cylindrical phantom confirmed the utility of the pinhole collimator in SM for improved breast tumor detection.« less

  18. Evaluation of the HadGEM3-A simulations in view of detection and attribution of human influence on extreme events in Europe

    NASA Astrophysics Data System (ADS)

    Vautard, Robert; Christidis, Nikolaos; Ciavarella, Andrew; Alvarez-Castro, Carmen; Bellprat, Omar; Christiansen, Bo; Colfescu, Ioana; Cowan, Tim; Doblas-Reyes, Francisco; Eden, Jonathan; Hauser, Mathias; Hegerl, Gabriele; Hempelmann, Nils; Klehmet, Katharina; Lott, Fraser; Nangini, Cathy; Orth, René; Radanovics, Sabine; Seneviratne, Sonia I.; van Oldenborgh, Geert Jan; Stott, Peter; Tett, Simon; Wilcox, Laura; Yiou, Pascal

    2018-04-01

    A detailed analysis is carried out to assess the HadGEM3-A global atmospheric model skill in simulating extreme temperatures, precipitation and storm surges in Europe in the view of their attribution to human influence. The analysis is performed based on an ensemble of 15 atmospheric simulations forced with observed sea surface temperature of the 54 year period 1960-2013. These simulations, together with dual simulations without human influence in the forcing, are intended to be used in weather and climate event attribution. The analysis investigates the main processes leading to extreme events, including atmospheric circulation patterns, their links with temperature extremes, land-atmosphere and troposphere-stratosphere interactions. It also compares observed and simulated variability, trends and generalized extreme value theory parameters for temperature and precipitation. One of the most striking findings is the ability of the model to capture North-Atlantic atmospheric weather regimes as obtained from a cluster analysis of sea level pressure fields. The model also reproduces the main observed weather patterns responsible for temperature and precipitation extreme events. However, biases are found in many physical processes. Slightly excessive drying may be the cause of an overestimated summer interannual variability and too intense heat waves, especially in central/northern Europe. However, this does not seem to hinder proper simulation of summer temperature trends. Cold extremes appear well simulated, as well as the underlying blocking frequency and stratosphere-troposphere interactions. Extreme precipitation amounts are overestimated and too variable. The atmospheric conditions leading to storm surges were also examined in the Baltics region. There, simulated weather conditions appear not to be leading to strong enough storm surges, but winds were found in very good agreement with reanalyses. The performance in reproducing atmospheric weather patterns indicates that biases mainly originate from local and regional physical processes. This makes local bias adjustment meaningful for climate change attribution.

  19. 3CE Methodology for Conducting a Modeling, Simulation, and Instrumentation Tool Capability Analysis

    DTIC Science & Technology

    2010-05-01

    flRmurn I F )T:Ir,tir)l! MCr)lto.-lng DHin nttbli..’"Ollc:~ E,;m:a..liut .!,)’l’lt’Mn:l’lll.ll~ t Managemen t F unction a l Arem 1 .5 Toola na...a modeling, simulation, and instrumentation (MS&I) environment. This methodology uses the DoDAF product set to document operational and systems...engineering process were identified and resolved, such as duplication of data elements derived from DoDAF operational and system views used to

  20. Dual exposure view of exterior and interior of Apollo Mission simulator

    NASA Image and Video Library

    1967-08-01

    S67-50585 (1967) --- This is an intentional double exposure showing the Apollo Mission Simulator in the Mission Simulation and Training Facility, Building 5 at the Manned Spacecraft Center. In the exterior view astronauts William A. Anders, Michael Collins, and Frank Borman (reading from top of stairs) are about to enter the simulator. The interior view shows the three astronauts in the simulator. They are (left to right) Borman, Collins, and Anders. Photo credit: NASA

  1. Simulation of Mercury's magnetosheath with a combined hybrid-paraboloid model

    NASA Astrophysics Data System (ADS)

    Parunakian, David; Dyadechkin, Sergey; Alexeev, Igor; Belenkaya, Elena; Khodachenko, Maxim; Kallio, Esa; Alho, Markku

    2017-08-01

    In this paper we introduce a novel approach for modeling planetary magnetospheres that involves a combination of the hybrid model and the paraboloid magnetosphere model (PMM); we further refer to it as the combined hybrid model. While both of these individual models have been successfully applied in the past, their combination enables us both to overcome the traditional difficulties of hybrid models to develop a self-consistent magnetic field and to compensate the lack of plasma simulation in the PMM. We then use this combined model to simulate Mercury's magnetosphere and investigate the geometry and configuration of Mercury's magnetosheath controlled by various conditions in the interplanetary medium. The developed approach provides a unique comprehensive view of Mercury's magnetospheric environment for the first time. Using this setup, we compare the locations of the bow shock and the magnetopause as determined by simulations with the locations predicted by stand-alone PMM runs and also verify the magnetic and dynamic pressure balance at the magnetopause. We also compare the results produced by these simulations with observational data obtained by the magnetometer on board the MErcury Surface, Space ENvironment, GEochemistry, and Ranging (MESSENGER) spacecraft along a dusk-dawn orbit and discuss the signatures of the magnetospheric features that appear in these simulations. Overall, our analysis suggests that combining the semiempirical PMM with a self-consistent global kinetic model creates new modeling possibilities which individual models cannot provide on their own.

  2. A scene model of exosolar systems for use in planetary detection and characterisation simulations

    NASA Astrophysics Data System (ADS)

    Belu, A.; Thiébaut, E.; Ollivier, M.; Lagache, G.; Selsis, F.; Vakili, F.

    2007-12-01

    Context: Instrumental projects that will improve the direct optical finding and characterisation of exoplanets have advanced sufficiently to trigger organized investigation and development of corresponding signal processing algorithms. The first step is the availability of field-of-view (FOV) models. These can then be submitted to various instrumental models, which in turn produce simulated data, enabling the testing of processing algorithms. Aims: We aim to set the specifications of a physical model for typical FOVs of these instruments. Methods: The dynamic in resolution and flux between the various sources present in such a FOV imposes a multiscale, independent layer approach. From review of current literature and through extrapolations from currently available data and models, we derive the features of each source-type in the field of view likely to pass the instrumental filter at exo-Earth level. Results: Stellar limb darkening is shown to cause bias in leakage calibration if unaccounted for. Occurrence of perturbing background stars or galaxies in the typical FOV is unlikely. We extract galactic interstellar medium background emissions for current target lists. Galactic background can be considered uniform over the FOV, and it should show no significant drift with parallax. Our model specifications have been embedded into a Java simulator, soon to be made open-source. We have also designed an associated FITS input/output format standard that we present here. Work supported in part by the ESA/ESTEC contract 18701/04/NL/HB, led by Thales Alenia Space.

  3. Simulations of Convection Zone Flows and Measurements from Multiple Viewing Angles

    NASA Technical Reports Server (NTRS)

    Duvall, Thomas L.; Hanasoge, Shravan

    2011-01-01

    A deep-focusing time-distance measurement technique has been applied to linear acoustic simulations of a solar interior perturbed by convective flows. The simulations are for the full sphere for r/R greater than 0.2. From these it is straightforward to simulate the observations from different viewing angles and to test how multiple viewing angles enhance detectibility. Some initial results will be presented.

  4. Parallel stochastic simulation of macroscopic calcium currents.

    PubMed

    González-Vélez, Virginia; González-Vélez, Horacio

    2007-06-01

    This work introduces MACACO, a macroscopic calcium currents simulator. It provides a parameter-sweep framework which computes macroscopic Ca(2+) currents from the individual aggregation of unitary currents, using a stochastic model for L-type Ca(2+) channels. MACACO uses a simplified 3-state Markov model to simulate the response of each Ca(2+) channel to different voltage inputs to the cell. In order to provide an accurate systematic view for the stochastic nature of the calcium channels, MACACO is composed of an experiment generator, a central simulation engine and a post-processing script component. Due to the computational complexity of the problem and the dimensions of the parameter space, the MACACO simulation engine employs a grid-enabled task farm. Having been designed as a computational biology tool, MACACO heavily borrows from the way cell physiologists conduct and report their experimental work.

  5. Connectionist Interaction Information Retrieval.

    ERIC Educational Resources Information Center

    Dominich, Sandor

    2003-01-01

    Discussion of connectionist views for adaptive clustering in information retrieval focuses on a connectionist clustering technique and activation spreading-based information retrieval model using the interaction information retrieval method. Presents theoretical as well as simulation results as regards computational complexity and includes…

  6. Estimating the number and size of phloem sieve plate pores using longitudinal views and geometric reconstruction.

    PubMed

    Bussières, Philippe

    2014-05-12

    Because it is difficult to obtain transverse views of the plant phloem sieve plate pores, which are short tubes, to estimate their number and diameters, a method based on longitudinal views is proposed. This method uses recent methods to estimate the number and the sizes of approximately circular objects from their images, given by slices perpendicular to the objects. Moreover, because such longitudinal views are obtained from slices that are rather close to the plate centres whereas the pore size may vary with the pore distance from the plate edge, a sieve plate reconstruction model was developed and incorporated in the method to consider this bias. The method was successfully tested with published longitudinal views of phloem of Soybean and an exceptional entire transverse view from the same tissue. The method was also validated with simulated slices in two sieve plates from Cucurbita and Phaseolus. This method will likely be useful to estimate and to model the hydraulic conductivity and the architecture of the plant phloem, and it could have applications for other materials with approximately cylindrical structures.

  7. Numerical simulation on hydromechanical coupling in porous media adopting three-dimensional pore-scale model.

    PubMed

    Liu, Jianjun; Song, Rui; Cui, Mengmeng

    2014-01-01

    A novel approach of simulating hydromechanical coupling in pore-scale models of porous media is presented in this paper. Parameters of the sandstone samples, such as the stress-strain curve, Poisson's ratio, and permeability under different pore pressure and confining pressure, are tested in laboratory scale. The micro-CT scanner is employed to scan the samples for three-dimensional images, as input to construct the model. Accordingly, four physical models possessing the same pore and rock matrix characteristics as the natural sandstones are developed. Based on the micro-CT images, the three-dimensional finite element models of both rock matrix and pore space are established by MIMICS and ICEM software platform. Navier-Stokes equation and elastic constitutive equation are used as the mathematical model for simulation. A hydromechanical coupling analysis in pore-scale finite element model of porous media is simulated by ANSYS and CFX software. Hereby, permeability of sandstone samples under different pore pressure and confining pressure has been predicted. The simulation results agree well with the benchmark data. Through reproducing its stress state underground, the prediction accuracy of the porous rock permeability in pore-scale simulation is promoted. Consequently, the effects of pore pressure and confining pressure on permeability are revealed from the microscopic view.

  8. Numerical Simulation on Hydromechanical Coupling in Porous Media Adopting Three-Dimensional Pore-Scale Model

    PubMed Central

    Liu, Jianjun; Song, Rui; Cui, Mengmeng

    2014-01-01

    A novel approach of simulating hydromechanical coupling in pore-scale models of porous media is presented in this paper. Parameters of the sandstone samples, such as the stress-strain curve, Poisson's ratio, and permeability under different pore pressure and confining pressure, are tested in laboratory scale. The micro-CT scanner is employed to scan the samples for three-dimensional images, as input to construct the model. Accordingly, four physical models possessing the same pore and rock matrix characteristics as the natural sandstones are developed. Based on the micro-CT images, the three-dimensional finite element models of both rock matrix and pore space are established by MIMICS and ICEM software platform. Navier-Stokes equation and elastic constitutive equation are used as the mathematical model for simulation. A hydromechanical coupling analysis in pore-scale finite element model of porous media is simulated by ANSYS and CFX software. Hereby, permeability of sandstone samples under different pore pressure and confining pressure has been predicted. The simulation results agree well with the benchmark data. Through reproducing its stress state underground, the prediction accuracy of the porous rock permeability in pore-scale simulation is promoted. Consequently, the effects of pore pressure and confining pressure on permeability are revealed from the microscopic view. PMID:24955384

  9. User's guide to the stand-damage model: a component of the gypsy moth life system model

    Treesearch

    J. J. Colbert; George Racin

    1995-01-01

    The Stand-Damage Model (a component of the Gypsy Moth Life System Model) simulates the growth of a mixed hardwood forest and incorporates the effects of defoliation by gypsy moth or tree harvesting as prescribed by the user. It can be used to assess the damage from expected defoliation, view the differences between various degrees of defoliation, and describe the...

  10. Sources, Sinks, and Transatlantic Transport of North African Dust Aerosol: A Multimodel Analysis and Comparison With Remote Sensing Data

    NASA Technical Reports Server (NTRS)

    Kim, Dongchul; Chin, Mian; Yu, Hongbin; Diehl, Thomas; Tan, Qian; Kahn, Ralph A.; Tsigaridis, Kostas; Bauer, Susanne E.; Takemura, Toshihiko; Pozzoli, Luca; hide

    2014-01-01

    This study evaluates model-simulated dust aerosols over North Africa and the North Atlantic from five global models that participated in the Aerosol Comparison between Observations and Models phase II model experiments. The model results are compared with satellite aerosol optical depth (AOD) data from Moderate Resolution Imaging Spectroradiometer (MODIS), Multiangle Imaging Spectroradiometer (MISR), and Sea-viewing Wide Field-of-view Sensor, dust optical depth (DOD) derived from MODIS and MISR, AOD and coarse-mode AOD (as a proxy of DOD) from ground-based Aerosol Robotic Network Sun photometer measurements, and dust vertical distributions/centroid height from Cloud Aerosol Lidar with Orthogonal Polarization and Atmospheric Infrared Sounder satellite AOD retrievals. We examine the following quantities of AOD and DOD: (1) the magnitudes over land and over ocean in our study domain, (2) the longitudinal gradient from the dust source region over North Africa to the western North Atlantic, (3) seasonal variations at different locations, and (4) the dust vertical profile shape and the AOD centroid height (altitude above or below which half of the AOD is located). The different satellite data show consistent features in most of these aspects; however, the models display large diversity in all of them, with significant differences among the models and between models and observations. By examining dust emission, removal, and mass extinction efficiency in the five models, we also find remarkable differences among the models that all contribute to the discrepancies of model-simulated dust amount and distribution. This study highlights the challenges in simulating the dust physical and optical processes, even in the best known dust environment, and stresses the need for observable quantities to constrain the model processes.

  11. A View on Future Building System Modeling and Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wetter, Michael

    This chapter presents what a future environment for building system modeling and simulation may look like. As buildings continue to require increased performance and better comfort, their energy and control systems are becoming more integrated and complex. We therefore focus in this chapter on the modeling, simulation and analysis of building energy and control systems. Such systems can be classified as heterogeneous systems because they involve multiple domains, such as thermodynamics, fluid dynamics, heat and mass transfer, electrical systems, control systems and communication systems. Also, they typically involve multiple temporal and spatial scales, and their evolution can be described bymore » coupled differential equations, discrete equations and events. Modeling and simulating such systems requires a higher level of abstraction and modularisation to manage the increased complexity compared to what is used in today's building simulation programs. Therefore, the trend towards more integrated building systems is likely to be a driving force for changing the status quo of today's building simulation programs. Thischapter discusses evolving modeling requirements and outlines a path toward a future environment for modeling and simulation of heterogeneous building systems.A range of topics that would require many additional pages of discussion has been omitted. Examples include computational fluid dynamics for air and particle flow in and around buildings, people movement, daylight simulation, uncertainty propagation and optimisation methods for building design and controls. For different discussions and perspectives on the future of building modeling and simulation, we refer to Sahlin (2000), Augenbroe (2001) and Malkawi and Augenbroe (2004).« less

  12. Variability-aware compact modeling and statistical circuit validation on SRAM test array

    NASA Astrophysics Data System (ADS)

    Qiao, Ying; Spanos, Costas J.

    2016-03-01

    Variability modeling at the compact transistor model level can enable statistically optimized designs in view of limitations imposed by the fabrication technology. In this work we propose a variability-aware compact model characterization methodology based on stepwise parameter selection. Transistor I-V measurements are obtained from bit transistor accessible SRAM test array fabricated using a collaborating foundry's 28nm FDSOI technology. Our in-house customized Monte Carlo simulation bench can incorporate these statistical compact models; and simulation results on SRAM writability performance are very close to measurements in distribution estimation. Our proposed statistical compact model parameter extraction methodology also has the potential of predicting non-Gaussian behavior in statistical circuit performances through mixtures of Gaussian distributions.

  13. Simulation-based instruction of technical skills

    NASA Technical Reports Server (NTRS)

    Towne, Douglas M.; Munro, Allen

    1991-01-01

    A rapid intelligent tutoring development system (RAPIDS) was developed to facilitate the production of interactive, real-time graphical device models for use in instructing the operation and maintenance of complex systems. The tools allowed subject matter experts to produce device models by creating instances of previously defined objects and positioning them in the emerging device model. These simulation authoring functions, as well as those associated with demonstrating procedures and functional effects on the completed model, required no previous programming experience or use of frame-based instructional languages. Three large simulations were developed in RAPIDS, each involving more than a dozen screen-sized sections. Seven small, single-view applications were developed to explore the range of applicability. Three workshops were conducted to train others in the use of the authoring tools. Participants learned to employ the authoring tools in three to four days and were able to produce small working device models on the fifth day.

  14. Symbolic modeling of human anatomy for visualization and simulation

    NASA Astrophysics Data System (ADS)

    Pommert, Andreas; Schubert, Rainer; Riemer, Martin; Schiemann, Thomas; Tiede, Ulf; Hoehne, Karl H.

    1994-09-01

    Visualization of human anatomy in a 3D atlas requires both spatial and more abstract symbolic knowledge. Within our 'intelligent volume' model which integrates these two levels, we developed and implemented a semantic network model for describing human anatomy. Concepts for structuring (abstraction levels, domains, views, generic and case-specific modeling, inheritance) are introduced. Model, tools for generation and exploration and applications in our 3D anatomical atlas are presented and discussed.

  15. Managing complexity in simulations of land surface and near-surface processes

    DOE PAGES

    Coon, Ethan T.; Moulton, J. David; Painter, Scott L.

    2016-01-12

    Increasing computing power and the growing role of simulation in Earth systems science have led to an increase in the number and complexity of processes in modern simulators. We present a multiphysics framework that specifies interfaces for coupled processes and automates weak and strong coupling strategies to manage this complexity. Process management is enabled by viewing the system of equations as a tree, where individual equations are associated with leaf nodes and coupling strategies with internal nodes. A dynamically generated dependency graph connects a variable to its dependencies, streamlining and automating model evaluation, easing model development, and ensuring models aremore » modular and flexible. Additionally, the dependency graph is used to ensure that data requirements are consistent between all processes in a given simulation. Here we discuss the design and implementation of these concepts within the Arcos framework, and demonstrate their use for verification testing and hypothesis evaluation in numerical experiments.« less

  16. Quantifying driver's field-of-view in tractors: methodology and case study.

    PubMed

    Gilad, Issachar; Byran, Eyal

    2015-01-01

    When driving a car, the visual awareness is important for operating and controlling the vehicle. When operating a tractor, it is even more complex. This is because the driving is always accompanied with another task (e.g., plough) that demands constant changes of body postures, to achieve the needed Field-of-View (FoV). Therefore, the cockpit must be well designed to provide best FoV. Today, the driver's FoV is analyzed mostly by computer simulations of a cockpit model and a Digital Human Model (DHM) positioned inside. The outcome is an 'Eye view' that displays what the DHM 'sees'. This paper suggests a new approach that adds quantitative information to the current display; presented on three tractor models as case studies. Based on the results, the design can be modified. This may assist the engineer, to analyze, compare and improve the design, for better addressing the driver needs.

  17. A scalable delivery framework and a pricing model for streaming media with advertisements

    NASA Astrophysics Data System (ADS)

    Al-Hadrusi, Musab; Sarhan, Nabil J.

    2008-01-01

    This paper presents a delivery framework for streaming media with advertisements and an associated pricing model. The delivery model combines the benefits of periodic broadcasting and stream merging. The advertisements' revenues are used to subsidize the price of the media content. The pricing is determined based on the total ads' viewing time. Moreover, this paper presents an efficient ad allocation scheme and three modified scheduling policies that are well suited to the proposed delivery framework. Furthermore, we study the effectiveness of the delivery framework and various scheduling polices through extensive simulation in terms of numerous metrics, including customer defection probability, average number of ads viewed per client, price, arrival rate, profit, and revenue.

  18. Umbra (core)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bradley, Jon David; Oppel III, Fred J.; Hart, Brian E.

    Umbra is a flexible simulation framework for complex systems that can be used by itself for modeling, simulation, and analysis, or to create specific applications. It has been applied to many operations, primarily dealing with robotics and system of system simulations. This version, from 4.8 to 4.8.3b, incorporates bug fixes, refactored code, and new managed C++ wrapper code that can be used to bridge new applications written in C# to the C++ libraries. The new managed C++ wrapper code includes (project/directories) BasicSimulation, CSharpUmbraInterpreter, LogFileView, UmbraAboutBox, UmbraControls, UmbraMonitor and UmbraWrapper.

  19. Simulating Quantile Models with Applications to Economics and Management

    NASA Astrophysics Data System (ADS)

    Machado, José A. F.

    2010-05-01

    The massive increase in the speed of computers over the past forty years changed the way that social scientists, applied economists and statisticians approach their trades and also the very nature of the problems that they could feasibly tackle. The new methods that use intensively computer power go by the names of "computer-intensive" or "simulation". My lecture will start with bird's eye view of the uses of simulation in Economics and Statistics. Then I will turn out to my own research on uses of computer- intensive methods. From a methodological point of view the question I address is how to infer marginal distributions having estimated a conditional quantile process, (Counterfactual Decomposition of Changes in Wage Distributions using Quantile Regression," Journal of Applied Econometrics 20, 2005). Illustrations will be provided of the use of the method to perform counterfactual analysis in several different areas of knowledge.

  20. Optimisation and evaluation of pre-design models for offshore wind turbines with jacket support structures and their influence on integrated load simulations

    NASA Astrophysics Data System (ADS)

    Schafhirt, S.; Kaufer, D.; Cheng, P. W.

    2014-12-01

    In recent years many advanced load simulation tools, allowing an aero-servo-hydroelastic analyses of an entire offshore wind turbine, have been developed and verified. Nowadays, even an offshore wind turbine with a complex support structure such as a jacket can be analysed. However, the computational effort rises significantly with an increasing level of details. This counts especially for offshore wind turbines with lattice support structures, since those models do naturally have a higher number of nodes and elements than simpler monopile structures. During the design process multiple load simulations are demanded to obtain an optimal solution. In the view of pre-design tasks it is crucial to apply load simulations which keep the simulation quality and the computational effort in balance. The paper will introduce a reference wind turbine model consisting of the REpower5M wind turbine and a jacket support structure with a high level of detail. In total twelve variations of this reference model are derived and presented. Main focus is to simplify the models of the support structure and the foundation. The reference model and the simplified models are simulated with the coupled simulation tool Flex5-Poseidon and analysed regarding frequencies, fatigue loads, and ultimate loads. A model has been found which reaches an adequate increase of simulation speed while holding the results in an acceptable range compared to the reference results.

  1. Modelling and study of active vibration control for off-road vehicle

    NASA Astrophysics Data System (ADS)

    Zhang, Junwei; Chen, Sizhong

    2014-05-01

    In view of special working characteristics and structure, engineering machineries do not have conventional suspension system typically. Consequently, operators have to endure severe vibrations which are detrimental both to their health and to the productivity of the loader. Based on displacement control, a kind of active damping method is developed for a skid-steer loader. In this paper, the whole hydraulic system for active damping method is modelled which include swash plate dynamics model, proportional valve model, piston accumulator model, pilot-operated check valve model, relief valve model, pump loss model, and cylinder model. A new road excitation model is developed for the skid-steer loader specially. The response of chassis vibration acceleration to road excitation is verified through simulation. The simulation result of passive accumulator damping is compared with measurements and the comparison shows that they are close. Based on this, parallel PID controller and track PID controller with acceleration feedback are brought into the simulation model, and the simulation results are compared with passive accumulator damping. It shows that the active damping methods with PID controllers are better in reducing chassis vibration acceleration and pitch movement. In the end, the test work for active damping method is proposed for the future work.

  2. United States Air Force Graduate Student Research Program. 1989 Program Technical Report. Volume 1

    DTIC Science & Technology

    1989-12-01

    Analysis is required to supplement the experimental observations, which requires the formulation of a realistic model of the physical problem...RECOMMENDATION: a . From our point of view, the research team considere the NASTRAN model correct due to the vibrational frequencies, but we are still...structure of the program was understood, attempts were made to change the model from a thunderstorm simulation

  3. Selecting materialized views using random algorithm

    NASA Astrophysics Data System (ADS)

    Zhou, Lijuan; Hao, Zhongxiao; Liu, Chi

    2007-04-01

    The data warehouse is a repository of information collected from multiple possibly heterogeneous autonomous distributed databases. The information stored at the data warehouse is in form of views referred to as materialized views. The selection of the materialized views is one of the most important decisions in designing a data warehouse. Materialized views are stored in the data warehouse for the purpose of efficiently implementing on-line analytical processing queries. The first issue for the user to consider is query response time. So in this paper, we develop algorithms to select a set of views to materialize in data warehouse in order to minimize the total view maintenance cost under the constraint of a given query response time. We call it query_cost view_ selection problem. First, cost graph and cost model of query_cost view_ selection problem are presented. Second, the methods for selecting materialized views by using random algorithms are presented. The genetic algorithm is applied to the materialized views selection problem. But with the development of genetic process, the legal solution produced become more and more difficult, so a lot of solutions are eliminated and producing time of the solutions is lengthened in genetic algorithm. Therefore, improved algorithm has been presented in this paper, which is the combination of simulated annealing algorithm and genetic algorithm for the purpose of solving the query cost view selection problem. Finally, in order to test the function and efficiency of our algorithms experiment simulation is adopted. The experiments show that the given methods can provide near-optimal solutions in limited time and works better in practical cases. Randomized algorithms will become invaluable tools for data warehouse evolution.

  4. Model simulation studies to clarify the effect on saccadic eye movements of initial condition velocities set by the Vestibular Ocular Reflex (VOR)

    NASA Technical Reports Server (NTRS)

    Nam, M. H.; Winters, J. M.; Stark, L.

    1981-01-01

    Voluntary active head rotations produced vestibulo-ocular reflex eye movements (VOR) with the subject viewing a fixation target. When this target jumped, the size of the refixation saccades were a function of the ongoing initial velocity of the eye. Saccades made against the VOR were larger in magnitude. Simulation of a reciprocally innervated model eye movement provided results comparable to the experimental data. Most of the experimental effect appeared to be due to linear summation for saccades of 5 and 10 degree magnitude. For small saccades of 2.5 degrees, peripheral nonlinear interaction of state variables in the neuromuscular plant also played a role as proven by comparable behavior in the simulated model with known controller signals.

  5. Issues in visual support to real-time space system simulation solved in the Systems Engineering Simulator

    NASA Technical Reports Server (NTRS)

    Yuen, Vincent K.

    1989-01-01

    The Systems Engineering Simulator has addressed the major issues in providing visual data to its real-time man-in-the-loop simulations. Out-the-window views and CCTV views are provided by three scene systems to give the astronauts their real-world views. To expand the window coverage for the Space Station Freedom workstation a rotating optics system is used to provide the widest field of view possible. To provide video signals to as many viewpoints as possible, windows and CCTVs, with a limited amount of hardware, a video distribution system has been developed to time-share the video channels among viewpoints at the selection of the simulation users. These solutions have provided the visual simulation facility for real-time man-in-the-loop simulations for the NASA space program.

  6. THE VIEWING ANGLES OF BROAD ABSORPTION LINE VERSUS UNABSORBED QUASARS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DiPompeo, M. A.; Brotherton, M. S.; De Breuck, C.

    2012-06-10

    It was recently shown that there is a significant difference in the radio spectral index distributions of broad absorption line (BAL) quasars and unabsorbed quasars, with an overabundance of BAL quasars with steeper radio spectra. This result suggests that source orientation does play into the presence or absence of BAL features. In this paper, we provide more quantitative analysis of this result based on Monte Carlo simulations. While the relationship between viewing angle and spectral index does indeed contain a lot of scatter, the spectral index distributions are different enough to overcome that intrinsic variation. Utilizing two different models ofmore » the relationship between spectral index and viewing angle, the simulations indicate that the difference in spectral index distributions can be explained by allowing BAL quasar viewing angles to extend about 10 Degree-Sign farther from the radio jet axis than non-BAL sources, though both can be seen at small angles. These results show that orientation cannot be the only factor determining whether BAL features are present, but it does play a role.« less

  7. Simulating Fragmentation and Fluid-Induced Fracture in Disordered Media Using Random Finite-Element Meshes

    DOE PAGES

    Bishop, Joseph E.; Martinez, Mario J.; Newell, Pania

    2016-11-08

    Fracture and fragmentation are extremely nonlinear multiscale processes in which microscale damage mechanisms emerge at the macroscale as new fracture surfaces. Numerous numerical methods have been developed for simulating fracture initiation, propagation, and coalescence. In this paper, we present a computational approach for modeling pervasive fracture in quasi-brittle materials based on random close-packed Voronoi tessellations. Each Voronoi cell is formulated as a polyhedral finite element containing an arbitrary number of vertices and faces. Fracture surfaces are allowed to nucleate only at the intercell faces. Cohesive softening tractions are applied to new fracture surfaces in order to model the energy dissipatedmore » during fracture growth. The randomly seeded Voronoi cells provide a regularized discrete random network for representing fracture surfaces. The potential crack paths within the random network are viewed as instances of realizable crack paths within the continuum material. Mesh convergence of fracture simulations is viewed in a weak, or distributional, sense. The explicit facet representation of fractures within this approach is advantageous for modeling contact on new fracture surfaces and fluid flow within the evolving fracture network. Finally, applications of interest include fracture and fragmentation in quasi-brittle materials and geomechanical applications such as hydraulic fracturing, engineered geothermal systems, compressed-air energy storage, and carbon sequestration.« less

  8. Simulating Fragmentation and Fluid-Induced Fracture in Disordered Media Using Random Finite-Element Meshes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bishop, Joseph E.; Martinez, Mario J.; Newell, Pania

    Fracture and fragmentation are extremely nonlinear multiscale processes in which microscale damage mechanisms emerge at the macroscale as new fracture surfaces. Numerous numerical methods have been developed for simulating fracture initiation, propagation, and coalescence. In this paper, we present a computational approach for modeling pervasive fracture in quasi-brittle materials based on random close-packed Voronoi tessellations. Each Voronoi cell is formulated as a polyhedral finite element containing an arbitrary number of vertices and faces. Fracture surfaces are allowed to nucleate only at the intercell faces. Cohesive softening tractions are applied to new fracture surfaces in order to model the energy dissipatedmore » during fracture growth. The randomly seeded Voronoi cells provide a regularized discrete random network for representing fracture surfaces. The potential crack paths within the random network are viewed as instances of realizable crack paths within the continuum material. Mesh convergence of fracture simulations is viewed in a weak, or distributional, sense. The explicit facet representation of fractures within this approach is advantageous for modeling contact on new fracture surfaces and fluid flow within the evolving fracture network. Finally, applications of interest include fracture and fragmentation in quasi-brittle materials and geomechanical applications such as hydraulic fracturing, engineered geothermal systems, compressed-air energy storage, and carbon sequestration.« less

  9. A 3D virtual reality simulator for training of minimally invasive surgery.

    PubMed

    Mi, Shao-Hua; Hou, Zeng-Gunag; Yang, Fan; Xie, Xiao-Liang; Bian, Gui-Bin

    2014-01-01

    For the last decade, remarkable progress has been made in the field of cardiovascular disease treatment. However, these complex medical procedures require a combination of rich experience and technical skills. In this paper, a 3D virtual reality simulator for core skills training in minimally invasive surgery is presented. The system can generate realistic 3D vascular models segmented from patient datasets, including a beating heart, and provide a real-time computation of force and force feedback module for surgical simulation. Instruments, such as a catheter or guide wire, are represented by a multi-body mass-spring model. In addition, a realistic user interface with multiple windows and real-time 3D views are developed. Moreover, the simulator is also provided with a human-machine interaction module that gives doctors the sense of touch during the surgery training, enables them to control the motion of a virtual catheter/guide wire inside a complex vascular model. Experimental results show that the simulator is suitable for minimally invasive surgery training.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klein, Steven Karl; Determan, John C.

    Dynamic System Simulation (DSS) models of fissile solution systems have been developed and verified against a variety of historical configurations. DSS techniques have been applied specifically to subcritical accelerator-driven systems using fissile solution fuels of uranium. Initial DSS models were developed in DESIRE, a specialized simulation scripting language. In order to tailor the DSS models to specifically meet needs of system designers they were converted to a Visual Studio implementation, and one of these subsequently to National Instrument’s LabVIEW for human factors engineering and operator training. Specific operational characteristics of subcritical accelerator-driven systems have been examined using a DSS modelmore » tailored to this particular class using fissile fuel.« less

  11. Modeling and Simulation of the ARES UPPER STAGE Transportation, Lifting, Stacking and Mating Operations Within the Vehicle Assembly Building at KSC

    NASA Technical Reports Server (NTRS)

    Kromis, Phillip A.

    2010-01-01

    This viewgraph presentation describes the modeling and simulation of the Ares Upper Stage Transportation, lifting, stacking, and mating operations within the Vehicle Assembly Building (VAB) at Kennedy Space Center (KSC). An aerial view of KSC Launch Shuttle Complex, two views of the Delmia process control layout, and an upper stage move subroutine and breakdown are shown. An overhead image of the VAB and the turning basin along with the Pegasus barge at the turning basin are also shown. This viewgraph presentation also shows the actual design and the removal of the mid-section spring tensioners, the removal of the AFT rear and forward tensioners tie downs, and removing the AFT hold down post and mount. US leaving the Pegasus Barge, the upper stage arriving at transfer aisle, upper stage receiving/inspection in transfer aisle, and an overhead view of upper stage receiving/inspection in transfer aisle are depicted. Five views of the actual connection of the cabling to the upper stage aft lifting hardware are shown. The upper stage transporter forward connector, two views of the rotation horizontal to vertical, the disconnection of the rear bolt ring cabling, the lowering of the upper stage to the inspection stand, disconnection of the rear bolt ring from the upper stage, the lifting of the upper stage and inspection of AFT fange, and the transfer of upper stage in an integrated stack are shown. Six views of the mating of the upper stage to the first stage are depicted. The preparation, inspection, and removal of the forward dome are shown. The upper stage mated on the integrated stack and crawler is also shown. This presentation concludes with A Rapid Upper Limb Assessment (RULA) utilizing male and female models for assessing risk factors to the upper extremities of human beings in an actual physical environment.

  12. Study of City Landscape Heritage Using Lidar Data and 3d-City Models

    NASA Astrophysics Data System (ADS)

    Rubinowicz, P.; Czynska, K.

    2015-04-01

    In contemporary town planning protection of urban landscape is a significant issue. It regards especially those cities, where urban structures are the result of ages of evolution and layering of historical development process. Specific panoramas and other strategic views with historic city dominants can be an important part of the cultural heritage and genius loci. Other hand, protection of such expositions introduces limitations for future based city development. Digital Earth observation techniques creates new possibilities for more accurate urban studies, monitoring of urbanization processes and measuring of city landscape parameters. The paper examines possibilities of application of Lidar data and digital 3D-city models for: a) evaluation of strategic city views, b) mapping landscape absorption limits, and c) determination protection zones, where the urbanization and buildings height should be limited. In reference to this goal, the paper introduces a method of computational analysis of the city landscape called Visual Protection Surface (VPS). The method allows to emulate a virtual surface above the city including protection of a selected strategic views. The surface defines maximum height of buildings in such a way, that no new facility can be seen in any of selected views. The research includes also analyses of the quality of simulations according the form and precision of the input data: airborne Lidar / DSM model and more advanced 3D-city models (incl. semantic of the geometry, like in CityGML format). The outcome can be a support for professional planning of tall building development. Application of VPS method have been prepared by a computer program developed by the authors (C++). Simulations were carried out on an example of the city of Dresden.

  13. Using Computer Simulations for Promoting Model-based Reasoning. Epistemological and Educational Dimensions

    NASA Astrophysics Data System (ADS)

    Develaki, Maria

    2017-11-01

    Scientific reasoning is particularly pertinent to science education since it is closely related to the content and methodologies of science and contributes to scientific literacy. Much of the research in science education investigates the appropriate framework and teaching methods and tools needed to promote students' ability to reason and evaluate in a scientific way. This paper aims (a) to contribute to an extended understanding of the nature and pedagogical importance of model-based reasoning and (b) to exemplify how using computer simulations can support students' model-based reasoning. We provide first a background for both scientific reasoning and computer simulations, based on the relevant philosophical views and the related educational discussion. This background suggests that the model-based framework provides an epistemologically valid and pedagogically appropriate basis for teaching scientific reasoning and for helping students develop sounder reasoning and decision-taking abilities and explains how using computer simulations can foster these abilities. We then provide some examples illustrating the use of computer simulations to support model-based reasoning and evaluation activities in the classroom. The examples reflect the procedure and criteria for evaluating models in science and demonstrate the educational advantages of their application in classroom reasoning activities.

  14. Virtual world reconstruction using the modeling and simulation extended vector product prototype

    DOT National Transportation Integrated Search

    1997-05-30

    The MS Extended Vector Product (MSEVP) prototype being developed is an extended vector product format-based product containing a continuous surface representation and a consistent view of elevation across the thematic coverages contained within a dat...

  15. A multi-view face recognition system based on cascade face detector and improved Dlib

    NASA Astrophysics Data System (ADS)

    Zhou, Hongjun; Chen, Pei; Shen, Wei

    2018-03-01

    In this research, we present a framework for multi-view face detect and recognition system based on cascade face detector and improved Dlib. This method is aimed to solve the problems of low efficiency and low accuracy in multi-view face recognition, to build a multi-view face recognition system, and to discover a suitable monitoring scheme. For face detection, the cascade face detector is used to extracted the Haar-like feature from the training samples, and Haar-like feature is used to train a cascade classifier by combining Adaboost algorithm. Next, for face recognition, we proposed an improved distance model based on Dlib to improve the accuracy of multiview face recognition. Furthermore, we applied this proposed method into recognizing face images taken from different viewing directions, including horizontal view, overlooks view, and looking-up view, and researched a suitable monitoring scheme. This method works well for multi-view face recognition, and it is also simulated and tested, showing satisfactory experimental results.

  16. Effects of camera location on the reconstruction of 3D flare trajectory with two cameras

    NASA Astrophysics Data System (ADS)

    Özsaraç, Seçkin; Yeşilkaya, Muhammed

    2015-05-01

    Flares are used as valuable electronic warfare assets for the battle against infrared guided missiles. The trajectory of the flare is one of the most important factors that determine the effectiveness of the counter measure. Reconstruction of the three dimensional (3D) position of a point, which is seen by multiple cameras, is a common problem. Camera placement, camera calibration, corresponding pixel determination in between the images of different cameras and also the triangulation algorithm affect the performance of 3D position estimation. In this paper, we specifically investigate the effects of camera placement on the flare trajectory estimation performance by simulations. Firstly, 3D trajectory of a flare and also the aircraft, which dispenses the flare, are generated with simple motion models. Then, we place two virtual ideal pinhole camera models on different locations. Assuming the cameras are tracking the aircraft perfectly, the view vectors of the cameras are computed. Afterwards, using the view vector of each camera and also the 3D position of the flare, image plane coordinates of the flare on both cameras are computed using the field of view (FOV) values. To increase the fidelity of the simulation, we have used two sources of error. One is used to model the uncertainties in the determination of the camera view vectors, i.e. the orientations of the cameras are measured noisy. Second noise source is used to model the imperfections of the corresponding pixel determination of the flare in between the two cameras. Finally, 3D position of the flare is estimated using the corresponding pixel indices, view vector and also the FOV of the cameras by triangulation. All the processes mentioned so far are repeated for different relative camera placements so that the optimum estimation error performance is found for the given aircraft and are trajectories.

  17. A Virtual Sky with Extragalactic H I and CO Lines for the Square Kilometre Array and the Atacama Large Millimeter/Submillimeter Array

    NASA Astrophysics Data System (ADS)

    Obreschkow, D.; Klöckner, H.-R.; Heywood, I.; Levrier, F.; Rawlings, S.

    2009-10-01

    We present a sky simulation of the atomic H I-emission line and the first 10 12C16O rotational emission lines of molecular gas in galaxies beyond the Milky Way. The simulated sky field has a comoving diameter of 500 h -1 Mpc; hence, the actual field of view depends on the (user-defined) maximal redshift z max; e.g., for z max = 10, the field of view yields ~4 × 4 deg2. For all galaxies, we estimate the line fluxes, line profiles, and angular sizes of the H I and CO-emission lines. The galaxy sample is complete for galaxies with cold hydrogen masses above 108 M sun. This sky simulation builds on a semi-analytic model of the cosmic evolution of galaxies in a Λ cold dark matter (ΛCDM) cosmology. The evolving CDM distribution was adopted from the Millennium Simulation, an N-body CDM simulation in a cubic box with a side length of 500 h -1 Mpc. This side length limits the coherence scale of our sky simulation: it is long enough to allow the extraction of the baryon acoustic oscillations in the galaxy power spectrum, yet the position and amplitude of the first acoustic peak will be imperfectly defined. This sky simulation is a tangible aid to the design and operation of future telescopes, such as the Square Kilometre Array, Large Millimeter Telescope, and Atacama Large Millimeter/Submillimeter Array. The results presented in this paper have been restricted to a graphical representation of the simulated sky and fundamental dN/dz analyses for peak flux density limited and total flux limited surveys of H I and CO. A key prediction is that H I will be harder to detect at redshifts z gsim 2 than predicted by a no-evolution model. The future verification or falsification of this prediction will allow us to qualify the semi-analytic models. -SAX-Sky"

  18. Visualization of the Eastern Renewable Generation Integration Study: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gruchalla, Kenny; Novacheck, Joshua; Bloom, Aaron

    The Eastern Renewable Generation Integration Study (ERGIS), explores the operational impacts of the wide spread adoption of wind and solar photovoltaics (PV) resources in the U.S. Eastern Interconnection and Quebec Interconnection (collectively, EI). In order to understand some of the economic and reliability challenges of managing hundreds of gigawatts of wind and PV generation, we developed state of the art tools, data, and models for simulating power system operations using hourly unit commitment and 5-minute economic dispatch over an entire year. Using NREL's high-performance computing capabilities and new methodologies to model operations, we found that the EI, as simulated withmore » evolutionary change in 2026, could balance the variability and uncertainty of wind and PV at a 5-minute level under a variety of conditions. A large-scale display and a combination of multiple coordinated views and small multiples were used to visually analyze the four large highly multivariate scenarios with high spatial and temporal resolutions. state of the art tools, data, and models for simulating power system operations using hourly unit commitment and 5-minute economic dispatch over an entire year. Using NRELs high-performance computing capabilities and new methodologies to model operations, we found that the EI, as simulated with evolutionary change in 2026, could balance the variability and uncertainty of wind and PV at a 5-minute level under a variety of conditions. A large-scale display and a combination of multiple coordinated views and small multiples were used to visually analyze the four large highly multivariate scenarios with high spatial and temporal resolutions.« less

  19. Sharing and reusing cardiovascular anatomical models over the Web: a step towards the implementation of the virtual physiological human project.

    PubMed

    Gianni, Daniele; McKeever, Steve; Yu, Tommy; Britten, Randall; Delingette, Hervé; Frangi, Alejandro; Hunter, Peter; Smith, Nicolas

    2010-06-28

    Sharing and reusing anatomical models over the Web offers a significant opportunity to progress the investigation of cardiovascular diseases. However, the current sharing methodology suffers from the limitations of static model delivery (i.e. embedding static links to the models within Web pages) and of a disaggregated view of the model metadata produced by publications and cardiac simulations in isolation. In the context of euHeart--a research project targeting the description and representation of cardiovascular models for disease diagnosis and treatment purposes--we aim to overcome the above limitations with the introduction of euHeartDB, a Web-enabled database for anatomical models of the heart. The database implements a dynamic sharing methodology by managing data access and by tracing all applications. In addition to this, euHeartDB establishes a knowledge link with the physiome model repository by linking geometries to CellML models embedded in the simulation of cardiac behaviour. Furthermore, euHeartDB uses the exFormat--a preliminary version of the interoperable FieldML data format--to effectively promote reuse of anatomical models, and currently incorporates Continuum Mechanics, Image Analysis, Signal Processing and System Identification Graphical User Interface (CMGUI), a rendering engine, to provide three-dimensional graphical views of the models populating the database. Currently, euHeartDB stores 11 cardiac geometries developed within the euHeart project consortium.

  20. A graphical modeling tool for evaluating nitrogen loading to and nitrate transport in ground water in the mid-Snake region, south-central Idaho

    USGS Publications Warehouse

    Clark, David W.; Skinner, Kenneth D.; Pollock, David W.

    2006-01-01

    A flow and transport model was created with a graphical user interface to simplify the evaluation of nitrogen loading and nitrate transport in the mid-Snake region in south-central Idaho. This model and interface package, the Snake River Nitrate Scenario Simulator, uses the U.S. Geological Survey's MODFLOW 2000 and MOC3D models. The interface, which is enabled for use with geographic information systems (GIS), was created using ESRI's royalty-free MapObjects LT software. The interface lets users view initial nitrogen-loading conditions (representing conditions as of 1998), alter the nitrogen loading within selected zones by specifying a multiplication factor and applying it to the initial condition, run the flow and transport model, and view a graphical representation of the modeling results. The flow and transport model of the Snake River Nitrate Scenario Simulator was created by rediscretizing and recalibrating a clipped portion of an existing regional flow model. The new subregional model was recalibrated with newly available water-level data and spring and ground-water nitrate concentration data for the study area. An updated nitrogen input GIS layer controls the application of nitrogen to the flow and transport model. Users can alter the nitrogen application to the flow and transport model by altering the nitrogen load in predefined spatial zones contained within similar political, hydrologic, and size-constrained boundaries.

  1. Reconciling Simulated and Observed Views of Clouds: MODIS, ISCCP, and the Limits or Instrument Simulators

    NASA Technical Reports Server (NTRS)

    Ackerman, Steven A.; Hemler, Richard S.; Hofman, Robert J. Patrick; Pincus, Robert; Platnick, Steven

    2011-01-01

    The properties of clouds that may be observed by satellite instruments, such as optical depth and cloud top pressure, are only loosely related to the way clouds m-e represented in models of the atmosphere. One way to bridge this gap is through "instrument simulators," diagnostic tools that map the model representation to synthetic observations so that differences between simulator output and observations can be interpreted unambiguously as model error. But simulators may themselves be restricted by limited information available from the host model or by internal assumptions. This paper considers the extent to which instrument simulators are able to capture essential differences between MODIS and ISCCP, two similar but independent estimates of cloud properties. The authors review the measurements and algorithms underlying these two cloud climatologies, introduce a MODIS simulator, and detail data sets developed for comparison with global models using ISCCP and MODIS simulators, In nature MODIS observes less mid-level doudines!> than ISCCP, consistent with the different methods used to determine cloud top pressure; aspects of this difference are reproduced by the simulators running in a climate modeL But stark differences between MODIS and ISCCP observations of total cloudiness and the distribution of cloud optical thickness can be traced to different approaches to marginal pixels, which MODIS excludes and ISCCP treats as homogeneous. These pixels, which likely contain broken clouds, cover about 15 k of the planet and contain almost all of the optically thinnest clouds observed by either instrument. Instrument simulators can not reproduce these differences because the host model does not consider unresolved spatial scales and so can not produce broken pixels. Nonetheless, MODIS and ISCCP observation are consistent for all but the optically-thinnest clouds, and models can be robustly evaluated using instrument simulators by excluding ambiguous observations.

  2. Impact of finite receiver-aperture size in a non-line-of-sight single-scatter propagation model.

    PubMed

    Elshimy, Mohamed A; Hranilovic, Steve

    2011-12-01

    In this paper, a single-scatter propagation model is developed that expands the classical model by considering a finite receiver-aperture size for non-line-of-sight communication. The expanded model overcomes some of the difficulties with the classical model, most notably, inaccuracies in scenarios with short range and low elevation angle where significant scattering takes place near the receiver. The developed model does not approximate the receiver aperture as a point, but uses its dimensions for both field-of-view and solid-angle computations. To verify the model, a Monte Carlo simulation of photon transport in a turbid medium is applied. Simulation results for temporal responses and path losses are presented at a wavelength of 260 nm that lies in the solar-blind ultraviolet region.

  3. Participatory ergonomics simulation of hospital work systems: The influence of simulation media on simulation outcome.

    PubMed

    Andersen, Simone Nyholm; Broberg, Ole

    2015-11-01

    Current application of work system simulation in participatory ergonomics (PE) design includes a variety of different simulation media. However, the actual influence of the media attributes on the simulation outcome has received less attention. This study investigates two simulation media: full-scale mock-ups and table-top models. The aim is to compare, how the media attributes of fidelity and affordance influence the ergonomics identification and evaluation in PE design of hospital work systems. The results illustrate, how the full-scale mock-ups' high fidelity of room layout and affordance of tool operation support ergonomics identification and evaluation related to the work system entities space and technologies & tools. The table-top models' high fidelity of function relations and affordance of a helicopter view support ergonomics identification and evaluation related to the entity organization. Furthermore, the study addresses the form of the identified and evaluated conditions, being either identified challenges or tangible design criteria. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  4. Simulation of an Asynchronous Machine by using a Pseudo Bond Graph

    NASA Astrophysics Data System (ADS)

    Romero, Gregorio; Felez, Jesus; Maroto, Joaquin; Martinez, M. Luisa

    2008-11-01

    For engineers, computer simulation, is a basic tool since it enables them to understand how systems work without actually needing to see them. They can learn how they work in different circumstances and optimize their design with considerably less cost in terms of time and money than if they had to carry out tests on a physical system. However, if computer simulation is to be reliable it is essential for the simulation model to be validated. There is a wide range of commercial brands on the market offering products for electrical domain simulation (SPICE, LabVIEW PSCAD,Dymola, Simulink, Simplorer,...). These are powerful tools, but require the engineer to have a perfect knowledge of the electrical field. This paper shows an alternative methodology to can simulate an asynchronous machine using the multidomain Bond Graph technique and apply it in any program that permit the simulation of models based in this technique; no extraordinary knowledge of this technique and electric field are required to understand the process .

  5. Evaluation of Resuspension from Propeller Wash in DoD Harbors

    DTIC Science & Technology

    2016-09-01

    Environmental Research and Development Center FANS FOV ICP-MS Finite Analytical Navier-Stoker Solver Field of View Inductively Coupled Plasma with...Model (1984) and the Finite Analytical Navier- Stoker Solver (FANS) model (Chen et al., 2003) were set up to simulate and evaluate flow velocities and...model for evaluating the resuspension potential of propeller wash by a tugboat and the FANS model for a DDG. The Finite -Analytic Navier-Stokes (FANS

  6. The analysis of delays in simulator digital computing systems. Volume 1: Formulation of an analysis approach using a central example simulator model

    NASA Technical Reports Server (NTRS)

    Heffley, R. K.; Jewell, W. F.; Whitbeck, R. F.; Schulman, T. M.

    1980-01-01

    The effects of spurious delays in real time digital computing systems are examined. Various sources of spurious delays are defined and analyzed using an extant simulator system as an example. A specific analysis procedure is set forth and four cases are viewed in terms of their time and frequency domain characteristics. Numerical solutions are obtained for three single rate one- and two-computer examples, and the analysis problem is formulated for a two-rate, two-computer example.

  7. Utilizing In Situ Directional Hyperspectral Measurements to Validate Bio-Indicator Simulations for a Corn Crop Canopy

    NASA Technical Reports Server (NTRS)

    Cheng, Yen-Ben; Middleton, Elizabeth M.; Huemmrich, Karl F.; Zhang, Qingyuan; Campbell, Petya K. E.; Corp, Lawrence A.; Russ, Andrew L.; Kustas, William P.

    2010-01-01

    Two radiative transfer canopy models, SAIL and the two-layer Markov-Chain Canopy Reflectance Model (MCRM), were coupled with in situ leaf optical properties to simulate canopy-level spectral band ratio vegetation indices with the focus on the photochemical reflectance index in a cornfield. In situ hyperspectral measurements were made at both leaf and canopy levels. Leaf optical properties were obtained from both sunlit and shaded leaves. Canopy reflectance was acquired for eight different relative azimuth angles (psi) at three different view zenith angles (Theta (sub v)), and later used to validate model outputs. Field observations of photochemical reflectance index (PRI) for sunlit leaves exhibited lower values than shaded leaves, indicating higher light stress. Canopy PRI expressed obvious sensitivity to viewing geometry, as a function of both Theta (sub v) and psi . Overall, simulations from MCRM exhibited better agreements with in situ values than SAIL. When using only sunlit leaves as input, the MCRM-simulated PRI values showed satisfactory correlation and RMSE, as compared to in situ values. However, the performance of the MCRM model was significantly improved after defining a lower canopy layer comprised of shaded leaves beneath the upper sunlit leaf layer. Four other widely used band ratio vegetation indices were also studied and compared with the PRI results. MCRM simulations were able to generate satisfactory simulations for these other four indices when using only sunlit leaves as input; but unlike PRI, adding shaded leaves did not improve the performance of MCRM. These results support the hypothesis that the PRI is sensitive to physiological dynamics while the others detect static factors related to canopy structure. Sensitivity analysis was performed on MCRM in order to better understand the effects of structure related parameters on the PRI simulations. Leaf area index (LAI) showed the most significant impact on MCRM-simulated PRI among the parameters studied. This research shows the importance of hyperspectral and narrow band sensor studies, and especially the necessity of including the green wavelengths (e.g., 531 nm) on satellites proposing to monitor carbon dynamics of terrestrial ecosystems.

  8. Mimoza: web-based semantic zooming and navigation in metabolic networks.

    PubMed

    Zhukova, Anna; Sherman, David J

    2015-02-26

    The complexity of genome-scale metabolic models makes them quite difficult for human users to read, since they contain thousands of reactions that must be included for accurate computer simulation. Interestingly, hidden similarities between groups of reactions can be discovered, and generalized to reveal higher-level patterns. The web-based navigation system Mimoza allows a human expert to explore metabolic network models in a semantically zoomable manner: The most general view represents the compartments of the model; the next view shows the generalized versions of reactions and metabolites in each compartment; and the most detailed view represents the initial network with the generalization-based layout (where similar metabolites and reactions are placed next to each other). It allows a human expert to grasp the general structure of the network and analyze it in a top-down manner Mimoza can be installed standalone, or used on-line at http://mimoza.bordeaux.inria.fr/ , or installed in a Galaxy server for use in workflows. Mimoza views can be embedded in web pages, or downloaded as COMBINE archives.

  9. DYNAMIC MODELING AND SIMULATION FOR ENVIRONMENTALLY BENIGN CLEANING AND RINSING. (R824732)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  10. ISOPRENE FORMS SECONDARY ORGANIC AEROSOL THROUGH CLOUD PROCESSING: MODEL SIMULATIONS (R831073)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  11. Minimizing systematic errors from atmospheric multiple scattering and satellite viewing geometry in coastal zone color scanner level IIA imagery

    NASA Technical Reports Server (NTRS)

    Martin, D. L.; Perry, M. J.

    1994-01-01

    Water-leaving radiances and phytoplankton pigment concentrations are calculated from coastal zone color scanner (CZCS) radiance measurements by removing atmospheric Rayleigh and aerosol radiances from the total radiance signal measured at the satellite. The single greatest source of error in CZCS atmospheric correction algorithms in the assumption that these Rayleigh and aerosol radiances are separable. Multiple-scattering interactions between Rayleigh and aerosol components cause systematic errors in calculated aerosol radiances, and the magnitude of these errors is dependent on aerosol type and optical depth and on satellite viewing geometry. A technique was developed which extends the results of previous radiative transfer modeling by Gordon and Castano to predict the magnitude of these systematic errors for simulated CZCS orbital passes in which the ocean is viewed through a modeled, physically realistic atmosphere. The simulated image mathematically duplicates the exact satellite, Sun, and pixel locations of an actual CZCS image. Errors in the aerosol radiance at 443 nm are calculated for a range of aerosol optical depths. When pixels in the simulated image exceed an error threshhold, the corresponding pixels in the actual CZCS image are flagged and excluded from further analysis or from use in image compositing or compilation of pigment concentration databases. Studies based on time series analyses or compositing of CZCS imagery which do not address Rayleigh-aerosol multiple scattering should be interpreted cautiously, since the fundamental assumption used in their atmospheric correction algorithm is flawed.

  12. Programming of a flexible computer simulation to visualize pharmacokinetic-pharmacodynamic models.

    PubMed

    Lötsch, J; Kobal, G; Geisslinger, G

    2004-01-01

    Teaching pharmacokinetic-pharmacodynamic (PK/PD) models can be made more effective using computer simulations. We propose the programming of educational PK or PK/PD computer simulations as an alternative to the use of pre-built simulation software. This approach has the advantage of adaptability to non-standard or complicated PK or PK/PD models. Simplicity of the programming procedure was achieved by selecting the LabVIEW programming environment. An intuitive user interface to visualize the time courses of drug concentrations or effects can be obtained with pre-built elements. The environment uses a wiring analogy that resembles electrical circuit diagrams rather than abstract programming code. The goal of high interactivity of the simulation was attained by allowing the program to run in continuously repeating loops. This makes the program behave flexibly to the user input. The programming is described with the aid of a 2-compartment PK simulation. Examples of more sophisticated simulation programs are also given where the PK/PD simulation shows drug input, concentrations in plasma, and at effect site and the effects themselves as a function of time. A multi-compartmental model of morphine, including metabolite kinetics and effects is also included. The programs are available for download from the World Wide Web at http:// www. klinik.uni-frankfurt.de/zpharm/klin/ PKPDsimulation/content.html. For pharmacokineticists who only program occasionally, there is the possibility of building the computer simulation, together with the flexible interactive simulation algorithm for clinical pharmacological teaching in the field of PK/PD models.

  13. Segmental Analysis of Cardiac Short-Axis Views Using Lagrangian Radial and Circumferential Strain.

    PubMed

    Ma, Chi; Wang, Xiao; Varghese, Tomy

    2016-11-01

    Accurate description of myocardial deformation in the left ventricle is a three-dimensional problem, requiring three normal strain components along its natural axis, that is, longitudinal, radial, and circumferential strains. Although longitudinal strains are best estimated from long-axis views, radial and circumferential strains are best depicted in short-axis views. An algorithm that utilizes a polar grid for short-axis views previously developed in our laboratory for a Lagrangian description of tissue deformation is utilized for radial and circumferential displacement and strain estimation. Deformation of the myocardial wall, utilizing numerical simulations with ANSYS, and a finite-element analysis-based canine heart model were adapted as the input to a frequency-domain ultrasound simulation program to generate radiofrequency echo signals. Clinical in vivo data were also acquired from a healthy volunteer. Local displacements estimated along and perpendicular to the ultrasound beam propagation direction are then transformed into radial and circumferential displacements and strains using the polar grid based on a pre-determined centroid location. Lagrangian strain variations demonstrate good agreement with the ideal strain when compared with Eulerian results. Lagrangian radial and circumferential strain estimation results are also demonstrated for experimental data on a healthy volunteer. Lagrangian radial and circumferential strain tracking provide accurate results with the assistance of the polar grid, as demonstrated using both numerical simulations and in vivo study. © The Author(s) 2015.

  14. Segmental Analysis of Cardiac Short-Axis Views Using Lagrangian Radial and Circumferential Strain

    PubMed Central

    Ma, Chi; Wang, Xiao; Varghese, Tomy

    2016-01-01

    Accurate description of myocardial deformation in the left ventricle is a three-dimensional problem, requiring three normal strain components along its natural axis, that is, longitudinal, radial, and circumferential strains. Although longitudinal strains are best estimated from long-axis views, radial and circumferential strains are best depicted in short-axis views. An algorithm that utilizes a polar grid for short-axis views previously developed in our laboratory for a Lagrangian description of tissue deformation is utilized for radial and circumferential displacement and strain estimation. Deformation of the myocardial wall, utilizing numerical simulations with ANSYS, and a finite-element analysis–based canine heart model were adapted as the input to a frequency-domain ultrasound simulation program to generate radiofrequency echo signals. Clinical in vivo data were also acquired from a healthy volunteer. Local displacements estimated along and perpendicular to the ultrasound beam propagation direction are then transformed into radial and circumferential displacements and strains using the polar grid based on a pre-determined centroid location. Lagrangian strain variations demonstrate good agreement with the ideal strain when compared with Eulerian results. Lagrangian radial and circumferential strain estimation results are also demonstrated for experimental data on a healthy volunteer. Lagrangian radial and circumferential strain tracking provide accurate results with the assistance of the polar grid, as demonstrated using both numerical simulations and in vivo study. PMID:26578642

  15. Analysis of the diffraction effects for a multi-view autostereoscopic three-dimensional display system based on shutter parallax barriers with full resolution

    NASA Astrophysics Data System (ADS)

    Meng, Yang; Yu, Zhongyuan; Jia, Fangda; Zhang, Chunyu; Wang, Ye; Liu, Yumin; Ye, Han; Chen, Laurence Lujun

    2017-10-01

    A multi-view autostereoscopic three-dimensional (3D) system is built by using a 2D display screen and a customized parallax-barrier shutter (PBS) screen. The shutter screen is controlled dynamically by address driving matrix circuit and it is placed in front of the display screen at a certain location. The system could achieve densest viewpoints due to its specially optical and geometric design which is based on concept of "eye space". The resolution of 3D imaging is not reduced compared to 2D mode by using limited time division multiplexing technology. The diffraction effects may play an important role in 3D display imaging quality, especially when applied to small screen, such as iPhone screen etc. For small screen, diffraction effects may contribute crosstalk between binocular views, image brightness uniformity etc. Therefore, diffraction effects are analyzed and considered in a one-dimensional shutter screen model of the 3D display, in which the numerical simulation of light from display pixels on display screen through parallax barrier slits to each viewing zone in eye space, is performed. The simulation results provide guidance for criteria screen size over which the impact of diffraction effects are ignorable, and below which diffraction effects must be taken into account. Finally, the simulation results are compared to the corresponding experimental measurements and observation with discussion.

  16. Bidirectional reflectance function in coastal waters: modeling and validation

    NASA Astrophysics Data System (ADS)

    Gilerson, Alex; Hlaing, Soe; Harmel, Tristan; Tonizzo, Alberto; Arnone, Robert; Weidemann, Alan; Ahmed, Samir

    2011-11-01

    The current operational algorithm for the correction of bidirectional effects from the satellite ocean color data is optimized for typical oceanic waters. However, versions of bidirectional reflectance correction algorithms, specifically tuned for typical coastal waters and other case 2 conditions, are particularly needed to improve the overall quality of those data. In order to analyze the bidirectional reflectance distribution function (BRDF) of case 2 waters, a dataset of typical remote sensing reflectances was generated through radiative transfer simulations for a large range of viewing and illumination geometries. Based on this simulated dataset, a case 2 water focused remote sensing reflectance model is proposed to correct above-water and satellite water leaving radiance data for bidirectional effects. The proposed model is first validated with a one year time series of in situ above-water measurements acquired by collocated multi- and hyperspectral radiometers which have different viewing geometries installed at the Long Island Sound Coastal Observatory (LISCO). Match-ups and intercomparisons performed on these concurrent measurements show that the proposed algorithm outperforms the algorithm currently in use at all wavelengths.

  17. A system for automatic evaluation of simulation software

    NASA Technical Reports Server (NTRS)

    Ryan, J. P.; Hodges, B. C.

    1976-01-01

    Within the field of computer software, simulation and verification are complementary processes. Simulation methods can be used to verify software by performing variable range analysis. More general verification procedures, such as those described in this paper, can be implicitly, viewed as attempts at modeling the end-product software. From software requirement methodology, each component of the verification system has some element of simulation to it. Conversely, general verification procedures can be used to analyze simulation software. A dynamic analyzer is described which can be used to obtain properly scaled variables for an analog simulation, which is first digitally simulated. In a similar way, it is thought that the other system components and indeed the whole system itself have the potential of being effectively used in a simulation environment.

  18. Solid rocket booster thermal radiation model, volume 1

    NASA Technical Reports Server (NTRS)

    Watson, G. H.; Lee, A. L.

    1976-01-01

    A solid rocket booster (SRB) thermal radiation model, capable of defining the influence of the plume flowfield structure on the magnitude and distribution of thermal radiation leaving the plume, was prepared and documented. Radiant heating rates may be calculated for a single SRB plume or for the dual SRB plumes astride the space shuttle. The plumes may be gimbaled in the yaw and pitch planes. Space shuttle surface geometries are simulated with combinations of quadric surfaces. The effect of surface shading is included. The computer program also has the capability to calculate view factors between the SRB plumes and space shuttle surfaces as well as surface-to-surface view factors.

  19. Great Red Spot Rotation (animation)

    NASA Image and Video Library

    2017-12-11

    Winds around Jupiter's Great Red Spot are simulated in this JunoCam view that has been animated using a model of the winds there. The wind model, called a velocity field, was derived from data collected by NASA's Voyager spacecraft and Earth-based telescopes. NASA's Juno spacecraft acquired the original, static view during passage over the spot on July 10, 2017. Citizen scientists Gerald Eichstädt and Justin Cowart turned the JunoCam data into a color image mosaic. Juno scientists Shawn Ewald and Andrew Ingersoll applied the velocity data to the image to produce a looping animation. An animation is available at https://photojournal.jpl.nasa.gov/catalog/PIA22178

  20. Effect of field of view and monocular viewing on angular size judgements in an outdoor scene

    NASA Technical Reports Server (NTRS)

    Denz, E. A.; Palmer, E. A.; Ellis, S. R.

    1980-01-01

    Observers typically overestimate the angular size of distant objects. Significantly, overestimations are greater in outdoor settings than in aircraft visual-scene simulators. The effect of field of view and monocular and binocular viewing conditions on angular size estimation in an outdoor field was examined. Subjects adjusted the size of a variable triangle to match the angular size of a standard triangle set at three greater distances. Goggles were used to vary the field of view from 11.5 deg to 90 deg for both monocular and binocular viewing. In addition, an unrestricted monocular and binocular viewing condition was used. It is concluded that neither restricted fields of view similar to those present in visual simulators nor the restriction of monocular viewing causes a significant loss in depth perception in outdoor settings. Thus, neither factor should significantly affect the depth realism of visual simulators.

  1. Virgil Gus Grissom's Visit to LaRC

    NASA Image and Video Library

    1963-02-22

    Astronaut Virgil "Gus" Grissom at the controls of the Visual Docking Simulator. From A.W. Vogeley, "Piloted Space-Flight Simulation at Langley Research Center," Paper presented at the American Society of Mechanical Engineers 1966 Winter Meeting, New York, NY, November 27-December 1, 1966. "This facility was [later known as the Visual-Optical Simulator.] It presents to the pilot an out-the-window view of his target in correct 6 degrees of freedom motion. The scene is obtained by a television camera pick-up viewing a small-scale gimbaled model of the target." "For docking studies, the docking target picture was projected onto the surface of a 20-foot-diameter sphere and the pilot could, effectively, maneuver into contract. this facility was used in a comparison study with the Rendezvous Docking Simulator - one of the few comparison experiments in which conditions were carefully controlled and a reasonable sample of pilots used. All pilots preferred the more realistic RDS visual scene. The pilots generally liked the RDS angular motion cues although some objected to the false gravity cues that these motions introduced. Training time was shorter on the RDS, but final performance on both simulators was essentially equal. " "For station-keeping studies, since close approach is not required, the target was presented to the pilot through a virtual-image system which projects his view to infinity, providing a more realistic effect. In addition to the target, the system also projects a star and horizon background. "

  2. PROTO-PLASM: parallel language for adaptive and scalable modelling of biosystems.

    PubMed

    Bajaj, Chandrajit; DiCarlo, Antonio; Paoluzzi, Alberto

    2008-09-13

    This paper discusses the design goals and the first developments of PROTO-PLASM, a novel computational environment to produce libraries of executable, combinable and customizable computer models of natural and synthetic biosystems, aiming to provide a supporting framework for predictive understanding of structure and behaviour through multiscale geometric modelling and multiphysics simulations. Admittedly, the PROTO-PLASM platform is still in its infancy. Its computational framework--language, model library, integrated development environment and parallel engine--intends to provide patient-specific computational modelling and simulation of organs and biosystem, exploiting novel functionalities resulting from the symbolic combination of parametrized models of parts at various scales. PROTO-PLASM may define the model equations, but it is currently focused on the symbolic description of model geometry and on the parallel support of simulations. Conversely, CellML and SBML could be viewed as defining the behavioural functions (the model equations) to be used within a PROTO-PLASM program. Here we exemplify the basic functionalities of PROTO-PLASM, by constructing a schematic heart model. We also discuss multiscale issues with reference to the geometric and physical modelling of neuromuscular junctions.

  3. Proto-Plasm: parallel language for adaptive and scalable modelling of biosystems

    PubMed Central

    Bajaj, Chandrajit; DiCarlo, Antonio; Paoluzzi, Alberto

    2008-01-01

    This paper discusses the design goals and the first developments of Proto-Plasm, a novel computational environment to produce libraries of executable, combinable and customizable computer models of natural and synthetic biosystems, aiming to provide a supporting framework for predictive understanding of structure and behaviour through multiscale geometric modelling and multiphysics simulations. Admittedly, the Proto-Plasm platform is still in its infancy. Its computational framework—language, model library, integrated development environment and parallel engine—intends to provide patient-specific computational modelling and simulation of organs and biosystem, exploiting novel functionalities resulting from the symbolic combination of parametrized models of parts at various scales. Proto-Plasm may define the model equations, but it is currently focused on the symbolic description of model geometry and on the parallel support of simulations. Conversely, CellML and SBML could be viewed as defining the behavioural functions (the model equations) to be used within a Proto-Plasm program. Here we exemplify the basic functionalities of Proto-Plasm, by constructing a schematic heart model. We also discuss multiscale issues with reference to the geometric and physical modelling of neuromuscular junctions. PMID:18559320

  4. Multithreaded Stochastic PDES for Reactions and Diffusions in Neurons.

    PubMed

    Lin, Zhongwei; Tropper, Carl; Mcdougal, Robert A; Patoary, Mohammand Nazrul Ishlam; Lytton, William W; Yao, Yiping; Hines, Michael L

    2017-07-01

    Cells exhibit stochastic behavior when the number of molecules is small. Hence a stochastic reaction-diffusion simulator capable of working at scale can provide a more accurate view of molecular dynamics within the cell. This paper describes a parallel discrete event simulator, Neuron Time Warp-Multi Thread (NTW-MT), developed for the simulation of reaction diffusion models of neurons. To the best of our knowledge, this is the first parallel discrete event simulator oriented towards stochastic simulation of chemical reactions in a neuron. The simulator was developed as part of the NEURON project. NTW-MT is optimistic and thread-based, which attempts to capitalize on multi-core architectures used in high performance machines. It makes use of a multi-level queue for the pending event set and a single roll-back message in place of individual anti-messages to disperse contention and decrease the overhead of processing rollbacks. Global Virtual Time is computed asynchronously both within and among processes to get rid of the overhead for synchronizing threads. Memory usage is managed in order to avoid locking and unlocking when allocating and de-allocating memory and to maximize cache locality. We verified our simulator on a calcium buffer model. We examined its performance on a calcium wave model, comparing it to the performance of a process based optimistic simulator and a threaded simulator which uses a single priority queue for each thread. Our multi-threaded simulator is shown to achieve superior performance to these simulators. Finally, we demonstrated the scalability of our simulator on a larger CICR model and a more detailed CICR model.

  5. Modeling of a Micro-Electronic Mechanical Systems (MEMS) Deformable Mirror for Simulation and Characterization

    DTIC Science & Technology

    2016-09-01

    1  II.  MODEL DESIGN ...Figure 10.  Experimental Optical Layout for the Boston DM Characterization ..........13  Figure 11.  Side View Showing the Curved Surface on a DM...of different methods for deposition, patterning, and etching until the desired design of the device is achieved. While a large number of devices

  6. Use of measurement theory for operationalization and quantification of psychological constructs in systems dynamics modelling

    NASA Astrophysics Data System (ADS)

    Fitkov-Norris, Elena; Yeghiazarian, Ara

    2016-11-01

    The analytical tools available to social scientists have traditionally been adapted from tools originally designed for analysis of natural science phenomena. This article discusses the applicability of systems dynamics - a qualitative based modelling approach, as a possible analysis and simulation tool that bridges the gap between social and natural sciences. After a brief overview of the systems dynamics modelling methodology, the advantages as well as limiting factors of systems dynamics to the potential applications in the field of social sciences and human interactions are discussed. The issues arise with regards to operationalization and quantification of latent constructs at the simulation building stage of the systems dynamics methodology and measurement theory is proposed as a ready and waiting solution to the problem of dynamic model calibration, with a view of improving simulation model reliability and validity and encouraging the development of standardised, modular system dynamics models that can be used in social science research.

  7. A computational approach for coupled 1D and 2D/3D CFD modelling of pulse Tube cryocoolers

    NASA Astrophysics Data System (ADS)

    Fang, T.; Spoor, P. S.; Ghiaasiaan, S. M.

    2017-12-01

    The physics behind Stirling-type cryocoolers are complicated. One dimensional (1D) simulation tools offer limited details and accuracy, in particular for cryocoolers that have non-linear configurations. Multi-dimensional Computational Fluid Dynamic (CFD) methods are useful but are computationally expensive in simulating cyrocooler systems in their entirety. In view of the fact that some components of a cryocooler, e.g., inertance tubes and compliance tanks, can be modelled as 1D components with little loss of critical information, a 1D-2D/3D coupled model was developed. Accordingly, one-dimensional - like components are represented by specifically developed routines. These routines can be coupled to CFD codes and provide boundary conditions for 2D/3D CFD simulations. The developed coupled model, while preserving sufficient flow field details, is two orders of magnitude faster than equivalent 2D/3D CFD models. The predictions show good agreement with experimental data and 2D/3D CFD model.

  8. An atomic and molecular fluid model for efficient edge-plasma transport simulations at high densities

    NASA Astrophysics Data System (ADS)

    Rognlien, Thomas; Rensink, Marvin

    2016-10-01

    Transport simulations for the edge plasma of tokamaks and other magnetic fusion devices requires the coupling of plasma and recycling or injected neutral gas. There are various neutral models used for this purpose, e.g., atomic fluid model, a Monte Carlo particle models, transition/escape probability methods, and semi-analytic models. While the Monte Carlo method is generally viewed as the most accurate, it is time consuming, which becomes even more demanding for device simulations of high densities and size typical of fusion power plants because the neutral collisional mean-free path becomes very small. Here we examine the behavior of an extended fluid neutral model for hydrogen that includes both atoms and molecules, which easily includes nonlinear neutral-neutral collision effects. In addition to the strong charge-exchange between hydrogen atoms and ions, elastic scattering is included among all species. Comparisons are made with the DEGAS 2 Monte Carlo code. Work performed for U.S. DoE by LLNL under Contract DE-AC52-07NA27344.

  9. [Three-dimensional finite element modeling and biomechanical simulation for evaluating and improving postoperative internal instrumentation of neck-thoracic vertebral tumor en bloc resection].

    PubMed

    Qinghua, Zhao; Jipeng, Li; Yongxing, Zhang; He, Liang; Xuepeng, Wang; Peng, Yan; Xiaofeng, Wu

    2015-04-07

    To employ three-dimensional finite element modeling and biomechanical simulation for evaluating the stability and stress conduction of two postoperative internal fixed modeling-multilevel posterior instrumentation ( MPI) and MPI with anterior instrumentation (MPAI) with neck-thoracic vertebral tumor en bloc resection. Mimics software and computed tomography (CT) images were used to establish the three-dimensional (3D) model of vertebrae C5-T2 and simulated the C7 en bloc vertebral resection for MPI and MPAI modeling. Then the statistics and images were transmitted into the ANSYS finite element system and 20N distribution load (simulating body weight) and applied 1 N · m torque on neutral point for simulating vertebral displacement and stress conduction and distribution of motion mode, i. e. flexion, extension, bending and rotating. With a better stability, the displacement of two adjacent vertebral bodies of MPI and MPAI modeling was less than that of complete vertebral modeling. No significant differences existed between each other. But as for stress shielding effect reduction, MPI was slightly better than MPAI. From biomechanical point of view, two internal instrumentations with neck-thoracic tumor en bloc resection may achieve an excellent stability with no significant differences. But with better stress conduction, MPI is more advantageous in postoperative reconstruction.

  10. Veiling glare reduction methods compared for ophthalmic applications

    NASA Technical Reports Server (NTRS)

    Buchele, D. R.

    1981-01-01

    Veiling glare in ocular viewing was simulated by viewing the retina of an eye model through a sheet of light-scattering material lit from the front. Four methods of glare reduction were compared, namely, optical scanning, polarized light, viewing and illumination paths either coaxial or intersecting at the object, and closed circuit TV. Photographs show the effect of these methods on visibility. Polarized light was required to eliminate light specularly reflected from the instrument optics. The greatest glare reduction was obtained when the first three methods were utilized together. Glare reduction using TV was limited by nonuniform distribution of scattered light over the image.

  11. Lipid Clustering Correlates with Membrane Curvature as Revealed by Molecular Simulations of Complex Lipid Bilayers

    PubMed Central

    Koldsø, Heidi; Shorthouse, David; Hélie, Jean; Sansom, Mark S. P.

    2014-01-01

    Cell membranes are complex multicomponent systems, which are highly heterogeneous in the lipid distribution and composition. To date, most molecular simulations have focussed on relatively simple lipid compositions, helping to inform our understanding of in vitro experimental studies. Here we describe on simulations of complex asymmetric plasma membrane model, which contains seven different lipids species including the glycolipid GM3 in the outer leaflet and the anionic lipid, phosphatidylinositol 4,5-bisphophate (PIP2), in the inner leaflet. Plasma membrane models consisting of 1500 lipids and resembling the in vivo composition were constructed and simulations were run for 5 µs. In these simulations the most striking feature was the formation of nano-clusters of GM3 within the outer leaflet. In simulations of protein interactions within a plasma membrane model, GM3, PIP2, and cholesterol all formed favorable interactions with the model α-helical protein. A larger scale simulation of a model plasma membrane containing 6000 lipid molecules revealed correlations between curvature of the bilayer surface and clustering of lipid molecules. In particular, the concave (when viewed from the extracellular side) regions of the bilayer surface were locally enriched in GM3. In summary, these simulations explore the nanoscale dynamics of model bilayers which mimic the in vivo lipid composition of mammalian plasma membranes, revealing emergent nanoscale membrane organization which may be coupled both to fluctuations in local membrane geometry and to interactions with proteins. PMID:25340788

  12. Lipid clustering correlates with membrane curvature as revealed by molecular simulations of complex lipid bilayers.

    PubMed

    Koldsø, Heidi; Shorthouse, David; Hélie, Jean; Sansom, Mark S P

    2014-10-01

    Cell membranes are complex multicomponent systems, which are highly heterogeneous in the lipid distribution and composition. To date, most molecular simulations have focussed on relatively simple lipid compositions, helping to inform our understanding of in vitro experimental studies. Here we describe on simulations of complex asymmetric plasma membrane model, which contains seven different lipids species including the glycolipid GM3 in the outer leaflet and the anionic lipid, phosphatidylinositol 4,5-bisphophate (PIP2), in the inner leaflet. Plasma membrane models consisting of 1500 lipids and resembling the in vivo composition were constructed and simulations were run for 5 µs. In these simulations the most striking feature was the formation of nano-clusters of GM3 within the outer leaflet. In simulations of protein interactions within a plasma membrane model, GM3, PIP2, and cholesterol all formed favorable interactions with the model α-helical protein. A larger scale simulation of a model plasma membrane containing 6000 lipid molecules revealed correlations between curvature of the bilayer surface and clustering of lipid molecules. In particular, the concave (when viewed from the extracellular side) regions of the bilayer surface were locally enriched in GM3. In summary, these simulations explore the nanoscale dynamics of model bilayers which mimic the in vivo lipid composition of mammalian plasma membranes, revealing emergent nanoscale membrane organization which may be coupled both to fluctuations in local membrane geometry and to interactions with proteins.

  13. Computational Analysis and Simulation of Empathic Behaviors: a Survey of Empathy Modeling with Behavioral Signal Processing Framework.

    PubMed

    Xiao, Bo; Imel, Zac E; Georgiou, Panayiotis; Atkins, David C; Narayanan, Shrikanth S

    2016-05-01

    Empathy is an important psychological process that facilitates human communication and interaction. Enhancement of empathy has profound significance in a range of applications. In this paper, we review emerging directions of research on computational analysis of empathy expression and perception as well as empathic interactions, including their simulation. We summarize the work on empathic expression analysis by the targeted signal modalities (e.g., text, audio, and facial expressions). We categorize empathy simulation studies into theory-based emotion space modeling or application-driven user and context modeling. We summarize challenges in computational study of empathy including conceptual framing and understanding of empathy, data availability, appropriate use and validation of machine learning techniques, and behavior signal processing. Finally, we propose a unified view of empathy computation and offer a series of open problems for future research.

  14. Photo-consistency registration of a 4D cardiac motion model to endoscopic video for image guidance of robotic coronary artery bypass

    NASA Astrophysics Data System (ADS)

    Figl, Michael; Rueckert, Daniel; Edwards, Eddie

    2009-02-01

    The aim of the work described in this paper is registration of a 4D preoperative motion model of the heart to the video view of the patient through the intraoperative endoscope. The heart motion is cyclical and can be modelled using multiple reconstructions of cardiac gated coronary CT. We propose the use of photoconsistency between the two views through the da Vinci endoscope to align to the preoperative heart surface model from CT. The temporal alignment from the video to the CT model could in principle be obtained from the ECG signal. We propose averaging of the photoconsistency over the cardiac cycle to improve the registration compared to a single view. Though there is considerable motion of the heart, after correct temporal alignment we suggest that the remaining motion should be close to rigid. Results are presented for simulated renderings and for real video of a beating heart phantom. We found much smoother sections at the minimum when using multiple phases for the registration, furthermore convergence was found to be better when more phases are used.

  15. Pixel-level tunable liquid crystal lenses for auto-stereoscopic display

    NASA Astrophysics Data System (ADS)

    Li, Kun; Robertson, Brian; Pivnenko, Mike; Chu, Daping; Zhou, Jiong; Yao, Jun

    2014-02-01

    Mobile video and gaming are now widely used, and delivery of a glass-free 3D experience is of both research and development interest. The key drawbacks of a conventional 3D display based on a static lenticular lenslet array and parallax barriers are low resolution, limited viewing angle and reduced brightness, mainly because of the need of multiple-pixels for each object point. This study describes the concept and performance of pixel-level cylindrical liquid crystal (LC) lenses, which are designed to steer light to the left and right eye sequentially to form stereo parallax. The width of the LC lenses can be as small as 20-30 μm, so that the associated auto-stereoscopic display will have the same resolution as the 2D display panel in use. Such a thin sheet of tunable LC lens array can be applied directly on existing mobile displays, and can deliver 3D viewing experience while maintaining 2D viewing capability. Transparent electrodes were laser patterned to achieve the single pixel lens resolution, and a high birefringent LC material was used to realise a large diffraction angle for a wide field of view. Simulation was carried out to model the intensity profile at the viewing plane and optimise the lens array based on the measured LC phase profile. The measured viewing angle and intensity profile were compared with the simulation results.

  16. Diffraction effects incorporated design of a parallax barrier for a high-density multi-view autostereoscopic 3D display.

    PubMed

    Yoon, Ki-Hyuk; Ju, Heongkyu; Kwon, Hyunkyung; Park, Inkyu; Kim, Sung-Kyu

    2016-02-22

    We present optical characteristics of view image provided by a high-density multi-view autostereoscopic 3D display (HD-MVA3D) with a parallax barrier (PB). Diffraction effects that become of great importance in such a display system that uses a PB, are considered in an one-dimensional model of the 3D display, in which the numerical simulation of light from display panel pixels through PB slits to viewing zone is performed. The simulation results are then compared to the corresponding experimental measurements with discussion. We demonstrate that, as a main parameter for view image quality evaluation, the Fresnel number can be used to determine the PB slit aperture for the best performance of the display system. It is revealed that a set of the display parameters, which gives the Fresnel number of ∼ 0.7 offers maximized brightness of the view images while that corresponding to the Fresnel number of 0.4 ∼ 0.5 offers minimized image crosstalk. The compromise between the brightness and crosstalk enables optimization of the relative magnitude of the brightness to the crosstalk and lead to the choice of display parameter set for the HD-MVA3D with a PB, which satisfies the condition where the Fresnel number lies between 0.4 and 0.7.

  17. Probative value of absolute and relative judgments in eyewitness identification.

    PubMed

    Clark, Steven E; Erickson, Michael A; Breneman, Jesse

    2011-10-01

    It is well-accepted that eyewitness identification decisions based on relative judgments are less accurate than identification decisions based on absolute judgments. However, the theoretical foundation for this view has not been established. In this study relative and absolute judgments were compared through simulations of the WITNESS model (Clark, Appl Cogn Psychol 17:629-654, 2003) to address the question: Do suspect identifications based on absolute judgments have higher probative value than suspect identifications based on relative judgments? Simulations of the WITNESS model showed a consistent advantage for absolute judgments over relative judgments for suspect-matched lineups. However, simulations of same-foils lineups showed a complex interaction based on the accuracy of memory and the similarity relationships among lineup members.

  18. Numerical Simulations of Homogeneous Turbulence Using Lagrangian-Averaged Navier-Stokes Equations

    NASA Technical Reports Server (NTRS)

    Mohseni, Kamran; Shkoller, Steve; Kosovic, Branko; Marsden, Jerrold E.; Carati, Daniele; Wray, Alan; Rogallo, Robert

    2000-01-01

    The Lagrangian-averaged Navier-Stokes (LANS) equations are numerically evaluated as a turbulence closure. They are derived from a novel Lagrangian averaging procedure on the space of all volume-preserving maps and can be viewed as a numerical algorithm which removes the energy content from the small scales (smaller than some a priori fixed spatial scale alpha) using a dispersive rather than dissipative mechanism, thus maintaining the crucial features of the large scale flow. We examine the modeling capabilities of the LANS equations for decaying homogeneous turbulence, ascertain their ability to track the energy spectrum of fully resolved direct numerical simulations (DNS), compare the relative energy decay rates, and compare LANS with well-accepted large eddy simulation (LES) models.

  19. Virtual Reality Calibration for Telerobotic Servicing

    NASA Technical Reports Server (NTRS)

    Kim, W.

    1994-01-01

    A virtual reality calibration technique of matching a virtual environment of simulated graphics models in 3-D geometry and perspective with actual camera views of the remote site task environment has been developed to enable high-fidelity preview/predictive displays with calibrated graphics overlay on live video.

  20. Physiologically based modeling of hepatic and gastrointestinal biotransformation in fish

    EPA Science Inventory

    In fish, as in mammals, the liver generally viewed as the principal site of chemical biotransformation. For waterborne exposures, such as those conducted in support of standardized BCF testing, the effects of hepatic metabolism on chemical accumulation can be simulated using rela...

  1. LAGRANGIAN PARTICLE DISPERSION MODELING OF THE FUMIGATION PROCESS USING LARGE-EDDY SIMULATION. (R828178)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  2. Integrated view of internal friction in unfolded proteins from single-molecule FRET, contact quenching, theory, and simulations

    PubMed Central

    Soranno, Andrea; Holla, Andrea; Dingfelder, Fabian; Nettels, Daniel; Makarov, Dmitrii E.; Schuler, Benjamin

    2017-01-01

    Internal friction is an important contribution to protein dynamics at all stages along the folding reaction. Even in unfolded and intrinsically disordered proteins, internal friction has a large influence, as demonstrated with several experimental techniques and in simulations. However, these methods probe different facets of internal friction and have been applied to disparate molecular systems, raising questions regarding the compatibility of the results. To obtain an integrated view, we apply here the combination of two complementary experimental techniques, simulations, and theory to the same system: unfolded protein L. We use single-molecule Förster resonance energy transfer (FRET) to measure the global reconfiguration dynamics of the chain, and photoinduced electron transfer (PET), a contact-based method, to quantify the rate of loop formation between two residues. This combination enables us to probe unfolded-state dynamics on different length scales, corresponding to different parts of the intramolecular distance distribution. Both FRET and PET measurements show that internal friction dominates unfolded-state dynamics at low denaturant concentration, and the results are in remarkable agreement with recent large-scale molecular dynamics simulations using a new water model. The simulations indicate that intrachain interactions and dihedral angle rotation correlate with the presence of internal friction, and theoretical models of polymer dynamics provide a framework for interrelating the contribution of internal friction observed in the two types of experiments and in the simulations. The combined results thus provide a coherent and quantitative picture of internal friction in unfolded proteins that could not be attained from the individual techniques. PMID:28223518

  3. Integrated view of internal friction in unfolded proteins from single-molecule FRET, contact quenching, theory, and simulations.

    PubMed

    Soranno, Andrea; Holla, Andrea; Dingfelder, Fabian; Nettels, Daniel; Makarov, Dmitrii E; Schuler, Benjamin

    2017-03-07

    Internal friction is an important contribution to protein dynamics at all stages along the folding reaction. Even in unfolded and intrinsically disordered proteins, internal friction has a large influence, as demonstrated with several experimental techniques and in simulations. However, these methods probe different facets of internal friction and have been applied to disparate molecular systems, raising questions regarding the compatibility of the results. To obtain an integrated view, we apply here the combination of two complementary experimental techniques, simulations, and theory to the same system: unfolded protein L. We use single-molecule Förster resonance energy transfer (FRET) to measure the global reconfiguration dynamics of the chain, and photoinduced electron transfer (PET), a contact-based method, to quantify the rate of loop formation between two residues. This combination enables us to probe unfolded-state dynamics on different length scales, corresponding to different parts of the intramolecular distance distribution. Both FRET and PET measurements show that internal friction dominates unfolded-state dynamics at low denaturant concentration, and the results are in remarkable agreement with recent large-scale molecular dynamics simulations using a new water model. The simulations indicate that intrachain interactions and dihedral angle rotation correlate with the presence of internal friction, and theoretical models of polymer dynamics provide a framework for interrelating the contribution of internal friction observed in the two types of experiments and in the simulations. The combined results thus provide a coherent and quantitative picture of internal friction in unfolded proteins that could not be attained from the individual techniques.

  4. Understanding and coping with extremism in an online collaborative environment: A data-driven modeling

    PubMed Central

    Rudas, Csilla; Surányi, Olivér; Yasseri, Taha; Török, János

    2017-01-01

    The Internet has provided us with great opportunities for large scale collaborative public good projects. Wikipedia is a predominant example of such projects where conflicts emerge and get resolved through bottom-up mechanisms leading to the emergence of the largest encyclopedia in human history. Disaccord arises whenever editors with different opinions try to produce an article reflecting a consensual view. The debates are mainly heated by editors with extreme views. Using a model of common value production, we show that the consensus can only be reached if groups with extreme views can actively take part in the discussion and if their views are also represented in the common outcome, at least temporarily. We show that banning problematic editors mostly hinders the consensus as it delays discussion and thus the whole consensus building process. To validate the model, relevant quantities are measured both in simulations and Wikipedia, which show satisfactory agreement. We also consider the role of direct communication between editors both in the model and in Wikipedia data (by analyzing the Wikipedia talk pages). While the model suggests that in certain conditions there is an optimal rate of “talking” vs “editing”, it correctly predicts that in the current settings of Wikipedia, more activity in talk pages is associated with more controversy. PMID:28323867

  5. Contributions of numerical simulation data bases to the physics, modeling and measurement of turbulence

    NASA Technical Reports Server (NTRS)

    Moin, Parviz; Spalart, Philippe R.

    1987-01-01

    The use of simulation data bases for the examination of turbulent flows is an effective research tool. Studies of the structure of turbulence have been hampered by the limited number of probes and the impossibility of measuring all desired quantities. Also, flow visualization is confined to the observation of passive markers with limited field of view and contamination caused by time-history effects. Computer flow fields are a new resource for turbulence research, providing all the instantaneous flow variables in three-dimensional space. Simulation data bases also provide much-needed information for phenomenological turbulence modeling. Three dimensional velocity and pressure fields from direct simulations can be used to compute all the terms in the transport equations for the Reynolds stresses and the dissipation rate. However, only a few, geometrically simple flows have been computed by direct numerical simulation, and the inventory of simulation does not fully address the current modeling needs in complex turbulent flows. The availability of three-dimensional flow fields also poses challenges in developing new techniques for their analysis, techniques based on experimental methods, some of which are used here for the analysis of direct-simulation data bases in studies of the mechanics of turbulent flows.

  6. Simulation of photons from plasmas for the applications to display devices

    NASA Astrophysics Data System (ADS)

    Lee, Hae June; Yoon, Hyun Jin; Lee, Jae Koo

    2007-07-01

    Numerical modeling of the photon transport of the ultraviolet (UV) and the visible lights are presented for plasma based display devices. The transport of UV lights which undergo resonance trapping by ground state atoms is solved by using the Holstein equation. After the UV lights are transformed to visible lights at the phosphor surfaces, the visible lights experience complicated traces inside the cell and finally are emitted toward the viewing window after having some power loss within the cell. A three-dimensional ray trace of the visible lights is calculated with a radiosity model. These simulations for the photons strengthen plasma discharge modeling for the application to display devices.

  7. Truncated disc surface brightness profiles produced by flares

    NASA Astrophysics Data System (ADS)

    Borlaff, Alejandro; Eliche-Moral, M. Carmen; Beckman, John; Font, Joan

    2017-03-01

    Previous studies have discarded that flares in galactic discs may explain the truncation that are frequently observed in highly-inclined galaxies (Kregel et al. 2002). However, no study has systematically analysed this hypothesis using realistic models for the disc, the flare and the bulge. We derive edge-on and face-on surface brightness profiles for a series of realistic galaxy models with flared discs that sample a wide range of structural and photometric parameters across the Hubble Sequence, accordingly to observations. The surface brightness profile for each galaxy model has been simulated for edge-on and face-on views to find out whether the flared disc produces a significant truncation in the disc in the edge-on view compared to the face-on view or not. In order to simulate realistic images of disc galaxies, we have considered the observational distribution of the photometric parameters as a function of the morphological type for three mass bins (10 < log10(M/M ⊙) < 10.7, 10.7 < log10(M/M ⊙) < 11 and log10(M/M ⊙) > 11), and four morphological type bins (S0-Sa, Sb-Sbc, Sc-Scd and Sd-Sdm). For each mass bin, we have restricted the photometric and structural parameters of each modelled galaxy to their characteristic observational ranges (μ0, disc, μeff, bulge, B/T, M abs, r eff, n bulge, h R, disc) and the flare in the disc (h z, disc/h R, disc, ∂h z, disc/∂R, see de Grijs & Peletier 1997, Graham 2001, López-Corredoira et al. 2002, Yoachim & Dalcanton 2006, Bizyaev et al. 2014, Mosenkov et al. 2015). Contrary to previous claims, the simulations show that realistic flared disks can be responsible for the truncations observed in many edge-on systems, preserving the profile of the non-flared analogous model in face-on view. These breaks reproduce the properties of the weak-to-intermediate breaks observed in many real Type-II galaxies in the diagram relating the radial location of the break (R brkII) in units of the inner disk scale-length with the break strength S (Laine et al. 2014). Radial variation of the scale-height of the disc (flaring) can explain the existence of many breaks in edge-on galaxies, especially of those with low break strengths 10\\frac{ho}{hi} \\sim \\ [-0.3,-0.1]$ .

  8. Adjustment and validation of a simulation tool for CSP plants based on parabolic trough technology

    NASA Astrophysics Data System (ADS)

    García-Barberena, Javier; Ubani, Nora

    2016-05-01

    The present work presents the validation process carried out for a simulation tool especially designed for the energy yield assessment of concentrating solar plants based on parabolic through (PT) technology. The validation has been carried out by comparing the model estimations with real data collected from a commercial CSP plant. In order to adjust the model parameters used for the simulation, 12 different days were selected among one-year of operational data measured at the real plant. The 12 days were simulated and the estimations compared with the measured data, focusing on the most important variables from the simulation point of view: temperatures, pressures and mass flow of the solar field, gross power, parasitic power, and net power delivered by the plant. Based on these 12 days, the key parameters for simulating the model were properly fixed and the simulation of a whole year performed. The results obtained for a complete year simulation showed very good agreement for the gross and net electric total production. The estimations for these magnitudes show a 1.47% and 2.02% BIAS respectively. The results proved that the simulation software describes with great accuracy the real operation of the power plant and correctly reproduces its transient behavior.

  9. SutraGUI, a graphical-user interface for SUTRA, a model for ground-water flow with solute or energy transport

    USGS Publications Warehouse

    Winston, Richard B.; Voss, Clifford I.

    2004-01-01

    This report describes SutraGUI, a flexible graphical user-interface (GUI) that supports two-dimensional (2D) and three-dimensional (3D) simulation with the U.S. Geological Survey (USGS) SUTRA ground-water-flow and transport model (Voss and Provost, 2002). SutraGUI allows the user to create SUTRA ground-water models graphically. SutraGUI provides all of the graphical functionality required for setting up and running SUTRA simulations that range from basic to sophisticated, but it is also possible for advanced users to apply programmable features within Argus ONE to meet the unique demands of particular ground-water modeling projects. SutraGUI is a public-domain computer program designed to run with the proprietary Argus ONE? package, which provides 2D Geographic Information System (GIS) and meshing support. For 3D simulation, GIS and meshing support is provided by programming contained within SutraGUI. When preparing a 3D SUTRA model, the model and all of its features are viewed within Argus 1 in 2D projection. For 2D models, SutraGUI is only slightly changed in functionality from the previous 2D-only version (Voss and others, 1997) and it provides visualization of simulation results. In 3D, only model preparation is supported by SutraGUI, and 3D simulation results may be viewed in SutraPlot (Souza, 1999) or Model Viewer (Hsieh and Winston, 2002). A comprehensive online Help system is included in SutraGUI. For 3D SUTRA models, the 3D model domain is conceptualized as bounded on the top and bottom by 2D surfaces. The 3D domain may also contain internal surfaces extending across the model that divide the domain into tabular units, which can represent hydrogeologic strata or other features intended by the user. These surfaces can be non-planar and non-horizontal. The 3D mesh is defined by one or more 2D meshes at different elevations that coincide with these surfaces. If the nodes in the 3D mesh are vertically aligned, only a single 2D mesh is needed. For nonaligned meshes, two or more 2D meshes of similar connectivity are used. Between each set of 2D meshes (and model surfaces), the vertical space in the 3D mesh is evenly divided into a user-specified number of layers of finite elements. Boundary conditions may be specified for 3D models in SutraGUI using a variety of geometric shapes that may be located freely within the 3D model domain. These shapes include points, lines, sheets, and solids. These are represented by 2D contours (within the vertically-projected Argus ONE view) with user-defined elevations. In addition, boundary conditions may be specified for 3D models as points, lines, and areas that are located exactly within the surfaces that define the model top and the bottoms of the tabular units. Aquifer properties may be specified separately for each tabular unit. If the aquifer properties vary vertically within a unit, SutraGUI provides the Sutra_Z function that can be used to specify such variation.

  10. Modeling digital breast tomosynthesis imaging systems for optimization studies

    NASA Astrophysics Data System (ADS)

    Lau, Beverly Amy

    Digital breast tomosynthesis (DBT) is a new imaging modality for breast imaging. In tomosynthesis, multiple images of the compressed breast are acquired at different angles, and the projection view images are reconstructed to yield images of slices through the breast. One of the main problems to be addressed in the development of DBT is the optimal parameter settings to obtain images ideal for detection of cancer. Since it would be unethical to irradiate women multiple times to explore potentially optimum geometries for tomosynthesis, it is ideal to use a computer simulation to generate projection images. Existing tomosynthesis models have modeled scatter and detector without accounting for oblique angles of incidence that tomosynthesis introduces. Moreover, these models frequently use geometry-specific physical factors measured from real systems, which severely limits the robustness of their algorithms for optimization. The goal of this dissertation was to design the framework for a computer simulation of tomosynthesis that would produce images that are sensitive to changes in acquisition parameters, so an optimization study would be feasible. A computer physics simulation of the tomosynthesis system was developed. The x-ray source was modeled as a polychromatic spectrum based on published spectral data, and inverse-square law was applied. Scatter was applied using a convolution method with angle-dependent scatter point spread functions (sPSFs), followed by scaling using an angle-dependent scatter-to-primary ratio (SPR). Monte Carlo simulations were used to generate sPSFs for a 5-cm breast with a 1-cm air gap. Detector effects were included through geometric propagation of the image onto layers of the detector, which were blurred using depth-dependent detector point-spread functions (PRFs). Depth-dependent PRFs were calculated every 5-microns through a 200-micron thick CsI detector using Monte Carlo simulations. Electronic noise was added as Gaussian noise as a last step of the model. The sPSFs and detector PRFs were verified to match published data, and noise power spectrum (NPS) from simulated flat field images were shown to match empirically measured data from a digital mammography unit. A novel anthropomorphic software breast phantom was developed for 3D imaging simulation. Projection view images of the phantom were shown to have similar structure as real breasts in the spatial frequency domain, using the power-law exponent beta to quantify tissue complexity. The physics simulation and computer breast phantom were used together, following methods from a published study with real tomosynthesis images of real breasts. The simulation model and 3D numerical breast phantoms were able to reproduce the trends in the experimental data. This result demonstrates the ability of the tomosynthesis physics model to generate images sensitive to changes in acquisition parameters.

  11. Just-in-time Time Data Analytics and Visualization of Climate Simulations using the Bellerophon Framework

    NASA Astrophysics Data System (ADS)

    Anantharaj, V. G.; Venzke, J.; Lingerfelt, E.; Messer, B.

    2015-12-01

    Climate model simulations are used to understand the evolution and variability of earth's climate. Unfortunately, high-resolution multi-decadal climate simulations can take days to weeks to complete. Typically, the simulation results are not analyzed until the model runs have ended. During the course of the simulation, the output may be processed periodically to ensure that the model is preforming as expected. However, most of the data analytics and visualization are not performed until the simulation is finished. The lengthy time period needed for the completion of the simulation constrains the productivity of climate scientists. Our implementation of near real-time data visualization analytics capabilities allows scientists to monitor the progress of their simulations while the model is running. Our analytics software executes concurrently in a co-scheduling mode, monitoring data production. When new data are generated by the simulation, a co-scheduled data analytics job is submitted to render visualization artifacts of the latest results. These visualization output are automatically transferred to Bellerophon's data server located at ORNL's Compute and Data Environment for Science (CADES) where they are processed and archived into Bellerophon's database. During the course of the experiment, climate scientists can then use Bellerophon's graphical user interface to view animated plots and their associated metadata. The quick turnaround from the start of the simulation until the data are analyzed permits research decisions and projections to be made days or sometimes even weeks sooner than otherwise possible! The supercomputer resources used to run the simulation are unaffected by co-scheduling the data visualization jobs, so the model runs continuously while the data are visualized. Our just-in-time data visualization software looks to increase climate scientists' productivity as climate modeling moves into exascale era of computing.

  12. Psycho-Motor and Error Enabled Simulations: Modeling Vulnerable Skills in the Pre-Mastery Phase

    DTIC Science & Technology

    2016-04-01

    participants. Multiple abstracts and posters were created for surgical conferences attended. These works concentrated on data from pre and post ...analyzed to give every participant a perspective of the smallest difference in stiffness they could differentiate. Based on the results the tests were...camera was affixed to a post mounted to this station’s table to capture a close-up view of the participant’s placement of needles on the simulation

  13. Prism users guide.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weirs, V. Gregory

    2012-03-01

    Prism is a ParaView plugin that simultaneously displays simulation data and material model data. This document describes its capabilities and how to use them. A demonstration of Prism is given in the first section. The second section contains more detailed notes on less obvious behavior. The third and fourth sections are specifically for Alegra and CTH users. They tell how to generate the simulation data and SESAME files and how to handle aspects of Prism use particular to each of these codes.

  14. Preliminary design of the HARMONI science software

    NASA Astrophysics Data System (ADS)

    Piqueras, Laure; Jarno, Aurelien; Pécontal-Rousset, Arlette; Loupias, Magali; Richard, Johan; Schwartz, Noah; Fusco, Thierry; Sauvage, Jean-François; Neichel, Benoît; Correia, Carlos M.

    2016-08-01

    This paper introduces the science software of HARMONI. The Instrument Numerical Model simulates the instrument from the optical point of view and provides synthetic exposures simulating detector readouts from data-cubes containing astrophysical scenes. The Data Reduction Software converts raw-data frames into a fully calibrated, scientifically usable data cube. We present the functionalities and the preliminary design of this software, describe some of the methods and algorithms used and highlight the challenges that we will have to face.

  15. Dynamics of Gas Near the Galactic Centre

    NASA Astrophysics Data System (ADS)

    Jenkins, A.; Binney, J.

    1994-10-01

    We simulate the flow of gas in the Binney et al. model of the bar at the centre of the Milky Way. We argue that the flow of a clumpy interstellar medium is most realistically simulated by a sticky-particle scheme, and investigate two such schemes. In both schemes orbits close to the cusped orbit rapidly become depopulated. This depopulation places a lower limit on the pattern speed since it implies that in the (1, v) plane the cusped orbit lies significantly inside the peak of the Hi terminal-velocity envelope at 1 20. We find that the size of the central molecular disc and the magnitudes of the observed forbidden velocities constrain the eccentricity of the Galactic bar to values similar to that arbitrarily assumed by Binney et al. We study the accretion by the nuclear disc of matter shed by dying bulge stars. We estimate that mass loss by the bulge can replenish the Hi in the nuclear disc within two bar rotation periods, in good agreement with the predictions of the simulations. When accretion of gas from the bulge is included, fine-scale irregular structure persists in the nuclear disc. This structure gives rise to features in longitude-velocity plots which depend significantly on viewing angle, and consequently give rise to asymmetries in longitude. These asymmetries are, however, much less pronounced than those in the observational plots. We conclude that the addition of hydrodynamics to the Binney et al. model does not resolve some important discrepancies between theory and observation. The model's basic idea does, however, have high a priori probability and has enjoyed some significant successes, while a number of potentially important physical processes - most notably the self-gravity of interstellar gas - are neglected in the present simulations. In view of the deficiencies of our simulations and interesting parallels we do observe between simulated and observational longitude-velocity plots, we believe it would be premature to reject the Binney et al. model prior to exploring high-quality three-dimensional simulations that include self-gravitating stars and gas. Key words: accretion, accretion discs - ISM: kinematics and dynamics - ISM: structure -Galaxy: centre - Galaxy: kinematics and dynamics - radio lines: ISM.

  16. Storm Water Management Model User’s Manual Version 5.1 - manual

    EPA Science Inventory

    SWMM 5 provides an integrated environment for editing study area input data, running hydrologic, hydraulic and water quality simulations, and viewing the results in a variety of formats. These include color-coded drainage area and conveyance system maps, time series graphs and ta...

  17. INTRA-AND INTERANNUAL VARIABILITY OF ECOSYSTEM PROCESSES IN SHORTGRASS STEPPE: NEW MODEL, VERIFICATION, SIMULATIONS. (R824993)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  18. C/sup 3/ and combat simulation - a survey

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Erickson, S.A. Jr.

    1983-01-04

    This article looks at the overlap between C/sup 3/ and combat simulation, from the point of view of the developer of combat simulations and models. In this context, there are two different questions. The first is: How and to what extent should specific models of the C/sup 3/ processes be incorporated in simulations of combat. Here the key point is the assessment of impact. In which types or levels of combat does C/sup 3/ play a role sufficiently intricate and closely coupled with combat performance that it would significantly affect combat results. Conversely, when is C/sup 3/ a known factormore » or modifier which can be simply accommodated without a specific detailed model being made for it. The second question is the inverse one. In the development of future C/sup 3/ systems, what rule should combat simulation play. Obviously, simulation of the operation of the hardware, software and other parts of the C/sup 3/ system would be useful in its design and specification, but this is not combat simulation. When is it necessary to encase the C/sup 3/ simulation model in a combat model which has enough detail to be considered a simulation itself. How should this outer combat model be scoped out as to the components needed. In order to build a background for answering these questions a two-pronged approach will be taken. First a framework for C/sup 3/ modeling will be developed, in which the various types of modeling which can be done to include or encase C/sup 3/ in a combat model are organized. This framework will hopefully be useful in describing the particular assumptions made in specific models in terms of what could be done in a more general way. Then a few specific models will be described, concentrating on the C/sup 3/ portion of the simulations, or what could be interpreted as the C/sup 3/ assumptions.« less

  19. Computational Modelling of Patella Femoral Kinematics During Gait Cycle and Experimental Validation

    NASA Astrophysics Data System (ADS)

    Maiti, Raman

    2016-06-01

    The effect of loading and boundary conditions on patellar mechanics is significant due to the complications arising in patella femoral joints during total knee replacements. To understand the patellar mechanics with respect to loading and motion, a computational model representing the patella femoral joint was developed and validated against experimental results. The computational model was created in IDEAS NX and simulated in MSC ADAMS/VIEW software. The results obtained in the form of internal external rotations and anterior posterior displacements for a new and experimentally simulated specimen for patella femoral joint under standard gait condition were compared with experimental measurements performed on the Leeds ProSim knee simulator. A good overall agreement between the computational prediction and the experimental data was obtained for patella femoral kinematics. Good agreement between the model and the past studies was observed when the ligament load was removed and the medial lateral displacement was constrained. The model is sensitive to ±5 % change in kinematics, frictional, force and stiffness coefficients and insensitive to time step.

  20. Computational Modelling of Patella Femoral Kinematics During Gait Cycle and Experimental Validation

    NASA Astrophysics Data System (ADS)

    Maiti, Raman

    2018-06-01

    The effect of loading and boundary conditions on patellar mechanics is significant due to the complications arising in patella femoral joints during total knee replacements. To understand the patellar mechanics with respect to loading and motion, a computational model representing the patella femoral joint was developed and validated against experimental results. The computational model was created in IDEAS NX and simulated in MSC ADAMS/VIEW software. The results obtained in the form of internal external rotations and anterior posterior displacements for a new and experimentally simulated specimen for patella femoral joint under standard gait condition were compared with experimental measurements performed on the Leeds ProSim knee simulator. A good overall agreement between the computational prediction and the experimental data was obtained for patella femoral kinematics. Good agreement between the model and the past studies was observed when the ligament load was removed and the medial lateral displacement was constrained. The model is sensitive to ±5 % change in kinematics, frictional, force and stiffness coefficients and insensitive to time step.

  1. Modeling of a production system using the multi-agent approach

    NASA Astrophysics Data System (ADS)

    Gwiazda, A.; Sękala, A.; Banaś, W.

    2017-08-01

    The method that allows for the analysis of complex systems is a multi-agent simulation. The multi-agent simulation (Agent-based modeling and simulation - ABMS) is modeling of complex systems consisting of independent agents. In the case of the model of the production system agents may be manufactured pieces set apart from other types of agents like machine tools, conveyors or replacements stands. Agents are magazines and buffers. More generally speaking, the agents in the model can be single individuals, but you can also be defined as agents of collective entities. They are allowed hierarchical structures. It means that a single agent could belong to a certain class. Depending on the needs of the agent may also be a natural or physical resource. From a technical point of view, the agent is a bundle of data and rules describing its behavior in different situations. Agents can be autonomous or non-autonomous in making the decision about the types of classes of agents, class sizes and types of connections between elements of the system. Multi-agent modeling is a very flexible technique for modeling and model creating in the convention that could be adapted to any research problem analyzed from different points of views. One of the major problems associated with the organization of production is the spatial organization of the production process. Secondly, it is important to include the optimal scheduling. For this purpose use can approach multi-purposeful. In this regard, the model of the production process will refer to the design and scheduling of production space for four different elements. The program system was developed in the environment NetLogo. It was also used elements of artificial intelligence. The main agent represents the manufactured pieces that, according to previously assumed rules, generate the technological route and allow preprint the schedule of that line. Machine lines, reorientation stands, conveyors and transport devices also represent the other type of agent that are utilized in the described simulation. The article presents the idea of an integrated program approach and shows the resulting production layout as a virtual model. This model was developed in the NetLogo multi-agent program environment.

  2. Progress in Development of the ITER Plasma Control System Simulation Platform

    NASA Astrophysics Data System (ADS)

    Walker, Michael; Humphreys, David; Sammuli, Brian; Ambrosino, Giuseppe; de Tommasi, Gianmaria; Mattei, Massimiliano; Raupp, Gerhard; Treutterer, Wolfgang; Winter, Axel

    2017-10-01

    We report on progress made and expected uses of the Plasma Control System Simulation Platform (PCSSP), the primary test environment for development of the ITER Plasma Control System (PCS). PCSSP will be used for verification and validation of the ITER PCS Final Design for First Plasma, to be completed in 2020. We discuss the objectives of PCSSP, its overall structure, selected features, application to existing devices, and expected evolution over the lifetime of the ITER PCS. We describe an archiving solution for simulation results, methods for incorporating physics models of the plasma and physical plant (tokamak, actuator, and diagnostic systems) into PCSSP, and defining characteristics of models suitable for a plasma control development environment such as PCSSP. Applications of PCSSP simulation models including resistive plasma equilibrium evolution are demonstrated. PCSSP development supported by ITER Organization under ITER/CTS/6000000037. Resistive evolution code developed under General Atomics' Internal funding. The views and opinions expressed herein do not necessarily reflect those of the ITER Organization.

  3. Monte Carlo modeling of spatially complex wrist tissue for the optimization of optical pulse oximeters

    NASA Astrophysics Data System (ADS)

    Robinson, Mitchell; Butcher, Ryan; Coté, Gerard L.

    2017-02-01

    Monte Carlo modeling of photon propagation has been used in the examination of particular areas of the body to further enhance the understanding of light propagation through tissue. This work seeks to improve upon the established simulation methods through more accurate representations of the simulated tissues in the wrist as well as the characteristics of the light source. The Monte Carlo simulation program was developed using Matlab. Generation of different tissue domains, such as muscle, vasculature, and bone, was performed in Solidworks, where each domain was saved as a separate .stl file that was read into the program. The light source was altered to give considerations to both viewing angle of the simulated LED as well as the nominal diameter of the source. It is believed that the use of these more accurate models generates results that more closely match those seen in-vivo, and can be used to better guide the design of optical wrist-worn measurement devices.

  4. Proactive action preparation: seeing action preparation as a continuous and proactive process.

    PubMed

    Pezzulo, Giovanni; Ognibene, Dimitri

    2012-07-01

    In this paper, we aim to elucidate the processes that occur during action preparation from both a conceptual and a computational point of view. We first introduce the traditional, serial model of goal-directed action and discuss from a computational viewpoint its subprocesses occurring during the two phases of covert action preparation and overt motor control. Then, we discuss recent evidence indicating that these subprocesses are highly intertwined at representational and neural levels, which undermines the validity of the serial model and points instead to a parallel model of action specification and selection. Within the parallel view, we analyze the case of delayed choice, arguing that action preparation can be proactive, and preparatory processes can take place even before decisions are made. Specifically, we discuss how prior knowledge and prospective abilities can be used to maximize utility even before deciding what to do. To support our view, we present a computational implementation of (an approximated version of) proactive action preparation, showing its advantages in a simulated tennis-like scenario.

  5. Statistical properties of several models of fractional random point processes

    NASA Astrophysics Data System (ADS)

    Bendjaballah, C.

    2011-08-01

    Statistical properties of several models of fractional random point processes have been analyzed from the counting and time interval statistics points of view. Based on the criterion of the reduced variance, it is seen that such processes exhibit nonclassical properties. The conditions for these processes to be treated as conditional Poisson processes are examined. Numerical simulations illustrate part of the theoretical calculations.

  6. Use of Linear Perspective Scene Cues in a Simulated Height Regulation Task

    NASA Technical Reports Server (NTRS)

    Levison, W. H.; Warren, R.

    1984-01-01

    As part of a long-term effort to quantify the effects of visual scene cuing and non-visual motion cuing in flight simulators, an experimental study of the pilot's use of linear perspective cues in a simulated height-regulation task was conducted. Six test subjects performed a fixed-base tracking task with a visual display consisting of a simulated horizon and a perspective view of a straight, infinitely-long roadway of constant width. Experimental parameters were (1) the central angle formed by the roadway perspective and (2) the display gain. The subject controlled only the pitch/height axis; airspeed, bank angle, and lateral track were fixed in the simulation. The average RMS height error score for the least effective display configuration was about 25% greater than the score for the most effective configuration. Overall, larger and more highly significant effects were observed for the pitch and control scores. Model analysis was performed with the optimal control pilot model to characterize the pilot's use of visual scene cues, with the goal of obtaining a consistent set of independent model parameters to account for display effects.

  7. Reducing Errors in Satellite Simulated Views of Clouds with an Improved Parameterization of Unresolved Scales

    NASA Astrophysics Data System (ADS)

    Hillman, B. R.; Marchand, R.; Ackerman, T. P.

    2016-12-01

    Satellite instrument simulators have emerged as a means to reduce errors in model evaluation by producing simulated or psuedo-retrievals from model fields, which account for limitations in the satellite retrieval process. Because of the mismatch in resolved scales between satellite retrievals and large-scale models, model cloud fields must first be downscaled to scales consistent with satellite retrievals. This downscaling is analogous to that required for model radiative transfer calculations. The assumption is often made in both model radiative transfer codes and satellite simulators that the unresolved clouds follow maximum-random overlap with horizontally homogeneous cloud condensate amounts. We examine errors in simulated MISR and CloudSat retrievals that arise due to these assumptions by applying the MISR and CloudSat simulators to cloud resolving model (CRM) output generated by the Super-parameterized Community Atmosphere Model (SP-CAM). Errors are quantified by comparing simulated retrievals performed directly on the CRM fields with those simulated by first averaging the CRM fields to approximately 2-degree resolution, applying a "subcolumn generator" to regenerate psuedo-resolved cloud and precipitation condensate fields, and then applying the MISR and CloudSat simulators on the regenerated condensate fields. We show that errors due to both assumptions of maximum-random overlap and homogeneous condensate are significant (relative to uncertainties in the observations and other simulator limitations). The treatment of precipitation is particularly problematic for CloudSat-simulated radar reflectivity. We introduce an improved subcolumn generator for use with the simulators, and show that these errors can be greatly reduced by replacing the maximum-random overlap assumption with the more realistic generalized overlap and incorporating a simple parameterization of subgrid-scale cloud and precipitation condensate heterogeneity. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000. SAND NO. SAND2016-7485 A

  8. Sensitivity of an Elekta iView GT a-Si EPID model to delivery errors for pre-treatment verification of IMRT fields.

    PubMed

    Herwiningsih, Sri; Hanlon, Peta; Fielding, Andrew

    2014-12-01

    A Monte Carlo model of an Elekta iViewGT amorphous silicon electronic portal imaging device (a-Si EPID) has been validated for pre-treatment verification of clinical IMRT treatment plans. The simulations involved the use of the BEAMnrc and DOSXYZnrc Monte Carlo codes to predict the response of the iViewGT a-Si EPID model. The predicted EPID images were compared to the measured images obtained from the experiment. The measured EPID images were obtained by delivering a photon beam from an Elekta Synergy linac to the Elekta iViewGT a-Si EPID. The a-Si EPID was used with no additional build-up material. Frame averaged EPID images were acquired and processed using in-house software. The agreement between the predicted and measured images was analyzed using the gamma analysis technique with acceptance criteria of 3 %/3 mm. The results show that the predicted EPID images for four clinical IMRT treatment plans have a good agreement with the measured EPID signal. Three prostate IMRT plans were found to have an average gamma pass rate of more than 95.0 % and a spinal IMRT plan has the average gamma pass rate of 94.3 %. During the period of performing this work a routine MLC calibration was performed and one of the IMRT treatments re-measured with the EPID. A change in the gamma pass rate for one field was observed. This was the motivation for a series of experiments to investigate the sensitivity of the method by introducing delivery errors, MLC position and dosimetric overshoot, into the simulated EPID images. The method was found to be sensitive to 1 mm leaf position errors and 10 % overshoot errors.

  9. Dynamic simulation of road vehicle door window regulator mechanism of cross arm type

    NASA Astrophysics Data System (ADS)

    Miklos, I. Zs; Miklos, C.; Alic, C.

    2017-01-01

    The paper presents issues related to the dynamic simulation of a motor-drive operating mechanism of cross arm type, for the manipulation of road vehicle door windows, using Autodesk Inventor Professional software. The dynamic simulation of the mechanism involves a 3D modelling, kinematic coupling, drive motion parameters and external loads, as well as the graphically view of the kinematic and kinetostatic results for the various elements and kinematic couplings of the mechanism, under real operating conditions. Also, based on the results, the analysis of the mechanism components has been carried out using the finite element method.

  10. Thermodynamic analysis of shark skin texture surfaces for microchannel flow

    NASA Astrophysics Data System (ADS)

    Yu, Hai-Yan; Zhang, Hao-Chun; Guo, Yang-Yu; Tan, He-Ping; Li, Yao; Xie, Gong-Nan

    2016-09-01

    The studies of shark skin textured surfaces in flow drag reduction provide inspiration to researchers overcoming technical challenges from actual production application. In this paper, three kinds of infinite parallel plate flow models with microstructure inspired by shark skin were established, namely blade model, wedge model and the smooth model, according to cross-sectional shape of microstructure. Simulation was carried out by using FLUENT, which simplified the computation process associated with direct numeric simulations. To get the best performance from simulation results, shear-stress transport k-omega turbulence model was chosen during the simulation. Since drag reduction mechanism is generally discussed from kinetics point of view, which cannot interpret the cause of these losses directly, a drag reduction rate was established based on the second law of thermodynamics. Considering abrasion and fabrication precision in practical applications, three kinds of abraded geometry models were constructed and tested, and the ideal microstructure was found to achieve best performance suited to manufacturing production on the basis of drag reduction rate. It was also believed that bionic shark skin surfaces with mechanical abrasion may draw more attention from industrial designers and gain wide applications with drag-reducing characteristics.

  11. Benchmarking Model Variants in Development of a Hardware-in-the-Loop Simulation System

    NASA Technical Reports Server (NTRS)

    Aretskin-Hariton, Eliot D.; Zinnecker, Alicia M.; Kratz, Jonathan L.; Culley, Dennis E.; Thomas, George L.

    2016-01-01

    Distributed engine control architecture presents a significant increase in complexity over traditional implementations when viewed from the perspective of system simulation and hardware design and test. Even if the overall function of the control scheme remains the same, the hardware implementation can have a significant effect on the overall system performance due to differences in the creation and flow of data between control elements. A Hardware-in-the-Loop (HIL) simulation system is under development at NASA Glenn Research Center that enables the exploration of these hardware dependent issues. The system is based on, but not limited to, the Commercial Modular Aero-Propulsion System Simulation 40k (C-MAPSS40k). This paper describes the step-by-step conversion from the self-contained baseline model to the hardware in the loop model, and the validation of each step. As the control model hardware fidelity was improved during HIL system development, benchmarking simulations were performed to verify that engine system performance characteristics remained the same. The results demonstrate the goal of the effort; the new HIL configurations have similar functionality and performance compared to the baseline C-MAPSS40k system.

  12. A brief introduction to computer-intensive methods, with a view towards applications in spatial statistics and stereology.

    PubMed

    Mattfeldt, Torsten

    2011-04-01

    Computer-intensive methods may be defined as data analytical procedures involving a huge number of highly repetitive computations. We mention resampling methods with replacement (bootstrap methods), resampling methods without replacement (randomization tests) and simulation methods. The resampling methods are based on simple and robust principles and are largely free from distributional assumptions. Bootstrap methods may be used to compute confidence intervals for a scalar model parameter and for summary statistics from replicated planar point patterns, and for significance tests. For some simple models of planar point processes, point patterns can be simulated by elementary Monte Carlo methods. The simulation of models with more complex interaction properties usually requires more advanced computing methods. In this context, we mention simulation of Gibbs processes with Markov chain Monte Carlo methods using the Metropolis-Hastings algorithm. An alternative to simulations on the basis of a parametric model consists of stochastic reconstruction methods. The basic ideas behind the methods are briefly reviewed and illustrated by simple worked examples in order to encourage novices in the field to use computer-intensive methods. © 2010 The Authors Journal of Microscopy © 2010 Royal Microscopical Society.

  13. XCAT/DRASIM: a realistic CT/human-model simulation package

    NASA Astrophysics Data System (ADS)

    Fung, George S. K.; Stierstorfer, Karl; Segars, W. Paul; Taguchi, Katsuyuki; Flohr, Thomas G.; Tsui, Benjamin M. W.

    2011-03-01

    The aim of this research is to develop a complete CT/human-model simulation package by integrating the 4D eXtended CArdiac-Torso (XCAT) phantom, a computer generated NURBS surface based phantom that provides a realistic model of human anatomy and respiratory and cardiac motions, and the DRASIM (Siemens Healthcare) CT-data simulation program. Unlike other CT simulation tools which are based on simple mathematical primitives or voxelized phantoms, this new simulation package has the advantages of utilizing a realistic model of human anatomy and physiological motions without voxelization and with accurate modeling of the characteristics of clinical Siemens CT systems. First, we incorporated the 4D XCAT anatomy and motion models into DRASIM by implementing a new library which consists of functions to read-in the NURBS surfaces of anatomical objects and their overlapping order and material properties in the XCAT phantom. Second, we incorporated an efficient ray-tracing algorithm for line integral calculation in DRASIM by computing the intersection points of the rays cast from the x-ray source to the detector elements through the NURBS surfaces of the multiple XCAT anatomical objects along the ray paths. Third, we evaluated the integrated simulation package by performing a number of sample simulations of multiple x-ray projections from different views followed by image reconstruction. The initial simulation results were found to be promising by qualitative evaluation. In conclusion, we have developed a unique CT/human-model simulation package which has great potential as a tool in the design and optimization of CT scanners, and the development of scanning protocols and image reconstruction methods for improving CT image quality and reducing radiation dose.

  14. Multimode simulations of a wide field of view double-Fourier far-infrared spatio-spectral interferometer

    NASA Astrophysics Data System (ADS)

    Bracken, Colm P.; Lightfoot, John; O'Sullivan, Creidhe; Murphy, J. Anthony; Donohoe, Anthony; Savini, Giorgio; Juanola-Parramon, Roser; The Fisica Consortium, On Behalf Of

    2018-01-01

    In the absence of 50-m class space-based observatories, subarcsecond astronomy spanning the full far-infrared wavelength range will require space-based long-baseline interferometry. The long baselines of up to tens of meters are necessary to achieve subarcsecond resolution demanded by science goals. Also, practical observing times command a field of view toward an arcminute (1‧) or so, not achievable with a single on-axis coherent detector. This paper is concerned with an application of an end-to-end instrument simulator PyFIInS, developed as part of the FISICA project under funding from the European Commission's seventh Framework Programme for Research and Technological Development (FP7). Predicted results of wide field of view spatio-spectral interferometry through simulations of a long-baseline, double-Fourier, far-infrared interferometer concept are presented and analyzed. It is shown how such an interferometer, illuminated by a multimode detector can recover a large field of view at subarcsecond angular resolution, resulting in similar image quality as that achieved by illuminating the system with an array of coherent detectors. Through careful analysis, the importance of accounting for the correct number of higher-order optical modes is demonstrated, as well as accounting for both orthogonal polarizations. Given that it is very difficult to manufacture waveguide and feed structures at sub-mm wavelengths, the larger multimode design is recommended over the array of smaller single mode detectors. A brief note is provided in the conclusion of this paper addressing a more elegant solution to modeling far-infrared interferometers, which holds promise for improving the computational efficiency of the simulations presented here.

  15. User’s guide and reference to Ash3d: a three-dimensional model for Eulerian atmospheric tephra transport and deposition

    USGS Publications Warehouse

    Mastin, Larry G.; Randall, Michael J.; Schwaiger, Hans F.; Denlinger, Roger P.

    2013-01-01

    Ash3d is a three-dimensional Eulerian atmospheric model for tephra transport, dispersal, and deposition, written by the authors to study and forecast hazards of volcanic ash clouds and tephra fall. In this report, we explain how to set up simulations using both a web interface and an ASCII input file, and how to view and interpret model output. We also summarize the architecture of the model and some of its properties.

  16. Simulation of image formation in x-ray coded aperture microscopy with polycapillary optics.

    PubMed

    Korecki, P; Roszczynialski, T P; Sowa, K M

    2015-04-06

    In x-ray coded aperture microscopy with polycapillary optics (XCAMPO), the microstructure of focusing polycapillary optics is used as a coded aperture and enables depth-resolved x-ray imaging at a resolution better than the focal spot dimensions. Improvements in the resolution and development of 3D encoding procedures require a simulation model that can predict the outcome of XCAMPO experiments. In this work we introduce a model of image formation in XCAMPO which enables calculation of XCAMPO datasets for arbitrary positions of the object relative to the focal plane as well as to incorporate optics imperfections. In the model, the exit surface of the optics is treated as a micro-structured x-ray source that illuminates a periodic object. This makes it possible to express the intensity of XCAMPO images as a convolution series and to perform simulations by means of fast Fourier transforms. For non-periodic objects, the model can be applied by enforcing artificial periodicity and setting the spatial period larger then the field-of-view. Simulations are verified by comparison with experimental data.

  17. A Dynamic Simulation Model of Organizational Culture and Business Strategy Effects on Performance

    NASA Astrophysics Data System (ADS)

    Trivellas, Panagiotis; Reklitis, Panagiotis; Konstantopoulos, Nikolaos

    2007-12-01

    In the past two decades, organizational culture literature has gained tremendous interest for both academic and practitioners. This is based not only on the suggestion that culture is related to performance, but also on the view that it is subject of direct managerial control and manipulation to the desired direction. In the present paper, we adopt Competing Values Framework (CVF) to operationalise organizational culture and Porter's typology to conceptualize business strategy (cost leadership, innovative and marketing differentiation, and focus). Although simulation of social events is a quite difficult task, since there are so many considerations (not all well understood) involved, in the present study we developed a dynamic model to simulate the organizational culture and strategy effects on financial performance. Data obtained from a six-year survey in the banking sector of a European developing economy was used for the proposed dynamic model development.

  18. Computational Analysis and Simulation of Empathic Behaviors: A Survey of Empathy Modeling with Behavioral Signal Processing Framework

    PubMed Central

    Xiao, Bo; Imel, Zac E.; Georgiou, Panayiotis; Atkins, David C.; Narayanan, Shrikanth S.

    2017-01-01

    Empathy is an important psychological process that facilitates human communication and interaction. Enhancement of empathy has profound significance in a range of applications. In this paper, we review emerging directions of research on computational analysis of empathy expression and perception as well as empathic interactions, including their simulation. We summarize the work on empathic expression analysis by the targeted signal modalities (e.g., text, audio, facial expressions). We categorize empathy simulation studies into theory-based emotion space modeling or application-driven user and context modeling. We summarize challenges in computational study of empathy including conceptual framing and understanding of empathy, data availability, appropriate use and validation of machine learning techniques, and behavior signal processing. Finally, we propose a unified view of empathy computation, and offer a series of open problems for future research. PMID:27017830

  19. Simulator sickness analysis of 3D video viewing on passive 3D TV

    NASA Astrophysics Data System (ADS)

    Brunnström, K.; Wang, K.; Andrén, B.

    2013-03-01

    The MPEG 3DV project is working on the next generation video encoding standard and in this process a call for proposal of encoding algorithms was issued. To evaluate these algorithm a large scale subjective test was performed involving Laboratories all over the world. For the participating Labs it was optional to administer a slightly modified Simulator Sickness Questionnaire (SSQ) from Kennedy et al (1993) before and after the test. Here we report the results from one Lab (Acreo) located in Sweden. The videos were shown on a 46 inch film pattern retarder 3D TV, where the viewers were using polarized passive eye-glasses to view the stereoscopic 3D video content. There were 68 viewers participating in this investigation in ages ranges from 16 to 72, with one third females. The questionnaire was filled in before and after the test, with a viewing time ranging between 30 min to about one and half hour, which is comparable to a feature length movie. The SSQ consists of 16 different symptoms that have been identified as important for indicating simulator sickness. When analyzing the individual symptoms it was found that Fatigue, Eye-strain, Difficulty Focusing and Difficulty Concentrating were significantly worse after than before. SSQ was also analyzed according to the model suggested by Kennedy et al (1993). All in all this investigation shows a statistically significant increase in symptoms after viewing 3D video especially related to visual or Oculomotor system.

  20. Stabilization of Model Membrane Systems by Disaccharides. Quasielastic Neutron Scattering Experiments and Atomistic Simulations

    NASA Astrophysics Data System (ADS)

    Doxastakis, Emmanouil; Garcia Sakai, Victoria; Ohtake, Satoshi; Maranas, Janna K.; de Pablo, Juan J.

    2006-03-01

    Trehalose, a disaccharide of glucose, is often used for the stabilization of cell membranes in the absence of water. This work studies the effects of trehalose on model membrane systems as they undergo a melting transition using a combination of experimental methods and atomistic molecular simulations. Quasielastic neutron scattering experiments on selectively deuterated samples provide the incoherent dynamic structure over a wide time range. Elastic scans probing the lipid tail dynamics display clear evidence of a main melting transition that is significantly lowered in the presence of trehalose. Lipid headgroup mobility is considerably restricted at high temperatures and directly associated with the dynamics of the sugar in the mixture. Molecular simulations provide a detailed overview of the dynamics and their spatial and time dependence. The combined simulation and experimental methodology offers a unique, molecular view of the physics of systems commonly employed in cryopreservation and lyophilization processes.

  1. Modeling Being "Lost": Imperfect Situation Awareness

    NASA Technical Reports Server (NTRS)

    Middleton, Victor E.

    2011-01-01

    Being "lost" is an exemplar of imperfect Situation Awareness/Situation Understanding (SA/SU) -- information/knowledge that is uncertain, incomplete, and/or just wrong. Being "lost" may be a geo-spatial condition - not knowing/being wrong about where to go or how to get there. More broadly, being "lost" can serve as a metaphor for uncertainty and/or inaccuracy - not knowing/being wrong about how one fits into a larger world view, what one wants to do, or how to do it. This paper discusses using agent based modeling (ABM) to explore imperfect SA/SU, simulating geo-spatially "lost" intelligent agents trying to navigate in a virtual world. Each agent has a unique "mental map" -- its idiosyncratic view of its geo-spatial environment. Its decisions are based on this idiosyncratic view, but behavior outcomes are based on ground truth. Consequently, the rate and degree to which an agent's expectations diverge from ground truth provide measures of that agent's SA/SU.

  2. SU-E-J-234: Application of a Breathing Motion Model to ViewRay Cine MR Images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O’Connell, D. P.; Thomas, D. H.; Dou, T. H.

    2015-06-15

    Purpose: A respiratory motion model previously used to generate breathing-gated CT images was used with cine MR images. Accuracy and predictive ability of the in-plane models were evaluated. Methods: Sagittalplane cine MR images of a patient undergoing treatment on a ViewRay MRI/radiotherapy system were acquired before and during treatment. Images were acquired at 4 frames/second with 3.5 × 3.5 mm resolution and a slice thickness of 5 mm. The first cine frame was deformably registered to following frames. Superior/inferior component of the tumor centroid position was used as a breathing surrogate. Deformation vectors and surrogate measurements were used to determinemore » motion model parameters. Model error was evaluated and subsequent treatment cines were predicted from breathing surrogate data. A simulated CT cine was created by generating breathing-gated volumetric images at 0.25 second intervals along the measured breathing trace, selecting a sagittal slice and downsampling to the resolution of the MR cines. A motion model was built using the first half of the simulated cine data. Model accuracy and error in predicting the remaining frames of the cine were evaluated. Results: Mean difference between model predicted and deformably registered lung tissue positions for the 28 second preview MR cine acquired before treatment was 0.81 +/− 0.30 mm. The model was used to predict two minutes of the subsequent treatment cine with a mean accuracy of 1.59 +/− 0.63 mm. Conclusion: Inplane motion models were built using MR cine images and evaluated for accuracy and ability to predict future respiratory motion from breathing surrogate measurements. Examination of long term predictive ability is ongoing. The technique was applied to simulated CT cines for further validation, and the authors are currently investigating use of in-plane models to update pre-existing volumetric motion models used for generation of breathing-gated CT planning images.« less

  3. Damping capacity of a sealed squeeze film bearing

    NASA Technical Reports Server (NTRS)

    Dede, M. M.; Dogan, M.; Holmes, R.

    1984-01-01

    The advantages of incorporating an open-ended or weakly-sealed squeeze-film bearing in a flexible support structure simulating an aero-engine assembly were examined. Attention is given to empirically modelling the hydrodynamics of the more usual tightly-sealed squeeze-film bearing, with a view to assessing its damping performance.

  4. A PHYSICAL MODEL FOR THE SIMULATION OF BIOTURBATION AND ITS COMPARISON TO EXPERIMENTS WITH OLIGOCHAETES. (R825513C011)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  5. SIMULATION MODEL RESULTS OF LYME DISEASE INFECTIVITY RATES IN VERTEBRATE HOSTS UNDER CLIMATE CHANGE SCENARIOS. (R824995)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  6. Virtual DRI dataset development

    NASA Astrophysics Data System (ADS)

    Hixson, Jonathan G.; Teaney, Brian P.; May, Christopher; Maurer, Tana; Nelson, Michael B.; Pham, Justin R.

    2017-05-01

    The U.S. Army RDECOM CERDEC NVESD MSD's target acquisition models have been used for many years by the military analysis community for sensor design, trade studies, and field performance prediction. This paper analyzes the results of perception tests performed to compare the results of a field DRI (Detection, Recognition, and Identification Test) performed in 2009 to current Soldier performance viewing the same imagery in a laboratory environment and simulated imagery of the same data set. The purpose of the experiment is to build a robust data set for use in the virtual prototyping of infrared sensors. This data set will provide a strong foundation relating, model predictions, field DRI results and simulated imagery.

  7. Appropriate Arrangement of Nori Aquafarming Grounds in the Ariake Sea on the Basis of Convective Dispersion Simulation Model

    NASA Astrophysics Data System (ADS)

    Tabata, Toshinori; Hiramatsu, Kazuaki; Harada, Masayoshi; Shiraishi, Hideto; Shuto, Toshio

    This study investigated appropriate arrangement of nori aquafarming grounds from the view point of nori growth in the Ariake Sea coastal waters. Databases of the sea-bed topography and nori aquafarming grounds were constructed using GIS. Then the tidal currents and salinity in the Ariake Sea were simulated using a two-dimensional depth-integrated model, which was developed by integrating the three-dimensional continuity, momentum, and diffusion equations. The wetting and drying scheme was also introduced to account for the appearance and disappearance of tidal flats. The velocities and directions of the simulated tidal currents, salinity, and tidal land appearance were in good agreement with observed data. Five scenarios considered by the Fukuoka Prefectural Government were introduced in the simulation model to identify the most appropriate arrangement. An experimental formula for nitrogen assimilation kinetics in the nori body was introduced to evaluate the simulation results for the five scenarios. The scenarios with a reduced density of aquafarming grounds had increased nori growth, suggesting that the arrangement of the aquafarming grounds affected the nori growth. The simulation results were used to identify the most appropriate arrangement of aquafarming grounds.

  8. An Efficient Approach to Modeling the Topographic Control of Surface Hydrology for Regional and Global Climate Modeling.

    NASA Astrophysics Data System (ADS)

    Stieglitz, Marc; Rind, David; Famiglietti, James; Rosenzweig, Cynthia

    1997-01-01

    The current generation of land-surface models used in GCMs view the soil column as the fundamental hydrologic unit. While this may be effective in simulating such processes as the evolution of ground temperatures and the growth/ablation of a snowpack at the soil plot scale, it effectively ignores the role topography plays in the development of soil moisture heterogeneity and the subsequent impacts of this soil moisture heterogeneity on watershed evapotranspiration and the partitioning of surface fluxes. This view also ignores the role topography plays in the timing of discharge and the partitioning of discharge into surface runoff and baseflow. In this paper an approach to land-surface modeling is presented that allows us to view the watershed as the fundamental hydrologic unit. The analytic form of TOPMODEL equations are incorporated into the soil column framework and the resulting model is used to predict the saturated fraction of the watershed and baseflow in a consistent fashion. Soil moisture heterogeneity represented by saturated lowlands subsequently impacts the partitioning of surface fluxes, including evapotranspiration and runoff. The approach is computationally efficient, allows for a greatly improved simulation of the hydrologic cycle, and is easily coupled into the existing framework of the current generation of single column land-surface models. Because this approach uses the statistics of the topography rather than the details of the topography, it is compatible with the large spatial scales of today's regional and global climate models. Five years of meteorological and hydrological data from the Sleepers River watershed located in the northeastern United States where winter snow cover is significant were used to drive the new model. Site validation data were sufficient to evaluate model performance with regard to various aspects of the watershed water balance, including snowpack growth/ablation, the spring snowmelt hydrograph, storm hydrographs, and the seasonal development of watershed evapotranspiration and soil moisture.

  9. A Co-modeling Method Based on Component Features for Mechatronic Devices in Aero-engines

    NASA Astrophysics Data System (ADS)

    Wang, Bin; Zhao, Haocen; Ye, Zhifeng

    2017-08-01

    Data-fused and user-friendly design of aero-engine accessories is required because of their structural complexity and stringent reliability. This paper gives an overview of a typical aero-engine control system and the development process of key mechatronic devices used. Several essential aspects of modeling and simulation in the process are investigated. Considering the limitations of a single theoretic model, feature-based co-modeling methodology is suggested to satisfy the design requirements and compensate for diversity of component sub-models for these devices. As an example, a stepper motor controlled Fuel Metering Unit (FMU) is modeled in view of the component physical features using two different software tools. An interface is suggested to integrate the single discipline models into the synthesized one. Performance simulation of this device using the co-model and parameter optimization for its key components are discussed. Comparison between delivery testing and the simulation shows that the co-model for the FMU has a high accuracy and the absolute superiority over a single model. Together with its compatible interface with the engine mathematical model, the feature-based co-modeling methodology is proven to be an effective technical measure in the development process of the device.

  10. Uterus models for use in virtual reality hysteroscopy simulators.

    PubMed

    Niederer, Peter; Weiss, Stephan; Caduff, Rosmarie; Bajka, Michael; Szekély, Gabor; Harders, Matthias

    2009-05-01

    Virtual reality models of human organs are needed in surgery simulators which are developed for educational and training purposes. A simulation can only be useful, however, if the mechanical performance of the system in terms of force-feedback for the user as well as the visual representation is realistic. We therefore aim at developing a mechanical computer model of the organ in question which yields realistic force-deformation behavior under virtual instrument-tissue interactions and which, in particular, runs in real time. The modeling of the human uterus is described as it is to be implemented in a simulator for minimally invasive gynecological procedures. To this end, anatomical information which was obtained from specially designed computed tomography and magnetic resonance imaging procedures as well as constitutive tissue properties recorded from mechanical testing were used. In order to achieve real-time performance, the combination of mechanically realistic numerical uterus models of various levels of complexity with a statistical deformation approach is suggested. In view of mechanical accuracy of such models, anatomical characteristics including the fiber architecture along with the mechanical deformation properties are outlined. In addition, an approach to make this numerical representation potentially usable in an interactive simulation is discussed. The numerical simulation of hydrometra is shown in this communication. The results were validated experimentally. In order to meet the real-time requirements and to accommodate the large biological variability associated with the uterus, a statistical modeling approach is demonstrated to be useful.

  11. The Photochemical Reflectance Index from Directional Cornfield Reflectances: Observations and Simulations

    NASA Technical Reports Server (NTRS)

    Cheng, Yen-Ben; Middleton, Elizabeth M.; Zhang, Qingyuan; Corp, Lawrence A.; Dandois, Jonathan; Kustas, William P.

    2012-01-01

    The two-layer Markov chain Analytical Canopy Reflectance Model (ACRM) was linked with in situ hyperspectral leaf optical properties to simulate the Photochemical Reflectance Index (PRI) for a corn crop canopy at three different growth stages. This is an extended study after a successful demonstration of PRI simulations for a cornfield previously conducted at an early vegetative growth stage. Consistent with previous in situ studies, sunlit leaves exhibited lower PRI values than shaded leaves. Since sunlit (shaded) foliage dominates the canopy in the reflectance hotspot (coldspot), the canopy PRI derived from field hyperspectral observations displayed sensitivity to both view zenith angle and relative azimuth angle at all growth stages. Consequently, sunlit and shaded canopy sectors were most differentiated when viewed along the azimuth matching the solar principal plane. These directional PRI responses associated with sunlit/shaded foliage were successfully reproduced by the ACRM. As before, the simulated PRI values from the current study were closer to in situ values when both sunlit and shaded leaves were utilized as model input data in a two-layer mode, instead of a one-layer mode with sunlit leaves only. Model performance as judged by correlation between in situ and simulated values was strongest for the mature corn crop (r = 0.87, RMSE = 0.0048), followed by the early vegetative stage (r = 0.78; RMSE = 0.0051) and the early senescent stage (r = 0.65; RMSE = 0.0104). Since the benefit of including shaded leaves in the scheme varied across different growth stages, a further analysis was conducted to investigate how variable fractions of sunlit/shaded leaves affect the canopy PRI values expected for a cornfield, with implications for 20 remote sensing monitoring options. Simulations of the sunlit to shaded canopy ratio near 50/50 +/- 10 (e.g., 60/40) matching field observations at all growth stages were examined. Our results suggest in the importance of the sunlit/shaded fraction and canopy structure in understanding and interpreting PRI.

  12. Challenges in Visual Analysis of Ensembles

    DOE PAGES

    Crossno, Patricia

    2018-04-12

    Modeling physical phenomena through computational simulation increasingly relies on generating a collection of related runs, known as an ensemble. In this paper, we explore the challenges we face in developing analysis and visualization systems for large and complex ensemble data sets, which we seek to understand without having to view the results of every simulation run. Implementing approaches and ideas developed in response to this goal, we demonstrate the analysis of a 15K run material fracturing study using Slycat, our ensemble analysis system.

  13. Design and realization of retina-like three-dimensional imaging based on a MOEMS mirror

    NASA Astrophysics Data System (ADS)

    Cao, Jie; Hao, Qun; Xia, Wenze; Peng, Yuxin; Cheng, Yang; Mu, Jiaxing; Wang, Peng

    2016-07-01

    To balance conflicts for high-resolution, large-field-of-view and real-time imaging, a retina-like imaging method based on time-of flight (TOF) is proposed. Mathematical models of 3D imaging based on MOEMS are developed. Based on this method, we perform simulations of retina-like scanning properties, including compression of redundant information and rotation and scaling invariance. To validate the theory, we develop a prototype and conduct relevant experiments. The preliminary results agree well with the simulations.

  14. Hierarchical Modelling Of Mobile, Seeing Robots

    NASA Astrophysics Data System (ADS)

    Luh, Cheng-Jye; Zeigler, Bernard P.

    1990-03-01

    This paper describes the implementation of a hierarchical robot simulation which supports the design of robots with vision and mobility. A seeing robot applies a classification expert system for visual identification of laboratory objects. The visual data acquisition algorithm used by the robot vision system has been developed to exploit multiple viewing distances and perspectives. Several different simulations have been run testing the visual logic in a laboratory environment. Much work remains to integrate the vision system with the rest of the robot system.

  15. Hierarchical modelling of mobile, seeing robots

    NASA Technical Reports Server (NTRS)

    Luh, Cheng-Jye; Zeigler, Bernard P.

    1990-01-01

    This paper describes the implementation of a hierarchical robot simulation which supports the design of robots with vision and mobility. A seeing robot applies a classification expert system for visual identification of laboratory objects. The visual data acquisition algorithm used by the robot vision system has been developed to exploit multiple viewing distances and perspectives. Several different simulations have been run testing the visual logic in a laboratory environment. Much work remains to integrate the vision system with the rest of the robot system.

  16. Engine-Level Simulation of Liquid Rocket Combustion Instabilities: Transcritical Combustion Simulations in Single Injector Configurations

    DTIC Science & Technology

    2012-03-01

    simple 1-step mechanism taking into account 4 species: CH4, O2, CO2 and H2O. Figure 2. Multiblock grid for the CVRC experiment. Left: Overall view, Right... Supercritical (and subcritical) fluid behavior and modeling: drops, streams, shear and mixing layers, jets and sprays. Progress in Energy and...hydrogen shear-coaxial jet flames at supercritical pressure. Com- bustion science and technology, 178(1-3):229–252, 2006. 12 B. E. Poling, J. M. Prausnitz

  17. Challenges in Visual Analysis of Ensembles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crossno, Patricia

    Modeling physical phenomena through computational simulation increasingly relies on generating a collection of related runs, known as an ensemble. In this paper, we explore the challenges we face in developing analysis and visualization systems for large and complex ensemble data sets, which we seek to understand without having to view the results of every simulation run. Implementing approaches and ideas developed in response to this goal, we demonstrate the analysis of a 15K run material fracturing study using Slycat, our ensemble analysis system.

  18. 19. Interior view showing flight simulator partition and rear overhead ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    19. Interior view showing flight simulator partition and rear overhead door, dock no. 493. View to south. - Offutt Air Force Base, Looking Glass Airborne Command Post, Nose Docks, On either side of Hangar Access Apron at Northwest end of Project Looking Glass Historic District, Bellevue, Sarpy County, NE

  19. Development of a positive corona from a long grounded wire in a growing thunderstorm field

    NASA Astrophysics Data System (ADS)

    Mokrov, M. S.; Raizer, Yu P.; Bazelyan, E. M.

    2013-11-01

    The properties of a non-stationary corona initiated from a long grounded wire suspended horizontally above the ground and coronating in a slowly varying thundercloud electric field are studied. A two-dimensional (2D) model of the corona is developed. On the basis of this model, characteristics of the corona produced by a lightning protection wire are calculated under thunderstorm conditions. The corona characteristics are also found by using approximate analytical and quasi-one-dimensional numerical models. The results of these models agree reasonably well with those obtained from the 2D simulation. This allows one to estimate the corona parameters without recourse to the cumbersome simulation. This work was performed with a view to study the efficiency of lightning protection wires later on.

  20. Computer-based simulation of the Bielschowsky head-tilt test using the SEE++ software system.

    PubMed

    Kaltofen, Thomas; Buchberger, Michael; Priglinger, Siegfried

    2008-01-01

    Latest measurements of the vestibulo-ocular reflex (VOR) allowed the integration of the simulation of the Bielschowsky head-tilt test (BHTT) into the SEE++ software system. SEE++ realizes a biomechanical model of the human eye in order to simulate eye motility disorders and strabismus surgeries. With the addition of the BHTT it can now also be used for differential-diagnostic simulations of complex disorders (e.g., superior oblique palsies). In order to simulate the BHTT in SEE++, the user can freely choose the desired head-tilt angle from -45 degrees to +45 degrees. The chosen angle is shown in the 3D view with a human body model and is also used in the calculation of the Hess-Lancaster test. The integration of the BHTT offers an additional improvement of the possibilities for simulating eye motility disorders. Moreover, SEE++ allows the creation of a video of the "virtual patient" while tilting the head from one side to the other, which shows dynamic changes in the simulated Hess-diagrams. Comparisons of simulation results with patient-measured data showed a good correlation between the simulated and the measured data. Further comparisons with patient data are planned.

  1. Adams-Based Rover Terramechanics and Mobility Simulator - ARTEMIS

    NASA Technical Reports Server (NTRS)

    Trease, Brian P.; Lindeman, Randel A.; Arvidson, Raymond E.; Bennett, Keith; VanDyke, Lauren P.; Zhou, Feng; Iagnemma, Karl; Senatore, Carmine

    2013-01-01

    The Mars Exploration Rovers (MERs), Spirit and Opportunity, far exceeded their original drive distance expectations and have traveled, at the time of this reporting, a combined 29 kilometers across the surface of Mars. The Rover Sequencing and Visualization Program (RSVP), the current program used to plan drives for MERs, is only a kinematic simulator of rover movement. Therefore, rover response to various terrains and soil types cannot be modeled. Although sandbox experiments attempt to model rover-terrain interaction, these experiments are time-intensive and costly, and they cannot be used within the tactical timeline of rover driving. Imaging techniques and hazard avoidance features on MER help to prevent the rover from traveling over dangerous terrains, but mobility issues have shown that these methods are not always sufficient. ARTEMIS, a dynamic modeling tool for MER, allows planned drives to be simulated before commands are sent to the rover. The deformable soils component of this model allows rover-terrain interactions to be simulated to determine if a particular drive path would take the rover over terrain that would induce hazardous levels of slip or sink. When used in the rover drive planning process, dynamic modeling reduces the likelihood of future mobility issues because high-risk areas could be identified before drive commands are sent to the rover, and drives planned over these areas could be rerouted. The ARTEMIS software consists of several components. These include a preprocessor, Digital Elevation Models (DEMs), Adams rover model, wheel and soil parameter files, MSC Adams GUI (commercial), MSC Adams dynamics solver (commercial), terramechanics subroutines (FORTRAN), a contact detection engine, a soil modification engine, and output DEMs of deformed soil. The preprocessor is used to define the terrain (from a DEM) and define the soil parameters for the terrain file. The Adams rover model is placed in this terrain. Wheel and soil parameter files can be altered in the respective text files. The rover model and terrain are viewed in Adams View, the GUI for ARTEMIS. The Adams dynamics solver calls terramechanics subroutines in FORTRAN containing the Bekker-Wong equations.

  2. A regularized vortex-particle mesh method for large eddy simulation

    NASA Astrophysics Data System (ADS)

    Spietz, H. J.; Walther, J. H.; Hejlesen, M. M.

    2017-11-01

    We present recent developments of the remeshed vortex particle-mesh method for simulating incompressible fluid flow. The presented method relies on a parallel higher-order FFT based solver for the Poisson equation. Arbitrary high order is achieved through regularization of singular Green's function solutions to the Poisson equation and recently we have derived novel high order solutions for a mixture of open and periodic domains. With this approach the simulated variables may formally be viewed as the approximate solution to the filtered Navier Stokes equations, hence we use the method for Large Eddy Simulation by including a dynamic subfilter-scale model based on test-filters compatible with the aforementioned regularization functions. Further the subfilter-scale model uses Lagrangian averaging, which is a natural candidate in light of the Lagrangian nature of vortex particle methods. A multiresolution variation of the method is applied to simulate the benchmark problem of the flow past a square cylinder at Re = 22000 and the obtained results are compared to results from the literature.

  3. Knowledge Diffusion on Networks through the Game Strategy

    NASA Astrophysics Data System (ADS)

    Sun, Shu; Wu, Jiangning; Xuan, Zhaoguo

    In this paper, we develop a knowledge diffusion model in which agents determine to give their knowledge to others according to some exchange strategies. The typical network namely small-world network is used for modeling, on which agents with knowledge are viewed as the nodes of the network and the edges are viewed as the social relationships for knowledge transmission. Such agents are permitted to interact with their neighbors repeatedly who have direct connections with them and accordingly change their strategies by choosing the most beneficial neighbors to diffuse knowledge. Two kinds of knowledge transmission strategies are proposed for the theoretical model based on the game theory and thereafter used in different simulations to examine the effect of the network structure on the knowledge diffusion effect. By analyses, two main observations can be found: One is that the simulation results are contrary to our intuition which agents would like to only accept but not share, thus they will maximize their benefit; another one is that the number of the agents acquired knowledge and the corresponding knowledge stock turn out to be independent of the percentage of those agents who choose to contribute their knowledge.

  4. Global exponential periodicity and stability of discrete-time complex-valued recurrent neural networks with time-delays.

    PubMed

    Hu, Jin; Wang, Jun

    2015-06-01

    In recent years, complex-valued recurrent neural networks have been developed and analysed in-depth in view of that they have good modelling performance for some applications involving complex-valued elements. In implementing continuous-time dynamical systems for simulation or computational purposes, it is quite necessary to utilize a discrete-time model which is an analogue of the continuous-time system. In this paper, we analyse a discrete-time complex-valued recurrent neural network model and obtain the sufficient conditions on its global exponential periodicity and exponential stability. Simulation results of several numerical examples are delineated to illustrate the theoretical results and an application on associative memory is also given. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Pececillo

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carlson, Neil; Jibben, Zechariah; Brady, Peter

    2017-06-28

    Pececillo is a proxy-app for the open source Truchas metal processing code (LA-CC-15-097). It implements many of the physics models used in Truchas: free-surface, incompressible Navier-Stokes fluid dynamics (e.g., water waves); heat transport, material phase change, view factor thermal radiation; species advection-diffusion; quasi-static, elastic/plastic solid mechanics with contact; electomagnetics (Maxwell's equations). The models are simplified versions that retain the fundamental computational complexity of the Truchas models while omitting many non-essential features and modeling capabilities. The purpose is to expose Truchas algorithms in a greatly simplified context where computer science problems related to parallel performance on advanced architectures can be moremore » easily investigated. While Pececillo is capable of performing simulations representative of typical Truchas metal casting, welding, and additive manufacturing simulations, it lacks many of the modeling capabilites needed for real applications.« less

  6. SU-E-I-58: Objective Models of Breast Shape Undergoing Mammography and Tomosynthesis Using Principal Component Analysis.

    PubMed

    Feng, Ssj; Sechopoulos, I

    2012-06-01

    To develop an objective model of the shape of the compressed breast undergoing mammographic or tomosynthesis acquisition. Automated thresholding and edge detection was performed on 984 anonymized digital mammograms (492 craniocaudal (CC) view mammograms and 492 medial lateral oblique (MLO) view mammograms), to extract the edge of each breast. Principal Component Analysis (PCA) was performed on these edge vectors to identify a limited set of parameters and eigenvectors that. These parameters and eigenvectors comprise a model that can be used to describe the breast shapes present in acquired mammograms and to generate realistic models of breasts undergoing acquisition. Sample breast shapes were then generated from this model and evaluated. The mammograms in the database were previously acquired for a separate study and authorized for use in further research. The PCA successfully identified two principal components and their corresponding eigenvectors, forming the basis for the breast shape model. The simulated breast shapes generated from the model are reasonable approximations of clinically acquired mammograms. Using PCA, we have obtained models of the compressed breast undergoing mammographic or tomosynthesis acquisition based on objective analysis of a large image database. Up to now, the breast in the CC view has been approximated as a semi-circular tube, while there has been no objectively-obtained model for the MLO view breast shape. Such models can be used for various breast imaging research applications, such as x-ray scatter estimation and correction, dosimetry estimates, and computer-aided detection and diagnosis. © 2012 American Association of Physicists in Medicine.

  7. Videopanorama Frame Rate Requirements Derived from Visual Discrimination of Deceleration During Simulated Aircraft Landing

    NASA Technical Reports Server (NTRS)

    Furnstenau, Norbert; Ellis, Stephen R.

    2015-01-01

    In order to determine the required visual frame rate (FR) for minimizing prediction errors with out-the-window video displays at remote/virtual airport towers, thirteen active air traffic controllers viewed high dynamic fidelity simulations of landing aircraft and decided whether aircraft would stop as if to be able to make a turnoff or whether a runway excursion would be expected. The viewing conditions and simulation dynamics replicated visual rates and environments of transport aircraft landing at small commercial airports. The required frame rate was estimated using Bayes inference on prediction errors by linear FRextrapolation of event probabilities conditional on predictions (stop, no-stop). Furthermore estimates were obtained from exponential model fits to the parametric and non-parametric perceptual discriminabilities d' and A (average area under ROC-curves) as dependent on FR. Decision errors are biased towards preference of overshoot and appear due to illusionary increase in speed at low frames rates. Both Bayes and A - extrapolations yield a framerate requirement of 35 < FRmin < 40 Hz. When comparing with published results [12] on shooter game scores the model based d'(FR)-extrapolation exhibits the best agreement and indicates even higher FRmin > 40 Hz for minimizing decision errors. Definitive recommendations require further experiments with FR > 30 Hz.

  8. A simulation model for analysing brain structure deformations.

    PubMed

    Di Bona, Sergio; Lutzemberger, Ludovico; Salvetti, Ovidio

    2003-12-21

    Recent developments of medical software applications--from the simulation to the planning of surgical operations--have revealed the need for modelling human tissues and organs, not only from a geometric point of view but also from a physical one, i.e. soft tissues, rigid body, viscoelasticity, etc. This has given rise to the term 'deformable objects', which refers to objects with a morphology, a physical and a mechanical behaviour of their own and that reflects their natural properties. In this paper, we propose a model, based upon physical laws, suitable for the realistic manipulation of geometric reconstructions of volumetric data taken from MR and CT scans. In particular, a physically based model of the brain is presented that is able to simulate the evolution of different nature pathological intra-cranial phenomena such as haemorrhages, neoplasm, haematoma, etc and to describe the consequences that are caused by their volume expansions and the influences they have on the anatomical and neuro-functional structures of the brain.

  9. Neural dynamics of object-based multifocal visual spatial attention and priming: Object cueing, useful-field-of-view, and crowding

    PubMed Central

    Foley, Nicholas C.; Grossberg, Stephen; Mingolla, Ennio

    2015-01-01

    How are spatial and object attention coordinated to achieve rapid object learning and recognition during eye movement search? How do prefrontal priming and parietal spatial mechanisms interact to determine the reaction time costs of intra-object attention shifts, inter-object attention shifts, and shifts between visible objects and covertly cued locations? What factors underlie individual differences in the timing and frequency of such attentional shifts? How do transient and sustained spatial attentional mechanisms work and interact? How can volition, mediated via the basal ganglia, influence the span of spatial attention? A neural model is developed of how spatial attention in the where cortical stream coordinates view-invariant object category learning in the what cortical stream under free viewing conditions. The model simulates psychological data about the dynamics of covert attention priming and switching requiring multifocal attention without eye movements. The model predicts how “attentional shrouds” are formed when surface representations in cortical area V4 resonate with spatial attention in posterior parietal cortex (PPC) and prefrontal cortex (PFC), while shrouds compete among themselves for dominance. Winning shrouds support invariant object category learning, and active surface-shroud resonances support conscious surface perception and recognition. Attentive competition between multiple objects and cues simulates reaction-time data from the two-object cueing paradigm. The relative strength of sustained surface-driven and fast-transient motion-driven spatial attention controls individual differences in reaction time for invalid cues. Competition between surface-driven attentional shrouds controls individual differences in detection rate of peripheral targets in useful-field-of-view tasks. The model proposes how the strength of competition can be mediated, though learning or momentary changes in volition, by the basal ganglia. A new explanation of crowding shows how the cortical magnification factor, among other variables, can cause multiple object surfaces to share a single surface-shroud resonance, thereby preventing recognition of the individual objects. PMID:22425615

  10. Neural dynamics of object-based multifocal visual spatial attention and priming: object cueing, useful-field-of-view, and crowding.

    PubMed

    Foley, Nicholas C; Grossberg, Stephen; Mingolla, Ennio

    2012-08-01

    How are spatial and object attention coordinated to achieve rapid object learning and recognition during eye movement search? How do prefrontal priming and parietal spatial mechanisms interact to determine the reaction time costs of intra-object attention shifts, inter-object attention shifts, and shifts between visible objects and covertly cued locations? What factors underlie individual differences in the timing and frequency of such attentional shifts? How do transient and sustained spatial attentional mechanisms work and interact? How can volition, mediated via the basal ganglia, influence the span of spatial attention? A neural model is developed of how spatial attention in the where cortical stream coordinates view-invariant object category learning in the what cortical stream under free viewing conditions. The model simulates psychological data about the dynamics of covert attention priming and switching requiring multifocal attention without eye movements. The model predicts how "attentional shrouds" are formed when surface representations in cortical area V4 resonate with spatial attention in posterior parietal cortex (PPC) and prefrontal cortex (PFC), while shrouds compete among themselves for dominance. Winning shrouds support invariant object category learning, and active surface-shroud resonances support conscious surface perception and recognition. Attentive competition between multiple objects and cues simulates reaction-time data from the two-object cueing paradigm. The relative strength of sustained surface-driven and fast-transient motion-driven spatial attention controls individual differences in reaction time for invalid cues. Competition between surface-driven attentional shrouds controls individual differences in detection rate of peripheral targets in useful-field-of-view tasks. The model proposes how the strength of competition can be mediated, though learning or momentary changes in volition, by the basal ganglia. A new explanation of crowding shows how the cortical magnification factor, among other variables, can cause multiple object surfaces to share a single surface-shroud resonance, thereby preventing recognition of the individual objects. Copyright © 2012 Elsevier Inc. All rights reserved.

  11. Application of digital human modeling and simulation for vision analysis of pilots in a jet aircraft: a case study.

    PubMed

    Karmakar, Sougata; Pal, Madhu Sudan; Majumdar, Deepti; Majumdar, Dhurjati

    2012-01-01

    Ergonomic evaluation of visual demands becomes crucial for the operators/users when rapid decision making is needed under extreme time constraint like navigation task of jet aircraft. Research reported here comprises ergonomic evaluation of pilot's vision in a jet aircraft in virtual environment to demonstrate how vision analysis tools of digital human modeling software can be used effectively for such study. Three (03) dynamic digital pilot models, representative of smallest, average and largest Indian pilot population were generated from anthropometric database and interfaced with digital prototype of the cockpit in Jack software for analysis of vision within and outside the cockpit. Vision analysis tools like view cones, eye view windows, blind spot area, obscuration zone, reflection zone etc. were employed during evaluation of visual fields. Vision analysis tool was also used for studying kinematic changes of pilot's body joints during simulated gazing activity. From present study, it can be concluded that vision analysis tool of digital human modeling software was found very effective in evaluation of position and alignment of different displays and controls in the workstation based upon their priorities within the visual fields and anthropometry of the targeted users, long before the development of its physical prototype.

  12. Analytical and experimental study of control effort associated with model reference adaptive control

    NASA Technical Reports Server (NTRS)

    Messer, R. S.; Haftka, R. T.; Cudney, H. H.

    1992-01-01

    Numerical simulation results presently obtained for the performance of model reference adaptive control (MRAC) are experimentally verified, with a view to accounting for differences between the plant and the reference model after the control function has been brought to bear. MRAC is both experimentally and analytically applied to a single-degree-of-freedom system, as well as analytically to a MIMO system having controlled differences between the reference model and the plant. The control effort is noted to be sensitive to differences between the plant and the reference model.

  13. Thin plate spline feature point matching for organ surfaces in minimally invasive surgery imaging

    NASA Astrophysics Data System (ADS)

    Lin, Bingxiong; Sun, Yu; Qian, Xiaoning

    2013-03-01

    Robust feature point matching for images with large view angle changes in Minimally Invasive Surgery (MIS) is a challenging task due to low texture and specular reflections in these images. This paper presents a new approach that can improve feature matching performance by exploiting the inherent geometric property of the organ surfaces. Recently, intensity based template image tracking using a Thin Plate Spline (TPS) model has been extended for 3D surface tracking with stereo cameras. The intensity based tracking is also used here for 3D reconstruction of internal organ surfaces. To overcome the small displacement requirement of intensity based tracking, feature point correspondences are used for proper initialization of the nonlinear optimization in the intensity based method. Second, we generate simulated images from the reconstructed 3D surfaces under all potential view positions and orientations, and then extract feature points from these simulated images. The obtained feature points are then filtered and re-projected to the common reference image. The descriptors of the feature points under different view angles are stored to ensure that the proposed method can tolerate a large range of view angles. We evaluate the proposed method with silicon phantoms and in vivo images. The experimental results show that our method is much more robust with respect to the view angle changes than other state-of-the-art methods.

  14. Limited view angle iterative CT reconstruction

    NASA Astrophysics Data System (ADS)

    Kisner, Sherman J.; Haneda, Eri; Bouman, Charles A.; Skatter, Sondre; Kourinny, Mikhail; Bedford, Simon

    2012-03-01

    Computed Tomography (CT) is widely used for transportation security to screen baggage for potential threats. For example, many airports use X-ray CT to scan the checked baggage of airline passengers. The resulting reconstructions are then used for both automated and human detection of threats. Recently, there has been growing interest in the use of model-based reconstruction techniques for application in CT security systems. Model-based reconstruction offers a number of potential advantages over more traditional direct reconstruction such as filtered backprojection (FBP). Perhaps one of the greatest advantages is the potential to reduce reconstruction artifacts when non-traditional scan geometries are used. For example, FBP tends to produce very severe streaking artifacts when applied to limited view data, which can adversely affect subsequent processing such as segmentation and detection. In this paper, we investigate the use of model-based reconstruction in conjunction with limited-view scanning architectures, and we illustrate the value of these methods using transportation security examples. The advantage of limited view architectures is that it has the potential to reduce the cost and complexity of a scanning system, but its disadvantage is that limited-view data can result in structured artifacts in reconstructed images. Our method of reconstruction depends on the formulation of both a forward projection model for the system, and a prior model that accounts for the contents and densities of typical baggage. In order to evaluate our new method, we use realistic models of baggage with randomly inserted simple simulated objects. Using this approach, we show that model-based reconstruction can substantially reduce artifacts and improve important metrics of image quality such as the accuracy of the estimated CT numbers.

  15. Robots with language.

    PubMed

    Parisi, Domenico

    2010-01-01

    Trying to understand human language by constructing robots that have language necessarily implies an embodied view of language, where the meaning of linguistic expressions is derived from the physical interactions of the organism with the environment. The paper describes a neural model of language according to which the robot's behaviour is controlled by a neural network composed of two sub-networks, one dedicated to the non-linguistic interactions of the robot with the environment and the other one to processing linguistic input and producing linguistic output. We present the results of a number of simulations using the model and we suggest how the model can be used to account for various language-related phenomena such as disambiguation, the metaphorical use of words, the pervasive idiomaticity of multi-word expressions, and mental life as talking to oneself. The model implies a view of the meaning of words and multi-word expressions as a temporal process that takes place in the entire brain and has no clearly defined boundaries. The model can also be extended to emotional words if we assume that an embodied view of language includes not only the interactions of the robot's brain with the external environment but also the interactions of the brain with what is inside the body.

  16. Efforts to integrate CMIP metadata and standards into NOAA-GFDL's climate model workflow

    NASA Astrophysics Data System (ADS)

    Blanton, C.; Lee, M.; Mason, E. E.; Radhakrishnan, A.

    2017-12-01

    Modeling centers participating in CMIP6 run model simulations, publish requested model output (conforming to community data standards), and document models and simulations using ES-DOC. GFDL developed workflow software implementing some best practices to meet these metadata and documentation requirements. The CMIP6 Data Request defines the variables that should be archived for each experiment and specifies their spatial and temporal structure. We used the Data Request's dreqPy python library to write GFDL model configuration files as an alternative to hand-crafted tables. There was also a largely successful effort to standardize variable names within the model to reduce the additional overhead of translating "GFDL to CMOR" variables at a later stage in the pipeline. The ES-DOC ecosystem provides tools and standards to create, publish, and view various types of community-defined CIM documents, most notably model and simulation documents. Although ES-DOC will automatically create simulation documents during publishing by harvesting NetCDF global attributes, the information must be collected, stored, and placed in the NetCDF files by the workflow. We propose to develop a GUI to collect the simulation document precursors. In addition, a new MIP for CMIP6-CPMIP, a comparison of computational performance of climate models-is documented using machine and performance CIM documents. We used ES-DOC's pyesdoc python library to automatically create these machine and performance documents. We hope that these and similar efforts will become permanent features of the GFDL workflow to facilitate future participation in CMIP-like activities.

  17. Modeling and parameter identification of impulse response matrix of mechanical systems

    NASA Astrophysics Data System (ADS)

    Bordatchev, Evgueni V.

    1998-12-01

    A method for studying the problem of modeling, identification and analysis of mechanical system dynamic characteristic in view of the impulse response matrix for the purpose of adaptive control is developed here. Two types of the impulse response matrices are considered: (i) on displacement, which describes the space-coupled relationship between vectors of the force and simulated displacement, which describes the space-coupled relationship between vectors of the force and simulated displacement and (ii) on acceleration, which also describes the space-coupled relationship between the vectors of the force and measured acceleration. The idea of identification consists of: (a) the practical obtaining of the impulse response matrix on acceleration by 'impact-response' technique; (b) the modeling and parameter estimation of the each impulse response function on acceleration through the fundamental representation of the impulse response function on displacement as a sum of the damped sine curves applying linear and non-linear least square methods; (c) simulating the impulse provides the additional possibility to calculate masses, damper and spring constants. The damped natural frequencies are used as a priori information and are found through the standard FFT analysis. The problem of double numerical integration is avoided by taking two derivations of the fundamental dynamic model of a mechanical system as linear combination of the mass-damper-spring subsystems. The identified impulse response matrix on displacement represents the dynamic properties of the mechanical system. From the engineering point of view, this matrix can be also understood as a 'dynamic passport' of the mechanical system and can be used for dynamic certification and analysis of the dynamic quality. In addition, the suggested approach mathematically reproduces amplitude-frequency response matrix in a low-frequency band and on zero frequency. This allows the possibility of determining the matrix of the static stiffness due to dynamic testing over the time of 10- 15 minutes. As a practical example, the dynamic properties in view of the impulse and frequency response matrices of the lathe spindle are obtained, identified and investigated. The developed approach for modeling and parameter identification appears promising for a wide range o industrial applications; for example, rotary systems.

  18. Creation and Validation of a Novel Mobile Simulation Laboratory for High Fidelity, Prehospital, Difficult Airway Simulation.

    PubMed

    Bischof, Jason J; Panchal, Ashish R; Finnegan, Geoffrey I; Terndrup, Thomas E

    2016-10-01

    Introduction Endotracheal intubation (ETI) is a complex clinical skill complicated by the inherent challenge of providing care in the prehospital setting. Literature reports a low success rate of prehospital ETI attempts, partly due to the care environment and partly to the lack of consistent standardized training opportunities of prehospital providers in ETI. Hypothesis/Problem The availability of a mobile simulation laboratory (MSL) to study clinically critical interventions is needed in the prehospital setting to enhance instruction and maintain proficiency. This report is on the development and validation of a prehospital airway simulator and MSL that mimics in situ care provided in an ambulance. The MSL was a Type 3 ambulance with four cameras allowing audio-video recordings of observable behaviors. The prehospital airway simulator is a modified airway mannequin with increased static tongue pressure and a rigid cervical collar. Airway experts validated the model in a static setting through ETI at varying tongue pressures with a goal of a Grade 3 Cormack-Lehane (CL) laryngeal view. Following completion of this development, the MSL was launched with the prehospital airway simulator to distant communities utilizing a single facilitator/driver. Paramedics were recruited to perform ETI in the MSL, and the detailed airway management observations were stored for further analysis. Nineteen airway experts performed 57 ETI attempts at varying tongue pressures demonstrating increased CL views at higher tongue pressures. Tongue pressure of 60 mm Hg generated 31% Grade 3/4 CL view and was chosen for the prehospital trials. The MSL was launched and tested by 18 paramedics. First pass success was 33% with another 33% failing to intubate within three attempts. The MSL created was configured to deliver, record, and assess intubator behaviors with a difficult airway simulation. The MSL created a reproducible, high fidelity, mobile learning environment for assessment of simulated ETI performance by prehospital providers. Bischof JJ , Panchal AR , Finnegan GI , Terndrup TE . Creation and validation of a novel mobile simulation laboratory for high fidelity, prehospital, difficult airway simulation. Prehosp Disaster Med. 2016;31(5):465-470.

  19. Two-state thermodynamics and the possibility of a liquid-liquid phase transition in supercooled TIP4P/2005 water

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singh, Rakesh S.; Debenedetti, Pablo G.; Biddle, John W.

    Water shows intriguing thermodynamic and dynamic anomalies in the supercooled liquid state. One possible explanation of the origin of these anomalies lies in the existence of a metastable liquid-liquid phase transition (LLPT) between two (high and low density) forms of water. While the anomalies are observed in experiments on bulk and confined water and by computer simulation studies of different water-like models, the existence of a LLPT in water is still debated. Unambiguous experimental proof of the existence of a LLPT in bulk supercooled water is hampered by fast ice nucleation which is a precursor of the hypothesized LLPT. Moreover,more » the hypothesized LLPT, being metastable, in principle cannot exist in the thermodynamic limit (infinite size, infinite time). Therefore, computer simulations of water models are crucial for exploring the possibility of the metastable LLPT and the nature of the anomalies. In this work, we present new simulation results in the NVT ensemble for one of the most accurate classical molecular models of water, TIP4P/2005. To describe the computed properties and explore the possibility of a LLPT, we have applied two-structure thermodynamics, viewing water as a non-ideal mixture of two interconvertible local structures (“states”). The results suggest the presence of a liquid-liquid critical point and are consistent with the existence of a LLPT in this model for the simulated length and time scales. We have compared the behavior of TIP4P/2005 with other popular water-like models, namely, mW and ST2, and with real water, all of which are well described by two-state thermodynamics. In view of the current debate involving different studies of TIP4P/2005, we discuss consequences of metastability and finite size in observing the liquid-liquid separation. We also address the relationship between the phenomenological order parameter of two-structure thermodynamics and the microscopic nature of the low-density structure.« less

  20. FluorMODgui V3.0: A graphic user interface for the spectral simulation of leaf and canopy chlorophyll fluorescence

    NASA Astrophysics Data System (ADS)

    Zarco-Tejada, P. J.; Miller, J. R.; Pedrós, R.; Verhoef, W.; Berger, M.

    2006-06-01

    The FluorMODgui Graphic User Interface (GUI) software package developed within the frame of the FluorMOD project Development of a Vegetation Fluorescence Canopy Model is presented in this manuscript. The FluorMOD project was launched in 2002 by the European Space Agency (ESA) to advance the science of vegetation fluorescence simulation through the development and integration of leaf and canopy fluorescence models based on physical methods. The design of airborne or space missions dedicated to the measurement of solar-induced chlorophyll fluorescence using remote-sensing instruments require physical methods for quantitative feasibility analysis and sensor specification studies. The FluorMODgui model developed as part of this project is designed to simulate the effects of chlorophyll fluorescence at leaf and canopy levels using atmospheric inputs, running the leaf model, FluorMODleaf, and the canopy model, FluorSAIL, independently, through a coupling scheme, and by a multiple iteration protocol to simulate changes in the viewing geometry and atmospheric characteristics. Inputs for the FluorMODleaf model are the number of leaf layers, chlorophyll a+ b content, water equivalent thickness, dry matter content, fluorescence quantum efficiency, temperature, species type, and stoichiometry. Inputs for the FluorSAIL canopy model are a MODTRAN-4 6-parameter spectra or measured direct horizontal irradiance and diffuse irradiance spectra, a soil reflectance spectrum, leaf reflectance & transmittance spectra and a excitation-fluorescence response matrix in upward and downward directions (all from FluorMODleaf), 2 PAR-dependent coefficients for the fluorescence response to light level, relative azimuth angle and viewing zenith angle, canopy leaf area index, leaf inclination distribution function, and a hot spot parameter. Outputs available in the 400-1000 nm spectral range from the graphical user interface, FluorMODgui, are the leaf spectral reflectance and transmittance, and the canopy reflectance, with and without fluorescence effects. In addition, solar and sky irradiance on the ground, radiance with and without fluorescence on the ground, and top-of-atmosphere (TOA) radiances for bare soil and surroundings same as target are also produced. The models and documentation regarding the FluorMOD project can be downloaded at http://www.ias.csic.es/fluormod.

  1. Simulation of glioblastoma multiforme (GBM) tumor cells using ising model on the Creutz Cellular Automaton

    NASA Astrophysics Data System (ADS)

    Züleyha, Artuç; Ziya, Merdan; Selçuk, Yeşiltaş; Kemal, Öztürk M.; Mesut, Tez

    2017-11-01

    Computational models for tumors have difficulties due to complexity of tumor nature and capacities of computational tools, however, these models provide visions to understand interactions between tumor and its micro environment. Moreover computational models have potential to develop strategies for individualized treatments for cancer. To observe a solid brain tumor, glioblastoma multiforme (GBM), we present a two dimensional Ising Model applied on Creutz cellular automaton (CCA). The aim of this study is to analyze avascular spherical solid tumor growth, considering transitions between non tumor cells and cancer cells are like phase transitions in physical system. Ising model on CCA algorithm provides a deterministic approach with discrete time steps and local interactions in position space to view tumor growth as a function of time. Our simulation results are given for fixed tumor radius and they are compatible with theoretical and clinic data.

  2. [Constructing 3-dimensional colorized digital dental model assisted by digital photography].

    PubMed

    Ye, Hong-qiang; Liu, Yu-shu; Liu, Yun-song; Ning, Jing; Zhao, Yi-jiao; Zhou, Yong-sheng

    2016-02-18

    To explore a method of constructing universal 3-dimensional (3D) colorized digital dental model which can be displayed and edited in common 3D software (such as Geomagic series), in order to improve the visual effect of digital dental model in 3D software. The morphological data of teeth and gingivae were obtained by intra-oral scanning system (3Shape TRIOS), constructing 3D digital dental models. The 3D digital dental models were exported as STL files. Meanwhile, referring to the accredited photography guide of American Academy of Cosmetic Dentistry (AACD), five selected digital photographs of patients'teeth and gingivae were taken by digital single lens reflex camera (DSLR) with the same exposure parameters (except occlusal views) to capture the color data. In Geomagic Studio 2013, after STL file of 3D digital dental model being imported, digital photographs were projected on 3D digital dental model with corresponding position and angle. The junctions of different photos were carefully trimmed to get continuous and natural color transitions. Then the 3D colorized digital dental model was constructed, which was exported as OBJ file or WRP file which was a special file for software of Geomagic series. For the purpose of evaluating the visual effect of the 3D colorized digital model, a rating scale on color simulation effect in views of patients'evaluation was used. Sixteen patients were recruited and their scores on colored and non-colored digital dental models were recorded. The data were analyzed using McNemar-Bowker test in SPSS 20. Universal 3D colorized digital dental model with better color simulation was constructed based on intra-oral scanning and digital photography. For clinical application, the 3D colorized digital dental models, combined with 3D face images, were introduced into 3D smile design of aesthetic rehabilitation, which could improve the patients' cognition for the esthetic digital design and virtual prosthetic effect. Universal 3D colorized digital dental model with better color simulation can be constructed assisted by 3D dental scanning system and digital photography. In clinical practice, the communication between dentist and patients could be improved assisted by the better visual perception since the colorized 3D digital dental models with better color simulation effect.

  3. Simulated and Real Sheet-of-Light 3D Object Scanning Using a-Si:H Thin Film PSD Arrays.

    PubMed

    Contreras, Javier; Tornero, Josep; Ferreira, Isabel; Martins, Rodrigo; Gomes, Luis; Fortunato, Elvira

    2015-11-30

    A MATLAB/SIMULINK software simulation model (structure and component blocks) has been constructed in order to view and analyze the potential of the PSD (Position Sensitive Detector) array concept technology before it is further expanded or developed. This simulation allows changing most of its parameters, such as the number of elements in the PSD array, the direction of vision, the viewing/scanning angle, the object rotation, translation, sample/scan/simulation time, etc. In addition, results show for the first time the possibility of scanning an object in 3D when using an a-Si:H thin film 128 PSD array sensor and hardware/software system. Moreover, this sensor technology is able to perform these scans and render 3D objects at high speeds and high resolutions when using a sheet-of-light laser within a triangulation platform. As shown by the simulation, a substantial enhancement in 3D object profile image quality and realism can be achieved by increasing the number of elements of the PSD array sensor as well as by achieving an optimal position response from the sensor since clearly the definition of the 3D object profile depends on the correct and accurate position response of each detector as well as on the size of the PSD array.

  4. Improving Power System Modeling. A Tool to Link Capacity Expansion and Production Cost Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Diakov, Victor; Cole, Wesley; Sullivan, Patrick

    2015-11-01

    Capacity expansion models (CEM) provide a high-level long-term view at the prospects of the evolving power system. In simulating the possibilities of long-term capacity expansion, it is important to maintain the viability of power system operation in the short-term (daily, hourly and sub-hourly) scales. Production-cost models (PCM) simulate routine power system operation on these shorter time scales using detailed load, transmission and generation fleet data by minimizing production costs and following reliability requirements. When based on CEM 'predictions' about generating unit retirements and buildup, PCM provide more detailed simulation for the short-term system operation and, consequently, may confirm the validitymore » of capacity expansion predictions. Further, production cost model simulations of a system that is based on capacity expansion model solution are 'evolutionary' sound: the generator mix is the result of logical sequence of unit retirement and buildup resulting from policy and incentives. The above has motivated us to bridge CEM with PCM by building a capacity expansion - to - production cost model Linking Tool (CEPCoLT). The Linking Tool is built to onset capacity expansion model prescriptions onto production cost model inputs. NREL's ReEDS and Energy Examplar's PLEXOS are the capacity expansion and the production cost models, respectively. Via the Linking Tool, PLEXOS provides details of operation for the regionally-defined ReEDS scenarios.« less

  5. The visualization of spatial uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Srivastava, R.M.

    1994-12-31

    Geostatistical conditions simulation is gaining acceptance as a numerical modeling tool in the petroleum industry. Unfortunately, many of the new users of conditional simulation work with only one outcome or ``realization`` and ignore the many other outcomes that could be produced by their conditional simulation tools. 3-D visualization tools allow them to create very realistic images of this single outcome as reality. There are many methods currently available for presenting the uncertainty information from a family of possible outcomes; most of these, however, use static displays and many present uncertainty in a format that is not intuitive. This paper exploresmore » the visualization of uncertainty through dynamic displays that exploit the intuitive link between uncertainty and change by presenting the use with a constantly evolving model. The key technical challenge to such a dynamic presentation is the ability to create numerical models that honor the available well data and geophysical information and yet are incrementally different so that successive frames can be viewed rapidly as an animated cartoon. An example of volumetric uncertainty from a Gulf Coast reservoir will be used to demonstrate that such a dynamic presentation is the ability to create numerical models that honor the available well data and geophysical information and yet are incrementally different so that successive frames can be viewed rapidly as an animated cartoon. An example of volumetric uncertainty from a Gulf Coast reservoir will be used to demonstrate that such animation is possible and to show that such dynamic displays can be an effective tool in risk analysis for the petroleum industry.« less

  6. Normalization of time-series satellite reflectance data to a standard sun-target-sensor geometry using a semi-empirical model

    NASA Astrophysics Data System (ADS)

    Zhao, Yongguang; Li, Chuanrong; Ma, Lingling; Tang, Lingli; Wang, Ning; Zhou, Chuncheng; Qian, Yonggang

    2017-10-01

    Time series of satellite reflectance data have been widely used to characterize environmental phenomena, describe trends in vegetation dynamics and study climate change. However, several sensors with wide spatial coverage and high observation frequency are usually designed to have large field of view (FOV), which cause variations in the sun-targetsensor geometry in time-series reflectance data. In this study, on the basis of semiempirical kernel-driven BRDF model, a new semi-empirical model was proposed to normalize the sun-target-sensor geometry of remote sensing image. To evaluate the proposed model, bidirectional reflectance under different canopy growth conditions simulated by Discrete Anisotropic Radiative Transfer (DART) model were used. The semi-empirical model was first fitted by using all simulated bidirectional reflectance. Experimental result showed a good fit between the bidirectional reflectance estimated by the proposed model and the simulated value. Then, MODIS time-series reflectance data was normalized to a common sun-target-sensor geometry by the proposed model. The experimental results showed the proposed model yielded good fits between the observed and estimated values. The noise-like fluctuations in time-series reflectance data was also reduced after the sun-target-sensor normalization process.

  7. A Teamwork-Oriented Air Traffic Control Simulator

    DTIC Science & Technology

    2006-06-01

    the software development methodology of this work , this chapter is viewed as the acquisition phase of this model. The end of the ...Maintenance phase Changed Verification Retirement Development Maintenance 37 because the different controllers working in these phases usually...traditional operation such as scaling the airport and personalizing the working environment. 4. Pilot Specification The

  8. COMPARISON OF N2O EMISSIONS FROM SOILS AT THREE TEMPERATE AGRICULTURAL SITES: SIMULATIONS OF YEAR-ROUND MEASUREMENTS BY FOUR MODELS. (R824993)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  9. EXTENSION OF SELF-MODELING CURVE RESOLUTION TO MIXTURES OF MORE THAN THREE COMPONENTS. PART 3. ATMOSPHERIC AEROSOL DATA SIMULATION STUDIES. (R826238)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  10. Environmental fog/rain visual display system for aircraft simulators

    NASA Technical Reports Server (NTRS)

    Chase, W. D. (Inventor)

    1982-01-01

    An environmental fog/rain visual display system for aircraft simulators is described. The electronic elements of the system include a real time digital computer, a caligraphic color display which simulates landing lights of selective intensity, and a color television camera for producing a moving color display of the airport runway as depicted on a model terrain board. The mechanical simulation elements of the system include an environmental chamber which can produce natural fog, nonhomogeneous fog, rain and fog combined, or rain only. A pilot looking through the aircraft wind screen will look through the fog and/or rain generated in the environmental chamber onto a viewing screen with the simulated color image of the airport runway thereon, and observe a very real simulation of actual conditions of a runway as it would appear through actual fog and/or rain.

  11. The Matter Simulation (R)evolution

    PubMed Central

    2018-01-01

    To date, the program for the development of methods and models for atomistic and continuum simulation directed toward chemicals and materials has reached an incredible degree of sophistication and maturity. Currently, one can witness an increasingly rapid emergence of advances in computing, artificial intelligence, and robotics. This drives us to consider the future of computer simulation of matter from the molecular to the human length and time scales in a radical way that deliberately dares to go beyond the foreseeable next steps in any given discipline. This perspective article presents a view on this future development that we believe is likely to become a reality during our lifetime. PMID:29532014

  12. GF-7 Imaging Simulation and Dsm Accuracy Estimate

    NASA Astrophysics Data System (ADS)

    Yue, Q.; Tang, X.; Gao, X.

    2017-05-01

    GF-7 satellite is a two-line-array stereo imaging satellite for surveying and mapping which will be launched in 2018. Its resolution is about 0.8 meter at subastral point corresponding to a 20 km width of cloth, and the viewing angle of its forward and backward cameras are 5 and 26 degrees. This paper proposed the imaging simulation method of GF-7 stereo images. WorldView-2 stereo images were used as basic data for simulation. That is, we didn't use DSM and DOM as basic data (we call it "ortho-to-stereo" method) but used a "stereo-to-stereo" method, which will be better to reflect the difference of geometry and radiation in different looking angle. The shortage is that geometric error will be caused by two factors, one is different looking angles between basic image and simulated image, another is not very accurate or no ground reference data. We generated DSM by WorldView-2 stereo images. The WorldView-2 DSM was not only used as reference DSM to estimate the accuracy of DSM generated by simulated GF-7 stereo images, but also used as "ground truth" to establish the relationship between WorldView-2 image point and simulated image point. Static MTF was simulated on the instantaneous focal plane "image" by filtering. SNR was simulated in the electronic sense, that is, digital value of WorldView-2 image point was converted to radiation brightness and used as radiation brightness of simulated GF-7 camera. This radiation brightness will be converted to electronic number n according to physical parameters of GF-7 camera. The noise electronic number n1 will be a random number between -√n and √n. The overall electronic number obtained by TDI CCD will add and converted to digital value of simulated GF-7 image. Sinusoidal curves with different amplitude, frequency and initial phase were used as attitude curves. Geometric installation errors of CCD tiles were also simulated considering the rotation and translation factors. An accuracy estimate was made for DSM generated from simulated images.

  13. Node Scheduling Strategies for Achieving Full-View Area Coverage in Camera Sensor Networks.

    PubMed

    Wu, Peng-Fei; Xiao, Fu; Sha, Chao; Huang, Hai-Ping; Wang, Ru-Chuan; Xiong, Nai-Xue

    2017-06-06

    Unlike conventional scalar sensors, camera sensors at different positions can capture a variety of views of an object. Based on this intrinsic property, a novel model called full-view coverage was proposed. We study the problem that how to select the minimum number of sensors to guarantee the full-view coverage for the given region of interest (ROI). To tackle this issue, we derive the constraint condition of the sensor positions for full-view neighborhood coverage with the minimum number of nodes around the point. Next, we prove that the full-view area coverage can be approximately guaranteed, as long as the regular hexagons decided by the virtual grid are seamlessly stitched. Then we present two solutions for camera sensor networks in two different deployment strategies. By computing the theoretically optimal length of the virtual grids, we put forward the deployment pattern algorithm (DPA) in the deterministic implementation. To reduce the redundancy in random deployment, we come up with a local neighboring-optimal selection algorithm (LNSA) for achieving the full-view coverage. Finally, extensive simulation results show the feasibility of our proposed solutions.

  14. Node Scheduling Strategies for Achieving Full-View Area Coverage in Camera Sensor Networks

    PubMed Central

    Wu, Peng-Fei; Xiao, Fu; Sha, Chao; Huang, Hai-Ping; Wang, Ru-Chuan; Xiong, Nai-Xue

    2017-01-01

    Unlike conventional scalar sensors, camera sensors at different positions can capture a variety of views of an object. Based on this intrinsic property, a novel model called full-view coverage was proposed. We study the problem that how to select the minimum number of sensors to guarantee the full-view coverage for the given region of interest (ROI). To tackle this issue, we derive the constraint condition of the sensor positions for full-view neighborhood coverage with the minimum number of nodes around the point. Next, we prove that the full-view area coverage can be approximately guaranteed, as long as the regular hexagons decided by the virtual grid are seamlessly stitched. Then we present two solutions for camera sensor networks in two different deployment strategies. By computing the theoretically optimal length of the virtual grids, we put forward the deployment pattern algorithm (DPA) in the deterministic implementation. To reduce the redundancy in random deployment, we come up with a local neighboring-optimal selection algorithm (LNSA) for achieving the full-view coverage. Finally, extensive simulation results show the feasibility of our proposed solutions. PMID:28587304

  15. New media simulation stories in nursing education: a quasi-experimental study exploring learning outcomes.

    PubMed

    Webb-Corbett, Robin; Schwartz, Melissa Renee; Green, Bob; Sessoms, Andrea; Swanson, Melvin

    2013-04-01

    New media simulation stories are short multimedia presentations that combine simulation, digital technology, and story branching to depict a variety of healthcare-related scenarios. The purpose of this study was to explore whether learning outcomes were enhanced if students viewed the results of both correct and incorrect nursing actions demonstrated through new media simulation stories. A convenience sample of 109 undergraduate nursing students in a family-centered maternity course participated in the study. Study findings suggests that students who viewed both correct and incorrect depictions of maternity nursing actions scored better on tests than did those students who viewed only correct nursing actions.

  16. ICME — A Mere Coupling of Models or a Discipline of Its Own?

    NASA Astrophysics Data System (ADS)

    Bambach, Markus; Schmitz, Georg J.; Prahl, Ulrich

    Technically, ICME — Integrated computational materials engineering — is an approach for solving advanced engineering problems related to the design of new materials and processes by combining individual materials and process models. The combination of models by now is mainly achieved by manual transformation of the output of a simulation to form the input to a subsequent one. This subsequent simulation is either performed at a different length scale or constitutes a subsequent step along the process chain. Is ICME thus just a synonym for the coupling of simulations? In fact, most ICME publications up to now are examples of the joint application of selected models and software codes to a specific problem. However, from a systems point of view, the coupling of individual models and/or software codes across length scales and along material processing chains leads to highly complex meta-models. Their viability has to be ensured by joint efforts from science, industry, software developers and independent organizations. This paper identifies some developments that seem necessary to make future ICME simulations viable, sustainable and broadly accessible and accepted. The main conclusion is that ICME is more than a multi-disciplinary subject but a discipline of its own, for which a generic structural framework has to be elaborated and established.

  17. Relativistic MHD simulations of collision-induced magnetic dissipation in Poynting-flux-dominated jets/outflows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deng, Wei

    2015-07-21

    The question of the energy composition of the jets/outflows in high-energy astrophysical systems, e.g. GRBs, AGNs, is taken up first: Matter-flux-dominated (MFD), σ < 1, and/or Poynting-flux-dominated (PFD), σ >1? The standard fireball IS model and dissipative photosphere model are MFD, while the ICMART (Internal-Collision-induced MAgnetic Reconnection and Turbulence) model is PFD. Motivated by ICMART model and other relevant problems, such as “jets in a jet” model of AGNs, the author investigates the models from the EMF energy dissipation efficiency, relativistic outflow generation, and σ evolution points of view, and simulates collisions between high-σ blobs to mimic the situation ofmore » the interactions inside the PFD jets/outflows by using a 3D SRMHD code which solves the conservative form of the ideal MHD equations. σ b,f is calculated from the simulation results (threshold = 1). The efficiency obtained from this hybrid method is similar to the efficiency got from the energy evolution of the simulations (35.2%). Efficiency is nearly σ independent, which is also confirmed by the hybrid method. σ b,i - σ b,f provides an interesting linear relationship. Results of several parameter studies of EMF energy dissipation efficiency are shown.« less

  18. Transmutation of All German Transuranium under Nuclear Phase Out Conditions – Is This Feasible from Neutronic Point of View?

    PubMed Central

    Merk, Bruno; Litskevich, Dzianis

    2015-01-01

    The German government has decided for the nuclear phase out, but a decision on a strategy for the management of the highly radioactive waste is not defined yet. Partitioning and Transmutation (P&T) could be considered as a technological option for the management of highly radioactive waste, therefore a wide study has been conducted. In the study group objectives for P&T and the boundary conditions of the phase out have been discussed. The fulfillment of the given objectives is analyzed from neutronics point of view using simulations of a molten salt reactor with fast neutron spectrum. It is shown that the efficient transmutation of all existing transuranium isotopes would be possible from neutronic point of view in a time frame of about 60 years. For this task three reactors of a mostly new technology would have to be developed and a twofold life cycle consisting of a transmuter operation and a deep burn phase would be required. A basic insight for the optimization of the time duration of the deep burn phase is given. Further on, a detailed balance of different isotopic inventories is given to allow a deeper understanding of the processes during transmutation in the molten salt fast reactor. The effect of modeling and simulation is investigated based on three different modeling strategies and two different code versions. PMID:26717509

  19. Transmutation of All German Transuranium under Nuclear Phase Out Conditions - Is This Feasible from Neutronic Point of View?

    PubMed

    Merk, Bruno; Litskevich, Dzianis

    2015-01-01

    The German government has decided for the nuclear phase out, but a decision on a strategy for the management of the highly radioactive waste is not defined yet. Partitioning and Transmutation (P&T) could be considered as a technological option for the management of highly radioactive waste, therefore a wide study has been conducted. In the study group objectives for P&T and the boundary conditions of the phase out have been discussed. The fulfillment of the given objectives is analyzed from neutronics point of view using simulations of a molten salt reactor with fast neutron spectrum. It is shown that the efficient transmutation of all existing transuranium isotopes would be possible from neutronic point of view in a time frame of about 60 years. For this task three reactors of a mostly new technology would have to be developed and a twofold life cycle consisting of a transmuter operation and a deep burn phase would be required. A basic insight for the optimization of the time duration of the deep burn phase is given. Further on, a detailed balance of different isotopic inventories is given to allow a deeper understanding of the processes during transmutation in the molten salt fast reactor. The effect of modeling and simulation is investigated based on three different modeling strategies and two different code versions.

  20. FY17 ISCR Scholar End-of-Assignment Report - Robbie Sadre

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sadre, R.

    2017-10-20

    Throughout this internship assignment, I did various tasks that contributed towards the starting of the SASEDS (Safe Active Scanning for Energy Delivery Systems) and CES-21 (California Energy Systems for the 21st Century) projects in the SKYFALL laboratory. The goal of the SKYFALL laboratory is to perform modeling and simulation verification of transmission power system devices, while integrating with high-performance computing. The first thing I needed to do was acquire official Online LabVIEW training from National Instruments. Through these online tutorial modules, I learned the basics of LabVIEW, gaining experience in connecting to NI devices through the DAQmx API as wellmore » as LabVIEW basic programming techniques (structures, loops, state machines, front panel GUI design etc).« less

  1. Systematic analysis of signaling pathways using an integrative environment.

    PubMed

    Visvanathan, Mahesh; Breit, Marc; Pfeifer, Bernhard; Baumgartner, Christian; Modre-Osprian, Robert; Tilg, Bernhard

    2007-01-01

    Understanding the biological processes of signaling pathways as a whole system requires an integrative software environment that has comprehensive capabilities. The environment should include tools for pathway design, visualization, simulation and a knowledge base concerning signaling pathways as one. In this paper we introduce a new integrative environment for the systematic analysis of signaling pathways. This system includes environments for pathway design, visualization, simulation and a knowledge base that combines biological and modeling information concerning signaling pathways that provides the basic understanding of the biological system, its structure and functioning. The system is designed with a client-server architecture. It contains a pathway designing environment and a simulation environment as upper layers with a relational knowledge base as the underlying layer. The TNFa-mediated NF-kB signal trans-duction pathway model was designed and tested using our integrative framework. It was also useful to define the structure of the knowledge base. Sensitivity analysis of this specific pathway was performed providing simulation data. Then the model was extended showing promising initial results. The proposed system offers a holistic view of pathways containing biological and modeling data. It will help us to perform biological interpretation of the simulation results and thus contribute to a better understanding of the biological system for drug identification.

  2. Modelling Public Security Operations: Evaluation of the Holistic Security Ecosystem (HSE) Proof-of-Concept

    DTIC Science & Technology

    2012-12-01

    base pour construire de telles simulations et pourrait être adaptée à d’autres expériences à un coût relativement bas. Perspectives : Les leçons...systems (such as culture , [ Culture ]). This seven-dimensional framework advocates that systems be viewed from the physical, individual, functional...structural, normative, social, and informational dimensions. The human factors include modelling stress, trust, risk factors, and cultural factors

  3. Research Area 3 - Mathematical Sciences: Multiscale Modeling of the Mechanics of Advanced Energetic Materials Relevant to Detonation Prediction

    DTIC Science & Technology

    2015-08-24

    new energetic materials with enhanced energy release rates and reduced sensitivity to unintentional detonation . The following results have been...Mechanics of Advanced Energetic Materials Relevant to Detonation Prediction The views, opinions and/or findings contained in this report are those of the...modeling, molecular simulations, detonation prediction REPORT DOCUMENTATION PAGE 11. SPONSOR/MONITOR’S REPORT NUMBER(S) 10. SPONSOR/MONITOR’S

  4. A High Fidelity Approach to Data Simulation for Space Situational Awareness Missions

    NASA Astrophysics Data System (ADS)

    Hagerty, S.; Ellis, H., Jr.

    2016-09-01

    Space Situational Awareness (SSA) is vital to maintaining our Space Superiority. A high fidelity, time-based simulation tool, PROXOR™ (Proximity Operations and Rendering), supports SSA by generating realistic mission scenarios including sensor frame data with corresponding truth. This is a unique and critical tool for supporting mission architecture studies, new capability (algorithm) development, current/future capability performance analysis, and mission performance prediction. PROXOR™ provides a flexible architecture for sensor and resident space object (RSO) orbital motion and attitude control that simulates SSA, rendezvous and proximity operations scenarios. The major elements of interest are based on the ability to accurately simulate all aspects of the RSO model, viewing geometry, imaging optics, sensor detector, and environmental conditions. These capabilities enhance the realism of mission scenario models and generated mission image data. As an input, PROXOR™ uses a library of 3-D satellite models containing 10+ satellites, including low-earth orbit (e.g., DMSP) and geostationary (e.g., Intelsat) spacecraft, where the spacecraft surface properties are those of actual materials and include Phong and Maxwell-Beard bidirectional reflectance distribution function (BRDF) coefficients for accurate radiometric modeling. We calculate the inertial attitude, the changing solar and Earth illumination angles of the satellite, and the viewing angles from the sensor as we propagate the RSO in its orbit. The synthetic satellite image is rendered at high resolution and aggregated to the focal plane resolution resulting in accurate radiometry even when the RSO is a point source. The sensor model includes optical effects from the imaging system [point spread function (PSF) includes aberrations, obscurations, support structures, defocus], detector effects (CCD blooming, left/right bias, fixed pattern noise, image persistence, shot noise, read noise, and quantization noise), and environmental effects (radiation hits with selectable angular distributions and 4-layer atmospheric turbulence model for ground based sensors). We have developed an accurate flash Light Detection and Ranging (LIDAR) model that supports reconstruction of 3-dimensional information on the RSO. PROXOR™ contains many important imaging effects such as intra-frame smear, realized by oversampling the image in time and capturing target motion and jitter during the integration time.

  5. A system dynamics approach to analyze laboratory test errors.

    PubMed

    Guo, Shijing; Roudsari, Abdul; Garcez, Artur d'Avila

    2015-01-01

    Although many researches have been carried out to analyze laboratory test errors during the last decade, it still lacks a systemic view of study, especially to trace errors during test process and evaluate potential interventions. This study implements system dynamics modeling into laboratory errors to trace the laboratory error flows and to simulate the system behaviors while changing internal variable values. The change of the variables may reflect a change in demand or a proposed intervention. A review of literature on laboratory test errors was given and provided as the main data source for the system dynamics model. Three "what if" scenarios were selected for testing the model. System behaviors were observed and compared under different scenarios over a period of time. The results suggest system dynamics modeling has potential effectiveness of helping to understand laboratory errors, observe model behaviours, and provide a risk-free simulation experiments for possible strategies.

  6. OpenKIM - Building a Knowledgebase of Interatomic Models

    NASA Astrophysics Data System (ADS)

    Bierbaum, Matthew; Tadmor, Ellad; Elliott, Ryan; Wennblom, Trevor; Alemi, Alexander; Chen, Yan-Jiun; Karls, Daniel; Ludvik, Adam; Sethna, James

    2014-03-01

    The Knowledgebase of Interatomic Models (KIM) is an effort by the computational materials community to provide a standard interface for the development, characterization, and use of interatomic potentials. The KIM project has developed an API between simulation codes and interatomic models written in several different languages including C, Fortran, and Python. This interface is already supported in popular simulation environments such as LAMMPS and ASE, giving quick access to over a hundred compatible potentials that have been contributed so far. To compare and characterize models, we have developed a computational processing pipeline which automatically runs a series of tests for each model in the system, such as phonon dispersion relations and elastic constant calculations. To view the data from these tests, we created a rich set of interactive visualization tools located online. Finally, we created a Web repository to store and share these potentials, tests, and visualizations which can be found at https://openkim.org along with futher information.

  7. Paint by Particle

    NASA Image and Video Library

    2017-12-08

    NASA models and supercomputing have created a colorful new view of aerosol movement. Satellites, balloon-borne instruments and ground-based devices make 30 million observations of the atmosphere each day. Yet these measurements still give an incomplete picture of the complex interactions within the membrane surrounding Earth. Enter climate models. Through mathematical experiments, modelers can move Earth forward or backward in time to create a dynamic portrait of the planet. Researchers from NASA Goddard’s Global Modeling and Assimilation Office recently ran a simulation of the atmosphere that captured how winds whip aerosols around the world. Such simulations allow scientists to better understand how these tiny particulates travel in the atmosphere and influence weather and climate. In the visualization below, covering August 2006 to April 2007, watch as dust and sea salt swirl inside cyclones, carbon bursts from fires, sulfate streams from volcanoes—and see how these aerosols paint the modeled world. Credit: NASA/Goddard Space Flight Center

  8. Complex Dynamics of Wetland Ecosystem with Nonlinear Harvesting: Application to Chilika Lake in Odisha, India

    NASA Astrophysics Data System (ADS)

    Upadhyay, Ranjit Kumar; Tiwari, S. K.; Roy, Parimita

    2015-06-01

    In this paper, an attempt has been made to study the spatial and temporal dynamical interactions among the species of wetland ecosystem through a mathematical model. The model represents the population dynamics of phytoplankton, zooplankton and fish species found in Chilika lake, Odisha, India. Nonlinear stability analysis of both the temporal and spatial models has been carried out. Maximum sustainable yield and optimal harvesting policy have been studied for a nonspatial model system. Numerical simulation has been performed to figure out the parameters responsible for the complex dynamics of the wetland system. Significant outcomes of our numerical findings and their interpretations from an ecological point of view are provided in this paper. Numerical simulation of spatial model exhibits some interesting and beautiful patterns. We have also pointed out the parameters that are responsible for the good health of wetland ecosystem.

  9. Self-organized phenomena of pedestrian counterflow through a wide bottleneck in a channel

    NASA Astrophysics Data System (ADS)

    Dong, Li-Yun; Lan, Dong-Kai; Li, Xiang

    2016-09-01

    The pedestrian counterflow through a bottleneck in a channel shows a variety of flow patterns due to self-organization. In order to reveal the underlying mechanism, a cellular automaton model was proposed by incorporating the floor field and the view field which reflects the global information of the studied area and local interactions with others. The presented model can well reproduce typical collective behaviors, such as lane formation. Numerical simulations were performed in the case of a wide bottleneck and typical flow patterns at different density ranges were identified as rarefied flow, laminar flow, interrupted bidirectional flow, oscillatory flow, intermittent flow, and choked flow. The effects of several parameters, such as the size of view field and the width of opening, on the bottleneck flow are also analyzed in detail. The view field plays a vital role in reproducing self-organized phenomena of pedestrian. Numerical results showed that the presented model can capture key characteristics of bottleneck flows. Project supported by the National Basic Research Program of China (Grant No. 2012CB725404) and the National Natural Science Foundation of China (Grant Nos. 11172164 and 11572184).

  10. Simulation and visualization of fundamental optics phenomenon by LabVIEW

    NASA Astrophysics Data System (ADS)

    Lyu, Bohan

    2017-08-01

    Most instructors teach complex phenomenon by equation and static illustration without interactive multimedia. Students usually memorize phenomenon by taking note. However, only note or complex formula can not make user visualize the phenomenon of the photonics system. LabVIEW is a good tool for in automatic measurement. However, the simplicity of coding in LabVIEW makes it not only suit for automatic measurement, but also suitable for simulation and visualization of fundamental optics phenomenon. In this paper, five simple optics phenomenon will be discuss and simulation with LabVIEW. They are Snell's Law, Hermite-Gaussian beam transverse mode, square and circular aperture diffraction, polarization wave and Poincare sphere, and finally Fabry-Perrot etalon in spectrum domain.

  11. Collaborative testing of turbulence models

    NASA Astrophysics Data System (ADS)

    Bradshaw, P.

    1992-12-01

    This project, funded by AFOSR, ARO, NASA, and ONR, was run by the writer with Profs. Brian E. Launder, University of Manchester, England, and John L. Lumley, Cornell University. Statistical data on turbulent flows, from lab. experiments and simulations, were circulated to modelers throughout the world. This is the first large-scale project of its kind to use simulation data. The modelers returned their predictions to Stanford, for distribution to all modelers and to additional participants ('experimenters')--over 100 in all. The object was to obtain a consensus on the capabilities of present-day turbulence models and identify which types most deserve future support. This was not completely achieved, mainly because not enough modelers could produce results for enough test cases within the duration of the project. However, a clear picture of the capabilities of various modeling groups has appeared, and the interaction has been helpful to the modelers. The results support the view that Reynolds-stress transport models are the most accurate.

  12. A new 3D maser code applied to flaring events

    NASA Astrophysics Data System (ADS)

    Gray, M. D.; Mason, L.; Etoka, S.

    2018-06-01

    We set out the theory and discretization scheme for a new finite-element computer code, written specifically for the simulation of maser sources. The code was used to compute fractional inversions at each node of a 3D domain for a range of optical thicknesses. Saturation behaviour of the nodes with regard to location and optical depth was broadly as expected. We have demonstrated via formal solutions of the radiative transfer equation that the apparent size of the model maser cloud decreases as expected with optical depth as viewed by a distant observer. Simulations of rotation of the cloud allowed the construction of light curves for a number of observable quantities. Rotation of the model cloud may be a reasonable model for quasi-periodic variability, but cannot explain periodic flaring.

  13. Electron-phonon interaction within classical molecular dynamics

    DOE PAGES

    Tamm, A.; Samolyuk, G.; Correa, A. A.; ...

    2016-07-14

    Here, we present a model for nonadiabatic classical molecular dynamics simulations that captures with high accuracy the wave-vector q dependence of the phonon lifetimes, in agreement with quantum mechanics calculations. It is based on a local view of the e-ph interaction where individual atom dynamics couples to electrons via a damping term that is obtained as the low-velocity limit of the stopping power of a moving ion in a host. The model is parameter free, as its components are derived from ab initio-type calculations, is readily extended to the case of alloys, and is adequate for large-scale molecular dynamics computermore » simulations. We also show how this model removes some oversimplifications of the traditional ionic damped dynamics commonly used to describe situations beyond the Born-Oppenheimer approximation.« less

  14. A new attempt using LabVIEW into a computational experiment of plasma focus device

    NASA Astrophysics Data System (ADS)

    Kim, Myungkyu

    2017-03-01

    The simulation program of plasma focus device based on S. Lee's model has been first developed since 30 years ago and it is widely used to date. Originally the program made by GWbasic language, and then modified by visual basic which was included in the Microsoft Excel. Using Excel well-known to researchers is a key advantage of this program. But it has disadvantages in displaying data in same graph, in slow calculation speed, and in displaying data and calculation of smaller time step. To overcome all these points, the LabVIEW that made by national instrument and based on graphical environment is used for simulation. Furthermore it is correlated with data acquisition of experiment, once experiment being the data is directly transferred to the simulation program and then analyzes and predicts for the next shot. The mass swept factor (fm) and current factor (fc) can be easily find out using this program. This paper describes the detail function and usage of the program and compares the results with the existing one.

  15. Welfare Gains from Financial Liberalization

    PubMed Central

    Townsend, Robert M.; Ueda, Kenichi

    2010-01-01

    Financial liberalization has been a controversial issue, as empirical evidence for growth enhancing effects is mixed. Here, we find sizable welfare gains from liberalization (cost to repression), though the gain in economic growth is ambiguous. We take the view that financial liberalization is a government policy that alters the path of financial deepening, while financial deepening is endogenously chosen by agents given a policy and occurs in transition towards a distant steady state. This history-dependent view necessitates the use of simulation analysis based on a growth model. Our application is a specific episode: Thailand from 1976 to 1996. PMID:20806055

  16. Fundamental Mechanisms, Predictive Modeling, and Novel Aerospace Applications of Plasma Assisted Combustion

    DTIC Science & Technology

    2013-10-22

    0.0, pulse #10 Front view: T0=500 K, ϕ=0.3 Front view: T0=300 K, ϕ=0.0 200 Torr DBD Discharges : 20 kV, 10kHz ICCD gate 50 ns. P = 20 Torr #5...0.60 19,050 49 ppm 0.47 10,820 Non-diffusive hybrid scheme for simulation of filamentary discharges AVALANCHE TO STREAMER TRANSITION IN...Specific Deposited Discharge Energy and Energy Deposited in First Pulse С2Н2:О2:Ar = 17:83:900 (φ = 0.5) Ignition delay time in С2Н2:О2:Ar

  17. Response of a Bell–Bloom Magnetometer to a Magnetic Field of Arbitrary Direction

    PubMed Central

    Ding, Zhichao; Yuan, Jie; Long, Xingwu

    2018-01-01

    The Bell–Bloom magnetometer in response to a magnetic field of arbitrary direction is observed theoretically and experimentally. A theoretical model is built from a macroscopic view to simulate the magnetometer frequency response to an external magnetic field of arbitrary direction. Based on the simulation results, the magnetometer characteristics, including the signal phase and amplitude at resonance, the linewidth, and the magnetometer sensitivity, are analyzed, and the dependencies of these characteristics on the external magnetic field direction are obtained, which are verified by the experiment. PMID:29724059

  18. Multi-ray medical ultrasound simulation without explicit speckle modelling.

    PubMed

    Tuzer, Mert; Yazıcı, Abdulkadir; Türkay, Rüştü; Boyman, Michael; Acar, Burak

    2018-05-04

    To develop a medical ultrasound (US) simulation method using T1-weighted magnetic resonance images (MRI) as the input that offers a compromise between low-cost ray-based and high-cost realistic wave-based simulations. The proposed method uses a novel multi-ray image formation approach with a virtual phased array transducer probe. A domain model is built from input MR images. Multiple virtual acoustic rays are emerged from each element of the linear transducer array. Reflected and transmitted acoustic energy at discrete points along each ray is computed independently. Simulated US images are computed by fusion of the reflected energy along multiple rays from multiple transducers, while phase delays due to differences in distances to transducers are taken into account. A preliminary implementation using GPUs is presented. Preliminary results show that the multi-ray approach is capable of generating view point-dependent realistic US images with an inherent Rician distributed speckle pattern automatically. The proposed simulator can reproduce the shadowing artefacts and demonstrates frequency dependence apt for practical training purposes. We also have presented preliminary results towards the utilization of the method for real-time simulations. The proposed method offers a low-cost near-real-time wave-like simulation of realistic US images from input MR data. It can further be improved to cover the pathological findings using an improved domain model, without any algorithmic updates. Such a domain model would require lesion segmentation or manual embedding of virtual pathologies for training purposes.

  19. Simulation of laser beam reflection at the sea surface

    NASA Astrophysics Data System (ADS)

    Schwenger, Frédéric; Repasi, Endre

    2011-05-01

    A 3D simulation of the reflection of a Gaussian shaped laser beam on the dynamic sea surface is presented. The simulation is suitable for both the calculation of images of SWIR (short wave infrared) imaging sensor and for determination of total detected power of reflected laser light for a bistatic configuration of laser source and receiver at different atmospheric conditions. Our computer simulation comprises the 3D simulation of a maritime scene (open sea/clear sky) and the simulation of laser light reflected at the sea surface. The basic sea surface geometry is modeled by a composition of smooth wind driven gravity waves. The propagation model for water waves is applied for sea surface animation. To predict the view of a camera in the spectral band SWIR the sea surface radiance must be calculated. This is done by considering the emitted sea surface radiance and the reflected sky radiance, calculated by MODTRAN. Additionally, the radiances of laser light specularly reflected at the wind-roughened sea surface are modeled in the SWIR band considering an analytical statistical sea surface BRDF (bidirectional reflectance distribution function). This BRDF model considers the statistical slope statistics of waves and accounts for slope-shadowing of waves that especially occurs at flat incident angles of the laser beam and near horizontal detection angles of reflected irradiance at rough seas. Simulation results are presented showing the variation of the detected laser power dependent on the geometric configuration of laser, sensor and wind characteristics.

  20. Ultra-low dose CT attenuation correction for PET/CT: analysis of sparse view data acquisition and reconstruction algorithms

    NASA Astrophysics Data System (ADS)

    Rui, Xue; Cheng, Lishui; Long, Yong; Fu, Lin; Alessio, Adam M.; Asma, Evren; Kinahan, Paul E.; De Man, Bruno

    2015-09-01

    For PET/CT systems, PET image reconstruction requires corresponding CT images for anatomical localization and attenuation correction. In the case of PET respiratory gating, multiple gated CT scans can offer phase-matched attenuation and motion correction, at the expense of increased radiation dose. We aim to minimize the dose of the CT scan, while preserving adequate image quality for the purpose of PET attenuation correction by introducing sparse view CT data acquisition. We investigated sparse view CT acquisition protocols resulting in ultra-low dose CT scans designed for PET attenuation correction. We analyzed the tradeoffs between the number of views and the integrated tube current per view for a given dose using CT and PET simulations of a 3D NCAT phantom with lesions inserted into liver and lung. We simulated seven CT acquisition protocols with {984, 328, 123, 41, 24, 12, 8} views per rotation at a gantry speed of 0.35 s. One standard dose and four ultra-low dose levels, namely, 0.35 mAs, 0.175 mAs, 0.0875 mAs, and 0.043 75 mAs, were investigated. Both the analytical Feldkamp, Davis and Kress (FDK) algorithm and the Model Based Iterative Reconstruction (MBIR) algorithm were used for CT image reconstruction. We also evaluated the impact of sinogram interpolation to estimate the missing projection measurements due to sparse view data acquisition. For MBIR, we used a penalized weighted least squares (PWLS) cost function with an approximate total-variation (TV) regularizing penalty function. We compared a tube pulsing mode and a continuous exposure mode for sparse view data acquisition. Global PET ensemble root-mean-squares-error (RMSE) and local ensemble lesion activity error were used as quantitative evaluation metrics for PET image quality. With sparse view sampling, it is possible to greatly reduce the CT scan dose when it is primarily used for PET attenuation correction with little or no measureable effect on the PET image. For the four ultra-low dose levels simulated, sparse view protocols with 41 and 24 views best balanced the tradeoff between electronic noise and aliasing artifacts. In terms of lesion activity error and ensemble RMSE of the PET images, these two protocols, when combined with MBIR, are able to provide results that are comparable to the baseline full dose CT scan. View interpolation significantly improves the performance of FDK reconstruction but was not necessary for MBIR. With the more technically feasible continuous exposure data acquisition, the CT images show an increase in azimuthal blur compared to tube pulsing. However, this blurring generally does not have a measureable impact on PET reconstructed images. Our simulations demonstrated that ultra-low-dose CT-based attenuation correction can be achieved at dose levels on the order of 0.044 mAs with little impact on PET image quality. Highly sparse 41- or 24- view ultra-low dose CT scans are feasible for PET attenuation correction, providing the best tradeoff between electronic noise and view aliasing artifacts. The continuous exposure acquisition mode could potentially be implemented in current commercially available scanners, thus enabling sparse view data acquisition without requiring x-ray tubes capable of operating in a pulsing mode.

  1. Ultra-low dose CT attenuation correction for PET/CT: analysis of sparse view data acquisition and reconstruction algorithms

    PubMed Central

    Rui, Xue; Cheng, Lishui; Long, Yong; Fu, Lin; Alessio, Adam M.; Asma, Evren; Kinahan, Paul E.; De Man, Bruno

    2015-01-01

    For PET/CT systems, PET image reconstruction requires corresponding CT images for anatomical localization and attenuation correction. In the case of PET respiratory gating, multiple gated CT scans can offer phase-matched attenuation and motion correction, at the expense of increased radiation dose. We aim to minimize the dose of the CT scan, while preserving adequate image quality for the purpose of PET attenuation correction by introducing sparse view CT data acquisition. Methods We investigated sparse view CT acquisition protocols resulting in ultra-low dose CT scans designed for PET attenuation correction. We analyzed the tradeoffs between the number of views and the integrated tube current per view for a given dose using CT and PET simulations of a 3D NCAT phantom with lesions inserted into liver and lung. We simulated seven CT acquisition protocols with {984, 328, 123, 41, 24, 12, 8} views per rotation at a gantry speed of 0.35 seconds. One standard dose and four ultra-low dose levels, namely, 0.35 mAs, 0.175 mAs, 0.0875 mAs, and 0.04375 mAs, were investigated. Both the analytical FDK algorithm and the Model Based Iterative Reconstruction (MBIR) algorithm were used for CT image reconstruction. We also evaluated the impact of sinogram interpolation to estimate the missing projection measurements due to sparse view data acquisition. For MBIR, we used a penalized weighted least squares (PWLS) cost function with an approximate total-variation (TV) regularizing penalty function. We compared a tube pulsing mode and a continuous exposure mode for sparse view data acquisition. Global PET ensemble root-mean-squares-error (RMSE) and local ensemble lesion activity error were used as quantitative evaluation metrics for PET image quality. Results With sparse view sampling, it is possible to greatly reduce the CT scan dose when it is primarily used for PET attenuation correction with little or no measureable effect on the PET image. For the four ultra-low dose levels simulated, sparse view protocols with 41 and 24 views best balanced the tradeoff between electronic noise and aliasing artifacts. In terms of lesion activity error and ensemble RMSE of the PET images, these two protocols, when combined with MBIR, are able to provide results that are comparable to the baseline full dose CT scan. View interpolation significantly improves the performance of FDK reconstruction but was not necessary for MBIR. With the more technically feasible continuous exposure data acquisition, the CT images show an increase in azimuthal blur compared to tube pulsing. However, this blurring generally does not have a measureable impact on PET reconstructed images. Conclusions Our simulations demonstrated that ultra-low-dose CT-based attenuation correction can be achieved at dose levels on the order of 0.044 mAs with little impact on PET image quality. Highly sparse 41- or 24- view ultra-low dose CT scans are feasible for PET attenuation correction, providing the best tradeoff between electronic noise and view aliasing artifacts. The continuous exposure acquisition mode could potentially be implemented in current commercially available scanners, thus enabling sparse view data acquisition without requiring x-ray tubes capable of operating in a pulsing mode. PMID:26352168

  2. Simulation of an oil film at the sea surface and its radiometric properties in the SWIR

    NASA Astrophysics Data System (ADS)

    Schwenger, Frédéric; Van Eijk, Alexander M. J.

    2017-10-01

    The knowledge of the optical contrast of an oil layer on the sea under various surface roughness conditions is of great interest for oil slick monitoring techniques. This paper presents a 3D simulation of a dynamic sea surface contaminated by a floating oil film. The simulation considers the damping influence of oil on the ocean waves and its physical properties. It calculates the radiance contrast of the sea surface polluted by the oil film in relation to a clean sea surface for the SWIR spectral band. Our computer simulation combines the 3D simulation of a maritime scene (open clear sea/clear sky) with an oil film at the sea surface. The basic geometry of a clean sea surface is modeled by a composition of smooth wind driven gravity waves. Oil on the sea surface attenuates the capillary and short gravity waves modulating the wave power density spectrum of these waves. The radiance of the maritime scene is calculated in the SWIR spectral band with the emitted sea surface radiance and the specularly reflected sky radiance as components. Wave hiding and shadowing, especially occurring at low viewing angles, are considered. The specular reflection of the sky radiance at the clean sea surface is modeled by an analytical statistical bidirectional reflectance distribution function (BRDF) of the sea surface. For oil at the sea surface, a specific BRDF is used influenced by the reduced surface roughness, i.e., the modulated wave density spectrum. The radiance contrast of an oil film in relation to the clean sea surface is calculated for different viewing angles, wind speeds, and oil types characterized by their specific physical properties.

  3. Analyzing and Visualizing Cosmological Simulations with ParaView

    NASA Astrophysics Data System (ADS)

    Woodring, Jonathan; Heitmann, Katrin; Ahrens, James; Fasel, Patricia; Hsu, Chung-Hsing; Habib, Salman; Pope, Adrian

    2011-07-01

    The advent of large cosmological sky surveys—ushering in the era of precision cosmology—has been accompanied by ever larger cosmological simulations. The analysis of these simulations, which currently encompass tens of billions of particles and up to a trillion particles in the near future, is often as daunting as carrying out the simulations in the first place. Therefore, the development of very efficient analysis tools combining qualitative and quantitative capabilities is a matter of some urgency. In this paper, we introduce new analysis features implemented within ParaView, a fully parallel, open-source visualization toolkit, to analyze large N-body simulations. A major aspect of ParaView is that it can live and operate on the same machines and utilize the same parallel power as the simulation codes themselves. In addition, data movement is in a serious bottleneck now and will become even more of an issue in the future; an interactive visualization and analysis tool that can handle data in situ is fast becoming essential. The new features in ParaView include particle readers and a very efficient halo finder that identifies friends-of-friends halos and determines common halo properties, including spherical overdensity properties. In combination with many other functionalities already existing within ParaView, such as histogram routines or interfaces to programming languages like Python, this enhanced version enables fast, interactive, and convenient analyses of large cosmological simulations. In addition, development paths are available for future extensions.

  4. Applying Signal-Detection Theory to the Study of Observer Accuracy and Bias in Behavioral Assessment

    ERIC Educational Resources Information Center

    Lerman, Dorothea C.; Tetreault, Allison; Hovanetz, Alyson; Bellaci, Emily; Miller, Jonathan; Karp, Hilary; Mahmood, Angela; Strobel, Maggie; Mullen, Shelley; Keyl, Alice; Toupard, Alexis

    2010-01-01

    We evaluated the feasibility and utility of a laboratory model for examining observer accuracy within the framework of signal-detection theory (SDT). Sixty-one individuals collected data on aggression while viewing videotaped segments of simulated teacher-child interactions. The purpose of Experiment 1 was to determine if brief feedback and…

  5. LARGE EDDY SIMULATION OF TURBULENT FLOW OVER MARGINALLY RESOLVED THREE BLUFF BODIES USING AN IMMERSED BOUNDARY METHOD AND LAGRANGIAN DYNAMIC EDDY-VISCOSITY MODELS. (R828771C004)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  6. EnsembleGraph: Interactive Visual Analysis of Spatial-Temporal Behavior for Ensemble Simulation Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shu, Qingya; Guo, Hanqi; Che, Limei

    We present a novel visualization framework—EnsembleGraph— for analyzing ensemble simulation data, in order to help scientists understand behavior similarities between ensemble members over space and time. A graph-based representation is used to visualize individual spatiotemporal regions with similar behaviors, which are extracted by hierarchical clustering algorithms. A user interface with multiple-linked views is provided, which enables users to explore, locate, and compare regions that have similar behaviors between and then users can investigate and analyze the selected regions in detail. The driving application of this paper is the studies on regional emission influences over tropospheric ozone, which is based onmore » ensemble simulations conducted with different anthropogenic emission absences using the MOZART-4 (model of ozone and related tracers, version 4) model. We demonstrate the effectiveness of our method by visualizing the MOZART-4 ensemble simulation data and evaluating the relative regional emission influences on tropospheric ozone concentrations. Positive feedbacks from domain experts and two case studies prove efficiency of our method.« less

  7. The de-correlation of westerly winds and westerly-wind stress over the Southern Ocean during the Last Glacial Maximum

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Wei; Lu, Jian; Leung, Lai-Yung R.

    2015-02-22

    This paper investigates the changes of the Southern Westerly Winds (SWW) and Southern Ocean (SO) upwelling between the Last Glacial Maximum (LGM) and preindustrial (PI) in the PMIP3/CMIP5 simulations, highlighting the role of the Antarctic sea ice in modulating the wind stress effect on the ocean. Particularly, a discrepancy may occur between the changes in SWW and westerly wind stress, caused primarily by an equatorward expansion of winter Antarctic sea ice that undermines the wind stress in driving the liquid ocean. Such discrepancy may reflect the LGM condition in reality, in view of that the model simulates this condition hasmore » most credible simulation of modern SWW and Antarctic sea ice. The effect of wind stress on the SO upwelling is further explored via the wind-induced Ekman pumping, which is reduced under the LGM condition in all models, in part by the sea-ice “capping” effect present in the models.« less

  8. Development of an immersive virtual reality head-mounted display with high performance.

    PubMed

    Wang, Yunqi; Liu, Weiqi; Meng, Xiangxiang; Fu, Hanyi; Zhang, Daliang; Kang, Yusi; Feng, Rui; Wei, Zhonglun; Zhu, Xiuqing; Jiang, Guohua

    2016-09-01

    To resolve the contradiction between large field of view and high resolution in immersive virtual reality (VR) head-mounted displays (HMDs), an HMD monocular optical system with a large field of view and high resolution was designed. The system was fabricated by adopting aspheric technology with CNC grinding and a high-resolution LCD as the image source. With this monocular optical system, an HMD binocular optical system with a wide-range continuously adjustable interpupillary distance was achieved in the form of partially overlapping fields of view (FOV) combined with a screw adjustment mechanism. A fast image processor-centered LCD driver circuit and an image preprocessing system were also built to address binocular vision inconsistency in the partially overlapping FOV binocular optical system. The distortions of the HMD optical system with a large field of view were measured. Meanwhile, the optical distortions in the display and the trapezoidal distortions introduced during image processing were corrected by a calibration model for reverse rotations and translations. A high-performance not-fully-transparent VR HMD device with high resolution (1920×1080) and large FOV [141.6°(H)×73.08°(V)] was developed. The full field-of-view average value of angular resolution is 18.6  pixels/degree. With the device, high-quality VR simulations can be completed under various scenarios, and the device can be utilized for simulated trainings in aeronautics, astronautics, and other fields with corresponding platforms. The developed device has positive practical significance.

  9. Encapsulating model complexity and landscape-scale analyses of state-and-transition simulation models: an application of ecoinformatics and juniper encroachment in sagebrush steppe ecosystems

    USGS Publications Warehouse

    O'Donnell, Michael

    2015-01-01

    State-and-transition simulation modeling relies on knowledge of vegetation composition and structure (states) that describe community conditions, mechanistic feedbacks such as fire that can affect vegetation establishment, and ecological processes that drive community conditions as well as the transitions between these states. However, as the need for modeling larger and more complex landscapes increase, a more advanced awareness of computing resources becomes essential. The objectives of this study include identifying challenges of executing state-and-transition simulation models, identifying common bottlenecks of computing resources, developing a workflow and software that enable parallel processing of Monte Carlo simulations, and identifying the advantages and disadvantages of different computing resources. To address these objectives, this study used the ApexRMS® SyncroSim software and embarrassingly parallel tasks of Monte Carlo simulations on a single multicore computer and on distributed computing systems. The results demonstrated that state-and-transition simulation models scale best in distributed computing environments, such as high-throughput and high-performance computing, because these environments disseminate the workloads across many compute nodes, thereby supporting analysis of larger landscapes, higher spatial resolution vegetation products, and more complex models. Using a case study and five different computing environments, the top result (high-throughput computing versus serial computations) indicated an approximate 96.6% decrease of computing time. With a single, multicore compute node (bottom result), the computing time indicated an 81.8% decrease relative to using serial computations. These results provide insight into the tradeoffs of using different computing resources when research necessitates advanced integration of ecoinformatics incorporating large and complicated data inputs and models. - See more at: http://aimspress.com/aimses/ch/reader/view_abstract.aspx?file_no=Environ2015030&flag=1#sthash.p1XKDtF8.dpuf

  10. Accuracy of finite-difference modeling of seismic waves : Simulation versus laboratory measurements

    NASA Astrophysics Data System (ADS)

    Arntsen, B.

    2017-12-01

    The finite-difference technique for numerical modeling of seismic waves is still important and for some areas extensively used.For exploration purposes is finite-difference simulation at the core of both traditional imaging techniques such as reverse-time migration and more elaborate Full-Waveform Inversion techniques.The accuracy and fidelity of finite-difference simulation of seismic waves are hard to quantify and meaningfully error analysis is really onlyeasily available for simplistic media. A possible alternative to theoretical error analysis is provided by comparing finite-difference simulated data with laboratory data created using a scale model. The advantage of this approach is the accurate knowledge of the model, within measurement precision, and the location of sources and receivers.We use a model made of PVC immersed in water and containing horizontal and tilted interfaces together with several spherical objects to generateultrasonic pressure reflection measurements. The physical dimensions of the model is of the order of a meter, which after scaling represents a model with dimensions of the order of 10 kilometer and frequencies in the range of one to thirty hertz.We find that for plane horizontal interfaces the laboratory data can be reproduced by the finite-difference scheme with relatively small error, but for steeply tilted interfaces the error increases. For spherical interfaces the discrepancy between laboratory data and simulated data is sometimes much more severe, to the extent that it is not possible to simulate reflections from parts of highly curved bodies. The results are important in view of the fact that finite-difference modeling is often at the core of imaging and inversion algorithms tackling complicatedgeological areas with highly curved interfaces.

  11. Deep Part Load Flow Analysis in a Francis Model turbine by means of two-phase unsteady flow simulations

    NASA Astrophysics Data System (ADS)

    Conrad, Philipp; Weber, Wilhelm; Jung, Alexander

    2017-04-01

    Hydropower plants are indispensable to stabilize the grid by reacting quickly to changes of the energy demand. However, an extension of the operating range towards high and deep part load conditions without fatigue of the hydraulic components is desirable to increase their flexibility. In this paper a model sized Francis turbine at low discharge operating conditions (Q/QBEP = 0.27) is analyzed by means of computational fluid dynamics (CFD). Unsteady two-phase simulations for two Thoma-number conditions are conducted. Stochastic pressure oscillations, observed on the test rig at low discharge, require sophisticated numerical models together with small time steps, large grid sizes and long simulation times to cope with these fluctuations. In this paper the BSL-EARSM model (Explicit Algebraic Reynolds Stress) was applied as a compromise between scale resolving and two-equation turbulence models with respect to computational effort and accuracy. Simulation results are compared to pressure measurements showing reasonable agreement in resolving the frequency spectra and amplitude. Inner blade vortices were predicted successfully in shape and size. Surface streamlines in blade-to-blade view are presented, giving insights to the formation of the inner blade vortices. The acquired time dependent pressure fields can be used for quasi-static structural analysis (FEA) for fatigue calculations in the future.

  12. Projected strengthening of Amazonian dry season by constrained climate model simulations

    NASA Astrophysics Data System (ADS)

    Boisier, Juan P.; Ciais, Philippe; Ducharne, Agnès; Guimberteau, Matthieu

    2015-07-01

    The vulnerability of Amazonian rainforest, and the ecological services it provides, depends on an adequate supply of dry-season water, either as precipitation or stored soil moisture. How the rain-bearing South American monsoon will evolve across the twenty-first century is thus a question of major interest. Extensive savanization, with its loss of forest carbon stock and uptake capacity, is an extreme although very uncertain scenario. We show that the contrasting rainfall projections simulated for Amazonia by 36 global climate models (GCMs) can be reproduced with empirical precipitation models, calibrated with historical GCM data as functions of the large-scale circulation. A set of these simple models was therefore calibrated with observations and used to constrain the GCM simulations. In agreement with the current hydrologic trends, the resulting projection towards the end of the twenty-first century is for a strengthening of the monsoon seasonal cycle, and a dry-season lengthening in southern Amazonia. With this approach, the increase in the area subjected to lengthy--savannah-prone--dry seasons is substantially larger than the GCM-simulated one. Our results confirm the dominant picture shown by the state-of-the-art GCMs, but suggest that the `model democracy' view of these impacts can be significantly underestimated.

  13. Humber-in-a-Box : Gamification to Communicate Coastal Flood Risk in the Face of Rising Seas

    NASA Astrophysics Data System (ADS)

    Skinner, C. J.; van Rij, J. D.

    2015-12-01

    Humber-in-a-Box is an immersive visualisation of the Humber Estuary (on the east coast of the UK), designed to communicate coastal flood risk in the face of rising seas. It is designed for use in a busy festival-like setting. The user views the environment via an Oculus Rift Virtual Reality (VR) headset and is able to explore using an XBOX controller. A live simulation of tidal flows on a modelled version of the estuary can be viewed on a box in the centre of a virtual room. Using the controller, the user is able to raise sea levels and see what happens as the tide levels adjust. Humber-in-a-Box uses a numerical model built with data used for published research. The hydraulic component of the CAESAR-Lisflood model code was incorporated into the UNITY-3D gaming engine, and the model uses recorded tidal stage data, bathymetry and elevations to build the virtual environment and drive the simulation. Present day flood defences are incorporated into the model, and in conjunction with modelling tidal flows, this provides a better representation of future flood risk than simpler linear models. The user is able to raise and lower sea levels between -10 m and 100 m, in 1m increments, and can reset the simulation to present day levels with one button click. Humber-in-a-Box has been showcased at several outreach events and has proven to be very popular and effective in an environment where time with each user is pressured, and information needs to exchange quickly. It has also been used in teaching at Undergraduate level, although the full potential of this is yet to be explored. A non-interactive version of the application is available on YouTube which is designed for use with Google Cardboard and similar kit.

  14. AOD trends during 2001-2010 from observations and model simulations

    NASA Astrophysics Data System (ADS)

    Pozzer, Andrea; de Meij, Alexander; Yoon, Jongmin; Astitha, Marina

    2016-04-01

    The trend of aerosol optical depth (AOD) between 2001 and 2010 is estimated globally and regionally from remote sensed observations by the MODIS (Moderate Resolution Imaging Spectroradiometer), MISR (Multi-angle Imaging SpectroRadiometer) and SeaWIFS (Sea-viewing Wide Field-of-view Sensor) satellite sensor. The resulting trends have been compared to model results from the EMAC (ECHAM5/MESSy Atmospheric Chemistry {[1]}), model. Although interannual variability is applied only to anthropogenic and biomass-burning emissions, the model is able to quantitatively reproduce the AOD trends as observed by MODIS, while some discrepancies are found when compared to MISR and SeaWIFS. An additional numerical simulation with the same model was performed, neglecting any temporal change in the emissions, i.e. with no interannual variability for any emission source. It is shown that decreasing AOD trends over the US and Europe are due to the decrease in the (anthropogenic) emissions. On contrary over the Sahara Desert and the Middle East region, the meteorological/dynamical changes in the last decade play a major role in driving the AOD trends. Further, over Southeast Asia, both meteorology and emissions changes are equally important in defining AOD trends {[2]}. Finally, decomposing the regional AOD trends into individual aerosol components reveals that the soluble components are the most dominant contributors to the total AOD, as their influence on the total AOD is enhanced by the aerosol water content. {[1]}: Jöckel, P., Kerkweg, A., Pozzer, A., Sander, R., Tost, H., Riede, H., Baumgaertner, A., Gromov, S., and Kern, B.: Development cycle 2 of the Modular Earth Submodel System (MESSy2), Geosci. Model Dev., 3, 717-752, doi:10.5194/gmd-3-717-2010, 2010. {[2]}: Pozzer, A., de Meij, A., Yoon, J., Tost, H., Georgoulias, A. K., and Astitha, M.: AOD trends during 2001-2010 from observations and model simulations, Atmos. Chem. Phys., 15, 5521-5535, doi:10.5194/acp-15-5521-2015, 2015.

  15. Performance optimization for space-based sensors: simulation and modelling at Fraunhofer IOSB

    NASA Astrophysics Data System (ADS)

    Schweitzer, Caroline; Stein, Karin

    2014-10-01

    The prediction of the effectiveness of a space-based sensor for its designated application in space (e.g. special earth surface observations or missile detection) can help to reduce the expenses, especially during the phases of mission planning and instrumentation. In order to optimize the performance of such systems we simulate and analyse the entire operational scenario, including: - optional waveband - various orbit heights and viewing angles - system design characteristics, e. g. pixel size and filter transmission - atmospheric effects, e. g. different cloud types, climate zones and seasons In the following, an evaluation of the appropriate infrared (IR) waveband for the designated sensor application is given. The simulation environment is also capable of simulating moving objects like aircraft or missiles. Therefore, the spectral signature of the object/missile as well as its track along a flight path is implemented. The resulting video sequence is then analysed by a tracking algorithm and an estimation of the effectiveness of the sensor system can be simulated. This paper summarizes the work carried out at Fraunhofer IOSB in the field of simulation and modelling for the performance optimization of space based sensors. The paper is structured as follows: First, an overview of the applied simulation and modelling software is given. Then, the capability of those tools is illustrated by means of a hypothetical threat scenario for space-based early warning (launch of a long-range ballistic missile (BM)).

  16. Dynamically downscaled climate simulations over North America: Methods, evaluation, and supporting documentation for users

    USGS Publications Warehouse

    Hostetler, S.W.; Alder, J.R.; Allan, A.M.

    2011-01-01

    We have completed an array of high-resolution simulations of present and future climate over Western North America (WNA) and Eastern North America (ENA) by dynamically downscaling global climate simulations using a regional climate model, RegCM3. The simulations are intended to provide long time series of internally consistent surface and atmospheric variables for use in climate-related research. In addition to providing high-resolution weather and climate data for the past, present, and future, we have developed an integrated data flow and methodology for processing, summarizing, viewing, and delivering the climate datasets to a wide range of potential users. Our simulations were run over 50- and 15-kilometer model grids in an attempt to capture more of the climatic detail associated with processes such as topographic forcing than can be captured by general circulation models (GCMs). The simulations were run using output from four GCMs. All simulations span the present (for example, 1968-1999), common periods of the future (2040-2069), and two simulations continuously cover 2010-2099. The trace gas concentrations in our simulations were the same as those of the GCMs: the IPCC 20th century time series for 1968-1999 and the A2 time series for simulations of the future. We demonstrate that RegCM3 is capable of producing present day annual and seasonal climatologies of air temperature and precipitation that are in good agreement with observations. Important features of the high-resolution climatology of temperature, precipitation, snow water equivalent (SWE), and soil moisture are consistently reproduced in all model runs over WNA and ENA. The simulations provide a potential range of future climate change for selected decades and display common patterns of the direction and magnitude of changes. As expected, there are some model to model differences that limit interpretability and give rise to uncertainties. Here, we provide background information about the GCMs and the RegCM3, a basic evaluation of the model output and examples of simulated future climate. We also provide information needed to access the web applications for visualizing and downloading the data, and give complete metadata that describe the variables in the datasets.

  17. Conditional Stochastic Models in Reduced Space: Towards Efficient Simulation of Tropical Cyclone Precipitation Patterns

    NASA Astrophysics Data System (ADS)

    Dodov, B.

    2017-12-01

    Stochastic simulation of realistic and statistically robust patterns of Tropical Cyclone (TC) induced precipitation is a challenging task. It is even more challenging in a catastrophe modeling context, where tens of thousands of typhoon seasons need to be simulated in order to provide a complete view of flood risk. Ultimately, one could run a coupled global climate model and regional Numerical Weather Prediction (NWP) model, but this approach is not feasible in the catastrophe modeling context and, most importantly, may not provide TC track patterns consistent with observations. Rather, we propose to leverage NWP output for the observed TC precipitation patterns (in terms of downscaled reanalysis 1979-2015) collected on a Lagrangian frame along the historical TC tracks and reduced to the leading spatial principal components of the data. The reduced data from all TCs is then grouped according to timing, storm evolution stage (developing, mature, dissipating, ETC transitioning) and central pressure and used to build a dictionary of stationary (within a group) and non-stationary (for transitions between groups) covariance models. Provided that the stochastic storm tracks with all the parameters describing the TC evolution are already simulated, a sequence of conditional samples from the covariance models chosen according to the TC characteristics at a given moment in time are concatenated, producing a continuous non-stationary precipitation pattern in a Lagrangian framework. The simulated precipitation for each event is finally distributed along the stochastic TC track and blended with a non-TC background precipitation using a data assimilation technique. The proposed framework provides means of efficient simulation (10000 seasons simulated in a couple of days) and robust typhoon precipitation patterns consistent with observed regional climate and visually undistinguishable from high resolution NWP output. The framework is used to simulate a catalog of 10000 typhoon seasons implemented in a flood risk model for Japan.

  18. Visualization and Analysis of Climate Simulation Performance Data

    NASA Astrophysics Data System (ADS)

    Röber, Niklas; Adamidis, Panagiotis; Behrens, Jörg

    2015-04-01

    Visualization is the key process of transforming abstract (scientific) data into a graphical representation, to aid in the understanding of the information hidden within the data. Climate simulation data sets are typically quite large, time varying, and consist of many different variables sampled on an underlying grid. A large variety of climate models - and sub models - exist to simulate various aspects of the climate system. Generally, one is mainly interested in the physical variables produced by the simulation runs, but model developers are also interested in performance data measured along with these simulations. Climate simulation models are carefully developed complex software systems, designed to run in parallel on large HPC systems. An important goal thereby is to utilize the entire hardware as efficiently as possible, that is, to distribute the workload as even as possible among the individual components. This is a very challenging task, and detailed performance data, such as timings, cache misses etc. have to be used to locate and understand performance problems in order to optimize the model implementation. Furthermore, the correlation of performance data to the processes of the application and the sub-domains of the decomposed underlying grid is vital when addressing communication and load imbalance issues. High resolution climate simulations are carried out on tens to hundreds of thousands of cores, thus yielding a vast amount of profiling data, which cannot be analyzed without appropriate visualization techniques. This PICO presentation displays and discusses the ICON simulation model, which is jointly developed by the Max Planck Institute for Meteorology and the German Weather Service and in partnership with DKRZ. The visualization and analysis of the models performance data allows us to optimize and fine tune the model, as well as to understand its execution on the HPC system. We show and discuss our workflow, as well as present new ideas and solutions that greatly aided our understanding. The software employed is based on Avizo Green, ParaView and SimVis, as well as own developed software extensions.

  19. An assessment of teenagers' perceptions of dental fluorosis using digital simulation and web-based testing.

    PubMed

    Edwards, Maura; Macpherson, Lorna M D; Simmons, David R; Harper Gilmour, W; Stephen, Kenneth W

    2005-08-01

    To develop a new model to establish teenagers' perceptions of the aesthetic impact of fluorosis, in the context of overall facial appearance. This web-based model was used to compare different degrees of fluorosis at any one distance, while also comparing the same level of fluorosis at different 'distances'. A 14-year-old subject was used as the model face. Different degrees of fluorosis were 'built-up' on this subject's teeth using digital simulation. A web-based questionnaire showed 30 photographs, displaying four levels of fluorosis, in addition to fluorosis-free, at five different 'distances'. The closest images were shown with and without retractors, while the more distant pictures showed more of the subject's face. Teenage pupils (n = 217) were then asked to grade the acceptability of the appearances and indicate if they would wish treatment for each such appearance. At any one distance, acceptability fell as fluorosis level increased. When the same degree of fluorosis was compared at different distances, acceptability improved as the teeth were viewed from further away. Pictures taken without retractors had higher acceptability than those taken with retractors in place. Teenagers can discriminate between various degrees of fluorosis. However, more distant viewing of fluorosed teeth, within the overall context of the face, improves acceptability of the appearance.

  20. Functional resilience of microbial ecosystems in soil: How important is a spatial analysis?

    NASA Astrophysics Data System (ADS)

    König, Sara; Banitz, Thomas; Centler, Florian; Frank, Karin; Thullner, Martin

    2015-04-01

    Microbial life in soil is exposed to fluctuating environmental conditions influencing the performance of microbially mediated ecosystem services such as biodegradation of contaminants. However, as this environment is typically very heterogeneous, spatial aspects can be expected to play a major role for the ability to recover from a stress event. To determine key processes for functional resilience, simple scenarios with varying stress intensities were simulated within a microbial simulation model and the biodegradation rate in the recovery phase monitored. Parameters including microbial growth and dispersal rates were varied over a typical range to consider microorganisms with varying properties. Besides an aggregated temporal monitoring, the explicit observation of the spatio-temporal dynamics proved essential to understand the recovery process. For a mechanistic understanding of the model system, scenarios were also simulated with selected processes being switched-off. Results of the mechanistic and the spatial view show that the key factors for functional recovery with respect to biodegradation after a simple stress event depend on the location of the observed habitats. The limiting factors near unstressed areas are spatial processes - the mobility of the bacteria as well as substrate diffusion - the longer the distance to the unstressed region the more important becomes the process growth. Furthermore, recovery depends on the stress intensity - after a low stress event the spatial configuration has no influence on the key factors for functional resilience. To confirm these results, we repeated the stress scenarios but this time including an additional dispersal network representing a fungal network in soil. The system benefits from an increased spatial performance due to the higher mobility of the degrading microorganisms. However, this effect appears only in scenarios where the spatial distribution of the stressed area plays a role. With these simulations we show that spatial aspects play a main role for recovering after a severe stress event in a highly heterogeneous environment such as soil, and thus the relevance of the exact distribution of the stressed area. In consequence a spatial-mechanistic view is necessary for examining the functional resilience as the aggregated temporal view alone could not have led to these conclusions. Further research should explore the importance of a spatial view for quantifying the recovery of the ecosystem service also after more complex stress regimes.

  1. Hardware-in-the-Loop Power Extraction Using Different Real-Time Platforms (PREPRINT)

    DTIC Science & Technology

    2008-07-01

    engine controller ( FADEC ). Incorporating various transient subsystem level models into a complex modeling tool can be a challenging process when each...used can also be modified or replaced as appropriate. In its current configuration, the generic turbine engine model’s FADEC runs primarily on a...simulation in real-time, two platforms were tested: dSPACE and National Instruments’ (NI) LabVIEW Real-Time. For both dSPACE and NI, the engine and FADEC

  2. Using computer graphics to design Space Station Freedom viewing

    NASA Technical Reports Server (NTRS)

    Goldsberry, Betty S.; Lippert, Buddy O.; Mckee, Sandra D.; Lewis, James L., Jr.; Mount, Francis E.

    1993-01-01

    Viewing requirements were identified early in the Space Station Freedom program for both direct viewing via windows and indirect viewing via cameras and closed-circuit television (CCTV). These requirements reside in NASA Program Definition and Requirements Document (PDRD), Section 3: Space Station Systems Requirements. Currently, analyses are addressing the feasibility of direct and indirect viewing. The goal of these analyses is to determine the optimum locations for the windows, cameras, and CCTV's in order to meet established requirements, to adequately support space station assembly, and to operate on-board equipment. PLAID, a three-dimensional computer graphics program developed at NASA JSC, was selected for use as the major tool in these analyses. PLAID provides the capability to simulate the assembly of the station as well as to examine operations as the station evolves. This program has been used successfully as a tool to analyze general viewing conditions for many Space Shuttle elements and can be used for virtually all Space Station components. Additionally, PLAID provides the ability to integrate an anthropometric scale-modeled human (representing a crew member) with interior and exterior architecture.

  3. Discrete Fracture Network Modeling and Simulation of Subsurface Transport for the Topopah Springs and Lava Flow Aquifers at Pahute Mesa, FY 15 Progress Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Makedonska, Nataliia; Kwicklis, Edward Michael; Birdsell, Kay Hanson

    This progress report for fiscal year 2015 (FY15) describes the development of discrete fracture network (DFN) models for Pahute Mesa. DFN models will be used to upscale parameters for simulations of subsurface flow and transport in fractured media in Pahute Mesa. The research focuses on modeling of groundwater flow and contaminant transport using DFNs generated according to fracture characteristics observed in the Topopah Spring Aquifer (TSA) and the Lava Flow Aquifer (LFA). This work will improve the representation of radionuclide transport processes in large-scale, regulatory-focused models with a view to reduce pessimistic bounding approximations and provide more realistic contaminant boundarymore » calculations that can be used to describe the future extent of contaminated groundwater. Our goal is to refine a modeling approach that can translate parameters to larger-scale models that account for local-scale flow and transport processes, which tend to attenuate migration.« less

  4. Dynamic modeling of brushless dc motors for aerospace actuation

    NASA Technical Reports Server (NTRS)

    Demerdash, N. A.; Nehl, T. W.

    1980-01-01

    A discrete time model for simulation of the dynamics of samarium cobalt-type permanent magnet brushless dc machines is presented. The simulation model includes modeling of the interaction between these machines and their attached power conditioners. These are transistorized conditioner units. This model is part of an overall discrete-time analysis of the dynamic performance of electromechanical actuators, which was conducted as part of prototype development of such actuators studied and built for NASA-Johnson Space Center as a prospective alternative to hydraulic actuators presently used in shuttle orbiter applications. The resulting numerical simulations of the various machine and power conditioner current and voltage waveforms gave excellent correlation to the actual waveforms collected from actual hardware experimental testing. These results, numerical and experimental, are presented here for machine motoring, regeneration and dynamic braking modes. Application of the resulting model to the determination of machine current and torque profiles during closed-loop actuator operation were also analyzed and the results are given here. These results are given in light of an overall view of the actuator system components. The applicability of this method of analysis to design optimization and trouble-shooting in such prototype development is also discussed in light of the results at hand.

  5. The Science of Transportation Analysis and Simulation

    NASA Astrophysics Data System (ADS)

    Gleibe, John

    2010-03-01

    Transportation Science focuses on methods developed to model and analyze the interaction between human behavior and transportation systems. From the human behavioral, or demand, perspective, we are interested in how person and households organize their activities across space and time, with travel viewed as an enabling activity. We have a particular interest in how to model the range of responses to public policy and transportation system changes, which leads to the consideration of both short- and long-term decision-making, interpersonal dependencies, and non-transportation-related opportunities and constraints, including household budgets, land use systems and economic systems. This has led to the development of complex structural econometric modeling systems as well as agent-based simulations. From the transportation systems, or supply, perspective we are interested in the level of service provide by transportation facilities, be it auto, transit or multi-modal systems. This has led to the development of network models and equilibrium concepts as well as hybrid simulation systems based on concepts borrowed from physics, such as fluid flow models, and cellular automata-type models. In this presentation, we review a representative sample of these methods and their use in transportation planning and public policy analysis.

  6. Conceptual Hierarchies in a Flat Attractor Network

    PubMed Central

    O’Connor, Christopher M.; Cree, George S.; McRae, Ken

    2009-01-01

    The structure of people’s conceptual knowledge of concrete nouns has traditionally been viewed as hierarchical (Collins & Quillian, 1969). For example, superordinate concepts (vegetable) are assumed to reside at a higher level than basic-level concepts (carrot). A feature-based attractor network with a single layer of semantic features developed representations of both basic-level and superordinate concepts. No hierarchical structure was built into the network. In Experiment and Simulation 1, the graded structure of categories (typicality ratings) is accounted for by the flat attractor-network. Experiment and Simulation 2 show that, as with basic-level concepts, such a network predicts feature verification latencies for superordinate concepts (vegetable ). In Experiment and Simulation 3, counterintuitive results regarding the temporal dynamics of similarity in semantic priming are explained by the model. By treating both types of concepts the same in terms of representation, learning, and computations, the model provides new insights into semantic memory. PMID:19543434

  7. Moon night sky brightness simulation for the Xinglong station

    NASA Astrophysics Data System (ADS)

    Yao, Song; Zhang, Hao-Tong; Yuan, Hai-Long; Zhao, Yong-Heng; Dong, Yi-Qiao; Bai, Zhong-Rui; Deng, Li-Cai; Lei, Ya-Juan

    2013-10-01

    Using a sky brightness monitor at the Xinglong station of National Astronomical Observatories, Chinese Academy of Sciences, we collected data from 22 dark clear nights and 90 moon nights. We first measured the sky brightness variation with time for dark nights and found a clear correlation between sky brightness and human activity. Then with a modified sky brightness model of moon nights and data from these nights, we derived the typical value for several important parameters in the model. With these results, we calculated the sky brightness distribution under a given moon condition for the Xinglong station. Furthermore, we simulated the sky brightness distribution of a moon night for a telescope with a 5° field of view (such as LAMOST). These simulations will be helpful for determining the limiting magnitude and exposure time, as well as planning the survey for LAMOST during moon nights.

  8. Equivalent circuit and characteristic simulation of a brushless electrically excited synchronous wind power generator

    NASA Astrophysics Data System (ADS)

    Wang, Hao; Zhang, Fengge; Guan, Tao; Yu, Siyang

    2017-09-01

    A brushless electrically excited synchronous generator (BEESG) with a hybrid rotor is a novel electrically excited synchronous generator. The BEESG proposed in this paper is composed of a conventional stator with two different sets of windings with different pole numbers, and a hybrid rotor with powerful coupling capacity. The pole number of the rotor is different from those of the stator windings. Thus, an analysis method different from that applied to conventional generators should be applied to the BEESG. In view of this problem, the equivalent circuit and electromagnetic torque expression of the BEESG are derived on the basis of electromagnetic relation of the proposed generator. The generator is simulated and tested experimentally using the established equivalent circuit model. The experimental and simulation data are then analyzed and compared. Results show the validity of the equivalent circuit model.

  9. The co-development of looking dynamics and discrimination performance

    PubMed Central

    Perone, Sammy; Spencer, John P.

    2015-01-01

    The study of looking dynamics and discrimination form the backbone of developmental science and are central processes in theories of infant cognition. Looking dynamics and discrimination change dramatically across the first year of life. Surprisingly, developmental changes in looking and discrimination have not been studied together. Recent simulations of a dynamic neural field (DNF) model of infant looking and memory suggest that looking and discrimination do change together over development and arise from a single neurodevelopmental mechanism. We probe this claim by measuring looking dynamics and discrimination along continuous, metrically organized dimensions in 5-, 7, and 10-month-old infants (N = 119). The results showed that looking dynamics and discrimination changed together over development and are linked within individuals. Quantitative simulations of a DNF model provide insights into the processes that underlie developmental change in looking dynamics and discrimination. Simulation results support the view that these changes might arise from a single neurodevelopmental mechanism. PMID:23957821

  10. Semantic Information Processing of Physical Simulation Based on Scientific Concept Vocabulary Model

    NASA Astrophysics Data System (ADS)

    Kino, Chiaki; Suzuki, Yoshio; Takemiya, Hiroshi

    Scientific Concept Vocabulary (SCV) has been developed to actualize Cognitive methodology based Data Analysis System: CDAS which supports researchers to analyze large scale data efficiently and comprehensively. SCV is an information model for processing semantic information for physics and engineering. In the model of SCV, all semantic information is related to substantial data and algorisms. Consequently, SCV enables a data analysis system to recognize the meaning of execution results output from a numerical simulation. This method has allowed a data analysis system to extract important information from a scientific view point. Previous research has shown that SCV is able to describe simple scientific indices and scientific perceptions. However, it is difficult to describe complex scientific perceptions by currently-proposed SCV. In this paper, a new data structure for SCV has been proposed in order to describe scientific perceptions in more detail. Additionally, the prototype of the new model has been constructed and applied to actual data of numerical simulation. The result means that the new SCV is able to describe more complex scientific perceptions.

  11. DART: Recent Advances in Remote Sensing Data Modeling With Atmosphere, Polarization, and Chlorophyll Fluorescence

    NASA Technical Reports Server (NTRS)

    Gastellu-Etchegorry, Jean-Phil; Lauret, Nicolas; Yin, Tiangang; Landier, Lucas; Kallel, Abdelaziz; Malenovsky, Zbynek; Bitar, Ahmad Al; Aval, Josselin; Benhmida, Sahar; Qi, Jianbo; hide

    2017-01-01

    To better understand the life-essential cycles and processes of our planet and to further develop remote sensing (RS) technology, there is an increasing need for models that simulate the radiative budget (RB) and RS acquisitions of urban and natural landscapes using physical approaches and considering the three-dimensional (3-D) architecture of Earth surfaces. Discrete anisotropic radiative transfer (DART) is one of the most comprehensive physically based 3-D models of Earth-atmosphere radiative transfer, covering the spectral domain from ultraviolet to thermal infrared wavelengths. It simulates the optical 3-DRB and optical signals of proximal, aerial, and satellite imaging spectrometers and laser scanners, for any urban and/or natural landscapes and for any experimental and instrumental configurations. It is freely available for research and teaching activities. In this paper, we briefly introduce DART theory and present recent advances in simulated sensors (LiDAR and cameras with finite field of view) and modeling mechanisms (atmosphere, specular reflectance with polarization and chlorophyll fluorescence). A case study demonstrating a novel application of DART to investigate urban landscapes is also presented.

  12. Army-NASA aircrew/aircraft integration program (A3I) software detailed design document, phase 3

    NASA Technical Reports Server (NTRS)

    Banda, Carolyn; Chiu, Alex; Helms, Gretchen; Hsieh, Tehming; Lui, Andrew; Murray, Jerry; Shankar, Renuka

    1990-01-01

    The capabilities and design approach of the MIDAS (Man-machine Integration Design and Analysis System) computer-aided engineering (CAE) workstation under development by the Army-NASA Aircrew/Aircraft Integration Program is detailed. This workstation uses graphic, symbolic, and numeric prototyping tools and human performance models as part of an integrated design/analysis environment for crewstation human engineering. Developed incrementally, the requirements and design for Phase 3 (Dec. 1987 to Jun. 1989) are described. Software tools/models developed or significantly modified during this phase included: an interactive 3-D graphic cockpit design editor; multiple-perspective graphic views to observe simulation scenarios; symbolic methods to model the mission decomposition, equipment functions, pilot tasking and loading, as well as control the simulation; a 3-D dynamic anthropometric model; an intermachine communications package; and a training assessment component. These components were successfully used during Phase 3 to demonstrate the complex interactions and human engineering findings involved with a proposed cockpit communications design change in a simulated AH-64A Apache helicopter/mission that maps to empirical data from a similar study and AH-1 Cobra flight test.

  13. School physics teacher class management, laboratory practice, student engagement, critical thinking, cooperative learning and use of simulations effects on student performance

    NASA Astrophysics Data System (ADS)

    Riaz, Muhammad

    The purpose of this study was to examine how simulations in physics class, class management, laboratory practice, student engagement, critical thinking, cooperative learning, and use of simulations predicted the percentage of students achieving a grade point average of B or higher and their academic performance as reported by teachers in secondary school physics classes. The target population consisted of secondary school physics teachers who were members of Science Technology, Engineeering and,Mathematics Teachers of New York City (STEMteachersNYC) and American Modeling Teachers Association (AMTA). They used simulations in their physics classes in the 2013 and 2014 school years. Subjects for this study were volunteers. A survey was constructed based on a literature review. Eighty-two physics teachers completed the survey about instructional practice in physics. All respondents were anonymous. Classroom management was the only predictor of the percent of students achieving a grade point average of B or higher in high school physics class. Cooperative learning, use of simulations, and student engagement were predictors of teacher's views of student academic performance in high school physics class. All other variables -- class management, laboratory practice, critical thinking, and teacher self-efficacy -- were not predictors of teacher's views of student academic performance in high school physics class. The implications of these findings were discussed and recommendations for physics teachers to improve student learning were presented.

  14. Theta EEG dynamics of the error-related negativity.

    PubMed

    Trujillo, Logan T; Allen, John J B

    2007-03-01

    The error-related negativity (ERN) is a response-locked brain potential (ERP) occurring 80-100ms following response errors. This report contrasts three views of the genesis of the ERN, testing the classic view that time-locked phasic bursts give rise to the ERN against the view that the ERN arises from a pure phase-resetting of ongoing theta (4-7Hz) EEG activity and the view that the ERN is generated - at least in part - by a phase-resetting and amplitude enhancement of ongoing theta EEG activity. Time-domain ERP analyses were augmented with time-frequency investigations of phase-locked and non-phase-locked spectral power, and inter-trial phase coherence (ITPC) computed from individual EEG trials, examining time courses and scalp topographies. Simulations based on the assumptions of the classic, pure phase-resetting, and phase-resetting plus enhancement views, using parameters from each subject's empirical data, were used to contrast the time-frequency findings that could be expected if one or more of these hypotheses adequately modeled the data. Error responses produced larger amplitude activity than correct responses in time-domain ERPs immediately following responses, as expected. Time-frequency analyses revealed that significant error-related post-response increases in total spectral power (phase- and non-phase-locked), phase-locked power, and ITPC were primarily restricted to the theta range, with this effect located over midfrontocentral sites, with a temporal distribution from approximately 150-200ms prior to the button press and persisting up to 400ms post-button press. The increase in non-phase-locked power (total power minus phase-locked power) was larger than phase-locked power, indicating that the bulk of the theta event-related dynamics were not phase-locked to response. Results of the simulations revealed a good fit for data simulated according to the phase-locking with amplitude enhancement perspective, and a poor fit for data simulated according to the classic view and the pure phase-resetting view. Error responses produce not only phase-locked increases in theta EEG activity, but also increases in non-phase-locked theta, both of which share a similar topography. The findings are thus consistent with the notion advanced by Luu et al. [Luu P, Tucker DM, Makeig S. Frontal midline theta and the error-related negativity; neurophysiological mechanisms of action regulation. Clin Neurophysiol 2004;115:1821-35] that the ERN emerges, at least in part, from a phase-resetting and phase-locking of ongoing theta-band activity, in the context of a general increase in theta power following errors.

  15. Rectal Carcinoma Model: A Novel Simulation in Pathology Training.

    PubMed

    Pongpaibul, Ananya; Chiravirakul, Prattana; Leksrisakul, Piyawadee; Silakorn, Phadungsak; Chumtap, Wangcha; Chongpipatchaipron, Somchai; Jaitrong, Peerasak; Jitvichai, Ekachai

    2017-06-01

    Until now, the apprenticeship training model is used to train pathology residents. Pathology residents are trained using patient specimens that are received during the course of normal daily pathology service. However, this training method could result in inconsistency in knowledge and experience among trainees because of variation in specimens that are received for analysis. The use of simulated specimens in pathology residency training could help ensure that all pathology residents receive consistent knowledge and experience. The aim of this study was to develop prototype rectal carcinoma model to be used as a simulation tool and to evaluate its effectiveness in pathology training. Five units of a prototype rectal carcinoma model were produced in latex rubber. The model was used as a simulation tool for training in 12 pathology residents and 7 pathologist assistants. Pretesting and posttesting of each participant was conducted by multiple choice question test. A questionnaire was also given to study participants to elicit their views regarding the fidelity of the model and the model's efficacy and usefulness relative to the gross examination technique. Among the 19 participants, the mean pretest score was 79.24% and the mean posttest score was 88.54% (P = 0.045). The fidelity of the model was rated as moderate to marked by all participants. Most participants (94.74%) rated the models efficacy and usefulness relative to the gross examination technique as being moderate to marked. The rectal carcinoma model introduced in this study was found to be an effective simulation tool for pathology training. The model had good fidelity on appearance and good efficacy as well as usefulness relative to the gross examination technique.

  16. Vernier caliper and micrometer computer models using Easy Java Simulation and its pedagogical design features—ideas for augmenting learning with real instruments

    NASA Astrophysics Data System (ADS)

    Wee, Loo Kang; Tiang Ning, Hwee

    2014-09-01

    This paper presents the customization of Easy Java Simulation models, used with actual laboratory instruments, to create active experiential learning for measurements. The laboratory instruments are the vernier caliper and the micrometer. Three computer model design ideas that complement real equipment are discussed. These ideas involve (1) a simple two-dimensional view for learning from pen and paper questions and the real world; (2) hints, answers, different scale options and the inclusion of zero error; (3) assessment for learning feedback. The initial positive feedback from Singaporean students and educators indicates that these tools could be successfully shared and implemented in learning communities. Educators are encouraged to change the source code for these computer models to suit their own purposes; they have creative commons attribution licenses for the benefit of all.

  17. A comprehensive model of the spatio-temporal stem cell and tissue organisation in the intestinal crypt.

    PubMed

    Buske, Peter; Galle, Jörg; Barker, Nick; Aust, Gabriela; Clevers, Hans; Loeffler, Markus

    2011-01-06

    We introduce a novel dynamic model of stem cell and tissue organisation in murine intestinal crypts. Integrating the molecular, cellular and tissue level of description, this model links a broad spectrum of experimental observations encompassing spatially confined cell proliferation, directed cell migration, multiple cell lineage decisions and clonal competition.Using computational simulations we demonstrate that the model is capable of quantitatively describing and predicting the dynamic behaviour of the intestinal tissue during steady state as well as after cell damage and following selective gain or loss of gene function manipulations affecting Wnt- and Notch-signalling. Our simulation results suggest that reversibility and flexibility of cellular decisions are key elements of robust tissue organisation of the intestine. We predict that the tissue should be able to fully recover after complete elimination of cellular subpopulations including subpopulations deemed to be functional stem cells. This challenges current views of tissue stem cell organisation.

  18. Spreading of correlations in the Falicov-Kimball model

    NASA Astrophysics Data System (ADS)

    Herrmann, Andreas J.; Antipov, Andrey E.; Werner, Philipp

    2018-04-01

    We study dynamical properties of the one- and two-dimensional Falicov-Kimball model using lattice Monte Carlo simulations. In particular, we calculate the spreading of charge correlations in the equilibrium model and after an interaction quench. The results show a reduction of the light-cone velocity with interaction strength at low temperature, while the phase velocity increases. At higher temperature, the initial spreading is determined by the Fermi velocity of the noninteracting system and the maximum range of the correlations decreases with increasing interaction strength. Charge order correlations in the disorder potential enhance the range of the correlations. We also use the numerically exact lattice Monte Carlo results to benchmark the accuracy of equilibrium and nonequilibrium dynamical cluster approximation calculations. It is shown that the bias introduced by the mapping to a periodized cluster is substantial, and that from a numerical point of view, it is more efficient to simulate the lattice model directly.

  19. Assessment of BRDF effect of Kunlun Mountain glacier on Tibetan Plateau as a potential pseudo-invariant calibration site

    NASA Astrophysics Data System (ADS)

    Wang, Ling; Hu, Xiuqing; Chen, Lin

    2017-09-01

    Calibration is a critical step to ensure data quality and to meet the requirement of quantitative remote sensing in a broad range of scientific applications. One of the least expensive and increasingly popular methods of on-orbit calibration is the use of pseudo invariant calibration sites (PICS). A spatial homogenous and temporally stable area of 34 km2 in size around the center of Kunlun Mountain (KLM) over Tibetan Plateau (TP) was identified by our previous study. The spatial and temporal coefficient of variation (CV) this region was better than 4% for the reflective solar bands. In this study, the BRDF impacts of KLM glacier on MODIS observed TOA reflectance in band 1 (659 nm) are examined. The BRDF impact of KLM glacier with respect to the view zenith angle is studied through using the observations at a fixed solar zenith angle, and the effect with respect to the sun zenith angle is studied based on the observations collected at the same view angle. Then, the two widely used BRDF models are applied to our test data to simulate the variations of TOA reflectance due to the changes in viewing geometry. The first one is Ross-Li model, which has been used to produce the MODIS global BRDF albedo data product. The second one is snow surface BRDF model, which has been used to characterize the bidirectional reflectance of Antarctic snow. Finally, the accuracy and effectiveness of these two different BRDF models are tested through comparing the model of simulated TOA reflectance with the observed one. The results show that variations of the reflectances at a fixed solar zenith angle are close to the lambertian pattern, while those at a fixed sensor zenith angle are strongly anisotropic. A decrease in solar zenith angle from 50º to 20º causes an increase in reflectance by the level of approximated 50%. The snow surface BRDF model performs much better than the Ross-Li BRDF model to re-produce the Bi-Directional Reflectance of KLM glacier. The RMSE of snow surface BRDF model is 3.60%, which is only half of the RMSE when using Ross-Li model.

  20. Segmented slant hole collimator for stationary cardiac SPECT: Monte Carlo simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mao, Yanfei, E-mail: ymao@ucair.med.utah.edu; Yu, Zhicong; Zeng, Gengsheng L.

    2015-09-15

    Purpose: This work is a preliminary study of a stationary cardiac SPECT system. The goal of this research is to propose a stationary cardiac SPECT system using segmented slant-hole collimators and to perform computer simulations to test the feasibility. Compared to the rotational SPECT, a stationary system has a benefit of acquiring temporally consistent projections. The most challenging issue in building a stationary system is to provide sufficient projection view-angles. Methods: A GATE (GEANT4 application for tomographic emission) Monte Carlo model was developed to simulate a two-detector stationary cardiac SPECT that uses segmented slant-hole collimators. Each detector contains seven segmentedmore » slant-hole sections that slant to a common volume at the rotation center. Consequently, 14 view-angles over 180° were acquired without any gantry rotation. The NCAT phantom was used for data generation and a tailored maximum-likelihood expectation-maximization algorithm was used for image reconstruction. Effects of limited number of view-angles and data truncation were carefully evaluated in the paper. Results: Simulation results indicated that the proposed segmented slant-hole stationary cardiac SPECT system is able to acquire sufficient data for cardiac imaging without a loss of image quality, even when the uptakes in the liver and kidneys are high. Seven views are acquired simultaneously at each detector, leading to 5-fold sensitivity gain over the conventional dual-head system at the same total acquisition time, which in turn increases the signal-to-noise ratio by 19%. The segmented slant-hole SPECT system also showed a good performance in lesion detection. In our prototype system, a short hole-length was used to reduce the dead zone between neighboring collimator segments. The measured sensitivity gain is about 17-fold over the conventional dual-head system. Conclusions: The GATE Monte Carlo simulations confirm the feasibility of the proposed stationary cardiac SPECT system with segmented slant-hole collimators. The proposed collimator consists of combined parallel and slant holes, and the image on the detector is not reduced in size.« less

  1. Local Homing Navigation Based on the Moment Model for Landmark Distribution and Features

    PubMed Central

    Lee, Changmin; Kim, DaeEun

    2017-01-01

    For local homing navigation, an agent is supposed to return home based on the surrounding environmental information. According to the snapshot model, the home snapshot and the current view are compared to determine the homing direction. In this paper, we propose a novel homing navigation method using the moment model. The suggested moment model also follows the snapshot theory to compare the home snapshot and the current view, but the moment model defines a moment of landmark inertia as the sum of the product of the feature of the landmark particle with the square of its distance. The method thus uses range values of landmarks in the surrounding view and the visual features. The center of the moment can be estimated as the reference point, which is the unique convergence point in the moment potential from any view. The homing vector can easily be extracted from the centers of the moment measured at the current position and the home location. The method effectively guides homing direction in real environments, as well as in the simulation environment. In this paper, we take a holistic approach to use all pixels in the panoramic image as landmarks and use the RGB color intensity for the visual features in the moment model in which a set of three moment functions is encoded to determine the homing vector. We also tested visual homing or the moment model with only visual features, but the suggested moment model with both the visual feature and the landmark distance shows superior performance. We demonstrate homing performance with various methods classified by the status of the feature, the distance and the coordinate alignment. PMID:29149043

  2. A new potential for the numerical simulations of electrolyte solutions on a hypersphere

    NASA Astrophysics Data System (ADS)

    Caillol, Jean-Michel

    1993-12-01

    We propose a new way of performing numerical simulations of the restricted primitive model of electrolytes—and related models—on a hypersphere. In this new approach, the system is viewed as a single component fluid of charged bihard spheres constrained to move at the surface of a four dimensional sphere. A charged bihard sphere is defined as the rigid association of two antipodal charged hard spheres of opposite signs. These objects interact via a simple analytical potential obtained by solving the Poisson-Laplace equation on the hypersphere. This new technique of simulation enables a precise determination of the chemical potential of the charged species in the canonical ensemble by a straightforward application of Widom's insertion method. Comparisons with previous simulations demonstrate the efficiency and the reliability of the method.

  3. TU-EF-204-09: A Preliminary Method of Risk-Informed Optimization of Tube Current Modulation for Dose Reduction in CT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gao, Y; Liu, B; Kalra, M

    Purpose: X-rays from CT scans can increase cancer risk to patients. Lifetime Attributable Risk of Cancer Incidence for adult patients has been investigated and shown to decrease as patient age. However, a new risk model shows an increasing risk trend for several radiosensitive organs for middle age patients. This study investigates the feasibility of a general method for optimizing tube current modulation (TCM) functions to minimize risk by reducing radiation dose to radiosensitive organs of patients. Methods: Organ-based TCM has been investigated in literature for eye lens dose and breast dose. Adopting the concept in organ-based TCM, this study seeksmore » to find an optimized tube current for minimal total risk to breasts and lungs by reducing dose to these organs. The contributions of each CT view to organ dose are determined through simulations of CT scan view-by-view using a GPU-based fast Monte Carlo code, ARCHER. A Linear Programming problem is established for tube current optimization, with Monte Carlo results as weighting factors at each view. A pre-determined dose is used as upper dose boundary, and tube current of each view is optimized to minimize the total risk. Results: An optimized tube current is found to minimize the total risk of lungs and breasts: compared to fixed current, the risk is reduced by 13%, with breast dose reduced by 38% and lung dose reduced by 7%. The average tube current is maintained during optimization to maintain image quality. In addition, dose to other organs in chest region is slightly affected, with relative change in dose smaller than 10%. Conclusion: Optimized tube current plans can be generated to minimize cancer risk to lungs and breasts while maintaining image quality. In the future, various risk models and greater number of projections per rotation will be simulated on phantoms of different gender and age. National Institutes of Health R01EB015478.« less

  4. GoPhast: a graphical user interface for PHAST

    USGS Publications Warehouse

    Winston, Richard B.

    2006-01-01

    GoPhast is a graphical user interface (GUI) for the USGS model PHAST. PHAST simulates multicomponent, reactive solute transport in three-dimensional, saturated, ground-water flow systems. PHAST can model both equilibrium and kinetic geochemical reactions. PHAST is derived from HST3D (flow and transport) and PHREEQC (geochemical calculations). The flow and transport calculations are restricted to constant fluid density and constant temperature. The complexity of the input required by PHAST makes manual construction of its input files tedious and error-prone. GoPhast streamlines the creation of the input file and helps reduce errors. GoPhast allows the user to define the spatial input for the PHAST flow and transport data file by drawing points, lines, or polygons on top, front, and side views of the model domain. These objects can have up to two associated formulas that define their extent perpendicular to the view plane, allowing the objects to be three-dimensional. Formulas are also used to specify the values of spatial data (data sets) both globally and for individual objects. Objects can be used to specify the values of data sets independent of the spatial and temporal discretization of the model. Thus, the grid and simulation periods for the model can be changed without respecifying spatial data pertaining to the hydrogeologic framework and boundary conditions. This report describes the operation of GoPhast and demonstrates its use with examples. GoPhast runs on Windows 2000, Windows XP, and Linux operating systems.

  5. Efficient simulation of intensity profile of light through subpixel-matched lenticular lens array for two- and four-view auto-stereoscopic liquid-crystal display.

    PubMed

    Chang, Yia-Chung; Tang, Li-Chuan; Yin, Chun-Yi

    2013-01-01

    Both an analytical formula and an efficient numerical method for simulation of the accumulated intensity profile of light that is refracted through a lenticular lens array placed on top of a liquid-crystal display (LCD) are presented. The influence due to light refracted through adjacent lens is examined in the two-view and four-view systems. Our simulation results are in good agreement with those obtained by a piece of commercial software, ASAP, but our method is much more efficient. This proposed method allows one to adjust the design parameters and carry out simulation for the performance of a subpixel-matched auto-stereoscopic LCD more efficiently and easily.

  6. Analysis, Simulation, and Verification of Knowledge-Based, Rule-Based, and Expert Systems

    NASA Technical Reports Server (NTRS)

    Hinchey, Mike; Rash, James; Erickson, John; Gracanin, Denis; Rouff, Chris

    2010-01-01

    Mathematically sound techniques are used to view a knowledge-based system (KBS) as a set of processes executing in parallel and being enabled in response to specific rules being fired. The set of processes can be manipulated, examined, analyzed, and used in a simulation. The tool that embodies this technology may warn developers of errors in their rules, but may also highlight rules (or sets of rules) in the system that are underspecified (or overspecified) and need to be corrected for the KBS to operate as intended. The rules embodied in a KBS specify the allowed situations, events, and/or results of the system they describe. In that sense, they provide a very abstract specification of a system. The system is implemented through the combination of the system specification together with an appropriate inference engine, independent of the algorithm used in that inference engine. Viewing the rule base as a major component of the specification, and choosing an appropriate specification notation to represent it, reveals how additional power can be derived from an approach to the knowledge-base system that involves analysis, simulation, and verification. This innovative approach requires no special knowledge of the rules, and allows a general approach where standardized analysis, verification, simulation, and model checking techniques can be applied to the KBS.

  7. Characterization of Surface Reflectance Variation Effects on Remote Sensing

    NASA Technical Reports Server (NTRS)

    Pearce, W. A.

    1984-01-01

    The use of Monte Carlo radiative transfer codes to simulate the effects on remote sensing in visible and infrared wavelengths of variables which affect classification is examined. These variables include detector viewing angle, atmospheric aerosol size distribution, aerosol vertical and horizontal distribution (e.g., finite clouds), the form of the bidirectional ground reflectance function, and horizontal variability of reflectance type and reflectivity (albedo). These simulations are used to characterize the sensitivity of observables (intensity and polarization) to variations in the underlying physical parameters both to improve algorithms for the removal of atmospheric effects and to identify techniques which can improve classification accuracy. It was necessary to revise and validate the simulation codes (CTRANS, ARTRAN, and the Mie scattering code) to improve efficiency and accommodate a new operational environment, and to build the basic software tools for acquisition and off-line manipulation of simulation results. Initial calculations compare cases in which increasing amounts of aerosol are shifted into the stratosphere, maintaining a constant optical depth. In the case of moderate aerosol optical depth, the effect on the spread function is to scale it linearly as would be expected from a single scattering model. Varying the viewing angle appears to provide the same qualitative effect as modifying the vertical optical depth (for Lambertian ground reflectance).

  8. Space Shuttle flying qualities and flight control system assessment study, phase 2

    NASA Technical Reports Server (NTRS)

    Myers, T. T.; Johnston, D. E.; Mcruer, D. T.

    1983-01-01

    A program of flying qualities experiments as part of the Orbiter Experiments Program (OEX) is defined. Phase 1, published as CR-170391, reviewed flying qualities criteria and shuttle data. The review of applicable experimental and shuttle data to further define the OEX plan is continued. An unconventional feature of this approach is the use of pilot strategy model identification to relate flight and simulator results. Instrumentation, software, and data analysis techniques for pilot model measurements are examined. The relationship between shuttle characteristics and superaugmented aircraft is established. STS flights 1 through 4 are reviewed from the point of view of flying qualities. A preliminary plan for a coordinated program of inflight and simulator research is presented.

  9. Enhanced 630nm equatorial airglow emission observed by Limb Viewing Hyper Spectral Imager (LiVHySI) onboard YOUTHSAT-1

    NASA Astrophysics Data System (ADS)

    Bisht, R. S.; Thapa, N.; Babu, P. N.

    2016-04-01

    The Earth's airglow layer, when observed in the limb view mode, appears to be a double layer. LiVHySI onboard YOUTHSAT (inclination 98.730, apogee 817 km, launched by Indian Space Research Organization in April, 2011) is an Earth's limb viewing camera measuring airglow emissions in the spectral window of 550-900 nm. Total altitude coverage is about 500 km with command selectable lowest altitude. During few of the orbits we have observed the double layer structure and obtained absolute spectral intensity and altitude profile for 630 nm airglow emission. Our night time observations of upper atmosphere above dip equator carried out on 3rd May, 2011 show a prominent 630 nm double layer structure. The upper airglow layer consists of the 630 nm atomic oxygen O(1D) emission line and lower layer consists of OH(9-3) meinel band emission at 630 nm. The volume emission rate as a function of altitude is simulated for our observational epoch and the modeled limb intensity distribution is compared with the observations. The observations are in good agreement with the simulated intensity distribution.

  10. Recent Developments in the VISRAD 3-D Target Design and Radiation Simulation Code

    NASA Astrophysics Data System (ADS)

    Macfarlane, Joseph; Golovkin, Igor; Sebald, James

    2017-10-01

    The 3-D view factor code VISRAD is widely used in designing HEDP experiments at major laser and pulsed-power facilities, including NIF, OMEGA, OMEGA-EP, ORION, Z, and LMJ. It simulates target designs by generating a 3-D grid of surface elements, utilizing a variety of 3-D primitives and surface removal algorithms, and can be used to compute the radiation flux throughout the surface element grid by computing element-to-element view factors and solving power balance equations. Target set-up and beam pointing are facilitated by allowing users to specify positions and angular orientations using a variety of coordinates systems (e.g., that of any laser beam, target component, or diagnostic port). Analytic modeling for laser beam spatial profiles for OMEGA DPPs and NIF CPPs is used to compute laser intensity profiles throughout the grid of surface elements. VISRAD includes a variety of user-friendly graphics for setting up targets and displaying results, can readily display views from any point in space, and can be used to generate image sequences for animations. We will discuss recent improvements to conveniently assess beam capture on target and beam clearance of diagnostic components, as well as plans for future developments.

  11. SMC: SCENIC Model Control

    NASA Technical Reports Server (NTRS)

    Srivastava, Priyaka; Kraus, Jeff; Murawski, Robert; Golden, Bertsel, Jr.

    2015-01-01

    NASAs Space Communications and Navigation (SCaN) program manages three active networks: the Near Earth Network, the Space Network, and the Deep Space Network. These networks simultaneously support NASA missions and provide communications services to customers worldwide. To efficiently manage these resources and their capabilities, a team of student interns at the NASA Glenn Research Center is developing a distributed system to model the SCaN networks. Once complete, the system shall provide a platform that enables users to perform capacity modeling of current and prospective missions with finer-grained control of information between several simulation and modeling tools. This will enable the SCaN program to access a holistic view of its networks and simulate the effects of modifications in order to provide NASA with decisional information. The development of this capacity modeling system is managed by NASAs Strategic Center for Education, Networking, Integration, and Communication (SCENIC). Three primary third-party software tools offer their unique abilities in different stages of the simulation process. MagicDraw provides UMLSysML modeling, AGIs Systems Tool Kit simulates the physical transmission parameters and de-conflicts scheduled communication, and Riverbed Modeler (formerly OPNET) simulates communication protocols and packet-based networking. SCENIC developers are building custom software extensions to integrate these components in an end-to-end space communications modeling platform. A central control module acts as the hub for report-based messaging between client wrappers. Backend databases provide information related to mission parameters and ground station configurations, while the end user defines scenario-specific attributes for the model. The eight SCENIC interns are working under the direction of their mentors to complete an initial version of this capacity modeling system during the summer of 2015. The intern team is composed of four students in Computer Science, two in Computer Engineering, one in Electrical Engineering, and one studying Space Systems Engineering.

  12. Solar tower cavity receiver aperture optimization based on transient optical and thermo-hydraulic modeling

    NASA Astrophysics Data System (ADS)

    Schöttl, Peter; Bern, Gregor; van Rooyen, De Wet; Heimsath, Anna; Fluri, Thomas; Nitz, Peter

    2017-06-01

    A transient simulation methodology for cavity receivers for Solar Tower Central Receiver Systems with molten salt as heat transfer fluid is described. Absorbed solar radiation is modeled with ray tracing and a sky discretization approach to reduce computational effort. Solar radiation re-distribution in the cavity as well as thermal radiation exchange are modeled based on view factors, which are also calculated with ray tracing. An analytical approach is used to represent convective heat transfer in the cavity. Heat transfer fluid flow is simulated with a discrete tube model, where the boundary conditions at the outer tube surface mainly depend on inputs from the previously mentioned modeling aspects. A specific focus is put on the integration of optical and thermo-hydraulic models. Furthermore, aiming point and control strategies are described, which are used during the transient performance assessment. Eventually, the developed simulation methodology is used for the optimization of the aperture opening size of a PS10-like reference scenario with cavity receiver and heliostat field. The objective function is based on the cumulative gain of one representative day. Results include optimized aperture opening size, transient receiver characteristics and benefits of the implemented aiming point strategy compared to a single aiming point approach. Future work will include annual simulations, cost assessment and optimization of a larger range of receiver parameters.

  13. Transient modeling/analysis of hyperbolic heat conduction problems employing mixed implicit-explicit alpha method

    NASA Technical Reports Server (NTRS)

    Tamma, Kumar K.; D'Costa, Joseph F.

    1991-01-01

    This paper describes the evaluation of mixed implicit-explicit finite element formulations for hyperbolic heat conduction problems involving non-Fourier effects. In particular, mixed implicit-explicit formulations employing the alpha method proposed by Hughes et al. (1987, 1990) are described for the numerical simulation of hyperbolic heat conduction models, which involves time-dependent relaxation effects. Existing analytical approaches for modeling/analysis of such models involve complex mathematical formulations for obtaining closed-form solutions, while in certain numerical formulations the difficulties include severe oscillatory solution behavior (which often disguises the true response) in the vicinity of the thermal disturbances, which propagate with finite velocities. In view of these factors, the alpha method is evaluated to assess the control of the amount of numerical dissipation for predicting the transient propagating thermal disturbances. Numerical test models are presented, and pertinent conclusions are drawn for the mixed-time integration simulation of hyperbolic heat conduction models involving non-Fourier effects.

  14. Reusable Rocket Engine Operability Modeling and Analysis

    NASA Technical Reports Server (NTRS)

    Christenson, R. L.; Komar, D. R.

    1998-01-01

    This paper describes the methodology, model, input data, and analysis results of a reusable launch vehicle engine operability study conducted with the goal of supporting design from an operations perspective. Paralleling performance analyses in schedule and method, this requires the use of metrics in a validated operations model useful for design, sensitivity, and trade studies. Operations analysis in this view is one of several design functions. An operations concept was developed given an engine concept and the predicted operations and maintenance processes incorporated into simulation models. Historical operations data at a level of detail suitable to model objectives were collected, analyzed, and formatted for use with the models, the simulations were run, and results collected and presented. The input data used included scheduled and unscheduled timeline and resource information collected into a Space Transportation System (STS) Space Shuttle Main Engine (SSME) historical launch operations database. Results reflect upon the importance not only of reliable hardware but upon operations and corrective maintenance process improvements.

  15. Analysis and modeling of photomask edge effects for 3D geometries and the effect on process window

    NASA Astrophysics Data System (ADS)

    Miller, Marshal A.; Neureuther, Andrew R.

    2009-03-01

    Simulation was used to explore boundary layer models for 1D and 2D patterns that would be appropriate for fast CAD modeling of physical effects during design. FDTD simulation was used to compare rigorous thick mask modeling to a thin mask approximation (TMA). When features are large, edges can be viewed as independent and modeled as separate from one another, but for small mask features, edges experience cross-talk. For attenuating phase-shift masks, interaction distances as large as 150nm were observed. Polarization effects are important for accurate EMF models. Due to polarization effects, the edge perturbations in line ends become different compared to a perpendicular edge. For a mask designed to be real, the 90o transmission created at edges produces an asymmetry through focus, which is also polarization dependent. Thick mask fields are calculated using TEMPEST and Panoramic Technologies software. Fields are then analyzed in the near field and on wafer CDs to examine deviations from TMA.

  16. Description of waste pretreatment and interfacing systems dynamic simulation model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garbrick, D.J.; Zimmerman, B.D.

    1995-05-01

    The Waste Pretreatment and Interfacing Systems Dynamic Simulation Model was created to investigate the required pretreatment facility processing rates for both high level and low level waste so that the vitrification of tank waste can be completed according to the milestones defined in the Tri-Party Agreement (TPA). In order to achieve this objective, the processes upstream and downstream of the pretreatment facilities must also be included. The simulation model starts with retrieval of tank waste and ends with vitrification for both low level and high level wastes. This report describes the results of three simulation cases: one based on suggestedmore » average facility processing rates, one with facility rates determined so that approximately 6 new DSTs are required, and one with facility rates determined so that approximately no new DSTs are required. It appears, based on the simulation results, that reasonable facility processing rates can be selected so that no new DSTs are required by the TWRS program. However, this conclusion must be viewed with respect to the modeling assumptions, described in detail in the report. Also included in the report, in an appendix, are results of two sensitivity cases: one with glass plant water recycle steams recycled versus not recycled, and one employing the TPA SST retrieval schedule versus a more uniform SST retrieval schedule. Both recycling and retrieval schedule appear to have a significant impact on overall tank usage.« less

  17. A simulation study of the impact of the public-private partnership strategy on the performance of transport infrastructure.

    PubMed

    Huang, Zhengfeng; Zheng, Pengjun; Ma, Yanqiang; Li, Xuan; Xu, Wenjun; Zhu, Wanlu

    2016-01-01

    The choice of investment strategy has a great impact on the performance of transport infrastructure. Positive projects such as the "Subway plus Property" model in Hong Kong have created sustainable financial profits for the public transport projects. Owing to a series of public debt and other constraints, public-private partnership (PPP) was introduced as an innovative investment model to address this issue and help develop transport infrastructure. Yet, few studies provide a deeper understanding of relationships between PPP strategy and the performance of such transport projects (particularly the whole transport system). This paper defines the research scope as a regional network of freeway. With a popular PPP model, travel demand prediction method, and relevant parameters as input, agents in a simulation framework can simulate the choice of PPP freeway over time. The simulation framework can be used to analyze the relationship between the PPP strategy and performance of the regional freeway network. This study uses the Freeway Network of Yangtze River Delta (FN-YRD) in China as the context. The results demonstrate the value of using simulation models of complex transportation systems to help decision makers choose the right PPP projects. Such a tool is viewed as particularly important given the ongoing transformation of functions of the Chinese transportation sector, including franchise rights of transport projects, and freeway charging mechanism.

  18. The effects of a dynamic graphical model during simulation-based training of console operation skill

    NASA Technical Reports Server (NTRS)

    Farquhar, John D.; Regian, J. Wesley

    1993-01-01

    LOADER is a Windows-based simulation of a complex procedural task. The task requires subjects to execute long sequences of console-operation actions (e.g., button presses, switch actuations, dial rotations) to accomplish specific goals. The LOADER interface is a graphical computer-simulated console which controls railroad cars, tracks, and cranes in a fictitious railroad yard. We hypothesized that acquisition of LOADER performance skill would be supported by the representation of a dynamic graphical model linking console actions to goal and goal states in the 'railroad yard'. Twenty-nine subjects were randomly assigned to one of two treatments (i.e., dynamic model or no model). During training, both groups received identical text-based instruction in an instructional-window above the LOADER interface. One group, however, additionally saw a dynamic version of the bird's-eye view of the railroad yard. After training, both groups were tested under identical conditions. They were asked to perform the complete procedure without guidance and without access to either type of railroad yard representation. Results indicate that rather than becoming dependent on the animated rail yard model, subjects in the dynamic model condition apparently internalized the model, as evidenced by their performance after the model was removed.

  19. 3D modeling of satellite spectral images, radiation budget and energy budget of urban landscapes

    NASA Astrophysics Data System (ADS)

    Gastellu-Etchegorry, J. P.

    2008-12-01

    DART EB is a model that is being developed for simulating the 3D (3 dimensional) energy budget of urban and natural scenes, possibly with topography and atmosphere. It simulates all non radiative energy mechanisms (heat conduction, turbulent momentum and heat fluxes, water reservoir evolution, etc.). It uses DART model (Discrete Anisotropic Radiative Transfer) for simulating radiative mechanisms: 3D radiative budget of 3D scenes and their remote sensing images expressed in terms of reflectance or brightness temperature values, for any atmosphere, wavelength, sun/view direction, altitude and spatial resolution. It uses an innovative multispectral approach (ray tracing, exact kernel, discrete ordinate techniques) over the whole optical domain. This paper presents two major and recent improvements of DART for adapting it to urban canopies. (1) Simulation of the geometry and optical characteristics of urban elements (houses, etc.). (2) Modeling of thermal infrared emission by vegetation and urban elements. The new DART version was used in the context of the CAPITOUL project. For that, districts of the Toulouse urban data base (Autocad format) were translated into DART scenes. This allowed us to simulate visible, near infrared and thermal infrared satellite images of Toulouse districts. Moreover, the 3D radiation budget was used by DARTEB for simulating the time evolution of a number of geophysical quantities of various surface elements (roads, walls, roofs). Results were successfully compared with ground measurements of the CAPITOUL project.

  20. Land surface modeling in convection permitting simulations

    NASA Astrophysics Data System (ADS)

    van Heerwaarden, Chiel; Benedict, Imme

    2017-04-01

    The next generation of weather and climate models permits convection, albeit at a grid spacing that is not sufficient to resolve all details of the clouds. Whereas much attention is being devoted to the correct simulation of convective clouds and associated precipitation, the role of the land surface has received far less interest. In our view, convective permitting simulations pose a set of problems that need to be solved before accurate weather and climate prediction is possible. The heart of the problem lies at the direct runoff and at the nonlinearity of the surface stress as a function of soil moisture. In coarse resolution simulations, where convection is not permitted, precipitation that reaches the land surface is uniformly distributed over the grid cell. Subsequently, a fraction of this precipitation is intercepted by vegetation or leaves the grid cell via direct runoff, whereas the remainder infiltrates into the soil. As soon as we move to convection permitting simulations, this precipitation falls often locally in large amounts. If the same land-surface model is used as in simulations with parameterized convection, this leads to an increase in direct runoff. Furthermore, spatially non-uniform infiltration leads to a very different surface stress, when scaled up to the course resolution of simulations without convection. Based on large-eddy simulation of realistic convection events at a large domain, this study presents a quantification of the errors made at the land surface in convection permitting simulation. It compares the magnitude of the errors to those made in the convection itself due to the coarse resolution of the simulation. We find that, convection permitting simulations have less evaporation than simulations with parameterized convection, resulting in a non-realistic drying of the atmosphere. We present solutions to resolve this problem.

  1. Cyberpsychology: a human-interaction perspective based on cognitive modeling.

    PubMed

    Emond, Bruno; West, Robert L

    2003-10-01

    This paper argues for the relevance of cognitive modeling and cognitive architectures to cyberpsychology. From a human-computer interaction point of view, cognitive modeling can have benefits both for theory and model building, and for the design and evaluation of sociotechnical systems usability. Cognitive modeling research applied to human-computer interaction has two complimentary objectives: (1) to develop theories and computational models of human interactive behavior with information and collaborative technologies, and (2) to use the computational models as building blocks for the design, implementation, and evaluation of interactive technologies. From the perspective of building theories and models, cognitive modeling offers the possibility to anchor cyberpsychology theories and models into cognitive architectures. From the perspective of the design and evaluation of socio-technical systems, cognitive models can provide the basis for simulated users, which can play an important role in usability testing. As an example of application of cognitive modeling to technology design, the paper presents a simulation of interactive behavior with five different adaptive menu algorithms: random, fixed, stacked, frequency based, and activation based. Results of the simulation indicate that fixed menu positions seem to offer the best support for classification like tasks such as filing e-mails. This research is part of the Human-Computer Interaction, and the Broadband Visual Communication research programs at the National Research Council of Canada, in collaboration with the Carleton Cognitive Modeling Lab at Carleton University.

  2. Spacecraft Guidance, Navigation, and Control Visualization Tool

    NASA Technical Reports Server (NTRS)

    Mandic, Milan; Acikmese, Behcet; Blackmore, Lars

    2011-01-01

    G-View is a 3D visualization tool for supporting spacecraft guidance, navigation, and control (GN&C) simulations relevant to small-body exploration and sampling (see figure). The tool is developed in MATLAB using Virtual Reality Toolbox and provides users with the ability to visualize the behavior of their simulations, regardless of which programming language (or machine) is used to generate simulation results. The only requirement is that multi-body simulation data is generated and placed in the proper format before applying G-View.

  3. Evaluation of Students' Views about the Use of SCORM (Sharable Content Object Reference Model)-Compatible Materials in Physics Teaching

    ERIC Educational Resources Information Center

    Gonen, Selahattin; Basaran, Bulent

    2013-01-01

    In the present study, a web site including instructional materials such as Whiteboard Movies (WBM), simulations and animations and testing materials such as true-false, fill-in-the-blanks, puzzles, open-ended questions and multiple-choice questions was designed. The study was carried out with 76 students attending Dicle College (DC), Diyarbakir…

  4. Carbon tradeoffs of restoration and provision of endangered species habitat in a fire-maintained forest

    Treesearch

    Katherine L. Martin; Matthew D. Hurteau; Bruce A. Hungate; George W. Koch; Malcolm P. North

    2015-01-01

    Forests are a significant part of the global carbon cycle and are increasingly viewed as tools for mitigating climate change. Natural disturbances, such as fire, can reduce carbon storage. However, many forests and dependent species evolved with frequent fire as an integral ecosystem process. We used a landscape forest simulation model to evaluate the effects of...

  5. Impact of Surface Roughness on AMSR-E Sea Ice Products

    NASA Technical Reports Server (NTRS)

    Stroeve, Julienne C.; Markus, Thorsten; Maslanik, James A.; Cavalieri, Donald J.; Gasiewski, Albin J.; Heinrichs, John F.; Holmgren, Jon; Perovich, Donald K.; Sturm, Matthew

    2006-01-01

    This paper examines the sensitivity of Advanced Microwave Scanning Radiometer (AMSR-E) brightness temperatures (Tbs) to surface roughness by a using radiative transfer model to simulate AMSR-E Tbs as a function of incidence angle at which the surface is viewed. The simulated Tbs are then used to examine the influence that surface roughness has on two operational sea ice algorithms, namely: 1) the National Aeronautics and Space Administration Team (NT) algorithm and 2) the enhanced NT algorithm, as well as the impact of roughness on the AMSR-E snow depth algorithm. Surface snow and ice data collected during the AMSR-Ice03 field campaign held in March 2003 near Barrow, AK, were used to force the radiative transfer model, and resultant modeled Tbs are compared with airborne passive microwave observations from the Polarimetric Scanning Radiometer. Results indicate that passive microwave Tbs are very sensitive even to small variations in incidence angle, which can cause either an over or underestimation of the true amount of sea ice in the pixel area viewed. For example, this paper showed that if the sea ice areas modeled in this paper mere assumed to be completely smooth, sea ice concentrations were underestimated by nearly 14% using the NT sea ice algorithm and by 7% using the enhanced NT algorithm. A comparison of polarization ratios (PRs) at 10.7,18.7, and 37 GHz indicates that each channel responds to different degrees of surface roughness and suggests that the PR at 10.7 GHz can be useful for identifying locations of heavily ridged or rubbled ice. Using the PR at 10.7 GHz to derive an "effective" viewing angle, which is used as a proxy for surface roughness, resulted in more accurate retrievals of sea ice concentration for both algorithms. The AMSR-E snow depth algorithm was found to be extremely sensitive to instrument calibration and sensor viewing angle, and it is concluded that more work is needed to investigate the sensitivity of the gradient ratio at 37 and 18.7 GHz to these factors to improve snow depth retrievals from spaceborne passive microwave sensors.

  6. 3D multiplayer virtual pets game using Google Card Board

    NASA Astrophysics Data System (ADS)

    Herumurti, Darlis; Riskahadi, Dimas; Kuswardayan, Imam

    2017-08-01

    Virtual Reality (VR) is a technology which allows user to interact with the virtual environment. This virtual environment is generated and simulated by computer. This technology can make user feel the sensation when they are in the virtual environment. The VR technology provides real virtual environment view for user and it is not viewed from screen. But it needs another additional device to show the view of virtual environment. This device is known as Head Mounted Device (HMD). Oculust Rift and Microsoft Hololens are the most famous HMD devices used in VR. And in 2014, Google Card Board was introduced at Google I/O developers conference. Google Card Board is VR platform which allows user to enjoy the VR with simple and cheap way. In this research, we explore Google Card Board to develop simulation game of raising pet. The Google Card Board is used to create view for the VR environment. The view and control in VR environment is built using Unity game engine. And the simulation process is designed using Finite State Machine (FSM). This FSM can help to design the process clearly. So the simulation process can describe the simulation of raising pet well. Raising pet is fun activity. But sometimes, there are many conditions which cause raising pet become difficult to do, i.e. environment condition, disease, high cost, etc. this research aims to explore and implement Google Card Board in simulation of raising pet.

  7. Global motions exhibited by proteins in micro- to milliseconds simulations concur with anisotropic network model predictions

    NASA Astrophysics Data System (ADS)

    Gur, M.; Zomot, E.; Bahar, I.

    2013-09-01

    The Anton supercomputing technology recently developed for efficient molecular dynamics simulations permits us to examine micro- to milli-second events at full atomic resolution for proteins in explicit water and lipid bilayer. It also permits us to investigate to what extent the collective motions predicted by network models (that have found broad use in molecular biophysics) agree with those exhibited by full-atomic long simulations. The present study focuses on Anton trajectories generated for two systems: the bovine pancreatic trypsin inhibitor, and an archaeal aspartate transporter, GltPh. The former, a thoroughly studied system, helps benchmark the method of comparative analysis, and the latter provides new insights into the mechanism of function of glutamate transporters. The principal modes of motion derived from both simulations closely overlap with those predicted for each system by the anisotropic network model (ANM). Notably, the ANM modes define the collective mechanisms, or the pathways on conformational energy landscape, that underlie the passage between the crystal structure and substates visited in simulations. In particular, the lowest frequency ANM modes facilitate the conversion between the most probable substates, lending support to the view that easy access to functional substates is a robust determinant of evolutionarily selected native contact topology.

  8. Remembrance of phases past: An autoregressive method for generating realistic atmospheres in simulations

    NASA Astrophysics Data System (ADS)

    Srinath, Srikar; Poyneer, Lisa A.; Rudy, Alexander R.; Ammons, S. M.

    2014-08-01

    The advent of expensive, large-aperture telescopes and complex adaptive optics (AO) systems has strengthened the need for detailed simulation of such systems from the top of the atmosphere to control algorithms. The credibility of any simulation is underpinned by the quality of the atmosphere model used for introducing phase variations into the incident photons. Hitherto, simulations which incorporate wind layers have relied upon phase screen generation methods that tax the computation and memory capacities of the platforms on which they run. This places limits on parameters of a simulation, such as exposure time or resolution, thus compromising its utility. As aperture sizes and fields of view increase the problem will only get worse. We present an autoregressive method for evolving atmospheric phase that is efficient in its use of computation resources and allows for variability in the power contained in frozen flow or stochastic components of the atmosphere. Users have the flexibility of generating atmosphere datacubes in advance of runs where memory constraints allow to save on computation time or of computing the phase at each time step for long exposure times. Preliminary tests of model atmospheres generated using this method show power spectral density and rms phase in accordance with established metrics for Kolmogorov models.

  9. Optimization and Control of Agent-Based Models in Biology: A Perspective.

    PubMed

    An, G; Fitzpatrick, B G; Christley, S; Federico, P; Kanarek, A; Neilan, R Miller; Oremland, M; Salinas, R; Laubenbacher, R; Lenhart, S

    2017-01-01

    Agent-based models (ABMs) have become an increasingly important mode of inquiry for the life sciences. They are particularly valuable for systems that are not understood well enough to build an equation-based model. These advantages, however, are counterbalanced by the difficulty of analyzing and using ABMs, due to the lack of the type of mathematical tools available for more traditional models, which leaves simulation as the primary approach. As models become large, simulation becomes challenging. This paper proposes a novel approach to two mathematical aspects of ABMs, optimization and control, and it presents a few first steps outlining how one might carry out this approach. Rather than viewing the ABM as a model, it is to be viewed as a surrogate for the actual system. For a given optimization or control problem (which may change over time), the surrogate system is modeled instead, using data from the ABM and a modeling framework for which ready-made mathematical tools exist, such as differential equations, or for which control strategies can explored more easily. Once the optimization problem is solved for the model of the surrogate, it is then lifted to the surrogate and tested. The final step is to lift the optimization solution from the surrogate system to the actual system. This program is illustrated with published work, using two relatively simple ABMs as a demonstration, Sugarscape and a consumer-resource ABM. Specific techniques discussed include dimension reduction and approximation of an ABM by difference equations as well systems of PDEs, related to certain specific control objectives. This demonstration illustrates the very challenging mathematical problems that need to be solved before this approach can be realistically applied to complex and large ABMs, current and future. The paper outlines a research program to address them.

  10. On the contribution of active galactic nuclei to the high-redshift metagalactic ionizing background

    NASA Astrophysics Data System (ADS)

    D'Aloisio, Anson; Upton Sanderbeck, Phoebe R.; McQuinn, Matthew; Trac, Hy; Shapiro, Paul R.

    2017-07-01

    Motivated by the claimed detection of a large population of faint active galactic nuclei (AGNs) at high redshift, recent studies have proposed models in which AGNs contribute significantly to the z > 4 H I ionizing background. In some models, AGNs are even the chief sources of reionization. If proved true, these models would make necessary a complete revision to the standard view that galaxies dominated the high-redshift ionizing background. It has been suggested that AGN-dominated models can better account for two recent observations that appear to be in conflict with the standard view: (1) large opacity variations in the z ˜ 5.5 H I Ly α forest, and (2) slow evolution in the mean opacity of the He II Ly α forest. Large spatial fluctuations in the ionizing background from the brightness and rarity of AGNs may account for the former, while the earlier onset of He II reionization in these models may account for the latter. Here we show that models in which AGN emissions source ≳50 per cent of the ionizing background generally provide a better fit to the observed H I Ly α forest opacity variations compared to standard galaxy-dominated models. However, we argue that these AGN-dominated models are in tension with constraints on the thermal history of the intergalactic medium (IGM). Under standard assumptions about the spectra of AGNs, we show that the earlier onset of He II reionization heats up the IGM well above recent temperature measurements. We further argue that the slower evolution of the mean opacity of the He II Ly α forest relative to simulations may reflect deficiencies in current simulations rather than favour AGN-dominated models as has been suggested.

  11. Modelling of Picosatellite Constellation-Based Network and Effects on Quality of Service

    DTIC Science & Technology

    2015-03-01

    the STK -QualNet Interface module. The analysis period for the scenario simulation was chosen as a two-day period from 6 June 2011 to 7 June 2011. This...MIN 0:00:04 0:00:34 AVERAGE 0:58:03 0:21:21 27 THIS PAGE INTENTIONALLY LEFT BLANK 28 VI. DATA ANALYSIS /INTERPRETATION The STK model was... Mission Analysis and Design, 3rd ed. (Space Technology Library, vol. 8). El Segundo, CA: Microcosm, 1999. [5] exactEarth. (2015). ExactView

  12. Model Checking Techniques for Assessing Functional Form Specifications in Censored Linear Regression Models.

    PubMed

    León, Larry F; Cai, Tianxi

    2012-04-01

    In this paper we develop model checking techniques for assessing functional form specifications of covariates in censored linear regression models. These procedures are based on a censored data analog to taking cumulative sums of "robust" residuals over the space of the covariate under investigation. These cumulative sums are formed by integrating certain Kaplan-Meier estimators and may be viewed as "robust" censored data analogs to the processes considered by Lin, Wei & Ying (2002). The null distributions of these stochastic processes can be approximated by the distributions of certain zero-mean Gaussian processes whose realizations can be generated by computer simulation. Each observed process can then be graphically compared with a few realizations from the Gaussian process. We also develop formal test statistics for numerical comparison. Such comparisons enable one to assess objectively whether an apparent trend seen in a residual plot reects model misspecification or natural variation. We illustrate the methods with a well known dataset. In addition, we examine the finite sample performance of the proposed test statistics in simulation experiments. In our simulation experiments, the proposed test statistics have good power of detecting misspecification while at the same time controlling the size of the test.

  13. Coordination control of flexible manufacturing systems

    NASA Astrophysics Data System (ADS)

    Menon, Satheesh R.

    One of the first attempts was made to develop a model driven system for coordination control of Flexible Manufacturing Systems (FMS). The structure and activities of the FMS are modeled using a colored Petri Net based system. This approach has the advantage of being able to model the concurrency inherent in the system. It provides a method for encoding the system state, state transitions and the feasible transitions at any given state. Further structural analysis (for detecting conflicting actions, deadlocks which might occur during operation, etc.) can be performed. The problem is also addressed of implementing and testing the behavior of existing dynamic scheduling approaches in simulations of realistic situations. A simulation architecture was proposed and performance evaluation was carried out for establishing the correctness of the model, stability of the system from a structural (deadlocks) and temporal (boundedness of backlogs) points of view, and for collection of statistics for performance measures such as machine and robot utilizations, average wait times and idle times of resources. A real-time implementation architecture for the coordination controller was also developed and implemented in a software simulated environment. Given the current technology of FMS control, the model-driven colored Petri net-based approach promises to develop a very flexible control environment.

  14. Virtual wayfinding using simulated prosthetic vision in gaze-locked viewing.

    PubMed

    Wang, Lin; Yang, Liancheng; Dagnelie, Gislin

    2008-11-01

    To assess virtual maze navigation performance with simulated prosthetic vision in gaze-locked viewing, under the conditions of varying luminance contrast, background noise, and phosphene dropout. Four normally sighted subjects performed virtual maze navigation using simulated prosthetic vision in gaze-locked viewing, under five conditions of luminance contrast, background noise, and phosphene dropout. Navigation performance was measured as the time required to traverse a 10-room maze using a game controller, and the number of errors made during the trip. Navigation performance time (1) became stable after 6 to 10 trials, (2) remained similar on average at luminance contrast of 68% and 16% but had greater variation at 16%, (3) was not significantly affected by background noise, and (4) increased by 40% when 30% of phosphenes were removed. Navigation performance time and number of errors were significantly and positively correlated. Assuming that the simulated gaze-locked viewing conditions are extended to implant wearers, such prosthetic vision can be helpful for wayfinding in simple mobility tasks, though phosphene dropout may interfere with performance.

  15. Report for Task 8.4: Development of Control Room Layout Recommendations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McDonald, Robert

    Idaho National Laboratory (INL) has contracted Institutt for Energiteknikk (IFE) to support in the development of an end state vision for the US Nuclear industry and in particular for a utility that is currently moving forward with a control room modernization project. This support includes the development of an Overview display and technical support in conducting an operational study. Development of operational scenarios to be conducted using a full scope simulator at the INL HSSL. Additionally IFE will use the CREATE modelling tool to provide 3-D views of the potential and possible end state view after the completion of digitalmore » upgrade project.« less

  16. Polar Ozone Loss Rates: Comparison Of Match Observations With Simulations Of 3-D Chemical Transport Model And Box Model

    NASA Astrophysics Data System (ADS)

    Tripathi, O. P.; Godin-Beekmann, S.; Lefevre, F.; Marchand, M.; Pazmino, A.; Hauchecorne, A.

    2005-12-01

    Model simulations of ozone loss rates during recent arctic and Antarctic winters are compared with the observed ozone loss rates from the match technique. Arctic winters 1994/1995, 1999/2000, 2002/2003 and the Antarctic winter 2003 were considered for the analysis. We use a high resolution chemical transport model MIMOSA-CHIM and REPROBUS box model for the calculation of ozone loss rates. Trajectory model calculations show that the ozone loss rates are dependent on the initialization fields. On the one hand when chemical fields are initialized by UCAM (University of Cambridge SLIMCAT model simulated fields) the loss rates were underestimated by a factor of two whereas on the other hand when it is initialized by UL (University of Leeds) fields the model loss rates are in a very good agreement with match loss rates at lower levels. The study shows a very good agreement between MIMOSA-CHIM simulation and match observation in 1999/2000 winter at both levels, 450 and 500 K, except slight underestimation in March at 500 K. But in January we have a very good agreement. This is also true for 1994/1995 when we consider simulated ozone loss rate in view of the ECMWF wind deficiency assuming that match observations were not made on isolated trajectories. Sensitivity tests, by changing JCl2O2 value, particle number density and heating rates, performed for the arctic winter 1999/2000 shows that we need to improve our understanding of particle number density and heating rate calculation mechanism. Burkholder JCl2O2 has improved the comparison of MIMOSA-CHIM model results with observations (Tripathi et al., 2005). In the same study the comparison results were shown to improved by changing heating rates and number density through NAT particle sedimentation.

  17. Organization of Lipids in the Tear Film: A Molecular-Level View

    PubMed Central

    Wizert, Alicja; Iskander, D. Robert; Cwiklik, Lukasz

    2014-01-01

    Biophysical properties of the tear film lipid layer are studied at the molecular level employing coarse grain molecular dynamics (MD) simulations with a realistic model of the human tear film. In this model, polar lipids are chosen to reflect the current knowledge on the lipidome of the tear film whereas typical Meibomian-origin lipids are included in the thick non-polar lipids subphase. Simulation conditions mimic those experienced by the real human tear film during blinks. Namely, thermodynamic equilibrium simulations at different lateral compressions are performed to model varying surface pressure, and the dynamics of the system during a blink is studied by non-equilibrium MD simulations. Polar lipids separate their non-polar counterparts from water by forming a monomolecular layer whereas the non-polar molecules establish a thick outermost lipid layer. Under lateral compression, the polar layer undulates and a sorting of polar lipids occurs. Moreover, formation of three-dimensional aggregates of polar lipids in both non-polar and water subphases is observed. We suggest that these three-dimensional structures are abundant under dynamic conditions caused by the action of eye lids and that they act as reservoirs of polar lipids, thus increasing stability of the tear film. PMID:24651175

  18. Sub-grid drag models for horizontal cylinder arrays immersed in gas-particle multiphase flows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sarkar, Avik; Sun, Xin; Sundaresan, Sankaran

    2013-09-08

    Immersed cylindrical tube arrays often are used as heat exchangers in gas-particle fluidized beds. In multiphase computational fluid dynamics (CFD) simulations of large fluidized beds, explicit resolution of small cylinders is computationally infeasible. Instead, the cylinder array may be viewed as an effective porous medium in coarse-grid simulations. The cylinders' influence on the suspension as a whole, manifested as an effective drag force, and on the relative motion between gas and particles, manifested as a correction to the gas-particle drag, must be modeled via suitable sub-grid constitutive relationships. In this work, highly resolved unit-cell simulations of flow around an arraymore » of horizontal cylinders, arranged in a staggered configuration, are filtered to construct sub-grid, or `filtered', drag models, which can be implemented in coarse-grid simulations. The force on the suspension exerted by the cylinders is comprised of, as expected, a buoyancy contribution, and a kinetic component analogous to fluid drag on a single cylinder. Furthermore, the introduction of tubes also is found to enhance segregation at the scale of the cylinder size, which, in turn, leads to a reduction in the filtered gas-particle drag.« less

  19. "Replaying Life's Tape": Simulations, metaphors, and historicity in Stephen Jay Gould's view of life.

    PubMed

    Sepkoski, David

    2016-08-01

    In a famous thought experiment, Stephen Jay Gould asked whether, if one could somehow rewind the history of life back to its initial starting point, the same results would obtain when the "tape" was run forward again. This hypothetical experiment is generally understood as a metaphor supporting Gould's philosophy of evolutionary contingency, which he developed and promoted from the late 1980s until his death in 2002. However, there was a very literal, non-metaphorical inspiration for Gould's thought experiment: since the early 1970s, Gould, along with a group of other paleontologists, was actively engaged in attempts to model and reconstruct the history of life using computer simulations and database analysis. These simulation projects not only demonstrate the impact that computers had on data analysis in paleontology, but also shed light on the close relationship between models and empirical data in data-oriented science. In a sense, I will argue, the models developed by paleontologists through simulation and quantitative analysis of the empirical fossil record in the 1970s and beyond were literal attempts to "replay life's tape" by reconstructing the history of life as data. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. The PRo3D View Planner - interactive simulation of Mars rover camera views to optimise capturing parameters

    NASA Astrophysics Data System (ADS)

    Traxler, Christoph; Ortner, Thomas; Hesina, Gerd; Barnes, Robert; Gupta, Sanjeev; Paar, Gerhard

    2017-04-01

    High resolution Digital Terrain Models (DTM) and Digital Outcrop Models (DOM) are highly useful for geological analysis and mission planning in planetary rover missions. PRo3D, developed as part of the EU-FP7 PRoViDE project, is a 3D viewer in which orbital DTMs and DOMs derived from rover stereo imagery can be rendered in a virtual environment for exploration and analysis. It allows fluent navigation over planetary surface models and provides a variety of measurement and annotation tools to complete an extensive geological interpretation. A key aspect of the image collection during planetary rover missions is determining the optimal viewing positions of rover instruments from different positions ('wide baseline stereo'). For the collection of high quality panoramas and stereo imagery the visibility of regions of interest from those positions, and the amount of common features shared by each stereo-pair, or image bundle is crucial. The creation of a highly accurate and reliable 3D surface, in the form of an Ordered Point Cloud (OPC), of the planetary surface, with a low rate of error and a minimum of artefacts, is greatly enhanced by using images that share a high amount of features and a sufficient overlap for wide baseline stereo or target selection. To support users in the selection of adequate viewpoints an interactive View Planner was integrated into PRo3D. The users choose from a set of different rovers and their respective instruments. PRo3D supports for instance the PanCam instrument of ESA's ExoMars 2020 rover mission or the Mastcam-Z camera of NASA's Mars2020 mission. The View Planner uses a DTM obtained from orbiter imagery, which can also be complemented with rover-derived DOMs as the mission progresses. The selected rover is placed onto a position on the terrain - interactively or using the current rover pose as known from the mission. The rover's base polygon and its local coordinate axes, and the chosen instrument's up- and forward vectors are visualised. The parameters of the instrument's pan and tilt unit (PTU) can be altered via the user interface, or alternatively calculated by selecting a target point on the visualised DTM. In the 3D view, the visible region of the planetary surface, resulting from these settings and the camera field-of-view is visualised by a highlighted region with a red border, representing the instruments footprint. The camera view is simulated and rendered in a separate window and PTU parameters can be interactively adjusted, allowing viewpoints, directions, and the expected image to be visualised in real-time in order to allow users the fine-tuning of these settings. In this way, ideal viewpoints and PTU settings for various rover models and instruments can efficiently be defined, resulting in an optimum imagery of the regions of interest.

  1. Distributed Observer Network

    NASA Technical Reports Server (NTRS)

    Conroy, Michael; Mazzone, Rebecca; Little, William; Elfrey, Priscilla; Mann, David; Mabie, Kevin; Cuddy, Thomas; Loundermon, Mario; Spiker, Stephen; McArthur, Frank; hide

    2010-01-01

    The Distributed Observer network (DON) is a NASA-collaborative environment that leverages game technology to bring three-dimensional simulations to conventional desktop and laptop computers in order to allow teams of engineers working on design and operations, either individually or in groups, to view and collaborate on 3D representations of data generated by authoritative tools such as Delmia Envision, Pro/Engineer, or Maya. The DON takes models and telemetry from these sources and, using commercial game engine technology, displays the simulation results in a 3D visual environment. DON has been designed to enhance accessibility and user ability to observe and analyze visual simulations in real time. A variety of NASA mission segment simulations [Synergistic Engineering Environment (SEE) data, NASA Enterprise Visualization Analysis (NEVA) ground processing simulations, the DSS simulation for lunar operations, and the Johnson Space Center (JSC) TRICK tool for guidance, navigation, and control analysis] were experimented with. Desired functionalities, [i.e. Tivo-like functions, the capability to communicate textually or via Voice-over-Internet Protocol (VoIP) among team members, and the ability to write and save notes to be accessed later] were targeted. The resulting DON application was slated for early 2008 release to support simulation use for the Constellation Program and its teams. Those using the DON connect through a client that runs on their PC or Mac. This enables them to observe and analyze the simulation data as their schedule allows, and to review it as frequently as desired. DON team members can move freely within the virtual world. Preset camera points can be established, enabling team members to jump to specific views. This improves opportunities for shared analysis of options, design reviews, tests, operations, training, and evaluations, and improves prospects for verification of requirements, issues, and approaches among dispersed teams.

  2. Perceptual disturbances predicted in zero-g through three-dimensional modeling.

    PubMed

    Holly, Jan E

    2003-01-01

    Perceptual disturbances in zero-g and 1-g differ. For example, the vestibular coriolis (or "cross-coupled") effect is weaker in zero-g. In 1-g, blindfolded subjects rotating on-axis experience perceptual disturbances upon head tilt, but the effects diminish in zero-g. Head tilts during centrifugation in zero-g and 1-g are investigated here by means of three-dimensional modeling, using a model that was previously used to explain the zero-g reduction of the on-axis vestibular coriolis effect. The model's foundation comprises the laws of physics, including linear-angular interactions in three dimensions. Addressed is the question: In zero-g, will the vestibular coriolis effect be as weak during centrifugation as during on-axis rotation? Centrifugation in 1-g was simulated first, with the subject supine, head toward center. The most noticeable result concerned direction of head yaw. For clockwise centrifuge rotation, greater perceptual effects arose in simulations during yaw counterclockwise (as viewed from the top of the head) than for yaw clockwise. Centrifugation in zero-g was then simulated with the same "supine" orientation. The result: In zero-g the simulated vestibular coriolis effect was greater during centrifugation than during on-axis rotation. In addition, clockwise-counterclockwise differences did not appear in zero-g, in contrast to the differences that appear in 1-g.

  3. ALI: A CSSL/multiprocessor software interface

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Makoui, A.; Karplus, W.J.

    ALI (A Language Interface) is a software package which translates simulation models expressed in one of the higher-level languages, CSSL-IV or ACSL, into sequences of instructions for each processor of a network of microprocessors. The partitioning of the source program among the processors is automatically accomplished. The code is converted into a data flow graph, analyzed and divided among the processors to minimize the overall execution time in the presence of interprocessor communication delays. This paper describes ALI from the user's point of view and includes a detailed example of the application of ALI to a specific dynamic system simulation.

  4. Wheat signature modeling and analysis for improved training statistics: Supplement. Simulated LANDSAT wheat radiances and radiance components

    NASA Technical Reports Server (NTRS)

    Malila, W. A.; Cicone, R. C.; Gleason, J. M.

    1976-01-01

    Simulated scanner system data values generated in support of LACIE (Large Area Crop Inventory Experiment) research and development efforts are presented. Synthetic inband (LANDSAT) wheat radiances and radiance components were computed and are presented for various wheat canopy and atmospheric conditions and scanner view geometries. Values include: (1) inband bidirectional reflectances for seven stages of wheat crop growth; (2) inband atmospheric features; and (3) inband radiances corresponding to the various combinations of wheat canopy and atmospheric conditions. Analyses of these data values are presented in the main report.

  5. Simulation of laser beam reflection at the sea surface modeling and validation

    NASA Astrophysics Data System (ADS)

    Schwenger, Frédéric; Repasi, Endre

    2013-06-01

    A 3D simulation of the reflection of a Gaussian shaped laser beam on the dynamic sea surface is presented. The simulation is suitable for the pre-calculation of images for cameras operating in different spectral wavebands (visible, short wave infrared) for a bistatic configuration of laser source and receiver for different atmospheric conditions. In the visible waveband the calculated detected total power of reflected laser light from a 660nm laser source is compared with data collected in a field trial. Our computer simulation comprises the 3D simulation of a maritime scene (open sea/clear sky) and the simulation of laser beam reflected at the sea surface. The basic sea surface geometry is modeled by a composition of smooth wind driven gravity waves. To predict the view of a camera the sea surface radiance must be calculated for the specific waveband. Additionally, the radiances of laser light specularly reflected at the wind-roughened sea surface are modeled considering an analytical statistical sea surface BRDF (bidirectional reflectance distribution function). Validation of simulation results is prerequisite before applying the computer simulation to maritime laser applications. For validation purposes data (images and meteorological data) were selected from field measurements, using a 660nm cw-laser diode to produce laser beam reflection at the water surface and recording images by a TV camera. The validation is done by numerical comparison of measured total laser power extracted from recorded images with the corresponding simulation results. The results of the comparison are presented for different incident (zenith/azimuth) angles of the laser beam.

  6. [Model and analysis of spectropolarimetric BRDF of painted target based on GA-LM method].

    PubMed

    Chen, Chao; Zhao, Yong-Qiang; Luo, Li; Pan, Quan; Cheng, Yong-Mei; Wang, Kai

    2010-03-01

    Models based on microfacet were used to describe spectropolarimetric BRDF (short for bidirectional reflectance distribution function) with experimental data. And the spectropolarimetric BRDF values of targets were measured with the comparison to the standard whiteboard, which was considered as Lambert and had a uniform reflectance rate up to 98% at arbitrary angle of view. And then the relationships between measured spectropolarimetric BRDF values and the angles of view, as well as wavelengths which were in a range of 400-720 nm were analyzed in details. The initial value needed to be input to the LM optimization method was difficult to get and greatly impacted the results. Therefore, optimization approach which combines genetic algorithm and Levenberg-Marquardt (LM) was utilized aiming to retrieve parameters of nonlinear models, and the initial values were obtained using GA approach. Simulated experiments were used to test the efficiency of the adopted optimization method. And the simulated experiment ensures the optimization method to have a good performance and be able to retrieve the parameters of nonlinear model efficiently. The correctness of the models was validated by real outdoor sampled data. The parameters of DoP model retrieved are the refraction index of measured targets. The refraction index of the same color painted target but with different materials was also obtained. Conclusion has been drawn that the refraction index from these two targets are very near and this slight difference could be understood by the difference in the conditions of paint targets' surface, not the material of the targets.

  7. SU-E-I-36: A KWIC and Dirty Look at Dose Savings and Perfusion Metrics in Simulated CT Neuro Perfusion Exams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoffman, J; Martin, T; Young, S

    Purpose: CT neuro perfusion scans are one of the highest dose exams. Methods to reduce dose include decreasing the number of projections acquired per gantry rotation, however conventional reconstruction of such scans leads to sampling artifacts. In this study we investigated a projection view-sharing reconstruction algorithm used in dynamic MRI – “K-space Weighted Image Contrast” (KWIC) – applied to simulated perfusion exams and evaluated dose savings and impacts on perfusion metrics. Methods: A FORBILD head phantom containing simulated time-varying objects was developed and a set of parallel-beam CT projection data was created. The simulated scans were 60 seconds long, 1152more » projections per turn, with a rotation time of one second. No noise was simulated. 5mm, 10mm, and 50mm objects were modeled in the brain. A baseline, “full dose” simulation used all projections and reduced dose cases were simulated by downsampling the number of projections per turn from 1152 to 576 (50% dose), 288 (25% dose), and 144 (12.5% dose). KWIC was further evaluated at 72 projections per rotation (6.25%). One image per second was reconstructed using filtered backprojection (FBP) and KWIC. KWIC reconstructions utilized view cores of 36, 72, 144, and 288 views and 16, 8, 4, and 2 subapertures respectively. From the reconstructed images, time-to-peak (TTP), cerebral blood flow (CBF) and the FWHM of the perfusion curve were calculated and compared against reference values from the full-dose FBP data. Results: TTP, CBF, and the FWHM were unaffected by dose reduction (to 12.5%) and reconstruction method, however image quality was improved when using KWIC. Conclusion: This pilot study suggests that KWIC preserves image quality and perfusion metrics when under-sampling projections and that the unique contrast weighting of KWIC could provided substantial dose-savings for perfusion CT scans. Evaluation of KWIC in clinical CT data will be performed in the near future. R01 EB014922, NCI Grant U01 CA181156 (Quantitative Imaging Network), and Tobacco Related Disease Research Project grant 22RT-0131.« less

  8. Mean-field methods in evolutionary duplication-innovation-loss models for the genome-level repertoire of protein domains.

    PubMed

    Angelini, A; Amato, A; Bianconi, G; Bassetti, B; Cosentino Lagomarsino, M

    2010-02-01

    We present a combined mean-field and simulation approach to different models describing the dynamics of classes formed by elements that can appear, disappear, or copy themselves. These models, related to a paradigm duplication-innovation model known as Chinese restaurant process, are devised to reproduce the scaling behavior observed in the genome-wide repertoire of protein domains of all known species. In view of these data, we discuss the qualitative and quantitative differences of the alternative model formulations, focusing in particular on the roles of element loss and of the specificity of empirical domain classes.

  9. Mean-field methods in evolutionary duplication-innovation-loss models for the genome-level repertoire of protein domains

    NASA Astrophysics Data System (ADS)

    Angelini, A.; Amato, A.; Bianconi, G.; Bassetti, B.; Cosentino Lagomarsino, M.

    2010-02-01

    We present a combined mean-field and simulation approach to different models describing the dynamics of classes formed by elements that can appear, disappear, or copy themselves. These models, related to a paradigm duplication-innovation model known as Chinese restaurant process, are devised to reproduce the scaling behavior observed in the genome-wide repertoire of protein domains of all known species. In view of these data, we discuss the qualitative and quantitative differences of the alternative model formulations, focusing in particular on the roles of element loss and of the specificity of empirical domain classes.

  10. Rigorous analysis of an electric-field-driven liquid crystal lens for 3D displays

    NASA Astrophysics Data System (ADS)

    Kim, Bong-Sik; Lee, Seung-Chul; Park, Woo-Sang

    2014-08-01

    We numerically analyzed the optical performance of an electric field driven liquid crystal (ELC) lens adopted for 3-dimensional liquid crystal displays (3D-LCDs) through rigorous ray tracing. For the calculation, we first obtain the director distribution profile of the liquid crystals by using the Erickson-Leslie motional equation; then, we calculate the transmission of light through the ELC lens by using the extended Jones matrix method. The simulation was carried out for a 9view 3D-LCD with a diagonal of 17.1 inches, where the ELC lens was slanted to achieve natural stereoscopic images. The results show that each view exists separately according to the viewing position at an optimum viewing distance of 80 cm. In addition, our simulation results provide a quantitative explanation for the ghost or blurred images between views observed from a 3D-LCD with an ELC lens. The numerical simulations are also shown to be in good agreement with the experimental results. The present simulation method is expected to provide optimum design conditions for obtaining natural 3D images by rigorously analyzing the optical functionalities of an ELC lens.

  11. MONET: multidimensional radiative cloud scene model

    NASA Astrophysics Data System (ADS)

    Chervet, Patrick

    1999-12-01

    All cloud fields exhibit variable structures (bulge) and heterogeneities in water distributions. With the development of multidimensional radiative models by the atmospheric community, it is now possible to describe horizontal heterogeneities of the cloud medium, to study these influences on radiative quantities. We have developed a complete radiative cloud scene generator, called MONET (French acronym for: MOdelisation des Nuages En Tridim.) to compute radiative cloud scene from visible to infrared wavelengths for various viewing and solar conditions, different spatial scales, and various locations on the Earth. MONET is composed of two parts: a cloud medium generator (CSSM -- Cloud Scene Simulation Model) developed by the Air Force Research Laboratory, and a multidimensional radiative code (SHDOM -- Spherical Harmonic Discrete Ordinate Method) developed at the University of Colorado by Evans. MONET computes images for several scenario defined by user inputs: date, location, viewing angles, wavelength, spatial resolution, meteorological conditions (atmospheric profiles, cloud types)... For the same cloud scene, we can output different viewing conditions, or/and various wavelengths. Shadowing effects on clouds or grounds are taken into account. This code is useful to study heterogeneity effects on satellite data for various cloud types and spatial resolutions, and to determine specifications of new imaging sensor.

  12. Payload Planning for the International Space Station

    NASA Technical Reports Server (NTRS)

    Johnson, Tameka J.

    1995-01-01

    A review of the evolution of the International Space Station (ISS) was performed for the purpose of understanding the project objectives. It was requested than an analysis of the current Office of Space Access and Technology (OSAT) Partnership Utilization Plan (PUP) traffic model be completed to monitor the process through which the scientific experiments called payloads are manifested for flight to the ISS. A viewing analysis of the ISS was also proposed to identify the capability to observe the United States Laboratory (US LAB) during the assembly sequence. Observations of the Drop-Tower experiment and nondestructive testing procedures were also performed to maximize the intern's technical experience. Contributions were made to the meeting in which the 1996 OSAT or Code X PUP traffic model was generated using the software tool, Filemaker Pro. The current OSAT traffic model satisfies the requirement for manifesting and delivering the proposed payloads to station. The current viewing capability of station provides the ability to view the US LAB during station assembly sequence. The Drop Tower experiment successfully simulates the effect of microgravity and conveniently documents the results for later use. The non-destructive test proved effective in determining stress in various components tested.

  13. Venus - Computer Simulated Global View of Northern Hemisphere

    NASA Image and Video Library

    1996-03-14

    The northern hemisphere is displayed in this global view of the surface of Venus. NASA Magellan synthetic aperture radar mosaics from the first cycle of Magellan mapping were mapped onto a computer-simulated globe to create this image. http://photojournal.jpl.nasa.gov/catalog/PIA00252

  14. Effect of Clouds on Optical Imaging of the Space Shuttle During the Ascent Phase: A Statistical Analysis Based on a 3D Model

    NASA Technical Reports Server (NTRS)

    Short, David A.; Lane, Robert E., Jr.; Winters, Katherine A.; Madura, John T.

    2004-01-01

    Clouds are highly effective in obscuring optical images of the Space Shuttle taken during its ascent by ground-based and airborne tracking cameras. Because the imagery is used for quick-look and post-flight engineering analysis, the Columbia Accident Investigation Board (CAIB) recommended the return-to-flight effort include an upgrade of the imaging system to enable it to obtain at least three useful views of the Shuttle from lift-off to at least solid rocket booster (SRB) separation (NASA 2003). The lifetimes of individual cloud elements capable of obscuring optical views of the Shuttle are typically 20 minutes or less. Therefore, accurately observing and forecasting cloud obscuration over an extended network of cameras poses an unprecedented challenge for the current state of observational and modeling techniques. In addition, even the best numerical simulations based on real observations will never reach "truth." In order to quantify the risk that clouds would obscure optical imagery of the Shuttle, a 3D model to calculate probabilistic risk was developed. The model was used to estimate the ability of a network of optical imaging cameras to obtain at least N simultaneous views of the Shuttle from lift-off to SRB separation in the presence of an idealized, randomized cloud field.

  15. Reliability of regional climate simulations

    NASA Astrophysics Data System (ADS)

    Ahrens, W.; Block, A.; Böhm, U.; Hauffe, D.; Keuler, K.; Kücken, M.; Nocke, Th.

    2003-04-01

    Quantification of uncertainty becomes more and more a key issue for assessing the trustability of future climate scenarios. In addition to the mean conditions, climate impact modelers focus in particular on extremes. Before generating such scenarios using e.g. dynamic regional climate models, a careful validation of present-day simulations should be performed to determine the range of errors for the quantities of interest under recent conditions as a raw estimate of their uncertainty in the future. Often, multiple aspects shall be covered together, and the required simulation accuracy depends on the user's demand. In our approach, a massive parallel regional climate model shall be used on the one hand to generate "long-term" high-resolution climate scenarios for several decades, and on the other hand to provide very high-resolution ensemble simulations of future dry spells or heavy rainfall events. To diagnosis the model's performance for present-day simulations, we have recently developed and tested a first version of a validation and visualization chain for this model. It is, however, applicable in a much more general sense and could be used as a common test bed for any regional climate model aiming at this type of simulations. Depending on the user's interest, integrated quality measures can be derived for near-surface parameters using multivariate techniques and multidimensional distance measures in a first step. At this point, advanced visualization techniques have been developed and included to allow for visual data mining and to qualitatively identify dominating aspects and regularities. Univariate techniques that are especially designed to assess climatic aspects in terms of statistical properties can then be used to quantitatively diagnose the error contributions of the individual used parameters. Finally, a comprehensive in-depth diagnosis tool allows to investigate, why the model produces the obtained near-surface results to answer the question if the model performs well from the modeler's point of view. Examples will be presented for results obtained using this approach for assessing the risk of potential total agricultural yield loss under drought conditions in Northeast Brazil and for evaluating simulation results for a 10-year period for Europe. To support multi-run simulations and result evaluation, the model will be embedded into an already existing simulation environment that provides further postprocessing tools for sensitivity studies, behavioral analysis and Monte-Carlo simulations, but also for ensemble scenario analysis in one of the next steps.

  16. Multiscale modeling of a rectifying bipolar nanopore: explicit-water versus implicit-water simulations.

    PubMed

    Ható, Zoltán; Valiskó, Mónika; Kristóf, Tamás; Gillespie, Dirk; Boda, Dezsö

    2017-07-21

    In a multiscale modeling approach, we present computer simulation results for a rectifying bipolar nanopore at two modeling levels. In an all-atom model, we use explicit water to simulate ion transport directly with the molecular dynamics technique. In a reduced model, we use implicit water and apply the Local Equilibrium Monte Carlo method together with the Nernst-Planck transport equation. This hybrid method makes the fast calculation of ion transport possible at the price of lost details. We show that the implicit-water model is an appropriate representation of the explicit-water model when we look at the system at the device (i.e., input vs. output) level. The two models produce qualitatively similar behavior of the electrical current for different voltages and model parameters. Looking at the details of concentration and potential profiles, we find profound differences between the two models. These differences, however, do not influence the basic behavior of the model as a device because they do not influence the z-dependence of the concentration profiles which are the main determinants of current. These results then address an old paradox: how do reduced models, whose assumptions should break down in a nanoscale device, predict experimental data? Our simulations show that reduced models can still capture the overall device physics correctly, even though they get some important aspects of the molecular-scale physics quite wrong; reduced models work because they include the physics that is necessary from the point of view of device function. Therefore, reduced models can suffice for general device understanding and device design, but more detailed models might be needed for molecular level understanding.

  17. Visual Data-Analytics of Large-Scale Parallel Discrete-Event Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ross, Caitlin; Carothers, Christopher D.; Mubarak, Misbah

    Parallel discrete-event simulation (PDES) is an important tool in the codesign of extreme-scale systems because PDES provides a cost-effective way to evaluate designs of highperformance computing systems. Optimistic synchronization algorithms for PDES, such as Time Warp, allow events to be processed without global synchronization among the processing elements. A rollback mechanism is provided when events are processed out of timestamp order. Although optimistic synchronization protocols enable the scalability of large-scale PDES, the performance of the simulations must be tuned to reduce the number of rollbacks and provide an improved simulation runtime. To enable efficient large-scale optimistic simulations, one has tomore » gain insight into the factors that affect the rollback behavior and simulation performance. We developed a tool for ROSS model developers that gives them detailed metrics on the performance of their large-scale optimistic simulations at varying levels of simulation granularity. Model developers can use this information for parameter tuning of optimistic simulations in order to achieve better runtime and fewer rollbacks. In this work, we instrument the ROSS optimistic PDES framework to gather detailed statistics about the simulation engine. We have also developed an interactive visualization interface that uses the data collected by the ROSS instrumentation to understand the underlying behavior of the simulation engine. The interface connects real time to virtual time in the simulation and provides the ability to view simulation data at different granularities. We demonstrate the usefulness of our framework by performing a visual analysis of the dragonfly network topology model provided by the CODES simulation framework built on top of ROSS. The instrumentation needs to minimize overhead in order to accurately collect data about the simulation performance. To ensure that the instrumentation does not introduce unnecessary overhead, we perform a scaling study that compares instrumented ROSS simulations with their noninstrumented counterparts in order to determine the amount of perturbation when running at different simulation scales.« less

  18. Pollen-proxies say cooler, climate models say warmer: resolving conflicting views of the Holocene climate of the Mediterranean region

    NASA Astrophysics Data System (ADS)

    Russo, E.; Mauri, A.; Davis, B. A. S.; Cubasch, U.

    2017-12-01

    The evolution of the Mediterranean region's climate during the Holocene has been the subject of long-standing debate within the paleoclimate community. Conflicting hypotheses have emerged from the analysis of different climate reconstructions based on proxy records and climate models outputs.In particular, pollen-based reconstructions of cooler summer temperatures during the Holocene have been criticized based on a hypothesis that the Mediterranean vegetation is mainly limited by effective precipitation and not summer temperature. This criticism is important because climate models show warmer summer temperatures during the Holocene over the Mediterranean region, in direct contradiction of the pollen-based evidence. Here we investigate this problem using a high resolution model simulation of the climate of the Mediterranean region during the mid-to-late Holocene, which we compare against pollen-based reconstructions using two different approaches.In the first, we compare the simulated climate from the model directly with the climate derived from the pollen data. In the second, we compare the simulated vegetation from the model directly with the vegetation from the pollen data.Results show that the climate model is unable to simulate neither the climate nor the vegetation shown by the pollen-data. The pollen data indicates an expansion in cool temperate vegetation in the mid-Holocene while the model suggests an expansion in warm arid vegetation. This suggests that the data-model discrepancy is more likely the result of bias in climate models, and not bias in the pollen-climate calibration transfer-function.

  19. Developing R&D portfolio business validity simulation model and system.

    PubMed

    Yeo, Hyun Jin; Im, Kwang Hyuk

    2015-01-01

    The R&D has been recognized as critical method to take competitiveness by not only companies but also nations with its value creation such as patent value and new product. Therefore, R&D has been a decision maker's burden in that it is hard to decide how much money to invest, how long time one should spend, and what technology to develop which means it accompanies resources such as budget, time, and manpower. Although there are diverse researches about R&D evaluation, business factors are not concerned enough because almost all previous studies are technology oriented evaluation with one R&D technology based. In that, we early proposed R&D business aspect evaluation model which consists of nine business model components. In this research, we develop a simulation model and system evaluating a company or industry's R&D portfolio with business model point of view and clarify default and control parameters to facilitate evaluator's business validity work in each evaluation module by integrate to one screen.

  20. Developing R&D Portfolio Business Validity Simulation Model and System

    PubMed Central

    2015-01-01

    The R&D has been recognized as critical method to take competitiveness by not only companies but also nations with its value creation such as patent value and new product. Therefore, R&D has been a decision maker's burden in that it is hard to decide how much money to invest, how long time one should spend, and what technology to develop which means it accompanies resources such as budget, time, and manpower. Although there are diverse researches about R&D evaluation, business factors are not concerned enough because almost all previous studies are technology oriented evaluation with one R&D technology based. In that, we early proposed R&D business aspect evaluation model which consists of nine business model components. In this research, we develop a simulation model and system evaluating a company or industry's R&D portfolio with business model point of view and clarify default and control parameters to facilitate evaluator's business validity work in each evaluation module by integrate to one screen. PMID:25893209

  1. Understanding resonance graphs using Easy Java Simulations (EJS) and why we use EJS

    NASA Astrophysics Data System (ADS)

    Wee, Loo Kang; Lee, Tat Leong; Chew, Charles; Wong, Darren; Tan, Samuel

    2015-03-01

    This paper reports a computer model simulation created using Easy Java Simulation (EJS) for learners to visualize how the steady-state amplitude of a driven oscillating system varies with the frequency of the periodic driving force. The simulation shows (N = 100) identical spring-mass systems being subjected to (1) a periodic driving force of equal amplitude but different driving frequencies, and (2) different amounts of damping. The simulation aims to create a visually intuitive way of understanding how the series of amplitude versus driving frequency graphs are obtained by showing how the displacement of the system changes over time as it transits from the transient to the steady state. A suggested ‘how to use’ the model is added to help educators and students in their teaching and learning, where we explain the theoretical steady-state equation time conditions when the model begins to allow data recording of maximum amplitudes to closely match the theoretical equation, and the steps to collect different runs of the degree of damping. We also discuss two of the design features in our computer model: displaying the instantaneous oscillation together with the achieved steady-state amplitudes, and the explicit world view overlay with scientific representation with different degrees of damping runs. Three advantages of using EJS include: (1) open source codes and creative commons attribution licenses for scaling up of interactively engaging educational practices; (2) the models made can run on almost any device, including Android and iOS; and (3) it allows the redefinition of physics educational practices through computer modeling.

  2. Adaptive spatial filtering of daytime sky noise in a satellite quantum key distribution downlink receiver

    NASA Astrophysics Data System (ADS)

    Gruneisen, Mark T.; Sickmiller, Brett A.; Flanagan, Michael B.; Black, James P.; Stoltenberg, Kurt E.; Duchane, Alexander W.

    2016-02-01

    Spatial filtering is an important technique for reducing sky background noise in a satellite quantum key distribution downlink receiver. Atmospheric turbulence limits the extent to which spatial filtering can reduce sky noise without introducing signal losses. Using atmospheric propagation and compensation simulations, the potential benefit of adaptive optics (AO) to secure key generation (SKG) is quantified. Simulations are performed assuming optical propagation from a low-Earth-orbit satellite to a terrestrial receiver that includes AO. Higher-order AO correction is modeled assuming a Shack-Hartmann wavefront sensor and a continuous-face-sheet deformable mirror. The effects of atmospheric turbulence, tracking, and higher-order AO on the photon capture efficiency are simulated using statistical representations of turbulence and a time-domain wave-optics hardware emulator. SKG rates are calculated for a decoy-state protocol as a function of the receiver field of view for various strengths of turbulence, sky radiances, and pointing angles. The results show that at fields of view smaller than those discussed by others, AO technologies can enhance SKG rates in daylight and enable SKG where it would otherwise be prohibited as a consequence of background optical noise and signal loss due to propagation and turbulence effects.

  3. Evolution of Mesoscale Convective System over the South Western Peninsular India: Observations from Microwave Radiometer and Simulations using WRF

    NASA Astrophysics Data System (ADS)

    Uma, K. N.; Krishna Moorthy, K.; Sijikumar, S.; Renju, R.; Tinu, K. A.; Raju, Suresh C.

    2012-07-01

    Meso-scale Convective Systems (MCS) are important in view of their large cumulous build-up, vertical extent, short horizontal extent and associated thundershowers. The Microwave Radiometer Profiler (MRP) over the equatorial coastal station Thiruvanathapuram (Trivandrum, 8.55oN, 76.9oE), has been utilized to understand the genesis of Mesoscale convective system (MCS), that occur frequently during the pre-monsoon season. Examination of the measurement of relative humidity, temperature and cloud liquid water measurements, over the zenith and two scanning elevation angles (15o) viewing both over the land and the sea respectively revealed that the MCS generally originate over the land during early afternoon hours, propagate seawards over the observational site and finally dissipate over the sea, with accompanying rainfall and latent heat release. The simulations obtained using Advanced Research-Weather Research and Forecast (WRF-ARW) model effectively reproduces the thermodynamical and microphysical properties of the MCS. The time duration and quantity of rainfall obtained by the simulations also well compared with the observations. Analysis also suggests that wind shear in the upper troposphere is responsible for the growth and the shape of the convective cloud.

  4. Molecular dynamics simulations of intergranular fracture in UO2 with nine empirical interatomic potentials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yongfeng Zhang; Paul C Millett; Michael R Tonks

    The intergranular fracture behavior of UO2 was studied using molecular dynamics simulations with a bicrystal model. The anisotropic fracture behavior due to the different grain boundary characters was investigated with the View the MathML source symmetrical tilt S5 and the View the MathML source symmetrical tilt S3 ({1 1 1} twin) grain boundaries. Nine interatomic potentials, seven rigid-ion plus two core–shell ones, were utilized to elucidate possible potential dependence. Initiating from a notch, crack propagation along grain boundaries was observed for most potentials. The S3 boundary was found to be more prone to fracture than the S5 one, indicated bymore » a lower energy release rate associated with the former. However, some potential dependence was identified on the existence of transient plastic deformation at crack tips, and the results were discussed regarding the relevant material properties including the excess energies of metastable phases and the critical energy release rate for intergranular fracture. In general, local plasticity at crack tips was observed in fracture simulations with potentials that predict low excess energies for metastable phases and high critical energy release rates for intergranular fracture.« less

  5. Kinetic Monte Carlo Method for Rule-based Modeling of Biochemical Networks

    PubMed Central

    Yang, Jin; Monine, Michael I.; Faeder, James R.; Hlavacek, William S.

    2009-01-01

    We present a kinetic Monte Carlo method for simulating chemical transformations specified by reaction rules, which can be viewed as generators of chemical reactions, or equivalently, definitions of reaction classes. A rule identifies the molecular components involved in a transformation, how these components change, conditions that affect whether a transformation occurs, and a rate law. The computational cost of the method, unlike conventional simulation approaches, is independent of the number of possible reactions, which need not be specified in advance or explicitly generated in a simulation. To demonstrate the method, we apply it to study the kinetics of multivalent ligand-receptor interactions. We expect the method will be useful for studying cellular signaling systems and other physical systems involving aggregation phenomena. PMID:18851068

  6. Pore-scale discretisation limits of multiphase lattice-Boltzmann methods

    NASA Astrophysics Data System (ADS)

    Li, Z.; Middleton, J.; Varslot, T.; Sheppard, A.

    2015-12-01

    Lattice-Boltzmann (LB) modeling is a popular method for the numerical solution of the Navier-Stokes equations and several multi-component LB models are widely used to simulate immiscible two-phase fluid flow in porous media. However, there has been relatively little study of the models' ability to make optimal use of 3D imagery by considering the minimum number of grid points that are needed to represent geometric features such as pore throats. This is of critical importance since 3D images of geological samples are a compromise between resolution and field of view. In this work we explore the discretisation limits of LB models, their behavior near these limits, and the consequences of this behavior for simulations of drainage and imbibition. We quantify the performance of two commonly used multiphase LB models: Shan-Chen (SC) and Rothman-Keller (RK) models in a set of tests, including simulations of bubbles in bulk fluid, on flat surfaces, confined in flat/tilted tubes, and fluid invasion into single tubes. Simple geometries like these allow better quantification of model behavior and better understanding of breakdown mechanisms. In bulk fluid, bubble radii less than 2.5 grid units (image voxels) cause numerical instability in SC model; the RK model is stable to a radius of 2.5 units and below, but with poor agreement with the Laplace's law. When confined to a flat duct, the SC model can simulate similar radii to RK model, but with higher interface spurious currents than the RK model and some risk of instability. In tilted ducts with 'staircase' voxel-level roughness, the SC model seems to average the roughness, whereas for RK model only the 'peaks' of the surface are relevant. Overall, our results suggest that LB models can simulate fluid capillary pressure corresponding to interfacial radii of just 1.5 grid units, with the RK model exhibiting significantly better stability.

  7. Design and Development of a Model to Simulate 0-G Treadmill Running Using the European Space Agency's Subject Loading System

    NASA Technical Reports Server (NTRS)

    Caldwell, E. C.; Cowley, M. S.; Scott-Pandorf, M. M.

    2010-01-01

    Develop a model that simulates a human running in 0 G using the European Space Agency s (ESA) Subject Loading System (SLS). The model provides ground reaction forces (GRF) based on speed and pull-down forces (PDF). DESIGN The theoretical basis for the Running Model was based on a simple spring-mass model. The dynamic properties of the spring-mass model express theoretical vertical GRF (GRFv) and shear GRF in the posterior-anterior direction (GRFsh) during running gait. ADAMs VIEW software was used to build the model, which has a pelvis, thigh segment, shank segment, and a spring foot (see Figure 1).the model s movement simulates the joint kinematics of a human running at Earth gravity with the aim of generating GRF data. DEVELOPMENT & VERIFICATION ESA provided parabolic flight data of subjects running while using the SLS, for further characterization of the model s GRF. Peak GRF data were fit to a linear regression line dependent on PDF and speed. Interpolation and extrapolation of the regression equation provided a theoretical data matrix, which is used to drive the model s motion equations. Verification of the model was conducted by running the model at 4 different speeds, with each speed accounting for 3 different PDF. The model s GRF data fell within a 1-standard-deviation boundary derived from the empirical ESA data. CONCLUSION The Running Model aids in conducting various simulations (potential scenarios include a fatigued runner or a powerful runner generating high loads at a fast cadence) to determine limitations for the T2 vibration isolation system (VIS) aboard the International Space Station. This model can predict how running with the ESA SLS affects the T2 VIS and may be used for other exercise analyses in the future.

  8. Anisotropy of thermal infrared remote sensing over urban areas : assessment from airborne data and modeling approach

    NASA Astrophysics Data System (ADS)

    Hénon, A.; Mestayer, P.; Lagouarde, J.-P.; Lee, J. H.

    2009-09-01

    Due to the morphological complexity of the urban canopy and to the variability in thermal properties of the building materials, the heterogeneity of the surface temperatures generates a strong directional anisotropy of thermal infrared remote sensing signal. Thermal infrared (TIR) data obtained with an airborne FLIR camera over Toulouse (France) city centre during the CAPITOUL experiment (feb. 2004 - feb. 2005) show brightness temperature anisotropies ranging from 3 °C by night to more than 10 °C by sunny days. These data have been analyzed in view of developing a simple approach to correct TIR satellite remote sensing from the canopy-generated anisotropy, and to further evaluate the sensible heat fluxes. The methodology is based on the identification of 6 different classes of surfaces: roofs, walls and grounds, sunlit or shaded, respectively. The thermo-radiative model SOLENE is used to simulate, with a 1 m resolution computational grid, the surface temperatures of an 18000 m² urban district, in the same meteorological conditions as during the observation. A pixel-by-pixel comparison with both hand-held temperature measurements and airborne camera images allows to assess the actual values of the radiative and thermal parameters of the scene elements. SOLENE is then used to simulate a generic street-canyon geometry, whose sizes average the morphological parameters of the actual streets in the district, for 18 different geographical orientations. The simulated temperatures are then integrated for different viewing positions, taking into account shadowing and masking, and directional temperatures are determined for the 6 surface classes. The class ratios in each viewing direction are derived from images of the district generated by using the POVRAY software, and used to weigh the temperatures of each class and to compute the resulting directional brightness temperature at the district scale for a given sun direction (time in the day). Simulated and measured anisotropies are finally compared for several flights over Toulouse in summer and winter. An inverse method is further proposed to obtain the surface temperatures from the directional brightness temperatures, which may be extended to deduce the sensible heat fluxes separately from the buildings and from the ground.

  9. Venus - Computer Simulated Global View Centered at 0 Degrees East Longitude

    NASA Image and Video Library

    1996-03-14

    This global view of the surface of Venus is centered at 0 degrees east longitude. NASA Magellan synthetic aperture radar mosaics from the first cycle of Magellan mapping were mapped onto a computer-simulated globe to create this image. http://photojournal.jpl.nasa.gov/catalog/PIA00257

  10. The Flipper Debate: Teaching Intercultural Communication through Simulated Conflict

    ERIC Educational Resources Information Center

    Peeples, Jennifer; Hall, Bradford J.; Seiter, John S.

    2012-01-01

    Although Western cultures tend to view dolphins as friendly and benevolent, in Japanese fishing communities, "iruka" (dolphins) are often viewed as food or pests. These perspectives have led to intense conflicts between Japanese fishermen and activists from the west. This article presents an exercise that simulates intercultural conflict by asking…

  11. Simulation-based MDP verification for leading-edge masks

    NASA Astrophysics Data System (ADS)

    Su, Bo; Syrel, Oleg; Pomerantsev, Michael; Hagiwara, Kazuyuki; Pearman, Ryan; Pang, Leo; Fujimara, Aki

    2017-07-01

    For IC design starts below the 20nm technology node, the assist features on photomasks shrink well below 60nm and the printed patterns of those features on masks written by VSB eBeam writers start to show a large deviation from the mask designs. Traditional geometry-based fracturing starts to show large errors for those small features. As a result, other mask data preparation (MDP) methods have become available and adopted, such as rule-based Mask Process Correction (MPC), model-based MPC and eventually model-based MDP. The new MDP methods may place shot edges slightly differently from target to compensate for mask process effects, so that the final patterns on a mask are much closer to the design (which can be viewed as the ideal mask), especially for those assist features. Such an alteration generally produces better masks that are closer to the intended mask design. Traditional XOR-based MDP verification cannot detect problems caused by eBeam effects. Much like model-based OPC verification which became a necessity for OPC a decade ago, we see the same trend in MDP today. Simulation-based MDP verification solution requires a GPU-accelerated computational geometry engine with simulation capabilities. To have a meaningful simulation-based mask check, a good mask process model is needed. The TrueModel® system is a field tested physical mask model developed by D2S. The GPU-accelerated D2S Computational Design Platform (CDP) is used to run simulation-based mask check, as well as model-based MDP. In addition to simulation-based checks such as mask EPE or dose margin, geometry-based rules are also available to detect quality issues such as slivers or CD splits. Dose margin related hotspots can also be detected by setting a correct detection threshold. In this paper, we will demonstrate GPU-acceleration for geometry processing, and give examples of mask check results and performance data. GPU-acceleration is necessary to make simulation-based mask MDP verification acceptable.

  12. Retrieval of chlorophyll from remote-sensing reflectance in the china seas.

    PubMed

    He, M X; Liu, Z S; Du, K P; Li, L P; Chen, R; Carder, K L; Lee, Z P

    2000-05-20

    The East China Sea is a typical case 2 water environment, where concentrations of phytoplankton pigments, suspended matter, and chromophoric dissolved organic matter (CDOM) are all higher than those in the open oceans, because of the discharge from the Yangtze River and the Yellow River. By using a hyperspectral semianalytical model, we simulated a set of remote-sensing reflectance for a variety of chlorophyll, suspended matter, and CDOM concentrations. From this simulated data set, a new algorithm for the retrieval of chlorophyll concentration from remote-sensing reflectance is proposed. For this method, we took into account the 682-nm spectral channel in addition to the Sea-viewing Wide Field-of-view Sensor (SeaWiFS) channels. When this algorithm was applied to a field data set, the chlorophyll concentrations retrieved through the new algorithm were consistent with field measurements to within a small error of 18%, in contrast with that of 147% between the SeaWiFS ocean chlorophyll 2 algorithm and the in situ observation.

  13. Predicting top-of-atmosphere radiance for arbitrary viewing geometries from the visible to thermal infrared: generalization to arbitrary average scene temperatures

    NASA Astrophysics Data System (ADS)

    Florio, Christopher J.; Cota, Steve A.; Gaffney, Stephanie K.

    2010-08-01

    In a companion paper presented at this conference we described how The Aerospace Corporation's Parameterized Image Chain Analysis & Simulation SOftware (PICASSO) may be used in conjunction with a limited number of runs of AFRL's MODTRAN4 radiative transfer code, to quickly predict the top-of-atmosphere (TOA) radiance received in the visible through midwave IR (MWIR) by an earth viewing sensor, for any arbitrary combination of solar and sensor elevation angles. The method is particularly useful for large-scale scene simulations where each pixel could have a unique value of reflectance/emissivity and temperature, making the run-time required for direct prediction via MODTRAN4 prohibitive. In order to be self-consistent, the method described requires an atmospheric model (defined, at a minimum, as a set of vertical temperature, pressure and water vapor profiles) that is consistent with the average scene temperature. MODTRAN4 provides only six model atmospheres, ranging from sub-arctic winter to tropical conditions - too few to cover with sufficient temperature resolution the full range of average scene temperatures that might be of interest. Model atmospheres consistent with intermediate temperature values can be difficult to come by, and in any event, their use would be too cumbersome for use in trade studies involving a large number of average scene temperatures. In this paper we describe and assess a method for predicting TOA radiance for any arbitrary average scene temperature, starting from only a limited number of model atmospheres.

  14. Data Mining Technologies Inspired from Visual Principle

    NASA Astrophysics Data System (ADS)

    Xu, Zongben

    In this talk we review the recent work done by our group on data mining (DM) technologies deduced from simulating visual principle. Through viewing a DM problem as a cognition problems and treading a data set as an image with each light point located at a datum position, we developed a series of high efficient algorithms for clustering, classification and regression via mimicking visual principles. In pattern recognition, human eyes seem to possess a singular aptitude to group objects and find important structure in an efficient way. Thus, a DM algorithm simulating visual system may solve some basic problems in DM research. From this point of view, we proposed a new approach for data clustering by modeling the blurring effect of lateral retinal interconnections based on scale space theory. In this approach, as the data image blurs, smaller light blobs merge into large ones until the whole image becomes one light blob at a low enough level of resolution. By identifying each blob with a cluster, the blurring process then generates a family of clustering along the hierarchy. The proposed approach provides unique solutions to many long standing problems, such as the cluster validity and the sensitivity to initialization problems, in clustering. We extended such an approach to classification and regression problems, through combatively employing the Weber's law in physiology and the cell response classification facts. The resultant classification and regression algorithms are proven to be very efficient and solve the problems of model selection and applicability to huge size of data set in DM technologies. We finally applied the similar idea to the difficult parameter setting problem in support vector machine (SVM). Viewing the parameter setting problem as a recognition problem of choosing a visual scale at which the global and local structures of a data set can be preserved, and the difference between the two structures be maximized in the feature space, we derived a direct parameter setting formula for the Gaussian SVM. The simulations and applications show that the suggested formula significantly outperforms the known model selection methods in terms of efficiency and precision.

  15. Simulating Stable Isotope Ratios in Plumes of Groundwater Pollutants with BIOSCREEN-AT-ISO.

    PubMed

    Höhener, Patrick; Li, Zhi M; Julien, Maxime; Nun, Pierrick; Robins, Richard J; Remaud, Gérald S

    2017-03-01

    BIOSCREEN is a well-known simple tool for evaluating the transport of dissolved contaminants in groundwater, ideal for rapid screening and teaching. This work extends the BIOSCREEN model for the calculation of stable isotope ratios in contaminants. A three-dimensional exact solution of the reactive transport from a patch source, accounting for fractionation by first-order decay and/or sorption, is used. The results match those from a previously published isotope model but are much simpler to obtain. Two different isotopes may be computed, and dual isotope plots can be viewed. The dual isotope assessment is a rapidly emerging new approach for identifying process mechanisms in aquifers. Furthermore, deviations of isotope ratios at specific reactive positions with respect to "bulk" ratios in the whole compound can be simulated. This model is named BIOSCREEN-AT-ISO and will be downloadable from the journal homepage. © 2016, National Ground Water Association.

  16. Using OPC technology to support the study of advanced process control.

    PubMed

    Mahmoud, Magdi S; Sabih, Muhammad; Elshafei, Moustafa

    2015-03-01

    OPC, originally the Object Linking and Embedding (OLE) for Process Control, brings a broad communication opportunity between different kinds of control systems. This paper investigates the use of OPC technology for the study of distributed control systems (DCS) as a cost effective and flexible research tool for the development and testing of advanced process control (APC) techniques in university research centers. Co-Simulation environment based on Matlab, LabVIEW and TCP/IP network is presented here. Several implementation issues and OPC based client/server control application have been addressed for TCP/IP network. A nonlinear boiler model is simulated as OPC server and OPC client is used for closed loop model identification, and to design a Model Predictive Controller. The MPC is able to control the NOx emissions in addition to drum water level and steam pressure. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  17. The Clinical Health Economics System Simulation (CHESS): a teaching tool for systems- and practice-based learning.

    PubMed

    Voss, John D; Nadkarni, Mohan M; Schectman, Joel M

    2005-02-01

    Academic medical centers face barriers to training physicians in systems- and practice-based learning competencies needed to function in the changing health care environment. To address these problems, at the University of Virginia School of Medicine the authors developed the Clinical Health Economics System Simulation (CHESS), a computerized team-based quasi-competitive simulator to teach the principles and practical application of health economics. CHESS simulates treatment costs to patients and society as well as physician reimbursement. It is scenario based with residents grouped into three teams, each team playing CHESS using differing (fee-for-service or capitated) reimbursement models. Teams view scenarios and select from two or three treatment options that are medically justifiable yet have different potential cost implications. CHESS displays physician reimbursement and patient and societal costs for each scenario as well as costs and income summarized across all scenarios extrapolated to a physician's entire patient panel. The learners are asked to explain these findings and may change treatment options and other variables such as panel size and case mix to conduct sensitivity analyses in real time. Evaluations completed in 2003 by 68 (94%) CHESS resident and faculty participants at 19 U.S. residency programs preferred CHESS to a traditional lecture-and-discussion format to learn about medical decision making, physician reimbursement, patient costs, and societal costs. Ninety-eight percent reported increased knowledge of health economics after viewing the simulation. CHESS demonstrates the potential of computer simulation to teach health economics and other key elements of practice- and systems-based competencies.

  18. A visualization tool to support decision making in environmental and biological planning

    USGS Publications Warehouse

    Romañach, Stephanie S.; McKelvy, James M.; Conzelmann, Craig; Suir, Kevin J.

    2014-01-01

    Large-scale ecosystem management involves consideration of many factors for informed decision making. The EverVIEW Data Viewer is a cross-platform desktop decision support tool to help decision makers compare simulation model outputs from competing plans for restoring Florida's Greater Everglades. The integration of NetCDF metadata conventions into EverVIEW allows end-users from multiple institutions within and beyond the Everglades restoration community to share information and tools. Our development process incorporates continuous interaction with targeted end-users for increased likelihood of adoption. One of EverVIEW's signature features is side-by-side map panels, which can be used to simultaneously compare species or habitat impacts from alternative restoration plans. Other features include examination of potential restoration plan impacts across multiple geographic or tabular displays, and animation through time. As a result of an iterative, standards-driven approach, EverVIEW is relevant to large-scale planning beyond Florida, and is used in multiple biological planning efforts in the United States.

  19. An Arctic source for the Great Salinity Anomaly - A simulation of the Arctic ice-ocean system for 1955-1975

    NASA Technical Reports Server (NTRS)

    Hakkinen, Sirpa

    1993-01-01

    The paper employs a fully prognostic Arctic ice-ocean model to study the interannual variability of sea ice during the period 1955-1975 and to explain the large variability of the ice extent in the Greenland and Iceland seas during the late 1960s. The model is used to test the contention of Aagaard and Carmack (1989) that the Great Salinity Anomaly (GSA) was a consequence of the anomalously large ice export in 1968. The high-latitude ice-ocean circulation changes due to wind field changes are explored. The ice export event of 1968 was the largest in the simulation, being about twice as large as the average and corresponding to 1600 cu km of excess fresh water. The simulations suggest that, besides the above average ice export to the Greenland Sea, there was also fresh water export to support the larger than average ice cover. The model results show the origin of the GSA to be in the Arctic, and support the view that the Arctic may play an active role in climate change.

  20. Scalability of surrogate-assisted multi-objective optimization of antenna structures exploiting variable-fidelity electromagnetic simulation models

    NASA Astrophysics Data System (ADS)

    Koziel, Slawomir; Bekasiewicz, Adrian

    2016-10-01

    Multi-objective optimization of antenna structures is a challenging task owing to the high computational cost of evaluating the design objectives as well as the large number of adjustable parameters. Design speed-up can be achieved by means of surrogate-based optimization techniques. In particular, a combination of variable-fidelity electromagnetic (EM) simulations, design space reduction techniques, response surface approximation models and design refinement methods permits identification of the Pareto-optimal set of designs within a reasonable timeframe. Here, a study concerning the scalability of surrogate-assisted multi-objective antenna design is carried out based on a set of benchmark problems, with the dimensionality of the design space ranging from six to 24 and a CPU cost of the EM antenna model from 10 to 20 min per simulation. Numerical results indicate that the computational overhead of the design process increases more or less quadratically with the number of adjustable geometric parameters of the antenna structure at hand, which is a promising result from the point of view of handling even more complex problems.

  1. Breakdown simulations in a focused microwave beam within the simplified model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Semenov, V. E.; Rakova, E. I.; Glyavin, M. Yu.

    2016-07-15

    The simplified model is proposed to simulate numerically air breakdown in a focused microwave beam. The model is 1D from the mathematical point of view, but it takes into account the spatial non-uniformity of microwave field amplitude along the beam axis. The simulations are completed for different frequencies and different focal lengths of microwave beams. The results demonstrate complicated regimes of the breakdown evolution which represents a series of repeated ionization waves. These waves start at the focal point and propagate towards incident microwave radiation. The ionization wave parameters vary during propagation. At relatively low frequencies, the propagation regime ofmore » subsequent waves can also change qualitatively. Each next ionization wave is less pronounced than the previous one, and the breakdown evolution approaches the steady state with relatively small plasma density. The ionization wave parameters are sensitive to the weak source of external ionization, but the steady state is independent on such a source. As the beam focal length decreases, the stationary plasma density increases and the onset of the steady state occurs faster.« less

  2. Imaging skills for transthoracic echocardiography in cardiology fellows: The value of motion metrics

    PubMed Central

    Montealegre-Gallegos, Mario; Mahmood, Feroze; Kim, Han; Bergman, Remco; Mitchell, John D.; Bose, Ruma; Hawthorne, Katie M.; O’Halloran, T. David; Wong, Vanessa; Hess, Philip E.; Matyal, Robina

    2016-01-01

    Background: Proficiency in transthoracic echocardiography (TTE) requires an integration of cognitive knowledge and psychomotor skills. Whereas cognitive knowledge can be quantified, psychomotor skills are implied after repetitive task performance. We applied motion analyses to evaluate psychomotor skill acquisition during simulator-based TTE training. Methods and Results: During the first month of their fellowship training, 16 cardiology fellows underwent a multimodal TTE training program for 4 weeks (8 sessions). The program consisted of online and live didactics as well as simulator training. Kinematic metrics (path length, time, probe accelerations) were obtained at the start and end of the course for 8 standard TTE views using a simulator. At the end of the course TTE image acquisition skills were tested on human models. After completion of the training program the trainees reported improved self-perceived comfort with TTE imaging. There was also an increase of 8.7% in post-test knowledge scores. There was a reduction in the number of probe accelerations [median decrease 49.5, 95% CI = 29-73, adjusted P < 0.01], total time [median decrease 10.6 s, 95% CI = 6.6-15.5, adjusted P < 0.01] and path length [median decrease 8.8 cm, 95% CI = 2.2-17.7, adjusted P < 0.01] from the start to the end of the course. During evaluation on human models, the trainees were able to obtain all the required TTE views without instructor assistance. Conclusion: Simulator-derived motion analyses can be used to objectively quantify acquisition of psychomotor skills during TTE training. Such an approach could be used to assess readiness for clinical practice of TTE. PMID:27052064

  3. A Global, Multi-Waveband Model for the Zodiacal Cloud

    NASA Technical Reports Server (NTRS)

    Grogan, Keith; Dermott, Stanley F.; Kehoe, Thomas J. J.

    2003-01-01

    This recently completed three-year project was undertaken by the PI at the University of Florida, NASA Goddard and JPL, and by the Co-I and Collaborator at the University of Florida. The funding was used to support a continuation of research conducted at the University of Florida over the last decade which focuses on the dynamics of dust particles in the interplanetary environment. The main objectives of this proposal were: To produce improved dynamical models of the zodiacal cloud by performing numerical simulations of the orbital evolution of asteroidal and cometary dust particles. To provide visualizations of the results using our visualization software package, SIMUL, simulating the viewing geometries of IRAS and COBE and comparing the model results with archived data. To use the results to provide a more accurate model of the brightness distribution of the zodiacal cloud than existing empirical models. In addition, our dynamical approach can provide insight into fundamental properties of the cloud, including but not limited to the total mass and surface area of dust, the size-frequency distribution of dust, and the relative contributions of asteroidal and cometary material. The model can also be used to provide constraints on trace signals from other sources, such as dust associated with the "Plutinos" , objects captured in the 2:3 resonance with Neptune.

  4. Modeling the biomechanical influence of epilaryngeal stricture on the vocal folds: a low-dimensional model of vocal-ventricular fold coupling.

    PubMed

    Moisik, Scott R; Esling, John H

    2014-04-01

    PURPOSE Physiological and phonetic studies suggest that, at moderate levels of epilaryngeal stricture, the ventricular folds impinge upon the vocal folds and influence their dynamical behavior, which is thought to be responsible for constricted laryngeal sounds. In this work, the authors examine this hypothesis through biomechanical modeling. METHOD The dynamical response of a low-dimensional, lumped-element model of the vocal folds under the influence of vocal-ventricular fold coupling was evaluated. The model was assessed for F0 and cover-mass phase difference. Case studies of simulations of different constricted phonation types and of glottal stop illustrate various additional aspects of model performance. RESULTS Simulated vocal-ventricular fold coupling lowers F0 and perturbs the mucosal wave. It also appears to reinforce irregular patterns of oscillation, and it can enhance laryngeal closure in glottal stop production. CONCLUSION The effects of simulated vocal-ventricular fold coupling are consistent with sounds, such as creaky voice, harsh voice, and glottal stop, that have been observed to involve epilaryngeal stricture and apparent contact between the vocal folds and ventricular folds. This supports the view that vocal-ventricular fold coupling is important in the vibratory dynamics of such sounds and, furthermore, suggests that these sounds may intrinsically require epilaryngeal stricture.

  5. Evaluation of hydrogen bond networks in cellulose Iβ and II crystals using density functional theory and Car-Parrinello molecular dynamics.

    PubMed

    Hayakawa, Daichi; Nishiyama, Yoshiharu; Mazeau, Karim; Ueda, Kazuyoshi

    2017-09-08

    Crystal models of cellulose Iβ and II, which contain various hydrogen bonding (HB) networks, were analyzed using density functional theory and Car-Parrinello molecular dynamics (CPMD) simulations. From the CPMD trajectories, the power spectra of the velocity correlation functions of hydroxyl groups involved in hydrogen bonds were calculated. For the Iβ allomorph, HB network A, which is dominant according to the neutron diffraction data, was stable, and the power spectrum represented the essential features of the experimental IR spectra. In contrast, network B, which is a minor structure, was unstable because its hydroxymethyl groups reoriented during the CPMD simulation, yielding a different crystal structure to that determined by experiments. For the II allomorph, a HB network A is proposed based on diffraction data, whereas molecular modeling identifies an alternative network B. Our simulations showed that the interaction energies of the cellulose II (B) model are slightly more favorable than model II(A). However, the evaluation of the free energy should be waited for the accurate determination from the energy point of view. For the IR calculation, cellulose II (B) model reproduces the spectra better than model II (A). Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Microstructure simulation of rapidly solidified ASP30 high-speed steel particles by gas atomization

    NASA Astrophysics Data System (ADS)

    Ma, Jie; Wang, Bo; Yang, Zhi-liang; Wu, Guang-xin; Zhang, Jie-yu; Zhao, Shun-li

    2016-03-01

    In this study, the microstructure evolution of rapidly solidified ASP30 high-speed steel particles was predicted using a simulation method based on the cellular automaton-finite element (CAFE) model. The dendritic growth kinetics, in view of the characteristics of ASP30 steel, were calculated and combined with macro heat transfer calculations by user-defined functions (UDFs) to simulate the microstructure of gas-atomized particles. The relationship among particle diameter, undercooling, and the convection heat transfer coefficient was also investigated to provide cooling conditions for simulations. The simulated results indicated that a columnar grain microstructure was observed in small particles, whereas an equiaxed microstructure was observed in large particles. In addition, the morphologies and microstructures of gas-atomized ASP30 steel particles were also investigated experimentally using scanning electron microscopy (SEM). The experimental results showed that four major types of microstructures were formed: dendritic, equiaxed, mixed, and multi-droplet microstructures. The simulated results and the available experimental data are in good agreement.

  7. Intraseasonal and interannual oscillations in coupled ocean-atmosphere models

    NASA Technical Reports Server (NTRS)

    Hirst, Anthony C.; Lau, K.-M.

    1990-01-01

    An investigation is presented of coupled ocean-atmosphere models' behavior in an environment where atmospheric wave speeds are substantially reduced from dry atmospheric values by such processes as condensation-moisture convergence. Modes are calculated for zonally periodic, unbounded ocean-atmosphere systems, emphasizing the importance of an inclusion of prognostic atmosphere equations in simple coupled ocean-atmosphere models with a view to simulations of intraseasonal variability and its possible interaction with interannual variability. The dynamics of low and high frequency modes are compared; both classes are sensitive to the degree to which surface wind anomalies are able to affect the evaporation rate.

  8. Visual air quality simulation techniques

    NASA Astrophysics Data System (ADS)

    Molenar, John V.; Malm, William C.; Johnson, Christopher E.

    Visual air quality is primarily a human perceptual phenomenon beginning with the transfer of image-forming information through an illuminated, scattering and absorbing atmosphere. Visibility, especially the visual appearance of industrial emissions or the degradation of a scenic view, is the principal atmospheric characteristic through which humans perceive air pollution, and is more sensitive to changing pollution levels than any other air pollution effect. Every attempt to quantify economic costs and benefits of air pollution has indicated that good visibility is a highly valued and desired environmental condition. Measurement programs can at best approximate the state of the ambient atmosphere at a few points in a scenic vista viewed by an observer. To fully understand the visual effect of various changes in the concentration and distribution of optically important atmospheric pollutants requires the use of aerosol and radiative transfer models. Communication of the output of these models to scientists, decision makers and the public is best done by applying modern image-processing systems to generate synthetic images representing the modeled air quality conditions. This combination of modeling techniques has been under development for the past 15 yr. Initially, visual air quality simulations were limited by a lack of computational power to simplified models depicting Gaussian plumes or uniform haze conditions. Recent explosive growth in low cost, high powered computer technology has allowed the development of sophisticated aerosol and radiative transfer models that incorporate realistic terrain, multiple scattering, non-uniform illumination, varying spatial distribution, concentration and optical properties of atmospheric constituents, and relative humidity effects on aerosol scattering properties. This paper discusses these improved models and image-processing techniques in detail. Results addressing uniform and non-uniform layered haze conditions in both urban and remote pristine areas will be presented.

  9. View-invariant object category learning, recognition, and search: how spatial and object attention are coordinated using surface-based attentional shrouds.

    PubMed

    Fazl, Arash; Grossberg, Stephen; Mingolla, Ennio

    2009-02-01

    How does the brain learn to recognize an object from multiple viewpoints while scanning a scene with eye movements? How does the brain avoid the problem of erroneously classifying parts of different objects together? How are attention and eye movements intelligently coordinated to facilitate object learning? A neural model provides a unified mechanistic explanation of how spatial and object attention work together to search a scene and learn what is in it. The ARTSCAN model predicts how an object's surface representation generates a form-fitting distribution of spatial attention, or "attentional shroud". All surface representations dynamically compete for spatial attention to form a shroud. The winning shroud persists during active scanning of the object. The shroud maintains sustained activity of an emerging view-invariant category representation while multiple view-specific category representations are learned and are linked through associative learning to the view-invariant object category. The shroud also helps to restrict scanning eye movements to salient features on the attended object. Object attention plays a role in controlling and stabilizing the learning of view-specific object categories. Spatial attention hereby coordinates the deployment of object attention during object category learning. Shroud collapse releases a reset signal that inhibits the active view-invariant category in the What cortical processing stream. Then a new shroud, corresponding to a different object, forms in the Where cortical processing stream, and search using attention shifts and eye movements continues to learn new objects throughout a scene. The model mechanistically clarifies basic properties of attention shifts (engage, move, disengage) and inhibition of return. It simulates human reaction time data about object-based spatial attention shifts, and learns with 98.1% accuracy and a compression of 430 on a letter database whose letters vary in size, position, and orientation. The model provides a powerful framework for unifying many data about spatial and object attention, and their interactions during perception, cognition, and action.

  10. A Computational Model of Human Table Tennis for Robot Application

    NASA Astrophysics Data System (ADS)

    Mülling, Katharina; Peters, Jan

    Table tennis is a difficult motor skill which requires all basic components of a general motor skill learning system. In order to get a step closer to such a generic approach to the automatic acquisition and refinement of table tennis, we study table tennis from a human motor control point of view. We make use of the basic models of discrete human movement phases, virtual hitting points, and the operational timing hypothesis. Using these components, we create a computational model which is aimed at reproducing human-like behavior. We verify the functionality of this model in a physically realistic simulation of a Barrett WAM.

  11. Introduction of hypermatrix and operator notation into a discrete mathematics simulation model of malignant tumour response to therapeutic schemes in vivo. Some operator properties.

    PubMed

    Stamatakos, Georgios S; Dionysiou, Dimitra D

    2009-10-21

    The tremendous rate of accumulation of experimental and clinical knowledge pertaining to cancer dictates the development of a theoretical framework for the meaningful integration of such knowledge at all levels of biocomplexity. In this context our research group has developed and partly validated a number of spatiotemporal simulation models of in vivo tumour growth and in particular tumour response to several therapeutic schemes. Most of the modeling modules have been based on discrete mathematics and therefore have been formulated in terms of rather complex algorithms (e.g. in pseudocode and actual computer code). However, such lengthy algorithmic descriptions, although sufficient from the mathematical point of view, may render it difficult for an interested reader to readily identify the sequence of the very basic simulation operations that lie at the heart of the entire model. In order to both alleviate this problem and at the same time provide a bridge to symbolic mathematics, we propose the introduction of the notion of hypermatrix in conjunction with that of a discrete operator into the already developed models. Using a radiotherapy response simulation example we demonstrate how the entire model can be considered as the sequential application of a number of discrete operators to a hypermatrix corresponding to the dynamics of the anatomic area of interest. Subsequently, we investigate the operators' commutativity and outline the "summarize and jump" strategy aiming at efficiently and realistically address multilevel biological problems such as cancer. In order to clarify the actual effect of the composite discrete operator we present further simulation results which are in agreement with the outcome of the clinical study RTOG 83-02, thus strengthening the reliability of the model developed.

  12. Adaptive control using neural networks and approximate models.

    PubMed

    Narendra, K S; Mukhopadhyay, S

    1997-01-01

    The NARMA model is an exact representation of the input-output behavior of finite-dimensional nonlinear discrete-time dynamical systems in a neighborhood of the equilibrium state. However, it is not convenient for purposes of adaptive control using neural networks due to its nonlinear dependence on the control input. Hence, quite often, approximate methods are used for realizing the neural controllers to overcome computational complexity. In this paper, we introduce two classes of models which are approximations to the NARMA model, and which are linear in the control input. The latter fact substantially simplifies both the theoretical analysis as well as the practical implementation of the controller. Extensive simulation studies have shown that the neural controllers designed using the proposed approximate models perform very well, and in many cases even better than an approximate controller designed using the exact NARMA model. In view of their mathematical tractability as well as their success in simulation studies, a case is made in this paper that such approximate input-output models warrant a detailed study in their own right.

  13. A Multilevel AR(1) Model: Allowing for Inter-Individual Differences in Trait-Scores, Inertia, and Innovation Variance.

    PubMed

    Jongerling, Joran; Laurenceau, Jean-Philippe; Hamaker, Ellen L

    2015-01-01

    In this article we consider a multilevel first-order autoregressive [AR(1)] model with random intercepts, random autoregression, and random innovation variance (i.e., the level 1 residual variance). Including random innovation variance is an important extension of the multilevel AR(1) model for two reasons. First, between-person differences in innovation variance are important from a substantive point of view, in that they capture differences in sensitivity and/or exposure to unmeasured internal and external factors that influence the process. Second, using simulation methods we show that modeling the innovation variance as fixed across individuals, when it should be modeled as a random effect, leads to biased parameter estimates. Additionally, we use simulation methods to compare maximum likelihood estimation to Bayesian estimation of the multilevel AR(1) model and investigate the trade-off between the number of individuals and the number of time points. We provide an empirical illustration by applying the extended multilevel AR(1) model to daily positive affect ratings from 89 married women over the course of 42 consecutive days.

  14. Effect of the cation model on the equilibrium structure of poly-L-glutamate in aqueous sodium chloride solution

    NASA Astrophysics Data System (ADS)

    Marchand, Gabriel; Soetens, Jean-Christophe; Jacquemin, Denis; Bopp, Philippe A.

    2015-12-01

    We demonstrate that different sets of Lennard-Jones parameters proposed for the Na+ ion, in conjunction with the empirical combining rules routinely used in simulation packages, can lead to essentially different equilibrium structures for a deprotonated poly-L-glutamic acid molecule (poly-L-glutamate) dissolved in a 0.3M aqueous NaCl solution. It is, however, difficult to discriminate a priori between these model potentials; when investigating the structure of the Na+-solvation shell in bulk NaCl solution, all parameter sets lead to radial distribution functions and solvation numbers in broad agreement with the available experimental data. We do not find any such dependency of the equilibrium structure on the parameters associated with the Cl- ion. This work does not aim at recommending a particular set of parameters for any particular purpose. Instead, it stresses the model dependence of simulation results for complex systems such as biomolecules in solution and thus the difficulties if simulations are to be used for unbiased predictions, or to discriminate between contradictory experiments. However, this opens the possibility of validating a model specifically in view of analyzing experimental data believed to be reliable.

  15. Inexpensive anatomical trainer for bronchoscopy.

    PubMed

    Di Domenico, Stefano; Simonassi, Claudio; Chessa, Leonardo

    2007-08-01

    Flexible fiberoptic bronchoscopy is an indispensable tool for optimal management of intensive care unit patients. However, the acquisition of sufficient training in bronchoscopy is not straightforward during residency, because of technical and ethical problems. Moreover, the use of commercial simulators is limited by their high cost. In order to overcome these limitations, we realized a low-cost anatomical simulator to acquire and maintain the basic skill to perform bronchoscopy in ventilated patients. We used 1.5 mm diameter iron wire to construct the bronchial tree scaffold; glazier-putty was applied to create the anatomical model. The model was covered by several layers of newspaper strips previously immersed in water and vinilic glue. When the model completely dried up, it was detached from the scaffold by cutting it into six pieces, it was reassembled, painted and fitted with an endotracheal tube. We used very cheap material and the final cost was euro16. The trainer resulted in real-scale and anatomically accurate, with appropriate correspondence on endoscopic view between model and patients. All bronchial segments can be explored and easily identified by endoscopic and external vision. This cheap simulator is a valuable tool for practicing, particularly in a hospital with limited resources for medical training.

  16. Effect of attenuation correction on image quality in emission tomography

    NASA Astrophysics Data System (ADS)

    Denisova, N. V.; Ondar, M. M.

    2017-10-01

    In this paper, mathematical modeling and computer simulations of myocardial perfusion SPECT imaging are performed. The main factors affecting the quality of reconstructed images in SPECT are anatomical structures, the diastolic volume of a myocardium and attenuation of gamma rays. The purpose of the present work is to study the effect of attenuation correction on image quality in emission tomography. The basic 2D model describing a Tc-99m distribution in a transaxial slice of the thoracic part of a patient body was designed. This model was used to construct four phantoms simulated various anatomical shapes: 2 male and 2 female patients with normal, obese and subtle physique were included in the study. Data acquisition model which includes the effect of non-uniform attenuation, collimator-detector response and Poisson statistics was developed. The projection data were calculated for 60 views in accordance with the standard myocardial perfusion SPECT imaging protocol. Reconstructions of images were performed using the OSEM algorithm which is widely used in modern SPECT systems. Two types of patient's examination procedures were simulated: SPECT without attenuation correction and SPECT/CT with attenuation correction. The obtained results indicate a significant effect of the attenuation correction on the SPECT images quality.

  17. Recovering the Physical Properties of Molecular Gas in Galaxies from CO SLED Modeling

    NASA Astrophysics Data System (ADS)

    Kamenetzky, J.; Privon, G. C.; Narayanan, D.

    2018-05-01

    Modeling of the spectral line energy distribution (SLED) of the CO molecule can reveal the physical conditions (temperature and density) of molecular gas in Galactic clouds and other galaxies. Recently, the Herschel Space Observatory and ALMA have offered, for the first time, a comprehensive view of the rotational J = 4‑3 through J = 13‑12 lines, which arise from a complex, diverse range of physical conditions that must be simplified to one, two, or three components when modeled. Here we investigate the recoverability of physical conditions from SLEDs produced by galaxy evolution simulations containing a large dynamical range in physical properties. These simulated SLEDs were generally fit well by one component of gas whose properties largely resemble or slightly underestimate the luminosity-weighted properties of the simulations when clumping due to nonthermal velocity dispersion is taken into account. If only modeling the first three rotational lines, the median values of the marginalized parameter distributions better represent the luminosity-weighted properties of the simulations, but the uncertainties in the fitted parameters are nearly an order of magnitude, compared to approximately 0.2 dex in the “best-case” scenario of a fully sampled SLED through J = 10‑9. This study demonstrates that while common CO SLED modeling techniques cannot reveal the underlying complexities of the molecular gas, they can distinguish bulk luminosity-weighted properties that vary with star formation surface densities and galaxy evolution, if a sufficient number of lines are detected and modeled.

  18. Validation of a digital mammographic unit model for an objective and highly automated clinical image quality assessment.

    PubMed

    Perez-Ponce, Hector; Daul, Christian; Wolf, Didier; Noel, Alain

    2013-08-01

    In mammography, image quality assessment has to be directly related to breast cancer indicator (e.g. microcalcifications) detectability. Recently, we proposed an X-ray source/digital detector (XRS/DD) model leading to such an assessment. This model simulates very realistic contrast-detail phantom (CDMAM) images leading to gold disc (representing microcalcifications) detectability thresholds that are very close to those of real images taken under the simulated acquisition conditions. The detection step was performed with a mathematical observer. The aim of this contribution is to include human observers into the disc detection process in real and virtual images to validate the simulation framework based on the XRS/DD model. Mathematical criteria (contrast-detail curves, image quality factor, etc.) are used to assess and to compare, from the statistical point of view, the cancer indicator detectability in real and virtual images. The quantitative results given in this paper show that the images simulated by the XRS/DD model are useful for image quality assessment in the case of all studied exposure conditions using either human or automated scoring. Also, this paper confirms that with the XRS/DD model the image quality assessment can be automated and the whole time of the procedure can be drastically reduced. Compared to standard quality assessment methods, the number of images to be acquired is divided by a factor of eight. Copyright © 2012 IPEM. Published by Elsevier Ltd. All rights reserved.

  19. Perception Modelling of Visitors in Vargas Museum Using Agent-Based Simulation and Visibility Analysis

    NASA Astrophysics Data System (ADS)

    Carcellar, B. G., III

    2017-10-01

    Museum exhibit management is one of the usual undertakings of museum facilitators. Art works must be strategically placed to achieve maximum viewing from the visitors. The positioning of the artworks also highly influences the quality of experience of the visitors. One solution in such problems is to utilize GIS and Agent-Based Modelling (ABM). In ABM, persistent interacting objects are modelled as agents. These agents are given attributes and behaviors that describe their properties as well as their motion. In this study, ABM approach that incorporates GIS is utilized to perform analyticcal assessment on the placement of the artworks in the Vargas Museum. GIS serves as the backbone for the spatial aspect of the simulation such as the placement of the artwork exhibits, as well as possible obstructions to perception such as the columns, walls, and panel boards. Visibility Analysis is also done to the model in GIS to assess the overall visibility of the artworks. The ABM is done using the initial GIS outputs and GAMA, an open source ABM software. Visitors are modelled as agents, moving inside the museum following a specific decision tree. The simulation is done in three use cases: the 10 %, 20 %, and 30 % chance of having a visitor in the next minute. For the case of the said museum, the 10 % chance is determined to be the closest simulation case to the actual and the recommended minimum time to achieve a maximum artwork perception is 1 hour and 40 minutes. Initial assessment of the results shows that even after 3 hours of simulation, small parts of the exhibit show lack of viewers, due to its distance from the entrance. A more detailed decision tree for the visitor agents can be incorporated to have a more realistic simulation.

  20. Colonoscopy tutorial software made with a cadaver's sectioned images.

    PubMed

    Chung, Beom Sun; Chung, Min Suk; Park, Hyung Seon; Shin, Byeong-Seok; Kwon, Koojoo

    2016-11-01

    Novice doctors may watch tutorial videos in training for actual or computed tomographic (CT) colonoscopy. The conventional learning videos can be complemented by virtual colonoscopy software made with a cadaver's sectioned images (SIs). The objective of this study was to assist colonoscopy trainees with the new interactive software. Submucosal segmentation on the SIs was carried out through the whole length of the large intestine. With the SIs and segmented images, a three dimensional model was reconstructed. Six-hundred seventy-one proximal colonoscopic views (conventional views) and corresponding distal colonoscopic views (simulating the retroflexion of a colonoscope) were produced. Not only navigation views showing the current location of the colonoscope tip and its course, but also, supplementary description views were elaborated. The four corresponding views were put into convenient browsing software to be downloaded free from the homepage (anatomy.co.kr). The SI colonoscopy software with the realistic images and supportive tools was available to anybody. Users could readily notice the position and direction of the virtual colonoscope tip and recognize meaningful structures in colonoscopic views. The software is expected to be an auxiliary learning tool to improve technique and related knowledge in actual and CT colonoscopies. Hopefully, the software will be updated using raw images from the Visible Korean project. Copyright © 2016 Elsevier GmbH. All rights reserved.

Top