Science.gov

Sample records for environment process model

  1. Near Field Environment Process Model Report

    SciTech Connect

    R.A. Wagner

    2000-11-14

    Waste emplacement and activities associated with construction of a repository system potentially will change environmental conditions within the repository system. These environmental changes principally result from heat generated by the decay of the radioactive waste, which elevates temperatures within the repository system. Elevated temperatures affect distribution of water, increase kinetic rates of geochemical processes, and cause stresses to change in magnitude and orientation from the stresses resulting from the overlying rock and from underground construction activities. The recognition of this evolving environment has been reflected in activities, studies and discussions generally associated with what has been termed the Near-Field Environment (NFE). The NFE interacts directly with waste packages and engineered barriers as well as potentially changing the fluid composition and flow conditions within the mountain. As such, the NFE defines the environment for assessing the performance of a potential Monitored Geologic Repository at Yucca Mountain, Nevada. The NFe evolves over time, and therefore is not amenable to direct characterization or measurement in the ambient system. Analysis or assessment of the NFE must rely upon projections based on tests and models that encompass the long-term processes of the evolution of this environment. This NFE Process Model Report (PMR) describes the analyses and modeling based on current understanding of the evolution of the near-field within the rock mass extending outward from the drift wall.

  2. Space - A unique environment for process modeling R&D

    NASA Technical Reports Server (NTRS)

    Overfelt, Tony

    1991-01-01

    Process modeling, the application of advanced computational techniques to simulate real processes as they occur in regular use, e.g., welding, casting and semiconductor crystal growth, is discussed. Using the low-gravity environment of space will accelerate the technical validation of the procedures and enable extremely accurate determinations of the many necessary thermophysical properties. Attention is given to NASA's centers for the commercial development of space; joint ventures of universities, industries, and goverment agencies to study the unique attributes of space that offer potential for applied R&D and eventual commercial exploitation.

  3. A simple hyperbolic model for communication in parallel processing environments

    NASA Technical Reports Server (NTRS)

    Stoica, Ion; Sultan, Florin; Keyes, David

    1994-01-01

    We introduce a model for communication costs in parallel processing environments called the 'hyperbolic model,' which generalizes two-parameter dedicated-link models in an analytically simple way. Dedicated interprocessor links parameterized by a latency and a transfer rate that are independent of load are assumed by many existing communication models; such models are unrealistic for workstation networks. The communication system is modeled as a directed communication graph in which terminal nodes represent the application processes that initiate the sending and receiving of the information and in which internal nodes, called communication blocks (CBs), reflect the layered structure of the underlying communication architecture. The direction of graph edges specifies the flow of the information carried through messages. Each CB is characterized by a two-parameter hyperbolic function of the message size that represents the service time needed for processing the message. The parameters are evaluated in the limits of very large and very small messages. Rules are given for reducing a communication graph consisting of many to an equivalent two-parameter form, while maintaining an approximation for the service time that is exact in both large and small limits. The model is validated on a dedicated Ethernet network of workstations by experiments with communication subprograms arising in scientific applications, for which a tight fit of the model predictions with actual measurements of the communication and synchronization time between end processes is demonstrated. The model is then used to evaluate the performance of two simple parallel scientific applications from partial differential equations: domain decomposition and time-parallel multigrid. In an appropriate limit, we also show the compatibility of the hyperbolic model with the recently proposed LogP model.

  4. Modeling socio-cultural processes in network-centric environments

    NASA Astrophysics Data System (ADS)

    Santos, Eunice E.; Santos, Eugene, Jr.; Korah, John; George, Riya; Gu, Qi; Kim, Keumjoo; Li, Deqing; Russell, Jacob; Subramanian, Suresh

    2012-05-01

    The major focus in the field of modeling & simulation for network centric environments has been on the physical layer while making simplifications for the human-in-the-loop. However, the human element has a big impact on the capabilities of network centric systems. Taking into account the socio-behavioral aspects of processes such as team building, group decision-making, etc. are critical to realistically modeling and analyzing system performance. Modeling socio-cultural processes is a challenge because of the complexity of the networks, dynamism in the physical and social layers, feedback loops and uncertainty in the modeling data. We propose an overarching framework to represent, model and analyze various socio-cultural processes within network centric environments. The key innovation in our methodology is to simultaneously model the dynamism in both the physical and social layers while providing functional mappings between them. We represent socio-cultural information such as friendships, professional relationships and temperament by leveraging the Culturally Infused Social Network (CISN) framework. The notion of intent is used to relate the underlying socio-cultural factors to observed behavior. We will model intent using Bayesian Knowledge Bases (BKBs), a probabilistic reasoning network, which can represent incomplete and uncertain socio-cultural information. We will leverage previous work on a network performance modeling framework called Network-Centric Operations Performance and Prediction (N-COPP) to incorporate dynamism in various aspects of the physical layer such as node mobility, transmission parameters, etc. We validate our framework by simulating a suitable scenario, incorporating relevant factors and providing analyses of the results.

  5. Establishment probability in fluctuating environments: a branching process model.

    PubMed

    Haccou, P; Iwasa, Y

    1996-12-01

    We study the establishment probability of invaders in stochastically fluctuating environments and the related issue of extinction probability of small populations in such environments, by means of an inhomogeneous branching process model. In the model it is assumed that individuals reproduce asexually during discrete reproduction periods. Within each period, individuals have (independent) Poisson distributed numbers of offspring. The expected numbers of offspring per individual are independently identically distributed over the periods. It is shown that the establishment probability of an invader varies over the reproduction periods according to a stable distribution. We give a method for simulating the establishment probabilities and approximations for the expected establishment probability. Furthermore, we show that, due to the stochasticity of the establishment success over different periods, the expected success of sequential invasions is larger then that of simultaneous invasions and we study the effects of environmental fluctuations on the extinction probability of small populations and metapopulations. The results can easily be generalized to other offspring distributions than the Poisson. PMID:9000490

  6. Conceptualization of Approaches and Thought Processes Emerging in Validating of Model in Mathematical Modeling in Technology Aided Environment

    ERIC Educational Resources Information Center

    Hidiroglu, Çaglar Naci; Bukova Güzel, Esra

    2013-01-01

    The aim of the present study is to conceptualize the approaches displayed for validation of model and thought processes provided in mathematical modeling process performed in technology-aided learning environment. The participants of this grounded theory study were nineteen secondary school mathematics student teachers. The data gathered from the…

  7. Hot Plasma Environment Model (HPEM): A empirical model for describing time-dependent processes of the jovian energetic electron environment

    NASA Astrophysics Data System (ADS)

    Roussos, E.; Krupp, N.; Fraenz, M.; Kollmann, P.; Truscott, P.; Futaana, Y.

    2015-10-01

    HPEM is a model designed in order to provide time-series of energetic electron differential or integral energy-flux spectra for Jupiter's magnetosphere which can be used as input for internal charging studies of the JUICE spacecraft. The model describes the electron distribution function between 150 keV keV up to ~50 MeV. It is designed to be applicable between the orbit of Europa (9.5 Rj) up to 30 Rj, which is near Callisto's orbit, and in a latitude range of 40 degrees from the planetary equatorial plane, but it can be extended to larger distances and latitudes. The model is constructed with a goal to describe the time variability that a spacecraft can encounter in Jupiter's energetic electron environment. This variability can have two components: the first comes from the motion of the spacecraft within a spatially-varying jovian magnetosphere. For this purpose an average radiation belt model for the differential electron energy-flux spectra was constructed based on Galileo EPD/LEMMS observations, dependent on L, magnetospheric local time and equatorial pitch angle. The second component includes an empirical description of magnetospheric transients that result from dynamics in the magnetosphere. For this purpose, the probability for a given spectrum to deviate from the average one (at a given location) has been modeled with log-normal distributions and such probabilities are obtained with a Monte-Carlo approach. Temporal changes in the electron spectra are constrained by the L- or time gradients observed with Galileo's EPD/LEMMS detector so as to prevent extreme and unrealistic changes between sequential spectra of the model's output. The model is able to reproduce both the statistical scatter of energetic electron fluxes observed with Galileo/EPD, as well as the lifetimes/time scales and the occurence probability of extreme flux enhancements (temporal radiation belts) that Galileo encountered. An application to the JUICE mission is also shown.

  8. Autoplan: A self-processing network model for an extended blocks world planning environment

    NASA Technical Reports Server (NTRS)

    Dautrechy, C. Lynne; Reggia, James A.; Mcfadden, Frank

    1990-01-01

    Self-processing network models (neural/connectionist models, marker passing/message passing networks, etc.) are currently undergoing intense investigation for a variety of information processing applications. These models are potentially very powerful in that they support a large amount of explicit parallel processing, and they cleanly integrate high level and low level information processing. However they are currently limited by a lack of understanding of how to apply them effectively in many application areas. The formulation of self-processing network methods for dynamic, reactive planning is studied. The long-term goal is to formulate robust, computationally effective information processing methods for the distributed control of semiautonomous exploration systems, e.g., the Mars Rover. The current research effort is focusing on hierarchical plan generation, execution and revision through local operations in an extended blocks world environment. This scenario involves many challenging features that would be encountered in a real planning and control environment: multiple simultaneous goals, parallel as well as sequential action execution, action sequencing determined not only by goals and their interactions but also by limited resources (e.g., three tasks, two acting agents), need to interpret unanticipated events and react appropriately through replanning, etc.

  9. OpenDA Open Source Generic Data Assimilation Environment and its Application in Process Models

    NASA Astrophysics Data System (ADS)

    El Serafy, Ghada; Verlaan, Martin; Hummel, Stef; Weerts, Albrecht; Dhondia, Juzer

    2010-05-01

    Data Assimilation techniques are essential elements in state-of-the-art development of models and their optimization with data in the field of groundwater, surface water and soil systems. They are essential tools in calibration of complex modelling systems and improvement of model forecasts. The OpenDA is a new and generic open source data assimilation environment for application to a choice of physical process models, applied to case dependent domains. OpenDA was introduced recently when the developers of Costa, an open-source TU Delft project [http://www.costapse.org; Van Velzen and Verlaan; 2007] and those of the DATools from the former WL|Delft Hydraulics [El Serafy et al 2007; Weerts et al. 2009] decided to join forces. OpenDA makes use of a set of interfaces that describe the interaction between models, observations and data assimilation algorithms. It focuses on flexible applications in portable systems for modelling geophysical processes. It provides a generic interfacing protocol that allows combination of the implemented data assimilation techniques with, in principle, any time-stepping model duscribing a process(atmospheric processes, 3D circulation, 2D water level, sea surface temperature, soil systems, groundwater etc.). Presently, OpenDA features filtering techniques and calibration techniques. The presentation will give an overview of the OpenDA and the results of some of its practical applications. Application of data assimilation in portable operational forecasting systems—the DATools assimilation environment, El Serafy G.Y., H. Gerritsen, S. Hummel, A. H. Weerts, A.E. Mynett and M. Tanaka (2007), Journal of Ocean Dynamics, DOI 10.1007/s10236-007-0124-3, pp.485-499. COSTA a problem solving environment for data assimilation applied for hydrodynamical modelling, Van Velzen and Verlaan (2007), Meteorologische Zeitschrift, Volume 16, Number 6, December 2007 , pp. 777-793(17). Application of generic data assimilation tools (DATools) for flood

  10. Effects of Kinetic Processes in Shaping Io's Global Plasma Environment: A 3D Hybrid Model

    NASA Technical Reports Server (NTRS)

    Lipatov, Alexander S.; Combi, Michael R.

    2004-01-01

    The global dynamics of the ionized and neutral components in the environment of Io plays an important role in the interaction of Jupiter's corotating magnetospheric plasma with Io. The stationary simulation of this problem was done in the MHD and the electrodynamics approaches. One of the main significant results from the simplified two-fluid model simulations was a production of the structure of the double-peak in the magnetic field signature of the I0 flyby that could not be explained by standard MHD models. In this paper, we develop a method of kinetic ion simulation. This method employs the fluid description for electrons and neutrals whereas for ions multilevel, drift-kinetic and particle, approaches are used. We also take into account charge-exchange and photoionization processes. Our model provides much more accurate description for ion dynamics and allows us to take into account the realistic anisotropic ion distribution that cannot be done in fluid simulations. The first results of such simulation of the dynamics of ions in the Io's environment are discussed in this paper.

  11. Effects of Kinetic Processes in Shaping Io's Global Plasma Environment: A 3D Hybrid Model

    NASA Technical Reports Server (NTRS)

    Lipatov, Alexander S.; Combi, Michael R.

    2006-01-01

    The global dynamics of the ionized and neutral gases in the environment of Io plays an important role in the interaction of Jupiter s corotating magnetospheric plasma with Io. Stationary simulations of this problem have already been done using the magnetohydrodynamics (MHD) and the electrodynamics approaches. One of the major results of recent simplified two-fluid model simulations [Saur, J., Neubauer, F.M., Strobel, D.F., Summers, M.E., 2002. J. Geophys. Res. 107 (SMP5), 1-18] was the production of the structure of the double-peak in the magnetic field signature of the Io flyby. These could not be explained before by standard MHD models. In this paper, we present a hybrid simulation for Io with kinetic ions and fluid electrons. This method employs a fluid description for electrons and neutrals, whereas for ions a particle approach is used. We also take into account charge-exchange and photoionization processes and solve self-consistently for electric and magnetic fields. Our model may provide a much more accurate description for the ion dynamics than previous approaches and allows us to account for the realistic anisotropic ion velocity distribution that cannot be done in fluid simulations with isotropic temperatures. The first results of such a simulation of the dynamics of ions in Io s environment are discussed in this paper. Comparison with the Galileo IO flyby results shows that this approach provides an accurate physical basis for the interaction and can therefore naturally reproduce all the observed salient features.

  12. Improving Ground Penetrating Radar Imaging in High Loss Environments by Coordinated System Development, Data Processing, Numerical Modeling, & Visualization

    SciTech Connect

    Wright, David L.

    2004-12-01

    Improving Ground Penetrating Radar Imaging in High Loss Environments by Coordinated System Development, Data Processing, Numerical Modeling, and Visualization Methods with Applications to Site Characterization EMSP Project 86992 Progress Report as of 9/2004.

  13. Mathematical Modelling of Thermal Process to Aquatic Environment with Different Hydrometeorological Conditions

    PubMed Central

    Issakhov, Alibek

    2014-01-01

    This paper presents the mathematical model of the thermal process from thermal power plant to aquatic environment of the reservoir-cooler, which is located in the Pavlodar region, 17 Km to the north-east of Ekibastuz town. The thermal process in reservoir-cooler with different hydrometeorological conditions is considered, which is solved by three-dimensional Navier-Stokes equations and temperature equation for an incompressible flow in a stratified medium. A numerical method based on the projection method, divides the problem into three stages. At the first stage, it is assumed that the transfer of momentum occurs only by convection and diffusion. Intermediate velocity field is solved by fractional steps method. At the second stage, three-dimensional Poisson equation is solved by the Fourier method in combination with tridiagonal matrix method (Thomas algorithm). Finally, at the third stage, it is expected that the transfer is only due to the pressure gradient. Numerical method determines the basic laws of the hydrothermal processes that qualitatively and quantitatively are approximated depending on different hydrometeorological conditions. PMID:24991644

  14. Mathematical modelling of thermal process to aquatic environment with different hydrometeorological conditions.

    PubMed

    Issakhov, Alibek

    2014-01-01

    This paper presents the mathematical model of the thermal process from thermal power plant to aquatic environment of the reservoir-cooler, which is located in the Pavlodar region, 17 Km to the north-east of Ekibastuz town. The thermal process in reservoir-cooler with different hydrometeorological conditions is considered, which is solved by three-dimensional Navier-Stokes equations and temperature equation for an incompressible flow in a stratified medium. A numerical method based on the projection method, divides the problem into three stages. At the first stage, it is assumed that the transfer of momentum occurs only by convection and diffusion. Intermediate velocity field is solved by fractional steps method. At the second stage, three-dimensional Poisson equation is solved by the Fourier method in combination with tridiagonal matrix method (Thomas algorithm). Finally, at the third stage, it is expected that the transfer is only due to the pressure gradient. Numerical method determines the basic laws of the hydrothermal processes that qualitatively and quantitatively are approximated depending on different hydrometeorological conditions. PMID:24991644

  15. Modelling Dust Processing and Evolution in Extreme Environments as seen by Herschel Space Observatory

    NASA Astrophysics Data System (ADS)

    Bocchio, Marco

    2014-09-01

    The main goal of my PhD study is to understand the dust processing that occurs during the mixing between the galactic interstellar medium and the intracluster medium. This process is of particular interest in violent phenomena such as galaxy-galaxy interactions or the ``Ram Pressure Stripping'' due to the infalling of a galaxy towards the cluster centre.Initially, I focus my attention to the problem of dust destruction and heating processes, re-visiting the available models in literature. I particularly stress on the cases of extreme environments such as a hot coronal-type gas (e.g., IGM, ICM, HIM) and supernova-generated interstellar shocks. Under these conditions small grains are destroyed on short timescales and large grains are heated by the collisions with fast electrons making the dust spectral energy distribution very different from what observed in the diffuse ISM.In order to test our models I apply them to the case of an interacting galaxy, NGC 4438. Herschel data of this galaxy indicates the presence of dust with a higher-than-expected temperature.With a multi-wavelength analysis on a pixel-by-pixel basis we show that this hot dust seems to be embedded in a hot ionised gas therefore undergoing both collisional heating and small grain destruction.Furthermore, I focus on the long-standing conundrum about the dust destruction and dust formation timescales in the Milky Way. Based on the destruction efficiency in interstellar shocks, previous estimates led to a dust lifetime shorter than the typical timescale for dust formation in AGB stars. Using a recent dust model and an updated dust processing model we re-evaluate the dust lifetime in our Galaxy. Finally, I turn my attention to the phenomenon of ``Ram Pressure Stripping''. The galaxy ESO 137-001 represents one of the best cases to study this effect. Its long H2 tail embedded in a hot and ionised tail raises questions about its possible stripping from the galaxy or formation downstream in the tail. Based on

  16. Analysing Students' Shared Activity while Modeling a Biological Process in a Computer-Supported Educational Environment

    ERIC Educational Resources Information Center

    Ergazaki, M.; Zogza, V.; Komis, V.

    2007-01-01

    This paper reports on a case study with three dyads of high school students (age 14 years) each collaborating on a plant growth modeling task in the computer-supported educational environment "ModelsCreator". Following a qualitative line of research, the present study aims at highlighting the ways in which the collaborating students as well as the…

  17. The enhanced Software Life Cyle Support Environment (ProSLCSE): Automation for enterprise and process modeling

    NASA Technical Reports Server (NTRS)

    Milligan, James R.; Dutton, James E.

    1993-01-01

    In this paper, we have introduced a comprehensive method for enterprise modeling that addresses the three important aspects of how an organization goes about its business. FirstEP includes infrastructure modeling, information modeling, and process modeling notations that are intended to be easy to learn and use. The notations stress the use of straightforward visual languages that are intuitive, syntactically simple, and semantically rich. ProSLCSE will be developed with automated tools and services to facilitate enterprise modeling and process enactment. In the spirit of FirstEP, ProSLCSE tools will also be seductively easy to use. Achieving fully managed, optimized software development and support processes will be long and arduous for most software organizations, and many serious problems will have to be solved along the way. ProSLCSE will provide the ability to document, communicate, and modify existing processes, which is the necessary first step.

  18. OpenDA: Open Source Generic Data Assimilation Environment and its Application in Geophysical Process Models

    NASA Astrophysics Data System (ADS)

    Weerts, A.; van Velzen, N.; Verlaan, M.; Sumihar, J.; Hummel, S.; El Serafy, G.; Dhondia, J.; Gerritsen, H.; Vermeer-Ooms, S.; Loots, E.; Markus, A.; Kockx, A.

    2011-12-01

    Data Assimilation techniques are essential elements in state-of-the-art development of models and their use in operational forecasting and real-time control in the fields of groundwater, surface water and soil systems. In meteorological and atmospheric sciences, steady improvements in numerical weather forecasting and climate prediction over the last couple of decades have been enabled to a large degree by the development of community-based models and data assimilation systems. The hydrologic community should learn from the experiences of the meteorological and atmospheric communities by accelerating the transition of hydrologic DA research into operations and developing community-supported, open-source modeling and forecasting systems and data assimilation tools. In 2010, a community based open source initiative named OpenDA was started. The openDA initiative bears similarities with the well-known openMI initiative. OpenDA makes use of a set of interfaces that describe the interaction between models, observations and data assimilation algorithms. It focuses on flexible applications in portable systems for modeling geophysical processes. It provides a generic interfacing protocol that allows combination of the implemented data assimilation techniques with, in principle, any time-stepping model describing a process (atmospheric processes, 3D circulation, 2D water flow, rainfall-runoff, unsaturated flow, groundwater flow, etc.). Presently, openDA features filtering techniques and calibration techniques. The presentation will give an overview of openDA and the results of some of its practical applications.

  19. Business Process Elicitation, Modeling, and Reengineering: Teaching and Learning with Simulated Environments

    ERIC Educational Resources Information Center

    Jeyaraj, Anand

    2010-01-01

    The design of enterprise information systems requires students to master technical skills for elicitation, modeling, and reengineering business processes as well as soft skills for information gathering and communication. These tacit skills and behaviors cannot be effectively taught students but rather experienced and learned by students. This…

  20. Process migration in UNIX environments

    NASA Technical Reports Server (NTRS)

    Lu, Chin; Liu, J. W. S.

    1988-01-01

    To support process migration in UNIX environments, the main problem is how to encapsulate the location dependent features of the system in such a way that a host independent virtual environment is maintained by the migration handlers on the behalf of each migrated process. An object-oriented approach is used to describe the interaction between a process and its environment. More specifically, environmental objects were introduced in UNIX systems to carry out the user-environment interaction. The implementation of the migration handlers is based on both the state consistency criterion and the property consistency criterion.

  1. Marine-hydrokinetic energy and the environment: Observations, modeling, and basic processes

    NASA Astrophysics Data System (ADS)

    Foufoula-Georgiou, Efi; Guala, Michele; Sotiropoulos, Fotis

    2012-03-01

    Research at the Interface of Marine Hydrokinetic Energy and the Environment: A Workshop; Minneapolis, Minnesota, 5-7 October 2011 Marine and hydrokinetic (MHK) energy harvesting technologies convert the kinetic energy of waves and water currents into power to generate electricity. Although these technologies are in early stages of development compared to other renewable technologies, such as solar and wind energy, they offer electricity consumers situated near coastlines or inland rivers an alternative energy technology that can help meet renewable portfolio standards. However, the potential environmental impacts of MHK energy are far from well understood, both in general principles and in site-specific cases. As pressure for new MHK energy licenses builds, accelerated research in providing the scientific understanding of harnessing the natural power of water for renewable energy at a competitive cost and without harming the environment becomes a priority.

  2. Performance Improvement: Applying a Human Performance Model to Organizational Processes in a Military Training Environment

    ERIC Educational Resources Information Center

    Aaberg, Wayne; Thompson, Carla J.; West, Haywood V.; Swiergosz, Matthew J.

    2009-01-01

    This article provides a description and the results of a study that utilized the human performance (HP) model and methods to explore and analyze a training organization. The systemic and systematic practices of the HP model are applicable to military training organizations as well as civilian organizations. Implications of the study for future…

  3. Model-based processing for shallow ocean environments: The broadband problem

    SciTech Connect

    Candy, J.V.; Sullivan, E.J.

    1996-01-31

    Most acoustic sources found is the ocean environmental are spatially complex and broadband. When propagating in a shallow ocean these source characteristics complicate the analysis of received acoustic data considerably. The enhancement of broadband acoustic pressure- field measurements using a vertical array is discussed. Here a model- based approach is developed for a broadband source using a normal- mode propagation model.

  4. An Efficient Simulation Environment for Modeling Large-Scale Cortical Processing

    PubMed Central

    Richert, Micah; Nageswaran, Jayram Moorkanikara; Dutt, Nikil; Krichmar, Jeffrey L.

    2011-01-01

    We have developed a spiking neural network simulator, which is both easy to use and computationally efficient, for the generation of large-scale computational neuroscience models. The simulator implements current or conductance based Izhikevich neuron networks, having spike-timing dependent plasticity and short-term plasticity. It uses a standard network construction interface. The simulator allows for execution on either GPUs or CPUs. The simulator, which is written in C/C++, allows for both fine grain and coarse grain specificity of a host of parameters. We demonstrate the ease of use and computational efficiency of this model by implementing a large-scale model of cortical areas V1, V4, and area MT. The complete model, which has 138,240 neurons and approximately 30 million synapses, runs in real-time on an off-the-shelf GPU. The simulator source code, as well as the source code for the cortical model examples is publicly available. PMID:22007166

  5. Arthropod model systems for studying complex biological processes in the space environment

    NASA Astrophysics Data System (ADS)

    Marco, Roberto; de Juan, Emilio; Ushakov, Ilya; Hernandorena, Arantxa; Gonzalez-Jurado, Juan; Calleja, Manuel; Manzanares, Miguel; Maroto, Miguel; Garesse, Rafael; Reitz, Günther; Miquel, Jaime

    1994-08-01

    Three arthropod systems are discussed in relation to their complementary and potential use in Space Biology. In a next biosatellite flight, Drosophila melanogaster pre-adapted during several months to different g levels will be flown in an automatic device that separates parental from first and second generations. In the same flight, flies will be exposed to microgravity conditions in an automatic unit in which fly motility can be recorded. In the International Microgravity Laboratory-2, several groups of Drosophila embryos will be grown in Space and the motility of a male fly population will be video-recorded. In the Biopan, an ESA exobilogy facility that can be flown attached to the exterior of a Russian biosatellite, Artemia dormant gastrulae will be exposed to the space environment in the exterior of the satellite under a normal atmosphere or in the void. Gastrulae will be separated in hit and non-hit populations. The developmental and aging response of these animals will be studied upon recovery. With these experiments we will be able to establish whether exposure to the space environment influences arthropod development and aging, and elaborate on some of the cellular mechanisms involved which should be tested in future experiments.

  6. Application of group process model to performance appraisal development in a CQI environment.

    PubMed

    LaPenta, C; Jacobs, G M

    1996-01-01

    Health care administrators have been faced with the challenge of developing performance appraisal instruments that balance continuous quality improvement (CQI) initiatives with the potential legal liabilities surrounding employee performance. Although the literature offers many examples of performance appraisal instruments, there is no real roadmap of how those charged with developing such an instrument can successfully go through the process. This article documents one pediatric hospital's experience in developing this needed device.

  7. Condensation Processes in Astrophysical Environments

    NASA Technical Reports Server (NTRS)

    Nuth, Joseph A., III; Rietmeijer, Frans J. M.; Hill, Hugh G. M.

    2002-01-01

    Astrophysical systems present an intriguing set of challenges for laboratory chemists. Chemistry occurs in regions considered an excellent vacuum by laboratory standards and at temperatures that would vaporize laboratory equipment. Outflows around Asymptotic Giant Branch (AGB) stars have timescales ranging from seconds to weeks depending on the distance of the region of interest from the star and, on the way significant changes in the state variables are defined. The atmospheres in normal stars may only change significantly on several billion-year timescales. Most laboratory experiments carried out to understand astrophysical processes are not done at conditions that perfectly match the natural suite of state variables or timescales appropriate for natural conditions. Experimenters must make use of simple analog experiments that place limits on the behavior of natural systems, often extrapolating to lower-pressure and/or higher-temperature environments. Nevertheless, we argue that well-conceived experiments will often provide insights into astrophysical processes that are impossible to obtain through models or observations. This is especially true for complex chemical phenomena such as the formation and metamorphism of refractory grains under a range of astrophysical conditions. Data obtained in our laboratory has been surprising in numerous ways, ranging from the composition of the condensates to the thermal evolution of their spectral properties. None of this information could have been predicted from first principals and would not have been credible even if it had.

  8. Process membership in asynchronous environments

    NASA Technical Reports Server (NTRS)

    Ricciardi, Aleta M.; Birman, Kenneth P.

    1993-01-01

    The development of reliable distributed software is simplified by the ability to assume a fail-stop failure model. The emulation of such a model in an asynchronous distributed environment is discussed. The solution proposed, called Strong-GMP, can be supported through a highly efficient protocol, and was implemented as part of a distributed systems software project at Cornell University. The precise definition of the problem, the protocol, correctness proofs, and an analysis of costs are addressed.

  9. A Phenomena-Oriented Environment for Teaching Process Modeling: Novel Modeling Software and Its Use in Problem Solving.

    ERIC Educational Resources Information Center

    Foss, Alan S.; Geurts, Kevin R.; Goodeve, Peter J.; Dahm, Kevin D.; Stephanopoulos, George; Bieszczad, Jerry; Koulouris, Alexandros

    1999-01-01

    Discusses a program that offers students a phenomenon-oriented environment expressed in the fundamental concepts and language of chemical engineering such as mass and energy balancing, phase equilibria, reaction stoichiometry and rate, modes of heat, and species transport. (CCM)

  10. A Comparison of Reasoning Processes in a Collaborative Modelling Environment: Learning about genetics problems using virtual chat

    NASA Astrophysics Data System (ADS)

    Pata, Kai; Sarapuu, Tago

    2006-09-01

    This study investigated the possible activation of different types of model-based reasoning processes in two learning settings, and the influence of various terms of reasoning on the learners’ problem representation development. Changes in 53 students’ problem representations about genetic issue were analysed while they worked with different modelling tools in a synchronous network-based environment. The discussion log-files were used for the “microgenetic” analysis of reasoning types. For studying the stages of students’ problem representation development, individual pre-essays and post-essays and their utterances during two reasoning phases were used. An approach for mapping problem representations was developed. Characterizing the elements of mental models and their reasoning level enabled the description of five hierarchical categories of problem representations. Learning in exploratory and experimental settings was registered as the shift towards more complex stages of problem representations in genetics. The effect of different types of reasoning could be observed as the divergent development of problem representations within hierarchical categories.

  11. An Integrated Vehicle Modeling Environment

    NASA Technical Reports Server (NTRS)

    Totah, Joseph J.; Kinney, David J.; Kaneshige, John T.; Agabon, Shane

    1999-01-01

    This paper describes an Integrated Vehicle Modeling Environment for estimating aircraft geometric, inertial, and aerodynamic characteristics, and for interfacing with a high fidelity, workstation based flight simulation architecture. The goals in developing this environment are to aid in the design of next generation intelligent fight control technologies, conduct research in advanced vehicle interface concepts for autonomous and semi-autonomous applications, and provide a value-added capability to the conceptual design and aircraft synthesis process. Results are presented for three aircraft by comparing estimates generated by the Integrated Vehicle Modeling Environment with known characteristics of each vehicle under consideration. The three aircraft are a modified F-15 with moveable canards attached to the airframe, a mid-sized, twin-engine commercial transport concept, and a small, single-engine, uninhabited aerial vehicle. Estimated physical properties and dynamic characteristics are correlated with those known for each aircraft over a large portion of the flight envelope of interest. These results represent the completion of a critical step toward meeting the stated goals for developing this modeling environment.

  12. DEVELOPMENT OF A CHEMICAL PROCESS MODELING ENVIRONMENT BASED ON CAPE-OPEN INTERFACE STANDARDS AND THE MICROSOFT .NET FRAMEWORK

    EPA Science Inventory

    Chemical process simulation has long been used as a design tool in the development of chemical plants, and has long been considered a means to evaluate different design options. With the advent of large scale computer networks and interface models for program components, it is po...

  13. A Comparison of Reasoning Processes in a Collaborative Modelling Environment: Learning about Genetics Problems Using Virtual Chat

    ERIC Educational Resources Information Center

    Pata, Kai; Sarapuu, Tago

    2006-01-01

    This study investigated the possible activation of different types of model-based reasoning processes in two learning settings, and the influence of various terms of reasoning on the learners' problem representation development. Changes in 53 students' problem representations about genetic issue were analysed while they worked with different…

  14. Generalized Environment for Modeling Systems

    SciTech Connect

    2012-02-07

    GEMS is an integrated environment that allows technical analysts, modelers, researchers, etc. to integrate and deploy models and/or decision tools with associated data to the internet for direct use by customers. GEMS does not require that the model developer know how to code or script and therefore delivers this capability to a large group of technical specialists. Customers gain the benefit of being able to execute their own scenarios directly without need for technical support. GEMS is a process that leverages commercial software products with specialized codes that add connectivity and unique functions to support the overall capability. Users integrate pre-existing models with a commercial product and store parameters and input trajectories in a companion commercial database. The model is then exposed into a commercial web environment and a graphical user interface (GUI) is applied by the model developer. Users execute the model through the web based GUI and GEMS manages supply of proper inputs, execution of models, routing of data to models and display of results back to users. GEMS works in layers, the following description is from the bottom up. Modelers create models in the modeling tool of their choice such as Excel, Matlab, or Fortran. They can also use models from a library of previously wrapped legacy codes (models). Modelers integrate the models (or a single model) by wrapping and connecting the models using the Phoenix Integration tool entitled ModelCenter. Using a ModelCenter/SAS plugin (DOE copyright CW-10-08) the modeler gets data from either an SAS or SQL database and sends results back to SAS or SQL. Once the model is working properly, the ModelCenter file is saved and stored in a folder location to which a SharePoint server tool created at INL is pointed. This enables the ModelCenter model to be run from SharePoint. The modeler then goes into Microsoft SharePoint and creates a graphical user interface (GUI) using the ModelCenter WebPart (CW-12

  15. Generalized Environment for Modeling Systems

    2012-02-07

    GEMS is an integrated environment that allows technical analysts, modelers, researchers, etc. to integrate and deploy models and/or decision tools with associated data to the internet for direct use by customers. GEMS does not require that the model developer know how to code or script and therefore delivers this capability to a large group of technical specialists. Customers gain the benefit of being able to execute their own scenarios directly without need for technical support.more » GEMS is a process that leverages commercial software products with specialized codes that add connectivity and unique functions to support the overall capability. Users integrate pre-existing models with a commercial product and store parameters and input trajectories in a companion commercial database. The model is then exposed into a commercial web environment and a graphical user interface (GUI) is applied by the model developer. Users execute the model through the web based GUI and GEMS manages supply of proper inputs, execution of models, routing of data to models and display of results back to users. GEMS works in layers, the following description is from the bottom up. Modelers create models in the modeling tool of their choice such as Excel, Matlab, or Fortran. They can also use models from a library of previously wrapped legacy codes (models). Modelers integrate the models (or a single model) by wrapping and connecting the models using the Phoenix Integration tool entitled ModelCenter. Using a ModelCenter/SAS plugin (DOE copyright CW-10-08) the modeler gets data from either an SAS or SQL database and sends results back to SAS or SQL. Once the model is working properly, the ModelCenter file is saved and stored in a folder location to which a SharePoint server tool created at INL is pointed. This enables the ModelCenter model to be run from SharePoint. The modeler then goes into Microsoft SharePoint and creates a graphical user interface (GUI) using the ModelCenter Web

  16. The Ecosystem Model: Designing Campus Environments.

    ERIC Educational Resources Information Center

    Western Interstate Commission for Higher Education, Boulder, CO.

    This document stresses the increasing awareness in higher education of the impact student/environment transactions have upon the quality of educational life and details a model and design process for creating a better fit between educational environments and students. The ecosystem model uses an interdisciplinary approach for the make-up of its…

  17. Spacecraft Internal Acoustic Environment Modeling

    NASA Technical Reports Server (NTRS)

    Chu, Shao-Sheng R.; Allen Christopher S.

    2010-01-01

    Acoustic modeling can be used to identify key noise sources, determine/analyze sub-allocated requirements, keep track of the accumulation of minor noise sources, and to predict vehicle noise levels at various stages in vehicle development, first with estimates of noise sources, later with experimental data. This paper describes the implementation of acoustic modeling for design purposes by incrementally increasing model fidelity and validating the accuracy of the model while predicting the noise of sources under various conditions. During FY 07, a simple-geometry Statistical Energy Analysis (SEA) model was developed and validated using a physical mockup and acoustic measurements. A process for modeling the effects of absorptive wall treatments and the resulting reverberation environment were developed. During FY 08, a model with more complex and representative geometry of the Orion Crew Module (CM) interior was built, and noise predictions based on input noise sources were made. A corresponding physical mockup was also built. Measurements were made inside this mockup, and comparisons were made with the model and showed excellent agreement. During FY 09, the fidelity of the mockup and corresponding model were increased incrementally by including a simple ventilation system. The airborne noise contribution of the fans was measured using a sound intensity technique, since the sound power levels were not known beforehand. This is opposed to earlier studies where Reference Sound Sources (RSS) with known sound power level were used. Comparisons of the modeling result with the measurements in the mockup showed excellent results. During FY 10, the fidelity of the mockup and the model were further increased by including an ECLSS (Environmental Control and Life Support System) wall, associated closeout panels, and the gap between ECLSS wall and mockup wall. The effect of sealing the gap and adding sound absorptive treatment to ECLSS wall were also modeled and validated.

  18. Understanding and modeling the physical processes that govern the melting of snow cover in a tropical mountain environment in Ecuador

    NASA Astrophysics Data System (ADS)

    Wagnon, P.; Lafaysse, M.; Lejeune, Y.; Maisincho, L.; Rojas, M.; Chazarin, J. P.

    2009-10-01

    The ISBA/CROCUS coupled ground-snow model developed for the Alps and subsequently adapted to the outer tropical conditions of Bolivia has been applied to a full set of meteorological data recorded at 4860 m above sea level on a moraine area in Ecuador (Antizana 15 glacier, 0°28'S; 78°09'W) between 16 June 2005 and 30 June 2006 to determine the physical processes involved in the melting and disappearance of transient snow cover in nonglaciated areas of the inner tropics. Although less accurate than in Bolivia, the model is still able to simulate snow behavior over nonglaciated natural surfaces, as long as the modeled turbulent fluxes over bare ground are reduced and a suitable function is included to represent the partitioning of the surface between bare soil and snow cover. The main difference between the two tropical sites is the wind velocity, which is more than 3 times higher at the Antizana site than at the Bolivian site, leading to a nonuniform spatial distribution of snow over nonglaciated areas that is hard to describe with a simple snow partitioning function. Net solar radiation dominates the surface energy balance and is responsible for the energy stored in snow-free areas (albedo = 0.05) and transferred horizontally to adjacent snow patches by conduction within the upper soil layers and by turbulent advection. These processes can prevent the snow cover from lasting more than a few hours or a few days. Sporadically, and at any time of the year, this inner tropical site, much wetter than the outer tropics, experiences heavy snowfalls, covering all the moraine area, and thus limiting horizontal transfers and inducing a significant time lag between precipitation events and runoff.

  19. Managing Algorithmic Skeleton Nesting Requirements in Realistic Image Processing Applications: The Case of the SKiPPER-II Parallel Programming Environment's Operating Model

    NASA Astrophysics Data System (ADS)

    Coudarcher, Rémi; Duculty, Florent; Serot, Jocelyn; Jurie, Frédéric; Derutin, Jean-Pierre; Dhome, Michel

    2005-12-01

    SKiPPER is a SKeleton-based Parallel Programming EnviRonment being developed since 1996 and running at LASMEA Laboratory, the Blaise-Pascal University, France. The main goal of the project was to demonstrate the applicability of skeleton-based parallel programming techniques to the fast prototyping of reactive vision applications. This paper deals with the special features embedded in the latest version of the project: algorithmic skeleton nesting capabilities and a fully dynamic operating model. Throughout the case study of a complete and realistic image processing application, in which we have pointed out the requirement for skeleton nesting, we are presenting the operating model of this feature. The work described here is one of the few reported experiments showing the application of skeleton nesting facilities for the parallelisation of a realistic application, especially in the area of image processing. The image processing application we have chosen is a 3D face-tracking algorithm from appearance.

  20. Estimation of environment-related properties of chemicals for design of sustainable processes: development of group-contribution+ (GC+) property models and uncertainty analysis.

    PubMed

    Hukkerikar, Amol Shivajirao; Kalakul, Sawitree; Sarup, Bent; Young, Douglas M; Sin, Gürkan; Gani, Rafiqul

    2012-11-26

    of the developed property models for the estimation of environment-related properties and uncertainties of the estimated property values is highlighted through an illustrative example. The developed property models provide reliable estimates of environment-related properties needed to perform process synthesis, design, and analysis of sustainable chemical processes and allow one to evaluate the effect of uncertainties of estimated property values on the calculated performance of processes giving useful insights into quality and reliability of the design of sustainable processes. PMID:23039255

  1. Estimation of environment-related properties of chemicals for design of sustainable processes: development of group-contribution+ (GC+) property models and uncertainty analysis.

    PubMed

    Hukkerikar, Amol Shivajirao; Kalakul, Sawitree; Sarup, Bent; Young, Douglas M; Sin, Gürkan; Gani, Rafiqul

    2012-11-26

    of the developed property models for the estimation of environment-related properties and uncertainties of the estimated property values is highlighted through an illustrative example. The developed property models provide reliable estimates of environment-related properties needed to perform process synthesis, design, and analysis of sustainable chemical processes and allow one to evaluate the effect of uncertainties of estimated property values on the calculated performance of processes giving useful insights into quality and reliability of the design of sustainable processes.

  2. Biosphere Process Model Report

    SciTech Connect

    J. Schmitt

    2000-05-25

    To evaluate the postclosure performance of a potential monitored geologic repository at Yucca Mountain, a Total System Performance Assessment (TSPA) will be conducted. Nine Process Model Reports (PMRs), including this document, are being developed to summarize the technical basis for each of the process models supporting the TSPA model. These reports cover the following areas: (1) Integrated Site Model; (2) Unsaturated Zone Flow and Transport; (3) Near Field Environment; (4) Engineered Barrier System Degradation, Flow, and Transport; (5) Waste Package Degradation; (6) Waste Form Degradation; (7) Saturated Zone Flow and Transport; (8) Biosphere; and (9) Disruptive Events. Analysis/Model Reports (AMRs) contain the more detailed technical information used to support TSPA and the PMRs. The AMRs consists of data, analyses, models, software, and supporting documentation that will be used to defend the applicability of each process model for evaluating the postclosure performance of the potential Yucca Mountain repository system. This documentation will ensure the traceability of information from its source through its ultimate use in the TSPA-Site Recommendation (SR) and in the National Environmental Policy Act (NEPA) analysis processes. The objective of the Biosphere PMR is to summarize (1) the development of the biosphere model, and (2) the Biosphere Dose Conversion Factors (BDCFs) developed for use in TSPA. The Biosphere PMR does not present or summarize estimates of potential radiation doses to human receptors. Dose calculations are performed as part of TSPA and will be presented in the TSPA documentation. The biosphere model is a component of the process to evaluate postclosure repository performance and regulatory compliance for a potential monitored geologic repository at Yucca Mountain, Nevada. The biosphere model describes those exposure pathways in the biosphere by which radionuclides released from a potential repository could reach a human receptor

  3. Slot Region Radiation Environment Models

    NASA Astrophysics Data System (ADS)

    Sandberg, Ingmar; Daglis, Ioannis; Heynderickx, Daniel; Evans, Hugh; Nieminen, Petteri

    2013-04-01

    Herein we present the main characteristics and first results of the Slot Region Radiation Environment Models (SRREMs) project. The statistical models developed in SRREMs aim to address the variability of trapped electron and proton fluxes in the region between the inner and the outer electron radiation belt. The energetic charged particle fluxes in the slot region are highly dynamic and are known to vary by several orders of magnitude on both short and long timescales. During quiet times, the particle fluxes are much lower than those found at the peak of the inner and outer belts and the region is considered benign. During geospace magnetic storms, though, this region can fill with energetic particles as the peak of the outer belt is pushed Earthwards and the fluxes can increase drastically. There has been a renewed interest in the potential operation of commercial satellites in orbits that are at least partially contained within the Slot Region. Hence, there is a need to improve the current radiation belt models, most of which do not model the extreme variability of the slot region and instead provide long-term averages between the better-known low and medium Earth orbits (LEO and MEO). The statistical models developed in the SRREMs project are based on the analysis of a large volume of available data and on the construction of a virtual database of slot region particle fluxes. The analysis that we have followed retains the long-term temporal, spatial and spectral variations in electron and proton fluxes as well as the short-term enhancement events at altitudes and inclinations relevant for satellites in the slot region. A large number of datasets have been used for the construction, evaluation and inter-calibration of the SRREMs virtual dataset. Special emphasis has been given on the use and analysis of ESA Standard Radiation Environment Monitor (SREM) data from the units on-board PROBA-1, INTEGRAL, and GIOVE-B due to the sufficient spatial and long temporal

  4. A Learning Model for Enhancing the Student's Control in Educational Process Using Web 2.0 Personal Learning Environments

    ERIC Educational Resources Information Center

    Rahimi, Ebrahim; van den Berg, Jan; Veen, Wim

    2015-01-01

    In recent educational literature, it has been observed that improving student's control has the potential of increasing his or her feeling of ownership, personal agency and activeness as means to maximize his or her educational achievement. While the main conceived goal for personal learning environments (PLEs) is to increase student's control by…

  5. Adapting the Electrospinning Process to Provide Three Unique Environments for a Tri-layered In Vitro Model of the Airway Wall

    PubMed Central

    Bridge, Jack C.; Aylott, Jonathan W.; Brightling, Christopher E.; Ghaemmaghami, Amir M.; Knox, Alan J.; Lewis, Mark P.; Rose, Felicity R.A.J.; Morris, Gavin E.

    2015-01-01

    Electrospinning is a highly adaptable method producing porous 3D fibrous scaffolds that can be exploited in in vitro cell culture. Alterations to intrinsic parameters within the process allow a high degree of control over scaffold characteristics including fiber diameter, alignment and porosity. By developing scaffolds with similar dimensions and topographies to organ- or tissue-specific extracellular matrices (ECM), micro-environments representative to those that cells are exposed to in situ can be created. The airway bronchiole wall, comprised of three main micro-environments, was selected as a model tissue. Using decellularized airway ECM as a guide, we electrospun the non-degradable polymer, polyethylene terephthalate (PET), by three different protocols to produce three individual electrospun scaffolds optimized for epithelial, fibroblast or smooth muscle cell-culture. Using a commercially available bioreactor system, we stably co-cultured the three cell-types to provide an in vitro model of the airway wall over an extended time period. This model highlights the potential for such methods being employed in in vitro diagnostic studies investigating important inter-cellular cross-talk mechanisms or assessing novel pharmaceutical targets, by providing a relevant platform to allow the culture of fully differentiated adult cells within 3D, tissue-specific environments. PMID:26275100

  6. Improving Ground Penetrating Radar Imaging in High Loss Environments by Coordinated System Development, Data Processing, Numerical Modeling, & Visualization ...

    SciTech Connect

    Powers, Michael H.

    2003-06-01

    The Department of Energy has identified the location and characterization of subsurface contaminants and the characterization of the subsurface as a priority need. Many DOE facilities are in need of subsurface imaging in the vadose and saturated zones. This includes (1) the detection and characterization of metal and concrete structures, (2) the characterization of waste pits (for both contents and integrity) and (3) mapping the complex geological/hydrological framework of the vadose and saturated zones. The DOE has identified ground penetrating radar (GPR) as a method that can non-invasively map transportation pathways and vadose zone heterogeneity. An advanced GPR system and advanced subsurface modeling, processing, imaging, and inversion techniques can be directly applied to several DOE science needs in more than one focus area and at many sites. Needs for enhanced subsurface imaging have been identified at Hanford, INEEL, SRS, ORNL, LLNL, SNL, LANL, and many other sites. In fact, needs for better subsurface imaging probably exist at all DOE sites. However, GPR performance is often inadequate due to increased attenuation and dispersion when soil conductivities are high. Our objective is to extend the limits of performance of GPR by improvements to both hardware and numerical computation. The key features include (1) greater dynamic range through real time digitizing, receiver gain improvements, and high output pulser, (2) modified, fully characterized antennas with sensors to allow dynamic determination of the changing radiated waveform, (3) modified deconvolution and depth migration algorithms exploiting the new antenna output information, (4) development of automatic full waveform inversion made possible by the known radiated pulse shape.

  7. Modeling the Space Radiation Environment

    NASA Technical Reports Server (NTRS)

    Xapsos, Michael A.

    2006-01-01

    There has been a renaissance of interest in space radiation environment modeling. This has been fueled by the growing need to replace long time standard AP-9 and AE-8 trapped particle models, the interplanetary exploration initiative, the modern satellite instrumentation that has led to unprecedented measurement accuracy, and the pervasive use of Commercial off the Shelf (COTS) microelectronics that require more accurate predictive capabilities. The objective of this viewgraph presentation was to provide basic understanding of the components of the space radiation environment and their variations, review traditional radiation effects application models, and present recent developments.

  8. Scalable Networked Information Processing Environment (SNIPE)

    SciTech Connect

    Fagg, G.E.; Moore, K.; Dongarra, J.J. |; Geist, A.

    1997-11-01

    SNIPE is a metacomputing system that aims to provide a reliable, secure, fault tolerant environment for long term distributed computing applications and data stores across the global Internet. This system combines global naming and replication of both processing and data to support large scale information processing applications leading to better availability and reliability than currently available with typical cluster computing and/or distributed computer environments.

  9. A Process and Environment Aware Sierra/SolidMechanics Cohesive Zone Modeling Capability for Polymer/Solid Interfaces

    SciTech Connect

    Reedy, E. D.; Chambers, Robert S.; Hughes, Lindsey Gloe; Kropka, Jamie Michael; Stavig, Mark E.; Stevens, Mark J.

    2015-09-01

    The performance and reliability of many mechanical and electrical components depend on the integrity of po lymer - to - solid interfaces . Such interfaces are found in adhesively bonded joints, encapsulated or underfilled electronic modules, protective coatings, and laminates. The work described herein was aimed at improving Sandia's finite element - based capability to predict interfacial crack growth by 1) using a high fidelity nonlinear viscoelastic material model for the adhesive in fracture simulations, and 2) developing and implementing a novel cohesive zone fracture model that generates a mode - mixity dependent toughness as a natural consequence of its formulation (i.e., generates the observed increase in interfacial toughness wi th increasing crack - tip interfacial shear). Furthermore, molecular dynamics simulations were used to study fundamental material/interfa cial physics so as to develop a fuller understanding of the connection between molecular structure and failure . Also reported are test results that quantify how joint strength and interfacial toughness vary with temperature.

  10. Modeling the growth and constraints of thermophiles and biogeochemical processes in deep-sea hydrothermal environments (Invited)

    NASA Astrophysics Data System (ADS)

    Holden, J. F.; Ver Eecke, H. C.; Lin, T. J.; Butterfield, D. A.; Olson, E. J.; Jamieson, J.; Knutson, J. K.; Dyar, M. D.

    2010-12-01

    and contain an abundance of Fe(III) oxide and sulfate minerals, especially on surfaces of pore spaces. Hyperthermophilic iron reducers attach to iron oxide particles via cell wall invaginations and pili and reduce the iron through direct contact. The iron is reduced to magnetite, possibly with a maghemite intermediate. Thus iron reducers could outcompete methanogens in low H2, mildly reducing habitats such as Endeavour. Unlike strain JH146, respiration rates per cell were highest near the optimal growth temperature for the iron reducer Hyperthermus strain Ro04 and decreased near the temperature limits for growth. This study highlights the need to model microbe-metal interactions and improve respiration estimates from pure cultures to refine our in situ bioenergetic and habitat models.

  11. Electronic materials processing and the microgravity environment

    NASA Technical Reports Server (NTRS)

    Witt, A. F.

    1988-01-01

    The nature and origin of deficiencies in bulk electronic materials for device fabrication are analyzed. It is found that gravity generated perturbations during their formation account largely for the introduction of critical chemical and crystalline defects and, moreover, are responsible for the still existing gap between theory and experiment and thus for excessive reliance on proprietary empiricism in processing technology. Exploration of the potential of reduced gravity environment for electronic materials processing is found to be not only desirable but mandatory.

  12. Teaching Process Writing in an Online Environment

    ERIC Educational Resources Information Center

    Carolan, Fergal; Kyppö, Anna

    2015-01-01

    This reflective practice paper offers some insights into teaching an interdisciplinary academic writing course aimed at promoting process writing. The study reflects on students' acquisition of writing skills and the teacher's support practices in a digital writing environment. It presents writers' experiences related to various stages of process…

  13. Cabin Environment Physics Risk Model

    NASA Technical Reports Server (NTRS)

    Mattenberger, Christopher J.; Mathias, Donovan Leigh

    2014-01-01

    This paper presents a Cabin Environment Physics Risk (CEPR) model that predicts the time for an initial failure of Environmental Control and Life Support System (ECLSS) functionality to propagate into a hazardous environment and trigger a loss-of-crew (LOC) event. This physics-of failure model allows a probabilistic risk assessment of a crewed spacecraft to account for the cabin environment, which can serve as a buffer to protect the crew during an abort from orbit and ultimately enable a safe return. The results of the CEPR model replace the assumption that failure of the crew critical ECLSS functionality causes LOC instantly, and provide a more accurate representation of the spacecraft's risk posture. The instant-LOC assumption is shown to be excessively conservative and, moreover, can impact the relative risk drivers identified for the spacecraft. This, in turn, could lead the design team to allocate mass for equipment to reduce overly conservative risk estimates in a suboptimal configuration, which inherently increases the overall risk to the crew. For example, available mass could be poorly used to add redundant ECLSS components that have a negligible benefit but appear to make the vehicle safer due to poor assumptions about the propagation time of ECLSS failures.

  14. Course Material Model in A&O Learning Environment.

    ERIC Educational Resources Information Center

    Levasma, Jarkko; Nykanen, Ossi

    One of the problematic issues in the content development for learning environments is the process of importing various types of course material into the environment. This paper describes a method for importing material into the A&O open learning environment by introducing a material model for metadata recognized by the environment. The first…

  15. Image and Signal Processing LISP Environment (ISLE)

    SciTech Connect

    Azevedo, S.G.; Fitch, J.P.; Johnson, R.R.; Lager, D.L.; Searfus, R.M.

    1987-10-02

    We have developed a multidimensional signal processing software system called the Image and Signal LISP Environment (ISLE). It is a hybrid software system, in that it consists of a LISP interpreter (used as the command processor) combined with FORTRAN, C, or LISP functions (used as the processing and display routines). Learning the syntax for ISLE is relatively simple and has the additional benefit of introducing a subset of commands from the general-purpose programming language, Common LISP. Because Common LISP is a well-documented and complete language, users do not need to depend exclusively on system developers for a description of the features of the command language, nor do the developers need to generate a command parser that exhaustively satisfies all the user requirements. Perhaps the major reason for selecting the LISP environment is that user-written code can be added to the environment through a ''foreign function'' interface without recompiling the entire system. The ability to perform fast prototyping of new algorithms is an important feature of this environment. As currently implemented, ISLE requires a Sun color or monochrome workstation and a license to run Franz Extended Common LISP. 16 refs., 4 figs.

  16. Process engineering concerns in the lunar environment

    NASA Technical Reports Server (NTRS)

    Sullivan, T. A.

    1990-01-01

    The paper discusses the constraints on a production process imposed by the lunar or Martian environment on the space transportation system. A proposed chemical route to produce oxygen from iron oxide bearing minerals (including ilmenite) is presented in three different configurations which vary in complexity. A design for thermal energy storage is presented that could both provide power during the lunar night and act as a blast protection barrier for the outpost. A process to release carbon from the lunar regolith as methane is proposed, capitalizing on the greater abundance and favorable physical properties of methane relative to hydrogen to benefit the entire system.

  17. Electrophysiological models of neural processing.

    PubMed

    Nelson, Mark E

    2011-01-01

    The brain is an amazing information processing system that allows organisms to adaptively monitor and control complex dynamic interactions with their environment across multiple spatial and temporal scales. Mathematical modeling and computer simulation techniques have become essential tools in understanding diverse aspects of neural processing ranging from sub-millisecond temporal coding in the sound localization circuity of barn owls to long-term memory storage and retrieval in humans that can span decades. The processing capabilities of individual neurons lie at the core of these models, with the emphasis shifting upward and downward across different levels of biological organization depending on the nature of the questions being addressed. This review provides an introduction to the techniques for constructing biophysically based models of individual neurons and local networks. Topics include Hodgkin-Huxley-type models of macroscopic membrane currents, Markov models of individual ion-channel currents, compartmental models of neuronal morphology, and network models involving synaptic interactions among multiple neurons. PMID:21064164

  18. Microbial processes in fractured rock environments

    NASA Astrophysics Data System (ADS)

    Kinner, Nancy E.; Eighmy, T. Taylor; Mills, M.; Coulburn, J.; Tisa, L.

    Little is known about the types and activities of microbes in fractured rock environments, but recent studies in a variety of bedrock formations have documented the presence of a diverse array of prokaryotes (Eubacteria and Archaea) and some protists. The prokaryotes appear to live in both diffusion-dominated microfractures and larger, more conductive open fractures. Some of the prokaryotes are associated with the surfaces of the host rock and mineral precipitates, while other planktonic forms are floating/moving in the groundwater filling the fractures. Studies indicate that the surface-associated and planktonic communities are distinct, and their importance in microbially mediated processes occurring in the bedrock environment may vary, depending on the availability of electron donors/acceptors and nutrients needed by the cells. In general, abundances of microbes are low compared with other environments, because of the paucity of these substances that are transported into the deeper subsurface where most bedrock occurs, unless there is significant pollution with an electron donor. To obtain a complete picture of the microbes present and their metabolic activity, it is usually necessary to sample formation water from specific fractures (versus open boreholes), and fracture surfaces (i.e., cores). Transport of the microbes through the major fracture pathways can be rapid, but may be quite limited in the microfractures. Very low abundances of small ( 2-3 μm) flagellated protists, which appear to prey upon planktonic bacteria, have been found in a bedrock aquifer. Much more research is needed to expand the understanding of all microbial processes in fractured rock environments.

  19. Estimation of Environment-Related Properties of Chemicals for Design of Sustainable Processes: Development of Group-Contribution+ (GC+) Property Models and Uncertainty Analysis

    EPA Science Inventory

    The aim of this work is to develop group-contribution+ (GC+) method (combined group-contribution (GC) method and atom connectivity index (CI) method) based property models to provide reliable estimations of environment-related properties of organic chemicals together with uncert...

  20. Space environment and lunar surface processes, 2

    NASA Technical Reports Server (NTRS)

    Comstock, G. M.

    1982-01-01

    The top few millimeters of a surface exposed to space represents a physically and chemically active zone with properties different from those of a surface in the environment of a planetary atmosphere. To meet the need or a quantitative synthesis of the various processes contributing to the evolution of surfaces of the Moon, Mercury, the asteroids, and similar bodies, (exposure to solar wind, solar flare particles, galactic cosmic rays, heating from solar radiation, and meteoroid bombardment), the MESS 2 computer program was developed. This program differs from earlier work in that the surface processes are broken down as a function of size scale and treated in three dimensions with good resolution on each scale. The results obtained apply to the development of soil near the surface and is based on lunar conditions. Parameters can be adjusted to describe asteroid regoliths and other space-related bodies.

  1. Spacecraft Internal Acoustic Environment Modeling

    NASA Technical Reports Server (NTRS)

    Chu, SShao-sheng R.; Allen, Christopher S.

    2009-01-01

    carried out by acquiring octave band microphone data simultaneously at ten fixed locations throughout the mockup. SPLs (Sound Pressure Levels) predicted by our SEA model match well with measurements for our CM mockup, with a more complicated shape. Additionally in FY09, background NC noise (Noise Criterion) simulation and MRT (Modified Rhyme Test) were developed and performed in the mockup to determine the maximum noise level in CM habitable volume for fair crew voice communications. Numerous demonstrations of simulated noise environment in the mockup and associated SIL (Speech Interference Level) via MRT were performed for various communities, including members from NASA and Orion prime-/sub-contractors. Also, a new HSIR (Human-Systems Integration Requirement) for limiting pre- and post-landing SIL was proposed.

  2. Control of the aseptic processing environment.

    PubMed

    Frieben, W R

    1983-11-01

    Methods used by industry with applications to hospital pharmacy for maintaining an aseptic environment in production of sterile pharmaceutical products are discussed. A major source of product contamination is airborne microorganisms. The laminar-airflow workbench with a high-efficiency particulate air filter provides an ultraclean environment for preparation of sterile products. However, the workbench does not guarantee sterility of products and is not effective if not properly installed and maintained or if the operator uses poor aseptic technique. The laminar-airflow workbench should be tested for leaks, airflow velocity, and airflow patterns when installed, and the workbench should be checked periodically thereafter. The workbench should be placed in a cleanroom where traffic and air disturbances that might affect the laminar airflow are eliminated. A major source of airborne microbial contamination in cleanrooms is people. Personnel movement through an area and presence of personnel without lint-free, nonshedding protective garments increase the levels of microbial contaminants in an area. The transport of nonsterile products (bottles, boxes, paper products) into a cleanroom should be minimized. The cleanroom itself should be sanitized and should be immaculate. Microbial or particulate monitoring should be conducted in the cleanroom using a quantitative method, and corrective-action limits should be set. Hospital pharmacists should examine industrial sterile-processing techniques and apply them to the preparation of sterile products.

  3. Gene-Environment Interplay in Twin Models

    PubMed Central

    Hatemi, Peter K.

    2013-01-01

    In this article, we respond to Shultziner’s critique that argues that identical twins are more alike not because of genetic similarity, but because they select into more similar environments and respond to stimuli in comparable ways, and that these effects bias twin model estimates to such an extent that they are invalid. The essay further argues that the theory and methods that undergird twin models, as well as the empirical studies which rely upon them, are unaware of these potential biases. We correct this and other misunderstandings in the essay and find that gene-environment (GE) interplay is a well-articulated concept in behavior genetics and political science, operationalized as gene-environment correlation and gene-environment interaction. Both are incorporated into interpretations of the classical twin design (CTD) and estimated in numerous empirical studies through extensions of the CTD. We then conduct simulations to quantify the influence of GE interplay on estimates from the CTD. Due to the criticism’s mischaracterization of the CTD and GE interplay, combined with the absence of any empirical evidence to counter what is presented in the extant literature and this article, we conclude that the critique does not enhance our understanding of the processes that drive political traits, genetic or otherwise. PMID:24808718

  4. Gene-Environment Interplay in Twin Models.

    PubMed

    Verhulst, Brad; Hatemi, Peter K

    2013-07-01

    In this article, we respond to Shultziner's critique that argues that identical twins are more alike not because of genetic similarity, but because they select into more similar environments and respond to stimuli in comparable ways, and that these effects bias twin model estimates to such an extent that they are invalid. The essay further argues that the theory and methods that undergird twin models, as well as the empirical studies which rely upon them, are unaware of these potential biases. We correct this and other misunderstandings in the essay and find that gene-environment (GE) interplay is a well-articulated concept in behavior genetics and political science, operationalized as gene-environment correlation and gene-environment interaction. Both are incorporated into interpretations of the classical twin design (CTD) and estimated in numerous empirical studies through extensions of the CTD. We then conduct simulations to quantify the influence of GE interplay on estimates from the CTD. Due to the criticism's mischaracterization of the CTD and GE interplay, combined with the absence of any empirical evidence to counter what is presented in the extant literature and this article, we conclude that the critique does not enhance our understanding of the processes that drive political traits, genetic or otherwise. PMID:24808718

  5. A multiarchitecture parallel-processing development environment

    NASA Technical Reports Server (NTRS)

    Townsend, Scott; Blech, Richard; Cole, Gary

    1993-01-01

    A description is given of the hardware and software of a multiprocessor test bed - the second generation Hypercluster system. The Hypercluster architecture consists of a standard hypercube distributed-memory topology, with multiprocessor shared-memory nodes. By using standard, off-the-shelf hardware, the system can be upgraded to use rapidly improving computer technology. The Hypercluster's multiarchitecture nature makes it suitable for researching parallel algorithms in computational field simulation applications (e.g., computational fluid dynamics). The dedicated test-bed environment of the Hypercluster and its custom-built software allows experiments with various parallel-processing concepts such as message passing algorithms, debugging tools, and computational 'steering'. Such research would be difficult, if not impossible, to achieve on shared, commercial systems.

  6. Near-field environment/processes working group summary

    SciTech Connect

    Murphy, W.M.

    1995-09-01

    This article is a summary of the proceedings of a group discussion which took place at the Workshop on the Role of Natural Analogs in Geologic Disposal of High-Level Nuclear Waste in San Antonio, Texas on July 22-25, 1991. The working group concentrated on the subject of the near-field environment to geologic repositories for high-level nuclear waste. The near-field environment may be affected by thermal perturbations from the waste, and by disturbances caused by the introduction of exotic materials during construction of the repository. This group also discussed the application of modelling of performance-related processes.

  7. Modeling of Flow and Water Quality Processes with Finite Volume Method due to Spreading and Dispersion of Petrochemical Pollution in the Hydro-Environments

    NASA Astrophysics Data System (ADS)

    Sarhadi Zadeh, Ehsan; Hejazi, Kourosh

    2009-11-01

    Having two water frontiers, namely (everlasting) Persian Gulf and Oman Sea in the south and Caspian Sea in the north, intense dependence on extracting and exporting oil, especially via marine fleets and ever-increasing development of petrochemical industry, Iran is exposed to severe environmental damages caused by oil and petrochemical industries. This essay investigates how oil spill is diffused and its environmental pollution is spread. The movement of oil spill, and its diffusion in water and its effects on water and the environment has been simulated by developing a Depth-Averaged numerical model and using the Finite Volume method. The existing models are not efficient enough to fulfill current modeling needs. The developed model uses the parameters useful in the advection and diffusion of oil pollutions in a model appropriate for predicting the transport of oil spill. Since the Navier-Stokes Equations play an important role in the advection and diffusion of oil pollutions, it is highly important to choose an appropriate numerical method in the advection and diffusion section. In this essay, choosing the methods used in the advection and diffusion have been emphasized and highly-accurate algorithms has been used in the advection terms. These algorithms are not present in similar models. The resulting equations have been solved using the ADI method. This method solves the unknown parameters with solving a Penta-Diagonal matrix in each time step. It does so without sacrificing the desired precision.

  8. Automated Environment Generation for Software Model Checking

    NASA Technical Reports Server (NTRS)

    Tkachuk, Oksana; Dwyer, Matthew B.; Pasareanu, Corina S.

    2003-01-01

    A key problem in model checking open systems is environment modeling (i.e., representing the behavior of the execution context of the system under analysis). Software systems are fundamentally open since their behavior is dependent on patterns of invocation of system components and values defined outside the system but referenced within the system. Whether reasoning about the behavior of whole programs or about program components, an abstract model of the environment can be essential in enabling sufficiently precise yet tractable verification. In this paper, we describe an approach to generating environments of Java program fragments. This approach integrates formally specified assumptions about environment behavior with sound abstractions of environment implementations to form a model of the environment. The approach is implemented in the Bandera Environment Generator (BEG) which we describe along with our experience using BEG to reason about properties of several non-trivial concurrent Java programs.

  9. Design of Training Systems Utility Assessment: The Training Process Flow and System Capabilities/Requirements and Resources Models Operating in the TRAPAC Environment.

    ERIC Educational Resources Information Center

    Duffy, Larry R.

    The report summarizes the results of a field test conducted for the purpose of determining the utility to Naval training of the Systems Capabilities/Requirements and Resources (SCRR) and the Training Process Flow (TPF) computer-based mathematical models. Basic descriptions of the SCRR and the TPF and their development are given. Training…

  10. Processing Conditions, Rice Properties, Health and Environment

    PubMed Central

    Roy, Poritosh; Orikasa, Takahiro; Okadome, Hiroshi; Nakamura, Nobutaka; Shiina, Takeo

    2011-01-01

    Rice is the staple food for nearly two-thirds of the world’s population. Food components and environmental load of rice depends on the rice form that is resulted by different processing conditions. Brown rice (BR), germinated brown rice (GBR) and partially-milled rice (PMR) contains more health beneficial food components compared to the well milled rice (WMR). Although the arsenic concentration in cooked rice depends on the cooking methods, parboiled rice (PBR) seems to be relatively prone to arsenic contamination compared to that of untreated rice, if contaminated water is used for parboiling and cooking. A change in consumption patterns from PBR to untreated rice (non-parboiled), and WMR to PMR or BR may conserve about 43–54 million tons of rice and reduce the risk from arsenic contamination in the arsenic prone area. This study also reveals that a change in rice consumption patterns not only supply more food components but also reduces environmental loads. A switch in production and consumption patterns would improve food security where food grains are scarce, and provide more health beneficial food components, may prevent some diseases and ease the burden on the Earth. However, motivation and awareness of the environment and health, and even a nominal incentive may require for a method switching which may help in building a sustainable society. PMID:21776212

  11. Model for a Healthy Work Environment.

    PubMed

    Blevins, Jamie

    2016-01-01

    The Healthy Work Environment (HWE) Model, considered a model of standards of professional behaviors, was created to help foster an environment that is happy, healthy, realistic, and feasible. The model focuses on areas of PEOPLE and PRACTICE, where each letter of these words identifies core, professional qualities and behaviors to foster an environment amenable and conducive to accountability for one's behavior and action. Each of these characteristics is supported from a Christian, biblical perspective. The HWE Model provides a mental and physical checklist of what is important in creating and sustaining a healthy work environment in education and practice.

  12. Model for a Healthy Work Environment.

    PubMed

    Blevins, Jamie

    2016-01-01

    The Healthy Work Environment (HWE) Model, considered a model of standards of professional behaviors, was created to help foster an environment that is happy, healthy, realistic, and feasible. The model focuses on areas of PEOPLE and PRACTICE, where each letter of these words identifies core, professional qualities and behaviors to foster an environment amenable and conducive to accountability for one's behavior and action. Each of these characteristics is supported from a Christian, biblical perspective. The HWE Model provides a mental and physical checklist of what is important in creating and sustaining a healthy work environment in education and practice. PMID:27610916

  13. Foam process models.

    SciTech Connect

    Moffat, Harry K.; Noble, David R.; Baer, Thomas A.; Adolf, Douglas Brian; Rao, Rekha Ranjana; Mondy, Lisa Ann

    2008-09-01

    In this report, we summarize our work on developing a production level foam processing computational model suitable for predicting the self-expansion of foam in complex geometries. The model is based on a finite element representation of the equations of motion, with the movement of the free surface represented using the level set method, and has been implemented in SIERRA/ARIA. An empirically based time- and temperature-dependent density model is used to encapsulate the complex physics of foam nucleation and growth in a numerically tractable model. The change in density with time is at the heart of the foam self-expansion as it creates the motion of the foam. This continuum-level model uses an homogenized description of foam, which does not include the gas explicitly. Results from the model are compared to temperature-instrumented flow visualization experiments giving the location of the foam front as a function of time for our EFAR model system.

  14. Students' mental models of the environment

    NASA Astrophysics Data System (ADS)

    Shepardson, Daniel P.; Wee, Bryan; Priddy, Michelle; Harbor, Jon

    2007-02-01

    What are students' mental models of the environment? In what ways, if any, do students' mental models vary by grade level or community setting? These two questions guided the research reported in this article. The Environments Task was administered to students from 25 different teacher-classrooms. The student responses were first inductively analyzed in order to identify students' mental models of the environment. The second phase of analysis involved the statistical testing of the identified mental models. From this analysis four mental models emerged: Model 1, the environment as a place where animals/plants live - a natural place; Model 2, the environment as a place that supports life; Model 3, the environment as a place impacted or modified by human activity; and Model 4, the environment as a place where animals, plants, and humans live. The dominant mental model was Mental Model 1. Yet, a greater frequency of urban students than suburban and rural students held Mental Model 3. The implications to environmental science education are explored.

  15. Radiolysis Process Model

    SciTech Connect

    Buck, Edgar C.; Wittman, Richard S.; Skomurski, Frances N.; Cantrell, Kirk J.; McNamara, Bruce K.; Soderquist, Chuck Z.

    2012-07-17

    Assessing the performance of spent (used) nuclear fuel in geological repository requires quantification of time-dependent phenomena that may influence its behavior on a time-scale up to millions of years. A high-level waste repository environment will be a dynamic redox system because of the time-dependent generation of radiolytic oxidants and reductants and the corrosion of Fe-bearing canister materials. One major difference between used fuel and natural analogues, including unirradiated UO2, is the intense radiolytic field. The radiation emitted by used fuel can produce radiolysis products in the presence of water vapor or a thin-film of water (including OH• and H• radicals, O2-, eaq, H2O2, H2, and O2) that may increase the waste form degradation rate and change radionuclide behavior. H2O2 is the dominant oxidant for spent nuclear fuel in an O2 depleted water environment, the most sensitive parameters have been identified with respect to predictions of a radiolysis model under typical conditions. As compared with the full model with about 100 reactions it was found that only 30-40 of the reactions are required to determine [H2O2] to one part in 10–5 and to preserve most of the predictions for major species. This allows a systematic approach for model simplification and offers guidance in designing experiments for validation.

  16. Students' Mental Models of the Environment

    ERIC Educational Resources Information Center

    Shepardson, Daniel P.; Wee, Bryan; Priddy, Michelle; Harbor, Jon

    2007-01-01

    What are students' mental models of the environment? In what ways, if any, do students' mental models vary by grade level or community setting? These two questions guided the research reported in this article. The Environments Task was administered to students from 25 different teacher-classrooms. The student responses were first inductively…

  17. Patient Data Synchronization Process in a Continuity of Care Environment

    PubMed Central

    Haras, Consuela; Sauquet, Dominique; Ameline, Philippe; Jaulent, Marie-Christine; Degoulet, Patrice

    2005-01-01

    In a distributed patient record environment, we analyze the processes needed to ensure exchange and access to EHR data. We propose an adapted method and the tools for data synchronization. Our study takes into account the issues of user rights management for data access and of decreasing the amount of data exchanged over the network. We describe a XML-based synchronization model that is portable and independent of specific medical data models. The implemented platform consists of several servers, of local network clients, of workstations running user’s interfaces and of data exchange and synchronization tools. PMID:16779049

  18. Spacecraft Internal Acoustic Environment Modeling

    NASA Technical Reports Server (NTRS)

    Chu, S. Reynold; Allen, Chris

    2009-01-01

    The objective of the project is to develop an acoustic modeling capability, based on commercial off-the-shelf software, to be used as a tool for oversight of the future manned Constellation vehicles. The use of such a model will help ensure compliance with acoustic requirements. Also, this project includes modeling validation and development feedback via building physical mockups and conducting acoustic measurements to compare with the predictions.

  19. Neuroscientific Model of Motivational Process

    PubMed Central

    Kim, Sung-il

    2013-01-01

    Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three sub processes, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous sub processes, namely reward-driven approach, value-based decision-making, and goal-directed control. Reward-driven approach is the process in which motivation is generated by reward anticipation and selective approach behaviors toward reward. This process recruits the ventral striatum (reward area) in which basic stimulus-action association is formed, and is classified as an automatic motivation to which relatively less attention is assigned. By contrast, value-based decision-making is the process of evaluating various outcomes of actions, learning through positive prediction error, and calculating the value continuously. The striatum and the orbitofrontal cortex (valuation area) play crucial roles in sustaining motivation. Lastly, the goal-directed control is the process of regulating motivation through cognitive control to achieve goals. This consciously controlled motivation is associated with higher-level cognitive functions such as planning, retaining the goal, monitoring the performance, and regulating action. The anterior cingulate cortex (attention area) and the dorsolateral prefrontal cortex (cognitive control area) are the main neural circuits related to regulation of motivation. These three sub processes interact with each other by sending reward prediction error signals through dopaminergic pathway from the striatum and to the prefrontal cortex. The neuroscientific model of motivational process suggests several educational implications with regard to the generation, maintenance, and regulation of motivation to learn in the learning environment. PMID:23459598

  20. Neuroscientific model of motivational process.

    PubMed

    Kim, Sung-Il

    2013-01-01

    Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three sub processes, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous sub processes, namely reward-driven approach, value-based decision-making, and goal-directed control. Reward-driven approach is the process in which motivation is generated by reward anticipation and selective approach behaviors toward reward. This process recruits the ventral striatum (reward area) in which basic stimulus-action association is formed, and is classified as an automatic motivation to which relatively less attention is assigned. By contrast, value-based decision-making is the process of evaluating various outcomes of actions, learning through positive prediction error, and calculating the value continuously. The striatum and the orbitofrontal cortex (valuation area) play crucial roles in sustaining motivation. Lastly, the goal-directed control is the process of regulating motivation through cognitive control to achieve goals. This consciously controlled motivation is associated with higher-level cognitive functions such as planning, retaining the goal, monitoring the performance, and regulating action. The anterior cingulate cortex (attention area) and the dorsolateral prefrontal cortex (cognitive control area) are the main neural circuits related to regulation of motivation. These three sub processes interact with each other by sending reward prediction error signals through dopaminergic pathway from the striatum and to the prefrontal cortex. The neuroscientific model of motivational process suggests several educational implications with regard to the generation, maintenance, and regulation of motivation to learn in the learning environment.

  1. Improvement of pre- and post-processing environments of the dynamic two-dimensional reservoir model CE-QUAL-W2 based on GIS.

    PubMed

    Ha, S R; Bae, G J; Park, D H; Cho, J H

    2003-01-01

    An Environmental Information System (EIS) coupled with a Geographic Information System (GIS) and water quality models is developed to improve the pre- and post-data processing function of CE-QUAL-W2. Since the accuracy of the geometric data in terms of a diverse water body has a great effect on the water quality variables such as the velocity, kinetic reactions, the horizontal and vertical momentum, to prepare the bathymetry information has been considered a difficult issue for modellers who intend to use the model. For identifying Cross Section and Profile Information (CSPI), which precisely contains hydraulic features and geographical configuration of a waterway, the automated CSPI extraction program has been developed using Avenue Language of the PC Arc/view package. The program consists of three major steps: (1) getting the digital depth map of a waterway using GIS techniques; (2) creating a CSPI data set of segments in each branch using the program for CE-QUAL-W2 bathymetry input; (3) selecting the optimal set of bathymetry input by which the calculated water volume meets the observed volume of the water body. Through those approaches, it is clear that the model simulation results in terms of water quality as well as reservoir hydraulics rely upon the accuracy of bathymetry information. PMID:15137156

  2. Space environment and lunar surface processes

    NASA Technical Reports Server (NTRS)

    Comstock, G. M.

    1979-01-01

    The development of a general rock/soil model capable of simulating in a self consistent manner the mechanical and exposure history of an assemblage of solid and loose material from submicron to planetary size scales, applicable to lunar and other space exposed planetary surfaces is discussed. The model was incorporated into a computer code called MESS.2 (model for the evolution of space exposed surfaces). MESS.2, which represents a considerable increase in sophistication and scope over previous soil and rock surface models, is described. The capabilities of previous models for near surface soil and rock surfaces are compared with the rock/soil model, MESS.2.

  3. Listeria monocytogenes in Irish Farmhouse cheese processing environments.

    PubMed

    Fox, Edward; Hunt, Karen; O'Brien, Martina; Jordan, Kieran

    2011-03-01

    Sixteen cheesemaking facilities were sampled during the production season at monthly intervals over a two-year period. Thirteen facilities were found to have samples positive for Listeria monocytogenes. Samples were divided into 4 categories; cheese, raw milk, processing environment and external to the processing environment (samples from the farm such as silage, bedding, and pooled water). In order to attempt to identify the source, persistence and putative transfer routes of contamination with the L. monocytogenes isolates, they were differentiated using PFGE and serotyping. Of the 250 isolates, there were 52 different pulsotypes. No pulsotype was found at more than one facility. Two facilities had persistent pulsotypes that were isolated on sampling occasions at least 6 months apart. Of the samples tested, 6.3% of milk, 13.1% of processing environment and 12.3% of samples external to the processing environment, respectively, were positive for L. monocytogenes. Pulsotypes found in raw milk were also found in the processing environment, however, one of the pulsotypes from raw milk was found in cheese on only one occasion. One of the pulsotypes isolated from the environment external to the processing facility was found on the surface of cheese, however, a number of them were found in the processing environment. The results suggest that the farm environment external to the processing environment may in some cases be the source of processing environment contamination with L. monocytogenes.

  4. Electron environment specification models for Galileo

    NASA Astrophysics Data System (ADS)

    Lazaro, Didier; Bourdarie, Sebastien; Hands, Alex; Ryden, Keith; Nieminen, Petteri

    The MEO radiation hazard is becoming an increasingly important consideration with an ever rising number of satellites missions spending most of their time in this environment. This region lies in the heart of the highly dynamic electron radiation belt, where very large radiation doses can be encountered unless proper shielding to critical systems and components is applied. Significant internal charging hazards also arise in the MEO regime. For electron environment specification at Galileo altitude, new models have been developed and implemented: long term effects model for dose evaluation, statistical model for internal charging analysis and latitudinal model for ELDRS analysis. Models outputs, tools and validation with observations (Giove-A data) and existing models (such as FLUMIC) are presented . "Energetic Electron Environment Models for MEO" Co 21403/08/NL/JD in consortium with ONERA, QinetiQ, SSTL and CNES .

  5. Combustion Processes in the Aerospace Environment

    NASA Technical Reports Server (NTRS)

    Huggett, Clayton

    1969-01-01

    The aerospace environment introduces new and enhanced fire hazards because the special atmosphere employed may increase the frequency and intensity of fires, because the confinement associated with aerospace systems adversely affects the dynamics of fire development and control, and because the hostile external environments limit fire control and rescue operations. Oxygen enriched atmospheres contribute to the fire hazard in aerospace systems by extending the list of combustible fuels, increasing the probability of ignition, and increasing the rates of fire spread and energy release. A system for classifying atmospheres according to the degree of fire hazard, based on the heat capacity of the atmosphere per mole of oxygen, is suggested. A brief exploration of the dynamics of chamber fires shows that such fires will exhibit an exponential growth rate and may grow to dangerous size in a very short time. Relatively small quantities of fuel and oxygen can produce a catastrophic fire in a closed chamber.

  6. Engineered Barrier System: Physical and Chemical Environment Model

    SciTech Connect

    D. M. Jolley; R. Jarek; P. Mariner

    2004-02-09

    The conceptual and predictive models documented in this Engineered Barrier System: Physical and Chemical Environment Model report describe the evolution of the physical and chemical conditions within the waste emplacement drifts of the repository. The modeling approaches and model output data will be used in the total system performance assessment (TSPA-LA) to assess the performance of the engineered barrier system and the waste form. These models evaluate the range of potential water compositions within the emplacement drifts, resulting from the interaction of introduced materials and minerals in dust with water seeping into the drifts and with aqueous solutions forming by deliquescence of dust (as influenced by atmospheric conditions), and from thermal-hydrological-chemical (THC) processes in the drift. These models also consider the uncertainty and variability in water chemistry inside the drift and the compositions of introduced materials within the drift. This report develops and documents a set of process- and abstraction-level models that constitute the engineered barrier system: physical and chemical environment model. Where possible, these models use information directly from other process model reports as input, which promotes integration among process models used for total system performance assessment. Specific tasks and activities of modeling the physical and chemical environment are included in the technical work plan ''Technical Work Plan for: In-Drift Geochemistry Modeling'' (BSC 2004 [DIRS 166519]). As described in the technical work plan, the development of this report is coordinated with the development of other engineered barrier system analysis model reports.

  7. Radiation Environment Modeling for Spacecraft Design: New Model Developments

    NASA Technical Reports Server (NTRS)

    Barth, Janet; Xapsos, Mike; Lauenstein, Jean-Marie; Ladbury, Ray

    2006-01-01

    A viewgraph presentation on various new space radiation environment models for spacecraft design is described. The topics include: 1) The Space Radiatio Environment; 2) Effects of Space Environments on Systems; 3) Space Radiatio Environment Model Use During Space Mission Development and Operations; 4) Space Radiation Hazards for Humans; 5) "Standard" Space Radiation Environment Models; 6) Concerns about Standard Models; 7) Inadequacies of Current Models; 8) Development of New Models; 9) New Model Developments: Proton Belt Models; 10) Coverage of New Proton Models; 11) Comparison of TPM-1, PSB97, AP-8; 12) New Model Developments: Electron Belt Models; 13) Coverage of New Electron Models; 14) Comparison of "Worst Case" POLE, CRESELE, and FLUMIC Models with the AE-8 Model; 15) New Model Developments: Galactic Cosmic Ray Model; 16) Comparison of NASA, MSU, CIT Models with ACE Instrument Data; 17) New Model Developmemts: Solar Proton Model; 18) Comparison of ESP, JPL91, KIng/Stassinopoulos, and PSYCHIC Models; 19) New Model Developments: Solar Heavy Ion Model; 20) Comparison of CREME96 to CREDO Measurements During 2000 and 2002; 21) PSYCHIC Heavy ion Model; 22) Model Standardization; 23) Working Group Meeting on New Standard Radiation Belt and Space Plasma Models; and 24) Summary.

  8. Guide to Modeling Earth's Trapped Radiation Environment

    NASA Technical Reports Server (NTRS)

    Garrett, H.

    1999-01-01

    The report will close with a detailed discussion of the current status of modeling of the radiation environment and recommend a long range plan for enhancing capabilities in this important environmental area.

  9. Time reversal processing for source location in an urban environment

    NASA Astrophysics Data System (ADS)

    Albert, Donald G.; Liu, Lanbo; Moran, Mark L.

    2005-08-01

    A simulation study is conducted to demonstrate in principle that time reversal processing can be used to locate sound sources in an outdoor urban area with many buildings. Acoustic pulse propagation in this environment is simulated using a two-dimensional finite difference time domain (FDTD) computation. Using the simulated time traces from only a few sensors and back propagating them with the FDTD model, the sound energy refocuses in the vicinity of the true source location. This time reversal numerical experiment confirms that using information acquired only at non-line-of-sight locations is sufficient to obtain accurate source locations in a complex urban terrain.

  10. Space Environments and Effects: Trapped Proton Model

    NASA Technical Reports Server (NTRS)

    Huston, S. L.; Kauffman, W. (Technical Monitor)

    2002-01-01

    An improved model of the Earth's trapped proton environment has been developed. This model, designated Trapped Proton Model version 1 (TPM-1), determines the omnidirectional flux of protons with energy between 1 and 100 MeV throughout near-Earth space. The model also incorporates a true solar cycle dependence. The model consists of several data files and computer software to read them. There are three versions of the mo'del: a FORTRAN-Callable library, a stand-alone model, and a Web-based model.

  11. Simple Thermal Environment Model (STEM) User's Guide

    NASA Technical Reports Server (NTRS)

    Justus, C.G.; Batts, G. W.; Anderson, B. J.; James, B. F.

    2001-01-01

    This report presents a Simple Thermal Environment Model (STEM) for determining appropriate engineering design values to specify the thermal environment of Earth-orbiting satellites. The thermal environment of a satellite, consists of three components: (1) direct solar radiation, (2) Earth-atmosphere reflected shortwave radiation, as characterized by Earth's albedo, and (3) Earth-atmosphere-emitted outgoing longwave radiation (OLR). This report, together with a companion "guidelines" report provides methodology and guidelines for selecting "design points" for thermal environment parameters for satellites and spacecraft systems. The methods and models reported here are outgrowths of Earth Radiation Budget Experiment (ERBE) satellite data analysis and thermal environment specifications discussed by Anderson and Smith (1994). In large part, this report is intended to update (and supersede) those results.

  12. Building an environment model using depth information

    NASA Technical Reports Server (NTRS)

    Roth-Tabak, Y.; Jain, Ramesh

    1989-01-01

    Modeling the environment is one of the most crucial issues for the development and research of autonomous robot and tele-perception. Though the physical robot operates (navigates and performs various tasks) in the real world, any type of reasoning, such as situation assessment, planning or reasoning about action, is performed based on information in its internal world. Hence, the robot's intentional actions are inherently constrained by the models it has. These models may serve as interfaces between sensing modules and reasoning modules, or in the case of telerobots serve as interface between the human operator and the distant robot. A robot operating in a known restricted environment may have a priori knowledge of its whole possible work domain, which will be assimilated in its World Model. As the information in the World Model is relatively fixed, an Environment Model must be introduced to cope with the changes in the environment and to allow exploring entirely new domains. Introduced here is an algorithm that uses dense range data collected at various positions in the environment to refine and update or generate a 3-D volumetric model of an environment. The model, which is intended for autonomous robot navigation and tele-perception, consists of cubic voxels with the possible attributes: Void, Full, and Unknown. Experimental results from simulations of range data in synthetic environments are given. The quality of the results show great promise for dealing with noisy input data. The performance measures for the algorithm are defined, and quantitative results for noisy data and positional uncertainty are presented.

  13. Mountains and man. A study of process and environment

    SciTech Connect

    Price, L.W.

    1986-01-01

    This book explores the processes and features of mountain environments: glaciers, snow and avalanches, landforms, weather and climate vegetation soils, and wildlife. The effects of latitudinal position on these processes and features are analyzed.

  14. Sanitation in the Shell Egg Processing Environment

    Technology Transfer Automated Retrieval System (TEKTRAN)

    In the past, most of the regulations regarding egg processing are concerned with quality rather than safety. Hazard Analysis and Critical Control Point (HACCP) will be required by retailers or by the federal government. GMPs (Good Manufacturing Practices) and SSOPs (Sanitation Standard Operating P...

  15. Sanitation in the Shell Egg Processing Environment

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Hazard analysis and critical control programs (HACCP) will eventually be required for commercial shell egg processing plants. Sanitation is an essential prerequisite program for HACCP and is based upon current Good Manufacturing Practices (cGMPs) as listed in the Code of Federal Regulations. Good ...

  16. MATLAB/Simulink analytic radar modeling environment

    NASA Astrophysics Data System (ADS)

    Esken, Bruce L.; Clayton, Brian L.

    2001-09-01

    Analytic radar models are simulations based on abstract representations of the radar, the RF environment that radar signals are propagated, and the reflections produced by targets, clutter and multipath. These models have traditionally been developed in FORTRAN and have evolved over the last 20 years into efficient and well-accepted codes. However, current models are limited in two primary areas. First, by the nature of algorithm based analytical models, they can be difficult to understand by non-programmers and equally difficult to modify or extend. Second, there is strong interest in re-using these models to support higher-level weapon system and mission level simulations. To address these issues, a model development approach has been demonstrated which utilizes the MATLAB/Simulink graphical development environment. Because the MATLAB/Simulink environment graphically represents model algorithms - thus providing visibility into the model - algorithms can be easily analyzed and modified by engineers and analysts with limited software skills. In addition, software tools have been created that provide for the automatic code generation of C++ objects. These objects are created with well-defined interfaces enabling them to be used by modeling architectures external to the MATLAB/Simulink environment. The approach utilized is generic and can be extended to other engineering fields.

  17. Rock fracture processes in chemically reactive environments

    NASA Astrophysics Data System (ADS)

    Eichhubl, P.

    2015-12-01

    Rock fracture is traditionally viewed as a brittle process involving damage nucleation and growth in a zone ahead of a larger fracture, resulting in fracture propagation once a threshold loading stress is exceeded. It is now increasingly recognized that coupled chemical-mechanical processes influence fracture growth in wide range of subsurface conditions that include igneous, metamorphic, and geothermal systems, and diagenetically reactive sedimentary systems with possible applications to hydrocarbon extraction and CO2 sequestration. Fracture processes aided or driven by chemical change can affect the onset of fracture, fracture shape and branching characteristics, and fracture network geometry, thus influencing mechanical strength and flow properties of rock systems. We are investigating two fundamental modes of chemical-mechanical interactions associated with fracture growth: 1. Fracture propagation may be aided by chemical dissolution or hydration reactions at the fracture tip allowing fracture propagation under subcritical stress loading conditions. We are evaluating effects of environmental conditions on critical (fracture toughness KIc) and subcritical (subcritical index) fracture properties using double torsion fracture mechanics tests on shale and sandstone. Depending on rock composition, the presence of reactive aqueous fluids can increase or decrease KIc and/or subcritical index. 2. Fracture may be concurrent with distributed dissolution-precipitation reactions in the hostrock beyond the immediate vicinity of the fracture tip. Reconstructing the fracture opening history recorded in crack-seal fracture cement of deeply buried sandstone we find that fracture length growth and fracture opening can be decoupled, with a phase of initial length growth followed by a phase of dominant fracture opening. This suggests that mechanical crack-tip failure processes, possibly aided by chemical crack-tip weakening, and distributed

  18. Float-zone processing in a weightless environment

    NASA Technical Reports Server (NTRS)

    Fowle, A. A.; Haggerty, J. S.; Perron, R. R.; Strong, P. F.; Swanson, J. L.

    1976-01-01

    The results were reported of investigations to: (1) test the validity of analyses which set maximum practical diameters for Si crystals that can be processed by the float zone method in a near weightless environment, (2) determine the convective flow patterns induced in a typical float zone, Si melt under conditions perceived to be advantageous to the crystal growth process using flow visualization techniques applied to a dimensionally scaled model of the Si melt, (3) revise the estimates of the economic impact of space produced Si crystal by the float zone method on the U.S. electronics industry, and (4) devise a rational plan for future work related to crystal growth phenomena wherein low gravity conditions available in a space site can be used to maximum benefit to the U.S. electronics industry.

  19. Processing Motion Signals in Complex Environments

    NASA Technical Reports Server (NTRS)

    Verghese, Preeti

    2000-01-01

    Motion information is critical for human locomotion and scene segmentation. Currently we have excellent neurophysiological models that are able to predict human detection and discrimination of local signals. Local motion signals are insufficient by themselves to guide human locomotion and to provide information about depth, object boundaries and surface structure. My research is aimed at understanding the mechanisms underlying the combination of motion signals across space and time. A target moving on an extended trajectory amidst noise dots in Brownian motion is much more detectable than the sum of signals generated by independent motion energy units responding to the trajectory segments. This result suggests that facilitation occurs between motion units tuned to similar directions, lying along the trajectory path. We investigated whether the interaction between local motion units along the motion direction is mediated by contrast. One possibility is that contrast-driven signals from motion units early in the trajectory sequence are added to signals in subsequent units. If this were the case, then units later in the sequence would have a larger signal than those earlier in the sequence. To test this possibility, we compared contrast discrimination thresholds for the first and third patches of a triplet of sequentially presented Gabor patches, aligned along the motion direction. According to this simple additive model, contrast increment thresholds for the third patch should be higher than thresholds for the first patch.The lack of a measurable effect on contrast thresholds for these various manipulations suggests that the pooling of signals along a trajectory is not mediated by contrast-driven signals. Instead, these results are consistent with models that propose that the facilitation of trajectory signals is achieved by a second-level network that chooses the strongest local motion signals and combines them if they occur in a spatio-temporal sequence consistent

  20. The AE-8 trapped electron model environment

    NASA Technical Reports Server (NTRS)

    Vette, James I.

    1991-01-01

    The machine sensible version of the AE-8 electron model environment was completed in December 1983. It has been sent to users on the model environment distribution list and is made available to new users by the National Space Science Data Center (NSSDC). AE-8 is the last in a series of terrestrial trapped radiation models that includes eight proton and eight electron versions. With the exception of AE-8, all these models were documented in formal reports as well as being available in a machine sensible form. The purpose of this report is to complete the documentation, finally, for AE-8 so that users can understand its construction and see the comparison of the model with the new data used, as well as with the AE-4 model.

  1. Understanding the Impact of Virtual World Environments on Social and Cognitive Processes in Learning

    ERIC Educational Resources Information Center

    Zhang, Chi

    2009-01-01

    Researchers in information systems and technology-mediated learning have begun to examine how virtual world environments can be used in learning and how they enable learning processes and enhance learning outcomes. This research examined learning processes in a virtual world learning environment (VWLE). A research model of VWLE effects on learning…

  2. Broadband acoustic source processing in a noisy shallow ocean environment

    SciTech Connect

    Candy, J.V.; Sullivan, E.J.

    1996-07-18

    Acoustic sources found in the ocean environment are spatially complex and broadband, complicating the analysis of received acoustic data considerably. A model-based approach is developed for a broadband source in a shallow ocean environment characterized by a normal-mode propagation model. Here we develop the optimal Bayesian solution to the broadband pressure-field enhancement and modal function extraction problem.

  3. Space Station Freedom natural environment design models

    NASA Technical Reports Server (NTRS)

    Suggs, Robert M.

    1993-01-01

    The Space Station Freedom program has established a series of natural environment models and databases for utilization in design and operations planning activities. The suite of models and databases that have either been selected from among internationally recognized standards or developed specifically for spacecraft design applications are presented. The models have been integrated with an orbit propagator and employed to compute environmental conditions for planned operations altitudes of Space Station Freedom.

  4. The Educational Process in the Emerging Information Society: Conditions for the Reversal of the Linear Model of Education and the Development of an Open Type Hybrid Learning Environment.

    ERIC Educational Resources Information Center

    Anastasiades, Panagiotes S.; Retalis, Simos

    The introduction of communications and information technologies in the area of education tends to create a totally different environment, which is marked by a change of the teacher's role and a transformation of the basic components that make up the meaning and content of the learning procedure as a whole. It could be said that, despite any…

  5. Liberty High School Transition Project: Model Process for Assimilating School, Community, Business, Government and Service Groups of the Least Restrictive Environment for Nondisabled and Disabled.

    ERIC Educational Resources Information Center

    Grimes, Michael K.

    The panel presentation traces the development of and describes the operation of a Brentwood (California) project to prepare approximately 75 severely disabled individuals, ages 12-22, to function in the least restrictive recreation/leisure, vocational, and general community environments. Transition Steering Committee developed such project…

  6. r-process nucleosynthesis in dynamic helium-burning environments

    NASA Technical Reports Server (NTRS)

    Cowan, J. J.; Cameron, A. G. W.; Truran, J. W.

    1985-01-01

    The results of an extended examination of r-process nucleosynthesis in helium-burning enviroments are presented. Using newly calculated nuclear rates, dynamical r-process calculations have been made of thermal runaways in helium cores typical of low-mass stars and in the helium zones of stars undergoing supernova explosions. These calculations show that, for a sufficient flux of neutrons produced by the C-13 neutron source, r-process nuclei in solar proportions can be produced. The conditions required for r-process production are found to be 10 to the 20th-10 to the 21st neutrons per cubic centimeter for times of 0.01-0.1 s and neutron number densities in excess of 10 to the 19th per cubic centimeter for times of about 1 s. The amount of C-13 required is found to be exceedingly high - larger than is found to occur in any current stellar evolutionary model. It is thus unlikely that these helium-burning environments are responsible for producing the bulk of the r-process elements seen in the solar system.

  7. The national operational environment model (NOEM)

    NASA Astrophysics Data System (ADS)

    Salerno, John J.; Romano, Brian; Geiler, Warren

    2011-06-01

    The National Operational Environment Model (NOEM) is a strategic analysis/assessment tool that provides insight into the complex state space (as a system) that is today's modern operational environment. The NOEM supports baseline forecasts by generating plausible futures based on the current state. It supports what-if analysis by forecasting ramifications of potential "Blue" actions on the environment. The NOEM also supports sensitivity analysis by identifying possible pressure (leverage) points in support of the Commander that resolves forecasted instabilities, and by ranking sensitivities in a list for each leverage point and response. The NOEM can be used to assist Decision Makers, Analysts and Researchers with understanding the inter-workings of a region or nation state, the consequences of implementing specific policies, and the ability to plug in new operational environment theories/models as they mature. The NOEM is built upon an open-source, license-free set of capabilities, and aims to provide support for pluggable modules that make up a given model. The NOEM currently has an extensive number of modules (e.g. economic, security & social well-being pieces such as critical infrastructure) completed along with a number of tools to exercise them. The focus this year is on modeling the social and behavioral aspects of a populace within their environment, primarily the formation of various interest groups, their beliefs, their requirements, their grievances, their affinities, and the likelihood of a wide range of their actions, depending on their perceived level of security and happiness. As such, several research efforts are currently underway to model human behavior from a group perspective, in the pursuit of eventual integration and balance of populace needs/demands within their respective operational environment and the capacity to meet those demands. In this paper we will provide an overview of the NOEM, the need for and a description of its main components

  8. The dynamic radiation environment assimilation model (DREAM)

    SciTech Connect

    Reeves, Geoffrey D; Koller, Josef; Tokar, Robert L; Chen, Yue; Henderson, Michael G; Friedel, Reiner H

    2010-01-01

    The Dynamic Radiation Environment Assimilation Model (DREAM) is a 3-year effort sponsored by the US Department of Energy to provide global, retrospective, or real-time specification of the natural and potential nuclear radiation environments. The DREAM model uses Kalman filtering techniques that combine the strengths of new physical models of the radiation belts with electron observations from long-term satellite systems such as GPS and geosynchronous systems. DREAM includes a physics model for the production and long-term evolution of artificial radiation belts from high altitude nuclear explosions. DREAM has been validated against satellites in arbitrary orbits and consistently produces more accurate results than existing models. Tools for user-specific applications and graphical displays are in beta testing and a real-time version of DREAM has been in continuous operation since November 2009.

  9. An intelligent processing environment for real-time simulation

    NASA Technical Reports Server (NTRS)

    Carroll, Chester C.; Wells, Buren Earl, Jr.

    1988-01-01

    The development of a highly efficient and thus truly intelligent processing environment for real-time general purpose simulation of continuous systems is described. Such an environment can be created by mapping the simulation process directly onto the University of Alamba's OPERA architecture. To facilitate this effort, the field of continuous simulation is explored, highlighting areas in which efficiency can be improved. Areas in which parallel processing can be applied are also identified, and several general OPERA type hardware configurations that support improved simulation are investigated. Three direct execution parallel processing environments are introduced, each of which greatly improves efficiency by exploiting distinct areas of the simulation process. These suggested environments are candidate architectures around which a highly intelligent real-time simulation configuration can be developed.

  10. Designing user models in a virtual cave environment

    SciTech Connect

    Brown-VanHoozer, S.; Hudson, R.; Gokhale, N.

    1995-12-31

    In this paper, the results of a first study into the use of virtual reality for human factor studies and design of simple and complex models of control systems, components, and processes are described. The objective was to design a model in a virtual environment that would reflect more characteristics of the user`s mental model of a system and fewer of the designer`s. The technology of a CAVE{trademark} virtual environment and the methodology of Neuro Linguistic Programming were employed in this study.

  11. GREENSCOPE: Sustainable Process Modeling

    EPA Science Inventory

    EPA researchers are responding to environmental problems by incorporating sustainability into process design and evaluation. EPA researchers are also developing a tool that allows users to assess modifications to existing and new chemical processes to determine whether changes in...

  12. Fuzzy control of the production environment process parameters

    NASA Astrophysics Data System (ADS)

    Izvekov, V. N.

    2015-04-01

    The fuzzy control process for support of given microclimatic production environment process parameters with loss of one from values, regulating regime of process was shown. The structural schematic decisions with algorithm of functioning and oriented to existing apparatus (means of realization) was presented.

  13. [Watershed water environment pollution models and their applications: a review].

    PubMed

    Zhu, Yao; Liang, Zhi-Wei; Li, Wei; Yang, Yi; Yang, Mu-Yi; Mao, Wei; Xu, Han-Li; Wu, Wei-Xiang

    2013-10-01

    Watershed water environment pollution model is the important tool for studying watershed environmental problems. Through the quantitative description of the complicated pollution processes of whole watershed system and its parts, the model can identify the main sources and migration pathways of pollutants, estimate the pollutant loadings, and evaluate their impacts on water environment, providing a basis for watershed planning and management. This paper reviewed the watershed water environment models widely applied at home and abroad, with the focuses on the models of pollutants loading (GWLF and PLOAD), water quality of received water bodies (QUAL2E and WASP), and the watershed models integrated pollutant loadings and water quality (HSPF, SWAT, AGNPS, AnnAGNPS, and SWMM), and introduced the structures, principles, and main characteristics as well as the limitations in practical applications of these models. The other models of water quality (CE-QUAL-W2, EFDC, and AQUATOX) and watershed models (GLEAMS and MIKE SHE) were also briefly introduced. Through the case analysis on the applications of single model and integrated models, the development trend and application prospect of the watershed water environment pollution models were discussed. PMID:24483100

  14. [Watershed water environment pollution models and their applications: a review].

    PubMed

    Zhu, Yao; Liang, Zhi-Wei; Li, Wei; Yang, Yi; Yang, Mu-Yi; Mao, Wei; Xu, Han-Li; Wu, Wei-Xiang

    2013-10-01

    Watershed water environment pollution model is the important tool for studying watershed environmental problems. Through the quantitative description of the complicated pollution processes of whole watershed system and its parts, the model can identify the main sources and migration pathways of pollutants, estimate the pollutant loadings, and evaluate their impacts on water environment, providing a basis for watershed planning and management. This paper reviewed the watershed water environment models widely applied at home and abroad, with the focuses on the models of pollutants loading (GWLF and PLOAD), water quality of received water bodies (QUAL2E and WASP), and the watershed models integrated pollutant loadings and water quality (HSPF, SWAT, AGNPS, AnnAGNPS, and SWMM), and introduced the structures, principles, and main characteristics as well as the limitations in practical applications of these models. The other models of water quality (CE-QUAL-W2, EFDC, and AQUATOX) and watershed models (GLEAMS and MIKE SHE) were also briefly introduced. Through the case analysis on the applications of single model and integrated models, the development trend and application prospect of the watershed water environment pollution models were discussed.

  15. Model for integrated management of quality, labor risks prevention, environment and ethical aspects, applied to R&D&I and production processes in an organization

    NASA Astrophysics Data System (ADS)

    González, M. R.; Torres, F.; Yoldi, V.; Arcega, F.; Plaza, I.

    2012-04-01

    It is proposed an integrated management model for an organization. This model is based on the continuous improvement Plan-Do-Check-Act cycle and it intends to integrate the environmental, risk prevention and ethical aspects as well as research, development and innovation projects management in the general quality management structure proposed by ISO 9001:2008. It aims to fulfill the standards ISO 9001, ISO 14001, OSHAS 18001, SGE 21 y 166002.

  16. Metal Catalyzed Fusion: Nuclear Active Environment vs. Process

    NASA Astrophysics Data System (ADS)

    Chubb, Talbot

    2009-03-01

    To achieve radiationless dd fusion and/or other LENR reactions via chemistry: some focus on environment of interior or altered near-surface volume of bulk metal; some on environment inside metal nanocrystals or on their surface; some on the interface between nanometal crystals and ionic crystals; some on a momentum shock-stimulation reaction process. Experiment says there is also a spontaneous reaction process.

  17. Macro Level Simulation Model Of Space Shuttle Processing

    NASA Technical Reports Server (NTRS)

    2000-01-01

    The contents include: 1) Space Shuttle Processing Simulation Model; 2) Knowledge Acquisition; 3) Simulation Input Analysis; 4) Model Applications in Current Shuttle Environment; and 5) Model Applications for Future Reusable Launch Vehicles (RLV's). This paper is presented in viewgraph form.

  18. THE IMPORTANCE OF CONCURRENT MONITORING AND MODELING FOR UNDERSTANDING MERCURY EXPOSURE IN THE ENVIRONMENT

    EPA Science Inventory

    Understanding the cycling processes governing mercury exposure in the environment requires sufficient process-based modeling and monitoring data. Monitoring provides ambient concentration data for specific sample times and locations. Modeling provides a tool for investigating the...

  19. FAME--a flexible appearance modeling environment.

    PubMed

    Stegmann, Mikkel B; Ersbøll, Bjarne K; Larsen, Rasmus

    2003-10-01

    Combined modeling of pixel intensities and shape has proven to be a very robust and widely applicable approach to interpret images. As such the active appearance model (AAM) framework has been applied to a wide variety of problems within medical image analysis. This paper summarizes AAM applications within medicine and describes a public domain implementation, namely the flexible appearance modeling environment (FAME). We give guidelines for the use of this research platform, and show that the optimization techniques used renders it applicable to interactive medical applications. To increase performance and make models generalize better, we apply parallel analysis to obtain automatic and objective model truncation. Further, two different AAM training methods are compared along with a reference case study carried out on cross-sectional short-axis cardiac magnetic resonance images and face images. Source code and annotated data sets needed to reproduce the results are put in the public domain for further investigation.

  20. Model-Based Detection in a Shallow Water Ocean Environment

    SciTech Connect

    Candy, J V

    2001-07-30

    A model-based detector is developed to process shallow water ocean acoustic data. The function of the detector is to adaptively monitor the environment and decide whether or not a change from normal has occurred. Here we develop a processor incorporating both a normal-mode ocean acoustic model and a vertical hydrophone array. The detector is applied to data acquired from the Hudson Canyon experiments at various ranges and its performance is evaluated.

  1. A Multiagent Modeling Environment for Simulating Work Practice in Organizations

    NASA Technical Reports Server (NTRS)

    Sierhuis, Maarten; Clancey, William J.; vanHoof, Ron

    2004-01-01

    In this paper we position Brahms as a tool for simulating organizational processes. Brahms is a modeling and simulation environment for analyzing human work practice, and for using such models to develop intelligent software agents to support the work practice in organizations. Brahms is the result of more than ten years of research at the Institute for Research on Learning (IRL), NYNEX Science & Technology (the former R&D institute of the Baby Bell telephone company in New York, now Verizon), and for the last six years at NASA Ames Research Center, in the Work Systems Design and Evaluation group, part of the Computational Sciences Division (Code IC). Brahms has been used on more than ten modeling and simulation research projects, and recently has been used as a distributed multiagent development environment for developing work practice support tools for human in-situ science exploration on planetary surfaces, in particular a human mission to Mars. Brahms was originally conceived of as a business process modeling and simulation tool that incorporates the social systems of work, by illuminating how formal process flow descriptions relate to people s actual located activities in the workplace. Our research started in the early nineties as a reaction to experiences with work process modeling and simulation . Although an effective tool for convincing management of the potential cost-savings of the newly designed work processes, the modeling and simulation environment was only able to describe work as a normative workflow. However, the social systems, uncovered in work practices studied by the design team played a significant role in how work actually got done-actual lived work. Multi- tasking, informal assistance and circumstantial work interactions could not easily be represented in a tool with a strict workflow modeling paradigm. In response, we began to develop a tool that would have the benefits of work process modeling and simulation, but be distinctively able to

  2. A model environment for outer zone electrons

    NASA Technical Reports Server (NTRS)

    Singley, G. W.; Vette, J. I.

    1972-01-01

    A brief morphology of outer zone electrons is given to illustrate the nature of the phenomena that we are attempting to model. This is followed by a discussion of the data processing that was done with the various data received from the experimenters before incorporating it into the data base from which this model was ultimately derived. The details of the derivation are given, and several comparisons of the final model with the various experimental measurements are presented.

  3. Variable effort fishing models in random environments.

    PubMed

    Braumann, C A

    1999-03-01

    We study the growth of populations in a random environment subjected to variable effort fishing policies. The models used are stochastic differential equations and the environmental fluctuations may either affect an intrinsic growth parameter or be of the additive noise type. Density-dependent natural growth and fishing policies are of very general form so that our results will be model independent. We obtain conditions on the fishing policies for non-extinction and for non-fixation at the carrying capacity that are very similar to the conditions obtained for the corresponding deterministic model. We also obtain conditions for the existence of stationary distributions (as well as expressions for such distributions) very similar to conditions for the existence of an equilibrium in the corresponding deterministic model. The results obtained provide minimal requirements for the choice of a wise density-dependent fishing policy.

  4. Virtual Collaborative Simulation Environment for Integrated Product and Process Development

    NASA Technical Reports Server (NTRS)

    Gulli, Michael A.

    1997-01-01

    Deneb Robotics is a leader in the development of commercially available, leading edge three- dimensional simulation software tools for virtual prototyping,, simulation-based design, manufacturing process simulation, and factory floor simulation and training applications. Deneb has developed and commercially released a preliminary Virtual Collaborative Engineering (VCE) capability for Integrated Product and Process Development (IPPD). This capability allows distributed, real-time visualization and evaluation of design concepts, manufacturing processes, and total factory and enterprises in one seamless simulation environment.

  5. Interplanetary Radiation and Internal Charging Environment Models for Solar Sails

    NASA Technical Reports Server (NTRS)

    Minow, Joseph I.; Altstatt, Richard L.; NeegaardParker, Linda

    2005-01-01

    A Solar Sail Radiation Environment (SSRE) model has been developed for defining charged particle environments over an energy range from 0.01 keV to 1 MeV for hydrogen ions, helium ions, and electrons. The SSRE model provides the free field charged particle environment required for characterizing energy deposition per unit mass, charge deposition, and dose rate dependent conductivity processes required to evaluate radiation dose and internal (bulk) charging processes in the solar sail membrane in interplanetary space. Solar wind and energetic particle measurements from instruments aboard the Ulysses spacecraft in a solar, near-polar orbit provide the particle data over a range of heliospheric latitudes used to derive the environment that can be used for radiation and charging environments for both high inclination 0.5 AU Solar Polar Imager mission and the 1.0 AU L1 solar missions. This paper describes the techniques used to model comprehensive electron, proton, and helium spectra over the range of particle energies of significance to energy and charge deposition in thin (less than 25 micrometers) solar sail materials.

  6. Multiscale Materials Modeling in an Industrial Environment.

    PubMed

    Weiß, Horst; Deglmann, Peter; In 't Veld, Pieter J; Cetinkaya, Murat; Schreiner, Eduard

    2016-06-01

    In this review, we sketch the materials modeling process in industry. We show that predictive and fast modeling is a prerequisite for successful participation in research and development processes in the chemical industry. Stable and highly automated workflows suitable for handling complex systems are a must. In particular, we review approaches to build and parameterize soft matter systems. By satisfying these prerequisites, efficiency for the development of new materials can be significantly improved, as exemplified here for formulation polymer development. This is in fact in line with recent Materials Genome Initiative efforts sponsored by the US government. Valuable contributions to product development are possible today by combining existing modeling techniques in an intelligent fashion, provided modeling and experiment work hand in hand. PMID:26927661

  7. Multiscale Materials Modeling in an Industrial Environment.

    PubMed

    Weiß, Horst; Deglmann, Peter; In 't Veld, Pieter J; Cetinkaya, Murat; Schreiner, Eduard

    2016-06-01

    In this review, we sketch the materials modeling process in industry. We show that predictive and fast modeling is a prerequisite for successful participation in research and development processes in the chemical industry. Stable and highly automated workflows suitable for handling complex systems are a must. In particular, we review approaches to build and parameterize soft matter systems. By satisfying these prerequisites, efficiency for the development of new materials can be significantly improved, as exemplified here for formulation polymer development. This is in fact in line with recent Materials Genome Initiative efforts sponsored by the US government. Valuable contributions to product development are possible today by combining existing modeling techniques in an intelligent fashion, provided modeling and experiment work hand in hand.

  8. Chemical Process Modeling and Control.

    ERIC Educational Resources Information Center

    Bartusiak, R. Donald; Price, Randel M.

    1987-01-01

    Describes some of the features of Lehigh University's (Pennsylvania) process modeling and control program. Highlights the creation and operation of the Chemical Process Modeling and Control Center (PMC). Outlines the program's philosophy, faculty, technical program, current research projects, and facilities. (TW)

  9. Modeling nuclear processes by Simulink

    SciTech Connect

    Rashid, Nahrul Khair Alang Md

    2015-04-29

    Modelling and simulation are essential parts in the study of dynamic systems behaviours. In nuclear engineering, modelling and simulation are important to assess the expected results of an experiment before the actual experiment is conducted or in the design of nuclear facilities. In education, modelling can give insight into the dynamic of systems and processes. Most nuclear processes can be described by ordinary or partial differential equations. Efforts expended to solve the equations using analytical or numerical solutions consume time and distract attention from the objectives of modelling itself. This paper presents the use of Simulink, a MATLAB toolbox software that is widely used in control engineering, as a modelling platform for the study of nuclear processes including nuclear reactor behaviours. Starting from the describing equations, Simulink models for heat transfer, radionuclide decay process, delayed neutrons effect, reactor point kinetic equations with delayed neutron groups, and the effect of temperature feedback are used as examples.

  10. Modeling nuclear processes by Simulink

    NASA Astrophysics Data System (ADS)

    Rashid, Nahrul Khair Alang Md

    2015-04-01

    Modelling and simulation are essential parts in the study of dynamic systems behaviours. In nuclear engineering, modelling and simulation are important to assess the expected results of an experiment before the actual experiment is conducted or in the design of nuclear facilities. In education, modelling can give insight into the dynamic of systems and processes. Most nuclear processes can be described by ordinary or partial differential equations. Efforts expended to solve the equations using analytical or numerical solutions consume time and distract attention from the objectives of modelling itself. This paper presents the use of Simulink, a MATLAB toolbox software that is widely used in control engineering, as a modelling platform for the study of nuclear processes including nuclear reactor behaviours. Starting from the describing equations, Simulink models for heat transfer, radionuclide decay process, delayed neutrons effect, reactor point kinetic equations with delayed neutron groups, and the effect of temperature feedback are used as examples.

  11. Business process modeling in healthcare.

    PubMed

    Ruiz, Francisco; Garcia, Felix; Calahorra, Luis; Llorente, César; Gonçalves, Luis; Daniel, Christel; Blobel, Bernd

    2012-01-01

    The importance of the process point of view is not restricted to a specific enterprise sector. In the field of health, as a result of the nature of the service offered, health institutions' processes are also the basis for decision making which is focused on achieving their objective of providing quality medical assistance. In this chapter the application of business process modelling - using the Business Process Modelling Notation (BPMN) standard is described. Main challenges of business process modelling in healthcare are the definition of healthcare processes, the multi-disciplinary nature of healthcare, the flexibility and variability of the activities involved in health care processes, the need of interoperability between multiple information systems, and the continuous updating of scientific knowledge in healthcare. PMID:22925789

  12. Conceptual models of information processing

    NASA Technical Reports Server (NTRS)

    Stewart, L. J.

    1983-01-01

    The conceptual information processing issues are examined. Human information processing is defined as an active cognitive process that is analogous to a system. It is the flow and transformation of information within a human. The human is viewed as an active information seeker who is constantly receiving, processing, and acting upon the surrounding environmental stimuli. Human information processing models are conceptual representations of cognitive behaviors. Models of information processing are useful in representing the different theoretical positions and in attempting to define the limits and capabilities of human memory. It is concluded that an understanding of conceptual human information processing models and their applications to systems design leads to a better human factors approach.

  13. Business process modeling in healthcare.

    PubMed

    Ruiz, Francisco; Garcia, Felix; Calahorra, Luis; Llorente, César; Gonçalves, Luis; Daniel, Christel; Blobel, Bernd

    2012-01-01

    The importance of the process point of view is not restricted to a specific enterprise sector. In the field of health, as a result of the nature of the service offered, health institutions' processes are also the basis for decision making which is focused on achieving their objective of providing quality medical assistance. In this chapter the application of business process modelling - using the Business Process Modelling Notation (BPMN) standard is described. Main challenges of business process modelling in healthcare are the definition of healthcare processes, the multi-disciplinary nature of healthcare, the flexibility and variability of the activities involved in health care processes, the need of interoperability between multiple information systems, and the continuous updating of scientific knowledge in healthcare.

  14. An integrative model linking feedback environment and organizational citizenship behavior.

    PubMed

    Peng, Jei-Chen; Chiu, Su-Fen

    2010-01-01

    Past empirical evidence has suggested that a positive supervisor feedback environment may enhance employees' organizational citizenship behavior (OCB). In this study, we aim to extend previous research by proposing and testing an integrative model that examines the mediating processes underlying the relationship between supervisor feedback environment and employee OCB. Data were collected from 259 subordinate-supervisor dyads across a variety of organizations in Taiwan. We used structural equation modeling to test our hypotheses. The results demonstrated that supervisor feedback environment influenced employees' OCB indirectly through (1) both positive affective-cognition and positive attitude (i.e., person-organization fit and organizational commitment), and (2) both negative affective-cognition and negative attitude (i.e., role stressors and job burnout). Theoretical and practical implications are discussed. PMID:21166326

  15. An integrative model linking feedback environment and organizational citizenship behavior.

    PubMed

    Peng, Jei-Chen; Chiu, Su-Fen

    2010-01-01

    Past empirical evidence has suggested that a positive supervisor feedback environment may enhance employees' organizational citizenship behavior (OCB). In this study, we aim to extend previous research by proposing and testing an integrative model that examines the mediating processes underlying the relationship between supervisor feedback environment and employee OCB. Data were collected from 259 subordinate-supervisor dyads across a variety of organizations in Taiwan. We used structural equation modeling to test our hypotheses. The results demonstrated that supervisor feedback environment influenced employees' OCB indirectly through (1) both positive affective-cognition and positive attitude (i.e., person-organization fit and organizational commitment), and (2) both negative affective-cognition and negative attitude (i.e., role stressors and job burnout). Theoretical and practical implications are discussed.

  16. Introducing ORACLE: Library Processing in a Multi-User Environment.

    ERIC Educational Resources Information Center

    Queensland Library Board, Brisbane (Australia).

    Currently being developed by the State Library of Queensland, Australia, ORACLE (On-Line Retrieval of Acquisitions, Cataloguing, and Circulation Details for Library Enquiries) is a computerized library system designed to provide rapid processing of library materials in a multi-user environment. It is based on the Australian MARC format and fully…

  17. Deterministic polishing process for aspheric lenses in a production environment

    NASA Astrophysics Data System (ADS)

    Stach, G.; Schwalb, F.

    2013-09-01

    Grinding, polishing and finishing with ultra-precise form correction from one supplier. Satisloh provides machines, peripheral equipment, training, service, consumables, tools and process-support. All the equipment is made for industrial environment. Together with exclusive, experienced partners, aspheres will be manufactured more efficient.

  18. Models of the Reading Process

    PubMed Central

    Rayner, Keith; Reichle, Erik D.

    2010-01-01

    Reading is a complex skill involving the orchestration of a number of components. Researchers often talk about a “model of reading” when talking about only one aspect of the reading process (for example, models of word identification are often referred to as “models of reading”). Here, we review prominent models that are designed to account for (1) word identification, (2) syntactic parsing, (3) discourse representations, and (4) how certain aspects of language processing (e.g., word identification), in conjunction with other constraints (e g., limited visual acuity, saccadic error, etc.), guide readers’ eyes. Unfortunately, it is the case that these various models addressing specific aspects of the reading process seldom make contact with models dealing with other aspects of reading. Thus, for example, the models of word identification seldom make contact with models of eye movement control, and vice versa. While this may be unfortunate in some ways, it is quite understandable in other ways because reading itself is a very complex process. We discuss prototypical models of aspects of the reading process in the order mentioned above. We do not review all possible models, but rather focus on those we view as being representative and most highly recognized. PMID:21170142

  19. Models of the Reading Process.

    PubMed

    Rayner, Keith; Reichle, Erik D

    2010-11-01

    Reading is a complex skill involving the orchestration of a number of components. Researchers often talk about a "model of reading" when talking about only one aspect of the reading process (for example, models of word identification are often referred to as "models of reading"). Here, we review prominent models that are designed to account for (1) word identification, (2) syntactic parsing, (3) discourse representations, and (4) how certain aspects of language processing (e.g., word identification), in conjunction with other constraints (e g., limited visual acuity, saccadic error, etc.), guide readers' eyes. Unfortunately, it is the case that these various models addressing specific aspects of the reading process seldom make contact with models dealing with other aspects of reading. Thus, for example, the models of word identification seldom make contact with models of eye movement control, and vice versa. While this may be unfortunate in some ways, it is quite understandable in other ways because reading itself is a very complex process. We discuss prototypical models of aspects of the reading process in the order mentioned above. We do not review all possible models, but rather focus on those we view as being representative and most highly recognized.

  20. Extreme Environments Development of Decision Processes and Training Programs for Medical Policy Formulation

    NASA Technical Reports Server (NTRS)

    Stough, Roger

    2004-01-01

    The purpose of this workshop was to survey existing health and safety policies as well as processes and practices for various extreme environments; to identify strengths and shortcomings of these processes; and to recommend parameters for inclusion in a generic approach to policy formulation, applicable to the broadest categories of extreme environments. It was anticipated that two additional workshops would follow. The November 7, 2003 workshop would be devoted to the evaluation of different model(s) and a concluding expert evaluation of the usefulness of the model using a policy formulation example. The final workshop was planned for March 2004.

  1. Modelling of CWS combustion process

    NASA Astrophysics Data System (ADS)

    Rybenko, I. A.; Ermakova, L. A.

    2016-10-01

    The paper considers the combustion process of coal water slurry (CWS) drops. The physico-chemical process scheme consisting of several independent parallel-sequential stages is offered. This scheme of drops combustion process is proved by the particle size distribution test and research stereomicroscopic analysis of combustion products. The results of mathematical modelling and optimization of stationary regimes of CWS combustion are provided. During modeling the problem of defining possible equilibrium composition of products, which can be obtained as a result of CWS combustion processes at different temperatures, is solved.

  2. Process material management in the Space Station environment

    NASA Technical Reports Server (NTRS)

    Perry, J. L.; Humphries, W. R.

    1988-01-01

    The Space Station will provide a unique facility for conducting material-processing and life-science experiments under microgravity conditions. These conditions place special requirements on the U.S. Laboratory for storing and transporting chemicals and process fluids, reclaiming water from selected experiments, treating and storing experiment wastes, and providing vacuum utilities. To meet these needs and provide a safe laboratory environment, the Process Material Management System (PMMS) is being developed. Preliminary design requirements and concepts related to the PMMS are addressed, and the MSFC PMMS breadboard test facility and a preliminary plan for validating the overall system design are discussed.

  3. Kinetic Modeling of Microbiological Processes

    SciTech Connect

    Liu, Chongxuan; Fang, Yilin

    2012-08-26

    Kinetic description of microbiological processes is vital for the design and control of microbe-based biotechnologies such as waste water treatment, petroleum oil recovery, and contaminant attenuation and remediation. Various models have been proposed to describe microbiological processes. This editorial article discusses the advantages and limiation of these modeling approaches in cluding tranditional, Monod-type models and derivatives, and recently developed constraint-based approaches. The article also offers the future direction of modeling researches that best suit for petroleum and environmental biotechnologies.

  4. Modeling of space environment impact on nanostructured materials. General principles

    NASA Astrophysics Data System (ADS)

    Voronina, Ekaterina; Novikov, Lev

    2016-07-01

    In accordance with the resolution of ISO TC20/SC14 WG4/WG6 joint meeting, Technical Specification (TS) 'Modeling of space environment impact on nanostructured materials. General principles' which describes computer simulation methods of space environment impact on nanostructured materials is being prepared. Nanomaterials surpass traditional materials for space applications in many aspects due to their unique properties associated with nanoscale size of their constituents. This superiority in mechanical, thermal, electrical and optical properties will evidently inspire a wide range of applications in the next generation spacecraft intended for the long-term (~15-20 years) operation in near-Earth orbits and the automatic and manned interplanetary missions. Currently, ISO activity on developing standards concerning different issues of nanomaterials manufacturing and applications is high enough. Most such standards are related to production and characterization of nanostructures, however there is no ISO documents concerning nanomaterials behavior in different environmental conditions, including the space environment. The given TS deals with the peculiarities of the space environment impact on nanostructured materials (i.e. materials with structured objects which size in at least one dimension lies within 1-100 nm). The basic purpose of the document is the general description of the methodology of applying computer simulation methods which relate to different space and time scale to modeling processes occurring in nanostructured materials under the space environment impact. This document will emphasize the necessity of applying multiscale simulation approach and present the recommendations for the choice of the most appropriate methods (or a group of methods) for computer modeling of various processes that can occur in nanostructured materials under the influence of different space environment components. In addition, TS includes the description of possible

  5. Combustion modeling for experimentation in a space environment

    NASA Technical Reports Server (NTRS)

    Berlad, A. L.

    1974-01-01

    The merits of combustion experimentation in a space environment are assessed, and the impact of such experimentation on current theoretical models is considered. It is noted that combustion theory and experimentation for less than normal gravitational conditions are incomplete, inadequate, or nonexistent. Extensive and systematic experimentation in a space environment is viewed as essential for more adequate and complete theoretical models of such processes as premixed flame propagation and extinction limits, premixed flame propagation in droplet and particle clouds, ignition and autoignition in premixed combustible media, and gas jet combustion of unpremixed reactants. Current theories and models in these areas are described, and some combustion studies that can be undertaken in the Space Shuttle Program are proposed, including crossed molecular beam, turbulence, and upper pressure limit (of gases) studies.

  6. Social Models: Blueprints or Processes?

    ERIC Educational Resources Information Center

    Little, Graham R.

    1981-01-01

    Discusses the nature and implications of two different models for societal planning: (1) the problem-solving process approach based on Karl Popper; and (2) the goal-setting "blueprint" approach based on Karl Marx. (DC)

  7. Integrated numeric and symbolic signal processing using a heterogeneous design environment

    NASA Astrophysics Data System (ADS)

    Mani, Ramamurthy; Nawab, S. Hamid; Winograd, Joseph M.; Evans, Brian L.

    1996-10-01

    We present a solution to a complex multi-tone transient detection problem to illustrate the integrated use of symbolic and numeric processing techniques which are supported by well-established underlying models. Examples of such models include synchronous dataflow for numeric processing and the blackboard paradigm for symbolic heuristic search. Our transient detection solution serves to emphasize the importance of developing system design methods and tools which can support the integrated use of well- established symbolic and numerical models of computation. Recently, we incorporated a blackboard-based model of computation underlying the Integrated Processing and Understanding of Signals (IPUS) paradigm into a system-level design environment for numeric processing called Ptolemy. Using the IPUS/Ptolemy environment, we are implementing our solution to the multi-tone transient detection problem.

  8. Regulatory Models and the Environment: Practice, Pitfalls, and Prospects

    SciTech Connect

    Holmes, K. John; Graham, Judith A.; McKone, Thomas; Whipple, Chris

    2008-06-01

    Computational models support environmental regulatory activities by providing the regulator an ability to evaluate available knowledge, assess alternative regulations, and provide a framework to assess compliance. But all models face inherent uncertainties, because human and natural systems are always more complex and heterogeneous than can be captured in a model. Here we provide a summary discussion of the activities, findings, and recommendations of the National Research Council's Committee on Regulatory Environmental Models, a committee funded by the US Environmental Protection Agency to provide guidance on the use of computational models in the regulatory process. Modeling is a difficult enterprise even outside of the potentially adversarial regulatory environment. The demands grow when the regulatory requirements for accountability, transparency, public accessibility, and technical rigor are added to the challenges. Moreover, models cannot be validated (declared true) but instead should be evaluated with regard to their suitability as tools to address a specific question. The committee concluded that these characteristics make evaluation of a regulatory model more complex than simply comparing measurement data with model results. Evaluation also must balance the need for a model to be accurate with the need for a model to be reproducible, transparent, and useful for the regulatory decision at hand. Meeting these needs requires model evaluation to be applied over the"life cycle" of a regulatory model with an approach that includes different forms of peer review, uncertainty analysis, and extrapolation methods than for non-regulatory models.

  9. A network-oriented business modeling environment

    NASA Astrophysics Data System (ADS)

    Bisconti, Cristian; Storelli, Davide; Totaro, Salvatore; Arigliano, Francesco; Savarino, Vincenzo; Vicari, Claudia

    The development of formal models related to the organizational aspects of an enterprise is fundamental when these aspects must be re-engineered and digitalized, especially when the enterprise is involved in the dynamics and value flows of a business network. Business modeling provides an opportunity to synthesize and make business processes, business rules and the structural aspects of an organization explicit, allowing business managers to control their complexity and guide an enterprise through effective decisional and strategic activities. This chapter discusses the main results of the TEKNE project in terms of software components that enable enterprises to configure, store, search and share models of any aspects of their business while leveraging standard and business-oriented technologies and languages to bridge the gap between the world of business people and IT experts and to foster effective business-to-business collaborations.

  10. Open environment for image processing and software development

    NASA Astrophysics Data System (ADS)

    Rasure, John R.; Young, Mark

    1992-04-01

    The main goal of the Khoros software project is to create and provide an integrated software development environment for information processing and data visualization. The Khoros software system is now being used as a foundation to improve productivity and promote software reuse in a wide variety of application domain. A powerful feature of the Khoros system is the high-level, abstract visual language that can be employed to significantly boost the productivity of the researcher. Central to the Khoros system is the need for a consistent yet flexible user interface development system that provides cohesiveness to the vast number of programs that make up the Khoros system. Automated tools assist in maintenance as well as development of programs. The software structure that embodies this system provides for extensibility and portability, and allows for easy tailoring to target specific application domains and processing environments. First, an overview of the Khoros software environment is given. Then this paper presents the abstract applications programmer interface, API, the data services that are provided in Khoros to support it, and the Khoros visualization and image file format. The authors contend that Khoros is an excellent environment for the exploration and implementation of imaging standards.

  11. Analysis of Interpersonal Communication Processes in Digital Factory Environments

    NASA Astrophysics Data System (ADS)

    Schütze, Jens; Baum, Heiko; Laue, Martin; Müller, Egon

    The paper outlines the scope of influence of digital factory on the interpersonal communication process and the exemplary description of them. On the basis of a brief description about the theoretical basic concepts of the digital factory occurs the illustration of communicative features in digital factory. Practical coherences of interpersonal communication from a human oriented view were analyzed in Volkswagen AG in Wolfsburg in a pilot project. A modeling method was developed within the process analysis. This method makes it possible to visualize interpersonal communication and its human oriented attribute in a technically focused workflow. Due to the results of a developed inquiry about communication analysis and process models of modeling methods it was possible to build the processes in a suitable way for humans and to obtain a positive implication on the communication processes.

  12. Critical processes affecting Cryptosporidium oocyst survival in the environment.

    PubMed

    King, B J; Monis, P T

    2007-03-01

    Cryptosporidium are parasitic protozoans that cause gastrointestinal disease and represent a significant risk to public health. Cryptosporidium oocysts are prevalent in surface waters as a result of human, livestock and native animal faecal contamination. The resistance of oocysts to the concentrations of chlorine and monochloramine used to disinfect potable water increases the risk of waterborne transmission via drinking water. In addition to being resistant to commonly used disinfectants, it is thought that oocysts can persist in the environment and be readily mobilized by precipitation events. This paper will review the critical processes involved in the inactivation or removal of oocysts in the terrestrial and aquatic environments and consider how these processes will respond in the context of climate change. PMID:17096874

  13. Command Process Modeling & Risk Analysis

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila

    2011-01-01

    Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.

  14. Simulation model for plant growth in controlled environment systems

    NASA Technical Reports Server (NTRS)

    Raper, C. D., Jr.; Wann, M.

    1986-01-01

    The role of the mathematical model is to relate the individual processes to environmental conditions and the behavior of the whole plant. Using the controlled-environment facilities of the phytotron at North Carolina State University for experimentation at the whole-plant level and methods for handling complex models, researchers developed a plant growth model to describe the relationships between hierarchial levels of the crop production system. The fundamental processes that are considered are: (1) interception of photosynthetically active radiation by leaves, (2) absorption of photosynthetically active radiation, (3) photosynthetic transformation of absorbed radiation into chemical energy of carbon bonding in solube carbohydrates in the leaves, (4) translocation between carbohydrate pools in leaves, stems, and roots, (5) flow of energy from carbohydrate pools for respiration, (6) flow from carbohydrate pools for growth, and (7) aging of tissues. These processes are described at the level of organ structure and of elementary function processes. The driving variables of incident photosynthetically active radiation and ambient temperature as inputs pertain to characterization at the whole-plant level. The output of the model is accumulated dry matter partitioned among leaves, stems, and roots; thus, the elementary processes clearly operate under the constraints of the plant structure which is itself the output of the model.

  15. NG6: Integrated next generation sequencing storage and processing environment

    PubMed Central

    2012-01-01

    Background Next generation sequencing platforms are now well implanted in sequencing centres and some laboratories. Upcoming smaller scale machines such as the 454 junior from Roche or the MiSeq from Illumina will increase the number of laboratories hosting a sequencer. In such a context, it is important to provide these teams with an easily manageable environment to store and process the produced reads. Results We describe a user-friendly information system able to manage large sets of sequencing data. It includes, on one hand, a workflow environment already containing pipelines adapted to different input formats (sff, fasta, fastq and qseq), different sequencers (Roche 454, Illumina HiSeq) and various analyses (quality control, assembly, alignment, diversity studies,…) and, on the other hand, a secured web site giving access to the results. The connected user will be able to download raw and processed data and browse through the analysis result statistics. The provided workflows can easily be modified or extended and new ones can be added. Ergatis is used as a workflow building, running and monitoring system. The analyses can be run locally or in a cluster environment using Sun Grid Engine. Conclusions NG6 is a complete information system designed to answer the needs of a sequencing platform. It provides a user-friendly interface to process, store and download high-throughput sequencing data. PMID:22958229

  16. Model-based description of environment interaction for mobile robots

    NASA Astrophysics Data System (ADS)

    Borghi, Giuseppe; Ferrari, Carlo; Pagello, Enrico; Vianello, Marco

    1999-01-01

    We consider a mobile robot that attempts to accomplish a task by reaching a given goal, and interacts with its environment through a finite set of actions and observations. The interaction between robot and environment is modeled by Partially Observable Markov Decision Processes (POMDP). The robot takes its decisions in presence of uncertainty about the current state, by maximizing its reward gained during interactions with the environment. It is able to self-locate into the environment by collecting actions and perception histories during the navigation. To make the state estimation more reliable, we introduce an additional information in the model without adding new states and without discretizing the considered measures. Thus, we associate to the state transition probabilities also a continuous metric given through the mean and the variance of some significant sensor measurements suitable to be kept under continuous form, such as odometric measurements, showing that also such unreliable data can supply a great deal of information to the robot. The overall control system of the robot is structured as a two-levels layered architecture, where the low level implements several collision avoidance algorithms, while the upper level takes care of the navigation problem. In this paper, we concentrate on how to use POMDP models at the upper level.

  17. CAD tool environment for MEMS process design support

    NASA Astrophysics Data System (ADS)

    Schmidt, T.; Wagener, A.; Popp, J.; Hahn, K.; Bruck, R.

    2005-07-01

    MEMS fabrication processes are characterized by a numerous useable process steps, materials and effects to fabricate the intended microstructure. Up to now CAD support in this domain concentrates mainly on the structural design (e.g. simulation programs on FEM basis). These tools often assume fixed interfaces to fabrication process like material parameters or design rules. Taking into account that MEMS design requires concurrently structural design (defining the lateral 2-dim shapes) as well as process design (responsible for the third dimension) it turns out that technology interfaces consisting only of sets of static data are no longer sufficient. For successful design flows in these areas it is necessary to incorporate a higher degree of process related data. A broader interface between process configuration on the one side and the application design on the other side seems to be needed. This paper proposes a novel approach. A process management system is introduced. It allows the specification of processes for specific applications. The system is based on a dedicated database environment that is able to store and manage all process related design constraints linked to the fabrication process data itself. The interdependencies between application specific processes and all stages of the design flow will be discussed and the complete software system PRINCE will be introduced meeting the requirements of this new approach. Based on a concurrent design methodology presented in the beginning of this paper, a system is presented that supports application specific process design. The paper will highlight the incorporated tools and the present status of the software system. A complete configuration of an Si-thin film process example will demonstrate the usage of PRINCE.

  18. Development of the Delta Shell as an integrated modeling environment

    NASA Astrophysics Data System (ADS)

    Donchyts, Gennadii; Baart, Fedor; Jagers, Bert

    2010-05-01

    Many engineering problem require the use of multiple numerical models from multiple disciplines. For example the use of river model for flow calculation coupled with groundwater model and rainfall-runoff model. These models need to be setup, coupled, run, results need to be visualized, input and output data need to be stored. For some of these steps a software or standards already exist, but there is a need for an environment allowing to perform all these steps.The goal of the present work is to create a modeling environment where models from different domains can perform all the sixe steps: setup, couple, run, visualize, store. This presentation deals with the different problems which arise when setting up a modelling framework, such as terminology, numerical aspects as well as the software development issues which arise. In order to solve these issues we use Domain Driven Design methods, available open standards and open source components. While creating an integrated modeling environment we have identified that a separation of the following domains is essential: a framework allowing to link and exchange data between models; a framework allowing to integrate different components of the environment; graphical user interface; GIS; hybrid relational and multi-dimensional data store; discipline-specific libraries: river hydrology, morphology, water quality, statistics; model-specific components Delta Shell environment which is the basis for several products such as HABITAT, SOBEK and the future Delft3D interface. It implements and integrates components covering the above mentioned domains by making use of open standards and open source components. Different components have been developed to fill in gaps. For exchaning data with the GUI an object oriented scientific framework in .NET was developed within Delta Shell somewhat similar to the JSR-275. For the GIS domain several OGC standards were used such as SFS, WCS and WFS. For storage the CF standard together with

  19. Hybrid Models for Trajectory Error Modelling in Urban Environments

    NASA Astrophysics Data System (ADS)

    Angelatsa, E.; Parés, M. E.; Colomina, I.

    2016-06-01

    This paper tackles the first step of any strategy aiming to improve the trajectory of terrestrial mobile mapping systems in urban environments. We present an approach to model the error of terrestrial mobile mapping trajectories, combining deterministic and stochastic models. Due to urban specific environment, the deterministic component will be modelled with non-continuous functions composed by linear shifts, drifts or polynomial functions. In addition, we will introduce a stochastic error component for modelling residual noise of the trajectory error function. First step for error modelling requires to know the actual trajectory error values for several representative environments. In order to determine as accurately as possible the trajectories error, (almost) error less trajectories should be estimated using extracted nonsemantic features from a sequence of images collected with the terrestrial mobile mapping system and from a full set of ground control points. Once the references are estimated, they will be used to determine the actual errors in terrestrial mobile mapping trajectory. The rigorous analysis of these data sets will allow us to characterize the errors of a terrestrial mobile mapping system for a wide range of environments. This information will be of great use in future campaigns to improve the results of the 3D points cloud generation. The proposed approach has been evaluated using real data. The data originate from a mobile mapping campaign over an urban and controlled area of Dortmund (Germany), with harmful GNSS conditions. The mobile mapping system, that includes two laser scanner and two cameras, was mounted on a van and it was driven over a controlled area around three hours. The results show the suitability to decompose trajectory error with non-continuous deterministic and stochastic components.

  20. Water related environment modelling on Mars.

    PubMed

    Kereszturi, Akos

    2004-01-01

    During a human Mars exploration because of the lack of time astronauts need fast methods for the interpretation of unexpected observations which give them flexibility and new, important targets. With in-situ modelling it is possible to get information on various past and present processes at the same location on a far wider spectrum than would be realized even during a long mission. This work summarizes the potential technical requirements and benefits of the modelling. Based on a simple estimation with a 300 kg package, and 1-10% of the working time of 1-2 astronauts at the same location, they can get plenty of new and important information for the whole past and present Mars. With the proposed five test groups astronauts will be able to make better and newer kinds of interpretations of observations, and find better targets and methods during the same mission.

  1. Spacecraft Charging Specification Using Model Environments

    NASA Astrophysics Data System (ADS)

    Hilmer, R. V.; Cooke, D. L.

    2003-12-01

    The specification and prediction of spacecraft charging at geosynchronous orbit represents an important goal of space weather research. While significant correlations exist between geomagnetic indices and the occurrence of satellite frame charging, for example with sunlit frame charging of the DSCS III satellite [Krause et al., IEEE Trans. Nucl. Sci., 47(6), 2000], the relationships are inadequate for useful predictions of charging at specific locations. Charged particles drift across the geosynchronous orbital path, and not along it, so spacecraft within less than an hour in local time experience completely different charging conditions. To account for these differences, a simple geosynchronous spacecraft surface charging application is driven using particle environments from the Magnetospheric Specification Model (MSM). Preliminary analysis using the NASCAP spacecraft-plasma interaction code indicated that spacecraft geometry and materials are responsible for the partial suppression of photoelectrons leading to frequent daylight charging of the DSCS III B-7 spacecraft. Analysis of the minimal spacecraft approximation we employ, i.e., a sunlit kapton sphere, also indicates that this so-called bootstrap charging phenomena is active. Surface charging is therefore identified by the net electron current to the kapton spacecraft determined by integrating electron, proton, and oxygen fluxes from the MSM along with secondary and backscatter yields specified as a function of energy. Spacecraft frame charging measurements from the Charge Control System on board the DSCS III satellite are compared with results obtained from the MSM-driven charging model. MSM/charging algorithm simulation output will be characterized at all local times in an effort to evaluate the model's potential effectiveness as a practical spacecraft charging specification tool.

  2. An Overview of NASA's Oribital Debris Environment Model

    NASA Technical Reports Server (NTRS)

    Matney, Mark

    2010-01-01

    Using updated measurement data, analysis tools, and modeling techniques; the NASA Orbital Debris Program Office has created a new Orbital Debris Environment Model. This model extends the coverage of orbital debris flux throughout the Earth orbit environment, and includes information on the mass density of the debris as well as the uncertainties in the model environment. This paper will give an overview of this model and its implications for spacecraft risk analysis.

  3. Proactive Process-Level Live Migration in HPC Environments

    SciTech Connect

    Wang, Chao; Mueller, Frank; Engelmann, Christian; Scott, Stephen L

    2008-01-01

    As the number of nodes in high-performance computing environments keeps increasing, faults are becoming common place. Reactive fault tolerance (FT) often does not scale due to massive I/O requirements and relies on manual job resubmission. This work complements reactive with proactive FT at the process level. Through health monitoring, a subset of node failures can be anticipated when one's health deteriorates. A novel process-level live migration mechanism supports continued execution of applications during much of processes migration. This scheme is integrated into an MPI execution environment to transparently sustain health-inflicted node failures, which eradicates the need to restart and requeue MPI jobs. Experiments indicate that 1-6.5 seconds of prior warning are required to successfully trigger live process migration while similar operating system virtualization mechanisms require 13-24 seconds. This self-healing approach complements reactive FT by nearly cutting the number of checkpoints in half when 70% of the faults are handled proactively.

  4. Process modeling - It's history, current status, and future

    NASA Astrophysics Data System (ADS)

    Duttweiler, Russell E.; Griffith, Walter M.; Jain, Sulekh C.

    1991-04-01

    The development of process modeling is reviewed to examine the potential of process applications to prevent and solve problems associated with the aerospace industry. The business and global environments is assessed, and the traditional approach to product/process design is argued to be obsolete. A revised engineering process is described which involves planning and prediction before production by means of process simulation. Process simulation can permit simultaneous engineering of unit processes and complex processes, and examples are given in the cross-coupling of forging-process variance. The implementation of process modeling, CAE, and computer simulations are found to reduce costs and time associated with technological development when incorporated judiciously.

  5. The formation of adipocere in model aquatic environments.

    PubMed

    Stuart, B H; Notter, S J; Dent, B; Selvalatchmanan, J; Fu, S

    2016-01-01

    An examination of the chemistry of adipocere formation in aquatic systems provides insight into how environmental factors affect the decomposition processes of human remains. Gas chromatography–mass spectrometry (GC-MS) and inductively coupled plasma–mass spectrometry (ICPMS) have been employed to monitor the changes to the chemistry of adipocere formed in aquatic environments used to model seawater, river and chlorinated water systems. Seawater was shown to inhibit adipocere formation, and a distinctively different elemental composition was produced in this environment due to the high concentrations of salts. By comparison, river water has been shown to accelerate the formation of adipocere. Chlorinated water appears to significantly enhance adipocere formation, based on a comparison with established fatty acid concentration values. However, a competing reaction to form chlorohydrins in chlorinated water is believed to be responsible for the unusual findings in this environment. The application of the chemical characterization of adipocere to an understanding of how this particular decomposition product forms in different water environments has been demonstrated, and there is potential to utilise this approach to identify the environment in which a body has been immersed. PMID:26493693

  6. A GPGPU accelerated modeling environment for quantitatively characterizing karst systems

    NASA Astrophysics Data System (ADS)

    Myre, J. M.; Covington, M. D.; Luhmann, A. J.; Saar, M. O.

    2011-12-01

    The ability to derive quantitative information on the geometry of karst aquifer systems is highly desirable. Knowing the geometric makeup of a karst aquifer system enables quantitative characterization of the systems response to hydraulic events. However, the relationship between flow path geometry and karst aquifer response is not well understood. One method to improve this understanding is the use of high speed modeling environments. High speed modeling environments offer great potential in this regard as they allow researchers to improve their understanding of the modeled karst aquifer through fast quantitative characterization. To that end, we have implemented a finite difference model using General Purpose Graphics Processing Units (GPGPUs). GPGPUs are special purpose accelerators which are capable of high speed and highly parallel computation. The GPGPU architecture is a grid like structure, making it is a natural fit for structured systems like finite difference models. To characterize the highly complex nature of karst aquifer systems our modeling environment is designed to use an inverse method to conduct the parameter tuning. Using an inverse method reduces the total amount of parameter space needed to produce a set of parameters describing a system of good fit. Systems of good fit are determined with a comparison to reference storm responses. To obtain reference storm responses we have collected data from a series of data-loggers measuring water depth, temperature, and conductivity at locations along a cave stream with a known geometry in southeastern Minnesota. By comparing the modeled response to those of the reference responses the model parameters can be tuned to quantitatively characterize geometry, and thus, the response of the karst system.

  7. Visualizing the process of interaction in a 3D environment

    NASA Astrophysics Data System (ADS)

    Vaidya, Vivek; Suryanarayanan, Srikanth; Krishnan, Kajoli; Mullick, Rakesh

    2007-03-01

    As the imaging modalities used in medicine transition to increasingly three-dimensional data the question of how best to interact with and analyze this data becomes ever more pressing. Immersive virtual reality systems seem to hold promise in tackling this, but how individuals learn and interact in these environments is not fully understood. Here we will attempt to show some methods in which user interaction in a virtual reality environment can be visualized and how this can allow us to gain greater insight into the process of interaction/learning in these systems. Also explored is the possibility of using this method to improve understanding and management of ergonomic issues within an interface.

  8. ISLE (Image and Signal Processing LISP Environment) reference manual

    SciTech Connect

    Sherwood, R.J.; Searfus, R.M.

    1990-01-01

    ISLE is a rapid prototyping system for performing image and signal processing. It is designed to meet the needs of a person doing development of image and signal processing algorithms in a research environment. The image and signal processing modules in ISLE form a very capable package in themselves. They also provide a rich environment for quickly and easily integrating user-written software modules into the package. ISLE is well suited to applications in which there is a need to develop a processing algorithm in an interactive manner. It is straightforward to develop the algorithms, load it into ISLE, apply the algorithm to an image or signal, display the results, then modify the algorithm and repeat the develop-load-apply-display cycle. ISLE consists of a collection of image and signal processing modules integrated into a cohesive package through a standard command interpreter. ISLE developer elected to concentrate their effort on developing image and signal processing software rather than developing a command interpreter. A COMMON LISP interpreter was selected for the command interpreter because it already has the features desired in a command interpreter, it supports dynamic loading of modules for customization purposes, it supports run-time parameter and argument type checking, it is very well documented, and it is a commercially supported product. This manual is intended to be a reference manual for the ISLE functions The functions are grouped into a number of categories and briefly discussed in the Function Summary chapter. The full descriptions of the functions and all their arguments are given in the Function Descriptions chapter. 6 refs.

  9. Physical processes affecting the sedimentary environments of Long Island Sound

    USGS Publications Warehouse

    Signell, R.P.; Knebel, H. J.; List, J.H.; Farris, A.S.; ,

    1997-01-01

    A modeling study was undertaken to simulate the bottom tidal-, wave-, and wind-driven currents in Long Island Sound in order to provide a general physical oceanographic framework for understanding the characteristics and distribution of seafloor sedimentary environments. Tidal currents are important in the funnel-shaped eastern part of the Sound, where a strong gradient of tidal-current speed was found. This current gradient parallels the general westward progression of sedimentary environments from erosion or non-deposition, through bedload transport and sediment sorting, to fine-grained deposition. Wave-driven currents, meanwhile, appear to be important along the shallow margins of the basin, explaining the occurrence of relatively coarse sediments in regions where tidal currents alone are not strong enough to move sediment. Finally, westerly wind events are shown to locally enhance bottom currents along the axial depression of the sound, providing a possible explanation for the relatively coarse sediments found in the depression despite tide- and wave-induced currents below the threshold of sediment movement. The strong correlation between the near-bottom current intensity based on the model results and the sediment response as indicated by the distribution of sedimentary environments provides a framework for predicting the long-term effects of anthropogenic activities.

  10. MASCARET: creating virtual learning environments from system modelling

    NASA Astrophysics Data System (ADS)

    Querrec, Ronan; Vallejo, Paola; Buche, Cédric

    2013-03-01

    The design process for a Virtual Learning Environment (VLE) such as that put forward in the SIFORAS project (SImulation FOR training and ASsistance) means that system specifications can be differentiated from pedagogical specifications. System specifications can also be obtained directly from the specialists' expertise; that is to say directly from Product Lifecycle Management (PLM) tools. To do this, the system model needs to be considered as a piece of VLE data. In this paper we present Mascaret, a meta-model which can be used to represent such system models. In order to ensure that the meta-model is capable of describing, representing and simulating such systems, MASCARET is based SysML1, a standard defined by Omg.

  11. A model of the energetic ion environment of Mars

    SciTech Connect

    Luhmann, J.G. ); Schwingenschuh, K. )

    1990-02-01

    Because Mars has a weak intrinsic magnetic field and a substantial atmosphere, instruments on orbiting spacecraft should detect a result from comet-like ion pickup in the solar wind and magnetosheath convection electric fields, in addition to those that might result from processes internal to a Martian magnetosphere. Although this ion exosphere has been previously discussed in the literature, detailed predictions that might be directly applied to the interpretation of data are not available. Here a test particle model is used to construct a global picture of Martian pickup ions in the Mars environment. The model makes use of the recent Nagy and Cravens (1988) model of the Martian exosphere and Spreiter and Stahara's (1980) gas dynamic model of the magnetosheath. The pickup of ions originating at Phobos is also considered. Notable properties of the resulting ion distributions include their near-monoenergetic spectra, pancake pitch angle distributions, and large gyroradii compared to the planetary scale.

  12. Mathematical modeling of biomass fuels formation process.

    PubMed

    Gaska, Krzysztof; Wandrasz, Andrzej J

    2008-01-01

    The increasing demand for thermal and electric energy in many branches of industry and municipal management accounts for a drastic diminishing of natural resources (fossil fuels). Meanwhile, in numerous technical processes, a huge mass of wastes is produced. A segregated and converted combustible fraction of the wastes, with relatively high calorific value, may be used as a component of formed fuels. The utilization of the formed fuel components from segregated groups of waste in associated processes of co-combustion with conventional fuels causes significant savings resulting from partial replacement of fossil fuels, and reduction of environmental pollution resulting directly from the limitation of waste migration to the environment (soil, atmospheric air, surface and underground water). The realization of technological processes with the utilization of formed fuel in associated thermal systems should be qualified by technical criteria, which means that elementary processes as well as factors of sustainable development, from a global viewpoint, must not be disturbed. The utilization of post-process waste should be preceded by detailed technical, ecological and economic analyses. In order to optimize the mixing process of fuel components, a mathematical model of the forming process was created. The model is defined as a group of data structures which uniquely identify a real process and conversion of this data in algorithms based on a problem of linear programming. The paper also presents the optimization of parameters in the process of forming fuels using a modified simplex algorithm with a polynomial worktime. This model is a datum-point in the numerical modeling of real processes, allowing a precise determination of the optimal elementary composition of formed fuels components, with assumed constraints and decision variables of the task.

  13. The Engagement Model of Person-Environment Interaction

    ERIC Educational Resources Information Center

    Neufeld, Jason E.; Rasmussen, Heather N.; Lopez, Shane J.; Ryder, Jamie A.; Magyar-Moe, Jeana L.; Ford, Alicia Ito; Edwards, Lisa M.; Bouwkamp, Jennifer C.

    2006-01-01

    This article focuses on growth-promoting aspects in the environment, and the authors propose a strength-based, dynamic model of person-environment interaction. The authors begin by briefly discussing the typical recognition of contextual variables in models that rely on the concept of person-environment fit. This is followed by a review of recent…

  14. An Instructional Method for the AutoCAD Modeling Environment.

    ERIC Educational Resources Information Center

    Mohler, James L.

    1997-01-01

    Presents a command organizer for AutoCAD to aid new uses in operating within the 3-D modeling environment. Addresses analyzing the problem, visualization skills, nonlinear tools, a static view of a dynamic model, the AutoCAD organizer, environment attributes, and control of the environment. Contains 11 references. (JRH)

  15. Group Modeling in Social Learning Environments

    ERIC Educational Resources Information Center

    Stankov, Slavomir; Glavinic, Vlado; Krpan, Divna

    2012-01-01

    Students' collaboration while learning could provide better learning environments. Collaboration assumes social interactions which occur in student groups. Social theories emphasize positive influence of such interactions on learning. In order to create an appropriate learning environment that enables social interactions, it is important to…

  16. Thermoplastic matrix composite processing model

    NASA Technical Reports Server (NTRS)

    Dara, P. H.; Loos, A. C.

    1985-01-01

    The effects the processing parameters pressure, temperature, and time have on the quality of continuous graphite fiber reinforced thermoplastic matrix composites were quantitatively accessed by defining the extent to which intimate contact and bond formation has occurred at successive ply interfaces. Two models are presented predicting the extents to which the ply interfaces have achieved intimate contact and cohesive strength. The models are based on experimental observation of compression molded laminates and neat resin conditions, respectively. Identified as the mechanism explaining the phenomenon by which the plies bond to themselves is the theory of autohesion (or self diffusion). Theoretical predictions from the Reptation Theory between autohesive strength and contact time are used to explain the effects of the processing parameters on the observed experimental strengths. The application of a time-temperature relationship for autohesive strength predictions is evaluated. A viscoelastic compression molding model of a tow was developed to explain the phenomenon by which the prepreg ply interfaces develop intimate contact.

  17. Post-intercept debris environment modeling and visualization

    NASA Astrophysics Data System (ADS)

    Kinley, Todd W.; Homsley, Tom; Gebhart, Welman

    1997-07-01

    The user-friendly platform for ground-based radar analysis of debris environments (UPGRADE) workstation consists of a simulation architecture that has been developed to provide a flexible framework for modeling post-intercept debris and the resultant return signal produced by a radar situated in the vicinity of the debris impact point and illuminating the cloud of debris fragments. Characterization of the debris and radar signal is a complex process requiring models which can be brought together in an integrated visualization package. The UPGRADE architecture consists of a graphical user interface (GUI) which controls a group of MATLAB components used to generate inputs and graphical output products and C language modules which perform the analytic and algorithmic procedures required to generate and process the debris data.

  18. Sampling the food processing environment: taking up the cudgel for preventive quality management in food processing environments.

    PubMed

    Wagner, Martin; Stessl, Beatrix

    2014-01-01

    The Listeria monitoring program for Austrian cheese factories was established in 1988. The basic idea is to control the introduction of L. monocytogenes into the food processing environment, preventing the pathogen from contaminating the food under processing. The Austrian Listeria monitoring program comprises four levels of investigation, dealing with routine monitoring of samples and consequences of finding a positive sample. Preventive quality control concepts attempt to detect a foodborne hazard along the food processing chain, prior to food delivery, retailing, and consumption. The implementation of a preventive food safety concept provokes a deepened insight by the manufacturers into problems concerning food safety. The development of preventive quality assurance strategies contributes to the national food safety status and protects public health. PMID:24792566

  19. Sampling the food processing environment: taking up the cudgel for preventive quality management in food processing environments.

    PubMed

    Wagner, Martin; Stessl, Beatrix

    2014-01-01

    The Listeria monitoring program for Austrian cheese factories was established in 1988. The basic idea is to control the introduction of L. monocytogenes into the food processing environment, preventing the pathogen from contaminating the food under processing. The Austrian Listeria monitoring program comprises four levels of investigation, dealing with routine monitoring of samples and consequences of finding a positive sample. Preventive quality control concepts attempt to detect a foodborne hazard along the food processing chain, prior to food delivery, retailing, and consumption. The implementation of a preventive food safety concept provokes a deepened insight by the manufacturers into problems concerning food safety. The development of preventive quality assurance strategies contributes to the national food safety status and protects public health.

  20. Reflectance characteristics and surface processes in stabilized dune environments

    NASA Technical Reports Server (NTRS)

    Jacobberger, P. A.

    1989-01-01

    Analysis of multitemporal TM data for three environmentally related field areas yields information on the response characteristics of stabilized dunes and desert-fringe environments. The three field sites studied include dune fields in Egypt, Mali, and Botswana, ranging in climate from hyperarid to semiarid, and may be classed as an environmental series relating surface processes under Saharan, Sahelian, and Savanna conditions. Sites were field mapped and monitored with TM data for lengths of time up to a year. The complexity of spectral response characteristics is greatest where vegetation is dense and diverse, but study of the three environments together places constraints on the importance of vegetation to spectral response as well as to mechanisms of sand transport. In both Mali and Botswana, the Sahelian and Savanna environments, contrast reversals occur on dune crests and reflectance patterns change through the dry season to resemble the response curves of the hyperarid study site in Egypt. In these analyses, overall surface brightness is controlled by sand composition, while spectral features are controlled by vegetation dynamics.

  1. The concepts of energy, environment, and cost for process design

    SciTech Connect

    Abu-Khader, M.M.; Speight, J.G.

    2004-05-01

    The process industries (specifically, energy and chemicals) are characterized by a variety of reactors and reactions to bring about successful process operations. The design of energy-related and chemical processes and their evolution is a complex process that determines the competitiveness of these industries, as well as their environmental impact. Thus, we have developed an Enviro-Energy Concept designed to facilitate sustainable industrial development. The Complete Onion Model represents a complete methodology for chemical process design and illustrates all of the requirements to achieve the best possible design within the accepted environmental standards. Currently, NOx emissions from industrial processes continue to receive maximum attention, therefore the issue problem of NOx emissions from industrial sources such as power stations and nitric acid plants is considered. The Selective Catalytic Reduction (SCR) is one of the most promising and effective commercial technologies. It is considered the Best Available Control Technology (BACT) for NOx reduction. The solution of NOx emissions problem is either through modifying the chemical process design and/or installing an end-of-pipe technology. The degree of integration between the process design and the installed technology plays a critical role in the capital cost evaluation. Therefore, integrating process units and then optimizing the design has a vital effect on the total cost. Both the environmental regulations and the cost evaluation are the boundary constraints of the optimum solution.

  2. Numerical simulation of overbank processes in topographically complex floodplain environments

    NASA Astrophysics Data System (ADS)

    Nicholas, A. P.; Mitchell, C. A.

    2003-03-01

    This article presents results from an investigation of the hydraulic characteristics of overbank flows on topographically-complex natural river floodplains. A two-dimensional hydraulic model that solves the depth-averaged shallow water form of the Navier-Stokes equations is used to simulate an overbank flow event within a multiple channel reach of the River Culm, Devon, UK. Parameterization of channel and floodplain roughness by the model is evaluated using monitored records of main channel water level and point measurements of floodplain flow depth and unit discharge. Modelled inundation extents and sequences are assessed using maps of actual inundation patterns obtained using a Global Positioning System, observational evidence and ground photographs. Simulation results suggest a two-phase model of flooding at the site, which seems likely to be representative of natural floodplains in general. Comparison of these results with previous research demonstrates the complexity of overbank flows on natural river floodplains and highlights the limitations of laboratory flumes as an analogue for these environments. Despite this complexity, frequency distributions of simulated depth, velocity and unit discharge data closely follow a simple gamma distribution model, and are described by a shape parameter () that exhibits clear systematic trends with changing discharge and floodplain roughness. Such statistical approaches have the potential to provide the basis for computationally efficient flood routing and overbank sedimentation models.

  3. Welding process modelling and control

    NASA Technical Reports Server (NTRS)

    Romine, Peter L.; Adenwala, Jinen A.

    1993-01-01

    The research and analysis performed, and software developed, and hardware/software recommendations made during 1992 in development of the PC-based data acquisition system for support of Welding Process Modeling and Control is reported. A need was identified by the Metals Processing Branch of NASA Marshall Space Flight Center, for a mobile data aquisition and analysis system, customized for welding measurement and calibration. Several hardware configurations were evaluated and a PC-based system was chosen. The Welding Measurement System (WMS) is a dedicated instrument, strictly for the use of data aquisition and analysis. Although the WMS supports many of the functions associated with the process control, it is not the intention for this system to be used for welding process control.

  4. A Conceptual Model of Training Transfer that Includes the Physical Environment

    ERIC Educational Resources Information Center

    Hillsman, Terron L.; Kupritz, Virginia W.

    2007-01-01

    The study presents the physical environment as an emerging factor impacting training transfer and proposes to position this variable in the Baldwin and Ford (1988) model of the training transfer process. The amended model positions workplace design, one element of the physical environment, as a part of organizational context in the work…

  5. Modeling Low-temperature Geochemical Processes

    NASA Astrophysics Data System (ADS)

    Nordstrom, D. K.

    2003-12-01

    Geochemical modeling has become a popular and useful tool for a wide number of applications from research on the fundamental processes of water-rock interactions to regulatory requirements and decisions regarding permits for industrial and hazardous wastes. In low-temperature environments, generally thought of as those in the temperature range of 0-100 °C and close to atmospheric pressure (1 atm=1.01325 bar=101,325 Pa), complex hydrobiogeochemical reactions participate in an array of interconnected processes that affect us, and that, in turn, we affect. Understanding these complex processes often requires tools that are sufficiently sophisticated to portray multicomponent, multiphase chemical reactions yet transparent enough to reveal the main driving forces. Geochemical models are such tools. The major processes that they are required to model include mineral dissolution and precipitation; aqueous inorganic speciation and complexation; solute adsorption and desorption; ion exchange; oxidation-reduction; or redox; transformations; gas uptake or production; organic matter speciation and complexation; evaporation; dilution; water mixing; reaction during fluid flow; reaction involving biotic interactions; and photoreaction. These processes occur in rain, snow, fog, dry atmosphere, soils, bedrock weathering, streams, rivers, lakes, groundwaters, estuaries, brines, and diagenetic environments. Geochemical modeling attempts to understand the redistribution of elements and compounds, through anthropogenic and natural means, for a large range of scale from nanometer to global. "Aqueous geochemistry" and "environmental geochemistry" are often used interchangeably with "low-temperature geochemistry" to emphasize hydrologic or environmental objectives.Recognition of the strategy or philosophy behind the use of geochemical modeling is not often discussed or explicitly described. Plummer (1984, 1992) and Parkhurst and Plummer (1993) compare and contrast two approaches for

  6. Practical Framework: Implementing OEE Method in Manufacturing Process Environment

    NASA Astrophysics Data System (ADS)

    Maideen, N. C.; Sahudin, S.; Mohd Yahya, N. H.; Norliawati, A. O.

    2016-02-01

    Manufacturing process environment requires reliable machineries in order to be able to satisfy the market demand. Ideally, a reliable machine is expected to be operated and produce a quality product at its maximum designed capability. However, due to some reason, the machine usually unable to achieved the desired performance. Since the performance will affect the productivity of the system, a measurement technique should be applied. Overall Equipment Effectiveness (OEE) is a good method to measure the performance of the machine. The reliable result produced from OEE can then be used to propose a suitable corrective action. There are a lot of published paper mentioned about the purpose and benefit of OEE that covers what and why factors. However, the how factor not yet been revealed especially the implementation of OEE in manufacturing process environment. Thus, this paper presents a practical framework to implement OEE and a case study has been discussed to explain in detail each steps proposed. The proposed framework is beneficial to the engineer especially the beginner to start measure their machine performance and later improve the performance of the machine.

  7. Reaching new levels of realism in modeling biological macromolecules in cellular environments.

    PubMed

    Feig, Michael; Sugita, Yuji

    2013-09-01

    An increasing number of studies are aimed at modeling cellular environments in a comprehensive and realistic fashion. A major challenge in these efforts is how to bridge spatial and temporal scales over many orders of magnitude. Furthermore, there are additional challenges in integrating different aspects ranging from questions about biomolecular stability in crowded environments to the description of reactive processes on cellular scales. In this review, recent studies with models of biomolecules in cellular environments at different levels of detail are discussed in terms of their strengths and weaknesses. In particular, atomistic models, implicit representations of cellular environments, coarse-grained and spheroidal models of biomolecules, as well as the inclusion of reactive processes via reaction-diffusion models are described. Furthermore, strategies for integrating the different models into a comprehensive description of cellular environments are discussed.

  8. Animal models and conserved processes

    PubMed Central

    2012-01-01

    Background The concept of conserved processes presents unique opportunities for using nonhuman animal models in biomedical research. However, the concept must be examined in the context that humans and nonhuman animals are evolved, complex, adaptive systems. Given that nonhuman animals are examples of living systems that are differently complex from humans, what does the existence of a conserved gene or process imply for inter-species extrapolation? Methods We surveyed the literature including philosophy of science, biological complexity, conserved processes, evolutionary biology, comparative medicine, anti-neoplastic agents, inhalational anesthetics, and drug development journals in order to determine the value of nonhuman animal models when studying conserved processes. Results Evolution through natural selection has employed components and processes both to produce the same outcomes among species but also to generate different functions and traits. Many genes and processes are conserved, but new combinations of these processes or different regulation of the genes involved in these processes have resulted in unique organisms. Further, there is a hierarchy of organization in complex living systems. At some levels, the components are simple systems that can be analyzed by mathematics or the physical sciences, while at other levels the system cannot be fully analyzed by reducing it to a physical system. The study of complex living systems must alternate between focusing on the parts and examining the intact whole organism while taking into account the connections between the two. Systems biology aims for this holism. We examined the actions of inhalational anesthetic agents and anti-neoplastic agents in order to address what the characteristics of complex living systems imply for inter-species extrapolation of traits and responses related to conserved processes. Conclusion We conclude that even the presence of conserved processes is insufficient for inter

  9. Face Processing: Models For Recognition

    NASA Astrophysics Data System (ADS)

    Turk, Matthew A.; Pentland, Alexander P.

    1990-03-01

    The human ability to process faces is remarkable. We can identify perhaps thousands of faces learned throughout our lifetime and read facial expression to understand such subtle qualities as emotion. These skills are quite robust, despite sometimes large changes in the visual stimulus due to expression, aging, and distractions such as glasses or changes in hairstyle or facial hair. Computers which model and recognize faces will be useful in a variety of applications, including criminal identification, human-computer interface, and animation. We discuss models for representing faces and their applicability to the task of recognition, and present techniques for identifying faces and detecting eye blinks.

  10. Theoretical Models of Astrochemical Processes

    NASA Technical Reports Server (NTRS)

    Charnley, Steven

    2009-01-01

    Interstellar chemistry provides a natural laboratory for studying exotic species and processes at densities, temperatures, and reaction rates. that are difficult or impractical to address in the laboratory. Thus, many chemical reactions considered too sloe by the standards of terrestrial chemistry, can be 'observed and modeled. Curious proposals concerning the nature and chemistry of complex interstellar organic molecules will be described. Catalytic reactions on "rain surfaces can, in principle, lead to a lame variety of species and this has motivated many laboratory and theoretical studies. Gas phase processes may also build lame species in molecular clouds. Future laboratory data and computational tools needed to construct accurate chemical models of various astronomical sources to be observed by Herschel and ALMA will be outlined.

  11. Drought processes, modeling, and mitigation

    NASA Astrophysics Data System (ADS)

    Mishra, Ashok K.; Sivakumar, Bellie; Singh, Vijay P.

    2015-07-01

    Accurate assessment of droughts is crucial for proper planning and management of our water resources, environment, and ecosystems. The combined influence of increasing water demands and the anticipated impacts of global climate change has already raised serious concerns about worsening drought conditions in the future and their social, economic, and environmental impacts. As a result, studies on droughts are currently a major focal point for a broad range of research communities, including civil engineers, hydrologists, environmentalists, ecologists, meteorologists, geologists, agricultural scientists, economists, policy makers, and water managers. There is, therefore, an urgent need for enhancing our understanding of droughts (e.g. occurrence, modeling), making more reliable assessments of their impacts on various sectors of our society (e.g. domestic, agricultural, industrial), and undertaking appropriate adaptation and mitigation measures, especially in the face of global climate change.

  12. Workflows for microarray data processing in the Kepler environment

    PubMed Central

    2012-01-01

    Background Microarray data analysis has been the subject of extensive and ongoing pipeline development due to its complexity, the availability of several options at each analysis step, and the development of new analysis demands, including integration with new data sources. Bioinformatics pipelines are usually custom built for different applications, making them typically difficult to modify, extend and repurpose. Scientific workflow systems are intended to address these issues by providing general-purpose frameworks in which to develop and execute such pipelines. The Kepler workflow environment is a well-established system under continual development that is employed in several areas of scientific research. Kepler provides a flexible graphical interface, featuring clear display of parameter values, for design and modification of workflows. It has capabilities for developing novel computational components in the R, Python, and Java programming languages, all of which are widely used for bioinformatics algorithm development, along with capabilities for invoking external applications and using web services. Results We developed a series of fully functional bioinformatics pipelines addressing common tasks in microarray processing in the Kepler workflow environment. These pipelines consist of a set of tools for GFF file processing of NimbleGen chromatin immunoprecipitation on microarray (ChIP-chip) datasets and more comprehensive workflows for Affymetrix gene expression microarray bioinformatics and basic primer design for PCR experiments, which are often used to validate microarray results. Although functional in themselves, these workflows can be easily customized, extended, or repurposed to match the needs of specific projects and are designed to be a toolkit and starting point for specific applications. These workflows illustrate a workflow programming paradigm focusing on local resources (programs and data) and therefore are close to traditional shell scripting or

  13. Mesoscopic Modeling of Reactive Transport Processes

    NASA Astrophysics Data System (ADS)

    Kang, Q.; Chen, L.; Deng, H.

    2012-12-01

    Reactive transport processes involving precipitation and/or dissolution are pervasive in geochemical, biological and engineered systems. Typical examples include self-assembled patterns such as Liesegang rings or bands, cones of stalactites in limestones caves, biofilm growth in aqueous environment, formation of mineral deposits in boilers and heat exchangers, uptake of toxic metal ions from polluted water by calcium carbonate, and mineral trapping of CO2. Compared to experimental studies, a numerical approach enables a systematic study of the reaction kinetics, mass transport, and mechanisms of nucleation and crystal growth, and hence provides a detailed description of reactive transport processes. In this study, we enhance a previously developed lattice Boltzmann pore-scale model by taking into account the nucleation process, and develop a mesoscopic approach to simulate reactive transport processes involving precipitation and/or dissolution of solid phases. The model is then used to simulate the formation of Liesegang precipitation patterns and investigate the effects of gel on the morphology of the precipitates. It is shown that this model can capture the porous structures of the precipitates and can account for the effects of the gel concentration and material. A wide range of precipitation patterns is predicted under different gel concentrations, including regular bands, treelike patterns, and for the first time with numerical models, transition patterns from regular bands to treelike patterns. The model is also applied to study the effect of secondary precipitate on the dissolution of primary mineral. Several types of dissolution and precipitation processes are identified based on the morphology and structures of the precipitates and on the extent to which the precipitates affect the dissolution of the primary mineral. Finally the model is applied to study the formation of pseudomorph. It is demonstrated for the first time by numerical simulation that a

  14. Propagation modeling in a manufacturing environment

    SciTech Connect

    Birdwell, J.D.; Horn, R.D.; Rader, M.S.; Shourbaji, A.A.

    1995-12-31

    Wireless sensors which utilize low power spread spectrum data transmission have significant potential in industrial environments due to low cabling and installation costs. In addition, this technology imposes fewer constraints upon placement due to cable routing, allowing sensors to be installed in areas with poor access. Limitations are imposed on sensor and receiver placement by electromagnetic propagation effects in the industrial environment, including multipath and the presence of absorbing media. This paper explores the electromagnetic analysis of potential wireless sensor applications using commercially available finite element software. In addition, since the applications environment is often at least partially specified in electronic form using computer-aided drafting software, the importation of information from this software is discussed. Both three-dimensional and two-dimensional examples are presented which demonstrate the utility and limitations of the method.

  15. A Process for Technology Prioritization in a Competitive Environment

    NASA Technical Reports Server (NTRS)

    Stephens, Karen; Herman, Melody; Griffin, Brand

    2006-01-01

    This slide presentation reviews NASA's process for prioritizing technology requirements where there is a competitive environment. The In-Space Propulsion Technology (ISPT) project is used to exemplify the process. The ISPT project focuses on the mid level Technology Readiness Level (TRL) for development. These are TRL's 4 through 6, (i.e. Technology Development and Technology Demonstration. The objective of the planning activity is to identify the current most likely date each technology is needed and create ISPT technology development schedules based on these dates. There is a minimum of 4 years between flight and pacing mission. The ISPT Project needed to identify the "pacing mission" for each technology in order to provide funding for each area. Graphic representations show the development of the process. A matrix shows which missions are currently receiving pull from the both the Solar System Exploration and the Sun-Solar System Connection Roadmaps. The timeframes of the pacing missions technologies are shown for various types of propulsion. A pacing mission that was in the near future serves to increase the priority for funding. Adaptations were made when budget reductions precluded the total implementation of the plan.

  16. Extending the LWS Data Environment: Distributed Data Processing and Analysis

    NASA Technical Reports Server (NTRS)

    Narock, Thomas

    2005-01-01

    The final stages of this work saw changes to the original framework, as well as the completion and integration of several data processing services. Initially, it was thought that a peer-to-peer architecture was necessary to make this work possible. The peer-to-peer architecture provided many benefits including the dynamic discovery of new services that would be continually added. A prototype example was built and while it showed promise, a major disadvantage was seen in that it was not easily integrated into the existing data environment. While the peer-to-peer system worked well for finding and accessing distributed data processing services, it was found that its use was limited by the difficulty in calling it from existing tools and services. After collaborations with members of the data community, it was determined that our data processing system was of high value and that a new interface should be pursued in order for the community to take full advantage of it. As such; the framework was modified from a peer-to-peer architecture to a more traditional web service approach. Following this change multiple data processing services were added. These services include such things as coordinate transformations and sub setting of data. Observatory (VHO), assisted with integrating the new architecture into the VHO. This allows anyone using the VHO to search for data, to then pass that data through our processing services prior to downloading it. As a second attempt at demonstrating the new system, a collaboration was established with the Collaborative Sun Earth Connector (CoSEC) group at Lockheed Martin. This group is working on a graphical user interface to the Virtual Observatories and data processing software. The intent is to provide a high-level easy-to-use graphical interface that will allow access to the existing Virtual Observatories and data processing services from one convenient application. Working with the CoSEC group we provided access to our data

  17. Construction material processed using lunar simulant in various environments

    NASA Technical Reports Server (NTRS)

    Chase, Stan; Ocallaghan-Hay, Bridget; Housman, Ralph; Kindig, Michael; King, John; Montegrande, Kevin; Norris, Raymond; Vanscotter, Ryan; Willenborg, Jonathan; Staubs, Harry

    1995-01-01

    The manufacture of construction materials from locally available resources in space is an important first step in the establishment of lunar and planetary bases. The objective of the CoMPULSIVE (Construction Material Processed Using Lunar Simulant In Various Environments) experiment is to develop a procedure to produce construction materials by sintering or melting Johnson Space Center Simulant 1 (JSC-1) lunar soil simulant in both earth-based (1-g) and microgravity (approximately 0-g) environments. The characteristics of the resultant materials will be tested to determine its physical and mechanical properties. The physical characteristics include: crystalline, thermal, and electrical properties. The mechanical properties include: compressive tensile, and flexural strengths. The simulant, placed in a sealed graphite crucible, will be heated using a high temperature furnace. The crucible will then be cooled by radiative and forced convective means. The core furnace element consists of space qualified quartz-halogen incandescent lamps with focusing mirrors. Sample temperatures of up to 2200 C are attainable using this heating method.

  18. Operational SAR Data Processing in GIS Environments for Rapid Disaster Mapping

    NASA Astrophysics Data System (ADS)

    Meroni, A.; Bahr, T.

    2013-05-01

    Having access to SAR data can be highly important and critical especially for disaster mapping. Updating a GIS with contemporary information from SAR data allows to deliver a reliable set of geospatial information to advance civilian operations, e.g. search and rescue missions. Therefore, we present in this paper the operational processing of SAR data within a GIS environment for rapid disaster mapping. This is exemplified by the November 2010 flash flood in the Veneto region, Italy. A series of COSMO-SkyMed acquisitions was processed in ArcGIS® using a single-sensor, multi-mode, multi-temporal approach. The relevant processing steps were combined using the ArcGIS ModelBuilder to create a new model for rapid disaster mapping in ArcGIS, which can be accessed both via a desktop and a server environment.

  19. The Virtual Cell Modeling and Simulation Software Environment

    PubMed Central

    Moraru, Ion I.; Schaff, James C.; Slepchenko, Boris M.; Blinov, Michael; Morgan, Frank; Lakshminarayana, Anuradha; Gao, Fei; Li, Ye; Loew, Leslie M.

    2009-01-01

    The Virtual Cell (VCell; http://vcell.org/) is a problem solving environment, built on a central database, for analysis, modeling and simulation of cell biological processes. VCell integrates a growing range of molecular mechanisms, including reaction kinetics, diffusion, flow, membrane transport, lateral membrane diffusion and electrophysiology, and can associate these with geometries derived from experimental microscope images. It has been developed and deployed as a web-based, distributed, client-server system, with more than a thousand world-wide users. VCell provides a separation of layers (core technologies and abstractions) representing biological models, physical mechanisms, geometry, mathematical models and numerical methods. This separation clarifies the impact of modeling decisions, assumptions, and approximations. The result is a physically consistent, mathematically rigorous, spatial modeling and simulation framework. Users create biological models and VCell will automatically (i) generate the appropriate mathematical encoding for running a simulation, and (ii) generate and compile the appropriate computer code. Both deterministic and stochastic algorithms are supported for describing and running non-spatial simulations; a full partial differential equation solver using the finite volume numerical algorithm is available for reaction-diffusion-advection simulations in complex cell geometries including 3D geometries derived from microscope images. Using the VCell database, models and model components can be reused and updated, as well as privately shared among collaborating groups, or published. Exchange of models with other tools is possible via import/export of SBML, CellML, and MatLab formats. Furthermore, curation of models is facilitated by external database binding mechanisms for unique identification of components and by standardized annotations compliant with the MIRIAM standard. VCell is now open source, with its native model encoding language

  20. IFR fuel cycle process equipment design environment and objectives

    SciTech Connect

    Rigg, R.H.

    1993-03-01

    Argonne National laboratory (ANL) is refurbishing the hot cell facility originally constructed with the EBR-II reactor. When refurbishment is complete, the facility win demonstrate the complete fuel cycle for current generation high burnup metallic fuel elements. These are sodium bonded, stainless steel clad fuel pins of U-Zr or U-Pu-Zr composition typical of the fuel type proposed for a future Integral Fast Reactor (IFR) design. To the extent possible, the process equipment is being built at full commercial scale, and the facility is being modified to incorporate current DOE facility design requirements and modem remote maintenance principles. The current regulatory and safety environment has affected the design of the fuel fabrication equipment, most of which will be described in greater detail in subsequent papers in this session.

  1. IFR fuel cycle process equipment design environment and objectives

    SciTech Connect

    Rigg, R.H.

    1993-01-01

    Argonne National laboratory (ANL) is refurbishing the hot cell facility originally constructed with the EBR-II reactor. When refurbishment is complete, the facility win demonstrate the complete fuel cycle for current generation high burnup metallic fuel elements. These are sodium bonded, stainless steel clad fuel pins of U-Zr or U-Pu-Zr composition typical of the fuel type proposed for a future Integral Fast Reactor (IFR) design. To the extent possible, the process equipment is being built at full commercial scale, and the facility is being modified to incorporate current DOE facility design requirements and modem remote maintenance principles. The current regulatory and safety environment has affected the design of the fuel fabrication equipment, most of which will be described in greater detail in subsequent papers in this session.

  2. A Process Study of the Development of Virtual Research Environments

    NASA Astrophysics Data System (ADS)

    Ahmed, I.; Cooper, K.; McGrath, R.; Griego, G.; Poole, M. S.; Hanisch, R. J.

    2014-05-01

    In recent years, cyberinfrastructures have been deployed to create virtual research environments (VREs) - such as the Virtual Astronomical Observatory (VAO) - to enhance the quality and speed of scientific research, and to foster global scientific communities. Our study utilizes process methodology to study the evolution of VREs. This approach focuses on a series of events that bring about or lead to some outcome, and attempts to specify the generative mechanism that could produce the event series. This paper briefly outlines our approach and describes initial results of a case study of the VAO, one of the participating VREs. The case study is based on interviews with seven individuals participating in the VAO, and analysis of project documents and online resources. These sources are hand tagged to identify events related to the thematic tracks, to yield a narrative of the project. Results demonstrate the event series of an organization through traditional methods augmented by virtual sources.

  3. Model test optimization using the virtual environment for test optimization

    SciTech Connect

    Klenke, S.E.; Reese, G.M.; Schoof, L.A.; Shierling, C.

    1995-11-01

    We present a software environment integrating analysis and test-based models to support optimal modal test design through a Virtual Environment for Test Optimization (VETO). The VETO assists analysis and test engineers to maximize the value of each modal test. It is particularly advantageous for structural dynamics model reconciliation applications. The VETO enables an engineer to interact with a finite element model of a test object to optimally place sensors and exciters and to investigate the selection of data acquisition parameters needed to conduct a complete modal survey. Additionally, the user can evaluate the use of different types of instrumentation such as filters, amplifiers and transducers for which models are available in the VETO. The dynamic response of most of the virtual instruments (including the device under test) are modeled in the state space domain. Design of modal excitation levels and appropriate test instrumentation are facilitated by the VETO`s ability to simulate such features as unmeasured external inputs, A/D quantization effects, and electronic noise. Measures of the quality of the experimental design, including the Modal Assurance Criterion, and the Normal Mode Indicator Function are available. The VETO also integrates tools such as Effective Independence and minamac to assist in selection of optimal sensor locations. The software is designed about three distinct modules: (1) a main controller and GUI written in C++, (2) a visualization model, taken from FEAVR, running under AVS, and (3) a state space model and time integration module built in SIMULINK. These modules are designed to run as separate processes on interconnected machines.

  4. A Collaborative Model for Ubiquitous Learning Environments

    ERIC Educational Resources Information Center

    Barbosa, Jorge; Barbosa, Debora; Rabello, Solon

    2016-01-01

    Use of mobile devices and widespread adoption of wireless networks have enabled the emergence of Ubiquitous Computing. Application of this technology to improving education strategies gave rise to Ubiquitous e-Learning, also known as Ubiquitous Learning. There are several approaches to organizing ubiquitous learning environments, but most of them…

  5. Gravity Modeling for Variable Fidelity Environments

    NASA Technical Reports Server (NTRS)

    Madden, Michael M.

    2006-01-01

    Aerospace simulations can model worlds, such as the Earth, with differing levels of fidelity. The simulation may represent the world as a plane, a sphere, an ellipsoid, or a high-order closed surface. The world may or may not rotate. The user may select lower fidelity models based on computational limits, a need for simplified analysis, or comparison to other data. However, the user will also wish to retain a close semblance of behavior to the real world. The effects of gravity on objects are an important component of modeling real-world behavior. Engineers generally equate the term gravity with the observed free-fall acceleration. However, free-fall acceleration is not equal to all observers. To observers on the sur-face of a rotating world, free-fall acceleration is the sum of gravitational attraction and the centrifugal acceleration due to the world's rotation. On the other hand, free-fall acceleration equals gravitational attraction to an observer in inertial space. Surface-observed simulations (e.g. aircraft), which use non-rotating world models, may choose to model observed free fall acceleration as the gravity term; such a model actually combines gravitational at-traction with centrifugal acceleration due to the Earth s rotation. However, this modeling choice invites confusion as one evolves the simulation to higher fidelity world models or adds inertial observers. Care must be taken to model gravity in concert with the world model to avoid denigrating the fidelity of modeling observed free fall. The paper will go into greater depth on gravity modeling and the physical disparities and synergies that arise when coupling specific gravity models with world models.

  6. Influence of global climatic processes on environment The Arctic seas

    NASA Astrophysics Data System (ADS)

    Kholmyansky, Mikhael; Anokhin, Vladimir; Kartashov, Alexandr

    2016-04-01

    One of the most actual problems of the present is changes of environment of Arctic regions under the influence of global climatic processes. Authors as a result of the works executed by them in different areas of the Russian Arctic regions, have received the materials characterising intensity of these processes. Complex researches are carried out on water area and in a coastal zone the White, the Barents, the Kara and the East-Siberian seas, on lake water areas of subarctic region since 1972 on the present. Into structure of researches enter: hydrophysical, cryological observations, direct measurements of temperatures, the analysis of the drill data, electrometric definitions of the parametres of a frozen zone, lithodynamic and geochemical definitions, geophysical investigations of boreholes, studying of glaciers on the basis of visual observations and the analysis of photographs. The obtained data allows to estimate change of temperature of a water layer, deposits and benthonic horizon of atmosphere for last 25 years. On the average they make 0,38⁰C for sea waters, 0,23⁰C for friable deposits and 0,72⁰C for atmosphere. Under the influence of temperature changes in hydrosphere and lithosphere of a shelf cryolithic zone changes the characteristics. It is possible to note depth increase of roof position of the cryolithic zone on the most part of the studied water area. Modern fast rise in temperature high-ice rocks composing coast, has led to avalanche process thermo - denudation and to receipt in the sea of quantity of a material of 1978 three times exceeding level Rise in temperature involves appreciable deviation borders of the Arctic glacial covers. On our monitoring measurements change of the maintenance of oxygen in benthonic area towards increase that is connected with reduction of the general salinity of waters at the expense of fresh water arriving at ice thawing is noticed. It, in turn, leads to change of a biogene part of ecosystem. The executed

  7. Cosmic ray environment model for Earth orbit

    NASA Technical Reports Server (NTRS)

    Edmonds, L.

    1985-01-01

    A set of computer codes, which include the effects of the Earth's magnetic field, used to predict the cosmic ray environment (atomic numbers 1 through 28) for a spacecraft in a near-Earth orbit is described. A simple transport analysis is used to approximate the environment at the center of a spherical shield of arbitrary thickness. The final output is in a form (a Heinrich Curve) which has immediate applications for single event upset rate predictions. The codes will culate the time average environment for an arbitrary number (fractional or whole) of circular orbits. The computer codes were run for some selected orbits and the results, which can be useful for quick estimates of single event upset rates, are given. The codes were listed in the language HPL, which is appropriate or a Hewlett Packard 9825B desk top computer. Extensive documentation of the codes is available from COSMIC, except where explanations have been deferred to references where extensive documentation can be found. Some qualitative aspects of the effects of mass and magnetic shielding are also discussed.

  8. Development, validation and application of numerical space environment models

    NASA Astrophysics Data System (ADS)

    Honkonen, Ilja

    2013-10-01

    Currently the majority of space-based assets are located inside the Earth's magnetosphere where they must endure the effects of the near-Earth space environment, i.e. space weather, which is driven by the supersonic flow of plasma from the Sun. Space weather refers to the day-to-day changes in the temperature, magnetic field and other parameters of the near-Earth space, similarly to ordinary weather which refers to changes in the atmosphere above ground level. Space weather can also cause adverse effects on the ground, for example, by inducing large direct currents in power transmission systems. The performance of computers has been growing exponentially for many decades and as a result the importance of numerical modeling in science has also increased rapidly. Numerical modeling is especially important in space plasma physics because there are no in-situ observations of space plasmas outside of the heliosphere and it is not feasible to study all aspects of space plasmas in a terrestrial laboratory. With the increasing number of computational cores in supercomputers, the parallel performance of numerical models on distributed memory hardware is also becoming crucial. This thesis consists of an introduction, four peer reviewed articles and describes the process of developing numerical space environment/weather models and the use of such models to study the near-Earth space. A complete model development chain is presented starting from initial planning and design to distributed memory parallelization and optimization, and finally testing, verification and validation of numerical models. A grid library that provides good parallel scalability on distributed memory hardware and several novel features, the distributed cartesian cell-refinable grid (DCCRG), is designed and developed. DCCRG is presently used in two numerical space weather models being developed at the Finnish Meteorological Institute. The first global magnetospheric test particle simulation based on the

  9. Challenging the Expanding Environment Model of Teaching Elementary Social Studies.

    ERIC Educational Resources Information Center

    Palmer, Jesse

    1989-01-01

    Looks at criticism of the Expanding Environments Model in the elementary school social studies curriculum. Cites recent reports that recommend a history-centered elementary curriculum. States that teaching methods may be the cause of historical, civic, and geographic illiteracy rather than the Expanding Environments Model. (LS)

  10. Indoor environment modeling for interactive robot security application

    NASA Astrophysics Data System (ADS)

    Jo, Sangwoo; Shahab, Qonita M.; Kwon, Yong-Moo; Ahn, Sang Chul

    2006-10-01

    This paper presents our simple and easy to use method to obtain a 3D textured model. For expression of reality, we need to integrate the 3D models and real scenes. Most of other cases of 3D modeling method consist of two data acquisition devices. One is for getting a 3D model and another for obtaining realistic textures. In this case, the former device would be 2D laser range-finder and the latter device would be common camera. Our algorithm consists of building a measurement-based 2D metric map which is acquired by laser range-finder, texture acquisition/stitching and texture-mapping to corresponding 3D model. The algorithm is implemented with laser sensor for obtaining 2D/3D metric map and two cameras for gathering texture. Our geometric 3D model consists of planes that model the floor and walls. The geometry of the planes is extracted from the 2D metric map data. Textures for the floor and walls are generated from the images captured by two 1394 cameras which have wide Field of View angle. Image stitching and image cutting process is used to generate textured images for corresponding with a 3D model. The algorithm is applied to 2 cases which are corridor and space that has the four walls like room of building. The generated 3D map model of indoor environment is shown with VRML format and can be viewed in a web browser with a VRML plug-in. The proposed algorithm can be applied to 3D model-based remote surveillance system through WWW.

  11. Shuttle measured contaminant environment and modeling for payloads. Preliminary assessment of the space telescope environment in the shuttle bay

    NASA Technical Reports Server (NTRS)

    Scialdone, J. J.

    1983-01-01

    A baseline gaseous and particulate environment of the Shuttle bay was developed based on the various measurements which were made during the first four flights of the Shuttle. The environment is described by the time dependent pressure, density, scattered molecular fluxes, the column densities and including the transient effects of water dumps, engine firings and opening and closing of the bay doors. The particulate conditions in the ambient and on surfaces were predicted as a function of the mission time based on the available data. This basic Shuttle environment when combined with the outgassing and the particulate contributions of the payloads, can provide a description of the environment of a payload in the Shuttle bay. As an example of this application, the environment of the Space Telescope in the bay, which may be representative of the environment of several payloads, was derived. Among the many findings obtained in the process of modeling the environment, one is that the payloads environment in the bay is not substantially different or more objectionable than the self-generated environment of a large payload or spacecraft. It is, however, more severe during ground facilities operations, the first 15 to 20 hours of the flight, during and for a short period after ater was dumped overboard, and the reaction control engines are being fired.

  12. Periglacial process and Pleistocene environment in northern China

    SciTech Connect

    Guo Xudong; Liu Dongsheng ); Yan Fuhua )

    1991-03-01

    In the present time, five kinds of periglacial phenomena have been defined: ice wedges, periglacial involutions, congelifolds, congeliturbations, and loess dunes. From the stratigraphical and geochronological data, the periglacial process is divided into six stages. (1) Guanting periglacial stage, characterized by the congeliturbative deposits that have developed in early Pleistocene Guanting loess-like formation. Paleomagnetic dating gives 2.43 Ma B.P. (2) Yanchi periglacial stage, characterized by the congelifold that has developed in middle Pleistocene Yanchi Lishi loess formation. Paleomagnetic dating gives 0.50 Ma B.P. (3) Zhaitang periglacial stage (II), characterized by the periglacial involutions that have developed in lower middle Pleistocene Lishi loess formation. Paleomagnetic dating gives 0.30 Ma B.P. (4) Zhaitang periglacial state (I), characterized by the ice (soil) wedge that has developed in upper-middle Pleistocene Lishi loess formation. Paleomagnetic dating gives 0.20 Ma B.P. (5) Qiansangyu periglacial stage (II), characterized by the ice (sand) wedges that has developed in late Pleistocene Malan loess formation. Paleomagnetic dating gives 0.13 Ma B.P. (6) Qiansangyu periglacial stage (I), characterized by the ice (soil) wedge that has developed in late Pleistocene Malan loess-like formation. Thermoluminescent dating gives 0.018 Ma B.P. Spore-pollen composition analysis shows that the savannah steppe environment prevailed in northern China during Pleistocene periglacial periods. These fossilized periglacial phenomena indicate a rather arid and windy periglacial environment with a mean annual temperature estimated some 12-15C colder than that in the present.

  13. Interoperation Modeling for Intelligent Domotic Environments

    NASA Astrophysics Data System (ADS)

    Bonino, Dario; Corno, Fulvio

    This paper introduces an ontology-based model for domotic device inter-operation. Starting from a previously published ontology (DogOnt) a refactoring and extension is described allowing to explicitly represent device capabilities, states and commands, and supporting abstract modeling of device inter-operation.

  14. NUMERICAL MODELING OF FINE SEDIMENT PHYSICAL PROCESSES.

    USGS Publications Warehouse

    Schoellhamer, David H.

    1985-01-01

    Fine sediment in channels, rivers, estuaries, and coastal waters undergo several physical processes including flocculation, floc disruption, deposition, bed consolidation, and resuspension. This paper presents a conceptual model and reviews mathematical models of these physical processes. Several general fine sediment models that simulate some of these processes are reviewed. These general models do not directly simulate flocculation and floc disruption, but the conceptual model and existing functions are shown to adequately model these two processes for one set of laboratory data.

  15. Performance modeling codes for the QuakeSim problem solving environment

    NASA Technical Reports Server (NTRS)

    Parker, J. W.; Donnellan, A.; Lyzenga, G.; Rundle, J.; Tullis, T.

    2003-01-01

    The QuakeSim Problem Solving Environment uses a web-services approach to unify and deploy diverse remote data sources and processing services within a browser environment. Here we focus on the high-performance crustal modeling applications that will be included in this set of remote but interoperable applications.

  16. Remotely Triggered Seismicity in Volcanic and Hydrothermal Environments - Which Processes?

    NASA Astrophysics Data System (ADS)

    Hill, D. P.

    2001-12-01

    The abrupt increase in seismicity rates at sites throughout much of the western United States immediately following the 1992 M=7.4 Landers earthquake provided compelling evidence that large regional earthquakes can trigger local earthquake activity at sites many source dimensions removed from the mainshock epicenter. Remotely triggered seismicity has since been documented for a number of other large earthquakes including the 1999 M=7.1 Hector Mine earthquake. Well-documented instances of remotely triggered seismicity are largely confined to transtensional or extensional tectonic regimes with a large fraction of these closely associated with hydrothermal and/or young volcanic systems. In the case of Long Valley caldera in eastern California, which responded to both the Landers and Hector Mines earthquakes, the triggered seismicity appears to be secondary to a larger, aseismic deformation transient. Because this is the only remotely triggered site with continuous deformation monitoring, however, it remains unclear whether a deformation transient is fundamental component of the remote triggering process. Fluids, either aqueous or magmatic, play a central role in most models for the triggering process in hydrothermal and/or young magmatic systems. These models span a range of intriguing (and sometimes counter-intuitive) physical processes including pressure increases associated with advective overpressure and rectified diffusion in bubbly fluids, hydraulic surges resulting from rupturing of compartments of super-hydrostatic fluids in the brittle-plastic transition zone, hydraulic pumping of near-surface pore fluids by surface waves, disruption of fine sediment accumulations (dams) in confined aquifers, local stress changes in the brittle crust associated with relaxation or mobilization of a partially crystallized magma body. All appeal to the dynamic stresses from the mainshock triggering a non-linear response in a crustal volume that is in some sense in a near

  17. THE RHIC/AGS ONLINE MODEL ENVIRONMENT: DESIGN AND OVERVIEW.

    SciTech Connect

    SATOGATA,T.; BROWN,K.; PILAT,F.; TAFTI,A.A.; TEPIKIAN,S.; VAN ZEIJTS,J.

    1999-03-29

    An integrated online modeling environment is currently under development for use by AGS and RHIC physicists and commissioners. This environment combines the modeling efforts of both groups in a CDEV [1] client-server design, providing access to expected machine optics and physics parameters based on live and design machine settings. An abstract modeling interface has been designed as a set of adapters [2] around core computational modeling engines such as MAD and UAL/Teapot++ [3]. This approach allows us to leverage existing survey, lattice, and magnet infrastructure, as well as easily incorporate new model engine developments. This paper describes the architecture of the RHIC/AGS modeling environment, including the application interface through CDEV and general tools for graphical interaction with the model using Tcl/Tk. Separate papers at this conference address the specifics of implementation and modeling experience for AGS and RHIC.

  18. Designing Effective Learning Environments: Cognitive Apprenticeship Models.

    ERIC Educational Resources Information Center

    Berryman, Sue E.

    1991-01-01

    Using cognitive science as the knowledge base for the discussion, this paper reviews why many school learning situations are ineffective and introduces cognitive apprenticeship models that suggest what effective learning situations might look like. Five wrong assumptions about learning are examined: (1) people transfer learning from one situation…

  19. A new security model for collaborative environments

    SciTech Connect

    Agarwal, Deborah; Lorch, Markus; Thompson, Mary; Perry, Marcia

    2003-06-06

    Prevalent authentication and authorization models for distributed systems provide for the protection of computer systems and resources from unauthorized use. The rules and policies that drive the access decisions in such systems are typically configured up front and require trust establishment before the systems can be used. This approach does not work well for computer software that moderates human-to-human interaction. This work proposes a new model for trust establishment and management in computer systems supporting collaborative work. The model supports the dynamic addition of new users to a collaboration with very little initial trust placed into their identity and supports the incremental building of trust relationships through endorsements from established collaborators. It also recognizes the strength of a users authentication when making trust decisions. By mimicking the way humans build trust naturally the model can support a wide variety of usage scenarios. Its particular strength lies in the support for ad-hoc and dynamic collaborations and the ubiquitous access to a Computer Supported Collaboration Workspace (CSCW) system from locations with varying levels of trust and security.

  20. Psychosocial environment: a health promotion model.

    PubMed

    Kar, S B

    1983-01-01

    This article presents a multidimensional model of psychosocial determinants of health behavior for health promotion research and policy analysis. Frequently, health promotion focuses almost exclusively on intrapsychic determinants and on individual level behavior. Based upon Field Theory and attitude theories, this proposed model holds that in populations with comparable sociodemographic and biological status (exogenous variables) a health behavior is a function of direct and interaction effects of five key intrapsychic and external variables. These are: behavioral intentions, social support, accessibility of means for action, personal autonomy, and action situation. Empirical tests with cross-cultural studies in Venezuela, Kenya, and the Philippines provide substantial support for the model. The findings suggest that while health promotion strategies should deal with intrapsychic determinants of behavior, key extrapsychic factors (such as social support, quality and accessibility of health care measures, and situational factors) all have direct and independent effects on health behavior as well. Health promotion research and interventions which aim exclusively at intrapsychic determinants would thus have rather limited overall value. The article discusses key research and policy implications of the model presented.

  1. Climate Model Evaluation in Distributed Environments.

    NASA Astrophysics Data System (ADS)

    Braverman, A. J.

    2014-12-01

    As the volume of climate-model-generated and observational data increases, it has become infeasible to perform large-scale comparisons of model output against observations by moving the data to a central location. Data reduction techniques, such as gridding or subsetting, can reduce data volume, but also sacrifice information about spatial and temporal variability that may be important for the comparison. Alternatively, it is generally recognized that "moving the computaton to the data" is more efficient for leveraging large data sets. In the spirit of the latter approach, we describe a new methodology for comparing time series structure in model-generated and observational time series when those data are stored on different computers. The method involves simulating the sampling distribution of the difference between a statistic computed from the model output and the same statistic computed from the observed data. This is accomplished with separate wavelet decompositions of the two time series on their respective local machines, and the transmission of only a very small set of information computed from the wavelet coefficients. The smaller that set is, the cheaper it is to transmit, but also the less accurate will be the result. From the standpoint of the analysis of distributed data, the main question concerns the nature of that trade-off. In this talk, we describe the comparison methodology and the results of some preliminary studies on the cost-accuracy trade-off.

  2. Understanding Fundamental Material Degradation Processes in High Temperature Aggressive Chemomechanical Environments

    SciTech Connect

    Stubbins, James; Gewirth, Andrew; Sehitoglu, Huseyin; Sofronis, Petros; Robertson, Ian

    2014-01-16

    The objective of this project is to develop a fundamental understanding of the mechanisms that limit materials durability for very high-temperature applications. Current design limitations are based on material strength and corrosion resistance. This project will characterize the interactions of high-temperature creep, fatigue, and environmental attack in structural metallic alloys of interest for the very high-temperature gas-cooled reactor (VHTR) or Next–Generation Nuclear Plant (NGNP) and for the associated thermo-chemical processing systems for hydrogen generation. Each of these degradation processes presents a major materials design challenge on its own, but in combination, they can act synergistically to rapidly degrade materials and limit component lives. This research and development effort will provide experimental results to characterize creep-fatigue-environment interactions and develop predictive models to define operation limits for high-temperature structural material applications. Researchers will study individually and in combination creep-fatigue-environmental attack processes in Alloys 617, 230, and 800H, as well as in an advanced Ni-Cr oxide dispersion strengthened steel (ODS) system. For comparison, the study will also examine basic degradation processes in nichrome (Ni-20Cr), which is a basis for most high-temperature structural materials, as well as many of the superalloys. These materials are selected to represent primary candidate alloys, one advanced developmental alloy that may have superior high-temperature durability, and one model system on which basic performance and modeling efforts can be based. The research program is presented in four parts, which all complement each other. The first three are primarily experimental in nature, and the last will tie the work together in a coordinated modeling effort. The sections are (1) dynamic creep-fatigue-environment process, (2) subcritical crack processes, (3) dynamic corrosion – crack

  3. A model of the energetic ion environment of Mars

    NASA Technical Reports Server (NTRS)

    Luhmann, J. G.; Schwingenschuh, K.

    1990-01-01

    Because Mars has a weak intrinsic magnetic field and a substantial atmosphere, instruments on orbiting spacecraft should detect a population of energetic heavy planetary ions which result from comet-like ion pickup in the solar wind and magnetosheath convection electric fields, in addition to those that might result from processes internal to a Martian 'magnetosphere.' Although this ion exosphere has been previously discussed in the literature, detailed predictions that might be directly applied to the interpretation of data are not available. Here a test particle model is used to construct a global picture of Martian pickup ions in the Mars environment. The model makes use of the recent Nagy and Cravens (1988) model of the Martian exosphere and Spreiter and Stahara's (1980) gas dynamic model of the magnetosheath. The pickup of ions originating at Phobos is also considered. Notable properties of the resulting ion distributions include their near-monoenergetic spectra, pancake pitch angle distributions, and large gyroradii compared to the planetary scale.

  4. Molecular and dust scattering processes in astrophysical environments

    NASA Astrophysics Data System (ADS)

    Lupu, Roxana-Elena

    2009-06-01

    Understanding the formation and evolution of structure in the universe requires knowledge of the stellar energy output and its processing by gas and dust, evaluating the abundances of atomic and molecular species, and constraining thermodynamic parameters. Molecules, with molecular hydrogen and carbon monoxide being the most abundant, are a major component of the interstellar medium, and play an essential role in structure formation, by participating in gas cooling. Molecular fluorescence studies aim to provide a better interpretation of far-ultraviolet observations, constraining the molecular abundances and their interaction with the radiation field. The fluorescent emission lines offer a set of diagnostics for molecules complementary to absorption line spectroscopy and to observations at infrared and radio wavelengths, but are often poorly reproduced by models. In this work, I have developed and expanded fluorescence models for molecular hydrogen and carbon monoxide, and employed them in determining the spatial distribution of CO in cometary comae, in characterizing the effects of partial frequency redistribution for emission line scattering in planetary atmospheres and reflection nebulae, and in abundance determinations from Bowen fluorescence lines of H 2 in planetary nebulae. Follow-up optical and infrared observations were used in addition to UV data to diagnose molecular excitation, temperature, and spatial distribution in planetary nebula M27. Knowledge of the spectral energy distribution of the exciting stars in the far- ultraviolet is essential in constraining both the fluorescence models and understanding the scattering properties of nebular gas and dust. Sounding rocket observations of the Trifid and Orion nebulae, performed as part of this work, provided the necessary dynamic range and spatial resolution to measure simultaneously the nebular scattered light and the spectral energy distribution of the illuminating stars. These low extinction sight lines

  5. Process modeling and control in foundry operations

    NASA Astrophysics Data System (ADS)

    Piwonka, T. S.

    1989-02-01

    Initial uses of process modeling were limited to phenomenological descriptions of the physical processes in foundry operations, with the aim of decreasing scrap and rework. It is now clear that process modeling can be used to select, design and optimize foundry processes so that on-line process control can be achieved. Computational, analogue and empirical process models have been developed for sand casting operations, and they are being applied in the foundry with beneficial effects.

  6. User behavioral model in hypertext environment

    NASA Astrophysics Data System (ADS)

    Moskvin, Oleksii M.; Sailarbek, Saltanat; Gromaszek, Konrad

    2015-12-01

    There is an important role of the users that are traversing Internet resources and their activities which, according to the practice, aren't usually considered by the Internet resource owners so to adjust and optimize hypertext structure. Optimal hypertext structure allows users to locate pages of interest, which are the goals of the informational search, in a faster way. Paper presents a model that conducts user auditory behavior analysis in order to figure out their goals in particular hypertext segment and allows finding out optimal routes for reaching those goals in terms of the routes length and informational value. Potential application of the proposed model is mostly the systems that evaluate hypertext networks and optimize their referential structure for faster information retrieval.

  7. Managing Analysis Models in the Design Process

    NASA Technical Reports Server (NTRS)

    Briggs, Clark

    2006-01-01

    Design of large, complex space systems depends on significant model-based support for exploration of the design space. Integrated models predict system performance in mission-relevant terms given design descriptions and multiple physics-based numerical models. Both the design activities and the modeling activities warrant explicit process definitions and active process management to protect the project from excessive risk. Software and systems engineering processes have been formalized and similar formal process activities are under development for design engineering and integrated modeling. JPL is establishing a modeling process to define development and application of such system-level models.

  8. An invariance principle for reversible Markov processes. Applications to random motions in random environments

    SciTech Connect

    De Masi, A.; Ferrari, P.A.; Goldstein, S.; Wick, W.D. )

    1989-05-01

    The authors present an invariance principle for antisymmetric functions of a reversible Markov process which immediately implies convergence to Brownian motion for a wide class of random motions in random environments. They apply it to establish convergence to Brownian motion (i) for a walker moving in the infinite cluster of the two-dimensional bond percolation model, (ii) for a d-dimensional walker moving in a symmetric random environment under very mild assumptions on the distribution of the environment, (iii) for a tagged particle in a d-dimensional symmetric lattice gas which allows interchanges, (iv) for a tagged particle in a d-dimensional system of interacting Brownian particles. Their formulation also leads naturally to bounds on the diffusion constant.

  9. Cupola Furnace Computer Process Model

    SciTech Connect

    Seymour Katz

    2004-12-31

    The cupola furnace generates more than 50% of the liquid iron used to produce the 9+ million tons of castings annually. The cupola converts iron and steel into cast iron. The main advantages of the cupola furnace are lower energy costs than those of competing furnaces (electric) and the ability to melt less expensive metallic scrap than the competing furnaces. However the chemical and physical processes that take place in the cupola furnace are highly complex making it difficult to operate the furnace in optimal fashion. The results are low energy efficiency and poor recovery of important and expensive alloy elements due to oxidation. Between 1990 and 2004 under the auspices of the Department of Energy, the American Foundry Society and General Motors Corp. a computer simulation of the cupola furnace was developed that accurately describes the complex behavior of the furnace. When provided with the furnace input conditions the model provides accurate values of the output conditions in a matter of seconds. It also provides key diagnostics. Using clues from the diagnostics a trained specialist can infer changes in the operation that will move the system toward higher efficiency. Repeating the process in an iterative fashion leads to near optimum operating conditions with just a few iterations. More advanced uses of the program have been examined. The program is currently being combined with an ''Expert System'' to permit optimization in real time. The program has been combined with ''neural network'' programs to affect very easy scanning of a wide range of furnace operation. Rudimentary efforts were successfully made to operate the furnace using a computer. References to these more advanced systems will be found in the ''Cupola Handbook''. Chapter 27, American Foundry Society, Des Plaines, IL (1999).

  10. Development of a Heterogenic Distributed Environment for Spatial Data Processing Using Cloud Technologies

    NASA Astrophysics Data System (ADS)

    Garov, A. S.; Karachevtseva, I. P.; Matveev, E. V.; Zubarev, A. E.; Florinsky, I. V.

    2016-06-01

    We are developing a unified distributed communication environment for processing of spatial data which integrates web-, desktop- and mobile platforms and combines volunteer computing model and public cloud possibilities. The main idea is to create a flexible working environment for research groups, which may be scaled according to required data volume and computing power, while keeping infrastructure costs at minimum. It is based upon the "single window" principle, which combines data access via geoportal functionality, processing possibilities and communication between researchers. Using an innovative software environment the recently developed planetary information system (http://cartsrv.mexlab.ru/geoportal) will be updated. The new system will provide spatial data processing, analysis and 3D-visualization and will be tested based on freely available Earth remote sensing data as well as Solar system planetary images from various missions. Based on this approach it will be possible to organize the research and representation of results on a new technology level, which provides more possibilities for immediate and direct reuse of research materials, including data, algorithms, methodology, and components. The new software environment is targeted at remote scientific teams, and will provide access to existing spatial distributed information for which we suggest implementation of a user interface as an advanced front-end, e.g., for virtual globe system.

  11. Process-Response Modeling and the Scientific Process.

    ERIC Educational Resources Information Center

    Fichter, Lynn S.

    1988-01-01

    Discusses the process-response model (PRM) in its theoretical and practical forms. Describes how geologists attempt to reconstruct the process from the response (the geologic phenomenon) being studied. (TW)

  12. Process modeling and industrial energy use

    SciTech Connect

    Howe, S O; Pilati, D A; Sparrow, F T

    1980-11-01

    How the process models developed at BNL are used to analyze industrial energy use is described and illustrated. Following a brief overview of the industry modeling program, the general methodology of process modeling is discussed. The discussion highlights the important concepts, contents, inputs, and outputs of a typical process model. A model of the US pulp and paper industry is then discussed as a specific application of process modeling methodology. Case study results from the pulp and paper model illustrate how process models can be used to analyze a variety of issues. Applications addressed with the case study results include projections of energy demand, conservation technology assessment, energy-related tax policies, and sensitivity analysis. A subsequent discussion of these results supports the conclusion that industry process models are versatile and powerful tools for energy end-use modeling and conservation analysis. Information on the current status of industry models at BNL is tabulated.

  13. Exascale Co-design for Modeling Materials in Extreme Environments

    SciTech Connect

    Germann, Timothy C.

    2014-07-08

    Computational materials science has provided great insight into the response of materials under extreme conditions that are difficult to probe experimentally. For example, shock-induced plasticity and phase transformation processes in single-crystal and nanocrystalline metals have been widely studied via large-scale molecular dynamics simulations, and many of these predictions are beginning to be tested at advanced 4th generation light sources such as the Advanced Photon Source (APS) and Linac Coherent Light Source (LCLS). I will describe our simulation predictions and their recent verification at LCLS, outstanding challenges in modeling the response of materials to extreme mechanical and radiation environments, and our efforts to tackle these as part of the multi-institutional, multi-disciplinary Exascale Co-design Center for Materials in Extreme Environments (ExMatEx). ExMatEx has initiated an early and deep collaboration between domain (computational materials) scientists, applied mathematicians, computer scientists, and hardware architects, in order to establish the relationships between algorithms, software stacks, and architectures needed to enable exascale-ready materials science application codes within the next decade. We anticipate that we will be able to exploit hierarchical, heterogeneous architectures to achieve more realistic large-scale simulations with adaptive physics refinement, and are using tractable application scale-bridging proxy application testbeds to assess new approaches and requirements. Such current scale-bridging strategies accumulate (or recompute) a distributed response database from fine-scale calculations, in a top-down rather than bottom-up multiscale approach.

  14. Trust Model to Enhance Security and Interoperability of Cloud Environment

    NASA Astrophysics Data System (ADS)

    Li, Wenjuan; Ping, Lingdi

    Trust is one of the most important means to improve security and enable interoperability of current heterogeneous independent cloud platforms. This paper first analyzed several trust models used in large and distributed environment and then introduced a novel cloud trust model to solve security issues in cross-clouds environment in which cloud customer can choose different providers' services and resources in heterogeneous domains can cooperate. The model is domain-based. It divides one cloud provider's resource nodes into the same domain and sets trust agent. It distinguishes two different roles cloud customer and cloud server and designs different strategies for them. In our model, trust recommendation is treated as one type of cloud services just like computation or storage. The model achieves both identity authentication and behavior authentication. The results of emulation experiments show that the proposed model can efficiently and safely construct trust relationship in cross-clouds environment.

  15. Analytical Model For Fluid Dynamics In A Microgravity Environment

    NASA Technical Reports Server (NTRS)

    Naumann, Robert J.

    1995-01-01

    Report presents analytical approximation methodology for providing coupled fluid-flow, heat, and mass-transfer equations in microgravity environment. Experimental engineering estimates accurate to within factor of 2 made quickly and easily, eliminating need for time-consuming and costly numerical modeling. Any proposed experiment reviewed to see how it would perform in microgravity environment. Model applied in commercial setting for preliminary design of low-Grashoff/Rayleigh-number experiments.

  16. Analog modelling of obduction processes

    NASA Astrophysics Data System (ADS)

    Agard, P.; Zuo, X.; Funiciello, F.; Bellahsen, N.; Faccenna, C.; Savva, D.

    2012-04-01

    Obduction corresponds to one of plate tectonics oddities, whereby dense, oceanic rocks (ophiolites) are presumably 'thrust' on top of light, continental ones, as for the short-lived, almost synchronous Peri-Arabic obduction (which took place along thousands of km from Turkey to Oman in c. 5-10 Ma). Analog modelling experiments were performed to study the mechanisms of obduction initiation and test various triggering hypotheses (i.e., plate acceleration, slab hitting the 660 km discontinuity, ridge subduction; Agard et al., 2007). The experimental setup comprises (1) an upper mantle, modelled as a low-viscosity transparent Newtonian glucose syrup filling a rigid Plexiglas tank and (2) high-viscosity silicone plates (Rhodrosil Gomme with PDMS iron fillers to reproduce densities of continental or oceanic plates), located at the centre of the tank above the syrup to simulate the subducting and the overriding plates - and avoid friction on the sides of the tank. Convergence is simulated by pushing on a piston at one end of the model with velocities comparable to those of plate tectonics (i.e., in the range 1-10 cm/yr). The reference set-up includes, from one end to the other (~60 cm): (i) the piston, (ii) a continental margin containing a transition zone to the adjacent oceanic plate, (iii) a weakness zone with variable resistance and dip (W), (iv) an oceanic plate - with or without a spreading ridge, (v) a subduction zone (S) dipping away from the piston and (vi) an upper, active continental margin, below which the oceanic plate is being subducted at the start of the experiment (as is known to have been the case in Oman). Several configurations were tested and over thirty different parametric tests were performed. Special emphasis was placed on comparing different types of weakness zone (W) and the extent of mechanical coupling across them, particularly when plates were accelerated. Displacements, together with along-strike and across-strike internal deformation in all

  17. LIGHT-INDUCED PROCESSES AFFECTING ENTEROCOCCI IN AQUATIC ENVIRONMENTS

    EPA Science Inventory

    Fecal indicator bacteria such as enterococci have been used to assess contamination of freshwater and marine environments by pathogenic microorganisms. Various past studies have shown that sunlight plays an important role in reducing concentrations of culturable enterococci and ...

  18. Sensitivity of UO2 Stability in a Reducing Environment on Radiolysis Model Parameters

    SciTech Connect

    Wittman, Richard S.; Buck, Edgar C.

    2012-09-01

    Results for a radiolysis model sensitivity study of radiolytically produced H2O2 are presented as they relate to Spent (or Used) Light Water Reactor uranium oxide (UO2) nuclear fuel (UNF) oxidation in a low oxygen environment. The model builds on previous reaction kinetic studies to represent the radiolytic processes occurring at the nuclear fuel surface. Hydrogen peroxide (H2O2) is the dominant oxidant for spent nuclear fuel in an O2-depleted water environment.

  19. Silicon-Carbide Power MOSFET Performance in High Efficiency Boost Power Processing Unit for Extreme Environments

    NASA Technical Reports Server (NTRS)

    Ikpe, Stanley A.; Lauenstein, Jean-Marie; Carr, Gregory A.; Hunter, Don; Ludwig, Lawrence L.; Wood, William; Del Castillo, Linda Y.; Fitzpatrick, Fred; Chen, Yuan

    2016-01-01

    Silicon-Carbide device technology has generated much interest in recent years. With superior thermal performance, power ratings and potential switching frequencies over its Silicon counterpart, Silicon-Carbide offers a greater possibility for high powered switching applications in extreme environment. In particular, Silicon-Carbide Metal-Oxide- Semiconductor Field-Effect Transistors' (MOSFETs) maturing process technology has produced a plethora of commercially available power dense, low on-state resistance devices capable of switching at high frequencies. A novel hard-switched power processing unit (PPU) is implemented utilizing Silicon-Carbide power devices. Accelerated life data is captured and assessed in conjunction with a damage accumulation model of gate oxide and drain-source junction lifetime to evaluate potential system performance at high temperature environments.

  20. LEGEND, a LEO-to-GEO Environment Debris Model

    NASA Technical Reports Server (NTRS)

    Liou, Jer Chyi; Hall, Doyle T.

    2013-01-01

    LEGEND (LEO-to-GEO Environment Debris model) is a three-dimensional orbital debris evolutionary model that is capable of simulating the historical and future debris populations in the near-Earth environment. The historical component in LEGEND adopts a deterministic approach to mimic the known historical populations. Launched rocket bodies, spacecraft, and mission-related debris (rings, bolts, etc.) are added to the simulated environment. Known historical breakup events are reproduced, and fragments down to 1 mm in size are created. The LEGEND future projection component adopts a Monte Carlo approach and uses an innovative pair-wise collision probability evaluation algorithm to simulate the future breakups and the growth of the debris populations. This algorithm is based on a new "random sampling in time" approach that preserves characteristics of the traditional approach and captures the rapidly changing nature of the orbital debris environment. LEGEND is a Fortran 90-based numerical simulation program. It operates in a UNIX/Linux environment.

  1. Integrated approaches to the application of advanced modeling technology in process development and optimization

    SciTech Connect

    Allgor, R.J.; Feehery, W.F.; Tolsma, J.E.

    1995-12-31

    The batch process development problem serves as good candidate to guide the development of process modeling environments. It demonstrates that very robust numerical techniques are required within an environment that can collect, organize, and maintain the data and models required to address the batch process development problem. This paper focuses on improving the robustness and efficiency of the numerical algorithms required in such a modeling environment through the development of hybrid numerical and symbolic strategies.

  2. Process correlation analysis model for process improvement identification.

    PubMed

    Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  3. Process Correlation Analysis Model for Process Improvement Identification

    PubMed Central

    Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data. PMID:24977170

  4. Simulation model of clastic sedimentary processes

    SciTech Connect

    Tetzlaff, D.M.

    1987-01-01

    This dissertation describes SEDSIM, a computer model that simulates erosion, transport, and deposition of clastic sediments by free-surface flow in natural environments. SEDSIM is deterministic and is applicable to sedimentary processes in rivers, deltas, continental shelves, submarine canyons, and turbidite fans. The model is used to perform experiments in clastic sedimentation. Computer experimentation is limited by computing power available, but is free from scaling problems associated with laboratory experiments. SEDSIM responds to information provided to it at the outset of a simulation experiment, including topography, subsurface configuration, physical parameters of fluid and sediment, and characteristics of sediment sources. Extensive computer graphics are incorporated in SEDSIM. The user can display the three-dimensional geometry of simulated deposits in the form of successions of contour maps, perspective diagrams, vector plots of current velocities, and vertical sections of any azimuth orientation. The sections show both sediment age and composition. SEDSIM works realistically with processes involving channel shifting and topographic changes. Example applications include simulation of an ancient submarine canyon carved into a Cretaceous sequence in the National Petroleum Reserve in Alaska, known mainly from seismic sections and a sequence of Tertiary age in the Golden Meadow oil field of Louisiana, known principally from well logs.

  5. Towards a new model of the interplanetary meteoroid environment

    NASA Astrophysics Data System (ADS)

    Dikarev, Valeri; Jehn, Rüdiger; Grün, Eberhard

    Improved models of the interplanetary meteoroid environment enjoy the interest of both spacecraft engineers and dust researchers. The engineers need it for risk assessments for their spacecraft instruments. Modelling dynamical and collisional evolution of interplanetary dust should lead to a match with observations, and an empirical model can be a good mediator between physical models and sparse observational data. Our current effort is directed towards the construction of a new model of the interplanetary meteoroid environment based on a number of observational data sets including in-situ dust flux measurements onboard spacecraft, radar meteor surveys and thermal emission of zodiacal dust. In contrast to earlier models, we use long-term particle dynamics to define populations for the new model. Based on these populations, we have constructed a prototype model which reasonably fits in-situ impact counts by Galileo and Ulysses dust experiments.

  6. Particle Swarm Based Collective Searching Model for Adaptive Environment

    SciTech Connect

    Cui, Xiaohui; Patton, Robert M; Potok, Thomas E; Treadwell, Jim N

    2008-01-01

    This report presents a pilot study of an integration of particle swarm algorithm, social knowledge adaptation and multi-agent approaches for modeling the collective search behavior of self-organized groups in an adaptive environment. The objective of this research is to apply the particle swarm metaphor as a model of social group adaptation for the dynamic environment and to provide insight and understanding of social group knowledge discovering and strategic searching. A new adaptive environment model, which dynamically reacts to the group collective searching behaviors, is proposed in this research. The simulations in the research indicate that effective communication between groups is not the necessary requirement for whole self-organized groups to achieve the efficient collective searching behavior in the adaptive environment.

  7. Particle Swarm Based Collective Searching Model for Adaptive Environment

    SciTech Connect

    Cui, Xiaohui; Patton, Robert M; Potok, Thomas E; Treadwell, Jim N

    2007-01-01

    This report presents a pilot study of an integration of particle swarm algorithm, social knowledge adaptation and multi-agent approaches for modeling the collective search behavior of self-organized groups in an adaptive environment. The objective of this research is to apply the particle swarm metaphor as a model of social group adaptation for the dynamic environment and to provide insight and understanding of social group knowledge discovering and strategic searching. A new adaptive environment model, which dynamically reacts to the group collective searching behaviors, is proposed in this research. The simulations in the research indicate that effective communication between groups is not the necessary requirement for whole self-organized groups to achieve the efficient collective searching behavior in the adaptive environment.

  8. Sensitivity analysis of the orbital debris environment using the evolve 4.0 model

    NASA Astrophysics Data System (ADS)

    Reynolds, R.; Eichler, P.; Bade, A.; Krisko, P.; Johnson, N.

    1999-01-01

    A number of models to describe the current and future orbital debris environments have been developed at the National Aeronautics and Space Administration Lyndon B. Johnson Space Center. One of these models, EVOLVE, is a complex simulation model that uses future space traffic, fragmentations, and nonfragmentation processes to predict future environments for debris 1 mm in diameter and larger. New breakup models incorporating new data on orbiting fragmentation debris, as well as new data from laboratory tests, are being developed for use by EVOLVE. These models will have different size, area-to-mass, and velocity distributions than in the current baselines. With the inclusion of the new breakup models, EVOLVE will be upgraded to version 4.0. Because there is limited data on debris sources and uncertainty in the importance of these sources in future debris environment evolution, it is important to understand the sensitivity of environment projections to these uncertainties. To calculate the sensitivity of the environment to characteristics of the debris sources, alternative environment projections will be obtained by making a series of modifications to the nominal source characteristics in EVOLVE. These modifications (e.g., to the traffic model and postmission disposal model) and the planned sensitivity study framework are described in this paper. Metrics for determining change in the environment are also defined in the paper and used to discuss sensitivities.

  9. Designing a Web-Based Science Learning Environment for Model-Based Collaborative Inquiry

    ERIC Educational Resources Information Center

    Sun, Daner; Looi, Chee-Kit

    2013-01-01

    The paper traces a research process in the design and development of a science learning environment called WiMVT (web-based inquirer with modeling and visualization technology). The WiMVT system is designed to help secondary school students build a sophisticated understanding of scientific conceptions, and the science inquiry process, as well as…

  10. Drift-Scale Coupled Processes (DST and THC Seepage) Models

    SciTech Connect

    E. Gonnenthal; N. Spyoher

    2001-02-05

    The purpose of this Analysis/Model Report (AMR) is to document the Near-Field Environment (NFE) and Unsaturated Zone (UZ) models used to evaluate the potential effects of coupled thermal-hydrologic-chemical (THC) processes on unsaturated zone flow and transport. This is in accordance with the ''Technical Work Plan (TWP) for Unsaturated Zone Flow and Transport Process Model Report'', Addendum D, Attachment D-4 (Civilian Radioactive Waste Management System (CRWMS) Management and Operating Contractor (M and O) 2000 [153447]) and ''Technical Work Plan for Nearfield Environment Thermal Analyses and Testing'' (CRWMS M and O 2000 [153309]). These models include the Drift Scale Test (DST) THC Model and several THC seepage models. These models provide the framework to evaluate THC coupled processes at the drift scale, predict flow and transport behavior for specified thermal loading conditions, and predict the chemistry of waters and gases entering potential waste-emplacement drifts. The intended use of this AMR is to provide input for the following: (1) Performance Assessment (PA); (2) Abstraction of Drift-Scale Coupled Processes AMR (ANL-NBS-HS-000029); (3) UZ Flow and Transport Process Model Report (PMR); and (4) Near-Field Environment (NFE) PMR. The work scope for this activity is presented in the TWPs cited above, and summarized as follows: continue development of the repository drift-scale THC seepage model used in support of the TSPA in-drift geochemical model; incorporate heterogeneous fracture property realizations; study sensitivity of results to changes in input data and mineral assemblage; validate the DST model by comparison with field data; perform simulations to predict mineral dissolution and precipitation and their effects on fracture properties and chemistry of water (but not flow rates) that may seep into drifts; submit modeling results to the TDMS and document the models. The model development, input data, sensitivity and validation studies described in

  11. Drift-Scale Coupled Processes (DST and THC Seepage) Models

    SciTech Connect

    E. Sonnenthale

    2001-04-16

    The purpose of this Analysis/Model Report (AMR) is to document the Near-Field Environment (NFE) and Unsaturated Zone (UZ) models used to evaluate the potential effects of coupled thermal-hydrologic-chemical (THC) processes on unsaturated zone flow and transport. This is in accordance with the ''Technical Work Plan (TWP) for Unsaturated Zone Flow and Transport Process Model Report'', Addendum D, Attachment D-4 (Civilian Radioactive Waste Management System (CRWMS) Management and Operating Contractor (M&O) 2000 [1534471]) and ''Technical Work Plan for Nearfield Environment Thermal Analyses and Testing'' (CRWMS M&O 2000 [153309]). These models include the Drift Scale Test (DST) THC Model and several THC seepage models. These models provide the framework to evaluate THC coupled processes at the drift scale, predict flow and transport behavior for specified thermal loading conditions, and predict the chemistry of waters and gases entering potential waste-emplacement drifts. The intended use of this AMR is to provide input for the following: Performance Assessment (PA); Near-Field Environment (NFE) PMR; Abstraction of Drift-Scale Coupled Processes AMR (ANL-NBS-HS-000029); and UZ Flow and Transport Process Model Report (PMR). The work scope for this activity is presented in the TWPs cited above, and summarized as follows: Continue development of the repository drift-scale THC seepage model used in support of the TSPA in-drift geochemical model; incorporate heterogeneous fracture property realizations; study sensitivity of results to changes in input data and mineral assemblage; validate the DST model by comparison with field data; perform simulations to predict mineral dissolution and precipitation and their effects on fracture properties and chemistry of water (but not flow rates) that may seep into drifts; submit modeling results to the TDMS and document the models. The model development, input data, sensitivity and validation studies described in this AMR are required

  12. Differential Susceptibility to the Environment: Are Developmental Models Compatible with the Evidence from Twin Studies?

    ERIC Educational Resources Information Center

    Del Giudice, Marco

    2016-01-01

    According to models of differential susceptibility, the same neurobiological and temperamental traits that determine increased sensitivity to stress and adversity also confer enhanced responsivity to the positive aspects of the environment. Differential susceptibility models have expanded to include complex developmental processes in which genetic…

  13. A seismic modelling environment as a research and teaching tool for 3-D subsurface modelling

    NASA Astrophysics Data System (ADS)

    Burford, Dennis J.; Ger, Larry; Blake, Edwin H.; de Wit, Maarten J.; Doucouré, C. Moctar; Hart, Roger J.

    Early geological modelling and visualisation techniques were limited to manual cross-sections or isometric perspectives. Computer modelling has automated this task to a certain degree, but traditional approaches do not allow iterative validation during the modelling process. When the structure is complex and data sparse, as is often the case in geology, interactive 3-D modelling techniques should be employed that can interrogate new and existing data, guided by the geological experience of the modeller. Using the Vredefort dome in South Africa as a case study, we describe a Seismic Modelling Environment (SME) to demonstrate the potential of this type of computer-based modelling and geological visualisation. SME offers a novel approach to interactive 3-D modelling of complex geological structures using an extension of sweep representations and user-controlled forward modelling with seismic analysis for validation. Incorporation of validation techniques allows early confirmation or rejection of models. Tested by a group of third-year geology students, SME's iterative construction and exploration of a 3-D model clearly provided users with a superior understanding through visualisation. SME has, therefore, potential both as an educational as well as a research tool.

  14. Autism and Digital Learning Environments: Processes of Interaction and Mediation

    ERIC Educational Resources Information Center

    Passerino, Liliana M.; Santarosa, Lucila M. Costi

    2008-01-01

    Using a socio-historical perspective to explain social interaction and taking advantage of information and communication technologies (ICTs) currently available for creating digital learning environments (DLEs), this paper seeks to redress the absence of empirical data concerning technology-aided social interaction between autistic individuals. In…

  15. NoteCards: A Multimedia Idea Processing Environment.

    ERIC Educational Resources Information Center

    Halasz, Frank G.

    1986-01-01

    Notecards is a computer environment designed to help people work with ideas by providing a set of tools for a variety of specific activities, which can range from sketching on the back of an envelope to formally representing knowledge. The basic framework of this hypermedia system is a semantic network of electronic notecards connected by…

  16. Rapid prototyping and AI programming environments applied to payload modeling

    NASA Technical Reports Server (NTRS)

    Carnahan, Richard S., Jr.; Mendler, Andrew P.

    1987-01-01

    This effort focused on using artificial intelligence (AI) programming environments and rapid prototyping to aid in both space flight manned and unmanned payload simulation and training. Significant problems addressed are the large amount of development time required to design and implement just one of these payload simulations and the relative inflexibility of the resulting model to accepting future modification. Results of this effort have suggested that both rapid prototyping and AI programming environments can significantly reduce development time and cost when applied to the domain of payload modeling for crew training. The techniques employed are applicable to a variety of domains where models or simulations are required.

  17. Interactive Schematic Integration Within the Propellant System Modeling Environment

    NASA Technical Reports Server (NTRS)

    Coote, David; Ryan, Harry; Burton, Kenneth; McKinney, Lee; Woodman, Don

    2012-01-01

    Task requirements for rocket propulsion test preparations of the test stand facilities drive the need to model the test facility propellant systems prior to constructing physical modifications. The Propellant System Modeling Environment (PSME) is an initiative designed to enable increased efficiency and expanded capabilities to a broader base of NASA engineers in the use of modeling and simulation (M&S) technologies for rocket propulsion test and launch mission requirements. PSME will enable a wider scope of users to utilize M&S of propulsion test and launch facilities for predictive and post-analysis functionality by offering a clean, easy-to-use, high-performance application environment.

  18. Model-based software process improvement

    NASA Technical Reports Server (NTRS)

    Zettervall, Brenda T.

    1994-01-01

    The activities of a field test site for the Software Engineering Institute's software process definition project are discussed. Products tested included the improvement model itself, descriptive modeling techniques, the CMM level 2 framework document, and the use of process definition guidelines and templates. The software process improvement model represents a five stage cyclic approach for organizational process improvement. The cycles consist of the initiating, diagnosing, establishing, acting, and leveraging phases.

  19. Transforming Collaborative Process Models into Interface Process Models by Applying an MDA Approach

    NASA Astrophysics Data System (ADS)

    Lazarte, Ivanna M.; Chiotti, Omar; Villarreal, Pablo D.

    Collaborative business models among enterprises require defining collaborative business processes. Enterprises implement B2B collaborations to execute these processes. In B2B collaborations the integration and interoperability of processes and systems of the enterprises are required to support the execution of collaborative processes. From a collaborative process model, which describes the global view of the enterprise interactions, each enterprise must define the interface process that represents the role it performs in the collaborative process in order to implement the process in a Business Process Management System. Hence, in this work we propose a method for the automatic generation of the interface process model of each enterprise from a collaborative process model. This method is based on a Model-Driven Architecture to transform collaborative process models into interface process models. By applying this method, interface processes are guaranteed to be interoperable and defined according to a collaborative process.

  20. Birth/death process model

    NASA Technical Reports Server (NTRS)

    Solloway, C. B.; Wakeland, W.

    1976-01-01

    First-order Markov model developed on digital computer for population with specific characteristics. System is user interactive, self-documenting, and does not require user to have complete understanding of underlying model details. Contains thorough error-checking algorithms on input and default capabilities.

  1. Mathematical Modeling: A Structured Process

    ERIC Educational Resources Information Center

    Anhalt, Cynthia Oropesa; Cortez, Ricardo

    2015-01-01

    Mathematical modeling, in which students use mathematics to explain or interpret physical, social, or scientific phenomena, is an essential component of the high school curriculum. The Common Core State Standards for Mathematics (CCSSM) classify modeling as a K-12 standard for mathematical practice and as a conceptual category for high school…

  2. Supporting Inquiry Processes with an Interactive Learning Environment: Inquiry Island

    NASA Astrophysics Data System (ADS)

    Eslinger, Eric; White, Barbara; Frederiksen, John; Brobst, Joseph

    2008-12-01

    This research addresses the effectiveness of an interactive learning environment, Inquiry Island, as a general-purpose framework for the design of inquiry-based science curricula. We introduce the software as a scaffold designed to support the creation and assessment of inquiry projects, and describe its use in a middle-school genetics unit. Students in the intervention showed significant gains in inquiry skills. We also illustrate the power of the software to gather and analyze qualitative data about student learning.

  3. Inquiry, play, and problem solving in a process learning environment

    NASA Astrophysics Data System (ADS)

    Thwaits, Anne Y.

    What is the nature of art/science collaborations in museums? How do art objects and activities contribute to the successes of science centers? Based on the premise that art exhibitions and art-based activities engage museum visitors in different ways than do strictly factual, information-based displays, I address these questions in a case study that examines the roles of visual art and artists in the Exploratorium, a museum that has influenced exhibit design and professional practice in many of the hands-on science centers in the United States and around the world. The marriage of art and science in education is not a new idea---Leonardo da Vinci and other early polymaths surely understood how their various endeavors informed one another, and some 20th century educators understood the value of the arts and creativity in the learning and practice of other disciplines. When, in 2010, the National Science Teachers Association added an A to the federal government's ubiquitous STEM initiative and turned it into STEAM, art educators nationwide took notice. With a heightened interest in the integration of and collaboration between disciplines comes an increased need for models of best practice for educators and educational institutions. With the intention to understand the nature of such collaborations and the potential they hold, I undertook this study. I made three site visits to the Exploratorium, where I took photos, recorded notes in a journal, interacted with exhibits, and observed museum visitors. I collected other data by examining the institution's website, press releases, annual reports, and fact sheets; and by reading popular and scholarly articles written by museum staff members and by independent journalists. I quickly realized that the Exploratorium was not created in the way than most museums are, and the history of its founding and the ideals of its founder illuminate what was then and continues now to be different about this museum from most others in the

  4. A Comparative of business process modelling techniques

    NASA Astrophysics Data System (ADS)

    Tangkawarow, I. R. H. T.; Waworuntu, J.

    2016-04-01

    In this era, there is a lot of business process modeling techniques. This article is the research about differences of business process modeling techniques. For each technique will explain about the definition and the structure. This paper presents a comparative analysis of some popular business process modelling techniques. The comparative framework is based on 2 criteria: notation and how it works when implemented in Somerleyton Animal Park. Each technique will end with the advantages and disadvantages. The final conclusion will give recommend of business process modeling techniques that easy to use and serve the basis for evaluating further modelling techniques.

  5. Charged Particle Environment Definition for NGST: Model Development

    NASA Technical Reports Server (NTRS)

    Blackwell, William C.; Minow, Joseph I.; Evans, Steven W.; Hardage, Donna M.; Suggs, Robert M.

    2000-01-01

    NGST will operate in a halo orbit about the L2 point, 1.5 million km from the Earth, where the spacecraft will periodically travel through the magnetotail region. There are a number of tools available to calculate the high energy, ionizing radiation particle environment from galactic cosmic rays and from solar disturbances. However, space environment tools are not generally available to provide assessments of charged particle environment and its variations in the solar wind, magnetosheath, and magnetotail at L2 distances. An engineering-level phenomenology code (LRAD) was therefore developed to facilitate the definition of charged particle environments in the vicinity of the L2 point in support of the NGST program. LRAD contains models tied to satellite measurement data of the solar wind and magnetotail regions. The model provides particle flux and fluence calculations necessary to predict spacecraft charging conditions and the degradation of materials used in the construction of NGST. This paper describes the LRAD environment models for the deep magnetotail (XGSE < -100 Re) and solar wind, and presents predictions of the charged particle environment for NGST.

  6. Negative Binomial Process Count and Mixture Modeling.

    PubMed

    Zhou, Mingyuan; Carin, Lawrence

    2013-10-17

    The seemingly disjoint problems of count and mixture modeling are united under the negative binomial (NB) process. A gamma process is employed to model the rate measure of a Poisson process, whose normalization provides a random probability measure for mixture modeling and whose marginalization leads to a NB process for count modeling. A draw from the NB process consists of a Poisson distributed finite number of distinct atoms, each of which is associated with a logarithmic distributed number of data samples. We reveal relationships between various count- and mixture-modeling distributions distributions, and construct a Poisson-logarithmic bivariate distribution that connects the NB and Chinese restaurant table distributions. Fundamental properties of the models are developed, and we derive efficient Bayesian inference. It is shown that with augmentation and normalization, the NB process and gamma-NB process can be reduced to the Dirichlet process and hierarchical Dirichlet process, respectively. These relationships highlight theoretical, structural and computational advantages of the NB process. A variety of NB processes, including the beta-geometric, beta-NB, marked-beta-NB, marked-gamma-NB and zero-inflated-NB processes, with distinct sharing mechanisms, are also constructed. These models are applied to topic modeling, with connections made to existing algorithms under Poisson factor analysis. Example results show the importance of inferring both the NB dispersion and probability parameters. PMID:24144977

  7. Negative Binomial Process Count and Mixture Modeling.

    PubMed

    Zhou, Mingyuan; Carin, Lawrence

    2015-02-01

    The seemingly disjoint problems of count and mixture modeling are united under the negative binomial (NB) process. A gamma process is employed to model the rate measure of a Poisson process, whose normalization provides a random probability measure for mixture modeling and whose marginalization leads to an NB process for count modeling. A draw from the NB process consists of a Poisson distributed finite number of distinct atoms, each of which is associated with a logarithmic distributed number of data samples. We reveal relationships between various count- and mixture-modeling distributions and construct a Poisson-logarithmic bivariate distribution that connects the NB and Chinese restaurant table distributions. Fundamental properties of the models are developed, and we derive efficient Bayesian inference. It is shown that with augmentation and normalization, the NB process and gamma-NB process can be reduced to the Dirichlet process and hierarchical Dirichlet process, respectively. These relationships highlight theoretical, structural, and computational advantages of the NB process. A variety of NB processes, including the beta-geometric, beta-NB, marked-beta-NB, marked-gamma-NB and zero-inflated-NB processes, with distinct sharing mechanisms, are also constructed. These models are applied to topic modeling, with connections made to existing algorithms under Poisson factor analysis. Example results show the importance of inferring both the NB dispersion and probability parameters. PMID:26353243

  8. Information-Processing Models and Curriculum Design

    ERIC Educational Resources Information Center

    Calfee, Robert C.

    1970-01-01

    "This paper consists of three sections--(a) the relation of theoretical analyses of learning to curriculum design, (b) the role of information-processing models in analyses of learning processes, and (c) selected examples of the application of information-processing models to curriculum design problems." (Author)

  9. Knowledge-based environment for hierarchical modeling and simulation

    SciTech Connect

    Kim, Taggon.

    1988-01-01

    This dissertation develops a knowledge-based environment for hierarchical modeling and simulation of discrete-event systems as the major part of a longer, ongoing research project in artificial intelligence and distributed simulation. In developing the environment, a knowledge representation framework for modeling and simulation, which unifies structural and behavioral knowledge of simulation models, is proposed by incorporating knowledge-representation schemes in artificial intelligence within simulation models. The knowledge base created using the framework is composed of a structural knowledge base called entity structure base and a behavioral knowledge base called model base. The DEVS-Scheme, a realization of DEVS (Discrete Event System Specifiation) formalism in a LISP-based, object-oriented environment, is extended to facilitate the specification of behavioral knowledge of models, especially for kernel models that are suited to model massively parallel computer architectures. The ESP Scheme, a realization of entity structure formalism in a frame-theoretic representation, is extended to represent structural knowledge of models and to manage it in the structural knowledge base.

  10. A Hierarchical Process-Dissociation Model

    ERIC Educational Resources Information Center

    Rouder, Jeffrey N.; Lu, Jun; Morey, Richard D.; Sun, Dongchu; Speckman, Paul L.

    2008-01-01

    In fitting the process-dissociation model (L. L. Jacoby, 1991) to observed data, researchers aggregate outcomes across participant, items, or both. T. Curran and D. L. Hintzman (1995) demonstrated how biases from aggregation may lead to artifactual support for the model. The authors develop a hierarchical process-dissociation model that does not…

  11. GREENSCOPE: A Method for Modeling Chemical Process Sustainability

    EPA Science Inventory

    Current work within the U.S. Environmental Protection Agency’s National Risk Management Research Laboratory is focused on the development of a method for modeling chemical process sustainability. The GREENSCOPE methodology, defined for the four bases of Environment, Economics, Ef...

  12. Microstructure in the extreme environment: understanding and predicting dynamic damage processes

    SciTech Connect

    Dennis-koller, Darcie L; Cerreta, Ellen K; Bronkhorst, Curt A; Escobedo-diaz, Juan P

    2010-12-21

    The future of materials science: strategic application for functionally controlled materials properties is emphasized by the need to control material performance in extreme environments. To this end, this study examines the separate effects of kinetics (in the form of dynamic loading rate and shock wave shape) from that of length-scale effects (in the form of microstructural defect distributions). Recently available mesoscale modeling techniques are being used to capture a physical link between kinetic and length-scale influences on dynamic loading. This work contributes innovative new tools in the form of shock-wave shaping techniques in dynamic experimentation, materials characterization, lending insight into 3D damage field analysis at micron resolution, and the physics necessary to provide predictive capabilities for dynamic damage evolution. Experimental results tailored for the discreet understanding of length-scale and kinetic effects during dynamic loading are obtained to provide the basis for the development of process-aware material performance models. The understanding of length-scale and kinetic effects in extreme environments of dynamic loading advances the understanding of current emerging issues relevant to phenomena such as inclusion related failure in metals, grain size dependence on ejecta, and benefits of interfaces in mitigating defect development specifically driven by the need to tailor material response. Finally, the coupling of experimental techniques with theory and simulation is aimed at advancing process-aware damage modeling as well as transitioning materials science from observation to property control.

  13. Computational Process Modeling for Additive Manufacturing

    NASA Technical Reports Server (NTRS)

    Bagg, Stacey; Zhang, Wei

    2014-01-01

    Computational Process and Material Modeling of Powder Bed additive manufacturing of IN 718. Optimize material build parameters with reduced time and cost through modeling. Increase understanding of build properties. Increase reliability of builds. Decrease time to adoption of process for critical hardware. Potential to decrease post-build heat treatments. Conduct single-track and coupon builds at various build parameters. Record build parameter information and QM Meltpool data. Refine Applied Optimization powder bed AM process model using data. Report thermal modeling results. Conduct metallography of build samples. Calibrate STK models using metallography findings. Run STK models using AO thermal profiles and report STK modeling results. Validate modeling with additional build. Photodiode Intensity measurements highly linear with power input. Melt Pool Intensity highly correlated to Melt Pool Size. Melt Pool size and intensity increase with power. Applied Optimization will use data to develop powder bed additive manufacturing process model.

  14. Omega: An Object-Oriented Image/Symbol Processing Environment

    NASA Astrophysics Data System (ADS)

    Carlotto, Mark J.; Fong, Jennifer B.

    1989-01-01

    A Common Lisp software system to support integrated image and symbolic processing applications is described. The system, termed Omega is implemented on a Symbolics Lisp Machine and is organized into modules to facilitate the development of user applications and for software transportability. An object-oriented programming language similar to Symbolics Zetalisp/Flavors is implemented in Common Lisp and is used for creating symbolic objects known as tokens. Tokens are used to represent images, significant areas in images, and regions that define the spatial extent of the significant areas. The extent of point, line, and areal features is represented by polygons, label maps, boundary points, row- and column-oriented run-length encoded rasters, and bounding rectangles. Macros provide a common means for image processing functions and spatial operators to access spatial representations. The implementation of image processing, segmentation, and symbolic processing functions within Omega are described.

  15. Large urban fire environment. Trends and model city predictions

    SciTech Connect

    Larson, D.A.; Small, R.D.

    1982-01-01

    The urban fire environment that would result from a megaton-yield nuclear weapon burst is considered. The dependence of temperatures and velocities on fire size, burning intensity, turbulence, and radiation is explored, and specific calculations for three model urban areas are presented. In all cases, high velocity fire winds are predicted. The model-city results show the influence of building density and urban sprawl on the fire environment. Additional calculations consider large-area fires with the burning intensity reduced in a blast-damaged urban center.

  16. Simulation Modeling of Software Development Processes

    NASA Technical Reports Server (NTRS)

    Calavaro, G. F.; Basili, V. R.; Iazeolla, G.

    1996-01-01

    A simulation modeling approach is proposed for the prediction of software process productivity indices, such as cost and time-to-market, and the sensitivity analysis of such indices to changes in the organization parameters and user requirements. The approach uses a timed Petri Net and Object Oriented top-down model specification. Results demonstrate the model representativeness, and its usefulness in verifying process conformance to expectations, and in performing continuous process improvement and optimization.

  17. Implementation of CCNUGrid-based Computational Environment for Molecular Modeling

    NASA Astrophysics Data System (ADS)

    Liu, Kai; Luo, Changhua; Ren, Yanliang; Wan, Jian; Xu, Xin

    2007-12-01

    Grid computing technology has being regarded as one of the most promising solutions for the tremendous requirement of computing resources in the field of molecular modeling up to date. Contrast to building a more and more powerful super-computer with novel hardware in a local network, grid technology enable us, in principle, to integrate various previous and present computing resources located in different location into a computing platform as a whole. As a case demonstration, we reported herein that a campus grid entitled CCNUGrid was implemented with grid middleware, consisting of four local computing networks distributed in College of Chemistry, College of Physics, Center for Network, and Center for Education Information Technology and Engineering, respectively, at Central China Normal University. Visualization functions of monitoring computer machines in each local network, monitoring job processing flow, and monitoring computational results were realized in this campus grid-based computational environment, in addition to the conventional components of grid architecture: universal portal, task management, computing node and security. In the last section of this paper, a molecular docking-based virtual screening study was performed at the CCNUGrid, as one example of CCNUGrid applications.

  18. Quality and Safety in Health Care, Part XIV: The External Environment and Research for Diagnostic Processes.

    PubMed

    Harolds, Jay A

    2016-09-01

    The work system in which diagnosis takes place is affected by the external environment, which includes requirements such as certification, accreditation, and regulations. How errors are reported, malpractice, and the system for payment are some other aspects of the external environment. Improving the external environment is expected to decrease errors in diagnosis. More research on improving the diagnostic process is needed. PMID:27280903

  19. Modelling urban rainfall-runoff responses using an experimental, two-tiered physical modelling environment

    NASA Astrophysics Data System (ADS)

    Green, Daniel; Pattison, Ian; Yu, Dapeng

    2016-04-01

    Surface water (pluvial) flooding occurs when rainwater from intense precipitation events is unable to infiltrate into the subsurface or drain via natural or artificial drainage channels. Surface water flooding poses a serious hazard to urban areas across the world, with the UK's perceived risk appearing to have increased in recent years due to surface water flood events seeming more severe and frequent. Surface water flood risk currently accounts for 1/3 of all UK flood risk, with approximately two million people living in urban areas at risk of a 1 in 200-year flood event. Research often focuses upon using numerical modelling techniques to understand the extent, depth and severity of actual or hypothetical flood scenarios. Although much research has been conducted using numerical modelling, field data available for model calibration and validation is limited due to the complexities associated with data collection in surface water flood conditions. Ultimately, the data which numerical models are based upon is often erroneous and inconclusive. Physical models offer a novel, alternative and innovative environment to collect data within, creating a controlled, closed system where independent variables can be altered independently to investigate cause and effect relationships. A physical modelling environment provides a suitable platform to investigate rainfall-runoff processes occurring within an urban catchment. Despite this, physical modelling approaches are seldom used in surface water flooding research. Scaled laboratory experiments using a 9m2, two-tiered 1:100 physical model consisting of: (i) a low-cost rainfall simulator component able to simulate consistent, uniformly distributed (>75% CUC) rainfall events of varying intensity, and; (ii) a fully interchangeable, modular plot surface have been conducted to investigate and quantify the influence of a number of terrestrial and meteorological factors on overland flow and rainfall-runoff patterns within a modelled

  20. Using process groups to implement failure detection in asynchronous environments

    NASA Technical Reports Server (NTRS)

    Ricciardi, Aleta M.; Birman, Kenneth P.

    1991-01-01

    Agreement on the membership of a group of processes in a distributed system is a basic problem that arises in a wide range of applications. Such groups occur when a set of processes cooperate to perform some task, share memory, monitor one another, subdivide a computation, and so forth. The group membership problems is discussed as it relates to failure detection in asynchronous, distributed systems. A rigorous, formal specification for group membership is presented under this interpretation. A solution is then presented for this problem.

  1. Spectroscopic analyses of chemical adaptation processes within microalgal biomass in response to changing environments.

    PubMed

    Vogt, Frank; White, Lauren

    2015-03-31

    Via photosynthesis, marine phytoplankton transforms large quantities of inorganic compounds into biomass. This has considerable environmental impacts as microalgae contribute for instance to counter-balancing anthropogenic releases of the greenhouse gas CO2. On the other hand, high concentrations of nitrogen compounds in an ecosystem can lead to harmful algae blooms. In previous investigations it was found that the chemical composition of microalgal biomass is strongly dependent on the nutrient availability. Therefore, it is expected that algae's sequestration capabilities and productivity are also determined by the cells' chemical environments. For investigating this hypothesis, novel analytical methodologies are required which are capable of monitoring live cells exposed to chemically shifting environments followed by chemometric modeling of their chemical adaptation dynamics. FTIR-ATR experiments have been developed for acquiring spectroscopic time series of live Dunaliella parva cultures adapting to different nutrient situations. Comparing experimental data from acclimated cultures to those exposed to a chemically shifted nutrient situation reveals insights in which analyte groups participate in modifications of microalgal biomass and on what time scales. For a chemometric description of these processes, a data model has been deduced which explains the chemical adaptation dynamics explicitly rather than empirically. First results show that this approach is feasible and derives information about the chemical biomass adaptations. Future investigations will utilize these instrumental and chemometric methodologies for quantitative investigations of the relation between chemical environments and microalgal sequestration capabilities. PMID:25813024

  2. Spectroscopic analyses of chemical adaptation processes within microalgal biomass in response to changing environments.

    PubMed

    Vogt, Frank; White, Lauren

    2015-03-31

    Via photosynthesis, marine phytoplankton transforms large quantities of inorganic compounds into biomass. This has considerable environmental impacts as microalgae contribute for instance to counter-balancing anthropogenic releases of the greenhouse gas CO2. On the other hand, high concentrations of nitrogen compounds in an ecosystem can lead to harmful algae blooms. In previous investigations it was found that the chemical composition of microalgal biomass is strongly dependent on the nutrient availability. Therefore, it is expected that algae's sequestration capabilities and productivity are also determined by the cells' chemical environments. For investigating this hypothesis, novel analytical methodologies are required which are capable of monitoring live cells exposed to chemically shifting environments followed by chemometric modeling of their chemical adaptation dynamics. FTIR-ATR experiments have been developed for acquiring spectroscopic time series of live Dunaliella parva cultures adapting to different nutrient situations. Comparing experimental data from acclimated cultures to those exposed to a chemically shifted nutrient situation reveals insights in which analyte groups participate in modifications of microalgal biomass and on what time scales. For a chemometric description of these processes, a data model has been deduced which explains the chemical adaptation dynamics explicitly rather than empirically. First results show that this approach is feasible and derives information about the chemical biomass adaptations. Future investigations will utilize these instrumental and chemometric methodologies for quantitative investigations of the relation between chemical environments and microalgal sequestration capabilities.

  3. Commercial applications in biomedical processing in the microgravity environment

    NASA Astrophysics Data System (ADS)

    Johnson, Terry C.; Taub, Floyd

    1995-01-01

    A series of studies have shown that a purified cell regulatory sialoglycopeptide (CeReS) that arrests cell division and induces cellular differentiation is fully capable of functionally interacting with target insect and mammalian cells in the microgravity environment. Data from several shuttle missions suggest that the signal transduction events that are known to be associated with CeReS action function as well in microgravity as in ground-based experiments. The molecular events known to be associated with CeReS include an ability to interfere with Ca2+ metabolism, the subsequent alkalinization of cell cytosol, and the inhibition of the phosphorylation of the nuclear protein product encoded by the retinoblastoma (RB) gene. The ability of CeReS to function in microgravity opens a wide variety of applications in space life sciences.

  4. Modeling the Reading Process: Promise and Problems.

    ERIC Educational Resources Information Center

    Geyer, John J.

    The problems of modeling a process as complex as reading are discussed, including such factors as the lack of agreement surrounding definitions of modeling, varying levels of rigor within and between models, the disjunctive categories within which models fall, and the difficulty of synthesis across fields which employ very different technical…

  5. Social Skills of and Social Environments Produced by Different Holland Types: A Social Perspective on Person-Environment Fit Models.

    ERIC Educational Resources Information Center

    Wampold, Bruce E.; And Others

    1995-01-01

    Describes qualitative study of chemistry laboratory groups of undergraduates to explore notion that a critical aspect of the environment in person-environment models is the nature and density of the social interactions of the people in the environment. Holland's hexagonal model of personality types was the framework used to study related…

  6. Modeling aggregation of dust monomers in low gravity environments

    NASA Astrophysics Data System (ADS)

    Doyon, Julien; Rioux, Claude

    The modeling of aggregation phenomena in microgravity is of paramount relevance to the understanding of the formation of planets. Relevant experiments have been carried out at a ground based laboratory and on aircraft providing low gravity during parabolic flight.1 Other possible environments are rockets, shuttles and the international space station. Numerical simulation of aggregation can provide us a tool to understand the formal and the-oretical background of the phenomena. The comparison between low gravity experiment and modeling prediction may confirm a theory. Also, experiments that are hard to perform can be simulated on computers allowing a vast choice of physical properties. Simulations to date have been constrained to ensembles of 100 to 1000 monomers.2 We have been able to extend such numbers to 10 000 monomers and the final goal is about 100 000 monomers, where gravitational effects become relevant yielding spheroidal systems of particles (planetesimals and planetoids). Simulations made are assumed to be diffusion processes where colliding particles will stick together with a certain probability. Future work shall include other interactions like electrostatic or magnetic forces. Recent results are to be shown at the meeting. I acknowledge the support from the ELIPS program (jointly between Canadian and European space agencies). The guidance of Prof. Slobodrian is warmly thanked. References. 1. R.J. Slobodrian, C. Rioux and J.-C. Leclerc, Microgravity Research and Aplications in Phys-ical Sciences and Biotechnology, Proceedings of the First International Symposium, Sorrento, Italy (2000) ESA SP-454, p.779-786. and Refs. therein. 2. P. Deladurantaye, C Rioux and R.J Slobodrian, Chaos, Solitons Fractals , (1997), pp. 1693-1708. Carl Robert and Eric Litvak, Software " Fractal", private communication.

  7. Environment Modeling Using Runtime Values for JPF-Android

    NASA Technical Reports Server (NTRS)

    van der Merwe, Heila; Tkachuk, Oksana; Nel, Seal; van der Merwe, Brink; Visser, Willem

    2015-01-01

    Software applications are developed to be executed in a specific environment. This environment includes external native libraries to add functionality to the application and drivers to fire the application execution. For testing and verification, the environment of an application is simplified abstracted using models or stubs. Empty stubs, returning default values, are simple to generate automatically, but they do not perform well when the application expects specific return values. Symbolic execution is used to find input parameters for drivers and return values for library stubs, but it struggles to detect the values of complex objects. In this work-in-progress paper, we explore an approach to generate drivers and stubs based on values collected during runtime instead of using default values. Entry-points and methods that need to be modeled are instrumented to log their parameters and return values. The instrumented applications are then executed using a driver and instrumented libraries. The values collected during runtime are used to generate driver and stub values on- the-fly that improve coverage during verification by enabling the execution of code that previously crashed or was missed. We are implementing this approach to improve the environment model of JPF-Android, our model checking and analysis tool for Android applications.

  8. Modelling between Epistemological Beliefs and Constructivist Learning Environment

    ERIC Educational Resources Information Center

    Çetin-Dindar, Ayla; Kirbulut, Zübeyde Demet; Boz, Yezdan

    2014-01-01

    The purpose of this study was to model the relationship between pre-service chemistry teachers' epistemological beliefs and their preference to use constructivist-learning environment in their future class. The sample was 125 pre-service chemistry teachers from five universities in Turkey. Two instruments were used in this study. One of the…

  9. Compound Cue Processing in Linearly and Nonlinearly Separable Environments

    ERIC Educational Resources Information Center

    Hoffrage, Ulrich; Garcia-Retamero, Rocio; Czienskowski, Uwe

    2008-01-01

    Take-the-best (TTB) is a fast and frugal heuristic for paired comparison that has been proposed as a model of bounded rationality. This heuristic has been criticized for not taking compound cues into account to predict a criterion, although such an approach is sometimes required to make accurate predictions. By means of computer simulations, it is…

  10. High performance medical image processing in client/server-environments.

    PubMed

    Mayer, A; Meinzer, H P

    1999-03-01

    As 3D scanning devices like computer tomography (CT) or magnetic resonance imaging (MRI) become more widespread, there is also an increasing need for powerful computers that can handle the enormous amounts of data with acceptable response times. We describe an approach to parallelize some of the more frequently used image processing operators on distributed memory architectures. It is desirable to make such specialized machines accessible on a network, in order to save costs by sharing resources. We present a client/server approach that is specifically tailored to the interactive work with volume data. Our image processing server implements a volume visualization method that allows the user to assess the segmentation of anatomical structures. We can enhance the presentation by combining the volume visualizations on a viewing station with additional graphical elements, which can be manipulated in real-time. The methods presented were verified on two applications for different domains. PMID:10094225

  11. Containerless processing of single crystals in low-G environment

    NASA Technical Reports Server (NTRS)

    Walter, H. U.

    1974-01-01

    Experiments on containerless crystal growth from the melt were conducted during Skylab missions SL3 and SL4 (Skylab Experiment M-560). Six samples of InSb were processed, one of them heavily doped with selenium. The concept of the experiment is discussed and related to general crystal growth methods and their merits as techniques for containerless processing in space. The morphology of the crystals obtained is explained in terms of volume changes associated with solidification and wetting conditions during solidification. All samples exhibit extremely well developed growth facets. Analysis by X-ray topographical methods and chemical etching shows that the crystals are of high structural perfection. Average dislocation density as revealed by etching is of the order of 100 per sq cm; no dislocation clusters could be observed in the space-grown samples. A sequence of striations that is observed in the first half of the selenium-doped sample is explained as being caused by periodic surface breakdown.

  12. VHP - An environment for the remote visualization of heuristic processes

    NASA Technical Reports Server (NTRS)

    Crawford, Stuart L.; Leiner, Barry M.

    1991-01-01

    A software system called VHP is introduced which permits the visualization of heuristic algorithms on both resident and remote hardware platforms. The VHP is based on the DCF tool for interprocess communication and is applicable to remote algorithms which can be on different types of hardware and in languages other than VHP. The VHP system is of particular interest to systems in which the visualization of remote processes is required such as robotics for telescience applications.

  13. Current models of the intensely ionizing particle environment in space

    NASA Technical Reports Server (NTRS)

    Adams, James H., Jr.

    1988-01-01

    The Cosmic Ray Effects on MicroElectronics (CREME) model that is currently in use to estimate single event effect rates in spacecraft is described. The CREME model provides a description of the radiation environment in interplanetary space near the orbit of the earth that contains no major deficiencies. The accuracy of the galactic cosmic ray model is limited by the uncertainties in solar modulation. The model for solar energetic particles could be improved by making use of all the data that has been collected on solar energetic particle events. There remain major uncertainties about the environment within the earth's magnetosphere, because of the uncertainties over the charge states of the heavy ions in the anomalous component and solar flares, and because of trapped heavy ions. The present CREME model is valid only at 1 AU, but it could be extended to other parts of the heliosphere. There is considerable data on the radiation environment from 0.2 to 35 AU in the ecliptic plane. This data could be used to extend the CREME model.

  14. Processing of soot in an urban environment: case study from the Mexico City Metropolitan Area

    NASA Astrophysics Data System (ADS)

    Johnson, K. S.; Zuberi, B.; Molina, L. T.; Molina, M. J.; Iedema, M. J.; Cowin, J. P.; Gaspar, D. J.; Wang, C.; Laskin, A.

    2005-08-01

    Chemical composition, size, and mixing state of atmospheric particles are critical in determining their effects on the environment. There is growing evidence that soot aerosols play a particularly important role in both climate and human health, but still relatively little is known of their physical and chemical nature. In addition, the atmospheric residence times and removal mechanisms for soot are neither well understood nor adequately represented in regional and global climate models. To investigate the effect of locality and residence time on properties of soot and mixing state in a polluted urban environment, particles of diameter 0.2-2.0 µm were collected in the Mexico City Metropolitan Area (MCMA) during the MCMA-2003 field campaign from various sites within the city. Individual particle analysis by different electron microscopy methods coupled with energy dispersed X-ray spectroscopy, and secondary ionization mass spectrometry show that freshly-emitted soot particles become rapidly processed in the MCMA. Whereas fresh particulate emissions from mixed-traffic are almost entirely carbonaceous, consisting of soot aggregates with liquid coatings suggestive of unburned lubricating oil and water, ambient soot particles which have been processed for less than a few hours are heavily internally mixed, primarily with ammonium sulfate. Single particle analysis suggests that this mixing occurs through several mechanisms that require further investigation. In light of previously published results, the internally-mixed nature of processed soot particles is expected to affect heterogeneous chemistry on the soot

  15. Processing of soot in an urban environment: case study from the Mexico City Metropolitan Area

    NASA Astrophysics Data System (ADS)

    Johnson, K. S.; Zuberi, B.; Molina, L. T.; Molina, M. J.; Iedema, M. J.; Cowin, J. P.; Gaspar, D. J.; Wang, C.; Laskin, A.

    2005-11-01

    Chemical composition, size, and mixing state of atmospheric particles are critical in determining their effects on the environment. There is growing evidence that soot aerosols play a particularly important role in both climate and human health, but still relatively little is known of their physical and chemical nature. In addition, the atmospheric residence times and removal mechanisms for soot are neither well understood nor adequately represented in regional and global climate models. To investigate the effect of locality and residence time on properties of soot and mixing state in a polluted urban environment, particles of diameter 0.2-2.0 μm were collected in the Mexico City Metropolitan Area (MCMA) during the MCMA-2003 Field Campaign from various sites within the city. Individual particle analysis by different electron microscopy methods coupled with energy dispersed x-ray spectroscopy, and secondary ionization mass spectrometry show that freshly-emitted soot particles become rapidly processed in the MCMA. Whereas fresh particulate emissions from mixed-traffic are almost entirely carbonaceous, consisting of soot aggregates with liquid coatings suggestive of unburned lubricating oil and water, ambient soot particles which have been processed for less than a few hours are heavily internally mixed, primarily with ammonium sulfate. Single particle analysis suggests that this mixing occurs through several mechanisms that require further investigation. In light of previously published results, the internally-mixed nature of processed soot particles is expected to affect heterogeneous chemistry on the soot surface, including interaction with water during wet-removal.

  16. An approach to accidents modeling based on compounds road environments.

    PubMed

    Fernandes, Ana; Neves, Jose

    2013-04-01

    The most common approach to study the influence of certain road features on accidents has been the consideration of uniform road segments characterized by a unique feature. However, when an accident is related to the road infrastructure, its cause is usually not a single characteristic but rather a complex combination of several characteristics. The main objective of this paper is to describe a methodology developed in order to consider the road as a complete environment by using compound road environments, overcoming the limitations inherented in considering only uniform road segments. The methodology consists of: dividing a sample of roads into segments; grouping them into quite homogeneous road environments using cluster analysis; and identifying the influence of skid resistance and texture depth on road accidents in each environment by using generalized linear models. The application of this methodology is demonstrated for eight roads. Based on real data from accidents and road characteristics, three compound road environments were established where the pavement surface properties significantly influence the occurrence of accidents. Results have showed clearly that road environments where braking maneuvers are more common or those with small radii of curvature and high speeds require higher skid resistance and texture depth as an important contribution to the accident prevention. PMID:23376544

  17. Declarative business process modelling: principles and modelling languages

    NASA Astrophysics Data System (ADS)

    Goedertier, Stijn; Vanthienen, Jan; Caron, Filip

    2015-02-01

    The business process literature has proposed a multitude of business process modelling approaches or paradigms, each in response to a different business process type with a unique set of requirements. Two polar paradigms, i.e. the imperative and the declarative paradigm, appear to define the extreme positions on the paradigm spectrum. While imperative approaches focus on explicitly defining how an organisational goal should be reached, the declarative approaches focus on the directives, policies and regulations restricting the potential ways to achieve the organisational goal. In between, a variety of hybrid-paradigms can be distinguished, e.g. the advanced and adaptive case management. This article focuses on the less-exposed declarative approach on process modelling. An outline of the declarative process modelling and the modelling approaches is presented, followed by an overview of the observed declarative process modelling principles and an evaluation of the declarative process modelling approaches.

  18. Lithography process window analysis with calibrated model

    NASA Astrophysics Data System (ADS)

    Zhou, Wenzhan; Yu, Jin; Lo, James; Liu, Johnson

    2004-05-01

    As critical-dimension shrink below 0.13 μm, the SPC (Statistical Process Control) based on CD (Critical Dimension) control in lithography process becomes more difficult. Increasing requirements of a shrinking process window have called on the need for more accurate decision of process window center. However in practical fabrication, we found that systematic error introduced by metrology and/or resist process can significantly impact the process window analysis result. Especially, when the simple polynomial functions are used to fit the lithographic data from focus exposure matrix (FEM), the model will fit these systematic errors rather than filter them out. This will definitely impact the process window analysis and determination of the best process condition. In this paper, we proposed to use a calibrated first principle model to do process window analysis. With this method, the systematic metrology error can be filtered out efficiently and give a more reasonable window analysis result.

  19. Fusion Process Model Implementation Case Studies

    NASA Astrophysics Data System (ADS)

    Kaur, Rupinder; Sengupta, Jyotsna

    2012-07-01

    In this paper we have discussed, three case studies. The first one is applied at Web Shrub Solutions, a software development organization, second is applied at web based job portal (stepintojob.com) for leading Indian firm and the third is web design and development for SCL limited, to observe the results of Fusion Process Model. Fusion Process Model follows component driven approach; it applies 3C Model to generalize the process of solving the problem in each phase, which provides firm control over the software development process.

  20. Distillation modeling for a uranium refining process

    SciTech Connect

    Westphal, B.R.

    1996-03-01

    As part of the spent fuel treatment program at Argonne National Laboratory, a vacuum distillation process is being employed for the recovery of uranium following an electrorefining process. Distillation of a salt electrolyte, containing a eutectic mixture of lithium and potassium chlorides, from uranium is achieved by a simple batch operation and is termed {open_quotes}cathode processing{close_quotes}. The incremental distillation of electrolyte salt will be modeled by an equilibrium expression and on a molecular basis since the operation is conducted under moderate vacuum conditions. As processing continues, the two models will be compared and analyzed for correlation with actual operating results. Possible factors that may contribute to aberrations from the models include impurities at the vapor-liquid boundary, distillate reflux, anomalous pressure gradients, and mass transport phenomena at the evaporating surface. Ultimately, the purpose of either process model is to enable the parametric optimization of the process.

  1. VARTM Process Modeling of Aerospace Composite Structures

    NASA Technical Reports Server (NTRS)

    Song, Xiao-Lan; Grimsley, Brian W.; Hubert, Pascal; Cano, Roberto J.; Loos, Alfred C.

    2003-01-01

    A three-dimensional model was developed to simulate the VARTM composite manufacturing process. The model considers the two important mechanisms that occur during the process: resin flow, and compaction and relaxation of the preform. The model was used to simulate infiltration of a carbon preform with an epoxy resin by the VARTM process. The model predicted flow patterns and preform thickness changes agreed qualitatively with the measured values. However, the predicted total infiltration times were much longer than measured most likely due to the inaccurate preform permeability values used in the simulation.

  2. Microstructure in the Extreme Environment: Understanding and Predicting Dynamic Damage Processes

    NASA Astrophysics Data System (ADS)

    Dennis-Koller, Darcie; Cerreta, Ellen; Bronkhorst, Curt; Escobedo-Diaz, Pablo; Lebensohn, Ricardo

    2013-03-01

    The future of materials science: strategic application for functionally controlled materials properties is emphasized by the need to control material performance in extreme environments. This study examines the separate effects of kinetics (in the form of dynamic loading rate and shock wave shape) from that of length-scale effects (in the form of microstructural defect distributions). Recently available mesoscale modeling techniques are being used to capture a physical link between kinetic and length-scale influences on dynamic loading. This work contributes innovative new tools in the form of shock-wave shaping techniques in dynamic experimentation, materials characterization, lending insight into 3D damage field analysis at micron resolution, and the physics necessary to provide predictive capabilities for dynamic damage evolution. Experimental are obtained to provide the basis for the development of process-aware material performance models.

  3. Mindseye: a visual programming and modeling environment for imaging science

    NASA Astrophysics Data System (ADS)

    Carney, Thom

    1998-07-01

    Basic vision science research has reached the point that many investigators are now designing quantitative models of human visual function in areas such as, pattern discrimination, motion detection, optical flow, color discrimination, adaptation and stereopsis. These models have practical significance in their application to image compression technologies and as tools for evaluating image quality. We have been working on a vision modeling environment, called Mindseye, that is designed to simplify the implementation and testing of general purpose spatio- temporal models of human vision. Mindseye is an evolving general-purpose vision-modeling environment that embodies the general structures of the visual system and provides a set of modular tools within a flexible platform tailored to the needs of researchers. The environment employs a user- friendly graphics interface with on-line documentation that describes the functionality of the individual modules. Mindseye, while functional, is still research in progress. We are seeking input from the image compression and evaluation community as well as from the vision science community as to the potential utility of Mindseye, and how it might be enhanced to meet future needs.

  4. Modeling and control for closed environment plant production systems.

    PubMed

    Fleisher, David H; Ting, K C

    2002-01-01

    A computer program was developed to study multiple crop production and control in controlled environment plant production systems. The program simulates crop growth and development under nominal and off-nominal environments. Time-series crop models for wheat (Triticum aestivum), soybean (Glycine max), and white potato (Solanum tuberosum) are integrated with a model-based predictive controller. The controller evaluates and compensates for effects of environmental disturbances on crop production scheduling. The crop models consist of a set of nonlinear polynomial equations, six for each crop, developed using multivariate polynomial regression (MPR). Simulated data from DSSAT crop models, previously modified for crop production in controlled environments with hydroponics under elevated atmospheric carbon dioxide concentration, were used for the MPR fitting. The model-based predictive controller adjusts light intensity, air temperature, and carbon dioxide concentration set points in response to environmental perturbations. Control signals are determined from minimization of a cost function, which is based on the weighted control effort and squared-error between the system response and desired reference signal.

  5. Modeling and control for closed environment plant production systems

    NASA Technical Reports Server (NTRS)

    Fleisher, David H.; Ting, K. C.; Janes, H. W. (Principal Investigator)

    2002-01-01

    A computer program was developed to study multiple crop production and control in controlled environment plant production systems. The program simulates crop growth and development under nominal and off-nominal environments. Time-series crop models for wheat (Triticum aestivum), soybean (Glycine max), and white potato (Solanum tuberosum) are integrated with a model-based predictive controller. The controller evaluates and compensates for effects of environmental disturbances on crop production scheduling. The crop models consist of a set of nonlinear polynomial equations, six for each crop, developed using multivariate polynomial regression (MPR). Simulated data from DSSAT crop models, previously modified for crop production in controlled environments with hydroponics under elevated atmospheric carbon dioxide concentration, were used for the MPR fitting. The model-based predictive controller adjusts light intensity, air temperature, and carbon dioxide concentration set points in response to environmental perturbations. Control signals are determined from minimization of a cost function, which is based on the weighted control effort and squared-error between the system response and desired reference signal.

  6. Modeling and control for closed environment plant production systems.

    PubMed

    Fleisher, David H; Ting, K C

    2002-01-01

    A computer program was developed to study multiple crop production and control in controlled environment plant production systems. The program simulates crop growth and development under nominal and off-nominal environments. Time-series crop models for wheat (Triticum aestivum), soybean (Glycine max), and white potato (Solanum tuberosum) are integrated with a model-based predictive controller. The controller evaluates and compensates for effects of environmental disturbances on crop production scheduling. The crop models consist of a set of nonlinear polynomial equations, six for each crop, developed using multivariate polynomial regression (MPR). Simulated data from DSSAT crop models, previously modified for crop production in controlled environments with hydroponics under elevated atmospheric carbon dioxide concentration, were used for the MPR fitting. The model-based predictive controller adjusts light intensity, air temperature, and carbon dioxide concentration set points in response to environmental perturbations. Control signals are determined from minimization of a cost function, which is based on the weighted control effort and squared-error between the system response and desired reference signal. PMID:12882224

  7. Modeling Users, Context and Devices for Ambient Assisted Living Environments

    PubMed Central

    Castillejo, Eduardo; Almeida, Aitor; López-de-Ipiña, Diego; Chen, Liming

    2014-01-01

    The participation of users within AAL environments is increasing thanks to the capabilities of the current wearable devices. Furthermore, the significance of considering user's preferences, context conditions and device's capabilities help smart environments to personalize services and resources for them. Being aware of different characteristics of the entities participating in these situations is vital for reaching the main goals of the corresponding systems efficiently. To collect different information from these entities, it is necessary to design several formal models which help designers to organize and give some meaning to the gathered data. In this paper, we analyze several literature solutions for modeling users, context and devices considering different approaches in the Ambient Assisted Living domain. Besides, we remark different ongoing standardization works in this area. We also discuss the used techniques, modeled characteristics and the advantages and drawbacks of each approach to finally draw several conclusions about the reviewed works. PMID:24643006

  8. Hospital survival in a competitive environment: the competitive constituency model.

    PubMed

    Ehreth, J

    1993-01-01

    Organizational theory is extended to develop a method for administrators to assess hospital effectiveness in a competitive environment. First, the literature pertaining to organizational effectiveness and survival is synthesized to show the lack of consideration for the effects of competition. Second, the article integrates the effects of competition on organizational effectiveness through a competitive constituency model. A step-by-step procedure is proposed to apply the theory in an organizational setting. The model explicitly addresses differences in power relations between hospitals, their competition, and their stakeholders. The relative nature of effectiveness is explored by comparing the hospital to its competition using criteria developed through specific goals of stakeholders. The distinction between managerial and organizational effectiveness constructs is clarified. Finally, the practical application of this model is demonstrated by assessing the effectiveness of a hospital in the competitive environment of Seattle, Washington, where two hospitals have recently closed.

  9. Strengthening the weak link: Built Environment modelling for loss analysis

    NASA Astrophysics Data System (ADS)

    Millinship, I.

    2012-04-01

    Methods to analyse insured losses from a range of natural perils, including pricing by primary insurers and catastrophe modelling by reinsurers, typically lack sufficient exposure information. Understanding the hazard intensity in terms of spatial severity and frequency is only the first step towards quantifying the risk of a catastrophic event. For any given event we need to know: Are any structures affected? What type of buildings are they? How much damaged occurred? How much will the repairs cost? To achieve this, detailed exposure information is required to assess the likely damage and to effectively calculate the resultant loss. Modelling exposures in the Built Environment therefore plays as important a role in understanding re/insurance risk as characterising the physical hazard. Across both primary insurance books and aggregated reinsurance portfolios, the location of a property (a risk) and its monetary value is typically known. Exactly what that risk is in terms of detailed property descriptors including structure type and rebuild cost - and therefore its vulnerability to loss - is often omitted. This data deficiency is a primary source of variations between modelled losses and the actual claims value. Built Environment models are therefore required at a high resolution to describe building attributes that relate vulnerability to property damage. However, national-scale household-level datasets are often not computationally practical in catastrophe models and data must be aggregated. In order to provide more accurate risk analysis, we have developed and applied a methodology for Built Environment modelling for incorporation into a range of re/insurance applications, including operational models for different international regions and different perils and covering residential, commercial and industry exposures. Illustrated examples are presented, including exposure modelling suitable for aggregated reinsurance analysis for the UK and bespoke high resolution

  10. Prevalence and concentration of Salmonella and Campylobacter in the processing environment of small-scale pastured broiler farms

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A growing niche in the locally grown food movement is the small scale production of broiler chickens using the pasture-raised poultry production model. Little research exists that focuses on Salmonella and Campylobacter contamination in the environment associated with on-farm processing of pasture-r...

  11. Combining Wireless Sensor Networks and Groundwater Transport Models: Protocol and Model Development in a Simulative Environment

    NASA Astrophysics Data System (ADS)

    Barnhart, K.; Urteaga, I.; Han, Q.; Porta, L.; Jayasumana, A.; Illangasekare, T.

    2007-12-01

    Groundwater transport modeling is intended to aid in remediation processes by providing prediction of plume location and by helping to bridge data gaps in the typically undersampled subsurface environment. Increased availability of computer resources has made computer-based transport models almost ubiquitous in calculating health risks, determining cleanup strategies, guiding environmental regulatory policy, and in determining culpable parties in lawsuits. Despite their broad use, very few studies exist which verify model correctness or even usefulness, and those that have shown significant discrepancies between predicted and actual results. Better predictions can only be gained from additional and higher quality data, but this is an expensive proposition using current sampling techniques. A promising technology is the use of wireless sensor networks (WSNs) which are comprised of wireless nodes (motes) coupled to in-situ sensors that are capable of measuring hydrological parameters. As the motes are typically battery powered, power consumption is a major concern in routing algorithms. By supplying predictions about the direction and arrival time of the contaminant, the application-driven routing protocol would then become more efficient. A symbiotic relationship then exists between the WSN, which is supplying the data to calibrate the transport model, and the model, which may be supplying predictive information to the WSN for optimum monitoring performance. Many challenges exist before the above can be realized: WSN protocols must mature, as must sensor technology, and inverse models and tools must be developed for integration into the system. As current model calibration, even automatic calibration, still often requires manual tweaking of calibration parameters, implementing this in a real-time closed-loop process may require significant work. Based on insights from a previous proof-of-concept intermediate-scale tank experiment, we are developing the models, tools

  12. Interdependencies of Arctic land surface processes: A uniquely sensitive environment

    NASA Astrophysics Data System (ADS)

    Bowling, L. C.

    2007-12-01

    The circumpolar arctic drainage basin is composed of several distinct ecoregions including steppe grassland and cropland, boreal forest and tundra. Land surface hydrology throughout this diverse region shares several unique features such as dramatic seasonal runoff differences controlled by snowmelt and ice break-up; the storage of significant portions of annual precipitation as snow and in lakes and wetlands; and the effects of ephemeral and permanently frozen soils. These arctic land processes are delicately balanced with the climate and are therefore important indicators of change. The litany of recently-detected changes in the Arctic includes changes in snow precipitation, trends and seasonal shifts in river discharge, increases and decreases in the extent of surface water, and warming soil temperatures. Although not unique to the arctic, increasing anthropogenic pressures represent an additional element of change in the form of resource extraction, fire threat and reservoir construction. The interdependence of the physical, biological and social systems mean that changes in primary indicators have large implications for land cover, animal populations and the regional carbon balance, all of which have the potential to feed back and induce further change. In fact, the complex relationships between the hydrological processes that make the Artic unique also render observed historical change difficult to interpret and predict, leading to conflicting explanations. For example, a decrease in snow accumulation may provide less insulation to the underlying soil resulting in greater frost development and increased spring runoff. Similarly, melting permafrost and ground ice may lead to ground subsidence and increased surface saturation and methane production, while more complete thaw may enhance drainage and result in drier soil conditions. The threshold nature of phase change around the freezing point makes the system especially sensitive to change. In addition, spatial

  13. INTEGRATED FISCHER TROPSCH MODULAR PROCESS MODEL

    SciTech Connect

    Donna Post Guillen; Richard Boardman; Anastasia M. Gribik; Rick A. Wood; Robert A. Carrington

    2007-12-01

    With declining petroleum reserves, increased world demand, and unstable politics in some of the world’s richest oil producing regions, the capability for the U.S. to produce synthetic liquid fuels from domestic resources is critical to national security and economic stability. Coal, biomass and other carbonaceous materials can be converted to liquid fuels using several conversion processes. The leading candidate for large-scale conversion of coal to liquid fuels is the Fischer Tropsch (FT) process. Process configuration, component selection, and performance are interrelated and dependent on feed characteristics. This paper outlines a flexible modular approach to model an integrated FT process that utilizes a library of key component models, supporting kinetic data and materials and transport properties allowing rapid development of custom integrated plant models. The modular construction will permit rapid assessment of alternative designs and feed stocks. The modeling approach consists of three thrust areas, or “strands” – model/module development, integration of the model elements into an end to end integrated system model, and utilization of the model for plant design. Strand 1, model/module development, entails identifying, developing, and assembling a library of codes, user blocks, and data for FT process unit operations for a custom feedstock and plant description. Strand 2, integration development, provides the framework for linking these component and subsystem models to form an integrated FT plant simulation. Strand 3, plant design, includes testing and validation of the comprehensive model and performing design evaluation analyses.

  14. Snow process monitoring in mountain forest environments with a digital camera network

    NASA Astrophysics Data System (ADS)

    Dong, Chunyu; Menzel, Lucas

    2016-04-01

    Snow processes are important components of the hydrologic cycle in mountainous areas and at high latitudes. Sparse observations in remote regions, in combination with complex topography, local climate specifics and the impact of heterogeneous vegetation cover complicate a detailed investigation of snow related processes. In this study, a camera network is applied to monitor the complex snow processes with high temporal resolution in montane forest environments (800-1200 m a.s.l.) in southwestern Germany. A typical feature of this region is the high temporal variability of weather conditions, with frequent snow accumulation and ablation processes and recurrent snow interception on conifers. We developed a semi-automatic procedure to interpret snow depths from the digital images, which shows high consistency with manual readings and station-based measurements. To extract the snow canopy interception dynamics from the pictures, six binary classification methods are compared. MaxEntropy classifier shows obviously better performance than the others in various illumination conditions, and it is thus selected to execute the snow interception quantification. The snow accumulation and ablation processes on the ground as well as the snow loading and unloading in forest canopies are investigated based on the snow parameters derived from the time-lapse photography monitoring. Besides, the influences of meteorological conditions, forest cover and elevation on snow processes are considered. Further, our investigations serve to improve the snow and interception modules of a hydrological model. We found that time-lapse photography proves to be an effective and low-cost approach to collect useful snow-related information which supports our understanding of snow processes and the further development of hydrological models. We will present selected results from our investigations over two consecutive winters.

  15. Open Data Processing Environment for Future Space Missions

    NASA Astrophysics Data System (ADS)

    Koerver, W.; Schmitz, G.; Sommer, C.; Willnecker, R.

    2002-01-01

    The globalization and decentralisation of future space missions execution requires new concepts for payload and experiment operation. The technological evolution in the area of data systems and networks will allow for almost unlimited remote operations. Software and hardware technologies bundled with modern networking will permit the distribution of work task- and location wise. Whereas in the past both, telemetry and telecommanding had been worked out as special software products taking care of specific mission and individual instrument requirements, future systems will be based on generic solutions and open data systems. Common interface applications in software and hardware will allow the user to access any data products and other mission or experiment related information from remote site. The user will become part of the mission control centre in a virtual manner. DLR, together with small and medium enterprises, has initiated the development for an open data processing system, DAVIS, which face the new challenges. The modular concept of the generic system allows the easy customised implementation of payload and experiment specific data services. Telescience, which means the interactive remote operations of science in space, can be simply realised by scaleable real-time telemetry and telecommand modules. DAVIS covers the entire application chain - telemetry services and archiving, data processing, visualisation and on-line data analysis, as well as telecommanding and tracking. It offers further on various interfaces to other systems, databases or analysis tools via dedicated application programming interfaces (API) and supports the development of multi-platform applications. DAVIS has gained great success in past Spacelab missions and is presently used for the preparation of the project Rosetta-Lander, part of the next cornerstone mission of ESA in 2003. In addition, it is now under further development for the future utilisation in the ISS payload operations

  16. The Icelandic volcanic aeolian environment: Processes and impacts - A review

    NASA Astrophysics Data System (ADS)

    Arnalds, Olafur; Dagsson-Waldhauserova, Pavla; Olafsson, Haraldur

    2016-03-01

    Iceland has the largest area of volcaniclastic sandy desert on Earth or 22,000 km2. The sand has been mostly produced by glacio-fluvial processes, leaving behind fine-grained unstable sediments which are later re-distributed by repeated aeolian events. Volcanic eruptions add to this pool of unstable sediments, often from subglacial eruptions. Icelandic desert surfaces are divided into sand fields, sandy lavas and sandy lag gravel, each with separate aeolian surface characteristics such as threshold velocities. Storms are frequent due to Iceland's location on the North Atlantic Storm track. Dry winds occur on the leeward sides of mountains and glaciers, in spite of the high moisture content of the Atlantic cyclones. Surface winds often move hundreds to more than 1000 kg m-1 per annum, and more than 10,000 kg m-1 have been measured in a single storm. Desertification occurs when aeolian processes push sand fronts and have thus destroyed many previously fully vegetated ecosystems since the time of the settlement of Iceland in the late ninth century. There are about 135 dust events per annum, ranging from minor storms to >300,000 t of dust emitted in single storms. Dust production is on the order of 30-40 million tons annually, some traveling over 1000 km and deposited on land and sea. Dust deposited on deserts tends to be re-suspended during subsequent storms. High PM10 concentrations occur during major dust storms. They are more frequent in the wake of volcanic eruptions, such as after the Eyjafjallajökull 2010 eruption. Airborne dust affects human health, with negative effects enhanced by the tubular morphology of the grains, and the basaltic composition with its high metal content. Dust deposition on snow and glaciers intensifies melting. Moreover, the dust production probably also influences atmospheric conditions and parameters that affect climate change.

  17. ARTEMIS: Ares Real Time Environments for Modeling, Integration, and Simulation

    NASA Technical Reports Server (NTRS)

    Hughes, Ryan; Walker, David

    2009-01-01

    This slide presentation reviews the use of ARTEMIS in the development and testing of the ARES launch vehicles. Ares Real Time Environment for Modeling, Simulation and Integration (ARTEMIS) is the real time simulation supporting Ares I hardware-in-the-loop (HWIL) testing. ARTEMIS accurately models all Ares/Orion/Ground subsystems which interact with Ares avionics components from pre-launch through orbit insertion The ARTEMIS System integration Lab, and the STIF architecture is reviewed. The functional components of ARTEMIS are outlined. An overview of the models and a block diagram is presented.

  18. Thermal modeling of carbon-epoxy laminates in fire environments.

    SciTech Connect

    McGurn, Matthew T. , Buffalo, NY); DesJardin, Paul Edward , Buffalo, NY); Dodd, Amanda B.

    2010-10-01

    A thermal model is developed for the response of carbon-epoxy composite laminates in fire environments. The model is based on a porous media description that includes the effects of gas transport within the laminate along with swelling. Model comparisons are conducted against the data from Quintere et al. Simulations are conducted for both coupon level and intermediate scale one-sided heating tests. Comparisons of the heat release rate (HRR) as well as the final products (mass fractions, volume percentages, porosity, etc.) are conducted. Overall, the agreement between available the data and model is excellent considering the simplified approximations to account for flame heat flux. A sensitivity study using a newly developed swelling model shows the importance of accounting for laminate expansion for the prediction of burnout. Excellent agreement is observed between the model and data of the final product composition that includes porosity, mass fractions and volume expansion ratio.

  19. From a hybrid model to a fully kinetic model: On the modeling of planetary plasma environments by a fully kinetic electromagnetic global model HYB-em

    NASA Astrophysics Data System (ADS)

    Pohjola, Valter; Kallio, Esa; Jarvinen, Riku

    We have developed a fully kinetic electromagnetic model to study instabilities and waves in planetary plasma environments. In the particle-in-a-cell (PIC) model both ions and electrons are modeled as particles. An important feature of the developed global kinetic model, called HYB-em, compared to other electromagnetic codes is that it is built up on an earlier quasi-neutral hybrid simulation platform called HYB and that it can be used in conjunction with earlier hybrid models. The HYB models have been used during the past ten years to study globally the flowing plasma interaction with various Solar System objects: Mercury, Venus, the Moon, Mars, Saturnian moon Titan and asteroids. The new model enables us to (1) study the stability of various planetary plasma regions in three dimensional space, (2) analyze the propa-gation of waves in a plasma environment derived from the other global HYB models. All particle processes in a multi-ion plasma which are implemented on the HYB platform(e.g. ion-neutral collisions, chemical processes, particle loss and production processes) are also automatically included in HYB-em model. In this presentation we study the developed approach by analyzing the propagation of high frequency electromagnetic waves in non-magnetized plasma in two cases: We study (1) expan-sion of a spherical wave generated from a point source and (2) propagation of a plane wave in plasma. We demonstrate that the HYB-em model is capable of describing these space plasma situations successfully. The analysis suggests the potential of the developed model to study both high density-high magnetic field plasma environments, such as Mercury, and low density-low magnetic field plasma environments, such as Venus and Mars.

  20. From a hybrid model to a fully kinetic model: On the modeling of planetary plasma environments by a fully kinetic electromagnetic global model HYB-em

    NASA Astrophysics Data System (ADS)

    Pohjola, Valter; Kallio, Esa

    2010-05-01

    We have developed a fully kinetic electromagnetic model to study instabilities and waves in planetary plasma environments. In the particle-in-a-cell (PIC) model both ions and electrons are modeled as particles. An important feature of the developed global kinetic model, called HYB-em, compared to other electromagnetic codes is that it is built up on an earlier quasi-neutral hybrid simulation platform called HYB and that it can be used in conjunction with earlier hybrid models. The HYB models have been used during the past ten years to study globally the flowing plasma interaction with various Solar System objects: Mercury, Venus, the Moon, Mars, Saturnian moon Titan and asteroids. The new model enables us to (1) study the stability of various planetary plasma regions in three dimensional space, (2) analyze the propagation of waves in a plasma environment derived from the other global HYB models. All particle processes in a multi-ion plasma which are implemented on the HYB platform (e.g. ion-neutral-collisions, chemical processes, particle loss and production processes) are also automatically included in HYB-em model. In this presentation we study the developed approach by analyzing the propagation of high frequency electromagnetic waves in non-magnetized plasma in two cases: We study (1) expansion of a spherical wave generated from a point source and (2) propagation of a plane wave in plasma. We demonstrate that the HYB-em model is capable of describing these space plasma situations successfully. The analysis suggests the potential of the developed model to study both high density-high magnetic field plasma environments, such as Mercury, and low density-low magnetic field plasma environments, such as Venus and Mars.

  1. Communicative processes: a model of communication

    SciTech Connect

    Kimura, T.D.; Gillett, W.D.

    1982-01-01

    The authors introduce a conceptual model of communicative organization as a part of the formal semantic study of distributed computation. The model includes, as communication primitives, three independent modes of communication: mailing, posting and broadcasting. Mailing models thin-wire communication, and posting models shared memory communication. While broadcasting is not prominent in today's parallel programming languages, it has an important role to play in distributed computation. Other fundamental notions in the model are process, symbol, site, process class, symbol class and site class. 8 references.

  2. Models of Problem Solving Processes and Abilities.

    ERIC Educational Resources Information Center

    Feldhusen, John F.; Guthrie, Virginia A.

    1979-01-01

    This paper reviews current models of problem solving to identify results relevant to teachers or instructional developers. Four areas are covered: information processing models, approaches stressing human abilities and factors, creative problem solving models, and other aspects of problem solving. Part of a theme issue on intelligence. (Author/SJL)

  3. Kinetic Modeling of the Lunar Dust-Plasma Environment

    NASA Astrophysics Data System (ADS)

    Kallio, Esa; Alho, Markku; Alvarez, Francisco; Barabash, Stas; Dyadechkin, Sergey; Fernandes, Vera; Futaana, Yoshifumi; Harri, Ari-Matti; Haunia, Touko; Heilimo, Jyri; Holmström, Mats; Jarvinen, Riku; Lue, Charles; Makela, Jakke; Porjo, Niko; Schmidt, Walter; Shahab, Fatemi; Siili, Tero; Wurz, Peter

    2014-05-01

    Modeling of the lunar dust and plasma environment is a challenging task because a self-consistent model should include ions, electrons and dust particles and numerous other factors. However, most of the parameters are not well established or constrained by measurements in the lunar environment. More precisely, a comprehensive model should contain electrons originating from 1) the solar wind, 2) the lunar material (photoelectrons, secondary electrons) and 3) the lunar dust. Ions originate from the solar wind, the lunar material, the lunar exosphere and the dust. To model the role of the dust in the lunar plasma environment is a highly complex task since the properties of the dust particles in the exosphere are poorly known (e.g. mass, size, shape, conductivity) or not known (e.g. charge and photoelectron emission) and probably are time dependent. Models should also include the effects of interactions between the surface and solar wind and energetic particles, and micrometeorites. Largely different temporal and spatial scales are also a challenge for the numerical models. In addition, the modeling of a region on the Moon - for example on the South Pole - at a given time requires also knowledge of the solar illumination conditions at that time, mineralogical and electric properties of the local lunar surface, lunar magnetic anomalies, solar UV flux and the properties of the solar wind. Harmful effects of lunar dust to technical devices and to human health as well as modeling of the properties of the lunar plasma and dust environment have been topics of two ESA funded projects L-DEPP and DPEM. In the presentation we will summarize some basic results and characteristics of plasma and fields near and around the Moon as studied and discovered in these projects. Especially, we analyse three different space and time scales by kinetic models: [1] the "microscale" region near surface with an electrostatic PIC (ions and electrons are particles) model, [2] the "mesoscale

  4. A new Mars radiation environment model with visualization.

    PubMed

    De Angelis, G; Clowdsley, M S; Singleterry, R C; Wilson, J W

    2004-01-01

    A new model for the radiation environment to be found on the planet Mars due to Galactic Cosmic Rays (OCR) has been developed at the NASA Langley Research Center. Solar modulated primary particles rescaled for Mars conditions are transported through the Martian atmosphere, with temporal properties modeled with variable timescales, down to the surface, with altitude and backscattering patterns taken into account. The Martian atmosphere has been modeled by using the Mars Global Reference Atmospheric Model--version 2001 (Mars-GRAM 2001). The altitude to compute the atmospheric thickness profile has been determined by using a model for the topography based on the data provided by the Mars Orbiter Laser Altimeter (MOLA) instrument on board the Mars Global Surveyor (MGS) spacecraft. The Mars surface composition has been modeled based on averages over the measurements obtained from orbiting spacecraft and at various landing sites, taking into account the possible volatile inventory (e.g., CO2 ice, H2O ice) along with its time variation throughout the Martian year. Particle transport has been performed with the HZETRN heavy ion code. The Mars Radiation Environment Model has been made available worldwide through the Space Ionizing Radiation Effects and Shielding Tools (SIREST) website, a project of NASA Langley Research Center.

  5. A new Mars radiation environment model with visualization.

    PubMed

    De Angelis, G; Clowdsley, M S; Singleterry, R C; Wilson, J W

    2004-01-01

    A new model for the radiation environment to be found on the planet Mars due to Galactic Cosmic Rays (OCR) has been developed at the NASA Langley Research Center. Solar modulated primary particles rescaled for Mars conditions are transported through the Martian atmosphere, with temporal properties modeled with variable timescales, down to the surface, with altitude and backscattering patterns taken into account. The Martian atmosphere has been modeled by using the Mars Global Reference Atmospheric Model--version 2001 (Mars-GRAM 2001). The altitude to compute the atmospheric thickness profile has been determined by using a model for the topography based on the data provided by the Mars Orbiter Laser Altimeter (MOLA) instrument on board the Mars Global Surveyor (MGS) spacecraft. The Mars surface composition has been modeled based on averages over the measurements obtained from orbiting spacecraft and at various landing sites, taking into account the possible volatile inventory (e.g., CO2 ice, H2O ice) along with its time variation throughout the Martian year. Particle transport has been performed with the HZETRN heavy ion code. The Mars Radiation Environment Model has been made available worldwide through the Space Ionizing Radiation Effects and Shielding Tools (SIREST) website, a project of NASA Langley Research Center. PMID:15880920

  6. Processing of atmospheric nitrogen by clouds above a forest environment

    NASA Astrophysics Data System (ADS)

    Hill, Kimberly A.; Shepson, Paul B.; Galbavy, Edward S.; Anastasio, Cort; Kourtev, Peter S.; Konopka, Allan; Stirm, Brian H.

    2007-06-01

    Dissolved inorganic ions (NH4+, Ca2+, Mg2+, K+, H+, NO3-, and SO42-) and organic nitrogen (DON) were measured in cloud water samples collected over the northern lower peninsula of Michigan. Within a given cloud field, several altitudes were sampled to examine changes in concentration and speciation with altitude. Several samples were analyzed for bacterial content and activity. Convective cumulus (cumulus congestus) were more concentrated than fair weather cumulus (cumulus humilis) for all major ions and DON, with the cloudy air DON concentrations in convective cumulus being twice as large as for fair weather cumulus, and for all other ions, the droplets were 4-6 times more concentrated. The molar average distribution of nitrogen in the cloud water was 43 (±10, 1σ)% ammonium, 39 (±7)% nitrate and 18 (±11)% DON. High concentrations of bacteria were observed in the clouds with an average concentration of 2.9 × 105 (±1.0 × 105, 1σ) bacteria m-3 of cloudy air but which contributed less than 1% of the nitrogen in the cloud water. In addition, nitrifying bacteria were identified, indicating bacterial processing of nitrogen in the cloud water may occur. Air mass origin and altitude influence observed cloud water concentrations, with the exception of DON. The correlation of ammonium and sulfate, and calcium and nitrate suggest that ammonium sulfate and calcium nitrate aerosol may be important sources of these ions.

  7. Eclipse: ESO C Library for an Image Processing Software Environment

    NASA Astrophysics Data System (ADS)

    Devillard, Nicolas

    2011-12-01

    Written in ANSI C, eclipse is a library offering numerous services related to astronomical image processing: FITS data access, various image and cube loading methods, binary image handling and filtering (including convolution and morphological filters), 2-D cross-correlation, connected components, cube and image arithmetic, dead pixel detection and correction, object detection, data extraction, flat-fielding with robust fit, image generation, statistics, photometry, image-space resampling, image combination, and cube stacking. It also contains support for mathematical tools like random number generation, FFT, curve fitting, matrices, fast median computation, and point-pattern matching. The main feature of this library is its ability to handle large amounts of input data (up to 2GB in the current version) regardless of the amount of memory and swap available on the local machine. Another feature is the very high speed allowed by optimized C, making it an ideal base tool for programming efficient number-crunching applications, e.g., on parallel (Beowulf) systems.

  8. Preform Characterization in VARTM Process Model Development

    NASA Technical Reports Server (NTRS)

    Grimsley, Brian W.; Cano, Roberto J.; Hubert, Pascal; Loos, Alfred C.; Kellen, Charles B.; Jensen, Brian J.

    2004-01-01

    Vacuum-Assisted Resin Transfer Molding (VARTM) is a Liquid Composite Molding (LCM) process where both resin injection and fiber compaction are achieved under pressures of 101.3 kPa or less. Originally developed over a decade ago for marine composite fabrication, VARTM is now considered a viable process for the fabrication of aerospace composites (1,2). In order to optimize and further improve the process, a finite element analysis (FEA) process model is being developed to include the coupled phenomenon of resin flow, preform compaction and resin cure. The model input parameters are obtained from resin and fiber-preform characterization tests. In this study, the compaction behavior and the Darcy permeability of a commercially available carbon fabric are characterized. The resulting empirical model equations are input to the 3- Dimensional Infiltration, version 5 (3DINFILv.5) process model to simulate infiltration of a composite panel.

  9. On Choosing Between Two Probabilistic Choice Sub-models in a Dynamic Multitask Environment

    NASA Technical Reports Server (NTRS)

    Soulsby, E. P.

    1984-01-01

    An independent random utility model based on Thurstone's Theory of Comparative Judgment and a constant utility model based on Luce's Choice Axiom are reviewed in detail. Predictions from the two models are shown to be equivalent under certain restrictions on the distribution of the underlying random process. Each model is applied as a stochastic choice submodel in a dynamic, multitask, environment. Resulting choice probabilities are nearly identical, indicating that, despite their conceptual differences, neither model may be preferred over the other based solely on its predictive capability.

  10. On the modeling of planetary plasma environments by a fully kinetic electromagnetic global model HYB-em

    NASA Astrophysics Data System (ADS)

    Pohjola, V.; Kallio, E.

    2010-03-01

    We have developed a fully kinetic electromagnetic model to study instabilities and waves in planetary plasma environments. In the particle-in-a-cell (PIC) model both ions and electrons are modeled as particles. An important feature of the developed global kinetic model, called HYB-em, compared to other electromagnetic codes is that it is built up on an earlier quasi-neutral hybrid simulation platform called HYB and that it can be used in conjunction with earlier hybrid models. The HYB models have been used during the past ten years to study globally the flowing plasma interaction with various Solar System objects: Mercury, Venus, the Moon, Mars, Saturnian moon Titan and asteroids. The new stand-alone fully kinetic model enables us to (1) study the stability of various planetary plasma regions in three-dimensional space, (2) analyze the propagation of waves in a plasma environment derived from the other global HYB models. All particle processes in a multi-ion plasma which are implemented on the HYB platform (e.g. ion-neutral-collisions, chemical processes, particle loss and production processes) are also automatically included in HYB-em model. In this brief report we study the developed approach by analyzing the propagation of high frequency electromagnetic waves in non-magnetized plasma in two cases: We study (1) expansion of a spherical wave generated from a point source and (2) propagation of a plane wave in plasma. The analysis shows that the HYB-em model is capable of describing these space plasma situations successfully. The analysis also suggests the potential of the developed model to study both high density-high magnetic field plasma environments, such as Mercury, and low density-low magnetic field plasma environments, such as Venus and Mars.

  11. Modelling the near-Earth space environment using LDEF data

    NASA Technical Reports Server (NTRS)

    Atkinson, Dale R.; Coombs, Cassandra R.; Crowell, Lawrence B.; Watts, Alan J.

    1992-01-01

    Near-Earth space is a dynamic environment, that is currently not well understood. In an effort to better characterize the near-Earth space environment, this study compares the results of actual impact crater measurement data and the Space Environment (SPENV) Program developed in-house at POD, to theoretical models established by Kessler (NASA TM-100471, 1987) and Cour-Palais (NASA SP-8013, 1969). With the continuing escalation of debris there will exist a definite hazard to unmanned satellites as well as manned operations. Since the smaller non-trackable debris has the highest impact rate, it is clearly necessary to establish the true debris environment for all particle sizes. Proper comprehension of the near-Earth space environment and its origin will permit improvement in spacecraft design and mission planning, thereby reducing potential disasters and extreme costs. Results of this study directly relate to the survivability of future spacecraft and satellites that are to travel through and/or reside in low Earth orbit (LEO). More specifically, these data are being used to: (1) characterize the effects of the LEO micrometeoroid an debris environment on satellite designs and components; (2) update the current theoretical micrometeoroid and debris models for LEO; (3) help assess the survivability of spacecraft and satellites that must travel through or reside in LEO, and the probability of their collision with already resident debris; and (4) help define and evaluate future debris mitigation and disposal methods. Combined model predictions match relatively well with the LDEF data for impact craters larger than approximately 0.05 cm, diameter; however, for smaller impact craters, the combined predictions diverge and do not reflect the sporadic clouds identified by the Interplanetary Dust Experiment (IDE) aboard LDEF. The divergences cannot currently be explained by the authors or model developers. The mean flux of small craters (approximately 0.05 cm diameter) is

  12. A neural network-based classification of environment dynamics models for compliant control of manipulation robots.

    PubMed

    Katic, D; Vukobratovic, M

    1998-01-01

    In this paper, a new method for selecting the appropriate compliance control parameters for robot machining tasks based on connectionist classification of unknown dynamic environments, is proposed. The method classifies the type of environment by using multilayer perceptron, and then, determines the control parameters for compliance control using the estimated characteristics. An important feature is that the process of pattern association can work in an on-line mode as a part of selected compliance control algorithm. Convergence process is improved by using evolutionary approach (genetic algorithms) in order to choose the optimal topology of the proposed multilayer perceptron. Compliant motion simulation experiments with robotic arm placed in contact with dynamic environment, described by the stiffness model and by the general impedance model, have been performed in order to verify the proposed approach.

  13. Martian Radiation Environment: Model Calculations and Recent Measurements with "MARIE"

    NASA Technical Reports Server (NTRS)

    Saganti, P. B.; Cucinotta, F. A.; zeitlin, C. J.; Cleghorn, T. F.

    2004-01-01

    The Galactic Cosmic Ray spectra in Mars orbit were generated with the recently expanded HZETRN (High Z and Energy Transport) and QMSFRG (Quantum Multiple-Scattering theory of nuclear Fragmentation) model calculations. These model calculations are compared with the first eighteen months of measured data from the MARIE (Martian Radiation Environment Experiment) instrument onboard the 2001 Mars Odyssey spacecraft that is currently in Martian orbit. The dose rates observed by the MARIE instrument are within 10% of the model calculated predictions. Model calculations are compared with the MARIE measurements of dose, dose-equivalent values, along with the available particle flux distribution. Model calculated particle flux includes GCR elemental composition of atomic number, Z = 1-28 and mass number, A = 1-58. Particle flux calculations specific for the current MARIE mapping period are reviewed and presented.

  14. Modeling Cellular Processes in 3-D

    PubMed Central

    Mogilner, Alex; Odde, David

    2011-01-01

    Summary Recent advances in photonic imaging and fluorescent protein technology offer unprecedented views of molecular space-time dynamics in living cells. At the same time, advances in computing hardware and software enable modeling of ever more complex systems, from global climate to cell division. As modeling and experiment become more closely integrated, we must address the issue of modeling cellular processes in 3-D. Here, we highlight recent advances related to 3-D modeling in cell biology. While some processes require full 3-D analysis, we suggest that others are more naturally described in 2-D or 1-D. Keeping the dimensionality as low as possible reduces computational time and makes models more intuitively comprehensible; however, the ability to test full 3-D models will build greater confidence in models generally and remains an important emerging area of cell biological modeling. PMID:22036197

  15. Software-Engineering Process Simulation (SEPS) model

    NASA Technical Reports Server (NTRS)

    Lin, C. Y.; Abdel-Hamid, T.; Sherif, J. S.

    1992-01-01

    The Software Engineering Process Simulation (SEPS) model is described which was developed at JPL. SEPS is a dynamic simulation model of the software project development process. It uses the feedback principles of system dynamics to simulate the dynamic interactions among various software life cycle development activities and management decision making processes. The model is designed to be a planning tool to examine tradeoffs of cost, schedule, and functionality, and to test the implications of different managerial policies on a project's outcome. Furthermore, SEPS will enable software managers to gain a better understanding of the dynamics of software project development and perform postmodern assessments.

  16. Quantum jump model for a system with a finite-size environment

    NASA Astrophysics Data System (ADS)

    Suomela, S.; Kutvonen, A.; Ala-Nissila, T.

    2016-06-01

    Measuring the thermodynamic properties of open quantum systems poses a major challenge. A calorimetric detection has been proposed as a feasible experimental scheme to measure work and fluctuation relations in open quantum systems. However, the detection requires a finite size for the environment, which influences the system dynamics. This process cannot be modeled with the standard stochastic approaches. We develop a quantum jump model suitable for systems coupled to a finite-size environment. We use the method to study the common fluctuation relations and prove that they are satisfied.

  17. Quantum jump model for a system with a finite-size environment.

    PubMed

    Suomela, S; Kutvonen, A; Ala-Nissila, T

    2016-06-01

    Measuring the thermodynamic properties of open quantum systems poses a major challenge. A calorimetric detection has been proposed as a feasible experimental scheme to measure work and fluctuation relations in open quantum systems. However, the detection requires a finite size for the environment, which influences the system dynamics. This process cannot be modeled with the standard stochastic approaches. We develop a quantum jump model suitable for systems coupled to a finite-size environment. We use the method to study the common fluctuation relations and prove that they are satisfied. PMID:27415207

  18. Probabilistic models of language processing and acquisition.

    PubMed

    Chater, Nick; Manning, Christopher D

    2006-07-01

    Probabilistic methods are providing new explanatory approaches to fundamental cognitive science questions of how humans structure, process and acquire language. This review examines probabilistic models defined over traditional symbolic structures. Language comprehension and production involve probabilistic inference in such models; and acquisition involves choosing the best model, given innate constraints and linguistic and other input. Probabilistic models can account for the learning and processing of language, while maintaining the sophistication of symbolic models. A recent burgeoning of theoretical developments and online corpus creation has enabled large models to be tested, revealing probabilistic constraints in processing, undermining acquisition arguments based on a perceived poverty of the stimulus, and suggesting fruitful links with probabilistic theories of categorization and ambiguity resolution in perception.

  19. Fuel Conditioning Facility Electrorefiner Process Model

    SciTech Connect

    DeeEarl Vaden

    2005-10-01

    The Fuel Conditioning Facility at the Idaho National Laboratory processes spent nuclear fuel from the Experimental Breeder Reactor II using electro-metallurgical treatment. To process fuel without waiting for periodic sample analyses to assess process conditions, an electrorefiner process model predicts the composition of the electrorefiner inventory and effluent streams. For the chemical equilibrium portion of the model, the two common methods for solving chemical equilibrium problems, stoichiometric and non stoichiometric, were investigated. In conclusion, the stoichiometric method produced equilibrium compositions close to the measured results whereas the non stoichiometric method did not.

  20. DISTRIBUTED PROCESSING TRADE-OFF MODEL FOR ELECTRIC UTILITY OPERATION

    NASA Technical Reports Server (NTRS)

    Klein, S. A.

    1994-01-01

    The Distributed processing Trade-off Model for Electric Utility Operation is based upon a study performed for the California Institute of Technology's Jet Propulsion Laboratory. This study presented a technique that addresses the question of trade-offs between expanding a communications network or expanding the capacity of distributed computers in an electric utility Energy Management System (EMS). The technique resulted in the development of a quantitative assessment model that is presented in a Lotus 1-2-3 worksheet environment. The model gives EMS planners a macroscopic tool for evaluating distributed processing architectures and the major technical and economic tradeoffs as well as interactions within these architectures. The model inputs (which may be varied according to application and need) include geographic parameters, data flow and processing workload parameters, operator staffing parameters, and technology/economic parameters. The model's outputs are total cost in various categories, a number of intermediate cost and technical calculation results, as well as graphical presentation of Costs vs. Percent Distribution for various parameters. The model has been implemented on an IBM PC using the LOTUS 1-2-3 spreadsheet environment and was developed in 1986. Also included with the spreadsheet model are a number of representative but hypothetical utility system examples.

  1. Evaluation of nearshore wave models in steep reef environments

    NASA Astrophysics Data System (ADS)

    Buckley, Mark; Lowe, Ryan; Hansen, Jeff

    2014-06-01

    To provide coastal engineers and scientists with a quantitative evaluation of nearshore numerical wave models in reef environments, we review and compare three commonly used models with detailed laboratory observations. These models are the following: (1) SWASH (Simulating WAves till SHore) (Zijlema et al. 2011), a phase-resolving nonlinear shallow-water wave model with added nonhydrostatic terms; (2) SWAN (Simulating WAve Nearshore) (Booij et al. 1999), a phase-averaged spectral wave model; and (3) XBeach (Roelvink et al. 2009), a coupled phase-averaged spectral wave model (applied to modeling sea-swell waves) and a nonlinear shallow-water model (applied to modeling infragravity waves). A quantitative assessment was made of each model's ability to predict sea-swell (SS) wave height, infragravity (IG) wave height, wave spectra, and wave setup () at five locations across the laboratory fringing reef profile of Demirbilek et al. (2007). Simulations were performed with the "recommended" empirical coefficients as documented for each model, and then the key wave-breaking parameter for each model ( α in SWASH and γ in both SWAN and XBeach) was optimized to most accurately reproduce the observations. SWASH, SWAN, and XBeach were found to be capable of predicting SS wave height variations across the steep fringing reef profile with reasonable accuracy using the default coefficients. Nevertheless, tuning of the key wave-breaking parameter improved the accuracy of each model's predictions. SWASH and XBeach were also able to predict IG wave height and spectral transformation. Although SWAN was capable of modeling the SS wave height, in its current form, it was not capable of modeling the spectral transformation into lower frequencies, as evident in the underprediction of the low-frequency waves.

  2. Model-supported virtual environment for prostate cancer pattern analysis

    NASA Astrophysics Data System (ADS)

    Yu, Ping; McClain, Maxine A.; Xuan, Jianhua; Wang, Yue J.; Sesterhenn, Isabell A.; Moul, Judd W.; Zhang, Wei; Mun, Seong K.

    1999-05-01

    As a step toward understanding complex spatial distribution patterns of prostate cancers, a 3D master model of the prostate, showing major anatomical structures and probability maps of the location of tumors, has been pilot developed. A virtual environment supported by the 3D master model and in vivo imaging features, will be used to evaluate, simulate, and optimize the image guided needle biopsy and radiation therapy, thus potentially improving the efficacy of prostate cancer diagnosis, staging, and treatment. A deformable graphics algorithm has been developed to reconstruct the graphics models from 200 serially sectioned whole mount radical prostatectomy specimens and to support computerized needle biopsy simulations. For the construction of a generic model, a principal-axes 3D registration technique has been developed. Simulated evaluation and real data experiment have shown the satisfactory performance of the method in constructing initial generic model with localized prostate cancer placement. For the construction of statistical model, a blended model registration technique is advanced to perform a non-linear warping of the individual model to the generic model so that the prostate cancer probability distribution maps can be accurately positioned. The method uses a spine- surface model and a linear elastic model to dynamically deform both the surface and volume where object re-slicing is required. For the interactive visualization of the 3D master model, four modes of data display are developed: (1) transparent rendering of the generic model, (2) overlaid rendering of cancer distributions, (3) stereo rendering, and (4) true volumetric display, and a model-to-image registration technique using synthetic image phantoms is under investigation. Preliminary results have shown that use of this master model allows correct understanding of prostate cancer distribution patterns and rational optimization of prostate biopsy and radiation therapy strategies.

  3. Distributed data processing and analysis environment for neutron scattering experiments at CSNS

    NASA Astrophysics Data System (ADS)

    Tian, H. L.; Zhang, J. R.; Yan, L. L.; Tang, M.; Hu, L.; Zhao, D. X.; Qiu, Y. X.; Zhang, H. Y.; Zhuang, J.; Du, R.

    2016-10-01

    China Spallation Neutron Source (CSNS) is the first high-performance pulsed neutron source in China, which will meet the increasing fundamental research and technique applications demands domestically and overseas. A new distributed data processing and analysis environment has been developed, which has generic functionalities for neutron scattering experiments. The environment consists of three parts, an object-oriented data processing framework adopting a data centered architecture, a communication and data caching system based on the C/S paradigm, and data analysis and visualization software providing the 2D/3D experimental data display. This environment will be widely applied in CSNS for live data processing.

  4. A model for dispersion of contaminants in the subway environment

    SciTech Connect

    Coke, L. R.; Sanchez, J. G.; Policastro, A. J.

    2000-05-03

    Although subway ventilation has been studied extensively, very little has been published on dispersion of contaminants in the subway environment. This paper presents a model that predicts dispersion of contaminants in a complex subway system. It accounts for the combined transient effects of train motion, station airflows, train car air exchange rates, and source release properties. Results are presented for a range of typical subway scenarios. The effects of train piston action and train car air exchange are discussed. The model could also be applied to analyze the environmental impact of hazardous materials releases such as chemical and biological agents.

  5. Explicitly Representing Soil Microbial Processes In Earth System Models

    SciTech Connect

    Wieder, William R.; Allison, Steven D.; Davidson, Eric A.; Georgiou, Katrina; Hararuk, Oleksandra; He, Yujie; Hopkins, Francesca; Luo, Yiqi; Smith, Mathew J.; Sulman, Benjamin; Todd-Brown, Katherine EO; Wang, Ying-Ping; Xia, Jianyang; Xu, Xiaofeng

    2015-10-26

    Microbes influence soil organic matter (SOM) decomposition and the long-term stabilization of carbon (C) in soils. We contend that by revising the representation of microbial processes and their interactions with the physicochemical soil environment, Earth system models (ESMs) may make more realistic global C cycle projections. Explicit representation of microbial processes presents considerable challenges due to the scale at which these processes occur. Thus, applying microbial theory in ESMs requires a framework to link micro-scale process-level understanding and measurements to macro-scale models used to make decadal- to century-long projections. Here, we review the diversity, advantages, and pitfalls of simulating soil biogeochemical cycles using microbial-explicit modeling approaches. We present a roadmap for how to begin building, applying, and evaluating reliable microbial-explicit model formulations that can be applied in ESMs. Drawing from experience with traditional decomposition models we suggest: (1) guidelines for common model parameters and output that can facilitate future model intercomparisons; (2) development of benchmarking and model-data integration frameworks that can be used to effectively guide, inform, and evaluate model parameterizations with data from well-curated repositories; and (3) the application of scaling methods to integrate microbial-explicit soil biogeochemistry modules within ESMs. With contributions across scientific disciplines, we feel this roadmap can advance our fundamental understanding of soil biogeochemical dynamics and more realistically project likely soil C response to environmental change at global scales.

  6. Explicitly representing soil microbial processes in Earth system models

    NASA Astrophysics Data System (ADS)

    Wieder, William R.; Allison, Steven D.; Davidson, Eric A.; Georgiou, Katerina; Hararuk, Oleksandra; He, Yujie; Hopkins, Francesca; Luo, Yiqi; Smith, Matthew J.; Sulman, Benjamin; Todd-Brown, Katherine; Wang, Ying-Ping; Xia, Jianyang; Xu, Xiaofeng

    2015-10-01

    Microbes influence soil organic matter decomposition and the long-term stabilization of carbon (C) in soils. We contend that by revising the representation of microbial processes and their interactions with the physicochemical soil environment, Earth system models (ESMs) will make more realistic global C cycle projections. Explicit representation of microbial processes presents considerable challenges due to the scale at which these processes occur. Thus, applying microbial theory in ESMs requires a framework to link micro-scale process-level understanding and measurements to macro-scale models used to make decadal- to century-long projections. Here we review the diversity, advantages, and pitfalls of simulating soil biogeochemical cycles using microbial-explicit modeling approaches. We present a roadmap for how to begin building, applying, and evaluating reliable microbial-explicit model formulations that can be applied in ESMs. Drawing from experience with traditional decomposition models, we suggest the following: (1) guidelines for common model parameters and output that can facilitate future model intercomparisons; (2) development of benchmarking and model-data integration frameworks that can be used to effectively guide, inform, and evaluate model parameterizations with data from well-curated repositories; and (3) the application of scaling methods to integrate microbial-explicit soil biogeochemistry modules within ESMs. With contributions across scientific disciplines, we feel this roadmap can advance our fundamental understanding of soil biogeochemical dynamics and more realistically project likely soil C response to environmental change at global scales.

  7. Periglacial process research for improved understanding of climate change in periglacial environments

    NASA Astrophysics Data System (ADS)

    Hvidtfeldt Christiansen, Hanne

    2010-05-01

    Periglacial landscapes extend widely outside the glaciated areas and the areas underlain by permafrost and with seasonal frost. Yet recently significant attention has in cryosphere research, related to periglacial geomorphology, been given to a direct climate permafrost relationship. The focus is on the permafrost thermal state including the thickness of the active layer, and often simplifying how these two key conditions are directly climatically controlled. There has been less focus on the understanding and quantification of the different periglacial processes, which largely control the consequences of changing climatic conditions on the permafrost and on seasonal frost all over the periglacial environments. It is the complex relationship between climate, micro-climate and local geomorphological, geological and ecological conditions, which controls periglacial processes. In several cases local erosion or deposition will affect the rates of landform change significantly more than any climate change. Thus detailed periglacial process studies will sophisticate the predictions of how periglacial landscapes can be expected to respond to climatic changes, and be built into Earth System Modelling. Particularly combining direct field observations and measurements with remote sensing and geochronological studies of periglacial landforms, enables a significantly improved understanding of periglacial process rates. An overview of the state of research in key periglacial processes are given focusing on ice-wedges and solifluction landforms, and seasonal ground thermal dynamics, all with examples from the high Arctic in Svalbard. Thermal contraction cracking and its seasonal meteorological control is presented, and potential thermal erosion of ice-wedges leading to development of thermokarst is discussed. Local and meteorological controls on solifluction rates are presented and their climatic control indicated. Seasonal ground thermal processes and their dependence on local

  8. Space Environment Effects: Low-Altitude Trapped Radiation Model

    NASA Technical Reports Server (NTRS)

    Huston, S. L.; Pfitzer, K. A.

    1998-01-01

    Accurate models of the Earth's trapped energetic proton environment are required for both piloted and robotic space missions. For piloted missions, the concern is mainly total dose to the astronauts, particularly in long-duration missions and during extravehicular activity (EVA). As astronomical and remote-sensing detectors become more sensitive, the proton flux can induce unwanted backgrounds in these instruments. Due to this unwanted background, the following description details the development of a new model for the low-trapped proton environment. The model is based on nearly 20 years of data from the TIRO/NOAA weather satellites. The model, which has been designated NOAAPRO (for NOAA protons), predicts the integral omnidirectional proton flux in three energy ranges: >16, >36, and >80 MeV. It contains a true solar cycle variation and accounts for the secular variation in the Earth's magnetic field. It also extends to lower values of the magnetic L parameter than does AP8. Thus, the model addresses the major shortcomings of AP8.

  9. Forest Canopy Processes in a Regional Chemical Transport Model

    NASA Astrophysics Data System (ADS)

    Makar, Paul; Staebler, Ralf; Akingunola, Ayodeji; Zhang, Junhua; McLinden, Chris; Kharol, Shailesh; Moran, Michael; Robichaud, Alain; Zhang, Leiming; Stroud, Craig; Pabla, Balbir; Cheung, Philip

    2016-04-01

    Forest canopies have typically been absent or highly parameterized in regional chemical transport models. Some forest-related processes are often considered - for example, biogenic emissions from the forests are included as a flux lower boundary condition on vertical diffusion, as is deposition to vegetation. However, real forest canopies comprise a much more complicated set of processes, at scales below the "transport model-resolved scale" of vertical levels usually employed in regional transport models. Advective and diffusive transport within the forest canopy typically scale with the height of the canopy, and the former process tends to dominate over the latter. Emissions of biogenic hydrocarbons arise from the foliage, which may be located tens of metres above the surface, while emissions of biogenic nitric oxide from decaying plant matter are located at the surface - in contrast to the surface flux boundary condition usually employed in chemical transport models. Deposition, similarly, is usually parameterized as a flux boundary condition, but may be differentiated between fluxes to vegetation and fluxes to the surface when the canopy scale is considered. The chemical environment also changes within forest canopies: shading, temperature, and relativity humidity changes with height within the canopy may influence chemical reaction rates. These processes have been observed in a host of measurement studies, and have been simulated using site-specific one-dimensional forest canopy models. Their influence on regional scale chemistry has been unknown, until now. In this work, we describe the results of the first attempt to include complex canopy processes within a regional chemical transport model (GEM-MACH). The original model core was subdivided into "canopy" and "non-canopy" subdomains. In the former, three additional near-surface layers based on spatially and seasonally varying satellite-derived canopy height and leaf area index were added to the original model

  10. Modeling and optimum time performance for concurrent processing

    NASA Technical Reports Server (NTRS)

    Mielke, Roland R.; Stoughton, John W.; Som, Sukhamoy

    1988-01-01

    The development of a new graph theoretic model for describing the relation between a decomposed algorithm and its execution in a data flow environment is presented. Called ATAMM, the model consists of a set of Petri net marked graphs useful for representing decision-free algorithms having large-grained, computationally complex primitive operations. Performance time measures which determine computing speed and throughput capacity are defined, and the ATAMM model is used to develop lower bounds for these times. A concurrent processing operating strategy for achieving optimum time performance is presented and illustrated by example.

  11. Modelling foraging ants in a dynamic and confined environment.

    PubMed

    Bandeira de Melo, Elton B; Araújo, Aluízio F R

    2011-04-01

    In social insects, the superposition of simple individual behavioral rules leads to the emergence of complex collective patterns and helps solve difficult problems inherent to surviving in hostile habitats. Modelling ant colony foraging reveals strategies arising from the insects' self-organization and helps develop of new computational strategies in order to solve complex problems. This paper presents advances in modelling ants' behavior when foraging in a confined and dynamic environment, based on experiments with the Argentine ant Linepithema humile in a relatively complex artificial network. We propose a model which overcomes the problem of stagnation observed in earlier models by taking into account additional biological aspects, by using non-linear functions for the deposit, perception and evaporation of pheromone, and by introducing new mechanisms to represent randomness and the exploratory behavior of the ants. PMID:21236313

  12. Cost Models for MMC Manufacturing Processes

    NASA Technical Reports Server (NTRS)

    Elzey, Dana M.; Wadley, Haydn N. G.

    1996-01-01

    Processes for the manufacture of advanced metal matrix composites are rapidly approaching maturity in the research laboratory and there is growing interest in their transition to industrial production. However, research conducted to date has almost exclusively focused on overcoming the technical barriers to producing high-quality material and little attention has been given to the economical feasibility of these laboratory approaches and process cost issues. A quantitative cost modeling (QCM) approach was developed to address these issues. QCM are cost analysis tools based on predictive process models relating process conditions to the attributes of the final product. An important attribute, of the QCM approach is the ability to predict the sensitivity of material production costs to product quality and to quantitatively explore trade-offs between cost and quality. Applications of the cost models allow more efficient direction of future MMC process technology development and a more accurate assessment of MMC market potential. Cost models were developed for two state-of-the art metal matrix composite (MMC) manufacturing processes: tape casting and plasma spray deposition. Quality and Cost models are presented for both processes and the resulting predicted quality-cost curves are presented and discussed.

  13. Evolution of quantum-like modeling in decision making processes

    SciTech Connect

    Khrennikova, Polina

    2012-12-18

    The application of the mathematical formalism of quantum mechanics to model behavioral patterns in social science and economics is a novel and constantly emerging field. The aim of the so called 'quantum like' models is to model the decision making processes in a macroscopic setting, capturing the particular 'context' in which the decisions are taken. Several subsequent empirical findings proved that when making a decision people tend to violate the axioms of expected utility theory and Savage's Sure Thing principle, thus violating the law of total probability. A quantum probability formula was devised to describe more accurately the decision making processes. A next step in the development of QL-modeling in decision making was the application of Schroedinger equation to describe the evolution of people's mental states. A shortcoming of Schroedinger equation is its inability to capture dynamics of an open system; the brain of the decision maker can be regarded as such, actively interacting with the external environment. Recently the master equation, by which quantum physics describes the process of decoherence as the result of interaction of the mental state with the environmental 'bath', was introduced for modeling the human decision making. The external environment and memory can be referred to as a complex 'context' influencing the final decision outcomes. The master equation can be considered as a pioneering and promising apparatus for modeling the dynamics of decision making in different contexts.

  14. Evolution of quantum-like modeling in decision making processes

    NASA Astrophysics Data System (ADS)

    Khrennikova, Polina

    2012-12-01

    The application of the mathematical formalism of quantum mechanics to model behavioral patterns in social science and economics is a novel and constantly emerging field. The aim of the so called 'quantum like' models is to model the decision making processes in a macroscopic setting, capturing the particular 'context' in which the decisions are taken. Several subsequent empirical findings proved that when making a decision people tend to violate the axioms of expected utility theory and Savage's Sure Thing principle, thus violating the law of total probability. A quantum probability formula was devised to describe more accurately the decision making processes. A next step in the development of QL-modeling in decision making was the application of Schrödinger equation to describe the evolution of people's mental states. A shortcoming of Schrödinger equation is its inability to capture dynamics of an open system; the brain of the decision maker can be regarded as such, actively interacting with the external environment. Recently the master equation, by which quantum physics describes the process of decoherence as the result of interaction of the mental state with the environmental 'bath', was introduced for modeling the human decision making. The external environment and memory can be referred to as a complex 'context' influencing the final decision outcomes. The master equation can be considered as a pioneering and promising apparatus for modeling the dynamics of decision making in different contexts.

  15. Modeling approach for business process reengineering

    NASA Astrophysics Data System (ADS)

    Tseng, Mitchell M.; Chen, Yuliu

    1995-08-01

    The purpose of this paper is to introduce a modeling approach to define, simulate, animate, and control business processes. The intent is to introduce the undergoing methodology to build tools for designing and managing business processes. Similar to computer aided design (CAD) for mechanical parts, CAD tools are needed for designing business processes. It emphasizes the dynamic behavior of business process. The proposed modeling technique consists of a definition of each individual activity, the network of activities, a control mechanism that describes coordination of these activities, and events that will flow through these activities. Based on the formalism introduced in this modeling technique, users will be able to define business process with minimum ambiguity, take snap shots of particular events in the process, describe the accountability of participants, and view a replay of event streams in the process flow. This modeling approach, mapped on top of a commercial software, has been tested by using examples from real life business process. The examples and testing helped us to identify some of the strengths and weaknesses of this proposed approach.

  16. A Delineation of the Cognitive Processes Manifested in a Social Annotation Environment

    ERIC Educational Resources Information Center

    Li, S. C.; Pow, J. W. C.; Cheung, W. C.

    2015-01-01

    This study aims to examine how students' learning trajectories progress in an online social annotation environment, and how their cognitive processes and levels of interaction correlate with their learning outcomes. Three different types of activities (cognitive, metacognitive and social) were identified in the online environment. The time…

  17. Gene-Environment Processes Linking Aggression, Peer Victimization, and the Teacher-Child Relationship

    ERIC Educational Resources Information Center

    Brendgen, Mara; Boivin, Michel; Dionne, Ginette; Barker, Edward D.; Vitaro, Frank; Girard, Alain; Tremblay, Richard; Perusse, Daniel

    2011-01-01

    Aggressive behavior in middle childhood is at least partly explained by genetic factors. Nevertheless, estimations of simple effects ignore possible gene-environment interactions (G x E) or gene-environment correlations (rGE) in the etiology of aggression. The present study aimed to simultaneously test for G x E and rGE processes between…

  18. Predicting Material Performance in the Space Environment from Laboratory Test Data, Static Design Environments, and Space Weather Models

    NASA Technical Reports Server (NTRS)

    Minow, Josep I.; Edwards, David L.

    2008-01-01

    Qualifying materials for use in the space environment is typically accomplished with laboratory exposures to simulated UV/EUV, atomic oxygen, and charged particle radiation environments with in-situ or subsequent measurements of material properties of interest to the particular application. Choice of environment exposure levels are derived from static design environments intended to represent either mean or extreme conditions that are anticipated to be encountered during a mission. The real space environment however is quite variable. Predictions of the on orbit performance of a material qualified to laboratory environments can be done using information on 'space weather' variations in the real environment. This presentation will first review the variability of space environments of concern for material degradation and then demonstrate techniques for using test data to predict material performance in a variety of space environments from low Earth orbit to interplanetary space using historical measurements and space weather models.

  19. Model for Simulating a Spiral Software-Development Process

    NASA Technical Reports Server (NTRS)

    Mizell, Carolyn; Curley, Charles; Nayak, Umanath

    2010-01-01

    A discrete-event simulation model, and a computer program that implements the model, have been developed as means of analyzing a spiral software-development process. This model can be tailored to specific development environments for use by software project managers in making quantitative cases for deciding among different software-development processes, courses of action, and cost estimates. A spiral process can be contrasted with a waterfall process, which is a traditional process that consists of a sequence of activities that include analysis of requirements, design, coding, testing, and support. A spiral process is an iterative process that can be regarded as a repeating modified waterfall process. Each iteration includes assessment of risk, analysis of requirements, design, coding, testing, delivery, and evaluation. A key difference between a spiral and a waterfall process is that a spiral process can accommodate changes in requirements at each iteration, whereas in a waterfall process, requirements are considered to be fixed from the beginning and, therefore, a waterfall process is not flexible enough for some projects, especially those in which requirements are not known at the beginning or may change during development. For a given project, a spiral process may cost more and take more time than does a waterfall process, but may better satisfy a customer's expectations and needs. Models for simulating various waterfall processes have been developed previously, but until now, there have been no models for simulating spiral processes. The present spiral-process-simulating model and the software that implements it were developed by extending a discrete-event simulation process model of the IEEE 12207 Software Development Process, which was built using commercially available software known as the Process Analysis Tradeoff Tool (PATT). Typical inputs to PATT models include industry-average values of product size (expressed as number of lines of code

  20. The (Mathematical) Modeling Process in Biosciences

    PubMed Central

    Torres, Nestor V.; Santos, Guido

    2015-01-01

    In this communication, we introduce a general framework and discussion on the role of models and the modeling process in the field of biosciences. The objective is to sum up the common procedures during the formalization and analysis of a biological problem from the perspective of Systems Biology, which approaches the study of biological systems as a whole. We begin by presenting the definitions of (biological) system and model. Particular attention is given to the meaning of mathematical model within the context of biology. Then, we present the process of modeling and analysis of biological systems. Three stages are described in detail: conceptualization of the biological system into a model, mathematical formalization of the previous conceptual model and optimization and system management derived from the analysis of the mathematical model. All along this work the main features and shortcomings of the process are analyzed and a set of rules that could help in the task of modeling any biological system are presented. Special regard is given to the formative requirements and the interdisciplinary nature of this approach. We conclude with some general considerations on the challenges that modeling is posing to current biology. PMID:26734063

  1. Modeling aerosol processes at the local scale

    SciTech Connect

    Lazaridis, M.; Isukapalli, S.S.; Georgopoulos, P.G.

    1998-12-31

    This work presents an approach for modeling photochemical gaseous and aerosol phase processes in subgrid plumes from major localized (e.g. point) sources (plume-in-grid modeling), thus improving the ability to quantify the relationship between emission source activity and ambient air quality. This approach employs the Reactive Plume Model (RPM-AERO) which extends the regulatory model RPM-IV by incorporating aerosol processes and heterogeneous chemistry. The physics and chemistry of elemental carbon, organic carbon, sulfate, sodium, chloride and crustal material of aerosols are treated and attributed to the PM size distribution. A modified version of the Carbon Bond IV chemical mechanism is included to model the formation of organic aerosol, and the inorganic multicomponent atmospheric aerosol equilibrium model, SEQUILIB is used for calculating the amounts of inorganic species in particulate matter. Aerosol dynamics modeled include mechanisms of nucleation, condensation and gas/particle partitioning of organic matter. An integrated trajectory-in-grid modeling system, UAM/RPM-AERO, is under continuing development for extracting boundary and initial conditions from the mesoscale photochemical/aerosol model UAM-AERO. The RPM-AERO is applied here to case studies involving emissions from point sources to study sulfate particle formation in plumes. Model calculations show that homogeneous nucleation is an efficient process for new particle formation in plumes, in agreement with previous field studies and theoretical predictions.

  2. The effects of physical environments in medical wards on medication communication processes affecting patient safety.

    PubMed

    Liu, Wei; Manias, Elizabeth; Gerdtz, Marie

    2014-03-01

    Physical environments of clinical settings play an important role in health communication processes. Effective medication management requires seamless communication among health professionals of different disciplines. This paper explores how physical environments affect communication processes for managing medications and patient safety in acute care hospital settings. Findings highlighted the impact of environmental interruptions on communication processes about medications. In response to frequent interruptions and limited space within working environments, nurses, doctors and pharmacists developed adaptive practices in the local clinical context. Communication difficulties were associated with the ward physical layout, the controlled drug key and the medication retrieving device. Health professionals should be provided with opportunities to discuss the effects of ward environments on medication communication processes and how this impacts medication safety. Hospital administrators and architects need to consider health professionals' views and experiences when designing hospital spaces. PMID:24486620

  3. PROCESS DESIGN FOR ENVIRONMENT: A MULTI-OBJECTIVE FRAMEWORK UNDER UNCERTAINTY

    EPA Science Inventory

    Designing chemical processes for environment requires consideration of several indexes of environmental impact including ozone depletion and global warming potentials, human and aquatic toxicity, and photochemical oxidation, and acid rain potentials. Current methodologies like t...

  4. The effects of physical environments in medical wards on medication communication processes affecting patient safety.

    PubMed

    Liu, Wei; Manias, Elizabeth; Gerdtz, Marie

    2014-03-01

    Physical environments of clinical settings play an important role in health communication processes. Effective medication management requires seamless communication among health professionals of different disciplines. This paper explores how physical environments affect communication processes for managing medications and patient safety in acute care hospital settings. Findings highlighted the impact of environmental interruptions on communication processes about medications. In response to frequent interruptions and limited space within working environments, nurses, doctors and pharmacists developed adaptive practices in the local clinical context. Communication difficulties were associated with the ward physical layout, the controlled drug key and the medication retrieving device. Health professionals should be provided with opportunities to discuss the effects of ward environments on medication communication processes and how this impacts medication safety. Hospital administrators and architects need to consider health professionals' views and experiences when designing hospital spaces.

  5. Estimation, modeling, and simulation of patterned growth in extreme environments.

    PubMed

    Strader, B; Schubert, K E; Quintana, M; Gomez, E; Curnutt, J; Boston, P

    2011-01-01

    In the search for life on Mars and other extraterrestrial bodies or in our attempts to identify biological traces in the most ancient rock record of Earth, one of the biggest problems facing us is how to recognize life or the remains of ancient life in a context very different from our planet's modern biological examples. Specific chemistries or biological properties may well be inapplicable to extraterrestrial conditions or ancient Earth environments. Thus, we need to develop an arsenal of techniques that are of broader applicability. The notion of patterning created in some fashion by biological processes and properties may provide such a generalized property of biological systems no matter what the incidentals of chemistry or environmental conditions. One approach to recognizing these kinds of patterns is to look at apparently organized arrangements created and left by life in extreme environments here on Earth, especially at various spatial scales, different geologies, and biogeochemical circumstances.

  6. The Epidemic Process and The Contagion Model

    ERIC Educational Resources Information Center

    Worthen, Dennis B.

    1973-01-01

    Goffman's epidemic theory is presented and compared to the contagion theory developed by Menzel. An attempt is made to compare the two models presented and examine their similarities and differences. The conclusion drawn is that the two models are very similar in their approach to understanding communication processes. (14 references) (Author/SJ)

  7. Dynamic process modeling with recurrent neural networks

    SciTech Connect

    You, Yong; Nikolaou, M. . Dept. of Chemical Engineering)

    1993-10-01

    Mathematical models play an important role in control system synthesis. However, due to the inherent nonlinearity, complexity and uncertainty of chemical processes, it is usually difficult to obtain an accurate model for a chemical engineering system. A method of nonlinear static and dynamic process modeling via recurrent neural networks (RNNs) is studied. An RNN model is a set of coupled nonlinear ordinary differential equations in continuous time domain with nonlinear dynamic node characteristics as well as both feed forward and feedback connections. For such networks, each physical input to a system corresponds to exactly one input to the network. The system's dynamics are captured by the internal structure of the network. The structure of RNN models may be more natural and attractive than that of feed forward neural network models, but computation time for training is longer. Simulation results show that RNNs can learn both steady-state relationships and process dynamics of continuous and batch, single-input/single-output and multi-input/multi-output systems in a simple and direct manner. Training of RNNs shows only small degradation in the presence of noise in the training data. Thus, RNNs constitute a feasible alternative to layered feed forward back propagation neural networks in steady-state and dynamic process modeling and model-based control.

  8. Modeling of fluidized bed silicon deposition process

    NASA Technical Reports Server (NTRS)

    Kim, K.; Hsu, G.; Lutwack, R.; PRATURI A. K.

    1977-01-01

    The model is intended for use as a means of improving fluidized bed reactor design and for the formulation of the research program in support of the contracts of Silicon Material Task for the development of the fluidized bed silicon deposition process. A computer program derived from the simple modeling is also described. Results of some sample calculations using the computer program are shown.

  9. Theoretical Models and Processes of Reading.

    ERIC Educational Resources Information Center

    Singer, Harry, Ed.; Ruddell, Robert B., Ed.

    The first section of this two-part collection of articles contains six papers and their discussions read at a symposium on Theoretical Models and Processes of Reading. The papers cover the linguistic, perceptual, and cognitive components involved in reading. The models attempt to integrate the variables that influence the perception, recognition,…

  10. Modeling a healthy and a person with heart failure conditions using the object-oriented modeling environment Dymola.

    PubMed

    Heinke, Stefanie; Pereira, Carina; Leonhardt, Steffen; Walter, Marian

    2015-10-01

    Several mathematical models of different physiological systems are spread through literature. They serve as tools which improve the understanding of (patho-) physiological processes, may help to meet clinical decisions and can even enhance medical therapies. These models are typically implemented in a signal-flow-oriented simulation environment and focus on the behavior of one specific subsystem. Neglecting other physiological subsystems and using a technical description of the physiology hinders the exchange with and acceptance of clinicians. By contrast, this paper presents a new model implemented in a physical, object-oriented modeling environment which includes the cardiovascular, respiratory and thermoregulatory system. Simulation results for a healthy subject at rest and at the onset of exercise are given, showing the validity of the model. Finally, simulation results showing the interaction of the cardiovascular system with a ventricular assist device in case of heart failure are presented showing the flexibility and mightiness of the model and the simulation environment. Thus, we present a new model including three important physiological systems and one medical device implemented in an innovative simulation environment.

  11. Modeling of thermal processes in spherical area

    NASA Astrophysics Data System (ADS)

    Demyanchenko, O.; Lyashenko, V.

    2016-10-01

    In this paper a mathematical model of the temperature field in spherical area with complex conditions of heat exchange with the environment is considered. The solution of the nonlinear initial boundary value problem is reduced to the solution of the nonlinear integral equation of Fredholm type respect to spatial coordinates and Volterra with the kernel in the form of the Green's function on the time coordinate.

  12. Modeling biochemical transformation processes and information processing with Narrator

    PubMed Central

    Mandel, Johannes J; Fuß, Hendrik; Palfreyman, Niall M; Dubitzky, Werner

    2007-01-01

    Background Software tools that model and simulate the dynamics of biological processes and systems are becoming increasingly important. Some of these tools offer sophisticated graphical user interfaces (GUIs), which greatly enhance their acceptance by users. Such GUIs are based on symbolic or graphical notations used to describe, interact and communicate the developed models. Typically, these graphical notations are geared towards conventional biochemical pathway diagrams. They permit the user to represent the transport and transformation of chemical species and to define inhibitory and stimulatory dependencies. A critical weakness of existing tools is their lack of supporting an integrative representation of transport, transformation as well as biological information processing. Results Narrator is a software tool facilitating the development and simulation of biological systems as Co-dependence models. The Co-dependence Methodology complements the representation of species transport and transformation together with an explicit mechanism to express biological information processing. Thus, Co-dependence models explicitly capture, for instance, signal processing structures and the influence of exogenous factors or events affecting certain parts of a biological system or process. This combined set of features provides the system biologist with a powerful tool to describe and explore the dynamics of life phenomena. Narrator's GUI is based on an expressive graphical notation which forms an integral part of the Co-dependence Methodology. Behind the user-friendly GUI, Narrator hides a flexible feature which makes it relatively easy to map models defined via the graphical notation to mathematical formalisms and languages such as ordinary differential equations, the Systems Biology Markup Language or Gillespie's direct method. This powerful feature facilitates reuse, interoperability and conceptual model development. Conclusion Narrator is a flexible and intuitive systems

  13. [Biological processes of the human environment regeneration within the Martian crew life support systems].

    PubMed

    Sychev, V N; Levinskikh, M A; Shepelev, E Ia; Podol'skiĭ, I G

    2003-01-01

    Five ground-based experiments at RF SRC-IBMP had the purpose to make a thorough investigation of a model of the human-unicellular algae-mineralization life support system. The system measured 15 m3 and contained 45 liters of alga suspension; the dry alga density was 10 to 12 g/l and water volume (including the alga suspension) amounted to 59 l. More sophisticated LSS models where algae were substituted by higher plants (crop area in the greenhouse equaled 15 m2) were investigated in three experiments from 1.5 mos. to 2 mos. in duration. It was found that the alga containing LSS was able to fulfill not only the macrofunction (air and water regeneration) but also several additional functions (air purification, establishment of microbial cenosis etc.) providing an adequate human environment. This polyfunctionality of the biological regenerative processes is a weighty argument for their integration into space LSSs. Another important aspect is that the unicellular algae containing systems are highly reliable owing to a huge number of species-cells which will be quickly recovered in case of the death of a part of the population and, consequently, functionality of the LSS autotrophic component will be restored before long. For an extended period of time the Martian crew will have no communication with the Earth's biosphere which implies that LSS should be absolutely reliable and redundant. Redundancy can be achieved through installation aboard the vehicle of two systems constructed on different principles of regeneration, i.e. physical-chemical and biological. Each of the LSSs should have the power to satisfy all needs of the crew. The best option is when two systems are functioning in parallel sharing the responsibility for the human environment. Redundancy in this case will mean that in the event of failure or a drastic decrease in performance of one system the other one will make up for the loss by increasing its share in the overall regeneration process. PMID:14730737

  14. Cognitive Virtualization: Combining Cognitive Models and Virtual Environments

    SciTech Connect

    Tuan Q. Tran; David I. Gertman; Donald D. Dudenhoeffer; Ronald L. Boring; Alan R. Mecham

    2007-08-01

    3D manikins are often used in visualizations to model human activity in complex settings. Manikins assist in developing understanding of human actions, movements and routines in a variety of different environments representing new conceptual designs. One such environment is a nuclear power plant control room, here they have the potential to be used to simulate more precise ergonomic assessments of human work stations. Next generation control rooms will pose numerous challenges for system designers. The manikin modeling approach by itself, however, may be insufficient for dealing with the desired technical advancements and challenges of next generation automated systems. Uncertainty regarding effective staffing levels; and the potential for negative human performance consequences in the presence of advanced automated systems (e.g., reduced vigilance, poor situation awareness, mistrust or blind faith in automation, higher information load and increased complexity) call for further research. Baseline assessment of novel control room equipment(s) and configurations needs to be conducted. These design uncertainties can be reduced through complementary analysis that merges ergonomic manikin models with models of higher cognitive functions, such as attention, memory, decision-making, and problem-solving. This paper will discuss recent advancements in merging a theoretical-driven cognitive modeling framework within a 3D visualization modeling tool to evaluate of next generation control room human factors and ergonomic assessment. Though this discussion primary focuses on control room design, the application for such a merger between 3D visualization and cognitive modeling can be extended to various areas of focus such as training and scenario planning.

  15. Spiked Dirichlet Process Priors for Gaussian Process Models

    PubMed Central

    Savitsky, Terrance; Vannucci, Marina

    2013-01-01

    We expand a framework for Bayesian variable selection for Gaussian process (GP) models by employing spiked Dirichlet process (DP) prior constructions over set partitions containing covariates. Our approach results in a nonparametric treatment of the distribution of the covariance parameters of the GP covariance matrix that in turn induces a clustering of the covariates. We evaluate two prior constructions: the first one employs a mixture of a point-mass and a continuous distribution as the centering distribution for the DP prior, therefore, clustering all covariates. The second one employs a mixture of a spike and a DP prior with a continuous distribution as the centering distribution, which induces clustering of the selected covariates only. DP models borrow information across covariates through model-based clustering. Our simulation results, in particular, show a reduction in posterior sampling variability and, in turn, enhanced prediction performances. In our model formulations, we accomplish posterior inference by employing novel combinations and extensions of existing algorithms for inference with DP prior models and compare performances under the two prior constructions. PMID:23950763

  16. Filament winding cylinders. I - Process model

    NASA Technical Reports Server (NTRS)

    Lee, Soo-Yong; Springer, George S.

    1990-01-01

    A model was developed which describes the filament winding process of composite cylinders. The model relates the significant process variables such as winding speed, fiber tension, and applied temperature to the thermal, chemical and mechanical behavior of the composite cylinder and the mandrel. Based on the model, a user friendly code was written which can be used to calculate (1) the temperature in the cylinder and the mandrel, (2) the degree of cure and viscosity in the cylinder, (3) the fiber tensions and fiber positions, (4) the stresses and strains in the cylinder and in the mandrel, and (5) the void diameters in the cylinder.

  17. Using Perspective to Model Complex Processes

    SciTech Connect

    Kelsey, R.L.; Bisset, K.R.

    1999-04-04

    The notion of perspective, when supported in an object-based knowledge representation, can facilitate better abstractions of reality for modeling and simulation. The object modeling of complex physical and chemical processes is made more difficult in part due to the poor abstractions of state and phase changes available in these models. The notion of perspective can be used to create different views to represent the different states of matter in a process. These techniques can lead to a more understandable model. Additionally, the ability to record the progress of a process from start to finish is problematic. It is desirable to have a historic record of the entire process, not just the end result of the process. A historic record should facilitate backtracking and re-start of a process at different points in time. The same representation structures and techniques can be used to create a sequence of process markers to represent a historic record. By using perspective, the sequence of markers can have multiple and varying views tailored for a particular user's context of interest.

  18. Processing of Soot in an Urban Environment: Case Study from the Mexico City Metropolitan Area

    SciTech Connect

    Johnson, Kirsten S.; Zuberi, Bilal M.; Molina, Luisa; Molina, Mario J.; Iedema, Martin J.; Cowin, James P.; Gaspar, Daniel J.; Wang, Chong M.; Laskin, Alexander

    2005-11-14

    Chemical composition, size, and mixing state of atmospheric particles are critical in determining their e ffects on the environment. There is growing evidence that soot aerosols play a particularly important role in both climate and human health, but still relatively little is known of their physical and chemical nature. In addition, the atmo- 5 spheric residence times and removal mechanisms for soot are neither well understood nor adequately represented in regional and global climate models. To investigate the effect of locality and residence time on properties of soot and mixing state in a polluted urban environment, particles of diameter 0.2–2.0 µm were collected in the Mexico City Metropolitan Area (MCMA) during the MCMA-2003 field campaign from various sites 10 within the city. Individual particle analysis by di fferent electron microscopy methods coupled with energy dispersed X-ray spectroscopy, and secondary ionization mass spectrometry show that freshly-emitted soot particles become rapidly processed in the MCMA. Whereas fresh particulate emissions from mixed-tra ffic are almost entirely carbonaceous, consisting of soot aggregates with liquid coatings suggestive of unburned 15 lubricating oil and water, ambient soot particles which have been processed for less than a few hours are heavily internally mixed, primarily with ammonium sulfate. Single particle analysis suggests that this mixing occurs through several mechanisms that require further investigation. In light of previously published results, the internally-mixed nature of processed soot particles is expected to a ffect heterogeneous chemistry on 20 the soot surface, including interaction with water during wet-removal.

  19. Hemispherical reflectance model for passive images in an outdoor environment.

    PubMed

    Kim, Charles C; Thai, Bea; Yamaoka, Neil; Aboutalib, Omar

    2015-05-01

    We present a hemispherical reflectance model for simulating passive images in an outdoor environment where illumination is provided by natural sources such as the sun and the clouds. While the bidirectional reflectance distribution function (BRDF) accurately produces radiance from any objects after the illumination, using the BRDF in calculating radiance requires double integration. Replacing the BRDF by hemispherical reflectance under the natural sources transforms the double integration into a multiplication. This reduces both storage space and computation time. We present the formalism for the radiance of the scene using hemispherical reflectance instead of BRDF. This enables us to generate passive images in an outdoor environment taking advantage of the computational and storage efficiencies. We show some examples for illustration.

  20. Modeling the VARTM Composite Manufacturing Process

    NASA Technical Reports Server (NTRS)

    Song, Xiao-Lan; Loos, Alfred C.; Grimsley, Brian W.; Cano, Roberto J.; Hubert, Pascal

    2004-01-01

    A comprehensive simulation model of the Vacuum Assisted Resin Transfer Modeling (VARTM) composite manufacturing process has been developed. For isothermal resin infiltration, the model incorporates submodels which describe cure of the resin and changes in resin viscosity due to cure, resin flow through the reinforcement preform and distribution medium and compaction of the preform during the infiltration. The accuracy of the model was validated by measuring the flow patterns during resin infiltration of flat preforms. The modeling software was used to evaluate the effects of the distribution medium on resin infiltration of a flat preform. Different distribution medium configurations were examined using the model and the results were compared with data collected during resin infiltration of a carbon fabric preform. The results of the simulations show that the approach used to model the distribution medium can significantly effect the predicted resin infiltration times. Resin infiltration into the preform can be accurately predicted only when the distribution medium is modeled correctly.

  1. From Business Value Model to Coordination Process Model

    NASA Astrophysics Data System (ADS)

    Fatemi, Hassan; van Sinderen, Marten; Wieringa, Roel

    The increased complexity of business webs calls for modeling the collaboration of enterprises from different perspectives, in particular the business and process perspectives, and for mutually aligning these perspectives. Business value modeling and coordination process modeling both are necessary for a good e-business design, but these activities have different goals and use different concepts. Nevertheless, the resulting models should be consistent with each other because they refer to the same system from different perspectives. Hence, checking the consistency between these models or producing one based on the other would be of high value. In this paper we discuss the issue of achieving consistency in multi-level e-business design and give guidelines to produce consistent coordination process models from business value models in a stepwise manner.

  2. Mathematical modeling of the coating process.

    PubMed

    Toschkoff, Gregor; Khinast, Johannes G

    2013-12-01

    Coating of tablets is a common unit operation in the pharmaceutical industry. In most cases, the final product must meet strict quality requirements; to meet them, a detailed understanding of the coating process is required. To this end, numerous experiment studies have been performed. However, to acquire a mechanistic understanding, experimental data must be interpreted in the light of mathematical models. In recent years, a combination of analytical modeling and computational simulations enabled deeper insights into the nature of the coating process. This paper presents an overview of modeling and simulation approaches of the coating process, covering various relevant aspects from scale-up considerations to coating mass uniformity investigations and models for drop atomization. The most important analytical and computational concepts are presented and the findings are compared.

  3. Defect modelling in an interactive 3-D CAD environment

    NASA Astrophysics Data System (ADS)

    Reilly, D.; Potts, A.; McNab, A.; Toft, M.; Chapman, R. K.

    2000-05-01

    This paper describes enhancement of the NDT Workbench, as presented at QNDE '98, to include theoretical models for the ultrasonic inspection of smooth planar defects, developed by British Energy and BNFL-Magnox Generation. The Workbench is a PC-based software package for the reconstruction, visualization and analysis of 3-D ultrasonic NDT data in an interactive CAD environment. This extension of the Workbeach now provides the user with a well established modelling approach, coupled with a graphical user interface for: a) configuring the model for flaw size, shape, orientation and location; b) flexible specification of probe parameters; c) selection of scanning surface and scan pattern on the CAD component model; d) presentation of the output as a simulated ultrasound image within the component, or as graphical or tabular displays. The defect modelling facilities of the Workbench can be used for inspection procedure assessment and confirmation of data interpretation, by comparison of overlay images generated from real and simulated data. The modelling technique currently implemented is based on the Geometrical Theory of Diffraction, for simulation of strip-like, circular or elliptical crack responses in the time harmonic or time dependent cases. Eventually, the Workbench will also allow modelling using elastodynamic Kirchhoff theory.

  4. Exposing earth surface process model simulations to a large audience

    NASA Astrophysics Data System (ADS)

    Overeem, I.; Kettner, A. J.; Borkowski, L.; Russell, E. L.; Peddicord, H.

    2015-12-01

    The Community Surface Dynamics Modeling System (CSDMS) represents a diverse group of >1300 scientists who develop and apply numerical models to better understand the Earth's surface. CSDMS has a mandate to make the public more aware of model capabilities and therefore started sharing state-of-the-art surface process modeling results with large audiences. One platform to reach audiences outside the science community is through museum displays on 'Science on a Sphere' (SOS). Developed by NOAA, SOS is a giant globe, linked with computers and multiple projectors and can display data and animations on a sphere. CSDMS has developed and contributed model simulation datasets for the SOS system since 2014, including hydrological processes, coastal processes, and human interactions with the environment. Model simulations of a hydrological and sediment transport model (WBM-SED) illustrate global river discharge patterns. WAVEWATCH III simulations have been specifically processed to show the impacts of hurricanes on ocean waves, with focus on hurricane Katrina and super storm Sandy. A large world dataset of dams built over the last two centuries gives an impression of the profound influence of humans on water management. Given the exposure of SOS, CSDMS aims to contribute at least 2 model datasets a year, and will soon provide displays of global river sediment fluxes and changes of the sea ice free season along the Arctic coast. Over 100 facilities worldwide show these numerical model displays to an estimated 33 million people every year. Datasets storyboards, and teacher follow-up materials associated with the simulations, are developed to address common core science K-12 standards. CSDMS dataset documentation aims to make people aware of the fact that they look at numerical model results, that underlying models have inherent assumptions and simplifications, and that limitations are known. CSDMS contributions aim to familiarize large audiences with the use of numerical

  5. Recent Developments in the Radiation Belt Environment Model

    NASA Technical Reports Server (NTRS)

    Fok, M.-C.; Glocer, A.; Zheng, Q.; Horne, R. B.; Meredith, N. P.; Albert, J. M.; Nagai, T.

    2010-01-01

    The fluxes of energetic particles in the radiation belts are found to be strongly controlled by the solar wind conditions. In order to understand and predict the radiation particle intensities, we have developed a physics-based Radiation Belt Environment (RBE) model that considers the influences from the solar wind, ring current and plasmasphere. Recently, an improved calculation of wave-particle interactions has been incorporated. In particular, the model now includes cross diffusion in energy and pitch-angle. We find that the exclusion of cross diffusion could cause significant overestimation of electron flux enhancement during storm recovery. The RBE model is also connected to MHD fields so that the response of the radiation belts to fast variations in the global magnetosphere can be studied.Weare able to reproduce the rapid flux increase during a substorm dipolarization on 4 September 2008. The timing is much shorter than the time scale of wave associated acceleration.

  6. New V and V Tools for Diagnostic Modeling Environment (DME)

    NASA Technical Reports Server (NTRS)

    Pecheur, Charles; Nelson, Stacy; Merriam, Marshall (Technical Monitor)

    2002-01-01

    The purpose of this report is to provide correctness and reliability criteria for verification and validation (V&V) of Second Generation Reusable Launch Vehicle (RLV) Diagnostic Modeling Environment, describe current NASA Ames Research Center tools for V&V of Model Based Reasoning systems, and discuss the applicability of Advanced V&V to DME. This report is divided into the following three sections: (1) correctness and reliability criteria; (2) tools for V&V of Model Based Reasoning; and (3) advanced V&V applicable to DME. The Executive Summary includes an overview of the main points from each section. Supporting details, diagrams, figures, and other information are included in subsequent sections. A glossary, acronym list, appendices, and references are included at the end of this report.

  7. Incorporating process variability into stormwater quality modelling.

    PubMed

    Wijesiri, Buddhi; Egodawatta, Prasanna; McGree, James; Goonetilleke, Ashantha

    2015-11-15

    Process variability in pollutant build-up and wash-off generates inherent uncertainty that affects the outcomes of stormwater quality models. Poor characterisation of process variability constrains the accurate accounting of the uncertainty associated with pollutant processes. This acts as a significant limitation to effective decision making in relation to stormwater pollution mitigation. The study undertaken developed three theoretical scenarios based on research findings that variations in particle size fractions <150 μm and >150 μm during pollutant build-up and wash-off primarily determine the variability associated with these processes. These scenarios, which combine pollutant build-up and wash-off processes that takes place on a continuous timeline, are able to explain process variability under different field conditions. Given the variability characteristics of a specific build-up or wash-off event, the theoretical scenarios help to infer the variability characteristics of the associated pollutant process that follows. Mathematical formulation of the theoretical scenarios enables the incorporation of variability characteristics of pollutant build-up and wash-off processes in stormwater quality models. The research study outcomes will contribute to the quantitative assessment of uncertainty as an integral part of the interpretation of stormwater quality modelling outcomes.

  8. Biomedical Simulation Models of Human Auditory Processes

    NASA Technical Reports Server (NTRS)

    Bicak, Mehmet M. A.

    2012-01-01

    Detailed acoustic engineering models that explore noise propagation mechanisms associated with noise attenuation and transmission paths created when using hearing protectors such as earplugs and headsets in high noise environments. Biomedical finite element (FE) models are developed based on volume Computed Tomography scan data which provides explicit external ear, ear canal, middle ear ossicular bones and cochlea geometry. Results from these studies have enabled a greater understanding of hearing protector to flesh dynamics as well as prioritizing noise propagation mechanisms. Prioritization of noise mechanisms can form an essential framework for exploration of new design principles and methods in both earplug and earcup applications. These models are currently being used in development of a novel hearing protection evaluation system that can provide experimentally correlated psychoacoustic noise attenuation. Moreover, these FE models can be used to simulate the effects of blast related impulse noise on human auditory mechanisms and brain tissue.

  9. Towards Modelling and Simulation of Crowded Environments in Cell Biology

    NASA Astrophysics Data System (ADS)

    Bittig, Arne T.; Jeschke, Matthias; Uhrmacher, Adelinde M.

    2010-09-01

    In modelling and simulation of cell biological processes, spatial homogeneity in the distribution of components is a common but not always valid assumption. Spatial simulation methods differ in computational effort and accuracy, and usually rely on tool-specific input formats for model specification. A clear separation between modelling and simulation allows a declarative model specification thereby facilitating reuse of models and exploiting different simulators. We outline a modelling formalism covering both stochastic spatial simulation at the population level and simulation of individual entities moving in continuous space as well as the combination thereof. A multi-level spatial simulator is presented that combines populations of small particles simulated according to the Next Subvolume Method with individually represented large particles following Brownian motion. This approach entails several challenges that need to be overcome, but nicely balances between calculation effort and required levels of detail.

  10. Engineered Barrier System Degradation, Flow, and Transport Process Model Report

    SciTech Connect

    E.L. Hardin

    2000-07-17

    The Engineered Barrier System Degradation, Flow, and Transport Process Model Report (EBS PMR) is one of nine PMRs supporting the Total System Performance Assessment (TSPA) being developed by the Yucca Mountain Project for the Site Recommendation Report (SRR). The EBS PMR summarizes the development and abstraction of models for processes that govern the evolution of conditions within the emplacement drifts of a potential high-level nuclear waste repository at Yucca Mountain, Nye County, Nevada. Details of these individual models are documented in 23 supporting Analysis/Model Reports (AMRs). Nineteen of these AMRs are for process models, and the remaining 4 describe the abstraction of results for application in TSPA. The process models themselves cluster around four major topics: ''Water Distribution and Removal Model, Physical and Chemical Environment Model, Radionuclide Transport Model, and Multiscale Thermohydrologic Model''. One AMR (Engineered Barrier System-Features, Events, and Processes/Degradation Modes Analysis) summarizes the formal screening analysis used to select the Features, Events, and Processes (FEPs) included in TSPA and those excluded from further consideration. Performance of a potential Yucca Mountain high-level radioactive waste repository depends on both the natural barrier system (NBS) and the engineered barrier system (EBS) and on their interactions. Although the waste packages are generally considered as components of the EBS, the EBS as defined in the EBS PMR includes all engineered components outside the waste packages. The principal function of the EBS is to complement the geologic system in limiting the amount of water contacting nuclear waste. A number of alternatives were considered by the Project for different EBS designs that could provide better performance than the design analyzed for the Viability Assessment. The design concept selected was Enhanced Design Alternative II (EDA II).

  11. The deterministic SIS epidemic model in a Markovian random environment.

    PubMed

    Economou, Antonis; Lopez-Herrero, Maria Jesus

    2016-07-01

    We consider the classical deterministic susceptible-infective-susceptible epidemic model, where the infection and recovery rates depend on a background environmental process that is modeled by a continuous time Markov chain. This framework is able to capture several important characteristics that appear in the evolution of real epidemics in large populations, such as seasonality effects and environmental influences. We propose computational approaches for the determination of various distributions that quantify the evolution of the number of infectives in the population. PMID:26515172

  12. Parabolic Anderson Model in a Dynamic Random Environment: Random Conductances

    NASA Astrophysics Data System (ADS)

    Erhard, D.; den Hollander, F.; Maillard, G.

    2016-06-01

    The parabolic Anderson model is defined as the partial differential equation ∂ u( x, t)/ ∂ t = κ Δ u( x, t) + ξ( x, t) u( x, t), x ∈ ℤ d , t ≥ 0, where κ ∈ [0, ∞) is the diffusion constant, Δ is the discrete Laplacian, and ξ is a dynamic random environment that drives the equation. The initial condition u( x, 0) = u 0( x), x ∈ ℤ d , is typically taken to be non-negative and bounded. The solution of the parabolic Anderson equation describes the evolution of a field of particles performing independent simple random walks with binary branching: particles jump at rate 2 d κ, split into two at rate ξ ∨ 0, and die at rate (- ξ) ∨ 0. In earlier work we looked at the Lyapunov exponents λ p(κ ) = limlimits _{tto ∞} 1/t log {E} ([u(0,t)]p)^{1/p}, quad p in {N} , qquad λ 0(κ ) = limlimits _{tto ∞} 1/2 log u(0,t). For the former we derived quantitative results on the κ-dependence for four choices of ξ : space-time white noise, independent simple random walks, the exclusion process and the voter model. For the latter we obtained qualitative results under certain space-time mixing conditions on ξ. In the present paper we investigate what happens when κΔ is replaced by Δ𝓚, where 𝓚 = {𝓚( x, y) : x, y ∈ ℤ d , x ˜ y} is a collection of random conductances between neighbouring sites replacing the constant conductances κ in the homogeneous model. We show that the associated annealed Lyapunov exponents λ p (𝓚), p ∈ ℕ, are given by the formula λ p({K} ) = {sup} {λ p(κ ) : κ in {Supp} ({K} )}, where, for a fixed realisation of 𝓚, Supp(𝓚) is the set of values taken by the 𝓚-field. We also show that for the associated quenched Lyapunov exponent λ 0(𝓚) this formula only provides a lower bound, and we conjecture that an upper bound holds when Supp(𝓚) is replaced by its convex hull. Our proof is valid for three classes of reversible ξ, and for all 𝓚

  13. Quantum mechanical Hamiltonian models of discrete processes

    SciTech Connect

    Benioff, P.

    1981-03-01

    Here the results of other work on quantum mechanical Hamiltonian models of Turing machines are extended to include any discrete process T on a countably infinite set A. The models are constructed here by use of scattering phase shifts from successive scatterers to turn on successive step interactions. Also a locality requirement is imposed. The construction is done by first associating with each process T a model quantum system M with associated Hilbert space H/sub M/ and step operator U/sub T/. Since U/sub T/ is not unitary in general, M, H/sub M/, and U/sub T/ are extended into a (continuous time) Hamiltonian model on a larger space which satisfies the locality requirement. The construction is compared with the minimal unitary dilation of U/sub T/. It is seen that the model constructed here is larger than the minimal one. However, the minimal one does not satisfy the locality requirement.

  14. Reservoir and contaminated sediments impacts in high-Andean environments: Morphodynamic interactions with biogeochemical processes

    NASA Astrophysics Data System (ADS)

    Escauriaza, C. R.; Contreras, M. T.; Müllendorff, D. A.; Pasten, P.; Pizarro, G. E.

    2014-12-01

    Rapid changes due to anthropic interventions in high-altitude environments, such as the Altiplano region in South America, require new approaches to understand the connections between physical and biogeochemical processes. Alterations of the water quality linked to the river morphology can affect the ecosystems and human development in the long-term. The future construction of a reservoir in the Lluta river, located in northern Chile, will change the spatial distribution of arsenic-rich sediments, which can have significant effects on the lower parts of the watershed. In this investigation we develop a coupled numerical model to predict and evaluate the interactions between morphodynamic changes in the Lluta reservoir, and conditions that can potentially desorb arsenic from the sediments. Assuming that contaminants are mobilized under anaerobic conditions, we calculate the oxygen concentration within the sediments to study the interactions of the delta progradation with the potential arsenic release. This work provides a framework for future studies aimed to analyze the complex connections between morphodynamics and water quality, when contaminant-rich sediments accumulate in a reservoir. The tool can also help to design effective risk management and remediation strategies in these extreme environments. Research has been supported by Fondecyt grant 1130940 and CONICYT/FONDAP Grant 15110017

  15. A network-based training environment: a medical image processing paradigm.

    PubMed

    Costaridou, L; Panayiotakis, G; Sakellaropoulos, P; Cavouras, D; Dimopoulos, J

    1998-01-01

    The capability of interactive multimedia and Internet technologies is investigated with respect to the implementation of a distance learning environment. The system is built according to a client-server architecture, based on the Internet infrastructure, composed of server nodes conceptually modelled as WWW sites. Sites are implemented by customization of available components. The environment integrates network-delivered interactive multimedia courses, network-based tutoring, SIG support, information databases of professional interest, as well as course and tutoring management. This capability has been demonstrated by means of an implemented system, validated with digital image processing content, specifically image enhancement. Image enhancement methods are theoretically described and applied to mammograms. Emphasis is given to the interactive presentation of the effects of algorithm parameters on images. The system end-user access depends on available bandwidth, so high-speed access can be achieved via LAN or local ISDN connections. Network based training offers new means of improved access and sharing of learning resources and expertise, as promising supplements in training. PMID:9922949

  16. Method for modelling sea surface clutter in complicated propagation environments

    NASA Astrophysics Data System (ADS)

    Dockery, G. D.

    1990-04-01

    An approach for predicting clutter levels in complicated propagation conditions using an advanced propagation model and one of several empirical clutter cross-section models is described. Incident power and grazing angle information is obtained using a parabolic equation/Fourier split-step technique to predict the distribution of energy in complicated, range-varying environments. Such environments also require the use of an algorithm that establishes a physically reasonable range-interpolation scheme for the measured refractivity profiles. The reflectivity of the sea surface is represented using a clutter cross-section model that was developed originally by the Georgia Institutue of Technology and subsequently modified to include the effects of arbitrary refractive conditions. Predicted clutter power levels generated by the new procedure are compared with clutter measured at 2.9 GHz during propagation experiments conducted at the NASA Wallops Flight Facility on Virginia's Eastern Shore. During these experiments, high-resolution refractivity data were collected in both range and altitude by an instrumented helicopter.

  17. GCR environmental models II: Uncertainty propagation methods for GCR environments

    NASA Astrophysics Data System (ADS)

    Slaba, Tony C.; Blattnig, Steve R.

    2014-04-01

    In order to assess the astronaut exposure received within vehicles or habitats, accurate models of the ambient galactic cosmic ray (GCR) environment are required. Many models have been developed and compared to measurements, with uncertainty estimates often stated to be within 15%. However, intercode comparisons can lead to differences in effective dose exceeding 50%. This is the second of three papers focused on resolving this discrepancy. The first paper showed that GCR heavy ions with boundary energies below 500 MeV/n induce less than 5% of the total effective dose behind shielding. Yet, due to limitations on available data, model development and validation are heavily influenced by comparisons to measurements taken below 500 MeV/n. In the current work, the focus is on developing an efficient method for propagating uncertainties in the ambient GCR environment to effective dose values behind shielding. A simple approach utilizing sensitivity results from the first paper is described and shown to be equivalent to a computationally expensive Monte Carlo uncertainty propagation. The simple approach allows a full uncertainty propagation to be performed once GCR uncertainty distributions are established. This rapid analysis capability may be integrated into broader probabilistic radiation shielding analysis and also allows error bars (representing boundary condition uncertainty) to be placed around point estimates of effective dose.

  18. Geometrical model for malaria parasite migration in structured environments

    NASA Astrophysics Data System (ADS)

    Battista, Anna; Frischknecht, Friedrich; Schwarz, Ulrich S.

    2014-10-01

    Malaria is transmitted to vertebrates via a mosquito bite, during which rodlike and crescent-shaped parasites, called sporozoites, are injected into the skin of the host. Searching for a blood capillary to penetrate, sporozoites move quickly in locally helical trajectories, that are frequently perturbed by interactions with the extracellular environment. Here we present a theoretical analysis of the active motility of sporozoites in a structured environment. The sporozoite is modelled as a self-propelled rod with spontaneous curvature and bending rigidity. It interacts with hard obstacles through collision rules inferred from experimental observation of two-dimensional sporozoite movement in pillar arrays. Our model shows that complex motion patterns arise from the geometrical shape of the parasite and that its mechanical flexibility is crucial for stable migration patterns. Extending the model to three dimensions reveals that a bent and twisted rod can associate to cylindrical obstacles in a manner reminiscent of the association of sporozoites to blood capillaries, supporting the notion of a prominent role of cell shape during malaria transmission.

  19. Barotropic Tidal Predictions and Validation in a Relocatable Modeling Environment. Revised

    NASA Technical Reports Server (NTRS)

    Mehra, Avichal; Passi, Ranjit; Kantha, Lakshmi; Payne, Steven; Brahmachari, Shuvobroto

    1998-01-01

    Under funding from the Office of Naval Research (ONR), and the Naval Oceanographic Office (NAVOCEANO), the Mississippi State University Center for Air Sea Technology (CAST) has been working on developing a Relocatable Modeling Environment(RME) to provide a uniform and unbiased infrastructure for efficiently configuring numerical models in any geographic/oceanic region. Under Naval Oceanographic Office (NAVO-CEANO) funding, the model was implemented and tested for NAVOCEANO use. With our current emphasis on ocean tidal modeling, CAST has adopted the Colorado University's numerical ocean model, known as CURReNTSS (Colorado University Rapidly Relocatable Nestable Storm Surge) Model, as the model of choice. During the RME development process, CURReNTSS has been relocated to several coastal oceanic regions, providing excellent results that demonstrate its veracity. This report documents the model validation results and provides a brief description of the Graphic user Interface (GUI).

  20. Utilizing Vector Space Models for User Modeling within e-Learning Environments

    ERIC Educational Resources Information Center

    Mangina, E.; Kilbride, J.

    2008-01-01

    User modeling has been found to enhance the effectiveness and/or usability of software systems through the representation of certain properties of a particular user. This paper presents the research and the results of the development of a user modeling system for the implementation of student models within e-learning environments, utilizing vector…

  1. Comparing Two Types of Model Progression in an Inquiry Learning Environment with Modelling Facilities

    ERIC Educational Resources Information Center

    Mulder, Yvonne G.; Lazonder, Ard W.; de Jong, Ton

    2011-01-01

    The educational advantages of inquiry learning environments that incorporate modelling facilities are often challenged by students' poor inquiry skills. This study examined two types of model progression as means to compensate for these skill deficiencies. Model order progression (MOP), the predicted optimal variant, gradually increases the…

  2. Software Engineering Laboratory (SEL) cleanroom process model

    NASA Technical Reports Server (NTRS)

    Green, Scott; Basili, Victor; Godfrey, Sally; Mcgarry, Frank; Pajerski, Rose; Waligora, Sharon

    1991-01-01

    The Software Engineering Laboratory (SEL) cleanroom process model is described. The term 'cleanroom' originates in the integrated circuit (IC) production process, where IC's are assembled in dust free 'clean rooms' to prevent the destructive effects of dust. When applying the clean room methodology to the development of software systems, the primary focus is on software defect prevention rather than defect removal. The model is based on data and analysis from previous cleanroom efforts within the SEL and is tailored to serve as a guideline in applying the methodology to future production software efforts. The phases that are part of the process model life cycle from the delivery of requirements to the start of acceptance testing are described. For each defined phase, a set of specific activities is discussed, and the appropriate data flow is described. Pertinent managerial issues, key similarities and differences between the SEL's cleanroom process model and the standard development approach used on SEL projects, and significant lessons learned from prior cleanroom projects are presented. It is intended that the process model described here will be further tailored as additional SEL cleanroom projects are analyzed.

  3. Computational Modeling in Structural Materials Processing

    NASA Technical Reports Server (NTRS)

    Meyyappan, Meyya; Arnold, James O. (Technical Monitor)

    1997-01-01

    High temperature materials such as silicon carbide, a variety of nitrides, and ceramic matrix composites find use in aerospace, automotive, machine tool industries and in high speed civil transport applications. Chemical vapor deposition (CVD) is widely used in processing such structural materials. Variations of CVD include deposition on substrates, coating of fibers, inside cavities and on complex objects, and infiltration within preforms called chemical vapor infiltration (CVI). Our current knowledge of the process mechanisms, ability to optimize processes, and scale-up for large scale manufacturing is limited. In this regard, computational modeling of the processes is valuable since a validated model can be used as a design tool. The effort is similar to traditional chemically reacting flow modeling with emphasis on multicomponent diffusion, thermal diffusion, large sets of homogeneous reactions, and surface chemistry. In the case of CVI, models for pore infiltration are needed. In the present talk, examples of SiC nitride, and Boron deposition from the author's past work will be used to illustrate the utility of computational process modeling.

  4. Dynamic occupancy models for explicit colonization processes.

    PubMed

    Broms, Kristin M; Hooten, Mevin B; Johnson, Devin S; Altwegg, Res; Conquest, Loveday L

    2016-01-01

    The dynamic, multi-season occupancy model framework has become a popular tool for modeling open populations with occupancies that change over time through local colonizations and extinctions. However, few versions of the model relate these probabilities to the occupancies of neighboring sites or patches. We present a modeling framework that incorporates this information and is capable of describing a wide variety of spatiotemporal colonization and extinction processes. A key feature of the model is that it is based on a simple set of small-scale rules describing how the process evolves. The result is a dynamic process that can account for complicated large-scale features. In our model, a site is more likely to be colonized if more of its neighbors were previously occupied and if it provides more appealing environmental characteristics than its neighboring sites. Additionally, a site without occupied neighbors may also become colonized through the inclusion of a long-distance dispersal process. Although similar model specifications have been developed for epidemiological applications, ours formally accounts for detectability using the well-known occupancy modeling framework. After demonstrating the viability and potential of this new form of dynamic occupancy model in a simulation study, we use it to obtain inference for the ongoing Common Myna (Acridotheres tristis) invasion in South Africa. Our results suggest that the Common Myna continues to enlarge its distribution and its spread via short distance movement, rather than long-distance dispersal. Overall, this new modeling framework provides a powerful tool for managers examining the drivers of colonization including short- vs. long-distance dispersal, habitat quality, and distance from source populations. PMID:27008788

  5. Dynamic occupancy models for explicit colonization processes

    USGS Publications Warehouse

    Broms, Kristin M.; Hooten, Mevin B.; Johnson, Devin S.; Altwegg, Res; Conquest, Loveday

    2016-01-01

    The dynamic, multi-season occupancy model framework has become a popular tool for modeling open populations with occupancies that change over time through local colonizations and extinctions. However, few versions of the model relate these probabilities to the occupancies of neighboring sites or patches. We present a modeling framework that incorporates this information and is capable of describing a wide variety of spatiotemporal colonization and extinction processes. A key feature of the model is that it is based on a simple set of small-scale rules describing how the process evolves. The result is a dynamic process that can account for complicated large-scale features. In our model, a site is more likely to be colonized if more of its neighbors were previously occupied and if it provides more appealing environmental characteristics than its neighboring sites. Additionally, a site without occupied neighbors may also become colonized through the inclusion of a long-distance dispersal process. Although similar model specifications have been developed for epidemiological applications, ours formally accounts for detectability using the well-known occupancy modeling framework. After demonstrating the viability and potential of this new form of dynamic occupancy model in a simulation study, we use it to obtain inference for the ongoing Common Myna (Acridotheres tristis) invasion in South Africa. Our results suggest that the Common Myna continues to enlarge its distribution and its spread via short distance movement, rather than long-distance dispersal. Overall, this new modeling framework provides a powerful tool for managers examining the drivers of colonization including short- vs. long-distance dispersal, habitat quality, and distance from source populations.

  6. Virtual building environments (VBE) - Applying information modeling to buildings

    SciTech Connect

    Bazjanac, Vladimir

    2004-06-21

    A Virtual Building Environment (VBE) is a ''place'' where building industry project staffs can get help in creating Building Information Models (BIM) and in the use of virtual buildings. It consists of a group of industry software that is operated by industry experts who are also experts in the use of that software. The purpose of a VBE is to facilitate expert use of appropriate software applications in conjunction with each other to efficiently support multidisciplinary work. This paper defines BIM and virtual buildings, and describes VBE objectives, set-up and characteristics of operation. It informs about the VBE Initiative and the benefits from a couple of early VBE projects.

  7. Causally nonseparable processes admitting a causal model

    NASA Astrophysics Data System (ADS)

    Feix, Adrien; Araújo, Mateus; Brukner, Časlav

    2016-08-01

    A recent framework of quantum theory with no global causal order predicts the existence of ‘causally nonseparable’ processes. Some of these processes produce correlations incompatible with any causal order (they violate so-called ‘causal inequalities’ analogous to Bell inequalities) while others do not (they admit a ‘causal model’ analogous to a local model). Here we show for the first time that bipartite causally nonseparable processes with a causal model exist, and give evidence that they have no clear physical interpretation. We also provide an algorithm to generate processes of this kind and show that they have nonzero measure in the set of all processes. We demonstrate the existence of processes which stop violating causal inequalities but are still causally nonseparable when mixed with a certain amount of ‘white noise’. This is reminiscent of the behavior of Werner states in the context of entanglement and nonlocality. Finally, we provide numerical evidence for the existence of causally nonseparable processes which have a causal model even when extended with an entangled state shared among the parties.

  8. Implementation of the Business Process Modelling Notation (BPMN) in the modelling of anatomic pathology processes

    PubMed Central

    Rojo, Marcial García; Rolón, Elvira; Calahorra, Luis; García, Felix Óscar; Sánchez, Rosario Paloma; Ruiz, Francisco; Ballester, Nieves; Armenteros, María; Rodríguez, Teresa; Espartero, Rafael Martín

    2008-01-01

    Background Process orientation is one of the essential elements of quality management systems, including those in use in healthcare. Business processes in hospitals are very complex and variable. BPMN (Business Process Modelling Notation) is a user-oriented language specifically designed for the modelling of business (organizational) processes. Previous experiences of the use of this notation in the processes modelling within the Pathology in Spain or another country are not known. We present our experience in the elaboration of the conceptual models of Pathology processes, as part of a global programmed surgical patient process, using BPMN. Methods With the objective of analyzing the use of BPMN notation in real cases, a multidisciplinary work group was created, including software engineers from the Dep. of Technologies and Information Systems from the University of Castilla-La Mancha and health professionals and administrative staff from the Hospital General de Ciudad Real. The work in collaboration was carried out in six phases: informative meetings, intensive training, process selection, definition of the work method, process describing by hospital experts, and process modelling. Results The modelling of the processes of Anatomic Pathology is presented using BPMN. The presented subprocesses are those corresponding to the surgical pathology examination of the samples coming from operating theatre, including the planning and realization of frozen studies. Conclusion The modelling of Anatomic Pathology subprocesses has allowed the creation of an understandable graphical model, where management and improvements are more easily implemented by health professionals. PMID:18673511

  9. Stochastic differential equation model to Prendiville processes

    SciTech Connect

    Granita; Bahar, Arifah

    2015-10-22

    The Prendiville process is another variation of the logistic model which assumes linearly decreasing population growth rate. It is a continuous time Markov chain (CTMC) taking integer values in the finite interval. The continuous time Markov chain can be approximated by stochastic differential equation (SDE). This paper discusses the stochastic differential equation of Prendiville process. The work started with the forward Kolmogorov equation in continuous time Markov chain of Prendiville process. Then it was formulated in the form of a central-difference approximation. The approximation was then used in Fokker-Planck equation in relation to the stochastic differential equation of the Prendiville process. The explicit solution of the Prendiville process was obtained from the stochastic differential equation. Therefore, the mean and variance function of the Prendiville process could be easily found from the explicit solution.

  10. Predicting plants -modeling traits as a function of environment

    NASA Astrophysics Data System (ADS)

    Franklin, Oskar

    2016-04-01

    A central problem in understanding and modeling vegetation dynamics is how to represent the variation in plant properties and function across different environments. Addressing this problem there is a strong trend towards trait-based approaches, where vegetation properties are functions of the distributions of functional traits rather than of species. Recently there has been enormous progress in in quantifying trait variability and its drivers and effects (Van Bodegom et al. 2012; Adier et al. 2014; Kunstler et al. 2015) based on wide ranging datasets on a small number of easily measured traits, such as specific leaf area (SLA), wood density and maximum plant height. However, plant function depends on many other traits and while the commonly measured trait data are valuable, they are not sufficient for driving predictive and mechanistic models of vegetation dynamics -especially under novel climate or management conditions. For this purpose we need a model to predict functional traits, also those not easily measured, and how they depend on the plants' environment. Here I present such a mechanistic model based on fitness concepts and focused on traits related to water and light limitation of trees, including: wood density, drought response, allocation to defense, and leaf traits. The model is able to predict observed patterns of variability in these traits in relation to growth and mortality, and their responses to a gradient of water limitation. The results demonstrate that it is possible to mechanistically predict plant traits as a function of the environment based on an eco-physiological model of plant fitness. References Adier, P.B., Salguero-Gómez, R., Compagnoni, A., Hsu, J.S., Ray-Mukherjee, J., Mbeau-Ache, C. et al. (2014). Functional traits explain variation in plant lifehistory strategies. Proc. Natl. Acad. Sci. U. S. A., 111, 740-745. Kunstler, G., Falster, D., Coomes, D.A., Hui, F., Kooyman, R.M., Laughlin, D.C. et al. (2015). Plant functional traits

  11. FAME, a microprocessor based front-end analysis and modeling environment

    NASA Technical Reports Server (NTRS)

    Rosenbaum, J. D.; Kutin, E. B.

    1980-01-01

    Higher order software (HOS) is a methodology for the specification and verification of large scale, complex, real time systems. The HOS methodology was implemented as FAME (front end analysis and modeling environment), a microprocessor based system for interactively developing, analyzing, and displaying system models in a low cost user-friendly environment. The nature of the model is such that when completed it can be the basis for projection to a variety of forms such as structured design diagrams, Petri-nets, data flow diagrams, and PSL/PSA source code. The user's interface with the analyzer is easily recognized by any current user of a structured modeling approach; therefore extensive training is unnecessary. Furthermore, when all the system capabilities are used one can check on proper usage of data types, functions, and control structures thereby adding a new dimension to the design process that will lead to better and more easily verified software designs.

  12. Models of Solar Wind Structures and Their Interaction with the Earth's Space Environment

    NASA Astrophysics Data System (ADS)

    Watermann, J.; Wintoft, P.; Sanahuja, B.; Saiz, E.; Poedts, S.; Palmroth, M.; Milillo, A.; Metallinou, F.-A.; Jacobs, C.; Ganushkina, N. Y.; Daglis, I. A.; Cid, C.; Cerrato, Y.; Balasis, G.; Aylward, A. D.; Aran, A.

    2009-11-01

    The discipline of “Space Weather” is built on the scientific foundation of solar-terrestrial physics but with a strong orientation toward applied research. Models describing the solar-terrestrial environment are therefore at the heart of this discipline, for both physical understanding of the processes involved and establishing predictive capabilities of the consequences of these processes. Depending on the requirements, purely physical models, semi-empirical or empirical models are considered to be the most appropriate. This review focuses on the interaction of solar wind disturbances with geospace. We cover interplanetary space, the Earth’s magnetosphere (with the exception of radiation belt physics), the ionosphere (with the exception of radio science), the neutral atmosphere and the ground (via electromagnetic induction fields). Space weather relevant state-of-the-art physical and semi-empirical models of the various regions are reviewed. They include models for interplanetary space, its quiet state and the evolution of recurrent and transient solar perturbations (corotating interaction regions, coronal mass ejections, their interplanetary remnants, and solar energetic particle fluxes). Models of coupled large-scale solar wind-magnetosphere-ionosphere processes (global magnetohydrodynamic descriptions) and of inner magnetosphere processes (ring current dynamics) are discussed. Achievements in modeling the coupling between magnetospheric processes and the neutral and ionized upper and middle atmospheres are described. Finally we mention efforts to compile comprehensive and flexible models from selections of existing modules applicable to particular regions and conditions in interplanetary space and geospace.

  13. Operational SAR Data Processing in GIS Environments for Rapid Disaster Mapping

    NASA Astrophysics Data System (ADS)

    Bahr, Thomas

    2014-05-01

    DEM without the need of ground control points. This step includes radiometric calibration. (3) A subsequent change detection analysis generates the final map showing the extent of the flash flood on Nov. 5th 2010. The underlying algorithms are provided by three different sources: Geocoding & radiometric calibration (2) is a standard functionality from the commercial SARscape Toolbox for ArcGIS. This toolbox is extended by the filter tool (1), which is called from the SARscape modules in ENVI. The change detection analysis (3) is based on ENVI processing routines and scripted with IDL. (2) and (3) are integrated with ArcGIS using a predefined Python interface. These 3 processing steps are combined using the ArcGIS ModelBuilder to create a new model for rapid disaster mapping in ArcGIS, based on SAR data. Moreover, this model can be dissolved from its desktop environment and published to users across the ArcGIS Server enterprise. Thus disaster zones, e.g. after severe flooding, can be automatically identified and mapped to support local task forces - using an operational workflow for SAR image analysis, which can be executed by the responsible operators without SAR expert knowledge.

  14. Cost Models for MMC Manufacturing Processes

    NASA Technical Reports Server (NTRS)

    Elzey, Dana M.; Wadley, Haydn N. G.

    1996-01-01

    The quality cost modeling (QCM) tool is intended to be a relatively simple-to-use device for obtaining a first-order assessment of the quality-cost relationship for a given process-material combination. The QCM curve is a plot of cost versus quality (an index indicating microstructural quality), which is unique for a given process-material combination. The QCM curve indicates the tradeoff between cost and performance, thus enabling one to evaluate affordability. Additionally, the effect of changes in process design, raw materials, and process conditions on the cost-quality relationship can be evaluated. Such results might indicate the most efficient means to obtain improved quality at reduced cost by process design refinements, the implementation of sensors and models for closed loop process control, or improvement in the properties of raw materials being fed into the process. QCM also allows alternative processes for producing the same or similar material to be compared in terms of their potential for producing competitively priced, high quality material. Aside from demonstrating the usefulness of the QCM concept, this is one of the main foci of the present research program, namely to compare processes for making continuous fiber reinforced, metal matrix composites (MMC's). Two processes, low pressure plasma spray deposition and tape casting are considered for QCM development. This document consists of a detailed look at the design of the QCM approach, followed by discussion of the application of QCM to each of the selected MMC manufacturing processes along with results, comparison of processes, and finally, a summary of findings and recommendations.

  15. Building intuition of iron evolution during solar cell processing through analysis of different process models

    NASA Astrophysics Data System (ADS)

    Morishige, Ashley E.; Laine, Hannu S.; Schön, Jonas; Haarahiltunen, Antti; Hofstetter, Jasmin; del Cañizo, Carlos; Schubert, Martin C.; Savin, Hele; Buonassisi, Tonio

    2015-09-01

    An important aspect of Process Simulators for photovoltaics is prediction of defect evolution during device fabrication. Over the last twenty years, these tools have accelerated process optimization, and several Process Simulators for iron, a ubiquitous and deleterious impurity in silicon, have been developed. The diversity of these tools can make it difficult to build intuition about the physics governing iron behavior during processing. Thus, in one unified software environment and using self-consistent terminology, we combine and describe three of these Simulators. We vary structural defect distribution and iron precipitation equations to create eight distinct Models, which we then use to simulate different stages of processing. We find that the structural defect distribution influences the final interstitial iron concentration ([]) more strongly than the iron precipitation equations. We identify two regimes of iron behavior: (1) diffusivity-limited, in which iron evolution is kinetically limited and bulk [] predictions can vary by an order of magnitude or more, and (2) solubility-limited, in which iron evolution is near thermodynamic equilibrium and the Models yield similar results. This rigorous analysis provides new intuition that can inform Process Simulation, material, and process development, and it enables scientists and engineers to choose an appropriate level of Model complexity based on wafer type and quality, processing conditions, and available computation time.

  16. A copula model for marked point processes.

    PubMed

    Diao, Liqun; Cook, Richard J; Lee, Ker-Ai

    2013-10-01

    Many chronic diseases feature recurring clinically important events. In addition, however, there often exists a random variable which is realized upon the occurrence of each event reflecting the severity of the event, a cost associated with it, or possibly a short term response indicating the effect of a therapeutic intervention. We describe a novel model for a marked point process which incorporates a dependence between continuous marks and the event process through the use of a copula function. The copula formulation ensures that event times can be modeled by any intensity function for point processes, and any multivariate model can be specified for the continuous marks. The relative efficiency of joint versus separate analyses of the event times and the marks is examined through simulation under random censoring. An application to data from a recent trial in transfusion medicine is given for illustration. PMID:23660874

  17. Incorporating evolutionary processes into population viability models.

    PubMed

    Pierson, Jennifer C; Beissinger, Steven R; Bragg, Jason G; Coates, David J; Oostermeijer, J Gerard B; Sunnucks, Paul; Schumaker, Nathan H; Trotter, Meredith V; Young, Andrew G

    2015-06-01

    We examined how ecological and evolutionary (eco-evo) processes in population dynamics could be better integrated into population viability analysis (PVA). Complementary advances in computation and population genomics can be combined into an eco-evo PVA to offer powerful new approaches to understand the influence of evolutionary processes on population persistence. We developed the mechanistic basis of an eco-evo PVA using individual-based models with individual-level genotype tracking and dynamic genotype-phenotype mapping to model emergent population-level effects, such as local adaptation and genetic rescue. We then outline how genomics can allow or improve parameter estimation for PVA models by providing genotypic information at large numbers of loci for neutral and functional genome regions. As climate change and other threatening processes increase in rate and scale, eco-evo PVAs will become essential research tools to evaluate the effects of adaptive potential, evolutionary rescue, and locally adapted traits on persistence.

  18. Relativistic diffusion processes and random walk models

    SciTech Connect

    Dunkel, Joern; Talkner, Peter; Haenggi, Peter

    2007-02-15

    The nonrelativistic standard model for a continuous, one-parameter diffusion process in position space is the Wiener process. As is well known, the Gaussian transition probability density function (PDF) of this process is in conflict with special relativity, as it permits particles to propagate faster than the speed of light. A frequently considered alternative is provided by the telegraph equation, whose solutions avoid superluminal propagation speeds but suffer from singular (noncontinuous) diffusion fronts on the light cone, which are unlikely to exist for massive particles. It is therefore advisable to explore other alternatives as well. In this paper, a generalized Wiener process is proposed that is continuous, avoids superluminal propagation, and reduces to the standard Wiener process in the nonrelativistic limit. The corresponding relativistic diffusion propagator is obtained directly from the nonrelativistic Wiener propagator, by rewriting the latter in terms of an integral over actions. The resulting relativistic process is non-Markovian, in accordance with the known fact that nontrivial continuous, relativistic Markov processes in position space cannot exist. Hence, the proposed process defines a consistent relativistic diffusion model for massive particles and provides a viable alternative to the solutions of the telegraph equation.

  19. Modeling a distributed environment for a petroleum reservoir engineering application with software product line

    NASA Astrophysics Data System (ADS)

    de Faria Scheidt, Rafael; Vilain, Patrícia; Dantas, M. A. R.

    2014-10-01

    Petroleum reservoir engineering is a complex and interesting field that requires large amount of computational facilities to achieve successful results. Usually, software environments for this field are developed without taking care out of possible interactions and extensibilities required by reservoir engineers. In this paper, we present a research work which it is characterized by the design and implementation based on a software product line model for a real distributed reservoir engineering environment. Experimental results indicate successfully the utilization of this approach for the design of distributed software architecture. In addition, all components from the proposal provided greater visibility of the organization and processes for the reservoir engineers.

  20. Quantitative Modeling of Earth Surface Processes

    NASA Astrophysics Data System (ADS)

    Pelletier, Jon D.

    This textbook describes some of the most effective and straightforward quantitative techniques for modeling Earth surface processes. By emphasizing a core set of equations and solution techniques, the book presents state-of-the-art models currently employed in Earth surface process research, as well as a set of simple but practical research tools. Detailed case studies demonstrate application of the methods to a wide variety of processes including hillslope, fluvial, aeolian, glacial, tectonic, and climatic systems. Exercises at the end of each chapter begin with simple calculations and then progress to more sophisticated problems that require computer programming. All the necessary computer codes are available online at www.cambridge.org/9780521855976. Assuming some knowledge of calculus and basic programming experience, this quantitative textbook is designed for advanced geomorphology courses and as a reference book for professional researchers in Earth and planetary science looking for a quantitative approach to Earth surface processes.

  21. More details...
  1. A neurolinguistic model of grammatical construction processing.

    PubMed

    Dominey, Peter Ford; Hoen, Michel; Inui, Toshio

    2006-12-01

    One of the functions of everyday human language is to communicate meaning. Thus, when one hears or reads the sentence, "John gave a book to Mary," some aspect of an event concerning the transfer of possession of a book from John to Mary is (hopefully) transmitted. One theoretical approach to language referred to as construction grammar emphasizes this link between sentence structure and meaning in the form of grammatical constructions. The objective of the current research is to (1) outline a functional description of grammatical construction processing based on principles of psycholinguistics, (2) develop a model of how these functions can be implemented in human neurophysiology, and then (3) demonstrate the feasibility of the resulting model in processing languages of typologically diverse natures, that is, English, French, and Japanese. In this context, particular interest will be directed toward the processing of novel compositional structure of relative phrases. The simulation results are discussed in the context of recent neurophysiological studies of language processing.

  2. A process algebra model of QED

    NASA Astrophysics Data System (ADS)

    Sulis, William

    2016-03-01

    The process algebra approach to quantum mechanics posits a finite, discrete, determinate ontology of primitive events which are generated by processes (in the sense of Whitehead). In this ontology, primitive events serve as elements of an emergent space-time and of emergent fundamental particles and fields. Each process generates a set of primitive elements, using only local information, causally propagated as a discrete wave, forming a causal space termed a causal tapestry. Each causal tapestry forms a discrete and finite sampling of an emergent causal manifold (space-time) M and emergent wave function. Interactions between processes are described by a process algebra which possesses 8 commutative operations (sums and products) together with a non-commutative concatenation operator (transitions). The process algebra possesses a representation via nondeterministic combinatorial games. The process algebra connects to quantum mechanics through the set valued process and configuration space covering maps, which associate each causal tapestry with sets of wave functions over M. Probabilities emerge from interactions between processes. The process algebra model has been shown to reproduce many features of the theory of non-relativistic scalar particles to a high degree of accuracy, without paradox or divergences. This paper extends the approach to a semi-classical form of quantum electrodynamics.

  3. Hydrothermal processing of Hanford tank wastes: Process modeling and control

    SciTech Connect

    Currier, R.P.

    1994-10-01

    In the Los Alamos National Laboratory (LANL) hydrothermal process, waste streams are first pressurized and heated as they pass through a continuous flow tubular reactor vessel. The waste is maintained at reaction temperature of 300--550 C where organic destruction and sludge reformation occur. This report documents LANL activities in process modeling and control undertaken in FY94 to support hydrothermal process development. Key issues discussed include non-ideal flow patterns (e.g. axial dispersion) and their effect on reactor performance, the use and interpretation of inert tracer experiments, and the use of computational fluid mechanics to evaluate novel hydrothermal reactor designs. In addition, the effects of axial dispersion (and simplifications to rate expressions) on the estimated kinetic parameters are explored by non-linear regression to experimental data. Safety-related calculations are reported which estimate the explosion limits of effluent gases and the fate of hydrogen as it passes through the reactor. Development and numerical solution of a generalized one-dimensional mathematical model is also summarized. The difficulties encountered in using commercially available software to correlate the behavior of high temperature, high pressure aqueous electrolyte mixtures are summarized. Finally, details of the control system and experiments conducted to empirically determine the system response are reported.

  4. Task Results Processing for the Needs of Task-Oriented Design Environments

    ERIC Educational Resources Information Center

    Zheliazkova, Irina; Kolev, R.

    2008-01-01

    This paper presents learners' task results gathered by means of an example task-oriented environment for knowledge testing and processed by EXCEL. The processing is domain- and task-independent and includes automatic calculation of several important task and session's parameters, drawing specific graphics, generating tables, and analyzing the…

  5. Examining Student Research Choices and Processes in a Disintermediated Searching Environment

    ERIC Educational Resources Information Center

    Rempel, Hannah Gascho; Buck, Stefanie; Deitering, Anne-Marie

    2013-01-01

    Students today perform research in a disintermediated environment, which often allows them to struggle directly with the process of selecting research tools and choosing scholarly sources. The authors conducted a qualitative study with twenty students, using structured observations to ascertain the processes students use to select databases and…

  6. Model-based adaptive 3D sonar reconstruction in reverberating environments.

    PubMed

    Saucan, Augustin-Alexandru; Sintes, Christophe; Chonavel, Thierry; Caillec, Jean-Marc Le

    2015-10-01

    In this paper, we propose a novel model-based approach for 3D underwater scene reconstruction, i.e., bathymetry, for side scan sonar arrays in complex and highly reverberating environments like shallow water areas. The presence of multipath echoes and volume reverberation generates false depth estimates. To improve the resulting bathymetry, this paper proposes and develops an adaptive filter, based on several original geometrical models. This multimodel approach makes it possible to track and separate the direction of arrival trajectories of multiple echoes impinging the array. Echo tracking is perceived as a model-based processing stage, incorporating prior information on the temporal evolution of echoes in order to reject cluttered observations generated by interfering echoes. The results of the proposed filter on simulated and real sonar data showcase the clutter-free and regularized bathymetric reconstruction. Model validation is carried out with goodness of fit tests, and demonstrates the importance of model-based processing for bathymetry reconstruction.

  7. Nonequilibrium chemistry in confined environments: a lattice Brusselator model.

    PubMed

    Bullara, D; De Decker, Y; Lefever, R

    2013-06-01

    In this work, we study the effect of molecular crowding on a typical example of a chemical oscillator: the Brusselator model. We adopt to this end a nonequilibrium thermodynamic description, in which the size of particles is introduced via a lattice gas model. The impenetrability and finite volume of the species are shown to affect both the reaction rates and the diffusion terms in the evolution equations for the concentrations. The corrected scheme shows a more complex dynamical behavior than its ideal counterpart, including bistability and excitability. These results help to shed light on recent experimental and computational studies in biochemistry and surface chemistry, in which it was shown that confined environments may greatly affect chemical dynamics. PMID:23848764

  8. Nonequilibrium chemistry in confined environments: A lattice Brusselator model

    NASA Astrophysics Data System (ADS)

    Bullara, D.; De Decker, Y.; Lefever, R.

    2013-06-01

    In this work, we study the effect of molecular crowding on a typical example of a chemical oscillator: the Brusselator model. We adopt to this end a nonequilibrium thermodynamic description, in which the size of particles is introduced via a lattice gas model. The impenetrability and finite volume of the species are shown to affect both the reaction rates and the diffusion terms in the evolution equations for the concentrations. The corrected scheme shows a more complex dynamical behavior than its ideal counterpart, including bistability and excitability. These results help to shed light on recent experimental and computational studies in biochemistry and surface chemistry, in which it was shown that confined environments may greatly affect chemical dynamics.

  9. Semiempirical Model Would Control Czochralski Process

    NASA Technical Reports Server (NTRS)

    Dudukovic, M. P.; Ramachandran, P. A.; Srivastava, R. K.

    1989-01-01

    Semiempirical mathematical model proposed for control of growth of single crystals of silicon by Czochralski process. Expresses dependence of pulling rate and shape of liquid/solid interface upon important process variables; radius of growing crystal, temperature of crucible, level of melt, and height of exposed portion of crucible wall. Necessary to control shape of interface in manner consistent with other variables, to maintain radially uniform concentration of dopant, and reduce thermally induced stresses in vicinity of interface. Used to simulate complete growth cycles without requiring excessive computer time consumed by rigorous finite-element modeling.

  10. Understanding Immersivity: Image Generation and Transformation Processes in 3D Immersive Environments

    PubMed Central

    Kozhevnikov, Maria; Dhond, Rupali P.

    2012-01-01

    Most research on three-dimensional (3D) visual-spatial processing has been conducted using traditional non-immersive 2D displays. Here we investigated how individuals generate and transform mental images within 3D immersive (3DI) virtual environments, in which the viewers perceive themselves as being surrounded by a 3D world. In Experiment 1, we compared participants’ performance on the Shepard and Metzler (1971) mental rotation (MR) task across the following three types of visual presentation environments; traditional 2D non-immersive (2DNI), 3D non-immersive (3DNI – anaglyphic glasses), and 3DI (head mounted display with position and head orientation tracking). In Experiment 2, we examined how the use of different backgrounds affected MR processes within the 3DI environment. In Experiment 3, we compared electroencephalogram data recorded while participants were mentally rotating visual-spatial images presented in 3DI vs. 2DNI environments. Overall, the findings of the three experiments suggest that visual-spatial processing is different in immersive and non-immersive environments, and that immersive environments may require different image encoding and transformation strategies than the two other non-immersive environments. Specifically, in a non-immersive environment, participants may utilize a scene-based frame of reference and allocentric encoding whereas immersive environments may encourage the use of a viewer-centered frame of reference and egocentric encoding. These findings also suggest that MR performed in laboratory conditions using a traditional 2D computer screen may not reflect spatial processing as it would occur in the real world. PMID:22908003

  11. The DAB model of drawing processes

    NASA Technical Reports Server (NTRS)

    Hochhaus, Larry W.

    1989-01-01

    The problem of automatic drawing was investigated in two ways. First, a DAB model of drawing processes was introduced. DAB stands for three types of knowledge hypothesized to support drawing abilities, namely, Drawing Knowledge, Assimilated Knowledge, and Base Knowledge. Speculation concerning the content and character of each of these subsystems of the drawing process is introduced and the overall adequacy of the model is evaluated. Second, eight experts were each asked to understand six engineering drawings and to think aloud while doing so. It is anticipated that a concurrent protocol analysis of these interviews can be carried out in the future. Meanwhile, a general description of the videotape database is provided. In conclusion, the DAB model was praised as a worthwhile first step toward solution of a difficult problem, but was considered by and large inadequate to the challenge of automatic drawing. Suggestions for improvements on the model were made.

  12. Retort process modelling for Indian traditional foods.

    PubMed

    Gokhale, S V; Lele, S S

    2014-11-01

    Indian traditional staple and snack food is typically a heterogeneous recipe that incorporates varieties of vegetables, lentils and other ingredients. Modelling the retorting process of multilayer pouch packed Indian food was achieved using lumped-parameter approach. A unified model is proposed to estimate cold point temperature. Initial process conditions, retort temperature and % solid content were the significantly affecting independent variables. A model was developed using combination of vegetable solids and water, which was then validated using four traditional Indian vegetarian products: Pulav (steamed rice with vegetables), Sambar (south Indian style curry containing mixed vegetables and lentils), Gajar Halawa (carrot based sweet product) and Upama (wheat based snack product). The predicted and experimental values of temperature profile matched with ±10 % error which is a good match considering the food was a multi component system. Thus the model will be useful as a tool to reduce number of trials required to optimize retorting of various Indian traditional vegetarian foods. PMID:26396305

  13. Lisp and portability: The Process Modeling System

    SciTech Connect

    Egdorf, H.W.

    1992-09-01

    A primary mission of the Technology Modeling and Assessment group (A-7) of the Analysis Division of Los Alamos National Laboratory is to support the Department of Energy in performing analysis of both existing and future facilities that comprise the Nuclear Weapons Complex. Many of the questions to be addressed in relation to this mission involve an examination of the flow of material through a processing facility and the transformations of the material as it moves through the facility by the use of a discrete-event simulation tool. In support of these analysis tasks, a simulation tool kit has been developed that allows examination of issues related to the movement and transformation of material as it moves through a processing facility. This tool kit, The Process Modeling System, is currently the primary modeling tool used for examination of current and future DOE facilities. The flexibility of the system has led to its use in performing similar analysis on a number of nonDOE facilities under Technology Transfer initiatives. The Process Modeling System is written in Common Lisp. The purpose of this paper is to describe the structure of the modeling tool kit and discuss the advantages of Common Lisp as its implementation language.

  14. Lisp and portability: The Process Modeling System

    SciTech Connect

    Egdorf, H.W.

    1992-01-01

    A primary mission of the Technology Modeling and Assessment group (A-7) of the Analysis Division of Los Alamos National Laboratory is to support the Department of Energy in performing analysis of both existing and future facilities that comprise the Nuclear Weapons Complex. Many of the questions to be addressed in relation to this mission involve an examination of the flow of material through a processing facility and the transformations of the material as it moves through the facility by the use of a discrete-event simulation tool. In support of these analysis tasks, a simulation tool kit has been developed that allows examination of issues related to the movement and transformation of material as it moves through a processing facility. This tool kit, The Process Modeling System, is currently the primary modeling tool used for examination of current and future DOE facilities. The flexibility of the system has led to its use in performing similar analysis on a number of nonDOE facilities under Technology Transfer initiatives. The Process Modeling System is written in Common Lisp. The purpose of this paper is to describe the structure of the modeling tool kit and discuss the advantages of Common Lisp as its implementation language.

  15. Attrition and abrasion models for oil shale process modeling

    SciTech Connect

    Aldis, D.F.

    1991-10-25

    As oil shale is processed, fine particles, much smaller than the original shale are created. This process is called attrition or more accurately abrasion. In this paper, models of abrasion are presented for oil shale being processed in several unit operations. Two of these unit operations, a fluidized bed and a lift pipe are used in the Lawrence Livermore National Laboratory Hot-Recycle-Solid (HRS) process being developed for the above ground processing of oil shale. In two reports, studies were conducted on the attrition of oil shale in unit operations which are used in the HRS process. Carley reported results for attrition in a lift pipe for oil shale which had been pre-processed either by retorting or by retorting then burning. The second paper, by Taylor and Beavers, reported results for a fluidized bed processing of oil shale. Taylor and Beavers studied raw, retorted, and shale which had been retorted and then burned. In this paper, empirical models are derived, from the experimental studies conducted on oil shale for the process occurring in the HRS process. The derived models are presented along with comparisons with experimental results.

  16. Multi-layer VEB modeling: capturing interlayer etch process effects for multi-patterning process

    NASA Astrophysics Data System (ADS)

    Hu, Lin; Jung, Sunwook; Li, Jianliang; Kim, Young; Bar, Yuval; Lobb, Granger; Liang, Jim; Ogino, Atsushi; Sturtevant, John; Bailey, Todd

    2016-03-01

    Self-Aligned Via (SAV) process is commonly used in back end of line (BEOL) patterning. As the technology node advances, tightening CD and overlay specs require continuous improvement in model accuracy of the SAV process. Traditional single layer Variable Etch Bias (VEB) model is capable of describing the micro-loading and aperture effects associated with the reactive ion etch (RIE), but it does not include effects from under layers. For the SAV etch, a multi-layer VEB model is needed to account for the etch restriction from metal trenches. In this study, we characterize via post-etch dimensions through pitch and through metal trench widths, and show that VEB model prediction accuracy for SAV CDs after SAV formation can be significantly improved by applying a multi-layer scheme. Using a multi-layer VEB, it is demonstrated that the output via size changes with varying trench dimensions, which matches the silicon results. The model also reports via shape post-etch as a function of trench environment, where elliptical vias are correctly produced. The multi-layer VEB model can be applied both multi-layer correction and verification in full chip flow. This paper will also suggest that the multi-layer VEB model can be used in other FEOL layers with interlayer etch process effects, such as gate cut, to support the robustness of new model.

  17. Modeling stroke rehabilitation processes using the Unified Modeling Language (UML).

    PubMed

    Ferrante, Simona; Bonacina, Stefano; Pinciroli, Francesco

    2013-10-01

    In organising and providing rehabilitation procedures for stroke patients, the usual need for many refinements makes it inappropriate to attempt rigid standardisation, but greater detail is required concerning workflow. The aim of this study was to build a model of the post-stroke rehabilitation process. The model, implemented in the Unified Modeling Language, was grounded on international guidelines and refined following the clinical pathway adopted at local level by a specialized rehabilitation centre. The model describes the organisation of the rehabilitation delivery and it facilitates the monitoring of recovery during the process. Indeed, a system software was developed and tested to support clinicians in the digital administration of clinical scales. The model flexibility assures easy updating after process evolution.

  18. Modeling of a thermoplastic pultrusion process

    SciTech Connect

    Astroem, B.T. ); Pipes, R.B. )

    1991-07-01

    To obtain a fundamental understanding of the effects of processing parameters and die geometry in a pultrusion process, a mathematical model is essential in order to minimize the number of trial-and-error experiments. Previous investigators have suggested a variety of more or less complete models for thermoset pultrusion, while little effort seems to have been spent modeling its less well-understood thermoplastic equivalent. Hence, a set of intricately related models to describe the temperature and pressure distributions, as well as the matrix flow, in a thermoplastic composite as it travels through the pultrusion die is presented. An approach to calculate the accumulated pulling force is also explored, and the individual mechanisms contributing to the pulling force are discussed. The pressure model incorporates a matrix viscosity that varies with shear rate, temperature, and pressure. Comparisons are made between shear-rate-dependent and Newtonian viscosity representations, indicating the necessity of including non-Newtonian fluid behavior when modeling thermoplastic pultrusion. The governing equations of the models are stated in general terms, and simplifications are implemented in order to obtain solutions without extensive numerical efforts. Pressure, temperature, cooling rate, and pulling force distributions are presented for carbon-fiber-reinforced polyetheretherketone. Pulling force predictions are compared to data obtained from preliminary experiments conducted with a model pultrusion line that was built solely for the pultrusion of thermoplastic matrix composites, and the correlation is found to be qualitatively satisfactory.

  19. Therapeutic Process During Exposure: Habituation Model

    PubMed Central

    Benito, Kristen G.; Walther, Michael

    2015-01-01

    The current paper outlines the habituation model of exposure process, which is a behavioral model emphasizing use of individually tailored functional analysis during exposures. This is a model of therapeutic process rather than one meant to explain the mechanism of change underlying exposure-based treatments. Habitation, or a natural decrease in anxiety level in the absence of anxiety-reducing behavior, might be best understood as an intermediate treatment outcome that informs therapeutic process, rather than as a mechanism of change. The habituation model purports that three conditions are necessary for optimal benefit from exposures: 1) fear activation, 2) minimization of anxiety-reducing behaviors, and 3) habituation. We describe prescribed therapist and client behaviors as those that increase or maintain anxiety level during an exposure (and therefore, facilitate habituation), and proscribed therapist and client behaviors as those that decrease anxiety during an exposure (and therefore, impede habituation). We illustrate model-consistent behaviors in the case of Monica, as well as outline the existing research support and call for additional research to further test the tenets of the habituation model as described in this paper. PMID:26258012

  20. Representation of planetary magnetospheric environment with the paraboloid model

    NASA Astrophysics Data System (ADS)

    Kalegaev, V. V.; Alexeev, I. I.; Belenkaya, E. S.; Mukhametdinova, L. R.; Khodachenko, M. L.; Génot, V.; Kallio, E. J.; Al-Ubaidi, T.; Modolo, R.

    2013-09-01

    Paraboloid model of the Earth's magnetosphere has been developed at Moscow State University to represent correctly the electrodynamics processes in the near-Earth's space [1]. This model is intended to calculate the magnetic field generated by a variety of current systems located on the boundaries and within the boundaries of the Earth's magnetosphere under a wide range of environmental conditions, quiet and disturbed, affected by Solar-Terrestrial interactions simulated by Solar activity such as Solar Flares and related phenomena which induce terrestrial magnetic disturbances such as Magnetic Storms. The model depends on a small set of physical input parameters, which characterize the intensity of large-scale magnetospheric current systems and their location. Among these parameters are a geomagnetic dipole tilt angle, distance to the subsolar point of the magnetosphere, etc. The input parameters depend on real- or quasi-real- time Empirical Data that include solar wind and IMF data as well as geomagnetic indices. A generalized paraboloid model was implemented to represent the magnetospheres of some magnetized planets, e.g. Saturn [2], Jupiter [3], Mercury [4]. Interactive models of the Earth's, Kronian and Mercury's magnetospheres, which take into account specific features of the modeled objects have been realized at Space Monitoring Data Center of SINP MSU [5]. The real-time model of the Earth's magnetosphere is currently working at SINP MSU Space Weather Web-site [6]. Data from different sources (satellite measurements, simulation data bases and online services) are accumulated inside a digital framework developed within the FP7 project IMPEx. Paraboloid model of the magnetospheres (PMM) is part of this infrastructure. A set of Webservices to provide the access to PMM calculations and to enable the modeling data post-processing under SOAP protocol have been created. These will be implemented for easy data exchange within IMPEx infrastructure.

  1. Processing and Modeling of Porous Copper Using Sintering Dissolution Process

    NASA Astrophysics Data System (ADS)

    Salih, Mustafa Abualgasim Abdalhakam

    The growth of porous metal has produced materials with improved properties as compared to non-metals and solid metals. Porous metal can be classified as either open cell or closed cell. Open cell allows a fluid media to pass through it. Closed cell is made up of adjacent sealed pores with shared cell walls. Metal foams offer higher strength to weight ratios, increased impact energy absorption, and a greater tolerance to high temperatures and adverse environmental conditions when compared to bulk materials. Copper and its alloys are examples of these, well known for high strength and good mechanical, thermal and electrical properties. In the present study, the porous Cu was made by a powder metallurgy process, using three different space holders, sodium chloride, sodium carbonate and potassium carbonate. Several different samples have been produced, using different ratios of volume fraction. The densities of the porous metals have been measured and compared to the theoretical density calculated using an equation developed for these foams. The porous structure was determined with the removal of spacer materials through sintering process. The sintering process of each spacer material depends on the melting point of the spacer material. Processing, characterization, and mechanical properties were completed. These tests include density measurements, compression tests, computed tomography (CT) and scanning electron microscopy (SEM). The captured morphological images are utilized to generate the object-oriented finite element (OOF) analysis for the porous copper. Porous copper was formed with porosities in the range of 40-66% with density ranges from 3 to 5.2 g/cm3. A study of two different methods to measure porosity was completed. OOF (Object Oriented Finite Elements) is a desktop software application for studying the relationship between the microstructure of a material and its overall mechanical, dielectric, or thermal properties using finite element models based on

  2. Modeling seismic energy propagation in highly scattering environments

    NASA Astrophysics Data System (ADS)

    Blanchette-Guertin, J.-F.; Johnson, C. L.; Lawrence, J. F.

    2015-03-01

    Meteoroid impacts over millions to billions of years can produce a highly fractured and heterogeneous megaregolith layer on planetary bodies such as the Moon that lack effective surface recycling mechanisms. The energy from seismic events on these bodies undergoes scattering in the fractured layer(s) and generates extensive coda wave trains that follow major seismic wave arrivals. The decay properties of these codas are affected by the planetary body's interior structure. To understand the propagation of seismic waves in such media, we model the transmission of seismic energy in highly scattering environments using an adapted phonon method. In this Monte Carlo simulation approach, we track a large number of seismic wavelets as they leave a source and we record the resulting ground deformation each time a wavelet reaches a surface receiver. Our method provides the first numerical global modeling of 3-D scattering, with user-defined power law distributions of scatterer length scales and frequency-dependent intrinsic attenuation, under the assumption of 1-D background velocity models. We model synthetic signals for simple, but highly scattering interior models and vary the model parameters independently to assess their individual effects on the coda. Results show that the magnitude of the decay times is most affected by the background velocity model, in particular the presence of shallow low-velocity layers, the event source depth, and the intrinsic attenuation level. The decay times are also controlled to a lesser extent by the size-frequency distribution of scatterers, the thickness of the scattering layer, and the impedance contrast at the scatterers.

  3. Modeling Multi-process Transport of Pathogens in Porous Media

    NASA Astrophysics Data System (ADS)

    Cheng, L.; Brusseau, M. L.

    2004-12-01

    The transport behavior of microorganisms in porous media is of interest with regard to the fate of pathogens associated with wastewater recharge, riverbank filtration, and land application of biosolids. This interest has fomented research on the transport of pathogens in the subsurface environment. The factors influencing pathogen transport within the subsurface environment include advection, dispersion, filtration, and inactivation. The filtration process, which mediates the magnitude and rate of pathogen retention, comprises several mechanisms such as attachment to porous-medium surfaces, straining, and sedimentation. We present a mathematical model wherein individual filtration mechanisms are explicitly incorporated along with advection, dispersion, and inactivation. The performance of the model is evaluated by applying it to several data sets obtained from miscible-displacement experiments conducted using various pathogens. Input parameters are obtained to the extent possible from independent means.

  4. Lightweight modeling environment for network-centric systems

    NASA Astrophysics Data System (ADS)

    Ealy, William

    2001-08-01

    Future network centric systems will rely heavily on telecommunication network technology to provide the connectivity needed to support distributed C4ISR requirements. To develop and validate emerging network centric concepts, designers will need communication and network M&S tools to assess the ability of large-scale networks to achieve the required communication performance. Current network and communication simulation tools are highly accurate and provide detailed data for communication and network designers. However, they are far too complex and inefficient to model large scale networks. To model these networks, lighter weight abstract modeling and simulation (M&S) tools and techniques are required. To meet these requirements, Lockheed Martin Advanced Technology Laboratories (ATL) is applying abstract network modeling techniques, developed for large scale signal processing applications, to model complex, distributed network architectures. Rather than modeling the detailed radio, network protocol and individual data transactions, our approach uses abstract stochastic models to simulate the low-level radio and protocol functions to significantly reduce the complexity and execution times. This paper describes the abstract modeling tools and techniques we are developing, discusses how ATL applied them to Office of the Deputy Under Secretary of Defense for Science and Technology's (ODUSD S&T) Smart Sensor Web (SSW) network and how we are planning to extend them.

  5. A Generic Modeling Process to Support Functional Fault Model Development

    NASA Technical Reports Server (NTRS)

    Maul, William A.; Hemminger, Joseph A.; Oostdyk, Rebecca; Bis, Rachael A.

    2016-01-01

    Functional fault models (FFMs) are qualitative representations of a system's failure space that are used to provide a diagnostic of the modeled system. An FFM simulates the failure effect propagation paths within a system between failure modes and observation points. These models contain a significant amount of information about the system including the design, operation and off nominal behavior. The development and verification of the models can be costly in both time and resources. In addition, models depicting similar components can be distinct, both in appearance and function, when created individually, because there are numerous ways of representing the failure space within each component. Generic application of FFMs has the advantages of software code reuse: reduction of time and resources in both development and verification, and a standard set of component models from which future system models can be generated with common appearance and diagnostic performance. This paper outlines the motivation to develop a generic modeling process for FFMs at the component level and the effort to implement that process through modeling conventions and a software tool. The implementation of this generic modeling process within a fault isolation demonstration for NASA's Advanced Ground System Maintenance (AGSM) Integrated Health Management (IHM) project is presented and the impact discussed.

  6. Physical processes and real-time chemical measurement of the insect olfactory environment.

    PubMed

    Riffell, Jeffrey A; Abrell, Leif; Hildebrand, John G

    2008-07-01

    Odor-mediated insect navigation in airborne chemical plumes is vital to many ecological interactions, including mate finding, flower nectaring, and host locating (where disease transmission or herbivory may begin). After emission, volatile chemicals become rapidly mixed and diluted through physical processes that create a dynamic olfactory environment. This review examines those physical processes and some of the analytical technologies available to characterize those behavior-inducing chemical signals at temporal scales equivalent to the olfactory processing in insects. In particular, we focus on two areas of research that together may further our understanding of olfactory signal dynamics and its processing and perception by insects. First, measurement of physical atmospheric processes in the field can provide insight into the spatiotemporal dynamics of the odor signal available to insects. Field measurements in turn permit aspects of the physical environment to be simulated in the laboratory, thereby allowing careful investigation into the links between odor signal dynamics and insect behavior. Second, emerging analytical technologies with high recording frequencies and field-friendly inlet systems may offer new opportunities to characterize natural odors at spatiotemporal scales relevant to insect perception and behavior. Characterization of the chemical signal environment allows the determination of when and where olfactory-mediated behaviors may control ecological interactions. Finally, we argue that coupling of these two research areas will foster increased understanding of the physicochemical environment and enable researchers to determine how olfactory environments shape insect behaviors and sensory systems.

  7. Physical Processes and Real-Time Chemical Measurement of the Insect Olfactory Environment

    PubMed Central

    Abrell, Leif; Hildebrand, John G.

    2009-01-01

    Odor-mediated insect navigation in airborne chemical plumes is vital to many ecological interactions, including mate finding, flower nectaring, and host locating (where disease transmission or herbivory may begin). After emission, volatile chemicals become rapidly mixed and diluted through physical processes that create a dynamic olfactory environment. This review examines those physical processes and some of the analytical technologies available to characterize those behavior-inducing chemical signals at temporal scales equivalent to the olfactory processing in insects. In particular, we focus on two areas of research that together may further our understanding of olfactory signal dynamics and its processing and perception by insects. First, measurement of physical atmospheric processes in the field can provide insight into the spatiotemporal dynamics of the odor signal available to insects. Field measurements in turn permit aspects of the physical environment to be simulated in the laboratory, thereby allowing careful investigation into the links between odor signal dynamics and insect behavior. Second, emerging analytical technologies with high recording frequencies and field-friendly inlet systems may offer new opportunities to characterize natural odors at spatiotemporal scales relevant to insect perception and behavior. Characterization of the chemical signal environment allows the determination of when and where olfactory-mediated behaviors may control ecological interactions. Finally, we argue that coupling of these two research areas will foster increased understanding of the physicochemical environment and enable researchers to determine how olfactory environments shape insect behaviors and sensory systems. PMID:18548311

  8. X-ray emission processes in stars and their immediate environment.

    PubMed

    Testa, Paola

    2010-04-20

    A decade of X-ray stellar observations with Chandra and XMM-Newton has led to significant advances in our understanding of the physical processes at work in hot (magnetized) plasmas in stars and their immediate environment, providing new perspectives and challenges, and in turn the need for improved models. The wealth of high-quality stellar spectra has allowed us to investigate, in detail, the characteristics of the X-ray emission across the Hertzsprung-Russell (HR) diagram. Progress has been made in addressing issues ranging from classical stellar activity in stars with solar-like dynamos (such as flares, activity cycles, spatial and thermal structuring of the X-ray emitting plasma, and evolution of X-ray activity with age), to X-ray generating processes (e.g., accretion, jets, magnetically confined winds) that were poorly understood in the preChandra/XMM-Newton era. I will discuss the progress made in the study of high energy stellar physics and its impact in a wider astrophysical context, focusing on the role of spectral diagnostics now accessible.

  9. X-ray emission processes in stars and their immediate environment

    PubMed Central

    Testa, Paola

    2010-01-01

    A decade of X-ray stellar observations with Chandra and XMM-Newton has led to significant advances in our understanding of the physical processes at work in hot (magnetized) plasmas in stars and their immediate environment, providing new perspectives and challenges, and in turn the need for improved models. The wealth of high-quality stellar spectra has allowed us to investigate, in detail, the characteristics of the X-ray emission across the Hertzsprung-Russell (HR) diagram. Progress has been made in addressing issues ranging from classical stellar activity in stars with solar-like dynamos (such as flares, activity cycles, spatial and thermal structuring of the X-ray emitting plasma, and evolution of X-ray activity with age), to X-ray generating processes (e.g., accretion, jets, magnetically confined winds) that were poorly understood in the preChandra/XMM-Newton era. I will discuss the progress made in the study of high energy stellar physics and its impact in a wider astrophysical context, focusing on the role of spectral diagnostics now accessible. PMID:20360562

  10. Applications integration in a hybrid cloud computing environment: modelling and platform

    NASA Astrophysics Data System (ADS)

    Li, Qing; Wang, Ze-yuan; Li, Wei-hua; Li, Jun; Wang, Cheng; Du, Rui-yang

    2013-08-01

    With the development of application services providers and cloud computing, more and more small- and medium-sized business enterprises use software services and even infrastructure services provided by professional information service companies to replace all or part of their information systems (ISs). These information service companies provide applications, such as data storage, computing processes, document sharing and even management information system services as public resources to support the business process management of their customers. However, no cloud computing service vendor can satisfy the full functional IS requirements of an enterprise. As a result, enterprises often have to simultaneously use systems distributed in different clouds and their intra enterprise ISs. Thus, this article presents a framework to integrate applications deployed in public clouds and intra ISs. A run-time platform is developed and a cross-computing environment process modelling technique is also developed to improve the feasibility of ISs under hybrid cloud computing environments.

  11. Improving the process of process modelling by the use of domain process patterns

    NASA Astrophysics Data System (ADS)

    Koschmider, Agnes; Reijers, Hajo A.

    2015-01-01

    The use of business process models has become prevalent in a wide area of enterprise applications. But while their popularity is expanding, concerns are growing with respect to their proper creation and maintenance. An obvious way to boost the efficiency of creating high-quality business process models would be to reuse relevant parts of existing models. At this point, however, limited support exists to guide process modellers towards the usage of appropriate model content. In this paper, a set of content-oriented patterns is presented, which is extracted from a large set of process models from the order management and manufacturing production domains. The patterns are derived using a newly proposed set of algorithms, which are being discussed in this paper. The authors demonstrate how such Domain Process Patterns, in combination with information on their historic usage, can support process modellers in generating new models. To support the wider dissemination and development of Domain Process Patterns within and beyond the studied domains, an accompanying website has been set up.

  12. Dynamical modeling of laser ablation processes

    SciTech Connect

    Leboeuf, J.N.; Chen, K.R.; Donato, J.M.; Geohegan, D.B.; Liu, C.L.; Puretzky, A.A.; Wood, R.F.

    1995-09-01

    Several physics and computational approaches have been developed to globally characterize phenomena important for film growth by pulsed laser deposition of materials. These include thermal models of laser-solid target interactions that initiate the vapor plume; plume ionization and heating through laser absorption beyond local thermodynamic equilibrium mechanisms; gas dynamic, hydrodynamic, and collisional descriptions of plume transport; and molecular dynamics models of the interaction of plume particles with the deposition substrate. The complexity of the phenomena involved in the laser ablation process is matched by the diversity of the modeling task, which combines materials science, atomic physics, and plasma physics.

  13. Hencky's model for elastomer forming process

    NASA Astrophysics Data System (ADS)

    Oleinikov, A. A.; Oleinikov, A. I.

    2016-08-01

    In the numerical simulation of elastomer forming process, Henckys isotropic hyperelastic material model can guarantee relatively accurate prediction of strain range in terms of large deformations. It is shown, that this material model prolongate Hooke's law from the area of infinitesimal strains to the area of moderate ones. New representation of the fourth-order elasticity tensor for Hencky's hyperelastic isotropic material is obtained, it possesses both minor symmetries, and the major symmetry. Constitutive relations of considered model is implemented into MSC.Marc code. By calculating and fitting curves, the polyurethane elastomer material constants are selected. Simulation of equipment for elastomer sheet forming are considered.

  14. Model-based internal wave processing

    SciTech Connect

    Candy, J.V.; Chambers, D.H.

    1995-06-09

    A model-based approach is proposed to solve the oceanic internal wave signal processing problem that is based on state-space representations of the normal-mode vertical velocity and plane wave horizontal velocity propagation models. It is shown that these representations can be utilized to spatially propagate the modal (dept) vertical velocity functions given the basic parameters (wave numbers, Brunt-Vaisala frequency profile etc.) developed from the solution of the associated boundary value problem as well as the horizontal velocity components. Based on this framework, investigations are made of model-based solutions to the signal enhancement problem for internal waves.

  15. Hot blast stove process model and model-based controller

    SciTech Connect

    Muske, K.R.; Howse, J.W.; Hansen, G.A.; Cagliostro, D.J.; Chaubal, P.C.

    1998-12-31

    This paper describes the process model and model-based control techniques implemented on the hot blast stoves for the No. 7 Blast Furnace at the Inland Steel facility in East Chicago, Indiana. A detailed heat transfer model of the stoves is developed and verified using plant data. This model is used as part of a predictive control scheme to determine the minimum amount of fuel necessary to achieve the blast air requirements. The model is also used to predict maximum and minimum temperature constraint violations within the stove so that the controller can take corrective actions while still achieving the required stove performance.

  16. Modeling two-spin dynamics in a noisy environment

    SciTech Connect

    Testolin, M. J.; Hollenberg, L. C. L.; Cole, J. H.

    2009-10-15

    We describe how the effect of charge noise on a pair of spins coupled via the exchange interaction can be calculated by modeling charge fluctuations as a random telegraph noise process using probability density functions. We develop analytic expressions for the time-dependent superoperator of a pair of spins as a function of fluctuation amplitude and rate. We show that the theory can be extended to include multiple fluctuators, in particular, spectral distributions of fluctuators. These superoperators can be included in time-dependent analyses of the state of spin systems designed for spintronics or quantum information processing to determine the decohering effects of exchange fluctuations.

  17. Aligning Grammatical Theories and Language Processing Models

    ERIC Educational Resources Information Center

    Lewis, Shevaun; Phillips, Colin

    2015-01-01

    We address two important questions about the relationship between theoretical linguistics and psycholinguistics. First, do grammatical theories and language processing models describe separate cognitive systems, or are they accounts of different aspects of the same system? We argue that most evidence is consistent with the one-system view. Second,…

  18. The SERIOL2 Model of Orthographic Processing

    ERIC Educational Resources Information Center

    Whitney, Carol; Marton, Yuval

    2013-01-01

    The SERIOL model of orthographic analysis proposed mechanisms for converting visual input into a serial encoding of letter order, which involved hemisphere-specific processing at the retinotopic level. As a test of SERIOL predictions, we conducted a consonant trigram-identification experiment, where the trigrams were briefly presented at various…

  19. Model Identification of Integrated ARMA Processes

    ERIC Educational Resources Information Center

    Stadnytska, Tetiana; Braun, Simone; Werner, Joachim

    2008-01-01

    This article evaluates the Smallest Canonical Correlation Method (SCAN) and the Extended Sample Autocorrelation Function (ESACF), automated methods for the Autoregressive Integrated Moving-Average (ARIMA) model selection commonly available in current versions of SAS for Windows, as identification tools for integrated processes. SCAN and ESACF can…

  20. Content, Process, and Product: Modeling Differentiated Instruction

    ERIC Educational Resources Information Center

    Taylor, Barbara Kline

    2015-01-01

    Modeling differentiated instruction is one way to demonstrate how educators can incorporate instructional strategies to address students' needs, interests, and learning styles. This article discusses how secondary teacher candidates learn to focus on content--the "what" of instruction; process--the "how" of instruction;…

  1. Mathematical Modelling of Continuous Biotechnological Processes

    ERIC Educational Resources Information Center

    Pencheva, T.; Hristozov, I.; Shannon, A. G.

    2003-01-01

    Biotechnological processes (BTP) are characterized by a complicated structure of organization and interdependent characteristics. Partial differential equations or systems of partial differential equations are used for their behavioural description as objects with distributed parameters. Modelling of substrate without regard to dispersion…

  2. Knowledge-based approach to fault diagnosis and control in distributed process environments

    NASA Astrophysics Data System (ADS)

    Chung, Kwangsue; Tou, Julius T.

    1991-03-01

    This paper presents a new design approach to knowledge-based decision support systems for fault diagnosis and control for quality assurance and productivity improvement in automated manufacturing environments. Based on the observed manifestations, the knowledge-based diagnostic system hypothesizes a set of the most plausible disorders by mimicking the reasoning process of a human diagnostician. The data integration technique is designed to generate error-free hierarchical category files. A novel approach to diagnostic problem solving has been proposed by integrating the PADIKS (Pattern-Directed Knowledge-Based System) concept and the symbolic model of diagnostic reasoning based on the categorical causal model. The combination of symbolic causal reasoning and pattern-directed reasoning produces a highly efficient diagnostic procedure and generates a more realistic expert behavior. In addition, three distinctive constraints are designed to further reduce the computational complexity and to eliminate non-plausible hypotheses involved in the multiple disorders problem. The proposed diagnostic mechanism, which consists of three different levels of reasoning operations, significantly reduces the computational complexity in the diagnostic problem with uncertainty by systematically shrinking the hypotheses space. This approach is applied to the test and inspection data collected from a PCB manufacturing operation.

  3. Computational Process Modeling for Additive Manufacturing (OSU)

    NASA Technical Reports Server (NTRS)

    Bagg, Stacey; Zhang, Wei

    2015-01-01

    Powder-Bed Additive Manufacturing (AM) through Direct Metal Laser Sintering (DMLS) or Selective Laser Melting (SLM) is being used by NASA and the Aerospace industry to "print" parts that traditionally are very complex, high cost, or long schedule lead items. The process spreads a thin layer of metal powder over a build platform, then melts the powder in a series of welds in a desired shape. The next layer of powder is applied, and the process is repeated until layer-by-layer, a very complex part can be built. This reduces cost and schedule by eliminating very complex tooling and processes traditionally used in aerospace component manufacturing. To use the process to print end-use items, NASA seeks to understand SLM material well enough to develop a method of qualifying parts for space flight operation. Traditionally, a new material process takes many years and high investment to generate statistical databases and experiential knowledge, but computational modeling can truncate the schedule and cost -many experiments can be run quickly in a model, which would take years and a high material cost to run empirically. This project seeks to optimize material build parameters with reduced time and cost through modeling.

  4. Empirical Modeling of Plant Gas Fluxes in Controlled Environments

    NASA Technical Reports Server (NTRS)

    Cornett, Jessie David

    1994-01-01

    As humans extend their reach beyond the earth, bioregenerative life support systems must replace the resupply and physical/chemical systems now used. The Controlled Ecological Life Support System (CELSS) will utilize plants to recycle the carbon dioxide (CO2) and excrement produced by humans and return oxygen (O2), purified water and food. CELSS design requires knowledge of gas flux levels for net photosynthesis (PS(sub n)), dark respiration (R(sub d)) and evapotranspiration (ET). Full season gas flux data regarding these processes for wheat (Triticum aestivum), soybean (Glycine max) and rice (Oryza sativa) from published sources were used to develop empirical models. Univariate models relating crop age (days after planting) and gas flux were fit by simple regression. Models are either high order (5th to 8th) or more complex polynomials whose curves describe crop development characteristics. The models provide good estimates of gas flux maxima, but are of limited utility. To broaden the applicability, data were transformed to dimensionless or correlation formats and, again, fit by regression. Polynomials, similar to those in the initial effort, were selected as the most appropriate models. These models indicate that, within a cultivar, gas flux patterns appear remarkably similar prior to maximum flux, but exhibit considerable variation beyond this point. This suggests that more broadly applicable models of plant gas flux are feasible, but univariate models defining gas flux as a function of crop age are too simplistic. Multivariate models using CO2 and crop age were fit for PS(sub n), and R(sub d) by multiple regression. In each case, the selected model is a subset of a full third order model with all possible interactions. These models are improvements over the univariate models because they incorporate more than the single factor, crop age, as the primary variable governing gas flux. They are still limited, however, by their reliance on the other environmental

  5. The Chandra X-Ray Observatory Radiation Environment Model

    NASA Technical Reports Server (NTRS)

    Blackwell, W. C.; Minow, Joseph I.; Smith, Shawn; Swift, Wesley R.; ODell, Stephen L.; Cameron, Robert A.

    2003-01-01

    CRMFLX (Chandra Radiation Model of ion FluX) is an environmental risk mitigation tool for use as a decision aid in planning the operations times for Chandra's Advanced CCD Imaging Spectrometer (ACIS) detector. The accurate prediction of the proton flux environment with energies of 100 - 200 keV is needed in order to protect the ACIS detector against proton degradation. Unfortunately, protons of this energy are abundant in the region of space Chandra must operate, and the on-board Electron, Proton, and Helium Instrument (EPHIN) does not measure proton flux levels of the required energy range. In addition to the concerns arising from the radiation belts, substorm injections of plasma from the magnetotail may increase the protons flux by orders of magnitude in this energy range. The Earth's magnetosphere is a dynamic entity, with the size and location of the magnetopause driven by the highly variable solar wind parameters (number density, velocity, and magnetic field components). Operational times for the telescope must be made weeks in advance, decisions which are complicated by the variability of the environment. CRMFLX is an engineering model developed to address these problems and provides proton flux and fluence statistics for the terrestrial outer magnetosphere, magnetosheath, and solar wind for use in scheduling ACIS operations. CRMFLX implements a number of standard models to predict the bow shock, magnetopause, and plasma sheet boundaries based on the sampling of historical solar wind data sets. Measurements from the GEOTAIL and POLAR spacecraft are used to create the proton flux database. This paper describes the recently released CRMFLX v2 implementation that includes an algorithm that propagates flux from an observation location to other regions of the magnetosphere based on convective ExB and VB-curvature particle drift motions in electric and magnetic fields. This technique has the advantage of more completely filling out the database and makes maximum

  6. A model evaluation checklist for process-based environmental models

    NASA Astrophysics Data System (ADS)

    Jackson-Blake, Leah

    2015-04-01

    Mechanistic catchment-scale phosphorus models appear to perform poorly where diffuse sources dominate. The reasons for this were investigated for one commonly-applied model, the INtegrated model of CAtchment Phosphorus (INCA-P). Model output was compared to 18 months of daily water quality monitoring data in a small agricultural catchment in Scotland, and model structure, key model processes and internal model responses were examined. Although the model broadly reproduced dissolved phosphorus dynamics, it struggled with particulates. The reasons for poor performance were explored, together with ways in which improvements could be made. The process of critiquing and assessing model performance was then generalised to provide a broadly-applicable model evaluation checklist, incorporating: (1) Calibration challenges, relating to difficulties in thoroughly searching a high-dimensional parameter space and in selecting appropriate means of evaluating model performance. In this study, for example, model simplification was identified as a necessary improvement to reduce the number of parameters requiring calibration, whilst the traditionally-used Nash Sutcliffe model performance statistic was not able to discriminate between realistic and unrealistic model simulations, and alternative statistics were needed. (2) Data limitations, relating to a lack of (or uncertainty in) input data, data to constrain model parameters, data for model calibration and testing, and data to test internal model processes. In this study, model reliability could be improved by addressing all four kinds of data limitation. For example, there was insufficient surface water monitoring data for model testing against an independent dataset to that used in calibration, whilst additional monitoring of groundwater and effluent phosphorus inputs would help distinguish between alternative plausible model parameterisations. (3) Model structural inadequacies, whereby model structure may inadequately represent

  7. Atomic Layer Deposition - Process Models and Metrologies

    SciTech Connect

    Burgess, D.R. Jr.; Maslar, J.E.; Hurst, W.S.; Moore, E.F.; Kimes, W.A.; Fink, R.R.; Nguyen, N.V.

    2005-09-09

    We report on the status of a combined experimental and modeling study for atomic layer deposition (ALD) of HfO2 and Al2O3. Hafnium oxide films were deposited from tetrakis(dimethylamino)hafnium and water. Aluminum oxide films from trimethyl aluminum and water are being studied through simulations. In this work, both in situ metrologies and process models are being developed. Optically-accessible ALD reactors have been constructed for in situ, high-sensitivity Raman and infrared absorption spectroscopic measurements to monitor gas phase and surface species. A numerical model using computational fluid dynamics codes has been developed to simulate the gas flow and temperature profiles in the experimental reactor. Detailed chemical kinetic models are being developed with assistance from quantum chemical calculations to explore reaction pathways and energetics. This chemistry is then incorporated into the overall reactor models.

  8. Coal-to-Liquids Process Model

    SciTech Connect

    2006-01-01

    A comprehensive Aspen Plus model has been developed to rigorously model coal-to-liquids processes. This portion was developed under Laboratory Directed Research and Development (LDRD) funding. The model is built in a modular fashion to allow rapid reconfiguration for evaluation of process options. Aspen Plus is the framework in which the model is developed. The coal-to-liquids simulation package is an assemble of Aspen Hierarchy Blocks representing subsections of the plant. Each of these Blocks are considered individual components of the Copyright, which may be extracted and licensed as individual components, but which may be combined with one or more other components, to model general coal-conversion processes, including the following plant operations: (1) coal handling and preparation, (2) coal pyrolysis, combustion, or gasification, (3) syngas conditioning and cleanup, (4) sulfur recovery using Claus-SCOT unit operations, (5) Fischer-Tropsch liquid fuels synthesis, (6) hydrocracking of high molecular weight paraffin, (7) hydrotreating of low molecular weight paraffin and olefins, (8) gas separations, and (9) power generation representing integrated combined cycle technology.

  9. Coal-to-Liquids Process Model

    2006-01-01

    A comprehensive Aspen Plus model has been developed to rigorously model coal-to-liquids processes. This portion was developed under Laboratory Directed Research and Development (LDRD) funding. The model is built in a modular fashion to allow rapid reconfiguration for evaluation of process options. Aspen Plus is the framework in which the model is developed. The coal-to-liquids simulation package is an assemble of Aspen Hierarchy Blocks representing subsections of the plant. Each of these Blocks are consideredmore » individual components of the Copyright, which may be extracted and licensed as individual components, but which may be combined with one or more other components, to model general coal-conversion processes, including the following plant operations: (1) coal handling and preparation, (2) coal pyrolysis, combustion, or gasification, (3) syngas conditioning and cleanup, (4) sulfur recovery using Claus-SCOT unit operations, (5) Fischer-Tropsch liquid fuels synthesis, (6) hydrocracking of high molecular weight paraffin, (7) hydrotreating of low molecular weight paraffin and olefins, (8) gas separations, and (9) power generation representing integrated combined cycle technology.« less

  10. Process modeling with the regression network.

    PubMed

    van der Walt, T; Barnard, E; van Deventer, J

    1995-01-01

    A new connectionist network topology called the regression network is proposed. The structural and underlying mathematical features of the regression network are investigated. Emphasis is placed on the intricacies of the optimization process for the regression network and some measures to alleviate these difficulties of optimization are proposed and investigated. The ability of the regression network algorithm to perform either nonparametric or parametric optimization, as well as a combination of both, is also highlighted. It is further shown how the regression network can be used to model systems which are poorly understood on the basis of sparse data. A semi-empirical regression network model is developed for a metallurgical processing operation (a hydrocyclone classifier) by building mechanistic knowledge into the connectionist structure of the regression network model. Poorly understood aspects of the process are provided for by use of nonparametric regions within the structure of the semi-empirical connectionist model. The performance of the regression network model is compared to the corresponding generalization performance results obtained by some other nonparametric regression techniques.

  11. Development of an interdisciplinary model cluster for tidal water environments

    NASA Astrophysics Data System (ADS)

    Dietrich, Stephan; Winterscheid, Axel; Jens, Wyrwa; Hartmut, Hein; Birte, Hein; Stefan, Vollmer; Andreas, Schöl

    2013-04-01

    Global climate change has a high potential to influence both the persistence and the transport pathways of water masses and its constituents in tidal waters and estuaries. These processes are linked through dispersion processes, thus directly influencing the sediment and solid suspend matter budgets, and thus the river morphology. Furthermore, the hydrologic regime has an impact on the transport of nutrients, phytoplankton, suspended matter, and temperature that determine the oxygen content within water masses, which is a major parameter describing the water quality. This project aims at the implementation of a so-called (numerical) model cluster in tidal waters, which includes the model compartments hydrodynamics, morphology and ecology. For the implementation of this cluster it is required to continue with the integration of different models that work in a wide range of spatial and temporal scales. The model cluster is thus suggested to lead to a more precise knowledge of the feedback processes between the single interdisciplinary model compartments. In addition to field measurements this model cluster will provide a complementary scientific basis required to address a spectrum of research questions concerning the integral management of estuaries within the Federal Institute of Hydrology (BfG, Germany). This will in particular include aspects like sediment and water quality management as well as adaptation strategies to climate change. The core of the model cluster will consist of the 3D-hydrodynamic model Delft3D (Roelvink and van Banning, 1994), long-term hydrodynamics in the estuaries are simulated with the Hamburg Shelf Ocean Model HAMSOM (Backhaus, 1983; Hein et al., 2012). The simulation results will be compared with the unstructured grid based SELFE model (Zhang and Bapista, 2008). The additional coupling of the BfG-developed 1D-water quality model QSim (Kirchesch and Schöl, 1999; Hein et al., 2011) with the morphological/hydrodynamic models is an

  12. Process diagnostics for precision grinding brittle materials in a production environment

    SciTech Connect

    Blaedel, K L; Davis, P J; Piscotty, M A

    1999-04-01

    Precision grinding processes are steadily migrating from research laboratory environments into manufacturing production lines as precision machines and processes become increasingly more commonplace throughout industry. Low-roughness, low-damage precision grinding is gaining widespread commercial acceptance for a host of brittle materials including advanced structural ceramics. The development of these processes is often problematic and requires diagnostic information and analysis to harden the processes for manufacturing. This paper presents a series of practical precision grinding tests developed and practiced at Lawrence Livermore National Laboratory that yield important information to help move a new process idea into production.

  13. Using 222Rn as a tracer of geophysical processes in underground environments

    NASA Astrophysics Data System (ADS)

    Lacerda, T.; Anjos, R. M.; Valladares, D. L.; da Silva, A. A. R.; Rizzotto, M.; Velasco, H.; de Rosas, J. P.; Ayub, J. Juri; Yoshimura, E. M.

    2014-11-01

    Radon levels in two old mines in San Luis, Argentina, are reported and analyzed. These mines are today used for touristic visitation. Our goal was to assess the potential use of such radioactive noble gas as tracer of geological processes in underground environments. CR-39 nuclear track detectors were used during the winter and summer seasons. The findings show that the significant radon concentrations reported in this environment are subject to large seasonal modulations, due to the strong dependence of natural ventilation on the variations of outside temperature. The results also indicate that radon pattern distribution appear as a good method to localize unknown ducts, fissures or secondary tunnels in subterranean environments.

  14. Using {sup 222}Rn as a tracer of geophysical processes in underground environments

    SciTech Connect

    Lacerda, T.; Anjos, R. M.; Silva, A. A. R. da; Yoshimura, E. M.

    2014-11-11

    Radon levels in two old mines in San Luis, Argentina, are reported and analyzed. These mines are today used for touristic visitation. Our goal was to assess the potential use of such radioactive noble gas as tracer of geological processes in underground environments. CR-39 nuclear track detectors were used during the winter and summer seasons. The findings show that the significant radon concentrations reported in this environment are subject to large seasonal modulations, due to the strong dependence of natural ventilation on the variations of outside temperature. The results also indicate that radon pattern distribution appear as a good method to localize unknown ducts, fissures or secondary tunnels in subterranean environments.

  15. Performance analysis of no-vent fill process for liquid hydrogen tank in terrestrial and on-orbit environments

    NASA Astrophysics Data System (ADS)

    Wang, Lei; Li, Yanzhong; Zhang, Feini; Ma, Yuan

    2015-12-01

    Two finite difference computer models, aiming at the process predictions of no-vent fill in normal gravity and microgravity environments respectively, are developed to investigate the filling performance in a liquid hydrogen (LH2) tank. In the normal gravity case model, the tank/fluid system is divided into five control volume including ullage, bulk liquid, gas-liquid interface, ullage-adjacent wall, and liquid-adjacent wall. In the microgravity case model, vapor-liquid thermal equilibrium state is maintained throughout the process, and only two nodes representing fluid and wall regions are applied. To capture the liquid-wall heat transfer accurately, a series of heat transfer mechanisms are considered and modeled successively, including film boiling, transition boiling, nucleate boiling and liquid natural convection. The two models are validated by comparing their prediction with experimental data, which shows good agreement. Then the two models are used to investigate the performance of no-vent fill in different conditions and several conclusions are obtained. It shows that in the normal gravity environment the no-vent fill experiences a continuous pressure rise during the whole process and the maximum pressure occurs at the end of the operation, while the maximum pressure of the microgravity case occurs at the beginning stage of the process. Moreover, it seems that increasing inlet mass flux has an apparent influence on the pressure evolution of no-vent fill process in normal gravity but a little influence in microgravity. The larger initial wall temperature brings about more significant liquid evaporation during the filling operation, and then causes higher pressure evolution, no matter the filling process occurs under normal gravity or microgravity conditions. Reducing inlet liquid temperature can improve the filling performance in normal gravity, but cannot significantly reduce the maximum pressure in microgravity. The presented work benefits the

  16. [Cellular model of blood coagulation process].

    PubMed

    Bijak, Michał; Rzeźnicka, Paulina; Saluk, Joanna; Nowak, Paweł

    2015-07-01

    Blood coagulation is a process which main objective is the prevention of blood loss when the integrity of the blood vessel is damaged. Over the years, have been presented a number of concepts characterizing the mechanism of thrombus formation. Since the 60s of last century was current cascade model of the coagulation wherein forming of the fibrin clot is determined by two pathways called extrinsic and intrinsic pathways. In the nineties of the last century Monroe and Hoffman presented his concept of blood coagulation process which complement the currently valid model of cells participation especially of blood platelets which aim is to provide a negatively charged phospholipid surface and thereby allow the coagulation enzymatic complexes formation. Developed conception they called cellular model of coagulation. The aim of this work was to present in details of this blood coagulation, including descriptions of its various phases.

  17. Modeling veterans healthcare administration disclosure processes :

    SciTech Connect

    Beyeler, Walter E; DeMenno, Mercy B.; Finley, Patrick D.

    2013-09-01

    As with other large healthcare organizations, medical adverse events at the Department of Veterans Affairs (VA) facilities can expose patients to unforeseen negative risks. VHA leadership recognizes that properly handled disclosure of adverse events can minimize potential harm to patients and negative consequences for the effective functioning of the organization. The work documented here seeks to help improve the disclosure process by situating it within the broader theoretical framework of issues management, and to identify opportunities for process improvement through modeling disclosure and reactions to disclosure. The computational model will allow a variety of disclosure actions to be tested across a range of incident scenarios. Our conceptual model will be refined in collaboration with domain experts, especially by continuing to draw on insights from VA Study of the Communication of Adverse Large-Scale Events (SCALE) project researchers.

  18. Modeling delayed processes in biological systems

    NASA Astrophysics Data System (ADS)

    Feng, Jingchen; Sevier, Stuart A.; Huang, Bin; Jia, Dongya; Levine, Herbert

    2016-09-01

    Delayed processes are ubiquitous in biological systems and are often characterized by delay differential equations (DDEs) and their extension to include stochastic effects. DDEs do not explicitly incorporate intermediate states associated with a delayed process but instead use an estimated average delay time. In an effort to examine the validity of this approach, we study systems with significant delays by explicitly incorporating intermediate steps. We show that such explicit models often yield significantly different equilibrium distributions and transition times as compared to DDEs with deterministic delay values. Additionally, different explicit models with qualitatively different dynamics can give rise to the same DDEs revealing important ambiguities. We also show that DDE-based predictions of oscillatory behavior may fail for the corresponding explicit model.

  19. An ecological process model of systems change.

    PubMed

    Peirson, Leslea J; Boydell, Katherine M; Ferguson, H Bruce; Ferris, Lorraine E

    2011-06-01

    In June 2007 the American Journal of Community Psychology published a special issue focused on theories, methods and interventions for systems change which included calls from the editors and authors for theoretical advancement in this field. We propose a conceptual model of systems change that integrates familiar and fundamental community psychology principles (succession, interdependence, cycling of resources, adaptation) and accentuates a process orientation. To situate our framework we offer a definition of systems change and a brief review of the ecological perspective and principles. The Ecological Process Model of Systems Change is depicted, described and applied to a case example of policy driven systems level change in publicly funded social programs. We conclude by identifying salient implications for thinking and action which flow from the Model. PMID:21203829

  20. Environment.

    ERIC Educational Resources Information Center

    White, Gilbert F.

    1980-01-01

    Presented are perspectives on the emergence of environmental problems. Six major trends in scientific thinking are identified including: holistic approaches to examining environments, life support systems, resource management, risk assessment, streamlined methods for monitoring environmental change, and emphasis on the global framework. (Author/SA)

  1. Computer modeling of gas flow and gas loading of rock in a bench blasting environment

    SciTech Connect

    Preece, D.S.; Baer, M.R. ); Knudsen, S.D. )

    1991-01-01

    Numerical modeling can contribute greatly to an understanding of the physics involved in the blasting process. This paper will describe the latest enhancements to the blast modeling code DMC (Distinct Motion Code) (Taylor and Preece, 1989) and will demonstrate the ability of DMC to model gas flow and rock motion in a bench blasting environment. DMC has been used previously to model rock motion associated with blasting in a cratering environment (Preece and Taylor, 1990) and in confined volume blasting associated with in-situ oil shale retorting (Preece, 1990 a b). These applications of DMC treated the explosive loading as force versus time functions on specific spheres which were adjusted to obtain correct face velocities. It was recognized that a great need in explosives modeling was the coupling of an ability to simulate gas flow with the rock motion simulation capability of DMC. This was accomplished by executing a finite difference code that computes gas flow through a porous media (Baer and Gross, 1989) in conjunction with DMC. The marriage of these two capabilities has been documented by Preece and Knudsen, 1991. The capabilities that have been added recently to DMC and which will be documented in this paper include: (1) addition of a new equation of state for the explosive gases; (2) modeling of gas flow and sphere loading in a bench environment. 8 refs., 5 figs.

  2. Improving science and mathematics education with computational modelling in interactive engagement environments

    NASA Astrophysics Data System (ADS)

    Neves, Rui Gomes; Teodoro, Vítor Duarte

    2012-09-01

    A teaching approach aiming at an epistemologically balanced integration of computational modelling in science and mathematics education is presented. The approach is based on interactive engagement learning activities built around computational modelling experiments that span the range of different kinds of modelling from explorative to expressive modelling. The activities are designed to make a progressive introduction to scientific computation without requiring prior development of a working knowledge of programming, generate and foster the resolution of cognitive conflicts in the understanding of scientific and mathematical concepts and promote performative competency in the manipulation of different and complementary representations of mathematical models. The activities are supported by interactive PDF documents which explain the fundamental concepts, methods and reasoning processes using text, images and embedded movies, and include free space for multimedia enriched student modelling reports and teacher feedback. To illustrate, an example from physics implemented in the Modellus environment and tested in undergraduate university general physics and biophysics courses is discussed.

  3. Modelling Of Manufacturing Processes With Membranes

    NASA Astrophysics Data System (ADS)

    Crăciunean, Daniel Cristian; Crăciunean, Vasile

    2015-07-01

    The current objectives to increase the standards of quality and efficiency in manufacturing processes can be achieved only through the best combination of inputs, independent of spatial distance between them. This paper proposes modelling production processes based on membrane structures introduced in [4]. Inspired from biochemistry, membrane computation [4] is based on the concept of membrane represented in its formalism by the mathematical concept of multiset. The manufacturing process is the evolution of a super cell system from its initial state according to the given actions of aggregation. In this paper we consider that the atomic production unit of the process is the action. The actions and the resources on which the actions are produced, are distributed in a virtual network of companies working together. The destination of the output resources is specified by corresponding output events.

  4. An Interactive, Web-based High Performance Modeling Environment for Computational Epidemiology

    PubMed Central

    Deodhar, Suruchi; Bisset, Keith R.; Chen, Jiangzhuo; Ma, Yifei; Marathe, Madhav V.

    2014-01-01

    We present an integrated interactive modeling environment to support public health epidemiology. The environment combines a high resolution individual-based model with a user-friendly web-based interface that allows analysts to access the models and the analytics back-end remotely from a desktop or a mobile device. The environment is based on a loosely-coupled service-oriented-architecture that allows analysts to explore various counter factual scenarios. As the modeling tools for public health epidemiology are getting more sophisticated, it is becoming increasingly hard for non-computational scientists to effectively use the systems that incorporate such models. Thus an important design consideration for an integrated modeling environment is to improve ease of use such that experimental simulations can be driven by the users. This is achieved by designing intuitive and user-friendly interfaces that allow users to design and analyze a computational experiment and steer the experiment based on the state of the system. A key feature of a system that supports this design goal is the ability to start, stop, pause and roll-back the disease propagation and intervention application process interactively. An analyst can access the state of the system at any point in time and formulate dynamic interventions based on additional information obtained through state assessment. In addition, the environment provides automated services for experiment set-up and management, thus reducing the overall time for conducting end-to-end experimental studies. We illustrate the applicability of the system by describing computational experiments based on realistic pandemic planning scenarios. The experiments are designed to demonstrate the system's capability and enhanced user productivity. PMID:25530914

  5. Development of a comprehensive weld process model

    SciTech Connect

    Radhakrishnan, B.; Zacharia, T.; Paul, A.

    1997-05-01

    This cooperative research and development agreement (CRADA) between Concurrent Technologies Corporation (CTC) and Lockheed Martin Energy Systems (LMES) combines CTC`s expertise in the welding area and that of LMES to develop computer models and simulation software for welding processes. This development is of significant impact to the industry, including materials producers and fabricators. The main thrust of the research effort was to develop a comprehensive welding simulation methodology. A substantial amount of work has been done by several researchers to numerically model several welding processes. The primary drawback of most of the existing models is the lack of sound linkages between the mechanistic aspects (e.g., heat transfer, fluid flow, and residual stress) and the metallurgical aspects (e.g., microstructure development and control). A comprehensive numerical model which can be used to elucidate the effect of welding parameters/conditions on the temperature distribution, weld pool shape and size, solidification behavior, and microstructure development, as well as stresses and distortion, does not exist. It was therefore imperative to develop a comprehensive model which would predict all of the above phenomena during welding. The CRADA built upon an already existing three-dimensional (3-D) welding simulation model which was developed by LMES which is capable of predicting weld pool shape and the temperature history in 3-d single-pass welds. However, the model does not account for multipass welds, microstructural evolution, distortion and residual stresses. Additionally, the model requires large resources of computing time, which limits its use for practical applications. To overcome this, CTC and LMES have developed through this CRADA the comprehensive welding simulation model described above.

  6. Comparison of the Beta and the Hidden Markov Models of Trust in Dynamic Environments

    NASA Astrophysics Data System (ADS)

    Moe, Marie E. G.; Helvik, Bjarne E.; Knapskog, Svein J.

    Computational trust and reputation models are used to aid the decision-making process in complex dynamic environments, where we are unable to obtain perfect information about the interaction partners. In this paper we present a comparison of our proposed hidden Markov trust model to the Beta reputation system. The hidden Markov trust model takes the time between observations into account, it also distinguishes between system states and uses methods previously applied to intrusion detection for the prediction of which state an agent is in. We show that the hidden Markov trust model performs better when it comes to the detection of changes in behavior of agents, due to its larger richness in model features. This means that our trust model may be more realistic in dynamic environments. However, the increased model complexity also leads to bigger challenges in estimating parameter values for the model. We also show that the hidden Markov trust model can be parameterized so that it responds similarly to the Beta reputation system.

  7. A Spatial Analysis and Modeling System (SAMS) for environment management

    NASA Technical Reports Server (NTRS)

    Stetina, Fran; Hill, John; Chan, Paul; Jaske, Robert; Rochon, Gilbert

    1993-01-01

    This is a proposal to develop a uniform global environmental data gathering and distribution system to support the calibration and validation of remotely sensed data. SAMS is based on an enhanced version of FEMA's Integrated Emergency Management Information Systems and the Department of Defense's Air land Battlefield Environment Software Systems. This system consists of state-of-the-art graphics and visualization techniques, simulation models, database management and expert systems for conducting environmental and disaster preparedness studies. This software package will be integrated into various Landsat and UNEP-GRID stations which are planned to become direct readout stations during the EOS (Earth Observing System) timeframe. This system would be implemented as a pilot program to support the Tropical Rainfall Measuring Mission (TRMM). This will be a joint NASA-FEMA-University-Industry project.

  8. A Spatial Analysis and Modeling System (SAMS) for environment management

    NASA Technical Reports Server (NTRS)

    Vermillion, Charles H.; Stetina, Fran; Hill, John; Chan, Paul; Jaske, Robert; Rochon, Gilbert

    1992-01-01

    This is a proposal to develop a uniform global environmental data gathering and distribution system to support the calibration and validation of remotely sensed data. SAMS is based on an enhanced version of FE MA's Integrated Emergency Management Information Systems and the Department of Defense's Air Land Battlefield Environment Software Systems. This system consists of state-of-the-art graphics and visualization techniques, simulation models, database management and expert systems for conducting environmental and disaster preparedness studies. This software package will be integrated into various Landsat and UNEP-GRID stations which are planned to become direct readout stations during the EOS timeframe. This system would be implemented as a pilot program to support the Tropical Rainfall Measuring Mission (TRMM). This will be a joint NASA-FEMA-University-Industry project.

  9. Overwriting information: Correlations, physical costs, and environment models

    NASA Astrophysics Data System (ADS)

    Anderson, Neal G.

    2012-03-01

    In this sequel to our previous study of the entropic and energetic costs of information erasure [N.G. Anderson, Phys. Lett. A 372 (2008) 5552], we consider direct overwriting of classical information encoded in a quantum-mechanical memory system interacting with a heat bath. Lower bounds on physical costs of overwriting - in both “single-shot” and “sequential” overwriting scenarios - are obtained from globally unitary quantum dynamics and entropic inequalities alone, all within a referential approach that grounds information content in correlations between physical system states. A heterogeneous environment model, required for consistent treatment of sequential overwriting, is introduced and used to establish and relate bounds for various cases.

  10. Machine platform and software environment for rapid optics assembly process development

    NASA Astrophysics Data System (ADS)

    Sauer, Sebastian; Müller, Tobias; Haag, Sebastian; Zontar, Daniel

    2016-03-01

    The assembly of optical components for laser systems is proprietary knowledge and typically done by well-trained personnel in clean room environment as it has major impact on the overall laser performance. Rising numbers of laser systems drives laser production to industrial-level automation solutions allowing for high volumes by simultaneously ensuring stable quality, lots of variants and low cost. Therefore, an easy programmable, expandable and reconfigurable machine with intuitive and flexible software environment for process configuration is required. With Fraunhofer IPT's expertise on optical assembly processes, the next step towards industrializing the production of optical systems is made.

  11. ISLE (Image and Signal LISP Environment): A functional language interface for signal and image processing

    SciTech Connect

    Azevedo, S.G.; Fitch, J.P.

    1987-10-21

    Conventional software interfaces that use imperative computer commands or menu interactions are often restrictive environments when used for researching new algorithms or analyzing processed experimental data. We found this to be true with current signal-processing software (SIG). As an alternative, ''functional language'' interfaces provide features such as command nesting for a more natural interaction with the data. The Image and Signal LISP Environment (ISLE) is an example of an interpreted functional language interface based on common LISP. Advantages of ISLE include multidimensional and multiple data-type independence through dispatching functions, dynamic loading of new functions, and connections to artificial intelligence (AI) software. 10 refs.

  12. ISLE (Image and Signal Lisp Environment): A functional language interface for signal and image processing

    SciTech Connect

    Azevedo, S.G.; Fitch, J.P.

    1987-05-01

    Conventional software interfaces which utilize imperative computer commands or menu interactions are often restrictive environments when used for researching new algorithms or analyzing processed experimental data. We found this to be true with current signal processing software (SIG). Existing ''functional language'' interfaces provide features such as command nesting for a more natural interaction with the data. The Image and Signal Lisp Environment (ISLE) will be discussed as an example of an interpreted functional language interface based on Common LISP. Additional benefits include multidimensional and multiple data-type independence through dispatching functions, dynamic loading of new functions, and connections to artificial intelligence software.

  13. MOUNTAIN-SCALE COUPLED PROCESSES (TH/THC/THM)MODELS

    SciTech Connect

    Y.S. Wu

    2005-08-24

    This report documents the development and validation of the mountain-scale thermal-hydrologic (TH), thermal-hydrologic-chemical (THC), and thermal-hydrologic-mechanical (THM) models. These models provide technical support for screening of features, events, and processes (FEPs) related to the effects of coupled TH/THC/THM processes on mountain-scale unsaturated zone (UZ) and saturated zone (SZ) flow at Yucca Mountain, Nevada (BSC 2005 [DIRS 174842], Section 2.1.1.1). The purpose and validation criteria for these models are specified in ''Technical Work Plan for: Near-Field Environment and Transport: Coupled Processes (Mountain-Scale TH/THC/THM, Drift-Scale THC Seepage, and Drift-Scale Abstraction) Model Report Integration'' (BSC 2005 [DIRS 174842]). Model results are used to support exclusion of certain FEPs from the total system performance assessment for the license application (TSPA-LA) model on the basis of low consequence, consistent with the requirements of 10 CFR 63.342 [DIRS 173273]. Outputs from this report are not direct feeds to the TSPA-LA. All the FEPs related to the effects of coupled TH/THC/THM processes on mountain-scale UZ and SZ flow are discussed in Sections 6 and 7 of this report. The mountain-scale coupled TH/THC/THM processes models numerically simulate the impact of nuclear waste heat release on the natural hydrogeological system, including a representation of heat-driven processes occurring in the far field. The mountain-scale TH simulations provide predictions for thermally affected liquid saturation, gas- and liquid-phase fluxes, and water and rock temperature (together called the flow fields). The main focus of the TH model is to predict the changes in water flux driven by evaporation/condensation processes, and drainage between drifts. The TH model captures mountain-scale three-dimensional flow effects, including lateral diversion and mountain-scale flow patterns. The mountain-scale THC model evaluates TH effects on water and gas

  14. Emerge - A Python environment for the modeling of subsurface transfers

    NASA Astrophysics Data System (ADS)

    Lopez, S.; Smai, F.; Sochala, P.

    2014-12-01

    The simulation of subsurface mass and energy transfers often relies on specific codes that were mainly developed using compiled languages which usually ensure computational efficiency at the expense of relatively long development times and relatively rigid software. Even if a very detailed, possibly graphical, user-interface is developed the core numerical aspects are rarely accessible and the smallest modification will always need a compilation step. Thus, user-defined physical laws or alternative numerical schemes may be relatively difficult to use. Over the last decade, Python has emerged as a popular and widely used language in the scientific community. There already exist several libraries for the pre and post-treatment of input and output files for reservoir simulators (e.g. pytough). Development times in Python are considerably reduced compared to compiled languages, and programs can be easily interfaced with libraries written in compiled languages with several comprehensive numerical libraries that provide sequential and parallel solvers (e.g. PETSc, Trilinos…). The core objective of the Emerge project is to explore the possibility to develop a modeling environment in full Python. Consequently, we are developing an open python package with the classes/objects necessary to express, discretize and solve the physical problems encountered in the modeling of subsurface transfers. We heavily relied on Python to have a convenient and concise way of manipulating potentially complex concepts with a few lines of code and a high level of abstraction. Our result aims to be a friendly numerical environment targeting both numerical engineers and physicist or geoscientists with the possibility to quickly specify and handle geometries, arbitrary meshes, spatially or temporally varying properties, PDE formulations, boundary conditions…

  15. PETRO-SAFE '91 conference papers: Volume 3 (Drilling and production environment and safety), Volume 4 (Transportation and storage environment and safety) and Volume 5 (Processing and refining environment and safety)

    SciTech Connect

    Not Available

    1991-01-01

    This conference provided a forum for the oil, gas, and petrochemical industries to discuss state of the art knowledge in those fields. The following topics were addressed: drilling and production environment and safety; transportation and storage environment and safety; and processing and refining environment and safety. Separate papers are processed for inclusion in the appropriate data bases.

  16. Reversibility in Quantum Models of Stochastic Processes

    NASA Astrophysics Data System (ADS)

    Gier, David; Crutchfield, James; Mahoney, John; James, Ryan

    Natural phenomena such as time series of neural firing, orientation of layers in crystal stacking and successive measurements in spin-systems are inherently probabilistic. The provably minimal classical models of such stochastic processes are ɛ-machines, which consist of internal states, transition probabilities between states and output values. The topological properties of the ɛ-machine for a given process characterize the structure, memory and patterns of that process. However ɛ-machines are often not ideal because their statistical complexity (Cμ) is demonstrably greater than the excess entropy (E) of the processes they represent. Quantum models (q-machines) of the same processes can do better in that their statistical complexity (Cq) obeys the relation Cμ >= Cq >= E. q-machines can be constructed to consider longer lengths of strings, resulting in greater compression. With code-words of sufficiently long length, the statistical complexity becomes time-symmetric - a feature apparently novel to this quantum representation. This result has ramifications for compression of classical information in quantum computing and quantum communication technology.

  17. Modeling Manufacturing Processes to Mitigate Technological Risk

    SciTech Connect

    Allgood, G.O.; Manges, W.W.

    1999-10-24

    An economic model is a tool for determining the justifiable cost of new sensors and subsystems with respect to value and operation. This process balances the R and D costs against the expense of maintaining current operations and allows for a method to calculate economic indices of performance that can be used as control points in deciding whether to continue development or suspend actions. The model can also be used as an integral part of an overall control loop utilizing real-time process data from the sensor groups to make production decisions (stop production and repair machine, continue and warn of anticipated problems, queue for repairs, etc.). This model has been successfully used and deployed in the CAFE Project. The economic model was one of seven (see Fig. 1) elements critical in developing an investment strategy. It has been successfully used in guiding the R and D activities on the CAFE Project, suspending activities on three new sensor technologies, and continuing development o f two others. The model has also been used to justify the development of a new prognostic approach for diagnosing machine health using COTS equipment and a new algorithmic approach. maintaining current operations and allows for a method to calculate economic indices of performance that can be used as control points in deciding whether to continue development or suspend actions. The model can also be used as an integral part of an overall control loop utilizing real-time process data from the sensor groups to make production decisions (stop production and repair machine, continue and warn of anticipated problems, queue for repairs, etc.).

  18. DEVELOPMENT OF CAPE-OPEN COMPLIANT PROCESS MODELING COMPONENTS IN MICROSOFT .NET

    EPA Science Inventory

    The CAPE-OPEN middleware standards were created to allow process modeling components (PMCs) developed by third parties to be used in any process modeling environment (PME) utilizing these standards. The CAPE-OPEN middleware specifications were based upon both Microsoft's Compone...

  19. Glacier lake outburst floods - modelling process chains

    NASA Astrophysics Data System (ADS)

    Schaub, Yvonne; Huggel, Christian; Haeberli, Wilfried

    2013-04-01

    New lakes are forming in high-mountain areas all over the world due to glacier recession. Often they will be located below steep, destabilized flanks and are therefore exposed to impacts from rock-/ice-avalanches. Several events worldwide are known, where an outburst flood has been triggered by such an impact. In regions such as in the European Alps or in the Cordillera Blanca in Peru, where valley bottoms are densely populated, these far-travelling, high-magnitude events can result in major disasters. For appropriate integral risk management it is crucial to gain knowledge on how the processes (rock-/ice-avalanches - impact waves in lake - impact on dam - outburst flood) interact and how the hazard potential related to corresponding process chains can be assessed. Research in natural hazards so far has mainly concentrated on describing, understanding, modeling or assessing single hazardous processes. Some of the above mentioned individual processes are quite well understood in their physical behavior and some of the process interfaces have also been investigated in detail. Multi-hazard assessments of the entire process chain, however, have only recently become subjects of investigations. Our study aims at closing this gap and providing suggestions on how to assess the hazard potential of the entire process chain in order to generate hazard maps and support risk assessments. We analyzed different types of models (empirical, analytical, physically based) for each process regarding their suitability for application in hazard assessments of the entire process chain based on literature. Results show that for rock-/ice-avalanches, dam breach and outburst floods, only numerical, physically based models are able to provide the required information, whereas the impact wave can be estimated by means of physically based or empirical assessments. We demonstrate how the findings could be applied with the help of a case study of a recent glacier lake outburst event at Laguna

  20. Models of plasticity in spatial auditory processing.

    PubMed

    Shinn-Cunningham, B

    2001-01-01

    Both psychophysical and physiological studies have examined plasticity of spatial auditory processing. While there is a great deal known about how the system computes basic cues that influence spatial perception, less is known about how these cues are integrated to form spatial percepts and how the auditory system adapts and calibrates in order to maintain accurate spatial perception. After summarizing evidence for plasticity in the spatial auditory pathway, this paper reviews a statistical, decision-theory model of short-term plasticity and a system-level model of the spatial auditory pathway that may help elucidate how long- and short-term experiences influence the computations underlying spatial hearing.