Science.gov

Sample records for environment process model

  1. Near Field Environment Process Model Report

    SciTech Connect

    R.A. Wagner

    2000-11-14

    Waste emplacement and activities associated with construction of a repository system potentially will change environmental conditions within the repository system. These environmental changes principally result from heat generated by the decay of the radioactive waste, which elevates temperatures within the repository system. Elevated temperatures affect distribution of water, increase kinetic rates of geochemical processes, and cause stresses to change in magnitude and orientation from the stresses resulting from the overlying rock and from underground construction activities. The recognition of this evolving environment has been reflected in activities, studies and discussions generally associated with what has been termed the Near-Field Environment (NFE). The NFE interacts directly with waste packages and engineered barriers as well as potentially changing the fluid composition and flow conditions within the mountain. As such, the NFE defines the environment for assessing the performance of a potential Monitored Geologic Repository at Yucca Mountain, Nevada. The NFe evolves over time, and therefore is not amenable to direct characterization or measurement in the ambient system. Analysis or assessment of the NFE must rely upon projections based on tests and models that encompass the long-term processes of the evolution of this environment. This NFE Process Model Report (PMR) describes the analyses and modeling based on current understanding of the evolution of the near-field within the rock mass extending outward from the drift wall.

  2. Modeling socio-cultural processes in network-centric environments

    NASA Astrophysics Data System (ADS)

    Santos, Eunice E.; Santos, Eugene, Jr.; Korah, John; George, Riya; Gu, Qi; Kim, Keumjoo; Li, Deqing; Russell, Jacob; Subramanian, Suresh

    2012-05-01

    The major focus in the field of modeling & simulation for network centric environments has been on the physical layer while making simplifications for the human-in-the-loop. However, the human element has a big impact on the capabilities of network centric systems. Taking into account the socio-behavioral aspects of processes such as team building, group decision-making, etc. are critical to realistically modeling and analyzing system performance. Modeling socio-cultural processes is a challenge because of the complexity of the networks, dynamism in the physical and social layers, feedback loops and uncertainty in the modeling data. We propose an overarching framework to represent, model and analyze various socio-cultural processes within network centric environments. The key innovation in our methodology is to simultaneously model the dynamism in both the physical and social layers while providing functional mappings between them. We represent socio-cultural information such as friendships, professional relationships and temperament by leveraging the Culturally Infused Social Network (CISN) framework. The notion of intent is used to relate the underlying socio-cultural factors to observed behavior. We will model intent using Bayesian Knowledge Bases (BKBs), a probabilistic reasoning network, which can represent incomplete and uncertain socio-cultural information. We will leverage previous work on a network performance modeling framework called Network-Centric Operations Performance and Prediction (N-COPP) to incorporate dynamism in various aspects of the physical layer such as node mobility, transmission parameters, etc. We validate our framework by simulating a suitable scenario, incorporating relevant factors and providing analyses of the results.

  3. A simple hyperbolic model for communication in parallel processing environments

    NASA Technical Reports Server (NTRS)

    Stoica, Ion; Sultan, Florin; Keyes, David

    1994-01-01

    We introduce a model for communication costs in parallel processing environments called the 'hyperbolic model,' which generalizes two-parameter dedicated-link models in an analytically simple way. Dedicated interprocessor links parameterized by a latency and a transfer rate that are independent of load are assumed by many existing communication models; such models are unrealistic for workstation networks. The communication system is modeled as a directed communication graph in which terminal nodes represent the application processes that initiate the sending and receiving of the information and in which internal nodes, called communication blocks (CBs), reflect the layered structure of the underlying communication architecture. The direction of graph edges specifies the flow of the information carried through messages. Each CB is characterized by a two-parameter hyperbolic function of the message size that represents the service time needed for processing the message. The parameters are evaluated in the limits of very large and very small messages. Rules are given for reducing a communication graph consisting of many to an equivalent two-parameter form, while maintaining an approximation for the service time that is exact in both large and small limits. The model is validated on a dedicated Ethernet network of workstations by experiments with communication subprograms arising in scientific applications, for which a tight fit of the model predictions with actual measurements of the communication and synchronization time between end processes is demonstrated. The model is then used to evaluate the performance of two simple parallel scientific applications from partial differential equations: domain decomposition and time-parallel multigrid. In an appropriate limit, we also show the compatibility of the hyperbolic model with the recently proposed LogP model.

  4. Conceptualization of Approaches and Thought Processes Emerging in Validating of Model in Mathematical Modeling in Technology Aided Environment

    ERIC Educational Resources Information Center

    Hidiroglu, Çaglar Naci; Bukova Güzel, Esra

    2013-01-01

    The aim of the present study is to conceptualize the approaches displayed for validation of model and thought processes provided in mathematical modeling process performed in technology-aided learning environment. The participants of this grounded theory study were nineteen secondary school mathematics student teachers. The data gathered from the…

  5. A Dynamic Process Model for Optimizing the Hospital Environment Cash-Flow

    NASA Astrophysics Data System (ADS)

    Pater, Flavius; Rosu, Serban

    2011-09-01

    In this article is presented a new approach to some fundamental techniques of solving dynamic programming problems with the use of functional equations. We will analyze the problem of minimizing the cost of treatment in a hospital environment. Mathematical modeling of this process leads to an optimal control problem with a finite horizon.

  6. Autoplan: A self-processing network model for an extended blocks world planning environment

    NASA Technical Reports Server (NTRS)

    Dautrechy, C. Lynne; Reggia, James A.; Mcfadden, Frank

    1990-01-01

    Self-processing network models (neural/connectionist models, marker passing/message passing networks, etc.) are currently undergoing intense investigation for a variety of information processing applications. These models are potentially very powerful in that they support a large amount of explicit parallel processing, and they cleanly integrate high level and low level information processing. However they are currently limited by a lack of understanding of how to apply them effectively in many application areas. The formulation of self-processing network methods for dynamic, reactive planning is studied. The long-term goal is to formulate robust, computationally effective information processing methods for the distributed control of semiautonomous exploration systems, e.g., the Mars Rover. The current research effort is focusing on hierarchical plan generation, execution and revision through local operations in an extended blocks world environment. This scenario involves many challenging features that would be encountered in a real planning and control environment: multiple simultaneous goals, parallel as well as sequential action execution, action sequencing determined not only by goals and their interactions but also by limited resources (e.g., three tasks, two acting agents), need to interpret unanticipated events and react appropriately through replanning, etc.

  7. Modeling of autonomous problem solving process by dynamic construction of task models in multiple tasks environment.

    PubMed

    Ohigashi, Yu; Omori, Takashi

    2006-10-01

    Traditional reinforcement learning (RL) supposes a complex but single task to be solved. When a RL agent faces a task similar to a learned one, the agent must re-learn the task from the beginning because it doesn't reuse the past learned results. This is the problem of quick action learning, which is the foundation of decision making in the real world. In this paper, we suppose agents that can solve a set of tasks similar to each other in a multiple tasks environment, where we encounter various problems one after another, and propose a technique of action learning that can quickly solve similar tasks by reusing previously learned knowledge. In our method, a model-based RL uses a task model constructed by combining primitive local predictors for predicting task and environmental dynamics. To evaluate the proposed method, we performed a computer simulation using a simple ping-pong game with variations.

  8. OpenDA Open Source Generic Data Assimilation Environment and its Application in Process Models

    NASA Astrophysics Data System (ADS)

    El Serafy, Ghada; Verlaan, Martin; Hummel, Stef; Weerts, Albrecht; Dhondia, Juzer

    2010-05-01

    Data Assimilation techniques are essential elements in state-of-the-art development of models and their optimization with data in the field of groundwater, surface water and soil systems. They are essential tools in calibration of complex modelling systems and improvement of model forecasts. The OpenDA is a new and generic open source data assimilation environment for application to a choice of physical process models, applied to case dependent domains. OpenDA was introduced recently when the developers of Costa, an open-source TU Delft project [http://www.costapse.org; Van Velzen and Verlaan; 2007] and those of the DATools from the former WL|Delft Hydraulics [El Serafy et al 2007; Weerts et al. 2009] decided to join forces. OpenDA makes use of a set of interfaces that describe the interaction between models, observations and data assimilation algorithms. It focuses on flexible applications in portable systems for modelling geophysical processes. It provides a generic interfacing protocol that allows combination of the implemented data assimilation techniques with, in principle, any time-stepping model duscribing a process(atmospheric processes, 3D circulation, 2D water level, sea surface temperature, soil systems, groundwater etc.). Presently, OpenDA features filtering techniques and calibration techniques. The presentation will give an overview of the OpenDA and the results of some of its practical applications. Application of data assimilation in portable operational forecasting systems—the DATools assimilation environment, El Serafy G.Y., H. Gerritsen, S. Hummel, A. H. Weerts, A.E. Mynett and M. Tanaka (2007), Journal of Ocean Dynamics, DOI 10.1007/s10236-007-0124-3, pp.485-499. COSTA a problem solving environment for data assimilation applied for hydrodynamical modelling, Van Velzen and Verlaan (2007), Meteorologische Zeitschrift, Volume 16, Number 6, December 2007 , pp. 777-793(17). Application of generic data assimilation tools (DATools) for flood

  9. Effects of Kinetic Processes in Shaping Io's Global Plasma Environment: A 3D Hybrid Model

    NASA Technical Reports Server (NTRS)

    Lipatov, Alexander S.; Combi, Michael R.

    2006-01-01

    The global dynamics of the ionized and neutral gases in the environment of Io plays an important role in the interaction of Jupiter s corotating magnetospheric plasma with Io. Stationary simulations of this problem have already been done using the magnetohydrodynamics (MHD) and the electrodynamics approaches. One of the major results of recent simplified two-fluid model simulations [Saur, J., Neubauer, F.M., Strobel, D.F., Summers, M.E., 2002. J. Geophys. Res. 107 (SMP5), 1-18] was the production of the structure of the double-peak in the magnetic field signature of the Io flyby. These could not be explained before by standard MHD models. In this paper, we present a hybrid simulation for Io with kinetic ions and fluid electrons. This method employs a fluid description for electrons and neutrals, whereas for ions a particle approach is used. We also take into account charge-exchange and photoionization processes and solve self-consistently for electric and magnetic fields. Our model may provide a much more accurate description for the ion dynamics than previous approaches and allows us to account for the realistic anisotropic ion velocity distribution that cannot be done in fluid simulations with isotropic temperatures. The first results of such a simulation of the dynamics of ions in Io s environment are discussed in this paper. Comparison with the Galileo IO flyby results shows that this approach provides an accurate physical basis for the interaction and can therefore naturally reproduce all the observed salient features.

  10. Effects of Kinetic Processes in Shaping Io's Global Plasma Environment: A 3D Hybrid Model

    NASA Technical Reports Server (NTRS)

    Lipatov, Alexander S.; Combi, Michael R.

    2004-01-01

    The global dynamics of the ionized and neutral components in the environment of Io plays an important role in the interaction of Jupiter's corotating magnetospheric plasma with Io. The stationary simulation of this problem was done in the MHD and the electrodynamics approaches. One of the main significant results from the simplified two-fluid model simulations was a production of the structure of the double-peak in the magnetic field signature of the I0 flyby that could not be explained by standard MHD models. In this paper, we develop a method of kinetic ion simulation. This method employs the fluid description for electrons and neutrals whereas for ions multilevel, drift-kinetic and particle, approaches are used. We also take into account charge-exchange and photoionization processes. Our model provides much more accurate description for ion dynamics and allows us to take into account the realistic anisotropic ion distribution that cannot be done in fluid simulations. The first results of such simulation of the dynamics of ions in the Io's environment are discussed in this paper.

  11. Gaussian process based modeling and experimental design for sensor calibration in drifting environments.

    PubMed

    Geng, Zongyu; Yang, Feng; Chen, Xi; Wu, Nianqiang

    2015-09-01

    It remains a challenge to accurately calibrate a sensor subject to environmental drift. The calibration task for such a sensor is to quantify the relationship between the sensor's response and its exposure condition, which is specified by not only the analyte concentration but also the environmental factors such as temperature and humidity. This work developed a Gaussian Process (GP)-based procedure for the efficient calibration of sensors in drifting environments. Adopted as the calibration model, GP is not only able to capture the possibly nonlinear relationship between the sensor responses and the various exposure-condition factors, but also able to provide valid statistical inference for uncertainty quantification of the target estimates (e.g., the estimated analyte concentration of an unknown environment). Built on GP's inference ability, an experimental design method was developed to achieve efficient sampling of calibration data in a batch sequential manner. The resulting calibration procedure, which integrates the GP-based modeling and experimental design, was applied on a simulated chemiresistor sensor to demonstrate its effectiveness and its efficiency over the traditional method.

  12. Gaussian process based modeling and experimental design for sensor calibration in drifting environments

    PubMed Central

    Geng, Zongyu; Yang, Feng; Chen, Xi; Wu, Nianqiang

    2016-01-01

    It remains a challenge to accurately calibrate a sensor subject to environmental drift. The calibration task for such a sensor is to quantify the relationship between the sensor’s response and its exposure condition, which is specified by not only the analyte concentration but also the environmental factors such as temperature and humidity. This work developed a Gaussian Process (GP)-based procedure for the efficient calibration of sensors in drifting environments. Adopted as the calibration model, GP is not only able to capture the possibly nonlinear relationship between the sensor responses and the various exposure-condition factors, but also able to provide valid statistical inference for uncertainty quantification of the target estimates (e.g., the estimated analyte concentration of an unknown environment). Built on GP’s inference ability, an experimental design method was developed to achieve efficient sampling of calibration data in a batch sequential manner. The resulting calibration procedure, which integrates the GP-based modeling and experimental design, was applied on a simulated chemiresistor sensor to demonstrate its effectiveness and its efficiency over the traditional method. PMID:26924894

  13. Improving Ground Penetrating Radar Imaging in High Loss Environments by Coordinated System Development, Data Processing, Numerical Modeling, & Visualization

    SciTech Connect

    Wright, David L.

    2004-12-01

    Improving Ground Penetrating Radar Imaging in High Loss Environments by Coordinated System Development, Data Processing, Numerical Modeling, and Visualization Methods with Applications to Site Characterization EMSP Project 86992 Progress Report as of 9/2004.

  14. Mathematical Modelling of Thermal Process to Aquatic Environment with Different Hydrometeorological Conditions

    PubMed Central

    Issakhov, Alibek

    2014-01-01

    This paper presents the mathematical model of the thermal process from thermal power plant to aquatic environment of the reservoir-cooler, which is located in the Pavlodar region, 17 Km to the north-east of Ekibastuz town. The thermal process in reservoir-cooler with different hydrometeorological conditions is considered, which is solved by three-dimensional Navier-Stokes equations and temperature equation for an incompressible flow in a stratified medium. A numerical method based on the projection method, divides the problem into three stages. At the first stage, it is assumed that the transfer of momentum occurs only by convection and diffusion. Intermediate velocity field is solved by fractional steps method. At the second stage, three-dimensional Poisson equation is solved by the Fourier method in combination with tridiagonal matrix method (Thomas algorithm). Finally, at the third stage, it is expected that the transfer is only due to the pressure gradient. Numerical method determines the basic laws of the hydrothermal processes that qualitatively and quantitatively are approximated depending on different hydrometeorological conditions. PMID:24991644

  15. Modelling Dust Processing and Evolution in Extreme Environments as seen by Herschel Space Observatory

    NASA Astrophysics Data System (ADS)

    Bocchio, Marco

    2014-09-01

    The main goal of my PhD study is to understand the dust processing that occurs during the mixing between the galactic interstellar medium and the intracluster medium. This process is of particular interest in violent phenomena such as galaxy-galaxy interactions or the ``Ram Pressure Stripping'' due to the infalling of a galaxy towards the cluster centre.Initially, I focus my attention to the problem of dust destruction and heating processes, re-visiting the available models in literature. I particularly stress on the cases of extreme environments such as a hot coronal-type gas (e.g., IGM, ICM, HIM) and supernova-generated interstellar shocks. Under these conditions small grains are destroyed on short timescales and large grains are heated by the collisions with fast electrons making the dust spectral energy distribution very different from what observed in the diffuse ISM.In order to test our models I apply them to the case of an interacting galaxy, NGC 4438. Herschel data of this galaxy indicates the presence of dust with a higher-than-expected temperature.With a multi-wavelength analysis on a pixel-by-pixel basis we show that this hot dust seems to be embedded in a hot ionised gas therefore undergoing both collisional heating and small grain destruction.Furthermore, I focus on the long-standing conundrum about the dust destruction and dust formation timescales in the Milky Way. Based on the destruction efficiency in interstellar shocks, previous estimates led to a dust lifetime shorter than the typical timescale for dust formation in AGB stars. Using a recent dust model and an updated dust processing model we re-evaluate the dust lifetime in our Galaxy. Finally, I turn my attention to the phenomenon of ``Ram Pressure Stripping''. The galaxy ESO 137-001 represents one of the best cases to study this effect. Its long H2 tail embedded in a hot and ionised tail raises questions about its possible stripping from the galaxy or formation downstream in the tail. Based on

  16. Integrated Modeling Environment

    NASA Technical Reports Server (NTRS)

    Mosier, Gary; Stone, Paul; Holtery, Christopher

    2006-01-01

    The Integrated Modeling Environment (IME) is a software system that establishes a centralized Web-based interface for integrating people (who may be geographically dispersed), processes, and data involved in a common engineering project. The IME includes software tools for life-cycle management, configuration management, visualization, and collaboration.

  17. The enhanced Software Life Cyle Support Environment (ProSLCSE): Automation for enterprise and process modeling

    NASA Technical Reports Server (NTRS)

    Milligan, James R.; Dutton, James E.

    1993-01-01

    In this paper, we have introduced a comprehensive method for enterprise modeling that addresses the three important aspects of how an organization goes about its business. FirstEP includes infrastructure modeling, information modeling, and process modeling notations that are intended to be easy to learn and use. The notations stress the use of straightforward visual languages that are intuitive, syntactically simple, and semantically rich. ProSLCSE will be developed with automated tools and services to facilitate enterprise modeling and process enactment. In the spirit of FirstEP, ProSLCSE tools will also be seductively easy to use. Achieving fully managed, optimized software development and support processes will be long and arduous for most software organizations, and many serious problems will have to be solved along the way. ProSLCSE will provide the ability to document, communicate, and modify existing processes, which is the necessary first step.

  18. Business Process Elicitation, Modeling, and Reengineering: Teaching and Learning with Simulated Environments

    ERIC Educational Resources Information Center

    Jeyaraj, Anand

    2010-01-01

    The design of enterprise information systems requires students to master technical skills for elicitation, modeling, and reengineering business processes as well as soft skills for information gathering and communication. These tacit skills and behaviors cannot be effectively taught students but rather experienced and learned by students. This…

  19. Modeled near-field environment porosity modifications due to coupled thermohydrologic and geochemical processes

    SciTech Connect

    Glassley, W. E.; Nitao, J. J.

    1998-10-30

    Heat deposited by waste packages in nuclear waste repositories can modify rock properties by instigating mineral dissolution and precipitation along hydrothermal flow pathways. Modeling this reactive transport requires coupling fluid flow to permeability changes resulting from dissolution and precipitation. Modification of the NUFT thermohydrologic (TH) code package to account for this coupling in a simplified geochemical system has been used to model the time- dependent change in porosity, permeability, matrix and fracture saturation, and temperature in the vicinity of waste-emplacement drifts, using conditions anticipated for the potential Yucca Mountain repository. The results show, within a few hundred years, dramatic porosity reduction approximately 10 m above emplacement drifts. Most of this reduction is attributed to deposition of solute load at the boiling front, although some of it also results from decreasing temperature along the flow path. The actual distribution of the nearly sealed region is sensitive to the time- dependent characteristics of the thermal load imposed on the environment and suggests that the geometry of the sealed region can be engineered by managing the waste-emplacement strategy.

  20. Process migration in UNIX environments

    NASA Technical Reports Server (NTRS)

    Lu, Chin; Liu, J. W. S.

    1988-01-01

    To support process migration in UNIX environments, the main problem is how to encapsulate the location dependent features of the system in such a way that a host independent virtual environment is maintained by the migration handlers on the behalf of each migrated process. An object-oriented approach is used to describe the interaction between a process and its environment. More specifically, environmental objects were introduced in UNIX systems to carry out the user-environment interaction. The implementation of the migration handlers is based on both the state consistency criterion and the property consistency criterion.

  1. A Methodology and Software Environment for Testing Process Model’s Sequential Predictions with Protocols

    DTIC Science & Technology

    1992-12-21

    further. Additionally, the verbal sequentiality assumption of Erikson & Simon’s (1984) verbal protocol theory was tested, and found to hold. The...science answered a lot of my questions. Erik Altmann, Uli Bodenhousen, Femand Gobet, Joe Mertz, Akira Miyake, and Chris Schunn. The R crowd my first year...in particular Erik Altmann, our British visitors Richard Young and Andrew Howes, and the users of preliminary versions of the environment who gave me

  2. Model-based processing for shallow ocean environments: The broadband problem

    SciTech Connect

    Candy, J.V.; Sullivan, E.J.

    1996-01-31

    Most acoustic sources found is the ocean environmental are spatially complex and broadband. When propagating in a shallow ocean these source characteristics complicate the analysis of received acoustic data considerably. The enhancement of broadband acoustic pressure- field measurements using a vertical array is discussed. Here a model- based approach is developed for a broadband source using a normal- mode propagation model.

  3. Performance Improvement: Applying a Human Performance Model to Organizational Processes in a Military Training Environment

    ERIC Educational Resources Information Center

    Aaberg, Wayne; Thompson, Carla J.; West, Haywood V.; Swiergosz, Matthew J.

    2009-01-01

    This article provides a description and the results of a study that utilized the human performance (HP) model and methods to explore and analyze a training organization. The systemic and systematic practices of the HP model are applicable to military training organizations as well as civilian organizations. Implications of the study for future…

  4. Arthropod model systems for studying complex biological processes in the space environment

    NASA Astrophysics Data System (ADS)

    Marco, Roberto; de Juan, Emilio; Ushakov, Ilya; Hernandorena, Arantxa; Gonzalez-Jurado, Juan; Calleja, Manuel; Manzanares, Miguel; Maroto, Miguel; Garesse, Rafael; Reitz, Günther; Miquel, Jaime

    1994-08-01

    Three arthropod systems are discussed in relation to their complementary and potential use in Space Biology. In a next biosatellite flight, Drosophila melanogaster pre-adapted during several months to different g levels will be flown in an automatic device that separates parental from first and second generations. In the same flight, flies will be exposed to microgravity conditions in an automatic unit in which fly motility can be recorded. In the International Microgravity Laboratory-2, several groups of Drosophila embryos will be grown in Space and the motility of a male fly population will be video-recorded. In the Biopan, an ESA exobilogy facility that can be flown attached to the exterior of a Russian biosatellite, Artemia dormant gastrulae will be exposed to the space environment in the exterior of the satellite under a normal atmosphere or in the void. Gastrulae will be separated in hit and non-hit populations. The developmental and aging response of these animals will be studied upon recovery. With these experiments we will be able to establish whether exposure to the space environment influences arthropod development and aging, and elaborate on some of the cellular mechanisms involved which should be tested in future experiments.

  5. Relational-database model for improving quality assurance and process control in a composite manufacturing environment

    NASA Astrophysics Data System (ADS)

    Gentry, Jeffery D.

    2000-05-01

    A relational database is a powerful tool for collecting and analyzing the vast amounts of inner-related data associated with the manufacture of composite materials. A relational database contains many individual database tables that store data that are related in some fashion. Manufacturing process variables as well as quality assurance measurements can be collected and stored in database tables indexed according to lot numbers, part type or individual serial numbers. Relationships between manufacturing process and product quality can then be correlated over a wide range of product types and process variations. This paper presents details on how relational databases are used to collect, store, and analyze process variables and quality assurance data associated with the manufacture of advanced composite materials. Important considerations are covered including how the various types of data are organized and how relationships between the data are defined. Employing relational database techniques to establish correlative relationships between process variables and quality assurance measurements is then explored. Finally, the benefits of database techniques such as data warehousing, data mining and web based client/server architectures are discussed in the context of composite material manufacturing.

  6. Condensation Processes in Astrophysical Environments

    NASA Technical Reports Server (NTRS)

    Nuth, Joseph A., III; Rietmeijer, Frans J. M.; Hill, Hugh G. M.

    2002-01-01

    Astrophysical systems present an intriguing set of challenges for laboratory chemists. Chemistry occurs in regions considered an excellent vacuum by laboratory standards and at temperatures that would vaporize laboratory equipment. Outflows around Asymptotic Giant Branch (AGB) stars have timescales ranging from seconds to weeks depending on the distance of the region of interest from the star and, on the way significant changes in the state variables are defined. The atmospheres in normal stars may only change significantly on several billion-year timescales. Most laboratory experiments carried out to understand astrophysical processes are not done at conditions that perfectly match the natural suite of state variables or timescales appropriate for natural conditions. Experimenters must make use of simple analog experiments that place limits on the behavior of natural systems, often extrapolating to lower-pressure and/or higher-temperature environments. Nevertheless, we argue that well-conceived experiments will often provide insights into astrophysical processes that are impossible to obtain through models or observations. This is especially true for complex chemical phenomena such as the formation and metamorphism of refractory grains under a range of astrophysical conditions. Data obtained in our laboratory has been surprising in numerous ways, ranging from the composition of the condensates to the thermal evolution of their spectral properties. None of this information could have been predicted from first principals and would not have been credible even if it had.

  7. Modeling microevolution in a changing environment: the evolving quasispecies and the diluted champion process

    NASA Astrophysics Data System (ADS)

    Bianconi, Ginestra; Fichera, Davide; Franz, Silvio; Peliti, Luca

    2011-08-01

    Several pathogens use evolvability as a survival strategy against acquired immunity of the host. Despite their high variability in time, some of them exhibit quite low variability within the population at any given time, a somewhat paradoxical behavior often called the evolving quasispecies. In this paper we introduce a simplified model of an evolving viral population in which the effects of the acquired immunity of the host are represented by the decrease of the fitness of the corresponding viral strains, depending on the frequency of the strain in the viral population. The model exhibits evolving quasispecies behavior in a certain range of its parameters, and suggests how punctuated evolution can be induced by a simple feedback mechanism.

  8. Process membership in asynchronous environments

    NASA Technical Reports Server (NTRS)

    Ricciardi, Aleta M.; Birman, Kenneth P.

    1993-01-01

    The development of reliable distributed software is simplified by the ability to assume a fail-stop failure model. The emulation of such a model in an asynchronous distributed environment is discussed. The solution proposed, called Strong-GMP, can be supported through a highly efficient protocol, and was implemented as part of a distributed systems software project at Cornell University. The precise definition of the problem, the protocol, correctness proofs, and an analysis of costs are addressed.

  9. A Phenomena-Oriented Environment for Teaching Process Modeling: Novel Modeling Software and Its Use in Problem Solving.

    ERIC Educational Resources Information Center

    Foss, Alan S.; Geurts, Kevin R.; Goodeve, Peter J.; Dahm, Kevin D.; Stephanopoulos, George; Bieszczad, Jerry; Koulouris, Alexandros

    1999-01-01

    Discusses a program that offers students a phenomenon-oriented environment expressed in the fundamental concepts and language of chemical engineering such as mass and energy balancing, phase equilibria, reaction stoichiometry and rate, modes of heat, and species transport. (CCM)

  10. Modeling Multiphase Coastal and Hydraulic Processes in an Interactive Python Environment with the Open Source Proteus Toolkit

    NASA Astrophysics Data System (ADS)

    Kees, C. E.; Farthing, M. W.; Ahmadia, A. J.; Bakhtyar, R.; Miller, C. T.

    2014-12-01

    Hydrology is dominated by multiphase flow processes, due to the importance of capturing water's interaction with soil and air phases. Unfortunately, many different mathematical model formulations are required to model particular processes and scales of interest, and each formulation often requires specialized numerical methods. The Proteus toolkit is a software package for research on models for coastal and hydraulic processes and improvements in numerics, particularly 3D multiphase processes and parallel numerics. The models considered include multiphase flow, shallow water flow, turbulent free surface flow, and various flow-driven processes. We will discuss the objectives of Proteus and recent evolution of the toolkit's design as well as present examples of how it has been used used to construct computational models of multiphase flows for the US Army Corps of Engineers. Proteus is also an open source toolkit authored primarily within the US Army Corps of Engineers, and used, developed, and maintained by a small community of researchers in both theoretical modeling and computational methods research. We will discuss how open source and community development practices have played a role in the creation of Proteus.

  11. An Integrated Vehicle Modeling Environment

    NASA Technical Reports Server (NTRS)

    Totah, Joseph J.; Kinney, David J.; Kaneshige, John T.; Agabon, Shane

    1999-01-01

    This paper describes an Integrated Vehicle Modeling Environment for estimating aircraft geometric, inertial, and aerodynamic characteristics, and for interfacing with a high fidelity, workstation based flight simulation architecture. The goals in developing this environment are to aid in the design of next generation intelligent fight control technologies, conduct research in advanced vehicle interface concepts for autonomous and semi-autonomous applications, and provide a value-added capability to the conceptual design and aircraft synthesis process. Results are presented for three aircraft by comparing estimates generated by the Integrated Vehicle Modeling Environment with known characteristics of each vehicle under consideration. The three aircraft are a modified F-15 with moveable canards attached to the airframe, a mid-sized, twin-engine commercial transport concept, and a small, single-engine, uninhabited aerial vehicle. Estimated physical properties and dynamic characteristics are correlated with those known for each aircraft over a large portion of the flight envelope of interest. These results represent the completion of a critical step toward meeting the stated goals for developing this modeling environment.

  12. Quantum process discrimination with information from environment

    NASA Astrophysics Data System (ADS)

    Wang, Yuan-Mei; Li, Jun-Gang; Zou, Jian; Xu, Bao-Ming

    2016-12-01

    In quantum metrology we usually extract information from the reduced probe system but ignore the information lost inevitably into the environment. However, K. Mølmer [Phys. Rev. Lett. 114, 040401 (2015)] showed that the information lost into the environment has an important effect on improving the successful probability of quantum process discrimination. Here we reconsider the model of a driven atom coupled to an environment and distinguish which of two candidate Hamiltonians governs the dynamics of the whole system. We mainly discuss two measurement methods, one of which obtains only the information from the reduced atom state and the other obtains the information from both the atom and its environment. Interestingly, for the two methods the optimal initial states of the atom, used to improve the successful probability of the process discrimination, are different. By comparing the two methods we find that the partial information from the environment is very useful for the discriminations. Project supported by the National Natural Science Foundation of China (Grant Nos. 11274043, 11375025, and 11005008).

  13. A Comparison of Reasoning Processes in a Collaborative Modelling Environment: Learning about Genetics Problems Using Virtual Chat

    ERIC Educational Resources Information Center

    Pata, Kai; Sarapuu, Tago

    2006-01-01

    This study investigated the possible activation of different types of model-based reasoning processes in two learning settings, and the influence of various terms of reasoning on the learners' problem representation development. Changes in 53 students' problem representations about genetic issue were analysed while they worked with different…

  14. DEVELOPMENT OF A CHEMICAL PROCESS MODELING ENVIRONMENT BASED ON CAPE-OPEN INTERFACE STANDARDS AND THE MICROSOFT .NET FRAMEWORK

    EPA Science Inventory

    Chemical process simulation has long been used as a design tool in the development of chemical plants, and has long been considered a means to evaluate different design options. With the advent of large scale computer networks and interface models for program components, it is po...

  15. Generalized Environment for Modeling Systems

    SciTech Connect

    2012-02-07

    GEMS is an integrated environment that allows technical analysts, modelers, researchers, etc. to integrate and deploy models and/or decision tools with associated data to the internet for direct use by customers. GEMS does not require that the model developer know how to code or script and therefore delivers this capability to a large group of technical specialists. Customers gain the benefit of being able to execute their own scenarios directly without need for technical support. GEMS is a process that leverages commercial software products with specialized codes that add connectivity and unique functions to support the overall capability. Users integrate pre-existing models with a commercial product and store parameters and input trajectories in a companion commercial database. The model is then exposed into a commercial web environment and a graphical user interface (GUI) is applied by the model developer. Users execute the model through the web based GUI and GEMS manages supply of proper inputs, execution of models, routing of data to models and display of results back to users. GEMS works in layers, the following description is from the bottom up. Modelers create models in the modeling tool of their choice such as Excel, Matlab, or Fortran. They can also use models from a library of previously wrapped legacy codes (models). Modelers integrate the models (or a single model) by wrapping and connecting the models using the Phoenix Integration tool entitled ModelCenter. Using a ModelCenter/SAS plugin (DOE copyright CW-10-08) the modeler gets data from either an SAS or SQL database and sends results back to SAS or SQL. Once the model is working properly, the ModelCenter file is saved and stored in a folder location to which a SharePoint server tool created at INL is pointed. This enables the ModelCenter model to be run from SharePoint. The modeler then goes into Microsoft SharePoint and creates a graphical user interface (GUI) using the ModelCenter WebPart (CW-12

  16. Geospace Environment Modeling Program

    NASA Astrophysics Data System (ADS)

    Dusenbery, Paul B.; Siscoe, George L.

    1992-02-01

    The geospace environment encompasses the highest and largest of the four physical geospheres—lithosphere, hydrosphere, atmosphere, and magnetosphere. Despite its size, its far-reaching structures interconnect and move together in a choreography of organized dynamics, whose complexity is reflected in the intricate movements of the northern lights. The vastness and inaccessibility of geospace, encompassing the plasma environment of the magnetosphere/ionosphere system, and the invisibility of its structures pose great challenges to scientists who want to study its dynamics by obtaining, in effect, video tapes of its globally organized motions. A key component of their strategy is the ability to see nearly all of geospace imaged onto the top of the atmosphere. The geomagnetic field threads the volume of geospace and transmits action, TV-like, from the magnetospheric stage down its lines of force onto the atmospheric screen.

  17. CAUSA - An Environment For Modeling And Simulation

    NASA Astrophysics Data System (ADS)

    Dilger, Werner; Moeller, Juergen

    1989-03-01

    CAUSA is an environment for modeling and simulation of dynamic systems on a quantitative level. The environment provides a conceptual framework including primitives like objects, processes and causal dependencies which allow the modeling of a broad class of complex systems. The facility of simulation allows the quantitative and qualitative inspection and empirical investigation of the behavior of the modeled system. CAUSA is implemented in Knowledge-Craft and runs on a Symbolics 3640.

  18. Managing Algorithmic Skeleton Nesting Requirements in Realistic Image Processing Applications: The Case of the SKiPPER-II Parallel Programming Environment's Operating Model

    NASA Astrophysics Data System (ADS)

    Coudarcher, Rémi; Duculty, Florent; Serot, Jocelyn; Jurie, Frédéric; Derutin, Jean-Pierre; Dhome, Michel

    2005-12-01

    SKiPPER is a SKeleton-based Parallel Programming EnviRonment being developed since 1996 and running at LASMEA Laboratory, the Blaise-Pascal University, France. The main goal of the project was to demonstrate the applicability of skeleton-based parallel programming techniques to the fast prototyping of reactive vision applications. This paper deals with the special features embedded in the latest version of the project: algorithmic skeleton nesting capabilities and a fully dynamic operating model. Throughout the case study of a complete and realistic image processing application, in which we have pointed out the requirement for skeleton nesting, we are presenting the operating model of this feature. The work described here is one of the few reported experiments showing the application of skeleton nesting facilities for the parallelisation of a realistic application, especially in the area of image processing. The image processing application we have chosen is a 3D face-tracking algorithm from appearance.

  19. Photohadronic Processes in Astrophysical Environments

    NASA Astrophysics Data System (ADS)

    Mücke, A.; Rachen, J. P.; Engel, Ralph; Protheroe, R. J.; Stanev, Todor

    1999-08-01

    We discuss the first applications of our newly developed Monte Carlo event generator SOPHIA to multiparticle photoproduction of relativistic protons with thermal and power-law radiation fields. The measured total cross section is reproduced in terms of excitation and decay of baryon resonances, direct pion production, diffractive scattering, and non-diffractive multiparticle production. Non-diffractive multiparticle production is described using a string fragmentation model. We demonstrate that the widely used `Δ-approximation' for the photoproduction cross section is reasonable only for a restricted set of astrophysical applications. The relevance of this result for cosmic ray propagation through the microwave background and hadronic models of active galactic nuclei and gamma-ray bursts is briefly discussed.

  20. Modeling of LDEF contamination environment

    NASA Technical Reports Server (NTRS)

    Carruth, M. Ralph, Jr.; Rantanen, Ray; Gordon, Tim

    1993-01-01

    The Long Duration Exposure Facility (LDEF) satellite was unique in many ways. It was a large structure that was in space for an extended period of time and was stable in orientation relative to the velocity vector. There are obvious and well documented effects of contamination and space environment effects on the LDEF satellite. In order to examine the interaction of LDEF with its environment and the resulting effect on the satellite, the Integrated Spacecraft Environments Model (ISEM) was used to model the LDEF-induced neutral environment at several different times and altitudes during the mission.

  1. Estimation of environment-related properties of chemicals for design of sustainable processes: development of group-contribution+ (GC+) property models and uncertainty analysis.

    PubMed

    Hukkerikar, Amol Shivajirao; Kalakul, Sawitree; Sarup, Bent; Young, Douglas M; Sin, Gürkan; Gani, Rafiqul

    2012-11-26

    of the developed property models for the estimation of environment-related properties and uncertainties of the estimated property values is highlighted through an illustrative example. The developed property models provide reliable estimates of environment-related properties needed to perform process synthesis, design, and analysis of sustainable chemical processes and allow one to evaluate the effect of uncertainties of estimated property values on the calculated performance of processes giving useful insights into quality and reliability of the design of sustainable processes.

  2. Autonomous environment modeling by a mobile robot

    NASA Astrophysics Data System (ADS)

    Moutarlier, Philippe

    1991-02-01

    Internal geometric representation of the environment is considered. The autonomy of a mobile robot partly relies on its ability to build a reliable representation of its environment. On the other hand, an autonomous environment building process requires that model be adapted to plan motions and perception actions. Therefore, the modeling process must be a reversible interface between perception motion devices and the model itself. Several kinds of models are necessary in order to achieve an autonomous process. Sensors give stochastic information on the surface, navigation needs free-space representation, and perception planning requires aspect graphs. The functions of stochastic surface modeling, free space representation, and topological graph computing are presented through the integrated geometric model builder called 'Yaka.' Since all environment data uncertainties are correlated together through the robot location inaccuracy, classical filtering methods are inadequate. A method of computing a linear variance estimator, that is adapted to the problem, is proposed. This general formalism is validated by a large number of experimentation wherein the robot incrementally builds a surfacic representation of its environment. Free space cannot be deduced directly, at each step, from the surfacic data provided by the sensors. Innacuracies on object surfaces and uncertainties on the visibility of objects by the sensor as well as the possible motion of objects must all be taken into account for building the free space incrementally. Then, motion and perception planning for autonomous environment modeling are achieved using this free space model and topological location and aspect graphs.

  3. Biosphere Process Model Report

    SciTech Connect

    J. Schmitt

    2000-05-25

    To evaluate the postclosure performance of a potential monitored geologic repository at Yucca Mountain, a Total System Performance Assessment (TSPA) will be conducted. Nine Process Model Reports (PMRs), including this document, are being developed to summarize the technical basis for each of the process models supporting the TSPA model. These reports cover the following areas: (1) Integrated Site Model; (2) Unsaturated Zone Flow and Transport; (3) Near Field Environment; (4) Engineered Barrier System Degradation, Flow, and Transport; (5) Waste Package Degradation; (6) Waste Form Degradation; (7) Saturated Zone Flow and Transport; (8) Biosphere; and (9) Disruptive Events. Analysis/Model Reports (AMRs) contain the more detailed technical information used to support TSPA and the PMRs. The AMRs consists of data, analyses, models, software, and supporting documentation that will be used to defend the applicability of each process model for evaluating the postclosure performance of the potential Yucca Mountain repository system. This documentation will ensure the traceability of information from its source through its ultimate use in the TSPA-Site Recommendation (SR) and in the National Environmental Policy Act (NEPA) analysis processes. The objective of the Biosphere PMR is to summarize (1) the development of the biosphere model, and (2) the Biosphere Dose Conversion Factors (BDCFs) developed for use in TSPA. The Biosphere PMR does not present or summarize estimates of potential radiation doses to human receptors. Dose calculations are performed as part of TSPA and will be presented in the TSPA documentation. The biosphere model is a component of the process to evaluate postclosure repository performance and regulatory compliance for a potential monitored geologic repository at Yucca Mountain, Nevada. The biosphere model describes those exposure pathways in the biosphere by which radionuclides released from a potential repository could reach a human receptor

  4. Adapting the Electrospinning Process to Provide Three Unique Environments for a Tri-layered In Vitro Model of the Airway Wall

    PubMed Central

    Bridge, Jack C.; Aylott, Jonathan W.; Brightling, Christopher E.; Ghaemmaghami, Amir M.; Knox, Alan J.; Lewis, Mark P.; Rose, Felicity R.A.J.; Morris, Gavin E.

    2015-01-01

    Electrospinning is a highly adaptable method producing porous 3D fibrous scaffolds that can be exploited in in vitro cell culture. Alterations to intrinsic parameters within the process allow a high degree of control over scaffold characteristics including fiber diameter, alignment and porosity. By developing scaffolds with similar dimensions and topographies to organ- or tissue-specific extracellular matrices (ECM), micro-environments representative to those that cells are exposed to in situ can be created. The airway bronchiole wall, comprised of three main micro-environments, was selected as a model tissue. Using decellularized airway ECM as a guide, we electrospun the non-degradable polymer, polyethylene terephthalate (PET), by three different protocols to produce three individual electrospun scaffolds optimized for epithelial, fibroblast or smooth muscle cell-culture. Using a commercially available bioreactor system, we stably co-cultured the three cell-types to provide an in vitro model of the airway wall over an extended time period. This model highlights the potential for such methods being employed in in vitro diagnostic studies investigating important inter-cellular cross-talk mechanisms or assessing novel pharmaceutical targets, by providing a relevant platform to allow the culture of fully differentiated adult cells within 3D, tissue-specific environments. PMID:26275100

  5. Adapting the Electrospinning Process to Provide Three Unique Environments for a Tri-layered In Vitro Model of the Airway Wall.

    PubMed

    Bridge, Jack C; Aylott, Jonathan W; Brightling, Christopher E; Ghaemmaghami, Amir M; Knox, Alan J; Lewis, Mark P; Rose, Felicity R A J; Morris, Gavin E

    2015-07-31

    Electrospinning is a highly adaptable method producing porous 3D fibrous scaffolds that can be exploited in in vitro cell culture. Alterations to intrinsic parameters within the process allow a high degree of control over scaffold characteristics including fiber diameter, alignment and porosity. By developing scaffolds with similar dimensions and topographies to organ- or tissue-specific extracellular matrices (ECM), micro-environments representative to those that cells are exposed to in situ can be created. The airway bronchiole wall, comprised of three main micro-environments, was selected as a model tissue. Using decellularized airway ECM as a guide, we electrospun the non-degradable polymer, polyethylene terephthalate (PET), by three different protocols to produce three individual electrospun scaffolds optimized for epithelial, fibroblast or smooth muscle cell-culture. Using a commercially available bioreactor system, we stably co-cultured the three cell-types to provide an in vitro model of the airway wall over an extended time period. This model highlights the potential for such methods being employed in in vitro diagnostic studies investigating important inter-cellular cross-talk mechanisms or assessing novel pharmaceutical targets, by providing a relevant platform to allow the culture of fully differentiated adult cells within 3D, tissue-specific environments.

  6. A Learning Model for Enhancing the Student's Control in Educational Process Using Web 2.0 Personal Learning Environments

    ERIC Educational Resources Information Center

    Rahimi, Ebrahim; van den Berg, Jan; Veen, Wim

    2015-01-01

    In recent educational literature, it has been observed that improving student's control has the potential of increasing his or her feeling of ownership, personal agency and activeness as means to maximize his or her educational achievement. While the main conceived goal for personal learning environments (PLEs) is to increase student's control by…

  7. Scalable Networked Information Processing Environment (SNIPE)

    SciTech Connect

    Fagg, G.E.; Moore, K.; Dongarra, J.J. |; Geist, A.

    1997-11-01

    SNIPE is a metacomputing system that aims to provide a reliable, secure, fault tolerant environment for long term distributed computing applications and data stores across the global Internet. This system combines global naming and replication of both processing and data to support large scale information processing applications leading to better availability and reliability than currently available with typical cluster computing and/or distributed computer environments.

  8. Modeling the Space Radiation Environment

    NASA Technical Reports Server (NTRS)

    Xapsos, Michael A.

    2006-01-01

    There has been a renaissance of interest in space radiation environment modeling. This has been fueled by the growing need to replace long time standard AP-9 and AE-8 trapped particle models, the interplanetary exploration initiative, the modern satellite instrumentation that has led to unprecedented measurement accuracy, and the pervasive use of Commercial off the Shelf (COTS) microelectronics that require more accurate predictive capabilities. The objective of this viewgraph presentation was to provide basic understanding of the components of the space radiation environment and their variations, review traditional radiation effects application models, and present recent developments.

  9. Development of an Integrated Modeling Framework for Simulations of Coastal Processes in Deltaic Environments Using High-Performance Computing

    DTIC Science & Technology

    2009-01-01

    ALPACA : This NSF funded project is developing debugging and profiling tools for the Cactus framework which will support the Coastal Modeling Framework...developed in this project. (http://www.cactuscode.org/Development/ alpaca ) CyberTools: This NSF/BOR funded project is developing a

  10. Development of an Integrated Modeling Framework for Simulations of Coastal Processes in Deltaic Environments Using High-Performance Computing

    DTIC Science & Technology

    2008-01-01

    http://www.cactuscode.org/Development/xirel) ALPACA : This NSF funded project is developing debugging and profiling tools for the Cactus framework...which will support the Coastal Modeling Framework developed in this project. (http://www.cactuscode.org/Development/ alpaca ) CyberTools: This NSF/BOR

  11. A Process and Environment Aware Sierra/SolidMechanics Cohesive Zone Modeling Capability for Polymer/Solid Interfaces

    SciTech Connect

    Reedy, E. D.; Chambers, Robert S.; Hughes, Lindsey Gloe; Kropka, Jamie Michael; Stavig, Mark E.; Stevens, Mark J.

    2015-09-01

    The performance and reliability of many mechanical and electrical components depend on the integrity of po lymer - to - solid interfaces . Such interfaces are found in adhesively bonded joints, encapsulated or underfilled electronic modules, protective coatings, and laminates. The work described herein was aimed at improving Sandia's finite element - based capability to predict interfacial crack growth by 1) using a high fidelity nonlinear viscoelastic material model for the adhesive in fracture simulations, and 2) developing and implementing a novel cohesive zone fracture model that generates a mode - mixity dependent toughness as a natural consequence of its formulation (i.e., generates the observed increase in interfacial toughness wi th increasing crack - tip interfacial shear). Furthermore, molecular dynamics simulations were used to study fundamental material/interfa cial physics so as to develop a fuller understanding of the connection between molecular structure and failure . Also reported are test results that quantify how joint strength and interfacial toughness vary with temperature.

  12. Modeling the Adoption Process of the Flight Training Synthetic Environment Technology (FTSET) in the Turkish Army Aviation (TUAA)

    DTIC Science & Technology

    2006-12-01

    new technologies is usually explained by the Diffusion of Innovations Model37 and its S-shaped growth patterns. French Sociologist Gabriel Tarde ...Dohme, and Robert T . Nullmeyer, “Optimizing Simulator-Aircraft Mix for US Army Initial Entry Rotary Wing Training,” Technical Report 1092 (March 1999): 6... T . Nullmeyer, Optimizing Simulator- Aircraft Mix for US Army Initial Entry Rotary Wing Training (US Army Research Institute for the Behavioral and

  13. Modeling the growth and constraints of thermophiles and biogeochemical processes in deep-sea hydrothermal environments (Invited)

    NASA Astrophysics Data System (ADS)

    Holden, J. F.; Ver Eecke, H. C.; Lin, T. J.; Butterfield, D. A.; Olson, E. J.; Jamieson, J.; Knutson, J. K.; Dyar, M. D.

    2010-12-01

    and contain an abundance of Fe(III) oxide and sulfate minerals, especially on surfaces of pore spaces. Hyperthermophilic iron reducers attach to iron oxide particles via cell wall invaginations and pili and reduce the iron through direct contact. The iron is reduced to magnetite, possibly with a maghemite intermediate. Thus iron reducers could outcompete methanogens in low H2, mildly reducing habitats such as Endeavour. Unlike strain JH146, respiration rates per cell were highest near the optimal growth temperature for the iron reducer Hyperthermus strain Ro04 and decreased near the temperature limits for growth. This study highlights the need to model microbe-metal interactions and improve respiration estimates from pure cultures to refine our in situ bioenergetic and habitat models.

  14. Spacecraft Internal Acoustic Environment Modeling

    NASA Technical Reports Server (NTRS)

    Allen, Christopher; Chu, S. Reynold

    2008-01-01

    The objective of the project is to develop an acoustic modeling capability, based on commercial off-the-shelf software, to be used as a tool for oversight of the future manned Constellation vehicles to ensure compliance with acoustic requirements and thus provide a safe and habitable acoustic environment for the crews, and to validate developed models via building physical mockups and conducting acoustic measurements.

  15. Electronic materials processing and the microgravity environment

    NASA Technical Reports Server (NTRS)

    Witt, A. F.

    1988-01-01

    The nature and origin of deficiencies in bulk electronic materials for device fabrication are analyzed. It is found that gravity generated perturbations during their formation account largely for the introduction of critical chemical and crystalline defects and, moreover, are responsible for the still existing gap between theory and experiment and thus for excessive reliance on proprietary empiricism in processing technology. Exploration of the potential of reduced gravity environment for electronic materials processing is found to be not only desirable but mandatory.

  16. Teaching Process Writing in an Online Environment

    ERIC Educational Resources Information Center

    Carolan, Fergal; Kyppö, Anna

    2015-01-01

    This reflective practice paper offers some insights into teaching an interdisciplinary academic writing course aimed at promoting process writing. The study reflects on students' acquisition of writing skills and the teacher's support practices in a digital writing environment. It presents writers' experiences related to various stages of process…

  17. Distributed Sensing and Processing Adaptive Collaboration Environment (D-SPACE)

    DTIC Science & Technology

    2014-07-01

    MapReduce to scalably query datagraphs in the SHARD graph-store. In Proceedings of the fourth international workshop on Data - intensive distributed ...of distributed relational data across multiple autonomous heterogeneous computing resources in environments with limited control, resource failures...this one year effort, we developed a model for processing distributed data across multiple heterogeneous computing resources. Our model exploits the

  18. Modeling hyporheic zone processes

    USGS Publications Warehouse

    Runkel, Robert L.; McKnight, Diane M.; Rajaram, Harihar

    2003-01-01

    Stream biogeochemistry is influenced by the physical and chemical processes that occur in the surrounding watershed. These processes include the mass loading of solutes from terrestrial and atmospheric sources, the physical transport of solutes within the watershed, and the transformation of solutes due to biogeochemical reactions. Research over the last two decades has identified the hyporheic zone as an important part of the stream system in which these processes occur. The hyporheic zone may be loosely defined as the porous areas of the stream bed and stream bank in which stream water mixes with shallow groundwater. Exchange of water and solutes between the stream proper and the hyporheic zone has many biogeochemical implications, due to differences in the chemical composition of surface and groundwater. For example, surface waters are typically oxidized environments with relatively high dissolved oxygen concentrations. In contrast, reducing conditions are often present in groundwater systems leading to low dissolved oxygen concentrations. Further, microbial oxidation of organic materials in groundwater leads to supersaturated concentrations of dissolved carbon dioxide relative to the atmosphere. Differences in surface and groundwater pH and temperature are also common. The hyporheic zone is therefore a mixing zone in which there are gradients in the concentrations of dissolved gasses, the concentrations of oxidized and reduced species, pH, and temperature. These gradients lead to biogeochemical reactions that ultimately affect stream water quality. Due to the complexity of these natural systems, modeling techniques are frequently employed to quantify process dynamics.

  19. Cabin Environment Physics Risk Model

    NASA Technical Reports Server (NTRS)

    Mattenberger, Christopher J.; Mathias, Donovan Leigh

    2014-01-01

    This paper presents a Cabin Environment Physics Risk (CEPR) model that predicts the time for an initial failure of Environmental Control and Life Support System (ECLSS) functionality to propagate into a hazardous environment and trigger a loss-of-crew (LOC) event. This physics-of failure model allows a probabilistic risk assessment of a crewed spacecraft to account for the cabin environment, which can serve as a buffer to protect the crew during an abort from orbit and ultimately enable a safe return. The results of the CEPR model replace the assumption that failure of the crew critical ECLSS functionality causes LOC instantly, and provide a more accurate representation of the spacecraft's risk posture. The instant-LOC assumption is shown to be excessively conservative and, moreover, can impact the relative risk drivers identified for the spacecraft. This, in turn, could lead the design team to allocate mass for equipment to reduce overly conservative risk estimates in a suboptimal configuration, which inherently increases the overall risk to the crew. For example, available mass could be poorly used to add redundant ECLSS components that have a negligible benefit but appear to make the vehicle safer due to poor assumptions about the propagation time of ECLSS failures.

  20. Course Material Model in A&O Learning Environment.

    ERIC Educational Resources Information Center

    Levasma, Jarkko; Nykanen, Ossi

    One of the problematic issues in the content development for learning environments is the process of importing various types of course material into the environment. This paper describes a method for importing material into the A&O open learning environment by introducing a material model for metadata recognized by the environment. The first…

  1. Process engineering concerns in the lunar environment

    NASA Technical Reports Server (NTRS)

    Sullivan, T. A.

    1990-01-01

    The paper discusses the constraints on a production process imposed by the lunar or Martian environment on the space transportation system. A proposed chemical route to produce oxygen from iron oxide bearing minerals (including ilmenite) is presented in three different configurations which vary in complexity. A design for thermal energy storage is presented that could both provide power during the lunar night and act as a blast protection barrier for the outpost. A process to release carbon from the lunar regolith as methane is proposed, capitalizing on the greater abundance and favorable physical properties of methane relative to hydrogen to benefit the entire system.

  2. Galactic cosmic radiation environment models

    NASA Astrophysics Data System (ADS)

    Badhwar, G. D.; O'Neill, P. M.; Troung, A. G.

    2001-02-01

    Models of the radiation environment in free space and in near earth orbits are required to estimate the radiation dose to the astronauts for Mars, Space Shuttle, and the International Space Station missions, and to estimate the rate of single event upsets and latch-ups in electronic devices. Accurate knowledge of the environment is critical for the design of optimal shielding during both the cruise phase and for a habitat on Mars or the Moon. Measurements of the energy spectra of galactic cosmic rays (GCR) have been made for nearly four decades. In the last decade, models have been constructed that can predict the energy spectra of any GCR nuclei to an accuracy of better than 25%. Fresh and more accurate measurements have been made in the last year. These measurements can lead to more accurate models. Improvements in these models can be made in determining the local interstellar spectra and in predicting the level of solar modulation. It is the coupling of the two that defines a GCR model. This paper reviews of two of the more widely used models, and a comparison of their predictions with new proton and helium data from the Alpha Magnetic Spectrometer (AMS), and spectra of beryllium to iron in the ~40 to 500 MeV/n acquired by the Advanced Composition Explorer (ACE) during the 1997-98 solar minimum. Regressions equations relating the IMP-8 helium count rate to the solar modulation deceleration parameter calculated using the Climax neutron monitor rate have been developed and may lead to improvements in the predictive capacity of the models. .

  3. The process-based stand growth model Formix 3-Q applied in a GIS environment for growth and yield analysis in a tropical rain forest.

    PubMed

    Ditzer, T.; Glauner, R.; Förster, M.; Köhler, P.; Huth, A.

    2000-03-01

    Managing tropical rain forests is difficult because few long-term field data on forest growth and the impact of harvesting disturbance are available. Growth models may provide a valuable tool for managers of tropical forests, particularly if applied to the extended forest areas of up to 100,000 ha that typically constitute the so-called forest management units (FMUs). We used a stand growth model in a geographic information system (GIS) environment to simulate tropical rain forest growth at the FMU level. We applied the process-based rain forest growth model Formix 3-Q to the 55,000 ha Deramakot Forest Reserve (DFR) in Sabah, Malaysia. The FMU was considered to be composed of single and independent small-scale stands differing in site conditions and forest structure. Field data, which were analyzed with a GIS, comprised a terrestrial forest inventory, site and soil analyses (water, nutrients, slope), the interpretation of aerial photographs of the present vegetation and topographic maps. Different stand types were determined based on a classification of site quality (three classes), slopes (four classes), and present forest structure (four strata). The effects of site quality on tree allometry (height-diameter curve, biomass allometry, leaf area) and growth (increment size) are incorporated into Formix 3-Q. We derived allometric relations and growth factors for different site conditions from the field data. Climax forest structure at the stand level was shown to depend strongly on site conditions. Simulated successional pattern and climax structure were compared with field observations. Based on the current management plan for the DFR, harvesting scenarios were simulated for stands on different sites. The effects of harvesting guidelines on forest structure and the implications for sustainable forest management at Deramakot were analyzed. Based on the stand types and GIS analysis, we also simulated undisturbed regeneration of the logged-over forest in the DFR at

  4. Microbial processes in fractured rock environments

    NASA Astrophysics Data System (ADS)

    Kinner, Nancy E.; Eighmy, T. Taylor; Mills, M.; Coulburn, J.; Tisa, L.

    Little is known about the types and activities of microbes in fractured rock environments, but recent studies in a variety of bedrock formations have documented the presence of a diverse array of prokaryotes (Eubacteria and Archaea) and some protists. The prokaryotes appear to live in both diffusion-dominated microfractures and larger, more conductive open fractures. Some of the prokaryotes are associated with the surfaces of the host rock and mineral precipitates, while other planktonic forms are floating/moving in the groundwater filling the fractures. Studies indicate that the surface-associated and planktonic communities are distinct, and their importance in microbially mediated processes occurring in the bedrock environment may vary, depending on the availability of electron donors/acceptors and nutrients needed by the cells. In general, abundances of microbes are low compared with other environments, because of the paucity of these substances that are transported into the deeper subsurface where most bedrock occurs, unless there is significant pollution with an electron donor. To obtain a complete picture of the microbes present and their metabolic activity, it is usually necessary to sample formation water from specific fractures (versus open boreholes), and fracture surfaces (i.e., cores). Transport of the microbes through the major fracture pathways can be rapid, but may be quite limited in the microfractures. Very low abundances of small ( 2-3 μm) flagellated protists, which appear to prey upon planktonic bacteria, have been found in a bedrock aquifer. Much more research is needed to expand the understanding of all microbial processes in fractured rock environments.

  5. Probing protein environment in an enzymatic process: All-electron quantum chemical analysis combined with ab initio quantum mechanical/molecular mechanical modeling of chorismate mutase

    NASA Astrophysics Data System (ADS)

    Ishida, Toyokazu

    2008-09-01

    In this study, we investigated the electronic character of protein environment in enzymatic processes by performing all-electron QM calculations based on the fragment molecular orbital (FMO) method. By introducing a new computational strategy combining all-electron QM analysis with ab initio QM/MM modeling, we investigated the details of molecular interaction energy between a reactive substrate and amino acid residues at a catalytic site. For a practical application, we selected the chorismate mutase catalyzed reaction as an example. Because the computational time required to perform all-electron QM reaction path searches was very large, we employed the ab initio QM/MM modeling technique to construct reliable reaction profiles and performed all-electron FMO calculations for the selected geometries. The main focus of the paper is to analyze the details of electrostatic stabilization, which is considered to be the major feature of enzymatic catalyses, and to clarify how the electronic structure of proteins is polarized in response to the change in electron distribution of the substrate. By performing interaction energy decomposition analysis from a quantum chemical viewpoint, we clarified the relationship between the location of amino acid residues on the protein domain and the degree of electronic polarization of each residue. In particular, in the enzymatic transition state, Arg7, Glu78, and Arg90 are highly polarized in response to the delocalized electronic character of the substrate, and as a result, a large amount of electrostatic stabilization energy is stored in the molecular interaction between the enzyme and the substrate and supplied for transition state stabilization.

  6. Probing protein environment in an enzymatic process: All-electron quantum chemical analysis combined with ab initio quantum mechanical/molecular mechanical modeling of chorismate mutase.

    PubMed

    Ishida, Toyokazu

    2008-09-28

    In this study, we investigated the electronic character of protein environment in enzymatic processes by performing all-electron QM calculations based on the fragment molecular orbital (FMO) method. By introducing a new computational strategy combining all-electron QM analysis with ab initio QM/MM modeling, we investigated the details of molecular interaction energy between a reactive substrate and amino acid residues at a catalytic site. For a practical application, we selected the chorismate mutase catalyzed reaction as an example. Because the computational time required to perform all-electron QM reaction path searches was very large, we employed the ab initio QM/MM modeling technique to construct reliable reaction profiles and performed all-electron FMO calculations for the selected geometries. The main focus of the paper is to analyze the details of electrostatic stabilization, which is considered to be the major feature of enzymatic catalyses, and to clarify how the electronic structure of proteins is polarized in response to the change in electron distribution of the substrate. By performing interaction energy decomposition analysis from a quantum chemical viewpoint, we clarified the relationship between the location of amino acid residues on the protein domain and the degree of electronic polarization of each residue. In particular, in the enzymatic transition state, Arg7, Glu78, and Arg90 are highly polarized in response to the delocalized electronic character of the substrate, and as a result, a large amount of electrostatic stabilization energy is stored in the molecular interaction between the enzyme and the substrate and supplied for transition state stabilization.

  7. Space environment and lunar surface processes, 2

    NASA Technical Reports Server (NTRS)

    Comstock, G. M.

    1982-01-01

    The top few millimeters of a surface exposed to space represents a physically and chemically active zone with properties different from those of a surface in the environment of a planetary atmosphere. To meet the need or a quantitative synthesis of the various processes contributing to the evolution of surfaces of the Moon, Mercury, the asteroids, and similar bodies, (exposure to solar wind, solar flare particles, galactic cosmic rays, heating from solar radiation, and meteoroid bombardment), the MESS 2 computer program was developed. This program differs from earlier work in that the surface processes are broken down as a function of size scale and treated in three dimensions with good resolution on each scale. The results obtained apply to the development of soil near the surface and is based on lunar conditions. Parameters can be adjusted to describe asteroid regoliths and other space-related bodies.

  8. Estimation of Environment-Related Properties of Chemicals for Design of Sustainable Processes: Development of Group-Contribution+ (GC+) Property Models and Uncertainty Analysis

    EPA Science Inventory

    The aim of this work is to develop group-contribution+ (GC+) method (combined group-contribution (GC) method and atom connectivity index (CI) method) based property models to provide reliable estimations of environment-related properties of organic chemicals together with uncert...

  9. Spacecraft Internal Acoustic Environment Modeling

    NASA Technical Reports Server (NTRS)

    Chu, SShao-sheng R.; Allen, Christopher S.

    2009-01-01

    carried out by acquiring octave band microphone data simultaneously at ten fixed locations throughout the mockup. SPLs (Sound Pressure Levels) predicted by our SEA model match well with measurements for our CM mockup, with a more complicated shape. Additionally in FY09, background NC noise (Noise Criterion) simulation and MRT (Modified Rhyme Test) were developed and performed in the mockup to determine the maximum noise level in CM habitable volume for fair crew voice communications. Numerous demonstrations of simulated noise environment in the mockup and associated SIL (Speech Interference Level) via MRT were performed for various communities, including members from NASA and Orion prime-/sub-contractors. Also, a new HSIR (Human-Systems Integration Requirement) for limiting pre- and post-landing SIL was proposed.

  10. Control of the aseptic processing environment.

    PubMed

    Frieben, W R

    1983-11-01

    Methods used by industry with applications to hospital pharmacy for maintaining an aseptic environment in production of sterile pharmaceutical products are discussed. A major source of product contamination is airborne microorganisms. The laminar-airflow workbench with a high-efficiency particulate air filter provides an ultraclean environment for preparation of sterile products. However, the workbench does not guarantee sterility of products and is not effective if not properly installed and maintained or if the operator uses poor aseptic technique. The laminar-airflow workbench should be tested for leaks, airflow velocity, and airflow patterns when installed, and the workbench should be checked periodically thereafter. The workbench should be placed in a cleanroom where traffic and air disturbances that might affect the laminar airflow are eliminated. A major source of airborne microbial contamination in cleanrooms is people. Personnel movement through an area and presence of personnel without lint-free, nonshedding protective garments increase the levels of microbial contaminants in an area. The transport of nonsterile products (bottles, boxes, paper products) into a cleanroom should be minimized. The cleanroom itself should be sanitized and should be immaculate. Microbial or particulate monitoring should be conducted in the cleanroom using a quantitative method, and corrective-action limits should be set. Hospital pharmacists should examine industrial sterile-processing techniques and apply them to the preparation of sterile products.

  11. A multiarchitecture parallel-processing development environment

    NASA Technical Reports Server (NTRS)

    Townsend, Scott; Blech, Richard; Cole, Gary

    1993-01-01

    A description is given of the hardware and software of a multiprocessor test bed - the second generation Hypercluster system. The Hypercluster architecture consists of a standard hypercube distributed-memory topology, with multiprocessor shared-memory nodes. By using standard, off-the-shelf hardware, the system can be upgraded to use rapidly improving computer technology. The Hypercluster's multiarchitecture nature makes it suitable for researching parallel algorithms in computational field simulation applications (e.g., computational fluid dynamics). The dedicated test-bed environment of the Hypercluster and its custom-built software allows experiments with various parallel-processing concepts such as message passing algorithms, debugging tools, and computational 'steering'. Such research would be difficult, if not impossible, to achieve on shared, commercial systems.

  12. Does microbial community structure matter for predicting ecosystem function? Use of statistical models to examine relationships between the environment, community and processes

    NASA Astrophysics Data System (ADS)

    Nemergut, D.; Graham, E. B.

    2014-12-01

    Microorganisms control all major biogeochemical cycles, yet the importance of microbial community structure for ecosystem function is widely debated. Indeed, few nutrient cycling models directly account for variation in community structure, leading some researchers to speculate that this information could provide important and missing explanatory power to predict ecosystem function. However, if variation in environmental variables strongly correlates with variation in microbial community composition, then information on microbial community composition may not improve models. Here, we use a data synthesis approach to ask when and where information on the microbial community matters for predictions of ecosystem function. We collated data from approximately 100 different studies and used statistical approaches to ask if models with data on microbial community composition significantly improved models of ecosystem function based on environmental data alone. We found that only 25% of models of ecosystem processes were significantly improved with the addition of data on microbial community composition. Specifically, we found that for phylogenetically broad processes, diversity indicators yielded more significant increases in explanatory power than abundance data. Our results also demonstrate that for phylogenetically narrow processes, qPCR data on functional genes yielded higher explanatory power than for broad processes. Further, we found that all types of data on microbial community composition explained more variation in obligate processes compared to facultative processes. Overall, our results suggest that trait distributions both within communities and within individuals affect the relative importance of microbial community composition for explaining ecosystem function.

  13. Near-field environment/processes working group summary

    SciTech Connect

    Murphy, W.M.

    1995-09-01

    This article is a summary of the proceedings of a group discussion which took place at the Workshop on the Role of Natural Analogs in Geologic Disposal of High-Level Nuclear Waste in San Antonio, Texas on July 22-25, 1991. The working group concentrated on the subject of the near-field environment to geologic repositories for high-level nuclear waste. The near-field environment may be affected by thermal perturbations from the waste, and by disturbances caused by the introduction of exotic materials during construction of the repository. This group also discussed the application of modelling of performance-related processes.

  14. Processing Conditions, Rice Properties, Health and Environment

    PubMed Central

    Roy, Poritosh; Orikasa, Takahiro; Okadome, Hiroshi; Nakamura, Nobutaka; Shiina, Takeo

    2011-01-01

    Rice is the staple food for nearly two-thirds of the world’s population. Food components and environmental load of rice depends on the rice form that is resulted by different processing conditions. Brown rice (BR), germinated brown rice (GBR) and partially-milled rice (PMR) contains more health beneficial food components compared to the well milled rice (WMR). Although the arsenic concentration in cooked rice depends on the cooking methods, parboiled rice (PBR) seems to be relatively prone to arsenic contamination compared to that of untreated rice, if contaminated water is used for parboiling and cooking. A change in consumption patterns from PBR to untreated rice (non-parboiled), and WMR to PMR or BR may conserve about 43–54 million tons of rice and reduce the risk from arsenic contamination in the arsenic prone area. This study also reveals that a change in rice consumption patterns not only supply more food components but also reduces environmental loads. A switch in production and consumption patterns would improve food security where food grains are scarce, and provide more health beneficial food components, may prevent some diseases and ease the burden on the Earth. However, motivation and awareness of the environment and health, and even a nominal incentive may require for a method switching which may help in building a sustainable society. PMID:21776212

  15. Automated Environment Generation for Software Model Checking

    NASA Technical Reports Server (NTRS)

    Tkachuk, Oksana; Dwyer, Matthew B.; Pasareanu, Corina S.

    2003-01-01

    A key problem in model checking open systems is environment modeling (i.e., representing the behavior of the execution context of the system under analysis). Software systems are fundamentally open since their behavior is dependent on patterns of invocation of system components and values defined outside the system but referenced within the system. Whether reasoning about the behavior of whole programs or about program components, an abstract model of the environment can be essential in enabling sufficiently precise yet tractable verification. In this paper, we describe an approach to generating environments of Java program fragments. This approach integrates formally specified assumptions about environment behavior with sound abstractions of environment implementations to form a model of the environment. The approach is implemented in the Bandera Environment Generator (BEG) which we describe along with our experience using BEG to reason about properties of several non-trivial concurrent Java programs.

  16. Design of Training Systems Utility Assessment: The Training Process Flow and System Capabilities/Requirements and Resources Models Operating in the TRAPAC Environment.

    ERIC Educational Resources Information Center

    Duffy, Larry R.

    The report summarizes the results of a field test conducted for the purpose of determining the utility to Naval training of the Systems Capabilities/Requirements and Resources (SCRR) and the Training Process Flow (TPF) computer-based mathematical models. Basic descriptions of the SCRR and the TPF and their development are given. Training…

  17. Model for a Healthy Work Environment.

    PubMed

    Blevins, Jamie

    2016-01-01

    The Healthy Work Environment (HWE) Model, considered a model of standards of professional behaviors, was created to help foster an environment that is happy, healthy, realistic, and feasible. The model focuses on areas of PEOPLE and PRACTICE, where each letter of these words identifies core, professional qualities and behaviors to foster an environment amenable and conducive to accountability for one's behavior and action. Each of these characteristics is supported from a Christian, biblical perspective. The HWE Model provides a mental and physical checklist of what is important in creating and sustaining a healthy work environment in education and practice.

  18. Microwave sintering process model.

    PubMed

    Peng, Hu; Tinga, W R; Sundararaj, U; Eadie, R L

    2003-01-01

    In order to simulate and optimize the microwave sintering of a silicon nitride and tungsten carbide/cobalt toolbits process, a microwave sintering process model has been built. A cylindrical sintering furnace was used containing a heat insulating layer, a susceptor layer, and an alumina tube containing the green toolbit parts between parallel, electrically conductive, graphite plates. Dielectric and absorption properties of the silicon nitride green parts, the tungsten carbide/cobalt green parts, and an oxidizable susceptor material were measured using perturbation and waveguide transmission methods. Microwave absorption data were measured over a temperature range from 20 degrees C to 800 degrees C. These data were then used in the microwave process model which assumed plane wave propagation along the radial direction and included the microwave reflection at each interface between the materials and the microwave absorption in the bulk materials. Heat transfer between the components inside the cylindrical sintering furnace was also included in the model. The simulated heating process data for both silicon nitride and tungsten carbide/cobalt samples closely follow the experimental data. By varying the physical parameters of the sintering furnace model, such as the thickness of the susceptor layer, the thickness of the allumina tube wall, the sample load volume and the graphite plate mass, the model data predicts their effects which are helpful in optimizing those parameters in the industrial sintering process.

  19. Foam process models.

    SciTech Connect

    Moffat, Harry K.; Noble, David R.; Baer, Thomas A.; Adolf, Douglas Brian; Rao, Rekha Ranjana; Mondy, Lisa Ann

    2008-09-01

    In this report, we summarize our work on developing a production level foam processing computational model suitable for predicting the self-expansion of foam in complex geometries. The model is based on a finite element representation of the equations of motion, with the movement of the free surface represented using the level set method, and has been implemented in SIERRA/ARIA. An empirically based time- and temperature-dependent density model is used to encapsulate the complex physics of foam nucleation and growth in a numerically tractable model. The change in density with time is at the heart of the foam self-expansion as it creates the motion of the foam. This continuum-level model uses an homogenized description of foam, which does not include the gas explicitly. Results from the model are compared to temperature-instrumented flow visualization experiments giving the location of the foam front as a function of time for our EFAR model system.

  20. Patient Data Synchronization Process in a Continuity of Care Environment

    PubMed Central

    Haras, Consuela; Sauquet, Dominique; Ameline, Philippe; Jaulent, Marie-Christine; Degoulet, Patrice

    2005-01-01

    In a distributed patient record environment, we analyze the processes needed to ensure exchange and access to EHR data. We propose an adapted method and the tools for data synchronization. Our study takes into account the issues of user rights management for data access and of decreasing the amount of data exchanged over the network. We describe a XML-based synchronization model that is portable and independent of specific medical data models. The implemented platform consists of several servers, of local network clients, of workstations running user’s interfaces and of data exchange and synchronization tools. PMID:16779049

  1. Radiolysis Process Model

    SciTech Connect

    Buck, Edgar C.; Wittman, Richard S.; Skomurski, Frances N.; Cantrell, Kirk J.; McNamara, Bruce K.; Soderquist, Chuck Z.

    2012-07-17

    Assessing the performance of spent (used) nuclear fuel in geological repository requires quantification of time-dependent phenomena that may influence its behavior on a time-scale up to millions of years. A high-level waste repository environment will be a dynamic redox system because of the time-dependent generation of radiolytic oxidants and reductants and the corrosion of Fe-bearing canister materials. One major difference between used fuel and natural analogues, including unirradiated UO2, is the intense radiolytic field. The radiation emitted by used fuel can produce radiolysis products in the presence of water vapor or a thin-film of water (including OH• and H• radicals, O2-, eaq, H2O2, H2, and O2) that may increase the waste form degradation rate and change radionuclide behavior. H2O2 is the dominant oxidant for spent nuclear fuel in an O2 depleted water environment, the most sensitive parameters have been identified with respect to predictions of a radiolysis model under typical conditions. As compared with the full model with about 100 reactions it was found that only 30-40 of the reactions are required to determine [H2O2] to one part in 10–5 and to preserve most of the predictions for major species. This allows a systematic approach for model simplification and offers guidance in designing experiments for validation.

  2. Students' Mental Models of the Environment

    ERIC Educational Resources Information Center

    Shepardson, Daniel P.; Wee, Bryan; Priddy, Michelle; Harbor, Jon

    2007-01-01

    What are students' mental models of the environment? In what ways, if any, do students' mental models vary by grade level or community setting? These two questions guided the research reported in this article. The Environments Task was administered to students from 25 different teacher-classrooms. The student responses were first inductively…

  3. Optimal mutation rates in dynamic environments: The eigen model

    NASA Astrophysics Data System (ADS)

    Ancliff, Mark; Park, Jeong-Man

    2011-03-01

    We consider the Eigen quasispecies model with a dynamic environment. For an environment with sharp-peak fitness in which the most-fit sequence moves by k spin-flips each period T we find an asymptotic stationary state in which the quasispecies population changes regularly according to the regular environmental change. From this stationary state we estimate the maximum and the minimum mutation rates for a quasispecies to survive under the changing environment and calculate the optimum mutation rate that maximizes the population growth. Interestingly we find that the optimum mutation rate in the Eigen model is lower than that in the Crow-Kimura model, and at their optimum mutation rates the corresponding mean fitness in the Eigen model is lower than that in the Crow-Kimura model, suggesting that the mutation process which occurs in parallel to the replication process as in the Crow-Kimura model gives an adaptive advantage under changing environment.

  4. Spacecraft Internal Acoustic Environment Modeling

    NASA Technical Reports Server (NTRS)

    Chu, S. Reynold; Allen, Chris

    2009-01-01

    The objective of the project is to develop an acoustic modeling capability, based on commercial off-the-shelf software, to be used as a tool for oversight of the future manned Constellation vehicles. The use of such a model will help ensure compliance with acoustic requirements. Also, this project includes modeling validation and development feedback via building physical mockups and conducting acoustic measurements to compare with the predictions.

  5. Neuroscientific model of motivational process.

    PubMed

    Kim, Sung-Il

    2013-01-01

    Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three sub processes, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous sub processes, namely reward-driven approach, value-based decision-making, and goal-directed control. Reward-driven approach is the process in which motivation is generated by reward anticipation and selective approach behaviors toward reward. This process recruits the ventral striatum (reward area) in which basic stimulus-action association is formed, and is classified as an automatic motivation to which relatively less attention is assigned. By contrast, value-based decision-making is the process of evaluating various outcomes of actions, learning through positive prediction error, and calculating the value continuously. The striatum and the orbitofrontal cortex (valuation area) play crucial roles in sustaining motivation. Lastly, the goal-directed control is the process of regulating motivation through cognitive control to achieve goals. This consciously controlled motivation is associated with higher-level cognitive functions such as planning, retaining the goal, monitoring the performance, and regulating action. The anterior cingulate cortex (attention area) and the dorsolateral prefrontal cortex (cognitive control area) are the main neural circuits related to regulation of motivation. These three sub processes interact with each other by sending reward prediction error signals through dopaminergic pathway from the striatum and to the prefrontal cortex. The neuroscientific model of motivational process suggests several educational implications with regard to the generation, maintenance, and regulation of motivation to learn in the learning environment.

  6. Neuroscientific Model of Motivational Process

    PubMed Central

    Kim, Sung-il

    2013-01-01

    Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three sub processes, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous sub processes, namely reward-driven approach, value-based decision-making, and goal-directed control. Reward-driven approach is the process in which motivation is generated by reward anticipation and selective approach behaviors toward reward. This process recruits the ventral striatum (reward area) in which basic stimulus-action association is formed, and is classified as an automatic motivation to which relatively less attention is assigned. By contrast, value-based decision-making is the process of evaluating various outcomes of actions, learning through positive prediction error, and calculating the value continuously. The striatum and the orbitofrontal cortex (valuation area) play crucial roles in sustaining motivation. Lastly, the goal-directed control is the process of regulating motivation through cognitive control to achieve goals. This consciously controlled motivation is associated with higher-level cognitive functions such as planning, retaining the goal, monitoring the performance, and regulating action. The anterior cingulate cortex (attention area) and the dorsolateral prefrontal cortex (cognitive control area) are the main neural circuits related to regulation of motivation. These three sub processes interact with each other by sending reward prediction error signals through dopaminergic pathway from the striatum and to the prefrontal cortex. The neuroscientific model of motivational process suggests several educational implications with regard to the generation, maintenance, and regulation of motivation to learn in the learning environment. PMID:23459598

  7. Thermal modeling environment for TMT

    NASA Astrophysics Data System (ADS)

    Vogiatzis, Konstantinos

    2010-07-01

    In a previous study we had presented a summary of the TMT Aero-Thermal modeling effort to support thermal seeing and dynamic loading estimates. In this paper a summary of the current status of Computational Fluid Dynamics (CFD) simulations for TMT is presented, with the focus shifted in particular towards the synergy between CFD and the TMT Finite Element Analysis (FEA) structural and optical models, so that the thermal and consequent optical deformations of the telescope can be calculated. To minimize thermal deformations and mirror seeing the TMT enclosure will be air conditioned during day-time to the expected night-time ambient temperature. Transient simulations with closed shutter were performed to investigate the optimum cooling configuration and power requirements for the standard telescope parking position. A complete model of the observatory on Mauna Kea was used to calculate night-time air temperature inside the enclosure (along with velocity and pressure) for a matrix of given telescope orientations and enclosure configurations. Generated records of temperature variations inside the air volume of the optical paths are also fed into the TMT thermal seeing model. The temperature and heat transfer coefficient outputs from both models are used as input surface boundary conditions in the telescope structure and optics FEA models. The results are parameterized so that sequential records several days long can be generated and used by the FEA model to estimate the observing spatial and temporal temperature range of the structure and optics.

  8. Space environment and lunar surface processes

    NASA Technical Reports Server (NTRS)

    Comstock, G. M.

    1979-01-01

    The development of a general rock/soil model capable of simulating in a self consistent manner the mechanical and exposure history of an assemblage of solid and loose material from submicron to planetary size scales, applicable to lunar and other space exposed planetary surfaces is discussed. The model was incorporated into a computer code called MESS.2 (model for the evolution of space exposed surfaces). MESS.2, which represents a considerable increase in sophistication and scope over previous soil and rock surface models, is described. The capabilities of previous models for near surface soil and rock surfaces are compared with the rock/soil model, MESS.2.

  9. Propagation modelling in microcellular environments

    NASA Astrophysics Data System (ADS)

    Sharples, P. A.; Mehler, M. J.

    This paper describes a microcellular model, based on ray tracing techniques. Ray tracing is a stationary phase technique which relies on the quasi-optical properties of radio waves in regions where any obstacles are large in terms of a wavelength. The model described is a very versatile implementation which can be used to study both indoor and outdoor propagation phenomena for a number of different types of service. In its fullest form it requires input data of a sophistication that is not commercially available. However, this allows the model to be used to assess the implications in terms of the achievable accuracy when using commercial building databases.

  10. Optimal mutation rates in dynamic environments: The Eigen model

    NASA Astrophysics Data System (ADS)

    Ancliff, Mark; Park, Jeong-Man

    2010-08-01

    We consider the Eigen quasispecies model with a dynamic environment. For an environment with sharp-peak fitness in which the most-fit sequence moves by k spin-flips each period T we find an asymptotic stationary state in which the quasispecies population changes regularly according to the regular environmental change. From this stationary state we estimate the maximum and the minimum mutation rates for a quasispecies to survive under the changing environment and calculate the optimum mutation rate that maximizes the population growth. Interestingly we find that the optimum mutation rate in the Eigen model is lower than that in the Crow-Kimura model, and at their optimum mutation rates the corresponding mean fitness in the eigenmodel is lower than that in the Crow-Kimura model, suggesting that the mutation process which occurs in parallel to the replication process as in the Crow-Kimura model gives an adaptive advantage under changing environment.

  11. Combustion Processes in the Aerospace Environment

    NASA Technical Reports Server (NTRS)

    Huggett, Clayton

    1969-01-01

    The aerospace environment introduces new and enhanced fire hazards because the special atmosphere employed may increase the frequency and intensity of fires, because the confinement associated with aerospace systems adversely affects the dynamics of fire development and control, and because the hostile external environments limit fire control and rescue operations. Oxygen enriched atmospheres contribute to the fire hazard in aerospace systems by extending the list of combustible fuels, increasing the probability of ignition, and increasing the rates of fire spread and energy release. A system for classifying atmospheres according to the degree of fire hazard, based on the heat capacity of the atmosphere per mole of oxygen, is suggested. A brief exploration of the dynamics of chamber fires shows that such fires will exhibit an exponential growth rate and may grow to dangerous size in a very short time. Relatively small quantities of fuel and oxygen can produce a catastrophic fire in a closed chamber.

  12. Improvement of pre- and post-processing environments of the dynamic two-dimensional reservoir model CE-QUAL-W2 based on GIS.

    PubMed

    Ha, S R; Bae, G J; Park, D H; Cho, J H

    2003-01-01

    An Environmental Information System (EIS) coupled with a Geographic Information System (GIS) and water quality models is developed to improve the pre- and post-data processing function of CE-QUAL-W2. Since the accuracy of the geometric data in terms of a diverse water body has a great effect on the water quality variables such as the velocity, kinetic reactions, the horizontal and vertical momentum, to prepare the bathymetry information has been considered a difficult issue for modellers who intend to use the model. For identifying Cross Section and Profile Information (CSPI), which precisely contains hydraulic features and geographical configuration of a waterway, the automated CSPI extraction program has been developed using Avenue Language of the PC Arc/view package. The program consists of three major steps: (1) getting the digital depth map of a waterway using GIS techniques; (2) creating a CSPI data set of segments in each branch using the program for CE-QUAL-W2 bathymetry input; (3) selecting the optimal set of bathymetry input by which the calculated water volume meets the observed volume of the water body. Through those approaches, it is clear that the model simulation results in terms of water quality as well as reservoir hydraulics rely upon the accuracy of bathymetry information.

  13. Engineered Barrier System: Physical and Chemical Environment Model

    SciTech Connect

    D. M. Jolley; R. Jarek; P. Mariner

    2004-02-09

    The conceptual and predictive models documented in this Engineered Barrier System: Physical and Chemical Environment Model report describe the evolution of the physical and chemical conditions within the waste emplacement drifts of the repository. The modeling approaches and model output data will be used in the total system performance assessment (TSPA-LA) to assess the performance of the engineered barrier system and the waste form. These models evaluate the range of potential water compositions within the emplacement drifts, resulting from the interaction of introduced materials and minerals in dust with water seeping into the drifts and with aqueous solutions forming by deliquescence of dust (as influenced by atmospheric conditions), and from thermal-hydrological-chemical (THC) processes in the drift. These models also consider the uncertainty and variability in water chemistry inside the drift and the compositions of introduced materials within the drift. This report develops and documents a set of process- and abstraction-level models that constitute the engineered barrier system: physical and chemical environment model. Where possible, these models use information directly from other process model reports as input, which promotes integration among process models used for total system performance assessment. Specific tasks and activities of modeling the physical and chemical environment are included in the technical work plan ''Technical Work Plan for: In-Drift Geochemistry Modeling'' (BSC 2004 [DIRS 166519]). As described in the technical work plan, the development of this report is coordinated with the development of other engineered barrier system analysis model reports.

  14. Radiation Environment Modeling for Spacecraft Design: New Model Developments

    NASA Technical Reports Server (NTRS)

    Barth, Janet; Xapsos, Mike; Lauenstein, Jean-Marie; Ladbury, Ray

    2006-01-01

    A viewgraph presentation on various new space radiation environment models for spacecraft design is described. The topics include: 1) The Space Radiatio Environment; 2) Effects of Space Environments on Systems; 3) Space Radiatio Environment Model Use During Space Mission Development and Operations; 4) Space Radiation Hazards for Humans; 5) "Standard" Space Radiation Environment Models; 6) Concerns about Standard Models; 7) Inadequacies of Current Models; 8) Development of New Models; 9) New Model Developments: Proton Belt Models; 10) Coverage of New Proton Models; 11) Comparison of TPM-1, PSB97, AP-8; 12) New Model Developments: Electron Belt Models; 13) Coverage of New Electron Models; 14) Comparison of "Worst Case" POLE, CRESELE, and FLUMIC Models with the AE-8 Model; 15) New Model Developments: Galactic Cosmic Ray Model; 16) Comparison of NASA, MSU, CIT Models with ACE Instrument Data; 17) New Model Developmemts: Solar Proton Model; 18) Comparison of ESP, JPL91, KIng/Stassinopoulos, and PSYCHIC Models; 19) New Model Developments: Solar Heavy Ion Model; 20) Comparison of CREME96 to CREDO Measurements During 2000 and 2002; 21) PSYCHIC Heavy ion Model; 22) Model Standardization; 23) Working Group Meeting on New Standard Radiation Belt and Space Plasma Models; and 24) Summary.

  15. Sanitation in the Shell Egg Processing Environment

    Technology Transfer Automated Retrieval System (TEKTRAN)

    In the past, most of the regulations regarding egg processing are concerned with quality rather than safety. Hazard Analysis and Critical Control Point (HACCP) will be required by retailers or by the federal government. GMPs (Good Manufacturing Practices) and SSOPs (Sanitation Standard Operating P...

  16. Simple Thermal Environment Model (STEM) User's Guide

    NASA Technical Reports Server (NTRS)

    Justus, C.G.; Batts, G. W.; Anderson, B. J.; James, B. F.

    2001-01-01

    This report presents a Simple Thermal Environment Model (STEM) for determining appropriate engineering design values to specify the thermal environment of Earth-orbiting satellites. The thermal environment of a satellite, consists of three components: (1) direct solar radiation, (2) Earth-atmosphere reflected shortwave radiation, as characterized by Earth's albedo, and (3) Earth-atmosphere-emitted outgoing longwave radiation (OLR). This report, together with a companion "guidelines" report provides methodology and guidelines for selecting "design points" for thermal environment parameters for satellites and spacecraft systems. The methods and models reported here are outgrowths of Earth Radiation Budget Experiment (ERBE) satellite data analysis and thermal environment specifications discussed by Anderson and Smith (1994). In large part, this report is intended to update (and supersede) those results.

  17. Space Environments and Effects: Trapped Proton Model

    NASA Technical Reports Server (NTRS)

    Huston, S. L.; Kauffman, W. (Technical Monitor)

    2002-01-01

    An improved model of the Earth's trapped proton environment has been developed. This model, designated Trapped Proton Model version 1 (TPM-1), determines the omnidirectional flux of protons with energy between 1 and 100 MeV throughout near-Earth space. The model also incorporates a true solar cycle dependence. The model consists of several data files and computer software to read them. There are three versions of the mo'del: a FORTRAN-Callable library, a stand-alone model, and a Web-based model.

  18. Process Architecture in a Multimodel Environment

    DTIC Science & Technology

    2008-03-01

    unclassified c . THIS PAGE unclassified Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 2 Permissions given: Addison Wesley to...FA8721-05- C -003 with Carnegie Mellon University for the operation of the Software Engineering Institute, a federally funded research and development...2000. [Avrunin] Avrunin, George S., Lori A. Clarke, Elizabeth A. Henneman , and Leon J. Osterweil, Complex Medical Processes as Context for

  19. Rock fracture processes in chemically reactive environments

    NASA Astrophysics Data System (ADS)

    Eichhubl, P.

    2015-12-01

    Rock fracture is traditionally viewed as a brittle process involving damage nucleation and growth in a zone ahead of a larger fracture, resulting in fracture propagation once a threshold loading stress is exceeded. It is now increasingly recognized that coupled chemical-mechanical processes influence fracture growth in wide range of subsurface conditions that include igneous, metamorphic, and geothermal systems, and diagenetically reactive sedimentary systems with possible applications to hydrocarbon extraction and CO2 sequestration. Fracture processes aided or driven by chemical change can affect the onset of fracture, fracture shape and branching characteristics, and fracture network geometry, thus influencing mechanical strength and flow properties of rock systems. We are investigating two fundamental modes of chemical-mechanical interactions associated with fracture growth: 1. Fracture propagation may be aided by chemical dissolution or hydration reactions at the fracture tip allowing fracture propagation under subcritical stress loading conditions. We are evaluating effects of environmental conditions on critical (fracture toughness KIc) and subcritical (subcritical index) fracture properties using double torsion fracture mechanics tests on shale and sandstone. Depending on rock composition, the presence of reactive aqueous fluids can increase or decrease KIc and/or subcritical index. 2. Fracture may be concurrent with distributed dissolution-precipitation reactions in the hostrock beyond the immediate vicinity of the fracture tip. Reconstructing the fracture opening history recorded in crack-seal fracture cement of deeply buried sandstone we find that fracture length growth and fracture opening can be decoupled, with a phase of initial length growth followed by a phase of dominant fracture opening. This suggests that mechanical crack-tip failure processes, possibly aided by chemical crack-tip weakening, and distributed

  20. Float-zone processing in a weightless environment

    NASA Technical Reports Server (NTRS)

    Fowle, A. A.; Haggerty, J. S.; Perron, R. R.; Strong, P. F.; Swanson, J. L.

    1976-01-01

    The results were reported of investigations to: (1) test the validity of analyses which set maximum practical diameters for Si crystals that can be processed by the float zone method in a near weightless environment, (2) determine the convective flow patterns induced in a typical float zone, Si melt under conditions perceived to be advantageous to the crystal growth process using flow visualization techniques applied to a dimensionally scaled model of the Si melt, (3) revise the estimates of the economic impact of space produced Si crystal by the float zone method on the U.S. electronics industry, and (4) devise a rational plan for future work related to crystal growth phenomena wherein low gravity conditions available in a space site can be used to maximum benefit to the U.S. electronics industry.

  1. Building an environment model using depth information

    NASA Technical Reports Server (NTRS)

    Roth-Tabak, Y.; Jain, Ramesh

    1989-01-01

    Modeling the environment is one of the most crucial issues for the development and research of autonomous robot and tele-perception. Though the physical robot operates (navigates and performs various tasks) in the real world, any type of reasoning, such as situation assessment, planning or reasoning about action, is performed based on information in its internal world. Hence, the robot's intentional actions are inherently constrained by the models it has. These models may serve as interfaces between sensing modules and reasoning modules, or in the case of telerobots serve as interface between the human operator and the distant robot. A robot operating in a known restricted environment may have a priori knowledge of its whole possible work domain, which will be assimilated in its World Model. As the information in the World Model is relatively fixed, an Environment Model must be introduced to cope with the changes in the environment and to allow exploring entirely new domains. Introduced here is an algorithm that uses dense range data collected at various positions in the environment to refine and update or generate a 3-D volumetric model of an environment. The model, which is intended for autonomous robot navigation and tele-perception, consists of cubic voxels with the possible attributes: Void, Full, and Unknown. Experimental results from simulations of range data in synthetic environments are given. The quality of the results show great promise for dealing with noisy input data. The performance measures for the algorithm are defined, and quantitative results for noisy data and positional uncertainty are presented.

  2. MATLAB/Simulink analytic radar modeling environment

    NASA Astrophysics Data System (ADS)

    Esken, Bruce L.; Clayton, Brian L.

    2001-09-01

    Analytic radar models are simulations based on abstract representations of the radar, the RF environment that radar signals are propagated, and the reflections produced by targets, clutter and multipath. These models have traditionally been developed in FORTRAN and have evolved over the last 20 years into efficient and well-accepted codes. However, current models are limited in two primary areas. First, by the nature of algorithm based analytical models, they can be difficult to understand by non-programmers and equally difficult to modify or extend. Second, there is strong interest in re-using these models to support higher-level weapon system and mission level simulations. To address these issues, a model development approach has been demonstrated which utilizes the MATLAB/Simulink graphical development environment. Because the MATLAB/Simulink environment graphically represents model algorithms - thus providing visibility into the model - algorithms can be easily analyzed and modified by engineers and analysts with limited software skills. In addition, software tools have been created that provide for the automatic code generation of C++ objects. These objects are created with well-defined interfaces enabling them to be used by modeling architectures external to the MATLAB/Simulink environment. The approach utilized is generic and can be extended to other engineering fields.

  3. Processing Motion Signals in Complex Environments

    NASA Technical Reports Server (NTRS)

    Verghese, Preeti

    2000-01-01

    Motion information is critical for human locomotion and scene segmentation. Currently we have excellent neurophysiological models that are able to predict human detection and discrimination of local signals. Local motion signals are insufficient by themselves to guide human locomotion and to provide information about depth, object boundaries and surface structure. My research is aimed at understanding the mechanisms underlying the combination of motion signals across space and time. A target moving on an extended trajectory amidst noise dots in Brownian motion is much more detectable than the sum of signals generated by independent motion energy units responding to the trajectory segments. This result suggests that facilitation occurs between motion units tuned to similar directions, lying along the trajectory path. We investigated whether the interaction between local motion units along the motion direction is mediated by contrast. One possibility is that contrast-driven signals from motion units early in the trajectory sequence are added to signals in subsequent units. If this were the case, then units later in the sequence would have a larger signal than those earlier in the sequence. To test this possibility, we compared contrast discrimination thresholds for the first and third patches of a triplet of sequentially presented Gabor patches, aligned along the motion direction. According to this simple additive model, contrast increment thresholds for the third patch should be higher than thresholds for the first patch.The lack of a measurable effect on contrast thresholds for these various manipulations suggests that the pooling of signals along a trajectory is not mediated by contrast-driven signals. Instead, these results are consistent with models that propose that the facilitation of trajectory signals is achieved by a second-level network that chooses the strongest local motion signals and combines them if they occur in a spatio-temporal sequence consistent

  4. A Formal Environment Model for Multi-Agent Systems

    NASA Astrophysics Data System (ADS)

    da Silva, Paulo Salem; de Melo, Ana C. V.

    Multi-agent systems are employed to model complex systems which can be decomposed into several interacting pieces called agents. In such systems, agents exist, evolve and interact within an environment. In this paper we present a model for the specification of such environments. This Environment Model for Multi-Agent Systems (EMMAS), as we call it, defines both structural and dynamic aspects of environments. Structurally, EMMAS connects agents by a social network, in which the link between agents is specified as the capability that one agent has to act upon another. Dynamically, EMMAS provides operations that can be composed together in order to create a number of different environmental situations and to respond appropriately to agents' actions. These features are founded on a mathematical model that we provide and that defines rigorously what constitutes an environment. Formality is achieved by employing the π-calculus process algebra in order to give the semantics of this model. This allows, in particular, a simple characterization of the evolution of the environment structure. Moreover, owing to this formal semantics, it is possible to perform formal analyses on environments thus described. For the sake of illustration, a concrete example of environment specification using EMMAS is also given.

  5. The Educational Process in the Emerging Information Society: Conditions for the Reversal of the Linear Model of Education and the Development of an Open Type Hybrid Learning Environment.

    ERIC Educational Resources Information Center

    Anastasiades, Panagiotes S.; Retalis, Simos

    The introduction of communications and information technologies in the area of education tends to create a totally different environment, which is marked by a change of the teacher's role and a transformation of the basic components that make up the meaning and content of the learning procedure as a whole. It could be said that, despite any…

  6. Liberty High School Transition Project: Model Process for Assimilating School, Community, Business, Government and Service Groups of the Least Restrictive Environment for Nondisabled and Disabled.

    ERIC Educational Resources Information Center

    Grimes, Michael K.

    The panel presentation traces the development of and describes the operation of a Brentwood (California) project to prepare approximately 75 severely disabled individuals, ages 12-22, to function in the least restrictive recreation/leisure, vocational, and general community environments. Transition Steering Committee developed such project…

  7. The AE-8 trapped electron model environment

    NASA Technical Reports Server (NTRS)

    Vette, James I.

    1991-01-01

    The machine sensible version of the AE-8 electron model environment was completed in December 1983. It has been sent to users on the model environment distribution list and is made available to new users by the National Space Science Data Center (NSSDC). AE-8 is the last in a series of terrestrial trapped radiation models that includes eight proton and eight electron versions. With the exception of AE-8, all these models were documented in formal reports as well as being available in a machine sensible form. The purpose of this report is to complete the documentation, finally, for AE-8 so that users can understand its construction and see the comparison of the model with the new data used, as well as with the AE-4 model.

  8. An intelligent processing environment for real-time simulation

    NASA Technical Reports Server (NTRS)

    Carroll, Chester C.; Wells, Buren Earl, Jr.

    1988-01-01

    The development of a highly efficient and thus truly intelligent processing environment for real-time general purpose simulation of continuous systems is described. Such an environment can be created by mapping the simulation process directly onto the University of Alamba's OPERA architecture. To facilitate this effort, the field of continuous simulation is explored, highlighting areas in which efficiency can be improved. Areas in which parallel processing can be applied are also identified, and several general OPERA type hardware configurations that support improved simulation are investigated. Three direct execution parallel processing environments are introduced, each of which greatly improves efficiency by exploiting distinct areas of the simulation process. These suggested environments are candidate architectures around which a highly intelligent real-time simulation configuration can be developed.

  9. Advanced modeling environment for developing and testing FES control systems.

    PubMed

    Davoodi, R; Brown, I E; Loeb, G E

    2003-01-01

    Realistic models of neuromusculoskeletal systems can provide a safe and convenient environment for the design and evaluation of controllers for functional electrical stimulation (FES) prior to clinical trials. We have developed a set of integrated musculoskeletal modeling tools to facilitate the model building process. Simulink models of musculoskeletal systems are created using two software packages developed in our laboratory, Musculoskeletal Modeling in Simulink (MMS) and virtual muscle, in addition to one software package available commercially, SIMM (Musculographics Inc., USA). MMS converts anatomically accurate musculoskeletal models generated by SIMM into Simulink(R) blocks. It also removes run-time constraints on kinetic simulations in SIMM, and allows the development of complex musculoskeletal models without writing a line of code. Virtual muscle builds realistic Simulink models of muscles responding to either natural recruitment or FES. Models of sensorimotor control systems can be developed using various Matlab (Mathworks Inc., USA) toolboxes and integrated easily with these musculoskeletal blocks in the graphical environment of Simulink.

  10. The national operational environment model (NOEM)

    NASA Astrophysics Data System (ADS)

    Salerno, John J.; Romano, Brian; Geiler, Warren

    2011-06-01

    The National Operational Environment Model (NOEM) is a strategic analysis/assessment tool that provides insight into the complex state space (as a system) that is today's modern operational environment. The NOEM supports baseline forecasts by generating plausible futures based on the current state. It supports what-if analysis by forecasting ramifications of potential "Blue" actions on the environment. The NOEM also supports sensitivity analysis by identifying possible pressure (leverage) points in support of the Commander that resolves forecasted instabilities, and by ranking sensitivities in a list for each leverage point and response. The NOEM can be used to assist Decision Makers, Analysts and Researchers with understanding the inter-workings of a region or nation state, the consequences of implementing specific policies, and the ability to plug in new operational environment theories/models as they mature. The NOEM is built upon an open-source, license-free set of capabilities, and aims to provide support for pluggable modules that make up a given model. The NOEM currently has an extensive number of modules (e.g. economic, security & social well-being pieces such as critical infrastructure) completed along with a number of tools to exercise them. The focus this year is on modeling the social and behavioral aspects of a populace within their environment, primarily the formation of various interest groups, their beliefs, their requirements, their grievances, their affinities, and the likelihood of a wide range of their actions, depending on their perceived level of security and happiness. As such, several research efforts are currently underway to model human behavior from a group perspective, in the pursuit of eventual integration and balance of populace needs/demands within their respective operational environment and the capacity to meet those demands. In this paper we will provide an overview of the NOEM, the need for and a description of its main components

  11. The dynamic radiation environment assimilation model (DREAM)

    SciTech Connect

    Reeves, Geoffrey D; Koller, Josef; Tokar, Robert L; Chen, Yue; Henderson, Michael G; Friedel, Reiner H

    2010-01-01

    The Dynamic Radiation Environment Assimilation Model (DREAM) is a 3-year effort sponsored by the US Department of Energy to provide global, retrospective, or real-time specification of the natural and potential nuclear radiation environments. The DREAM model uses Kalman filtering techniques that combine the strengths of new physical models of the radiation belts with electron observations from long-term satellite systems such as GPS and geosynchronous systems. DREAM includes a physics model for the production and long-term evolution of artificial radiation belts from high altitude nuclear explosions. DREAM has been validated against satellites in arbitrary orbits and consistently produces more accurate results than existing models. Tools for user-specific applications and graphical displays are in beta testing and a real-time version of DREAM has been in continuous operation since November 2009.

  12. Bioflims in the poultry production and processing environment

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The chapter conveys the importance of biofilm study in the environment of the poultry production and processing industires. Implications for food safety and security are established for sites of occurrences and causes of biofilm formation in poultry environments. Regulations and testing methods th...

  13. Human Information Processing in the Dynamic Environment (HIPDE)

    DTIC Science & Technology

    2008-01-01

    AFRL-RH-WP-TR-2008-0008 Human Information Processing in the Dynamic Environment (HIPDE) Richard A. McKinley Lloyd D . Tripp , Jr. Jacob...Richard A. McKinley Kathy L. Fullerton Lloyd D . Tripp Chuck Goodyear Jacob Loeffelholz Robert L. Esken 5d. PROJECT NUMBER 7184...Processing in the Dynamic Environment 5a. CONTRACT NUMBER F41624-97- D -6004 5b. GRANT NUMBER N/A 5c. PROGRAM ELEMENT NUMBER 62202F 6. AUTHOR(S

  14. Optical modeling in Testbed Environment for Space Situational Awareness (TESSA).

    PubMed

    Nikolaev, Sergei

    2011-08-01

    We describe optical systems modeling in the Testbed Environment for Space Situational Awareness (TESSA) simulator. We begin by presenting a brief outline of the overall TESSA architecture and focus on components for modeling optical sensors. Both image generation and image processing stages are described in detail, highlighting the differences in modeling ground- and space-based sensors. We conclude by outlining the applicability domains for the TESSA simulator, including potential real-life scenarios.

  15. Designing user models in a virtual cave environment

    SciTech Connect

    Brown-VanHoozer, S.; Hudson, R.; Gokhale, N.

    1995-12-31

    In this paper, the results of a first study into the use of virtual reality for human factor studies and design of simple and complex models of control systems, components, and processes are described. The objective was to design a model in a virtual environment that would reflect more characteristics of the user`s mental model of a system and fewer of the designer`s. The technology of a CAVE{trademark} virtual environment and the methodology of Neuro Linguistic Programming were employed in this study.

  16. [Watershed water environment pollution models and their applications: a review].

    PubMed

    Zhu, Yao; Liang, Zhi-Wei; Li, Wei; Yang, Yi; Yang, Mu-Yi; Mao, Wei; Xu, Han-Li; Wu, Wei-Xiang

    2013-10-01

    Watershed water environment pollution model is the important tool for studying watershed environmental problems. Through the quantitative description of the complicated pollution processes of whole watershed system and its parts, the model can identify the main sources and migration pathways of pollutants, estimate the pollutant loadings, and evaluate their impacts on water environment, providing a basis for watershed planning and management. This paper reviewed the watershed water environment models widely applied at home and abroad, with the focuses on the models of pollutants loading (GWLF and PLOAD), water quality of received water bodies (QUAL2E and WASP), and the watershed models integrated pollutant loadings and water quality (HSPF, SWAT, AGNPS, AnnAGNPS, and SWMM), and introduced the structures, principles, and main characteristics as well as the limitations in practical applications of these models. The other models of water quality (CE-QUAL-W2, EFDC, and AQUATOX) and watershed models (GLEAMS and MIKE SHE) were also briefly introduced. Through the case analysis on the applications of single model and integrated models, the development trend and application prospect of the watershed water environment pollution models were discussed.

  17. GREENSCOPE: Sustainable Process Modeling

    EPA Science Inventory

    EPA researchers are responding to environmental problems by incorporating sustainability into process design and evaluation. EPA researchers are also developing a tool that allows users to assess modifications to existing and new chemical processes to determine whether changes in...

  18. Model for integrated management of quality, labor risks prevention, environment and ethical aspects, applied to R&D&I and production processes in an organization

    NASA Astrophysics Data System (ADS)

    González, M. R.; Torres, F.; Yoldi, V.; Arcega, F.; Plaza, I.

    2012-04-01

    It is proposed an integrated management model for an organization. This model is based on the continuous improvement Plan-Do-Check-Act cycle and it intends to integrate the environmental, risk prevention and ethical aspects as well as research, development and innovation projects management in the general quality management structure proposed by ISO 9001:2008. It aims to fulfill the standards ISO 9001, ISO 14001, OSHAS 18001, SGE 21 y 166002.

  19. Visual Modelling of Learning Processes

    ERIC Educational Resources Information Center

    Copperman, Elana; Beeri, Catriel; Ben-Zvi, Nava

    2007-01-01

    This paper introduces various visual models for the analysis and description of learning processes. The models analyse learning on two levels: the dynamic level (as a process over time) and the functional level. Two types of model for dynamic modelling are proposed: the session trace, which documents a specific learner in a particular learning…

  20. Macro Level Simulation Model Of Space Shuttle Processing

    NASA Technical Reports Server (NTRS)

    2000-01-01

    The contents include: 1) Space Shuttle Processing Simulation Model; 2) Knowledge Acquisition; 3) Simulation Input Analysis; 4) Model Applications in Current Shuttle Environment; and 5) Model Applications for Future Reusable Launch Vehicles (RLV's). This paper is presented in viewgraph form.

  1. Design of Training Systems Utility Assessment. The Training Process Flow and System Capabilities/Requirements and Resources Models Operating in the TRAPAC Environment

    DTIC Science & Technology

    1976-05-01

    TRAINING ANALYSIS AND EVALUATION GROUP TAEG REPORT NO. 33. DOTS UTILITY ASSESSMENT: The Training Process Flow anu System...purpose of the United States Government. ALFRED F. SMODE, Ph.D., Director Training Analysis & Evaluation Group TAEG REPORT NO. 33 FOREWORD The... SUMMARY 46 ill TAEG REPORT NO. 33 LIST OF FIGURES FIGURE NO. PAGE 1 DOTS SYSTEM DIAGRAM 2 FIELD TEST SCHEDULE lv TAEG REPORT NO. 33 LIST OF

  2. THE IMPORTANCE OF CONCURRENT MONITORING AND MODELING FOR UNDERSTANDING MERCURY EXPOSURE IN THE ENVIRONMENT

    EPA Science Inventory

    Understanding the cycling processes governing mercury exposure in the environment requires sufficient process-based modeling and monitoring data. Monitoring provides ambient concentration data for specific sample times and locations. Modeling provides a tool for investigating the...

  3. Virtual Collaborative Simulation Environment for Integrated Product and Process Development

    NASA Technical Reports Server (NTRS)

    Gulli, Michael A.

    1997-01-01

    Deneb Robotics is a leader in the development of commercially available, leading edge three- dimensional simulation software tools for virtual prototyping,, simulation-based design, manufacturing process simulation, and factory floor simulation and training applications. Deneb has developed and commercially released a preliminary Virtual Collaborative Engineering (VCE) capability for Integrated Product and Process Development (IPPD). This capability allows distributed, real-time visualization and evaluation of design concepts, manufacturing processes, and total factory and enterprises in one seamless simulation environment.

  4. A Multiagent Modeling Environment for Simulating Work Practice in Organizations

    NASA Technical Reports Server (NTRS)

    Sierhuis, Maarten; Clancey, William J.; vanHoof, Ron

    2004-01-01

    In this paper we position Brahms as a tool for simulating organizational processes. Brahms is a modeling and simulation environment for analyzing human work practice, and for using such models to develop intelligent software agents to support the work practice in organizations. Brahms is the result of more than ten years of research at the Institute for Research on Learning (IRL), NYNEX Science & Technology (the former R&D institute of the Baby Bell telephone company in New York, now Verizon), and for the last six years at NASA Ames Research Center, in the Work Systems Design and Evaluation group, part of the Computational Sciences Division (Code IC). Brahms has been used on more than ten modeling and simulation research projects, and recently has been used as a distributed multiagent development environment for developing work practice support tools for human in-situ science exploration on planetary surfaces, in particular a human mission to Mars. Brahms was originally conceived of as a business process modeling and simulation tool that incorporates the social systems of work, by illuminating how formal process flow descriptions relate to people s actual located activities in the workplace. Our research started in the early nineties as a reaction to experiences with work process modeling and simulation . Although an effective tool for convincing management of the potential cost-savings of the newly designed work processes, the modeling and simulation environment was only able to describe work as a normative workflow. However, the social systems, uncovered in work practices studied by the design team played a significant role in how work actually got done-actual lived work. Multi- tasking, informal assistance and circumstantial work interactions could not easily be represented in a tool with a strict workflow modeling paradigm. In response, we began to develop a tool that would have the benefits of work process modeling and simulation, but be distinctively able to

  5. Models of Cognition in Distributed Learning Environments

    DTIC Science & Technology

    2010-07-13

    SUBTITLE Models of Cognition in Distributed Learning Environments 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR( S ) 5d...PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME( S ) AND ADDRESS(ES) Institute for Defense Analyses,4850 Mark...Center Dr ,Alexandria,VA,22311 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING AGENCY NAME( S ) AND ADDRESS(ES) 10. SPONSOR

  6. A model environment for outer zone electrons

    NASA Technical Reports Server (NTRS)

    Singley, G. W.; Vette, J. I.

    1972-01-01

    A brief morphology of outer zone electrons is given to illustrate the nature of the phenomena that we are attempting to model. This is followed by a discussion of the data processing that was done with the various data received from the experimenters before incorporating it into the data base from which this model was ultimately derived. The details of the derivation are given, and several comparisons of the final model with the various experimental measurements are presented.

  7. Interplanetary Radiation and Internal Charging Environment Models for Solar Sails

    NASA Technical Reports Server (NTRS)

    Minow, Joseph I.; Altstatt, Richard L.; NeegaardParker, Linda

    2005-01-01

    A Solar Sail Radiation Environment (SSRE) model has been developed for defining charged particle environments over an energy range from 0.01 keV to 1 MeV for hydrogen ions, helium ions, and electrons. The SSRE model provides the free field charged particle environment required for characterizing energy deposition per unit mass, charge deposition, and dose rate dependent conductivity processes required to evaluate radiation dose and internal (bulk) charging processes in the solar sail membrane in interplanetary space. Solar wind and energetic particle measurements from instruments aboard the Ulysses spacecraft in a solar, near-polar orbit provide the particle data over a range of heliospheric latitudes used to derive the environment that can be used for radiation and charging environments for both high inclination 0.5 AU Solar Polar Imager mission and the 1.0 AU L1 solar missions. This paper describes the techniques used to model comprehensive electron, proton, and helium spectra over the range of particle energies of significance to energy and charge deposition in thin (less than 25 micrometers) solar sail materials.

  8. Modeling Primary Atomization Processes

    DTIC Science & Technology

    2007-11-02

    I., "Generation of Ripples by Wind Blowing Over a Viscous Fluid", The Scientific Papers of Sir Geoffrey Ingram Taylor, 1963. 2. A. A. Amsden, P. J...92, 1983. 28. Jin, Xiaoshi, "Boundary Element Study on Particle Orientation Caused by the Fountain Flow in Injection Molding ", Polymer Engineering...HTPB, PE is a thermoplastic which is commonly produced via extrusion from a die in a continuous process. Hence, PE grains could be produced using

  9. Aerospace Materials Process Modelling

    DTIC Science & Technology

    1988-08-01

    des phdnombnes physico - chimiques , slors sal connus, notamment des rdactions do phase as produisant dana l intorvalle do solidification, par des...connaissance do donndos theraiques, sinai qua du comportement e~canique, physico - chimique at mdtaliurgique des pibees & order maim aussi des moules. des...W.T.Sbs 16 A NUMERICAL MODEL OF DIRECTIONAL SOLIDIFICATION OF CAST TURBINE BLADES by G,.Lammndu and L -Veruiot des Roches 17 Paper IS withdrawn Pape 19

  10. Observations of chemical processing in the circumstellar environment

    NASA Technical Reports Server (NTRS)

    Mundy, L. G.; McMullin, J. P.; Blake, G. A.

    1995-01-01

    High resolution interferometer and single-dish observations of young, deeply embedded stellar systems reveal a complex chemistry in the circumstellar environments of low to intermediate mass stars. Depletions of gas-phase molecules, grain mantle evaporation, and shock interactions actively drive chemical processes in different regions around young stars. We present results for two systems, IRAS 05338-0624 and NCG 1333 IRAS 4, to illustrate the behavior found and to examine the physical processes at work.

  11. Introducing ORACLE: Library Processing in a Multi-User Environment.

    ERIC Educational Resources Information Center

    Queensland Library Board, Brisbane (Australia).

    Currently being developed by the State Library of Queensland, Australia, ORACLE (On-Line Retrieval of Acquisitions, Cataloguing, and Circulation Details for Library Enquiries) is a computerized library system designed to provide rapid processing of library materials in a multi-user environment. It is based on the Australian MARC format and fully…

  12. Extreme Environments Development of Decision Processes and Training Programs for Medical Policy Formulation

    NASA Technical Reports Server (NTRS)

    Stough, Roger

    2004-01-01

    The purpose of this workshop was to survey existing health and safety policies as well as processes and practices for various extreme environments; to identify strengths and shortcomings of these processes; and to recommend parameters for inclusion in a generic approach to policy formulation, applicable to the broadest categories of extreme environments. It was anticipated that two additional workshops would follow. The November 7, 2003 workshop would be devoted to the evaluation of different model(s) and a concluding expert evaluation of the usefulness of the model using a policy formulation example. The final workshop was planned for March 2004.

  13. Multiscale Materials Modeling in an Industrial Environment.

    PubMed

    Weiß, Horst; Deglmann, Peter; In 't Veld, Pieter J; Cetinkaya, Murat; Schreiner, Eduard

    2016-06-07

    In this review, we sketch the materials modeling process in industry. We show that predictive and fast modeling is a prerequisite for successful participation in research and development processes in the chemical industry. Stable and highly automated workflows suitable for handling complex systems are a must. In particular, we review approaches to build and parameterize soft matter systems. By satisfying these prerequisites, efficiency for the development of new materials can be significantly improved, as exemplified here for formulation polymer development. This is in fact in line with recent Materials Genome Initiative efforts sponsored by the US government. Valuable contributions to product development are possible today by combining existing modeling techniques in an intelligent fashion, provided modeling and experiment work hand in hand.

  14. Pupils' Problem-Solving Processes in a Complex Computerized Learning Environment.

    ERIC Educational Resources Information Center

    Suomala, Jyrki; Alajaaski, Jarkko

    2002-01-01

    Describes a study that examined fifth-grade Finnish pupils' problem-solving processes in a LEGO/Logo technology-based learning environment. Results indicate that learning model and gender account for group differences in problem solving processes, and are interpreted as supporting the validity of discovery learning. (Author/LRW)

  15. An integrative model linking feedback environment and organizational citizenship behavior.

    PubMed

    Peng, Jei-Chen; Chiu, Su-Fen

    2010-01-01

    Past empirical evidence has suggested that a positive supervisor feedback environment may enhance employees' organizational citizenship behavior (OCB). In this study, we aim to extend previous research by proposing and testing an integrative model that examines the mediating processes underlying the relationship between supervisor feedback environment and employee OCB. Data were collected from 259 subordinate-supervisor dyads across a variety of organizations in Taiwan. We used structural equation modeling to test our hypotheses. The results demonstrated that supervisor feedback environment influenced employees' OCB indirectly through (1) both positive affective-cognition and positive attitude (i.e., person-organization fit and organizational commitment), and (2) both negative affective-cognition and negative attitude (i.e., role stressors and job burnout). Theoretical and practical implications are discussed.

  16. Business process modeling in healthcare.

    PubMed

    Ruiz, Francisco; Garcia, Felix; Calahorra, Luis; Llorente, César; Gonçalves, Luis; Daniel, Christel; Blobel, Bernd

    2012-01-01

    The importance of the process point of view is not restricted to a specific enterprise sector. In the field of health, as a result of the nature of the service offered, health institutions' processes are also the basis for decision making which is focused on achieving their objective of providing quality medical assistance. In this chapter the application of business process modelling - using the Business Process Modelling Notation (BPMN) standard is described. Main challenges of business process modelling in healthcare are the definition of healthcare processes, the multi-disciplinary nature of healthcare, the flexibility and variability of the activities involved in health care processes, the need of interoperability between multiple information systems, and the continuous updating of scientific knowledge in healthcare.

  17. Conceptual models of information processing

    NASA Technical Reports Server (NTRS)

    Stewart, L. J.

    1983-01-01

    The conceptual information processing issues are examined. Human information processing is defined as an active cognitive process that is analogous to a system. It is the flow and transformation of information within a human. The human is viewed as an active information seeker who is constantly receiving, processing, and acting upon the surrounding environmental stimuli. Human information processing models are conceptual representations of cognitive behaviors. Models of information processing are useful in representing the different theoretical positions and in attempting to define the limits and capabilities of human memory. It is concluded that an understanding of conceptual human information processing models and their applications to systems design leads to a better human factors approach.

  18. Modeling nuclear processes by Simulink

    SciTech Connect

    Rashid, Nahrul Khair Alang Md

    2015-04-29

    Modelling and simulation are essential parts in the study of dynamic systems behaviours. In nuclear engineering, modelling and simulation are important to assess the expected results of an experiment before the actual experiment is conducted or in the design of nuclear facilities. In education, modelling can give insight into the dynamic of systems and processes. Most nuclear processes can be described by ordinary or partial differential equations. Efforts expended to solve the equations using analytical or numerical solutions consume time and distract attention from the objectives of modelling itself. This paper presents the use of Simulink, a MATLAB toolbox software that is widely used in control engineering, as a modelling platform for the study of nuclear processes including nuclear reactor behaviours. Starting from the describing equations, Simulink models for heat transfer, radionuclide decay process, delayed neutrons effect, reactor point kinetic equations with delayed neutron groups, and the effect of temperature feedback are used as examples.

  19. Modeling nuclear processes by Simulink

    NASA Astrophysics Data System (ADS)

    Rashid, Nahrul Khair Alang Md

    2015-04-01

    Modelling and simulation are essential parts in the study of dynamic systems behaviours. In nuclear engineering, modelling and simulation are important to assess the expected results of an experiment before the actual experiment is conducted or in the design of nuclear facilities. In education, modelling can give insight into the dynamic of systems and processes. Most nuclear processes can be described by ordinary or partial differential equations. Efforts expended to solve the equations using analytical or numerical solutions consume time and distract attention from the objectives of modelling itself. This paper presents the use of Simulink, a MATLAB toolbox software that is widely used in control engineering, as a modelling platform for the study of nuclear processes including nuclear reactor behaviours. Starting from the describing equations, Simulink models for heat transfer, radionuclide decay process, delayed neutrons effect, reactor point kinetic equations with delayed neutron groups, and the effect of temperature feedback are used as examples.

  20. Process material management in the Space Station environment

    NASA Technical Reports Server (NTRS)

    Perry, J. L.; Humphries, W. R.

    1988-01-01

    The Space Station will provide a unique facility for conducting material-processing and life-science experiments under microgravity conditions. These conditions place special requirements on the U.S. Laboratory for storing and transporting chemicals and process fluids, reclaiming water from selected experiments, treating and storing experiment wastes, and providing vacuum utilities. To meet these needs and provide a safe laboratory environment, the Process Material Management System (PMMS) is being developed. Preliminary design requirements and concepts related to the PMMS are addressed, and the MSFC PMMS breadboard test facility and a preliminary plan for validating the overall system design are discussed.

  1. Three-dimensional environment models from airborne laser radar data

    NASA Astrophysics Data System (ADS)

    Soderman, Ulf; Ahlberg, Simon; Elmqvist, Magnus; Persson, Asa

    2004-09-01

    Detailed 3D environment models for visualization and computer based analyses are important in many defence and homeland security applications, e.g. crisis management, mission planning and rehearsal, damage assessment, etc. The high resolution data from airborne laser radar systems for 3D sensing provide an excellent source of data for obtaining the information needed for many of these models. To utilise the 3D data provided by the laser radar systems however, efficient methods for data processing and environment model construction needs to be developed. In this paper we will present some results on the development of laser data processing methods, including methods for data classification, bare earth extraction, 3D-reconstruction of buildings, and identification of single trees and estimation of their position, height, canopy size and species. We will also show how the results can be used for the construction of detailed 3D environment models for military modelling and simulation applications. The methods use data from discrete return airborne laser radar systems and digital cameras.

  2. Modeling of space environment impact on nanostructured materials. General principles

    NASA Astrophysics Data System (ADS)

    Voronina, Ekaterina; Novikov, Lev

    2016-07-01

    In accordance with the resolution of ISO TC20/SC14 WG4/WG6 joint meeting, Technical Specification (TS) 'Modeling of space environment impact on nanostructured materials. General principles' which describes computer simulation methods of space environment impact on nanostructured materials is being prepared. Nanomaterials surpass traditional materials for space applications in many aspects due to their unique properties associated with nanoscale size of their constituents. This superiority in mechanical, thermal, electrical and optical properties will evidently inspire a wide range of applications in the next generation spacecraft intended for the long-term (~15-20 years) operation in near-Earth orbits and the automatic and manned interplanetary missions. Currently, ISO activity on developing standards concerning different issues of nanomaterials manufacturing and applications is high enough. Most such standards are related to production and characterization of nanostructures, however there is no ISO documents concerning nanomaterials behavior in different environmental conditions, including the space environment. The given TS deals with the peculiarities of the space environment impact on nanostructured materials (i.e. materials with structured objects which size in at least one dimension lies within 1-100 nm). The basic purpose of the document is the general description of the methodology of applying computer simulation methods which relate to different space and time scale to modeling processes occurring in nanostructured materials under the space environment impact. This document will emphasize the necessity of applying multiscale simulation approach and present the recommendations for the choice of the most appropriate methods (or a group of methods) for computer modeling of various processes that can occur in nanostructured materials under the influence of different space environment components. In addition, TS includes the description of possible

  3. Modelling of CWS combustion process

    NASA Astrophysics Data System (ADS)

    Rybenko, I. A.; Ermakova, L. A.

    2016-10-01

    The paper considers the combustion process of coal water slurry (CWS) drops. The physico-chemical process scheme consisting of several independent parallel-sequential stages is offered. This scheme of drops combustion process is proved by the particle size distribution test and research stereomicroscopic analysis of combustion products. The results of mathematical modelling and optimization of stationary regimes of CWS combustion are provided. During modeling the problem of defining possible equilibrium composition of products, which can be obtained as a result of CWS combustion processes at different temperatures, is solved.

  4. Combustion modeling for experimentation in a space environment

    NASA Technical Reports Server (NTRS)

    Berlad, A. L.

    1974-01-01

    The merits of combustion experimentation in a space environment are assessed, and the impact of such experimentation on current theoretical models is considered. It is noted that combustion theory and experimentation for less than normal gravitational conditions are incomplete, inadequate, or nonexistent. Extensive and systematic experimentation in a space environment is viewed as essential for more adequate and complete theoretical models of such processes as premixed flame propagation and extinction limits, premixed flame propagation in droplet and particle clouds, ignition and autoignition in premixed combustible media, and gas jet combustion of unpremixed reactants. Current theories and models in these areas are described, and some combustion studies that can be undertaken in the Space Shuttle Program are proposed, including crossed molecular beam, turbulence, and upper pressure limit (of gases) studies.

  5. Distributed collaborative environments for 21st century modeling and simulation

    NASA Astrophysics Data System (ADS)

    McQuay, William K.

    2001-09-01

    Distributed collaboration is an emerging technology that will significantly change how modeling and simulation is employed in 21st century organizations. Modeling and simulation (M&S) is already an integral part of how many organizations conduct business and, in the future, will continue to spread throughout government and industry enterprises and across many domains from research and development to logistics to training to operations. This paper reviews research that is focusing on the open standards agent-based framework, product and process modeling, structural architecture, and the integration technologies - the glue to integrate the software components. A distributed collaborative environment is the underlying infrastructure that makes communication between diverse simulations and other assets possible and manages the overall flow of a simulation based experiment. The AFRL Collaborative Environment concept will foster a major cultural change in how the acquisition, training, and operational communities employ M&S.

  6. MODELING WIND TURBINES IN THE GRIDLAB-D SOFTWARE ENVIRONMENT

    SciTech Connect

    Fuller, J.C.; Schneider, K.P.

    2009-01-01

    In recent years, the rapid expansion of wind power has resulted in a need to more accurately model the effects of wind penetration on the electricity infrastructure. GridLAB-D is a new simulation environment developed for the U.S. Department of Energy (DOE) by the Pacifi c Northwest National Laboratory (PNNL), in cooperation with academic and industrial partners. GridLAB-D was originally written and designed to help integrate end-use smart grid technologies, and it is currently being expanded to include a number of other technologies, including distributed energy resources (DER). The specifi c goal of this project is to create a preliminary wind turbine generator (WTG) model for integration into GridLAB-D. As wind power penetration increases, models are needed to accurately study the effects of increased penetration; this project is a beginning step at examining these effects within the GridLAB-D environment. Aerodynamic, mechanical and electrical power models were designed to simulate the process by which mechanical power is extracted by a wind turbine and converted into electrical energy. The process was modeled using historic atmospheric data, collected over a period of 30 years as the primary energy input. This input was then combined with preliminary models for synchronous and induction generators. Additionally, basic control methods were implemented, using either constant power factor or constant power modes. The model was then compiled into the GridLAB-D simulation environment, and the power outputs were compared against manufacturers’ data and then a variation of the IEEE 4 node test feeder was used to examine the model’s behavior. Results showed the designs were suffi cient for a prototype model and provided output power similar to the available manufacturers’ data. The prototype model is designed as a template for the creation of new modules, with turbine-specifi c parameters to be added by the user.

  7. Random Walks and Branching Processes in Correlated Gaussian Environment

    NASA Astrophysics Data System (ADS)

    Aurzada, Frank; Devulder, Alexis; Guillotin-Plantard, Nadine; Pène, Françoise

    2017-01-01

    We study persistence probabilities for random walks in correlated Gaussian random environment investigated by Oshanin et al. (Phys Rev Lett, 110:100602, 2013). From the persistence results, we can deduce properties of critical branching processes with offspring sizes geometrically distributed with correlated random parameters. More precisely, we obtain estimates on the tail distribution of its total population size, of its maximum population, and of its extinction time.

  8. Modeling Extracellular Matrix Reorganization in 3D Environments

    PubMed Central

    Harjanto, Dewi; Zaman, Muhammad H.

    2013-01-01

    Extracellular matrix (ECM) remodeling is a key physiological process that occurs in a number of contexts, including cell migration, and is especially important for cellular form and function in three-dimensional (3D) matrices. However, there have been few attempts to computationally model how cells modify their environment in a manner that accounts for both cellular properties and the architecture of the surrounding ECM. To this end, we have developed and validated a novel model to simulate matrix remodeling that explicitly defines cells in a 3D collagenous matrix. In our simulation, cells can degrade, deposit, or pull on local fibers, depending on the fiber density around each cell. The cells can also move within the 3D matrix. Different cell phenotypes can be modeled by varying key cellular parameters. Using the model we have studied how two model cancer cell lines, of differing invasiveness, modify matrices with varying fiber density in their vicinity by tracking the metric of fraction of matrix occupied by fibers. Our results quantitatively demonstrate that in low density environments, cells deposit more collagen to uniformly increase fibril fraction. On the other hand, in higher density environments, the less invasive model cell line reduced the fibril fraction as compared to the highly invasive phenotype. These results show good qualitative and quantitative agreement with existing experimental literature. Our simulation is therefore able to function as a novel platform to provide new insights into the clinically relevant and physiologically critical process of matrix remodeling by helping identify critical parameters that dictate cellular behavior in complex native-like environments. PMID:23341900

  9. Social Models: Blueprints or Processes?

    ERIC Educational Resources Information Center

    Little, Graham R.

    1981-01-01

    Discusses the nature and implications of two different models for societal planning: (1) the problem-solving process approach based on Karl Popper; and (2) the goal-setting "blueprint" approach based on Karl Marx. (DC)

  10. Analysis of Interpersonal Communication Processes in Digital Factory Environments

    NASA Astrophysics Data System (ADS)

    Schütze, Jens; Baum, Heiko; Laue, Martin; Müller, Egon

    The paper outlines the scope of influence of digital factory on the interpersonal communication process and the exemplary description of them. On the basis of a brief description about the theoretical basic concepts of the digital factory occurs the illustration of communicative features in digital factory. Practical coherences of interpersonal communication from a human oriented view were analyzed in Volkswagen AG in Wolfsburg in a pilot project. A modeling method was developed within the process analysis. This method makes it possible to visualize interpersonal communication and its human oriented attribute in a technically focused workflow. Due to the results of a developed inquiry about communication analysis and process models of modeling methods it was possible to build the processes in a suitable way for humans and to obtain a positive implication on the communication processes.

  11. NG6: Integrated next generation sequencing storage and processing environment

    PubMed Central

    2012-01-01

    Background Next generation sequencing platforms are now well implanted in sequencing centres and some laboratories. Upcoming smaller scale machines such as the 454 junior from Roche or the MiSeq from Illumina will increase the number of laboratories hosting a sequencer. In such a context, it is important to provide these teams with an easily manageable environment to store and process the produced reads. Results We describe a user-friendly information system able to manage large sets of sequencing data. It includes, on one hand, a workflow environment already containing pipelines adapted to different input formats (sff, fasta, fastq and qseq), different sequencers (Roche 454, Illumina HiSeq) and various analyses (quality control, assembly, alignment, diversity studies,…) and, on the other hand, a secured web site giving access to the results. The connected user will be able to download raw and processed data and browse through the analysis result statistics. The provided workflows can easily be modified or extended and new ones can be added. Ergatis is used as a workflow building, running and monitoring system. The analyses can be run locally or in a cluster environment using Sun Grid Engine. Conclusions NG6 is a complete information system designed to answer the needs of a sequencing platform. It provides a user-friendly interface to process, store and download high-throughput sequencing data. PMID:22958229

  12. Regulatory Models and the Environment: Practice, Pitfalls, and Prospects

    SciTech Connect

    Holmes, K. John; Graham, Judith A.; McKone, Thomas; Whipple, Chris

    2008-06-01

    Computational models support environmental regulatory activities by providing the regulator an ability to evaluate available knowledge, assess alternative regulations, and provide a framework to assess compliance. But all models face inherent uncertainties, because human and natural systems are always more complex and heterogeneous than can be captured in a model. Here we provide a summary discussion of the activities, findings, and recommendations of the National Research Council's Committee on Regulatory Environmental Models, a committee funded by the US Environmental Protection Agency to provide guidance on the use of computational models in the regulatory process. Modeling is a difficult enterprise even outside of the potentially adversarial regulatory environment. The demands grow when the regulatory requirements for accountability, transparency, public accessibility, and technical rigor are added to the challenges. Moreover, models cannot be validated (declared true) but instead should be evaluated with regard to their suitability as tools to address a specific question. The committee concluded that these characteristics make evaluation of a regulatory model more complex than simply comparing measurement data with model results. Evaluation also must balance the need for a model to be accurate with the need for a model to be reproducible, transparent, and useful for the regulatory decision at hand. Meeting these needs requires model evaluation to be applied over the"life cycle" of a regulatory model with an approach that includes different forms of peer review, uncertainty analysis, and extrapolation methods than for non-regulatory models.

  13. Model-based description of environment interaction for mobile robots

    NASA Astrophysics Data System (ADS)

    Borghi, Giuseppe; Ferrari, Carlo; Pagello, Enrico; Vianello, Marco

    1999-01-01

    We consider a mobile robot that attempts to accomplish a task by reaching a given goal, and interacts with its environment through a finite set of actions and observations. The interaction between robot and environment is modeled by Partially Observable Markov Decision Processes (POMDP). The robot takes its decisions in presence of uncertainty about the current state, by maximizing its reward gained during interactions with the environment. It is able to self-locate into the environment by collecting actions and perception histories during the navigation. To make the state estimation more reliable, we introduce an additional information in the model without adding new states and without discretizing the considered measures. Thus, we associate to the state transition probabilities also a continuous metric given through the mean and the variance of some significant sensor measurements suitable to be kept under continuous form, such as odometric measurements, showing that also such unreliable data can supply a great deal of information to the robot. The overall control system of the robot is structured as a two-levels layered architecture, where the low level implements several collision avoidance algorithms, while the upper level takes care of the navigation problem. In this paper, we concentrate on how to use POMDP models at the upper level.

  14. Command Process Modeling & Risk Analysis

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila

    2011-01-01

    Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.

  15. Simulation model for plant growth in controlled environment systems

    NASA Technical Reports Server (NTRS)

    Raper, C. D., Jr.; Wann, M.

    1986-01-01

    The role of the mathematical model is to relate the individual processes to environmental conditions and the behavior of the whole plant. Using the controlled-environment facilities of the phytotron at North Carolina State University for experimentation at the whole-plant level and methods for handling complex models, researchers developed a plant growth model to describe the relationships between hierarchial levels of the crop production system. The fundamental processes that are considered are: (1) interception of photosynthetically active radiation by leaves, (2) absorption of photosynthetically active radiation, (3) photosynthetic transformation of absorbed radiation into chemical energy of carbon bonding in solube carbohydrates in the leaves, (4) translocation between carbohydrate pools in leaves, stems, and roots, (5) flow of energy from carbohydrate pools for respiration, (6) flow from carbohydrate pools for growth, and (7) aging of tissues. These processes are described at the level of organ structure and of elementary function processes. The driving variables of incident photosynthetically active radiation and ambient temperature as inputs pertain to characterization at the whole-plant level. The output of the model is accumulated dry matter partitioned among leaves, stems, and roots; thus, the elementary processes clearly operate under the constraints of the plant structure which is itself the output of the model.

  16. Memory processes and motor control in extreme environments.

    PubMed

    Newman, D J; Lathan, C E

    1999-08-01

    Cognitive-performance and motor-performance activities in multi-task, high-workload environments were assessed during astronaut performance in space flight and in isolation. Data was collected in microgravity on the International Micro-gravity Laboratory (IML) space shuttle mission (STS-42), and the Canadian Astronaut Program Space Unit Life Simulation (CAPSULS) mission offered an ideal opportunity to collect data for individuals in extreme isolation to complement the space flight data using similar hardware, software, and experimental protocols. The mental workload and performance experiment (MWPE) was performed during the IML-1 space flight mission, and the memory processes and motor control (MEMO) experiment was performed during the CAPSULS isolation mission. In both experiments, short-term exhaustive memory and fine motor control associated with human-computer interaction was studied. Memory processes were assessed using a Sternberg-like exhaustive memory search containing 1, 2, 4, or 7 letters. Fine motor control was assessed using velocity-controlled (joystick) and position-controlled (trackball) computer input devices to acquire targets as displayed on a computer screen. Subjects repeated the tasks under two conditions that tested perceptual motor adaptation strategies: 1) During adaptation to the microgravity environment; and 2) While wearing left-right reversing prism goggles during the CAPSULS mission. Both conditions significantly degraded motor performance but not cognitive performance. The data collected during both the MEMO experiment and the MWPE experiments enhance the knowledge base of human interface technology for human performance in extreme environments.

  17. Hybrid Models for Trajectory Error Modelling in Urban Environments

    NASA Astrophysics Data System (ADS)

    Angelatsa, E.; Parés, M. E.; Colomina, I.

    2016-06-01

    This paper tackles the first step of any strategy aiming to improve the trajectory of terrestrial mobile mapping systems in urban environments. We present an approach to model the error of terrestrial mobile mapping trajectories, combining deterministic and stochastic models. Due to urban specific environment, the deterministic component will be modelled with non-continuous functions composed by linear shifts, drifts or polynomial functions. In addition, we will introduce a stochastic error component for modelling residual noise of the trajectory error function. First step for error modelling requires to know the actual trajectory error values for several representative environments. In order to determine as accurately as possible the trajectories error, (almost) error less trajectories should be estimated using extracted nonsemantic features from a sequence of images collected with the terrestrial mobile mapping system and from a full set of ground control points. Once the references are estimated, they will be used to determine the actual errors in terrestrial mobile mapping trajectory. The rigorous analysis of these data sets will allow us to characterize the errors of a terrestrial mobile mapping system for a wide range of environments. This information will be of great use in future campaigns to improve the results of the 3D points cloud generation. The proposed approach has been evaluated using real data. The data originate from a mobile mapping campaign over an urban and controlled area of Dortmund (Germany), with harmful GNSS conditions. The mobile mapping system, that includes two laser scanner and two cameras, was mounted on a van and it was driven over a controlled area around three hours. The results show the suitability to decompose trajectory error with non-continuous deterministic and stochastic components.

  18. Geant4 models for space radiation environment.

    NASA Astrophysics Data System (ADS)

    Ivantchenko, Anton; Nieminen, Petteri; Incerti, Sebastien; Santin, Giovanni; Ivantchenko, Vladimir; Grichine, Vladimir; Allison, John

    The space radiation environment includes wide varieties of particles from electrons to heavy ions. In order to correctly predict the dose received by astronauts and devices the simulation models must have good applicability and produce accurate results from 10 MeV/u up to 10 GeV/u, where the most radioactive hazardous particles are present in the spectra. Appropriate models should also provide a good description of electromagnetic interactions down to very low energies (10 eV/u - 10 MeV/u) for understanding the damage mechanisms due to long-term low doses. Predictions of biological dose during long interplanetary journeys also need models for hadronic interactions of energetic heavy ions extending higher energies (10 GeV/u - 100 GeV/u, but possibly up to 1 TeV/u). Geant4 is a powerful toolkit, which in some areas well surpasses the needs from space radiation studies, while in other areas is being developed and/or validated to properly cover the modelling requirements outlined above. Our activities in ESA projects deal with the research and development of both Geant4 hadronic and electromagnetic physics. Recently the scope of verification tests and benchmarks has been extended. Hadronic tests and benchmarks run proton, pion, and ion interactions with matter at various energies. In the Geant4 hadronic sub-libraries, the most accurate cross sections have been identified and selected as a default for all particle types relevant to space applications. Significant developments were carried out for ion/ion interaction models. These now allow one to perform Geant4 simulations for all particle types and energies relevant to space applications. For the validation of ion models the hadronic testing suite for ion interactions was significantly extended. In this work the results of benchmarking versus data in a wide energy range for projectile protons and ions will be shown and discussed. Here we show results of the tests runs and their precision. Recommendations for Geant4

  19. Visualizing the process of interaction in a 3D environment

    NASA Astrophysics Data System (ADS)

    Vaidya, Vivek; Suryanarayanan, Srikanth; Krishnan, Kajoli; Mullick, Rakesh

    2007-03-01

    As the imaging modalities used in medicine transition to increasingly three-dimensional data the question of how best to interact with and analyze this data becomes ever more pressing. Immersive virtual reality systems seem to hold promise in tackling this, but how individuals learn and interact in these environments is not fully understood. Here we will attempt to show some methods in which user interaction in a virtual reality environment can be visualized and how this can allow us to gain greater insight into the process of interaction/learning in these systems. Also explored is the possibility of using this method to improve understanding and management of ergonomic issues within an interface.

  20. Float zone processing in a weightless environment. [Si crystals

    NASA Technical Reports Server (NTRS)

    Fowle, A. A.; Haggerty, J. S.; Strong, P. F.; Rudenberg, G.; Kronauer, R.

    1974-01-01

    Results are given for investigations into: (1) the physical limits which set the maximum practical diameters of Si crystals that can be processed by the float-zone method in a near weightless environment, and (2) the economic impact of large, space-produced Si crystals on the electronics industry. The stability of the melt is evaluated. Heat transfer and fluid flow within the melt as dependent on the crystal size and the degree and type of rotation imparted to the melt are studied. Methods of utilizing the weightless environment for the production of large, stress-free Si crystals of uniform composition are proposed. The economic effect of large size Si crystals, their potential applications, likely utilization and cost advantages in LSI, integrated circuits, and power devices are also evaluated. Foreseeable advantages of larger diameter wafers of good characteristics and the possibilities seen for greater perfection resulting from stress-free growth are discussed.

  1. ISLE (Image and Signal Processing LISP Environment) reference manual

    SciTech Connect

    Sherwood, R.J.; Searfus, R.M.

    1990-01-01

    ISLE is a rapid prototyping system for performing image and signal processing. It is designed to meet the needs of a person doing development of image and signal processing algorithms in a research environment. The image and signal processing modules in ISLE form a very capable package in themselves. They also provide a rich environment for quickly and easily integrating user-written software modules into the package. ISLE is well suited to applications in which there is a need to develop a processing algorithm in an interactive manner. It is straightforward to develop the algorithms, load it into ISLE, apply the algorithm to an image or signal, display the results, then modify the algorithm and repeat the develop-load-apply-display cycle. ISLE consists of a collection of image and signal processing modules integrated into a cohesive package through a standard command interpreter. ISLE developer elected to concentrate their effort on developing image and signal processing software rather than developing a command interpreter. A COMMON LISP interpreter was selected for the command interpreter because it already has the features desired in a command interpreter, it supports dynamic loading of modules for customization purposes, it supports run-time parameter and argument type checking, it is very well documented, and it is a commercially supported product. This manual is intended to be a reference manual for the ISLE functions The functions are grouped into a number of categories and briefly discussed in the Function Summary chapter. The full descriptions of the functions and all their arguments are given in the Function Descriptions chapter. 6 refs.

  2. Physical processes affecting the sedimentary environments of Long Island Sound

    USGS Publications Warehouse

    Signell, R.P.; Knebel, H. J.; List, J.H.; Farris, A.S.; ,

    1997-01-01

    A modeling study was undertaken to simulate the bottom tidal-, wave-, and wind-driven currents in Long Island Sound in order to provide a general physical oceanographic framework for understanding the characteristics and distribution of seafloor sedimentary environments. Tidal currents are important in the funnel-shaped eastern part of the Sound, where a strong gradient of tidal-current speed was found. This current gradient parallels the general westward progression of sedimentary environments from erosion or non-deposition, through bedload transport and sediment sorting, to fine-grained deposition. Wave-driven currents, meanwhile, appear to be important along the shallow margins of the basin, explaining the occurrence of relatively coarse sediments in regions where tidal currents alone are not strong enough to move sediment. Finally, westerly wind events are shown to locally enhance bottom currents along the axial depression of the sound, providing a possible explanation for the relatively coarse sediments found in the depression despite tide- and wave-induced currents below the threshold of sediment movement. The strong correlation between the near-bottom current intensity based on the model results and the sediment response as indicated by the distribution of sedimentary environments provides a framework for predicting the long-term effects of anthropogenic activities.

  3. Process modeling - It's history, current status, and future

    NASA Astrophysics Data System (ADS)

    Duttweiler, Russell E.; Griffith, Walter M.; Jain, Sulekh C.

    1991-04-01

    The development of process modeling is reviewed to examine the potential of process applications to prevent and solve problems associated with the aerospace industry. The business and global environments is assessed, and the traditional approach to product/process design is argued to be obsolete. A revised engineering process is described which involves planning and prediction before production by means of process simulation. Process simulation can permit simultaneous engineering of unit processes and complex processes, and examples are given in the cross-coupling of forging-process variance. The implementation of process modeling, CAE, and computer simulations are found to reduce costs and time associated with technological development when incorporated judiciously.

  4. Water related environment modelling on Mars.

    PubMed

    Kereszturi, Akos

    2004-01-01

    During a human Mars exploration because of the lack of time astronauts need fast methods for the interpretation of unexpected observations which give them flexibility and new, important targets. With in-situ modelling it is possible to get information on various past and present processes at the same location on a far wider spectrum than would be realized even during a long mission. This work summarizes the potential technical requirements and benefits of the modelling. Based on a simple estimation with a 300 kg package, and 1-10% of the working time of 1-2 astronauts at the same location, they can get plenty of new and important information for the whole past and present Mars. With the proposed five test groups astronauts will be able to make better and newer kinds of interpretations of observations, and find better targets and methods during the same mission.

  5. A GPGPU accelerated modeling environment for quantitatively characterizing karst systems

    NASA Astrophysics Data System (ADS)

    Myre, J. M.; Covington, M. D.; Luhmann, A. J.; Saar, M. O.

    2011-12-01

    The ability to derive quantitative information on the geometry of karst aquifer systems is highly desirable. Knowing the geometric makeup of a karst aquifer system enables quantitative characterization of the systems response to hydraulic events. However, the relationship between flow path geometry and karst aquifer response is not well understood. One method to improve this understanding is the use of high speed modeling environments. High speed modeling environments offer great potential in this regard as they allow researchers to improve their understanding of the modeled karst aquifer through fast quantitative characterization. To that end, we have implemented a finite difference model using General Purpose Graphics Processing Units (GPGPUs). GPGPUs are special purpose accelerators which are capable of high speed and highly parallel computation. The GPGPU architecture is a grid like structure, making it is a natural fit for structured systems like finite difference models. To characterize the highly complex nature of karst aquifer systems our modeling environment is designed to use an inverse method to conduct the parameter tuning. Using an inverse method reduces the total amount of parameter space needed to produce a set of parameters describing a system of good fit. Systems of good fit are determined with a comparison to reference storm responses. To obtain reference storm responses we have collected data from a series of data-loggers measuring water depth, temperature, and conductivity at locations along a cave stream with a known geometry in southeastern Minnesota. By comparing the modeled response to those of the reference responses the model parameters can be tuned to quantitatively characterize geometry, and thus, the response of the karst system.

  6. Sampling the food processing environment: taking up the cudgel for preventive quality management in food processing environments.

    PubMed

    Wagner, Martin; Stessl, Beatrix

    2014-01-01

    The Listeria monitoring program for Austrian cheese factories was established in 1988. The basic idea is to control the introduction of L. monocytogenes into the food processing environment, preventing the pathogen from contaminating the food under processing. The Austrian Listeria monitoring program comprises four levels of investigation, dealing with routine monitoring of samples and consequences of finding a positive sample. Preventive quality control concepts attempt to detect a foodborne hazard along the food processing chain, prior to food delivery, retailing, and consumption. The implementation of a preventive food safety concept provokes a deepened insight by the manufacturers into problems concerning food safety. The development of preventive quality assurance strategies contributes to the national food safety status and protects public health.

  7. An Instructional Method for the AutoCAD Modeling Environment.

    ERIC Educational Resources Information Center

    Mohler, James L.

    1997-01-01

    Presents a command organizer for AutoCAD to aid new uses in operating within the 3-D modeling environment. Addresses analyzing the problem, visualization skills, nonlinear tools, a static view of a dynamic model, the AutoCAD organizer, environment attributes, and control of the environment. Contains 11 references. (JRH)

  8. MASCARET: creating virtual learning environments from system modelling

    NASA Astrophysics Data System (ADS)

    Querrec, Ronan; Vallejo, Paola; Buche, Cédric

    2013-03-01

    The design process for a Virtual Learning Environment (VLE) such as that put forward in the SIFORAS project (SImulation FOR training and ASsistance) means that system specifications can be differentiated from pedagogical specifications. System specifications can also be obtained directly from the specialists' expertise; that is to say directly from Product Lifecycle Management (PLM) tools. To do this, the system model needs to be considered as a piece of VLE data. In this paper we present Mascaret, a meta-model which can be used to represent such system models. In order to ensure that the meta-model is capable of describing, representing and simulating such systems, MASCARET is based SysML1, a standard defined by Omg.

  9. The concepts of energy, environment, and cost for process design

    SciTech Connect

    Abu-Khader, M.M.; Speight, J.G.

    2004-05-01

    The process industries (specifically, energy and chemicals) are characterized by a variety of reactors and reactions to bring about successful process operations. The design of energy-related and chemical processes and their evolution is a complex process that determines the competitiveness of these industries, as well as their environmental impact. Thus, we have developed an Enviro-Energy Concept designed to facilitate sustainable industrial development. The Complete Onion Model represents a complete methodology for chemical process design and illustrates all of the requirements to achieve the best possible design within the accepted environmental standards. Currently, NOx emissions from industrial processes continue to receive maximum attention, therefore the issue problem of NOx emissions from industrial sources such as power stations and nitric acid plants is considered. The Selective Catalytic Reduction (SCR) is one of the most promising and effective commercial technologies. It is considered the Best Available Control Technology (BACT) for NOx reduction. The solution of NOx emissions problem is either through modifying the chemical process design and/or installing an end-of-pipe technology. The degree of integration between the process design and the installed technology plays a critical role in the capital cost evaluation. Therefore, integrating process units and then optimizing the design has a vital effect on the total cost. Both the environmental regulations and the cost evaluation are the boundary constraints of the optimum solution.

  10. Practical Framework: Implementing OEE Method in Manufacturing Process Environment

    NASA Astrophysics Data System (ADS)

    Maideen, N. C.; Sahudin, S.; Mohd Yahya, N. H.; Norliawati, A. O.

    2016-02-01

    Manufacturing process environment requires reliable machineries in order to be able to satisfy the market demand. Ideally, a reliable machine is expected to be operated and produce a quality product at its maximum designed capability. However, due to some reason, the machine usually unable to achieved the desired performance. Since the performance will affect the productivity of the system, a measurement technique should be applied. Overall Equipment Effectiveness (OEE) is a good method to measure the performance of the machine. The reliable result produced from OEE can then be used to propose a suitable corrective action. There are a lot of published paper mentioned about the purpose and benefit of OEE that covers what and why factors. However, the how factor not yet been revealed especially the implementation of OEE in manufacturing process environment. Thus, this paper presents a practical framework to implement OEE and a case study has been discussed to explain in detail each steps proposed. The proposed framework is beneficial to the engineer especially the beginner to start measure their machine performance and later improve the performance of the machine.

  11. Planning: The Participatory Process Model.

    ERIC Educational Resources Information Center

    McDowell, Elizabeth V.

    The participatory planning process model developed by Peirce Junior College is described in this paper. First, the rationale for shifting from a traditional authoritarian style of institutional leadership to a participatory style which encourages a broader concern for the institution and lessens morale problems is offered. The development of a new…

  12. Group Modeling in Social Learning Environments

    ERIC Educational Resources Information Center

    Stankov, Slavomir; Glavinic, Vlado; Krpan, Divna

    2012-01-01

    Students' collaboration while learning could provide better learning environments. Collaboration assumes social interactions which occur in student groups. Social theories emphasize positive influence of such interactions on learning. In order to create an appropriate learning environment that enables social interactions, it is important to…

  13. Exploring Undergraduate Students' Mental Models of the Environment: Are They Related to Environmental Affect and Behavior?

    ERIC Educational Resources Information Center

    Liu, Shu-Chiu; Lin, Huann-shyang

    2015-01-01

    A draw-and-explain task and questionnaire were used to explore Taiwanese undergraduate students' mental models of the environment and whether and how they relate to their environmental affect and behavioral commitment. We found that students generally held incomplete mental models of the environment, focusing on objects rather than on processes or…

  14. AMBA/D: a new programming environment for image processing

    NASA Astrophysics Data System (ADS)

    Roth, Karl n.; Hufnagl, Peter; Wolf, Guenter

    1992-04-01

    Recent practice in image processing is dominated by heuristic methods used to design practical, relevant algorithms. To ensure high efficiency in the design process, the communication between user and computer should be as direct as possible. An interactive software system for image processing is required to fulfill this demand. Interpreter-based systems with high interactivity available on the software market have the drawback of low operation speed. In AMBA/D we combine the performance of a compiler/based system, with the interactivity of an interpreter system. The AMBA/D system is an interactive programming environment with integrated facilities to create, compile, execute, and debug programs. In AMBA/D, a compiler language, direct execution, and programming concept is combined with a collection of high-level image processing procedures. The design of a special compiler language was necessary because existing computer languages like FORTRAN, C, etc., do not fulfill our requirement of interactivity. The system runs of an IBM-compatible personal computer and can be used with different types of commercially available frame grabbers.

  15. Thermoplastic matrix composite processing model

    NASA Technical Reports Server (NTRS)

    Dara, P. H.; Loos, A. C.

    1985-01-01

    The effects the processing parameters pressure, temperature, and time have on the quality of continuous graphite fiber reinforced thermoplastic matrix composites were quantitatively accessed by defining the extent to which intimate contact and bond formation has occurred at successive ply interfaces. Two models are presented predicting the extents to which the ply interfaces have achieved intimate contact and cohesive strength. The models are based on experimental observation of compression molded laminates and neat resin conditions, respectively. Identified as the mechanism explaining the phenomenon by which the plies bond to themselves is the theory of autohesion (or self diffusion). Theoretical predictions from the Reptation Theory between autohesive strength and contact time are used to explain the effects of the processing parameters on the observed experimental strengths. The application of a time-temperature relationship for autohesive strength predictions is evaluated. A viscoelastic compression molding model of a tow was developed to explain the phenomenon by which the prepreg ply interfaces develop intimate contact.

  16. Measurement and modeling of moist processes

    NASA Technical Reports Server (NTRS)

    Cotton, William; Starr, David; Mitchell, Kenneth; Fleming, Rex; Koch, Steve; Smith, Steve; Mailhot, Jocelyn; Perkey, Don; Tripoli, Greg

    1993-01-01

    The keynote talk summarized five years of work simulating observed mesoscale convective systems with the RAMS (Regional Atmospheric Modeling System) model. Excellent results are obtained when simulating squall line or other convective systems that are strongly forced by fronts or other lifting mechanisms. Less highly forced systems are difficult to model. The next topic in this colloquium was measurement of water vapor and other constituents of the hydrologic cycle. Impressive accuracy was shown measuring water vapor with both the airborne DIAL (Differential Absorption Lidar) system and the the ground-based Raman Lidar. NMC's plans for initializing land water hydrology in mesoscale models was presented before water vapor measurement concepts for GCIP were discussed. The subject of using satellite data to provide mesoscale moisture and wind analyses was next. Recent activities in modeling of moist processes in mesoscale systems was reported on. These modeling activities at the Canadian Atmospheric Environment Service (AES) used a hydrostatic, variable-resolution grid model. Next the spatial resolution effects of moisture budgets was discussed; in particular, the effects of temporal resolution on heat and moisture budgets for cumulus parameterization. The conclusion of this colloquium was on modeling scale interaction processes.

  17. Welding process modelling and control

    NASA Technical Reports Server (NTRS)

    Romine, Peter L.; Adenwala, Jinen A.

    1993-01-01

    The research and analysis performed, and software developed, and hardware/software recommendations made during 1992 in development of the PC-based data acquisition system for support of Welding Process Modeling and Control is reported. A need was identified by the Metals Processing Branch of NASA Marshall Space Flight Center, for a mobile data aquisition and analysis system, customized for welding measurement and calibration. Several hardware configurations were evaluated and a PC-based system was chosen. The Welding Measurement System (WMS) is a dedicated instrument, strictly for the use of data aquisition and analysis. Although the WMS supports many of the functions associated with the process control, it is not the intention for this system to be used for welding process control.

  18. Modeling Low-temperature Geochemical Processes

    NASA Astrophysics Data System (ADS)

    Nordstrom, D. K.

    2003-12-01

    Geochemical modeling has become a popular and useful tool for a wide number of applications from research on the fundamental processes of water-rock interactions to regulatory requirements and decisions regarding permits for industrial and hazardous wastes. In low-temperature environments, generally thought of as those in the temperature range of 0-100 °C and close to atmospheric pressure (1 atm=1.01325 bar=101,325 Pa), complex hydrobiogeochemical reactions participate in an array of interconnected processes that affect us, and that, in turn, we affect. Understanding these complex processes often requires tools that are sufficiently sophisticated to portray multicomponent, multiphase chemical reactions yet transparent enough to reveal the main driving forces. Geochemical models are such tools. The major processes that they are required to model include mineral dissolution and precipitation; aqueous inorganic speciation and complexation; solute adsorption and desorption; ion exchange; oxidation-reduction; or redox; transformations; gas uptake or production; organic matter speciation and complexation; evaporation; dilution; water mixing; reaction during fluid flow; reaction involving biotic interactions; and photoreaction. These processes occur in rain, snow, fog, dry atmosphere, soils, bedrock weathering, streams, rivers, lakes, groundwaters, estuaries, brines, and diagenetic environments. Geochemical modeling attempts to understand the redistribution of elements and compounds, through anthropogenic and natural means, for a large range of scale from nanometer to global. "Aqueous geochemistry" and "environmental geochemistry" are often used interchangeably with "low-temperature geochemistry" to emphasize hydrologic or environmental objectives.Recognition of the strategy or philosophy behind the use of geochemical modeling is not often discussed or explicitly described. Plummer (1984, 1992) and Parkhurst and Plummer (1993) compare and contrast two approaches for

  19. Modelling the Neutral Atmosphere and Plasma Environment of Saturn

    NASA Technical Reports Server (NTRS)

    Richardson, John D.; Jurac, S.; Johnson, R.; McGrath, M.

    2005-01-01

    The first year of this contract has resulted in two publications with the P.I. and co-I Jurac as lead authors and two publications where these team members are co-authors. These papers discuss modeling work undertaken in preparation for Cassini; the goal was to summarize our current best knowledge of the ion and neutrals sources and distributions. One of the major goals of this project is to improve models of the plasma and neutral environment near Saturn. The paper "A self-consistent model of plasma and neutrals at Saturn: Neutral cloud morphology" [Jurac and Richardson, 20051 presents results on the neutral clouds near Saturn using a model which for the first times treats the ions and neutrals self-consistently. We also for the first time include a directly sputtered H source. The Voyager and HST observations are used as model constraints. The neutral source is adjusted to give a good match to the HST observations of OH. For this initial run the ion parameters from Richardson et al. are used; charge exchange with ions is a major neutral loss process. The neutral profile derived from the model is then used in a model of plasma transport and chemistry (with the plasma diffusion rate the only free parameter). This model gives new values of the ion composition which are then fed back into the neutral model. This iteration continues until the values converge.

  20. An ecohydrologic model for a shallow groundwater urban environment.

    PubMed

    Arden, Sam; Ma, Xin Cissy; Brown, Mark

    2014-01-01

    The urban environment is a patchwork of natural and artificial surfaces that results in complex interactions with and impacts to natural hydrologic cycles. Evapotranspiration is a major hydrologic flow that is often altered through urbanization, although the mechanisms of change are sometimes difficult to tease out due to difficulty in effectively simulating soil-plant-atmosphere interactions. This paper introduces a simplified yet realistic model that is a combination of existing surface runoff and ecohydrology models designed to increase the quantitative understanding of complex urban hydrologic processes. Results demonstrate that the model is capable of simulating the long-term variability of major hydrologic fluxes as a function of impervious surface, temperature, water table elevation, canopy interception, soil characteristics, precipitation and complex mechanisms of plant water uptake. These understandings have potential implications for holistic urban water system management.

  1. The role of an electromagnetic environment model in spectrum management

    NASA Astrophysics Data System (ADS)

    Feller, A. H.

    1981-04-01

    The role of an electromagnetic (EM) environment model in spectrum management is developed. Spectrum management is traced from electromagnetic compatibility (EMC) considerations in international agreements through related domestic law to the fundamental spectrum management procedures allocation, allotment, assignment. The need for a model of the EM environment is derived from requirements of allocation, allotment, and assignment proceedings. Data elements required to support and EM environment model for spectrum management purpose are reviewed. An outline and derivation of a general EM environment model is given. The ways systems respond to the EM environment are cataloged and reviewed so that specific applications of an EM environment model are readily apparent. Application and limitations of current models are discussed.

  2. A Process for Technology Prioritization in a Competitive Environment

    NASA Technical Reports Server (NTRS)

    Stephens, Karen; Herman, Melody; Griffin, Brand

    2006-01-01

    This slide presentation reviews NASA's process for prioritizing technology requirements where there is a competitive environment. The In-Space Propulsion Technology (ISPT) project is used to exemplify the process. The ISPT project focuses on the mid level Technology Readiness Level (TRL) for development. These are TRL's 4 through 6, (i.e. Technology Development and Technology Demonstration. The objective of the planning activity is to identify the current most likely date each technology is needed and create ISPT technology development schedules based on these dates. There is a minimum of 4 years between flight and pacing mission. The ISPT Project needed to identify the "pacing mission" for each technology in order to provide funding for each area. Graphic representations show the development of the process. A matrix shows which missions are currently receiving pull from the both the Solar System Exploration and the Sun-Solar System Connection Roadmaps. The timeframes of the pacing missions technologies are shown for various types of propulsion. A pacing mission that was in the near future serves to increase the priority for funding. Adaptations were made when budget reductions precluded the total implementation of the plan.

  3. Animal models and conserved processes

    PubMed Central

    2012-01-01

    Background The concept of conserved processes presents unique opportunities for using nonhuman animal models in biomedical research. However, the concept must be examined in the context that humans and nonhuman animals are evolved, complex, adaptive systems. Given that nonhuman animals are examples of living systems that are differently complex from humans, what does the existence of a conserved gene or process imply for inter-species extrapolation? Methods We surveyed the literature including philosophy of science, biological complexity, conserved processes, evolutionary biology, comparative medicine, anti-neoplastic agents, inhalational anesthetics, and drug development journals in order to determine the value of nonhuman animal models when studying conserved processes. Results Evolution through natural selection has employed components and processes both to produce the same outcomes among species but also to generate different functions and traits. Many genes and processes are conserved, but new combinations of these processes or different regulation of the genes involved in these processes have resulted in unique organisms. Further, there is a hierarchy of organization in complex living systems. At some levels, the components are simple systems that can be analyzed by mathematics or the physical sciences, while at other levels the system cannot be fully analyzed by reducing it to a physical system. The study of complex living systems must alternate between focusing on the parts and examining the intact whole organism while taking into account the connections between the two. Systems biology aims for this holism. We examined the actions of inhalational anesthetic agents and anti-neoplastic agents in order to address what the characteristics of complex living systems imply for inter-species extrapolation of traits and responses related to conserved processes. Conclusion We conclude that even the presence of conserved processes is insufficient for inter

  4. Operational SAR Data Processing in GIS Environments for Rapid Disaster Mapping

    NASA Astrophysics Data System (ADS)

    Meroni, A.; Bahr, T.

    2013-05-01

    Having access to SAR data can be highly important and critical especially for disaster mapping. Updating a GIS with contemporary information from SAR data allows to deliver a reliable set of geospatial information to advance civilian operations, e.g. search and rescue missions. Therefore, we present in this paper the operational processing of SAR data within a GIS environment for rapid disaster mapping. This is exemplified by the November 2010 flash flood in the Veneto region, Italy. A series of COSMO-SkyMed acquisitions was processed in ArcGIS® using a single-sensor, multi-mode, multi-temporal approach. The relevant processing steps were combined using the ArcGIS ModelBuilder to create a new model for rapid disaster mapping in ArcGIS, which can be accessed both via a desktop and a server environment.

  5. Construction material processed using lunar simulant in various environments

    NASA Technical Reports Server (NTRS)

    Chase, Stan; Ocallaghan-Hay, Bridget; Housman, Ralph; Kindig, Michael; King, John; Montegrande, Kevin; Norris, Raymond; Vanscotter, Ryan; Willenborg, Jonathan; Staubs, Harry

    1995-01-01

    The manufacture of construction materials from locally available resources in space is an important first step in the establishment of lunar and planetary bases. The objective of the CoMPULSIVE (Construction Material Processed Using Lunar Simulant In Various Environments) experiment is to develop a procedure to produce construction materials by sintering or melting Johnson Space Center Simulant 1 (JSC-1) lunar soil simulant in both earth-based (1-g) and microgravity (approximately 0-g) environments. The characteristics of the resultant materials will be tested to determine its physical and mechanical properties. The physical characteristics include: crystalline, thermal, and electrical properties. The mechanical properties include: compressive tensile, and flexural strengths. The simulant, placed in a sealed graphite crucible, will be heated using a high temperature furnace. The crucible will then be cooled by radiative and forced convective means. The core furnace element consists of space qualified quartz-halogen incandescent lamps with focusing mirrors. Sample temperatures of up to 2200 C are attainable using this heating method.

  6. Self-assembly processes in the prebiotic environment.

    PubMed

    Deamer, David; Singaram, Sara; Rajamani, Sudha; Kompanichenko, Vladimir; Guggenheim, Stephen

    2006-10-29

    An important question guiding research on the origin of life concerns the environmental conditions where molecular systems with the properties of life first appeared on the early Earth. An appropriate site would require liquid water, a source of organic compounds, a source of energy to drive polymerization reactions and a process by which the compounds were sufficiently concentrated to undergo physical and chemical interactions. One such site is a geothermal setting, in which organic compounds interact with mineral surfaces to promote self-assembly and polymerization reactions. Here, we report an initial study of two geothermal sites where mixtures of representative organic solutes (amino acids, nucleobases, a fatty acid and glycerol) and phosphate were mixed with high-temperature water in clay-lined pools. Most of the added organics and phosphate were removed from solution with half-times measured in minutes to a few hours. Analysis of the clay, primarily smectite and kaolin, showed that the organics were adsorbed to the mineral surfaces at the acidic pH of the pools, but could subsequently be released in basic solutions. These results help to constrain the range of possible environments for the origin of life. A site conducive to self-assembly of organic solutes would be an aqueous environment relatively low in ionic solutes, at an intermediate temperature range and neutral pH ranges, in which cyclic concentration of the solutes can occur by transient dry intervals.

  7. Model for amorphous aggregation processes

    NASA Astrophysics Data System (ADS)

    Stranks, Samuel D.; Ecroyd, Heath; van Sluyter, Steven; Waters, Elizabeth J.; Carver, John A.; von Smekal, Lorenz

    2009-11-01

    The amorphous aggregation of proteins is associated with many phenomena, ranging from the formation of protein wine haze to the development of cataract in the eye lens and the precipitation of recombinant proteins during their expression and purification. While much literature exists describing models for linear protein aggregation, such as amyloid fibril formation, there are few reports of models which address amorphous aggregation. Here, we propose a model to describe the amorphous aggregation of proteins which is also more widely applicable to other situations where a similar process occurs, such as in the formation of colloids and nanoclusters. As first applications of the model, we have tested it against experimental turbidimetry data of three proteins relevant to the wine industry and biochemistry, namely, thaumatin, a thaumatinlike protein, and α -lactalbumin. The model is very robust and describes amorphous experimental data to a high degree of accuracy. Details about the aggregation process, such as shape parameters of the aggregates and rate constants, can also be extracted.

  8. Extending the LWS Data Environment: Distributed Data Processing and Analysis

    NASA Technical Reports Server (NTRS)

    Narock, Thomas

    2005-01-01

    The final stages of this work saw changes to the original framework, as well as the completion and integration of several data processing services. Initially, it was thought that a peer-to-peer architecture was necessary to make this work possible. The peer-to-peer architecture provided many benefits including the dynamic discovery of new services that would be continually added. A prototype example was built and while it showed promise, a major disadvantage was seen in that it was not easily integrated into the existing data environment. While the peer-to-peer system worked well for finding and accessing distributed data processing services, it was found that its use was limited by the difficulty in calling it from existing tools and services. After collaborations with members of the data community, it was determined that our data processing system was of high value and that a new interface should be pursued in order for the community to take full advantage of it. As such; the framework was modified from a peer-to-peer architecture to a more traditional web service approach. Following this change multiple data processing services were added. These services include such things as coordinate transformations and sub setting of data. Observatory (VHO), assisted with integrating the new architecture into the VHO. This allows anyone using the VHO to search for data, to then pass that data through our processing services prior to downloading it. As a second attempt at demonstrating the new system, a collaboration was established with the Collaborative Sun Earth Connector (CoSEC) group at Lockheed Martin. This group is working on a graphical user interface to the Virtual Observatories and data processing software. The intent is to provide a high-level easy-to-use graphical interface that will allow access to the existing Virtual Observatories and data processing services from one convenient application. Working with the CoSEC group we provided access to our data

  9. IFR fuel cycle process equipment design environment and objectives

    SciTech Connect

    Rigg, R.H.

    1993-03-01

    Argonne National laboratory (ANL) is refurbishing the hot cell facility originally constructed with the EBR-II reactor. When refurbishment is complete, the facility win demonstrate the complete fuel cycle for current generation high burnup metallic fuel elements. These are sodium bonded, stainless steel clad fuel pins of U-Zr or U-Pu-Zr composition typical of the fuel type proposed for a future Integral Fast Reactor (IFR) design. To the extent possible, the process equipment is being built at full commercial scale, and the facility is being modified to incorporate current DOE facility design requirements and modem remote maintenance principles. The current regulatory and safety environment has affected the design of the fuel fabrication equipment, most of which will be described in greater detail in subsequent papers in this session.

  10. IFR fuel cycle process equipment design environment and objectives

    SciTech Connect

    Rigg, R.H.

    1993-01-01

    Argonne National laboratory (ANL) is refurbishing the hot cell facility originally constructed with the EBR-II reactor. When refurbishment is complete, the facility win demonstrate the complete fuel cycle for current generation high burnup metallic fuel elements. These are sodium bonded, stainless steel clad fuel pins of U-Zr or U-Pu-Zr composition typical of the fuel type proposed for a future Integral Fast Reactor (IFR) design. To the extent possible, the process equipment is being built at full commercial scale, and the facility is being modified to incorporate current DOE facility design requirements and modem remote maintenance principles. The current regulatory and safety environment has affected the design of the fuel fabrication equipment, most of which will be described in greater detail in subsequent papers in this session.

  11. A Process Study of the Development of Virtual Research Environments

    NASA Astrophysics Data System (ADS)

    Ahmed, I.; Cooper, K.; McGrath, R.; Griego, G.; Poole, M. S.; Hanisch, R. J.

    2014-05-01

    In recent years, cyberinfrastructures have been deployed to create virtual research environments (VREs) - such as the Virtual Astronomical Observatory (VAO) - to enhance the quality and speed of scientific research, and to foster global scientific communities. Our study utilizes process methodology to study the evolution of VREs. This approach focuses on a series of events that bring about or lead to some outcome, and attempts to specify the generative mechanism that could produce the event series. This paper briefly outlines our approach and describes initial results of a case study of the VAO, one of the participating VREs. The case study is based on interviews with seven individuals participating in the VAO, and analysis of project documents and online resources. These sources are hand tagged to identify events related to the thematic tracks, to yield a narrative of the project. Results demonstrate the event series of an organization through traditional methods augmented by virtual sources.

  12. Diagnostic Modeling of PAMS VOC Observation on Regional Scale Environment

    NASA Astrophysics Data System (ADS)

    Chen, S.; Liu, T.; Chen, T.; Ou Yang, C.; Wang, J.; Chang, J. S.

    2008-12-01

    While a number of gas-phase chemical mechanisms, such as CBM-Z, RADM2, SAPRC-07 had been successful in studying gas-phase atmospheric chemical processes they all used some lumped organic species to varying degrees. Photochemical Assessment Monitoring Stations (PAMS) has been in use for over ten years and yet it is not clear how the detailed organic species measured by PAMS compare to the lumped model species under regional-scale transport and chemistry interactions. By developing a detailed mechanism specifically for the PAMS organics and embedding this diagnostic model within a regional-scale transport and chemistry model we can then directly compare PAMS observation with regional-scale model simulations. We modify one regional-scale chemical transport model (Taiwan Air Quality Model, TAQM) by adding a submodel with chemical mechanism for interactions of the 56 species observed by PAMS. This submodel then calculates the time evolution of these 56 PAMS species within the environment established by TAQM. It is assumed that TAQM can simulate the overall regional-scale environment including impact of regional-scale transport and time evolution of oxidants and radicals. Therefore we can scale these influences to the PAMS organic species and study their time evolution with their species-specific source functions, meteorological transport, and chemical interactions. Model simulations of each species are compared with PAMS hourly surface measurements. A case study located in a metropolitan area in central Taiwan showed that with wind speeds lower than 3 m/s, when meteorological simulation is comparable with observation, the diurnal pattern of each species performs well with PAMS data. It is found that for many observations meteorological transport is an influence and that local emissions of specific species must be represented correctly. At this time there are still species that cannot be modeled properly. We suspect this is mostly due to lack of information on local

  13. Mesoscopic Modeling of Reactive Transport Processes

    NASA Astrophysics Data System (ADS)

    Kang, Q.; Chen, L.; Deng, H.

    2012-12-01

    Reactive transport processes involving precipitation and/or dissolution are pervasive in geochemical, biological and engineered systems. Typical examples include self-assembled patterns such as Liesegang rings or bands, cones of stalactites in limestones caves, biofilm growth in aqueous environment, formation of mineral deposits in boilers and heat exchangers, uptake of toxic metal ions from polluted water by calcium carbonate, and mineral trapping of CO2. Compared to experimental studies, a numerical approach enables a systematic study of the reaction kinetics, mass transport, and mechanisms of nucleation and crystal growth, and hence provides a detailed description of reactive transport processes. In this study, we enhance a previously developed lattice Boltzmann pore-scale model by taking into account the nucleation process, and develop a mesoscopic approach to simulate reactive transport processes involving precipitation and/or dissolution of solid phases. The model is then used to simulate the formation of Liesegang precipitation patterns and investigate the effects of gel on the morphology of the precipitates. It is shown that this model can capture the porous structures of the precipitates and can account for the effects of the gel concentration and material. A wide range of precipitation patterns is predicted under different gel concentrations, including regular bands, treelike patterns, and for the first time with numerical models, transition patterns from regular bands to treelike patterns. The model is also applied to study the effect of secondary precipitate on the dissolution of primary mineral. Several types of dissolution and precipitation processes are identified based on the morphology and structures of the precipitates and on the extent to which the precipitates affect the dissolution of the primary mineral. Finally the model is applied to study the formation of pseudomorph. It is demonstrated for the first time by numerical simulation that a

  14. Morpheus: a user-friendly modeling environment for multiscale and multicellular systems biology.

    PubMed

    Starruß, Jörn; de Back, Walter; Brusch, Lutz; Deutsch, Andreas

    2014-05-01

    Morpheus is a modeling environment for the simulation and integration of cell-based models with ordinary differential equations and reaction-diffusion systems. It allows rapid development of multiscale models in biological terms and mathematical expressions rather than programming code. Its graphical user interface supports the entire workflow from model construction and simulation to visualization, archiving and batch processing.

  15. Influence of global climatic processes on environment The Arctic seas

    NASA Astrophysics Data System (ADS)

    Kholmyansky, Mikhael; Anokhin, Vladimir; Kartashov, Alexandr

    2016-04-01

    One of the most actual problems of the present is changes of environment of Arctic regions under the influence of global climatic processes. Authors as a result of the works executed by them in different areas of the Russian Arctic regions, have received the materials characterising intensity of these processes. Complex researches are carried out on water area and in a coastal zone the White, the Barents, the Kara and the East-Siberian seas, on lake water areas of subarctic region since 1972 on the present. Into structure of researches enter: hydrophysical, cryological observations, direct measurements of temperatures, the analysis of the drill data, electrometric definitions of the parametres of a frozen zone, lithodynamic and geochemical definitions, geophysical investigations of boreholes, studying of glaciers on the basis of visual observations and the analysis of photographs. The obtained data allows to estimate change of temperature of a water layer, deposits and benthonic horizon of atmosphere for last 25 years. On the average they make 0,38⁰C for sea waters, 0,23⁰C for friable deposits and 0,72⁰C for atmosphere. Under the influence of temperature changes in hydrosphere and lithosphere of a shelf cryolithic zone changes the characteristics. It is possible to note depth increase of roof position of the cryolithic zone on the most part of the studied water area. Modern fast rise in temperature high-ice rocks composing coast, has led to avalanche process thermo - denudation and to receipt in the sea of quantity of a material of 1978 three times exceeding level Rise in temperature involves appreciable deviation borders of the Arctic glacial covers. On our monitoring measurements change of the maintenance of oxygen in benthonic area towards increase that is connected with reduction of the general salinity of waters at the expense of fresh water arriving at ice thawing is noticed. It, in turn, leads to change of a biogene part of ecosystem. The executed

  16. A Collaborative Model for Ubiquitous Learning Environments

    ERIC Educational Resources Information Center

    Barbosa, Jorge; Barbosa, Debora; Rabello, Solon

    2016-01-01

    Use of mobile devices and widespread adoption of wireless networks have enabled the emergence of Ubiquitous Computing. Application of this technology to improving education strategies gave rise to Ubiquitous e-Learning, also known as Ubiquitous Learning. There are several approaches to organizing ubiquitous learning environments, but most of them…

  17. Periglacial process and Pleistocene environment in northern China

    SciTech Connect

    Guo Xudong; Liu Dongsheng ); Yan Fuhua )

    1991-03-01

    In the present time, five kinds of periglacial phenomena have been defined: ice wedges, periglacial involutions, congelifolds, congeliturbations, and loess dunes. From the stratigraphical and geochronological data, the periglacial process is divided into six stages. (1) Guanting periglacial stage, characterized by the congeliturbative deposits that have developed in early Pleistocene Guanting loess-like formation. Paleomagnetic dating gives 2.43 Ma B.P. (2) Yanchi periglacial stage, characterized by the congelifold that has developed in middle Pleistocene Yanchi Lishi loess formation. Paleomagnetic dating gives 0.50 Ma B.P. (3) Zhaitang periglacial stage (II), characterized by the periglacial involutions that have developed in lower middle Pleistocene Lishi loess formation. Paleomagnetic dating gives 0.30 Ma B.P. (4) Zhaitang periglacial state (I), characterized by the ice (soil) wedge that has developed in upper-middle Pleistocene Lishi loess formation. Paleomagnetic dating gives 0.20 Ma B.P. (5) Qiansangyu periglacial stage (II), characterized by the ice (sand) wedges that has developed in late Pleistocene Malan loess formation. Paleomagnetic dating gives 0.13 Ma B.P. (6) Qiansangyu periglacial stage (I), characterized by the ice (soil) wedge that has developed in late Pleistocene Malan loess-like formation. Thermoluminescent dating gives 0.018 Ma B.P. Spore-pollen composition analysis shows that the savannah steppe environment prevailed in northern China during Pleistocene periglacial periods. These fossilized periglacial phenomena indicate a rather arid and windy periglacial environment with a mean annual temperature estimated some 12-15C colder than that in the present.

  18. Collective Properties of a Transcription Initiation Model Under Varying Environment.

    PubMed

    Hu, Yucheng; Lowengrub, John S

    2016-01-01

    The dynamics of gene transcription is tightly regulated in eukaryotes. Recent experiments have revealed various kinds of transcriptional dynamics, such as RNA polymerase II pausing, that involves regulation at the transcription initiation stage, and the choice of different regulation pattern is closely related to the physiological functions of the target gene. Here we consider a simplified model of transcription initiation, a process including the assembly of transcription complex and the pausing and releasing of the RNA polymerase II. Focusing on the collective behaviors of a population level, we explore the potential regulatory functions this model can offer. These functions include fast and synchronized response to environmental change, or long-term memory about the transcriptional status. As a proof of concept we also show that, by selecting different control mechanisms cells can adapt to different environments. These findings may help us better understand the design principles of transcriptional regulation.

  19. Modeling climate related feedback processes

    SciTech Connect

    Elzen, M.G.J. den; Rotmans, J. )

    1993-11-01

    In order to assess their impact, the feedbacks which at present can be quantified reasonably are built into the Integrated Model to Assess the Greenhouse Effect (IMAGE). Unlike previous studies, this study describes the scenario- and time-dependent role of biogeochemical feedbacks. A number of simulation experiments are performed with IMAGE to project climate changes. Besides estimates of their absolute importance, the relative importance of individual biogeochemical feedbacks is considered by calculating the gain for each feedback process. This study focuses on feedback processes in the carbon cycle and the methane (semi-) cycle. Modeled feedbacks are then used to balance the past and present carbon budget. This results in substantially lower projections for atmospheric carbon dioxide than the Intergovernmental Panel on Climate Change (IPCC) estimates. The difference is approximately 18% from the 1990 level for the IPCC [open quotes]Business-as-Usual[close quotes] scenario. Furthermore, the IPCC's [open quotes]best guess[close quotes] value of the CO[sub 2] concentration in the year 2100 falls outside the uncertainty range estimated with our balanced modeling approach. For the IPCC [open quotes]Business-as-Usual[close quotes] scenario, the calculated total gain of the feedbacks within the carbon cycle appears to be negative, a result of the dominant role of the fertilization feedback. This study also shows that if temperature feedbacks on methane emissions from wetlands, rice paddies, and hydrates do materialize, methane concentrations might be increased by 30% by 2100. 70 refs., 17 figs., 7 tabs.

  20. Shuttle measured contaminant environment and modeling for payloads. Preliminary assessment of the space telescope environment in the shuttle bay

    NASA Technical Reports Server (NTRS)

    Scialdone, J. J.

    1983-01-01

    A baseline gaseous and particulate environment of the Shuttle bay was developed based on the various measurements which were made during the first four flights of the Shuttle. The environment is described by the time dependent pressure, density, scattered molecular fluxes, the column densities and including the transient effects of water dumps, engine firings and opening and closing of the bay doors. The particulate conditions in the ambient and on surfaces were predicted as a function of the mission time based on the available data. This basic Shuttle environment when combined with the outgassing and the particulate contributions of the payloads, can provide a description of the environment of a payload in the Shuttle bay. As an example of this application, the environment of the Space Telescope in the bay, which may be representative of the environment of several payloads, was derived. Among the many findings obtained in the process of modeling the environment, one is that the payloads environment in the bay is not substantially different or more objectionable than the self-generated environment of a large payload or spacecraft. It is, however, more severe during ground facilities operations, the first 15 to 20 hours of the flight, during and for a short period after ater was dumped overboard, and the reaction control engines are being fired.

  1. Causal Model Progressions as a Foundation for Intelligent Learning Environments.

    DTIC Science & Technology

    1987-11-01

    Learning Environments 12. PERSONAL AUTHOR(S? Barbara Y. White and John R. Frederiksen 13a. TYPE OF REPORT 13b TIME COVERED 14. DATE OF REPORT (Year...architecture of a new type of learning environment that incorporates features of microworlds and of intelligent tutorng systems. The environment is based on...The design principles underlying the creation of one type of causal model are then given (for zero-order models of electrical circuit behavior); and

  2. Modeling Stem Cell Induction Processes

    PubMed Central

    Grácio, Filipe; Cabral, Joaquim; Tidor, Bruce

    2013-01-01

    Technology for converting human cells to pluripotent stem cell using induction processes has the potential to revolutionize regenerative medicine. However, the production of these so called iPS cells is still quite inefficient and may be dominated by stochastic effects. In this work we build mass-action models of the core regulatory elements controlling stem cell induction and maintenance. The models include not only the network of transcription factors NANOG, OCT4, SOX2, but also important epigenetic regulatory features of DNA methylation and histone modification. We show that the network topology reported in the literature is consistent with the observed experimental behavior of bistability and inducibility. Based on simulations of stem cell generation protocols, and in particular focusing on changes in epigenetic cellular states, we show that cooperative and independent reaction mechanisms have experimentally identifiable differences in the dynamics of reprogramming, and we analyze such differences and their biological basis. It had been argued that stochastic and elite models of stem cell generation represent distinct fundamental mechanisms. Work presented here suggests an alternative possibility that they represent differences in the amount of information we have about the distribution of cellular states before and during reprogramming protocols. We show further that unpredictability and variation in reprogramming decreases as the cell progresses along the induction process, and that identifiable groups of cells with elite-seeming behavior can come about by a stochastic process. Finally we show how different mechanisms and kinetic properties impact the prospects of improving the efficiency of iPS cell generation protocols. PMID:23667423

  3. Development, validation and application of numerical space environment models

    NASA Astrophysics Data System (ADS)

    Honkonen, Ilja

    2013-10-01

    Currently the majority of space-based assets are located inside the Earth's magnetosphere where they must endure the effects of the near-Earth space environment, i.e. space weather, which is driven by the supersonic flow of plasma from the Sun. Space weather refers to the day-to-day changes in the temperature, magnetic field and other parameters of the near-Earth space, similarly to ordinary weather which refers to changes in the atmosphere above ground level. Space weather can also cause adverse effects on the ground, for example, by inducing large direct currents in power transmission systems. The performance of computers has been growing exponentially for many decades and as a result the importance of numerical modeling in science has also increased rapidly. Numerical modeling is especially important in space plasma physics because there are no in-situ observations of space plasmas outside of the heliosphere and it is not feasible to study all aspects of space plasmas in a terrestrial laboratory. With the increasing number of computational cores in supercomputers, the parallel performance of numerical models on distributed memory hardware is also becoming crucial. This thesis consists of an introduction, four peer reviewed articles and describes the process of developing numerical space environment/weather models and the use of such models to study the near-Earth space. A complete model development chain is presented starting from initial planning and design to distributed memory parallelization and optimization, and finally testing, verification and validation of numerical models. A grid library that provides good parallel scalability on distributed memory hardware and several novel features, the distributed cartesian cell-refinable grid (DCCRG), is designed and developed. DCCRG is presently used in two numerical space weather models being developed at the Finnish Meteorological Institute. The first global magnetospheric test particle simulation based on the

  4. Modeling pellet impact drilling process

    NASA Astrophysics Data System (ADS)

    Kovalyov, A. V.; Ryabchikov, S. Ya; Isaev, Ye D.; Ulyanova, O. S.

    2016-03-01

    The paper describes pellet impact drilling which could be used to increase the drilling speed and the rate of penetration when drilling hard rocks. Pellet impact drilling implies rock destruction by metal pellets with high kinetic energy in the immediate vicinity of the earth formation encountered. The pellets are circulated in the bottom hole by a high velocity fluid jet, which is the principle component of the ejector pellet impact drill bit. The experiments conducted has allowed modeling the process of pellet impact drilling, which creates the scientific and methodological basis for engineering design of drilling operations under different geo-technical conditions.

  5. Gravity Modeling for Variable Fidelity Environments

    NASA Technical Reports Server (NTRS)

    Madden, Michael M.

    2006-01-01

    Aerospace simulations can model worlds, such as the Earth, with differing levels of fidelity. The simulation may represent the world as a plane, a sphere, an ellipsoid, or a high-order closed surface. The world may or may not rotate. The user may select lower fidelity models based on computational limits, a need for simplified analysis, or comparison to other data. However, the user will also wish to retain a close semblance of behavior to the real world. The effects of gravity on objects are an important component of modeling real-world behavior. Engineers generally equate the term gravity with the observed free-fall acceleration. However, free-fall acceleration is not equal to all observers. To observers on the sur-face of a rotating world, free-fall acceleration is the sum of gravitational attraction and the centrifugal acceleration due to the world's rotation. On the other hand, free-fall acceleration equals gravitational attraction to an observer in inertial space. Surface-observed simulations (e.g. aircraft), which use non-rotating world models, may choose to model observed free fall acceleration as the gravity term; such a model actually combines gravitational at-traction with centrifugal acceleration due to the Earth s rotation. However, this modeling choice invites confusion as one evolves the simulation to higher fidelity world models or adds inertial observers. Care must be taken to model gravity in concert with the world model to avoid denigrating the fidelity of modeling observed free fall. The paper will go into greater depth on gravity modeling and the physical disparities and synergies that arise when coupling specific gravity models with world models.

  6. Exactly constructing model of quantum mechanics with random environment

    SciTech Connect

    Gevorkyan, A. S.

    2010-02-15

    Dissipation and decoherence, interaction with the random media, continuous measurements and many other complicated problems of open quantum systems are a result of interaction of quantum system with the random environment. These problems mathematically are described in terms of complex probabilistic processes (CPP). Note that CPP satisfies the stochastic differential equation (SDE) of Langevin-Schroedinger(L-Sch)type, and is defined on the extended space R{sup 1} - R{sub {l_brace}{gamma}{r_brace}}, where R{sup 1} and R{sub {l_brace}{gamma}{r_brace}} are the Euclidean and the functional spaces, correspondingly. For simplicity, the model of 1D quantum harmonic oscillator (QHO) with the stochastic environment is considered. On the basis of orthogonal CPP, the method of stochastic density matrix (SDM) is developed. By S DM method, the thermodynamical potentials, such as the nonequilibrium entropy and the energy of the 'ground state' are constructed in a closed form. The expressions for uncertain relations and Wigner function depending on interaction's constant between 1D QHO and the environment are obtained.

  7. Process Model for Friction Stir Welding

    NASA Technical Reports Server (NTRS)

    Adams, Glynn

    1996-01-01

    forging affect of the shoulder. The energy balance at the boundary of the plastic region with the environment required that energy flow away from the boundary in both radial directions. One resolution to this problem may be to introduce a time dependency into the process model, allowing the energy flow to oscillate across this boundary. Finally, experimental measurements are needed to verify the concepts used here and to aid in improving the model.

  8. Arctic mosses govern below-ground environment and ecosystem processes.

    PubMed

    Gornall, J L; Jónsdóttir, I S; Woodin, S J; Van der Wal, R

    2007-10-01

    Mosses dominate many northern ecosystems and their presence is integral to soil thermal and hydrological regimes which, in turn, dictate important ecological processes. Drivers, such as climate change and increasing herbivore pressure, affect the moss layer thus, assessment of the functional role of mosses in determining soil characteristics is essential. Field manipulations conducted in high arctic Spitsbergen (78 degrees N), creating shallow (3 cm), intermediate (6 cm) and deep (12 cm) moss layers over the soil surface, had an immediate impact on soil temperature in terms of both average temperatures and amplitude of fluctuations. In soil under deep moss, temperature was substantially lower and organic layer thaw occurred 4 weeks later than in other treatment plots; the growing season for vascular plants was thereby reduced by 40%. Soil moisture was also reduced under deep moss, reflecting the influence of local heterogeneity in moss depth, over and above the landscape-scale topographic control of soil moisture. Data from field and laboratory experiments show that moss-mediated effects on the soil environment influenced microbial biomass and activity, resulting in warmer and wetter soil under thinner moss layers containing more plant-available nitrogen. In arctic ecosystems, which are limited by soil temperature, growing season length and nutrient availability, spatial and temporal variation in the depth of the moss layer has significant repercussions for ecosystem function. Evidence from our mesic tundra site shows that any disturbance causing reduction in the depth of the moss layer will alleviate temperature and moisture constraints and therefore profoundly influence a wide range of ecosystem processes, including nutrient cycling and energy transfer.

  9. Performance modeling codes for the QuakeSim problem solving environment

    NASA Technical Reports Server (NTRS)

    Parker, J. W.; Donnellan, A.; Lyzenga, G.; Rundle, J.; Tullis, T.

    2003-01-01

    The QuakeSim Problem Solving Environment uses a web-services approach to unify and deploy diverse remote data sources and processing services within a browser environment. Here we focus on the high-performance crustal modeling applications that will be included in this set of remote but interoperable applications.

  10. Interoperation Modeling for Intelligent Domotic Environments

    NASA Astrophysics Data System (ADS)

    Bonino, Dario; Corno, Fulvio

    This paper introduces an ontology-based model for domotic device inter-operation. Starting from a previously published ontology (DogOnt) a refactoring and extension is described allowing to explicitly represent device capabilities, states and commands, and supporting abstract modeling of device inter-operation.

  11. Bayesian Genomic Prediction with Genotype × Environment Interaction Kernel Models

    PubMed Central

    Cuevas, Jaime; Crossa, José; Montesinos-López, Osval A.; Burgueño, Juan; Pérez-Rodríguez, Paulino; de los Campos, Gustavo

    2016-01-01

    The phenomenon of genotype × environment (G × E) interaction in plant breeding decreases selection accuracy, thereby negatively affecting genetic gains. Several genomic prediction models incorporating G × E have been recently developed and used in genomic selection of plant breeding programs. Genomic prediction models for assessing multi-environment G × E interaction are extensions of a single-environment model, and have advantages and limitations. In this study, we propose two multi-environment Bayesian genomic models: the first model considers genetic effects (u) that can be assessed by the Kronecker product of variance–covariance matrices of genetic correlations between environments and genomic kernels through markers under two linear kernel methods, linear (genomic best linear unbiased predictors, GBLUP) and Gaussian (Gaussian kernel, GK). The other model has the same genetic component as the first model (u) plus an extra component, f, that captures random effects between environments that were not captured by the random effects u. We used five CIMMYT data sets (one maize and four wheat) that were previously used in different studies. Results show that models with G × E always have superior prediction ability than single-environment models, and the higher prediction ability of multi-environment models with u and f over the multi-environment model with only u occurred 85% of the time with GBLUP and 45% of the time with GK across the five data sets. The latter result indicated that including the random effect f is still beneficial for increasing prediction ability after adjusting by the random effect u. PMID:27793970

  12. Understanding Fundamental Material Degradation Processes in High Temperature Aggressive Chemomechanical Environments

    SciTech Connect

    Stubbins, James; Gewirth, Andrew; Sehitoglu, Huseyin; Sofronis, Petros; Robertson, Ian

    2014-01-16

    The objective of this project is to develop a fundamental understanding of the mechanisms that limit materials durability for very high-temperature applications. Current design limitations are based on material strength and corrosion resistance. This project will characterize the interactions of high-temperature creep, fatigue, and environmental attack in structural metallic alloys of interest for the very high-temperature gas-cooled reactor (VHTR) or Next–Generation Nuclear Plant (NGNP) and for the associated thermo-chemical processing systems for hydrogen generation. Each of these degradation processes presents a major materials design challenge on its own, but in combination, they can act synergistically to rapidly degrade materials and limit component lives. This research and development effort will provide experimental results to characterize creep-fatigue-environment interactions and develop predictive models to define operation limits for high-temperature structural material applications. Researchers will study individually and in combination creep-fatigue-environmental attack processes in Alloys 617, 230, and 800H, as well as in an advanced Ni-Cr oxide dispersion strengthened steel (ODS) system. For comparison, the study will also examine basic degradation processes in nichrome (Ni-20Cr), which is a basis for most high-temperature structural materials, as well as many of the superalloys. These materials are selected to represent primary candidate alloys, one advanced developmental alloy that may have superior high-temperature durability, and one model system on which basic performance and modeling efforts can be based. The research program is presented in four parts, which all complement each other. The first three are primarily experimental in nature, and the last will tie the work together in a coordinated modeling effort. The sections are (1) dynamic creep-fatigue-environment process, (2) subcritical crack processes, (3) dynamic corrosion – crack

  13. NUMERICAL MODELING OF FINE SEDIMENT PHYSICAL PROCESSES.

    USGS Publications Warehouse

    Schoellhamer, David H.

    1985-01-01

    Fine sediment in channels, rivers, estuaries, and coastal waters undergo several physical processes including flocculation, floc disruption, deposition, bed consolidation, and resuspension. This paper presents a conceptual model and reviews mathematical models of these physical processes. Several general fine sediment models that simulate some of these processes are reviewed. These general models do not directly simulate flocculation and floc disruption, but the conceptual model and existing functions are shown to adequately model these two processes for one set of laboratory data.

  14. Network Modeling and Simulation Environment (NEMSE)

    DTIC Science & Technology

    2012-07-01

    Loop Using OPNET Modeler Demo ................................................ 3 3.2.3 COPE Demo...11 B 4.3 OPNET ...13 Figure 15: Antenna Pattern in OpNet ............................................................................... 13 Figure 16: NEMSE Box

  15. GREENSCOPE: A Method for Modeling Chemical Process ...

    EPA Pesticide Factsheets

    Current work within the U.S. Environmental Protection Agency’s National Risk Management Research Laboratory is focused on the development of a method for modeling chemical process sustainability. The GREENSCOPE methodology, defined for the four bases of Environment, Economics, Efficiency, and Energy, can evaluate processes with over a hundred different indicators. These indicators provide a means for realizing the principles of green chemistry and green engineering in the context of sustainability. Development of the methodology has centered around three focal points. One is a taxonomy of impacts that describe the indicators and provide absolute scales for their evaluation. The setting of best and worst limits for the indicators allows the user to know the status of the process under study in relation to understood values. Thus, existing or imagined processes can be evaluated according to their relative indicator scores, and process modifications can strive towards realizable targets. A second area of focus is in advancing definitions of data needs for the many indicators of the taxonomy. Each of the indicators has specific data that is necessary for their calculation. Values needed and data sources have been identified. These needs can be mapped according to the information source (e.g., input stream, output stream, external data, etc.) for each of the bases. The user can visualize data-indicator relationships on the way to choosing selected ones for evalua

  16. Realistic Modeling of Wireless Network Environments

    DTIC Science & Technology

    2015-03-01

    the FPGA. The memory can be used for a number of tasks, including capturing samples, storing samples for replay , and storing parameters for channel...models. We also increased the size of the memory available on the DSP card so longer traces can be stored and replayed . • We replaced the...the channel state. We also added large memories to the SCM and DSP card, allowing us to accurately model interference from various types of devices

  17. Entity Modeling and Immersive Decision Environments

    DTIC Science & Technology

    2011-09-01

    XCITE lifeform entities to detect and track moving or stationary objects is research and development work that should continue. The WRSTP team has... target identification scenario. A system dynamics model was developed to predict those results. Research limitations/Implications – While decision delays... target identification scenario. Many other decision models lack this time component and are therefore of limited use in time-critical situations. Take

  18. [Applying analytical hierarchy process to assess eco-environment quality of Heilongjiang province].

    PubMed

    Li, Song; Qiu, Wei; Zhao, Qing-liang; Liu, Zheng-mao

    2006-05-01

    The analytical hierarchy process (AHP) was adopted to study the index system of eco-province and the index system was set up for eco-province construction. The comparison matrix was constructed on the basis of experts' investigation questionnaires. MATLAB 6.5 was used to confirm the weights of the indices. The general environment quality index model was used to grade the environment quality and assessed the progress of constructing eco-province in Heilongjiang province. The results indicate that it is feasible to apply the AHP to assess quantitatively the ecological environmental quality province-wide. The ecological environment quality of Heilongjiang province has been improved obviously from the beginning of eco-province construction.

  19. Development of a Heterogenic Distributed Environment for Spatial Data Processing Using Cloud Technologies

    NASA Astrophysics Data System (ADS)

    Garov, A. S.; Karachevtseva, I. P.; Matveev, E. V.; Zubarev, A. E.; Florinsky, I. V.

    2016-06-01

    We are developing a unified distributed communication environment for processing of spatial data which integrates web-, desktop- and mobile platforms and combines volunteer computing model and public cloud possibilities. The main idea is to create a flexible working environment for research groups, which may be scaled according to required data volume and computing power, while keeping infrastructure costs at minimum. It is based upon the "single window" principle, which combines data access via geoportal functionality, processing possibilities and communication between researchers. Using an innovative software environment the recently developed planetary information system (http://cartsrv.mexlab.ru/geoportal) will be updated. The new system will provide spatial data processing, analysis and 3D-visualization and will be tested based on freely available Earth remote sensing data as well as Solar system planetary images from various missions. Based on this approach it will be possible to organize the research and representation of results on a new technology level, which provides more possibilities for immediate and direct reuse of research materials, including data, algorithms, methodology, and components. The new software environment is targeted at remote scientific teams, and will provide access to existing spatial distributed information for which we suggest implementation of a user interface as an advanced front-end, e.g., for virtual globe system.

  20. Application of integration algorithms in a parallel processing environment for the simulation of jet engines

    NASA Technical Reports Server (NTRS)

    Krosel, S. M.; Milner, E. J.

    1982-01-01

    The application of Predictor corrector integration algorithms developed for the digital parallel processing environment are investigated. The algorithms are implemented and evaluated through the use of a software simulator which provides an approximate representation of the parallel processing hardware. Test cases which focus on the use of the algorithms are presented and a specific application using a linear model of a turbofan engine is considered. Results are presented showing the effects of integration step size and the number of processors on simulation accuracy. Real time performance, interprocessor communication, and algorithm startup are also discussed.

  1. A new security model for collaborative environments

    SciTech Connect

    Agarwal, Deborah; Lorch, Markus; Thompson, Mary; Perry, Marcia

    2003-06-06

    Prevalent authentication and authorization models for distributed systems provide for the protection of computer systems and resources from unauthorized use. The rules and policies that drive the access decisions in such systems are typically configured up front and require trust establishment before the systems can be used. This approach does not work well for computer software that moderates human-to-human interaction. This work proposes a new model for trust establishment and management in computer systems supporting collaborative work. The model supports the dynamic addition of new users to a collaboration with very little initial trust placed into their identity and supports the incremental building of trust relationships through endorsements from established collaborators. It also recognizes the strength of a users authentication when making trust decisions. By mimicking the way humans build trust naturally the model can support a wide variety of usage scenarios. Its particular strength lies in the support for ad-hoc and dynamic collaborations and the ubiquitous access to a Computer Supported Collaboration Workspace (CSCW) system from locations with varying levels of trust and security.

  2. Modeling Obscurants in an Urban Environment

    DTIC Science & Technology

    2007-12-01

    cascades over the inertial subrange of the atmosphere, the Hurst parameter H= 1/3. For uncorrelated Brownian motion H= 1/2. (25) (26) (27) (24) 14... Pollock Editor, SPIE Optical Engineering Press, Chapter 6, pp 359- 493. Hoock, Donald W. Jr., 2002a: “New Transmission Algorithms for Modeling

  3. Modelling Three-Dimensional Sound Propagation in Wedge Environments

    NASA Astrophysics Data System (ADS)

    Austin, Melanie Elizabeth

    Ocean environments with sloped seafloors can give rise to sound paths that do not remain in a constant plane of propagation. Numerical modelling of sound fields in such environments requires the use of computer models that fully account for out-of-plane sound propagation effects. The inclusion of these three-dimensional effects can be computationally intensive and the effects are often neglected in computer sound propagation codes. The current state-of-the art in sound propagation modelling has seen the development of models that can fully account for out-of-plane sound propagation. Such a model has been implemented in this research to provide acoustic consultants JASCO Applied Sciences with an important tool for environmental noise impact assessment in complicated marine environments. The model is described and validation results are shown for benchmark test cases. The model is also applied to study three-dimensional propagation effects in measured data from a realistic ocean environment. Particular analysis techniques assist in the interpretation of the modelled sound field for this physical test environment providing new insight into the characteristics of the test environment.

  4. User behavioral model in hypertext environment

    NASA Astrophysics Data System (ADS)

    Moskvin, Oleksii M.; Sailarbek, Saltanat; Gromaszek, Konrad

    2015-12-01

    There is an important role of the users that are traversing Internet resources and their activities which, according to the practice, aren't usually considered by the Internet resource owners so to adjust and optimize hypertext structure. Optimal hypertext structure allows users to locate pages of interest, which are the goals of the informational search, in a faster way. Paper presents a model that conducts user auditory behavior analysis in order to figure out their goals in particular hypertext segment and allows finding out optimal routes for reaching those goals in terms of the routes length and informational value. Potential application of the proposed model is mostly the systems that evaluate hypertext networks and optimize their referential structure for faster information retrieval.

  5. Adaptive User Model for Web-Based Learning Environment.

    ERIC Educational Resources Information Center

    Garofalakis, John; Sirmakessis, Spiros; Sakkopoulos, Evangelos; Tsakalidis, Athanasios

    This paper describes the design of an adaptive user model and its implementation in an advanced Web-based Virtual University environment that encompasses combined and synchronized adaptation between educational material and well-known communication facilities. The Virtual University environment has been implemented to support a postgraduate…

  6. LIGHT-INDUCED PROCESSES AFFECTING ENTEROCOCCI IN AQUATIC ENVIRONMENTS

    EPA Science Inventory

    Fecal indicator bacteria such as enterococci have been used to assess contamination of freshwater and marine environments by pathogenic microorganisms. Various past studies have shown that sunlight plays an important role in reducing concentrations of culturable enterococci and ...

  7. Silicon-Carbide Power MOSFET Performance in High Efficiency Boost Power Processing Unit for Extreme Environments

    NASA Technical Reports Server (NTRS)

    Ikpe, Stanley A.; Lauenstein, Jean-Marie; Carr, Gregory A.; Hunter, Don; Ludwig, Lawrence L.; Wood, William; Del Castillo, Linda Y.; Fitzpatrick, Fred; Chen, Yuan

    2016-01-01

    Silicon-Carbide device technology has generated much interest in recent years. With superior thermal performance, power ratings and potential switching frequencies over its Silicon counterpart, Silicon-Carbide offers a greater possibility for high powered switching applications in extreme environment. In particular, Silicon-Carbide Metal-Oxide- Semiconductor Field-Effect Transistors' (MOSFETs) maturing process technology has produced a plethora of commercially available power dense, low on-state resistance devices capable of switching at high frequencies. A novel hard-switched power processing unit (PPU) is implemented utilizing Silicon-Carbide power devices. Accelerated life data is captured and assessed in conjunction with a damage accumulation model of gate oxide and drain-source junction lifetime to evaluate potential system performance at high temperature environments.

  8. Managing Analysis Models in the Design Process

    NASA Technical Reports Server (NTRS)

    Briggs, Clark

    2006-01-01

    Design of large, complex space systems depends on significant model-based support for exploration of the design space. Integrated models predict system performance in mission-relevant terms given design descriptions and multiple physics-based numerical models. Both the design activities and the modeling activities warrant explicit process definitions and active process management to protect the project from excessive risk. Software and systems engineering processes have been formalized and similar formal process activities are under development for design engineering and integrated modeling. JPL is establishing a modeling process to define development and application of such system-level models.

  9. Process-Response Modeling and the Scientific Process.

    ERIC Educational Resources Information Center

    Fichter, Lynn S.

    1988-01-01

    Discusses the process-response model (PRM) in its theoretical and practical forms. Describes how geologists attempt to reconstruct the process from the response (the geologic phenomenon) being studied. (TW)

  10. A process-based standard for the Solar Energetic Particle Event Environment

    NASA Astrophysics Data System (ADS)

    Gabriel, Stephen

    For 10 years or more, there has been a lack of concensus on what the ISO standard model for the Solar Energetic Particle Event (SEPE) environment should be. Despite many technical discussions between the world experts in this field, it has been impossible to agree on which of the several models available should be selected as the standard. Most of these discussions at the ISO WG4 meetings and conferences, etc have centred around the differences in modelling approach between the MSU model and the several remaining models from elsewhere worldwide (mainly the USA and Europe). The topic is considered timely given the inclusion of a session on reference data sets at the Space Weather Workshop in Boulder in April 2014. The original idea of a ‘process-based’ standard was conceived by Dr Kent Tobiska as a way of getting round the problems associated with not only the presence of different models, which in themselves could have quite distinct modelling approaches but could also be based on different data sets. In essence, a process based standard approach overcomes these issues by allowing there to be more than one model and not necessarily a single standard model; however, any such model has to be completely transparent in that the data set and the modelling techniques used have to be not only to be clearly and unambiguously defined but also subject to peer review. If the model meets all of these requirements then it should be acceptable as a standard model. So how does this process-based approach resolve the differences between the existing modelling approaches for the SEPE environment and remove the impasse? In a sense, it does not remove all of the differences but only some of them; however, most importantly it will allow something which so far has been impossible without ambiguities and disagreement and that is a comparison of the results of the various models. To date one of the problems (if not the major one) in comparing the results of the various different SEPE

  11. Cupola Furnace Computer Process Model

    SciTech Connect

    Seymour Katz

    2004-12-31

    The cupola furnace generates more than 50% of the liquid iron used to produce the 9+ million tons of castings annually. The cupola converts iron and steel into cast iron. The main advantages of the cupola furnace are lower energy costs than those of competing furnaces (electric) and the ability to melt less expensive metallic scrap than the competing furnaces. However the chemical and physical processes that take place in the cupola furnace are highly complex making it difficult to operate the furnace in optimal fashion. The results are low energy efficiency and poor recovery of important and expensive alloy elements due to oxidation. Between 1990 and 2004 under the auspices of the Department of Energy, the American Foundry Society and General Motors Corp. a computer simulation of the cupola furnace was developed that accurately describes the complex behavior of the furnace. When provided with the furnace input conditions the model provides accurate values of the output conditions in a matter of seconds. It also provides key diagnostics. Using clues from the diagnostics a trained specialist can infer changes in the operation that will move the system toward higher efficiency. Repeating the process in an iterative fashion leads to near optimum operating conditions with just a few iterations. More advanced uses of the program have been examined. The program is currently being combined with an ''Expert System'' to permit optimization in real time. The program has been combined with ''neural network'' programs to affect very easy scanning of a wide range of furnace operation. Rudimentary efforts were successfully made to operate the furnace using a computer. References to these more advanced systems will be found in the ''Cupola Handbook''. Chapter 27, American Foundry Society, Des Plaines, IL (1999).

  12. Exascale Co-design for Modeling Materials in Extreme Environments

    SciTech Connect

    Germann, Timothy C.

    2014-07-08

    Computational materials science has provided great insight into the response of materials under extreme conditions that are difficult to probe experimentally. For example, shock-induced plasticity and phase transformation processes in single-crystal and nanocrystalline metals have been widely studied via large-scale molecular dynamics simulations, and many of these predictions are beginning to be tested at advanced 4th generation light sources such as the Advanced Photon Source (APS) and Linac Coherent Light Source (LCLS). I will describe our simulation predictions and their recent verification at LCLS, outstanding challenges in modeling the response of materials to extreme mechanical and radiation environments, and our efforts to tackle these as part of the multi-institutional, multi-disciplinary Exascale Co-design Center for Materials in Extreme Environments (ExMatEx). ExMatEx has initiated an early and deep collaboration between domain (computational materials) scientists, applied mathematicians, computer scientists, and hardware architects, in order to establish the relationships between algorithms, software stacks, and architectures needed to enable exascale-ready materials science application codes within the next decade. We anticipate that we will be able to exploit hierarchical, heterogeneous architectures to achieve more realistic large-scale simulations with adaptive physics refinement, and are using tractable application scale-bridging proxy application testbeds to assess new approaches and requirements. Such current scale-bridging strategies accumulate (or recompute) a distributed response database from fine-scale calculations, in a top-down rather than bottom-up multiscale approach.

  13. Sensitivity of UO2 Stability in a Reducing Environment on Radiolysis Model Parameters

    SciTech Connect

    Wittman, Richard S.; Buck, Edgar C.

    2012-09-01

    Results for a radiolysis model sensitivity study of radiolytically produced H2O2 are presented as they relate to Spent (or Used) Light Water Reactor uranium oxide (UO2) nuclear fuel (UNF) oxidation in a low oxygen environment. The model builds on previous reaction kinetic studies to represent the radiolytic processes occurring at the nuclear fuel surface. Hydrogen peroxide (H2O2) is the dominant oxidant for spent nuclear fuel in an O2-depleted water environment.

  14. Integrated approaches to the application of advanced modeling technology in process development and optimization

    SciTech Connect

    Allgor, R.J.; Feehery, W.F.; Tolsma, J.E.

    1995-12-31

    The batch process development problem serves as good candidate to guide the development of process modeling environments. It demonstrates that very robust numerical techniques are required within an environment that can collect, organize, and maintain the data and models required to address the batch process development problem. This paper focuses on improving the robustness and efficiency of the numerical algorithms required in such a modeling environment through the development of hybrid numerical and symbolic strategies.

  15. Process modeling and industrial energy use

    SciTech Connect

    Howe, S O; Pilati, D A; Sparrow, F T

    1980-11-01

    How the process models developed at BNL are used to analyze industrial energy use is described and illustrated. Following a brief overview of the industry modeling program, the general methodology of process modeling is discussed. The discussion highlights the important concepts, contents, inputs, and outputs of a typical process model. A model of the US pulp and paper industry is then discussed as a specific application of process modeling methodology. Case study results from the pulp and paper model illustrate how process models can be used to analyze a variety of issues. Applications addressed with the case study results include projections of energy demand, conservation technology assessment, energy-related tax policies, and sensitivity analysis. A subsequent discussion of these results supports the conclusion that industry process models are versatile and powerful tools for energy end-use modeling and conservation analysis. Information on the current status of industry models at BNL is tabulated.

  16. Use of terrestrial laser scanning (TLS) for monitoring and modelling of geomorphic processes and phenomena at a small and medium spatial scale in Polar environment (Scott River — Spitsbergen)

    NASA Astrophysics Data System (ADS)

    Kociuba, Waldemar; Kubisz, Waldemar; Zagórski, Piotr

    2014-05-01

    The application of Terrestrial Laser Scanning (TLS) for precise modelling of land relief and quantitative estimation of spatial and temporal transformations can contribute to better understanding of catchment-forming processes. Experimental field measurements utilising the 3D laser scanning technology were carried out within the Scott River catchment located in the NW part of the Wedel Jarlsberg Land (Spitsbergen). The measurements concerned the glacier-free part of the Scott River valley floor with a length of 3.5 km and width from 0.3 to 1.5 km and were conducted with a state-of-the-art medium-range stationary laser scanner, a Leica Scan Station C10. A complex set of measurements of the valley floor were carried out from 86 measurement sites interrelated by the application of 82 common 'target points'. During scanning, from 5 to 19 million measurements were performed at each of the sites, and a point-cloud constituting a 'model space' was obtained. By merging individual 'model spaces', a Digital Surface Model (DSM) of the Scott River valley was obtained, with a co-registration error not exceeding ± 9 mm. The accuracy of the model permitted precise measurements of dimensions of landforms of varied scales on the main valley floor and slopes and in selected sub-catchments. The analyses verified the efficiency of the measurement system in Polar meteorological conditions of Spitsbergen in mid-summer.

  17. An Extended Stochastic Petri Nets Modeling Method for Collaborative Workflow Process

    NASA Astrophysics Data System (ADS)

    Yi, Yang

    Workflow process modeling is important for BPR; some classic process modeling methods have many defects, such as weakness description ability, high modeling complex, and so on. In this paper, we explore an extended stochastic Petri Nets modeling method based on basic Petri Nets. This method can model concurrency collaborative workflow process under stochastic environment.

  18. LEGEND, a LEO-to-GEO Environment Debris Model

    NASA Technical Reports Server (NTRS)

    Liou, Jer Chyi; Hall, Doyle T.

    2013-01-01

    LEGEND (LEO-to-GEO Environment Debris model) is a three-dimensional orbital debris evolutionary model that is capable of simulating the historical and future debris populations in the near-Earth environment. The historical component in LEGEND adopts a deterministic approach to mimic the known historical populations. Launched rocket bodies, spacecraft, and mission-related debris (rings, bolts, etc.) are added to the simulated environment. Known historical breakup events are reproduced, and fragments down to 1 mm in size are created. The LEGEND future projection component adopts a Monte Carlo approach and uses an innovative pair-wise collision probability evaluation algorithm to simulate the future breakups and the growth of the debris populations. This algorithm is based on a new "random sampling in time" approach that preserves characteristics of the traditional approach and captures the rapidly changing nature of the orbital debris environment. LEGEND is a Fortran 90-based numerical simulation program. It operates in a UNIX/Linux environment.

  19. Process correlation analysis model for process improvement identification.

    PubMed

    Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  20. Multidimensional Data Modeling for Business Process Analysis

    NASA Astrophysics Data System (ADS)

    Mansmann, Svetlana; Neumuth, Thomas; Scholl, Marc H.

    The emerging area of business process intelligence attempts to enhance the analytical capabilities of business process management systems by employing data warehousing and mining technologies. This paper presents an approach to re-engineering the business process modeling in conformity with the multidimensional data model. Since the business process and the multidimensional model are driven by rather different objectives and assumptions, there is no straightforward solution to converging these models.

  1. Electric discharge processes in the ISS plasma environment

    NASA Astrophysics Data System (ADS)

    Tverdokhlebova, E. M.; Korsun, A. G.; Gabdullin, F. F.; Karabadzhak, G. F.

    We consider the behaviour of the electric discharges which can be initiated between constructional elements of the International Space Station (ISS) due to the electric field of high-voltaic solar arrays (HVSA). The characteristics of the ISS plasma environment are evaluated taking into account the influence of space ionizing fluxes, the Earth's magnetic field, and the HVSA's electric field. We offer the statement of the space experiment "Plasma-ISS", the aim of which is to investigate, using optical emission characteristics, parameters of the ISS plasma environment formed at operation of both the onboard engines and other plasma sources.

  2. Time of arrival through interacting environments: Tunneling processes

    NASA Astrophysics Data System (ADS)

    Aoki, Ken-Ichi; Horikoshi, Atsushi; Nakamura, Etsuko

    2000-08-01

    We discuss the propagation of wave packets through interacting environments. Such environments generally modify the dispersion relation or shape of the wave function. To study such effects in detail, we define the distribution function PX(T), which describes the arrival time T of a packet at a detector located at point X. We calculate PX(T) for wave packets traveling through a tunneling barrier and find that our results actually explain recent experiments. We compare our results with Nelson's stochastic interpretation of quantum mechanics and resolve a paradox previously apparent in Nelson's viewpoint about the tunneling time.

  3. Focal and Ambient Processing of Built Environments: Intellectual and Atmospheric Experiences of Architecture.

    PubMed

    Rooney, Kevin K; Condia, Robert J; Loschky, Lester C

    2017-01-01

    Neuroscience has well established that human vision divides into the central and peripheral fields of view. Central vision extends from the point of gaze (where we are looking) out to about 5° of visual angle (the width of one's fist at arm's length), while peripheral vision is the vast remainder of the visual field. These visual fields project to the parvo and magno ganglion cells, which process distinctly different types of information from the world around us and project that information to the ventral and dorsal visual streams, respectively. Building on the dorsal/ventral stream dichotomy, we can further distinguish between focal processing of central vision, and ambient processing of peripheral vision. Thus, our visual processing of and attention to objects and scenes depends on how and where these stimuli fall on the retina. The built environment is no exception to these dependencies, specifically in terms of how focal object perception and ambient spatial perception create different types of experiences we have with built environments. We argue that these foundational mechanisms of the eye and the visual stream are limiting parameters of architectural experience. We hypothesize that people experience architecture in two basic ways based on these visual limitations; by intellectually assessing architecture consciously through focal object processing and assessing architecture in terms of atmosphere through pre-conscious ambient spatial processing. Furthermore, these separate ways of processing architectural stimuli operate in parallel throughout the visual perceptual system. Thus, a more comprehensive understanding of architecture must take into account that built environments are stimuli that are treated differently by focal and ambient vision, which enable intellectual analysis of architectural experience versus the experience of architectural atmosphere, respectively. We offer this theoretical model to help advance a more precise understanding of the

  4. Focal and Ambient Processing of Built Environments: Intellectual and Atmospheric Experiences of Architecture

    PubMed Central

    Rooney, Kevin K.; Condia, Robert J.; Loschky, Lester C.

    2017-01-01

    Neuroscience has well established that human vision divides into the central and peripheral fields of view. Central vision extends from the point of gaze (where we are looking) out to about 5° of visual angle (the width of one’s fist at arm’s length), while peripheral vision is the vast remainder of the visual field. These visual fields project to the parvo and magno ganglion cells, which process distinctly different types of information from the world around us and project that information to the ventral and dorsal visual streams, respectively. Building on the dorsal/ventral stream dichotomy, we can further distinguish between focal processing of central vision, and ambient processing of peripheral vision. Thus, our visual processing of and attention to objects and scenes depends on how and where these stimuli fall on the retina. The built environment is no exception to these dependencies, specifically in terms of how focal object perception and ambient spatial perception create different types of experiences we have with built environments. We argue that these foundational mechanisms of the eye and the visual stream are limiting parameters of architectural experience. We hypothesize that people experience architecture in two basic ways based on these visual limitations; by intellectually assessing architecture consciously through focal object processing and assessing architecture in terms of atmosphere through pre-conscious ambient spatial processing. Furthermore, these separate ways of processing architectural stimuli operate in parallel throughout the visual perceptual system. Thus, a more comprehensive understanding of architecture must take into account that built environments are stimuli that are treated differently by focal and ambient vision, which enable intellectual analysis of architectural experience versus the experience of architectural atmosphere, respectively. We offer this theoretical model to help advance a more precise understanding of the

  5. Simulation model of clastic sedimentary processes

    SciTech Connect

    Tetzlaff, D.M.

    1987-01-01

    This dissertation describes SEDSIM, a computer model that simulates erosion, transport, and deposition of clastic sediments by free-surface flow in natural environments. SEDSIM is deterministic and is applicable to sedimentary processes in rivers, deltas, continental shelves, submarine canyons, and turbidite fans. The model is used to perform experiments in clastic sedimentation. Computer experimentation is limited by computing power available, but is free from scaling problems associated with laboratory experiments. SEDSIM responds to information provided to it at the outset of a simulation experiment, including topography, subsurface configuration, physical parameters of fluid and sediment, and characteristics of sediment sources. Extensive computer graphics are incorporated in SEDSIM. The user can display the three-dimensional geometry of simulated deposits in the form of successions of contour maps, perspective diagrams, vector plots of current velocities, and vertical sections of any azimuth orientation. The sections show both sediment age and composition. SEDSIM works realistically with processes involving channel shifting and topographic changes. Example applications include simulation of an ancient submarine canyon carved into a Cretaceous sequence in the National Petroleum Reserve in Alaska, known mainly from seismic sections and a sequence of Tertiary age in the Golden Meadow oil field of Louisiana, known principally from well logs.

  6. Designing a Web-Based Science Learning Environment for Model-Based Collaborative Inquiry

    ERIC Educational Resources Information Center

    Sun, Daner; Looi, Chee-Kit

    2013-01-01

    The paper traces a research process in the design and development of a science learning environment called WiMVT (web-based inquirer with modeling and visualization technology). The WiMVT system is designed to help secondary school students build a sophisticated understanding of scientific conceptions, and the science inquiry process, as well as…

  7. Particle Swarm Based Collective Searching Model for Adaptive Environment

    SciTech Connect

    Cui, Xiaohui; Patton, Robert M; Potok, Thomas E; Treadwell, Jim N

    2008-01-01

    This report presents a pilot study of an integration of particle swarm algorithm, social knowledge adaptation and multi-agent approaches for modeling the collective search behavior of self-organized groups in an adaptive environment. The objective of this research is to apply the particle swarm metaphor as a model of social group adaptation for the dynamic environment and to provide insight and understanding of social group knowledge discovering and strategic searching. A new adaptive environment model, which dynamically reacts to the group collective searching behaviors, is proposed in this research. The simulations in the research indicate that effective communication between groups is not the necessary requirement for whole self-organized groups to achieve the efficient collective searching behavior in the adaptive environment.

  8. Particle Swarm Based Collective Searching Model for Adaptive Environment

    SciTech Connect

    Cui, Xiaohui; Patton, Robert M; Potok, Thomas E; Treadwell, Jim N

    2007-01-01

    This report presents a pilot study of an integration of particle swarm algorithm, social knowledge adaptation and multi-agent approaches for modeling the collective search behavior of self-organized groups in an adaptive environment. The objective of this research is to apply the particle swarm metaphor as a model of social group adaptation for the dynamic environment and to provide insight and understanding of social group knowledge discovering and strategic searching. A new adaptive environment model, which dynamically reacts to the group collective searching behaviors, is proposed in this research. The simulations in the research indicate that effective communication between groups is not the necessary requirement for whole self-organized groups to achieve the efficient collective searching behavior in the adaptive environment.

  9. Autism and Digital Learning Environments: Processes of Interaction and Mediation

    ERIC Educational Resources Information Center

    Passerino, Liliana M.; Santarosa, Lucila M. Costi

    2008-01-01

    Using a socio-historical perspective to explain social interaction and taking advantage of information and communication technologies (ICTs) currently available for creating digital learning environments (DLEs), this paper seeks to redress the absence of empirical data concerning technology-aided social interaction between autistic individuals. In…

  10. Intelligent sensing in dynamic environments using markov decision process.

    PubMed

    Nanayakkara, Thrishantha; Halgamuge, Malka N; Sridhar, Prasanna; Madni, Asad M

    2011-01-01

    In a network of low-powered wireless sensors, it is essential to capture as many environmental events as possible while still preserving the battery life of the sensor node. This paper focuses on a real-time learning algorithm to extend the lifetime of a sensor node to sense and transmit environmental events. A common method that is generally adopted in ad-hoc sensor networks is to periodically put the sensor nodes to sleep. The purpose of the learning algorithm is to couple the sensor's sleeping behavior to the natural statistics of the environment hence that it can be in optimal harmony with changes in the environment, the sensors can sleep when steady environment and stay awake when turbulent environment. This paper presents theoretical and experimental validation of a reward based learning algorithm that can be implemented on an embedded sensor. The key contribution of the proposed approach is the design and implementation of a reward function that satisfies a trade-off between the above two mutually contradicting objectives, and a linear critic function to approximate the discounted sum of future rewards in order to perform policy learning.

  11. Differential Susceptibility to the Environment: Are Developmental Models Compatible with the Evidence from Twin Studies?

    ERIC Educational Resources Information Center

    Del Giudice, Marco

    2016-01-01

    According to models of differential susceptibility, the same neurobiological and temperamental traits that determine increased sensitivity to stress and adversity also confer enhanced responsivity to the positive aspects of the environment. Differential susceptibility models have expanded to include complex developmental processes in which genetic…

  12. Drift-Scale Coupled Processes (DST and THC Seepage) Models

    SciTech Connect

    E. Sonnenthale

    2001-04-16

    The purpose of this Analysis/Model Report (AMR) is to document the Near-Field Environment (NFE) and Unsaturated Zone (UZ) models used to evaluate the potential effects of coupled thermal-hydrologic-chemical (THC) processes on unsaturated zone flow and transport. This is in accordance with the ''Technical Work Plan (TWP) for Unsaturated Zone Flow and Transport Process Model Report'', Addendum D, Attachment D-4 (Civilian Radioactive Waste Management System (CRWMS) Management and Operating Contractor (M&O) 2000 [1534471]) and ''Technical Work Plan for Nearfield Environment Thermal Analyses and Testing'' (CRWMS M&O 2000 [153309]). These models include the Drift Scale Test (DST) THC Model and several THC seepage models. These models provide the framework to evaluate THC coupled processes at the drift scale, predict flow and transport behavior for specified thermal loading conditions, and predict the chemistry of waters and gases entering potential waste-emplacement drifts. The intended use of this AMR is to provide input for the following: Performance Assessment (PA); Near-Field Environment (NFE) PMR; Abstraction of Drift-Scale Coupled Processes AMR (ANL-NBS-HS-000029); and UZ Flow and Transport Process Model Report (PMR). The work scope for this activity is presented in the TWPs cited above, and summarized as follows: Continue development of the repository drift-scale THC seepage model used in support of the TSPA in-drift geochemical model; incorporate heterogeneous fracture property realizations; study sensitivity of results to changes in input data and mineral assemblage; validate the DST model by comparison with field data; perform simulations to predict mineral dissolution and precipitation and their effects on fracture properties and chemistry of water (but not flow rates) that may seep into drifts; submit modeling results to the TDMS and document the models. The model development, input data, sensitivity and validation studies described in this AMR are required

  13. Drift-Scale Coupled Processes (DST and THC Seepage) Models

    SciTech Connect

    E. Gonnenthal; N. Spyoher

    2001-02-05

    The purpose of this Analysis/Model Report (AMR) is to document the Near-Field Environment (NFE) and Unsaturated Zone (UZ) models used to evaluate the potential effects of coupled thermal-hydrologic-chemical (THC) processes on unsaturated zone flow and transport. This is in accordance with the ''Technical Work Plan (TWP) for Unsaturated Zone Flow and Transport Process Model Report'', Addendum D, Attachment D-4 (Civilian Radioactive Waste Management System (CRWMS) Management and Operating Contractor (M and O) 2000 [153447]) and ''Technical Work Plan for Nearfield Environment Thermal Analyses and Testing'' (CRWMS M and O 2000 [153309]). These models include the Drift Scale Test (DST) THC Model and several THC seepage models. These models provide the framework to evaluate THC coupled processes at the drift scale, predict flow and transport behavior for specified thermal loading conditions, and predict the chemistry of waters and gases entering potential waste-emplacement drifts. The intended use of this AMR is to provide input for the following: (1) Performance Assessment (PA); (2) Abstraction of Drift-Scale Coupled Processes AMR (ANL-NBS-HS-000029); (3) UZ Flow and Transport Process Model Report (PMR); and (4) Near-Field Environment (NFE) PMR. The work scope for this activity is presented in the TWPs cited above, and summarized as follows: continue development of the repository drift-scale THC seepage model used in support of the TSPA in-drift geochemical model; incorporate heterogeneous fracture property realizations; study sensitivity of results to changes in input data and mineral assemblage; validate the DST model by comparison with field data; perform simulations to predict mineral dissolution and precipitation and their effects on fracture properties and chemistry of water (but not flow rates) that may seep into drifts; submit modeling results to the TDMS and document the models. The model development, input data, sensitivity and validation studies described in

  14. Econobiophysics - game of choosing. Model of selection or election process with diverse accessible information

    PubMed Central

    2011-01-01

    We propose several models applicable to both selection and election processes when each selecting or electing subject has access to different information about the objects to choose from. We wrote special software to simulate these processes. We consider both the cases when the environment is neutral (natural process) as well as when the environment is involved (controlled process). PMID:21892959

  15. Supporting Inquiry Processes with an Interactive Learning Environment: Inquiry Island

    NASA Astrophysics Data System (ADS)

    Eslinger, Eric; White, Barbara; Frederiksen, John; Brobst, Joseph

    2008-12-01

    This research addresses the effectiveness of an interactive learning environment, Inquiry Island, as a general-purpose framework for the design of inquiry-based science curricula. We introduce the software as a scaffold designed to support the creation and assessment of inquiry projects, and describe its use in a middle-school genetics unit. Students in the intervention showed significant gains in inquiry skills. We also illustrate the power of the software to gather and analyze qualitative data about student learning.

  16. Active microrheology of a model of the nuclear micromechanical environment

    NASA Astrophysics Data System (ADS)

    Byrd, Henry; Kilfoil, Maria

    2014-03-01

    In order to successfully complete the final stages of chromosome segregation, eukaryotic cells require the motor enzyme topoisomerase II, which can resolve topological constraints between entangled strands of duplex DNA. We created an in vitro model of a close approximation of the nuclear micromechanical environment in terms of DNA mass and entanglement density, and investigated the influence of this motor enzyme on the DNA mechanics. Topoisomerase II is a non-processive ATPase which we found significantly increases the motions of embedded microspheres in the DNA network. Because of this activity, we study the mechanical properties of our model system by active microrheology by optical trapping. We test the limits of fluctuation dissipation theorem (FDT) under this type of activity by comparing the active microrheology to passive measurements, where thermal motion alone drives the beads. We can relate any departure from FDT to the timescale of topoisomerase II activity in the DNA network. These experiments provide insight into the physical necessity of this motor enzyme in the cell.

  17. Composing Models of Geographic Physical Processes

    NASA Astrophysics Data System (ADS)

    Hofer, Barbara; Frank, Andrew U.

    Processes are central for geographic information science; yet geographic information systems (GIS) lack capabilities to represent process related information. A prerequisite to including processes in GIS software is a general method to describe geographic processes independently of application disciplines. This paper presents such a method, namely a process description language. The vocabulary of the process description language is derived formally from mathematical models. Physical processes in geography can be described in two equivalent languages: partial differential equations or partial difference equations, where the latter can be shown graphically and used as a method for application specialists to enter their process models. The vocabulary of the process description language comprises components for describing the general behavior of prototypical geographic physical processes. These process components can be composed by basic models of geographic physical processes, which is shown by means of an example.

  18. Interactive Schematic Integration Within the Propellant System Modeling Environment

    NASA Technical Reports Server (NTRS)

    Coote, David; Ryan, Harry; Burton, Kenneth; McKinney, Lee; Woodman, Don

    2012-01-01

    Task requirements for rocket propulsion test preparations of the test stand facilities drive the need to model the test facility propellant systems prior to constructing physical modifications. The Propellant System Modeling Environment (PSME) is an initiative designed to enable increased efficiency and expanded capabilities to a broader base of NASA engineers in the use of modeling and simulation (M&S) technologies for rocket propulsion test and launch mission requirements. PSME will enable a wider scope of users to utilize M&S of propulsion test and launch facilities for predictive and post-analysis functionality by offering a clean, easy-to-use, high-performance application environment.

  19. Rapid prototyping and AI programming environments applied to payload modeling

    NASA Technical Reports Server (NTRS)

    Carnahan, Richard S., Jr.; Mendler, Andrew P.

    1987-01-01

    This effort focused on using artificial intelligence (AI) programming environments and rapid prototyping to aid in both space flight manned and unmanned payload simulation and training. Significant problems addressed are the large amount of development time required to design and implement just one of these payload simulations and the relative inflexibility of the resulting model to accepting future modification. Results of this effort have suggested that both rapid prototyping and AI programming environments can significantly reduce development time and cost when applied to the domain of payload modeling for crew training. The techniques employed are applicable to a variety of domains where models or simulations are required.

  20. Heuristic and Linear Models of Judgment: Matching Rules and Environments

    ERIC Educational Resources Information Center

    Hogarth, Robin M.; Karelaia, Natalia

    2007-01-01

    Much research has highlighted incoherent implications of judgmental heuristics, yet other findings have demonstrated high correspondence between predictions and outcomes. At the same time, judgment has been well modeled in the form of as if linear models. Accepting the probabilistic nature of the environment, the authors use statistical tools to…

  1. Inquiry, play, and problem solving in a process learning environment

    NASA Astrophysics Data System (ADS)

    Thwaits, Anne Y.

    What is the nature of art/science collaborations in museums? How do art objects and activities contribute to the successes of science centers? Based on the premise that art exhibitions and art-based activities engage museum visitors in different ways than do strictly factual, information-based displays, I address these questions in a case study that examines the roles of visual art and artists in the Exploratorium, a museum that has influenced exhibit design and professional practice in many of the hands-on science centers in the United States and around the world. The marriage of art and science in education is not a new idea---Leonardo da Vinci and other early polymaths surely understood how their various endeavors informed one another, and some 20th century educators understood the value of the arts and creativity in the learning and practice of other disciplines. When, in 2010, the National Science Teachers Association added an A to the federal government's ubiquitous STEM initiative and turned it into STEAM, art educators nationwide took notice. With a heightened interest in the integration of and collaboration between disciplines comes an increased need for models of best practice for educators and educational institutions. With the intention to understand the nature of such collaborations and the potential they hold, I undertook this study. I made three site visits to the Exploratorium, where I took photos, recorded notes in a journal, interacted with exhibits, and observed museum visitors. I collected other data by examining the institution's website, press releases, annual reports, and fact sheets; and by reading popular and scholarly articles written by museum staff members and by independent journalists. I quickly realized that the Exploratorium was not created in the way than most museums are, and the history of its founding and the ideals of its founder illuminate what was then and continues now to be different about this museum from most others in the

  2. Model-based software process improvement

    NASA Technical Reports Server (NTRS)

    Zettervall, Brenda T.

    1994-01-01

    The activities of a field test site for the Software Engineering Institute's software process definition project are discussed. Products tested included the improvement model itself, descriptive modeling techniques, the CMM level 2 framework document, and the use of process definition guidelines and templates. The software process improvement model represents a five stage cyclic approach for organizational process improvement. The cycles consist of the initiating, diagnosing, establishing, acting, and leveraging phases.

  3. Report of the 2014 Programming Models and Environments Summit

    SciTech Connect

    Heroux, Michael; Lethin, Richard

    2016-09-19

    Programming models and environments play the essential roles in high performance computing of enabling the conception, design, implementation and execution of science and engineering application codes. Programmer productivity is strongly influenced by the effectiveness of our programming models and environments, as is software sustainability since our codes have lifespans measured in decades, so the advent of new computing architectures, increased concurrency, concerns for resilience, and the increasing demands for high-fidelity, multi-physics, multi-scale and data-intensive computations mean that we have new challenges to address as part of our fundamental R&D requirements. Fortunately, we also have new tools and environments that make design, prototyping and delivery of new programming models easier than ever. The combination of new and challenging requirements and new, powerful toolsets enables significant synergies for the next generation of programming models and environments R&D. This report presents the topics discussed and results from the 2014 DOE Office of Science Advanced Scientific Computing Research (ASCR) Programming Models & Environments Summit, and subsequent discussions among the summit participants and contributors to topics in this report.

  4. Birth/death process model

    NASA Technical Reports Server (NTRS)

    Solloway, C. B.; Wakeland, W.

    1976-01-01

    First-order Markov model developed on digital computer for population with specific characteristics. System is user interactive, self-documenting, and does not require user to have complete understanding of underlying model details. Contains thorough error-checking algorithms on input and default capabilities.

  5. Mathematical Modeling: A Structured Process

    ERIC Educational Resources Information Center

    Anhalt, Cynthia Oropesa; Cortez, Ricardo

    2015-01-01

    Mathematical modeling, in which students use mathematics to explain or interpret physical, social, or scientific phenomena, is an essential component of the high school curriculum. The Common Core State Standards for Mathematics (CCSSM) classify modeling as a K-12 standard for mathematical practice and as a conceptual category for high school…

  6. Charged Particle Environment Definition for NGST: Model Development

    NASA Technical Reports Server (NTRS)

    Blackwell, William C.; Minow, Joseph I.; Evans, Steven W.; Hardage, Donna M.; Suggs, Robert M.

    2000-01-01

    NGST will operate in a halo orbit about the L2 point, 1.5 million km from the Earth, where the spacecraft will periodically travel through the magnetotail region. There are a number of tools available to calculate the high energy, ionizing radiation particle environment from galactic cosmic rays and from solar disturbances. However, space environment tools are not generally available to provide assessments of charged particle environment and its variations in the solar wind, magnetosheath, and magnetotail at L2 distances. An engineering-level phenomenology code (LRAD) was therefore developed to facilitate the definition of charged particle environments in the vicinity of the L2 point in support of the NGST program. LRAD contains models tied to satellite measurement data of the solar wind and magnetotail regions. The model provides particle flux and fluence calculations necessary to predict spacecraft charging conditions and the degradation of materials used in the construction of NGST. This paper describes the LRAD environment models for the deep magnetotail (XGSE < -100 Re) and solar wind, and presents predictions of the charged particle environment for NGST.

  7. Genomic Prediction of Genotype × Environment Interaction Kernel Regression Models.

    PubMed

    Cuevas, Jaime; Crossa, José; Soberanis, Víctor; Pérez-Elizalde, Sergio; Pérez-Rodríguez, Paulino; Campos, Gustavo de Los; Montesinos-López, O A; Burgueño, Juan

    2016-11-01

    In genomic selection (GS), genotype × environment interaction (G × E) can be modeled by a marker × environment interaction (M × E). The G × E may be modeled through a linear kernel or a nonlinear (Gaussian) kernel. In this study, we propose using two nonlinear Gaussian kernels: the reproducing kernel Hilbert space with kernel averaging (RKHS KA) and the Gaussian kernel with the bandwidth estimated through an empirical Bayesian method (RKHS EB). We performed single-environment analyses and extended to account for G × E interaction (GBLUP-G × E, RKHS KA-G × E and RKHS EB-G × E) in wheat ( L.) and maize ( L.) data sets. For single-environment analyses of wheat and maize data sets, RKHS EB and RKHS KA had higher prediction accuracy than GBLUP for all environments. For the wheat data, the RKHS KA-G × E and RKHS EB-G × E models did show up to 60 to 68% superiority over the corresponding single environment for pairs of environments with positive correlations. For the wheat data set, the models with Gaussian kernels had accuracies up to 17% higher than that of GBLUP-G × E. For the maize data set, the prediction accuracy of RKHS EB-G × E and RKHS KA-G × E was, on average, 5 to 6% higher than that of GBLUP-G × E. The superiority of the Gaussian kernel models over the linear kernel is due to more flexible kernels that accounts for small, more complex marker main effects and marker-specific interaction effects.

  8. Modeling and Performance Simulation of the Mass Storage Network Environment

    NASA Technical Reports Server (NTRS)

    Kim, Chan M.; Sang, Janche

    2000-01-01

    This paper describes the application of modeling and simulation in evaluating and predicting the performance of the mass storage network environment. Network traffic is generated to mimic the realistic pattern of file transfer, electronic mail, and web browsing. The behavior and performance of the mass storage network and a typical client-server Local Area Network (LAN) are investigated by modeling and simulation. Performance characteristics in throughput and delay demonstrate the important role of modeling and simulation in network engineering and capacity planning.

  9. GREENSCOPE: A Method for Modeling Chemical Process Sustainability

    EPA Science Inventory

    Current work within the U.S. Environmental Protection Agency’s National Risk Management Research Laboratory is focused on the development of a method for modeling chemical process sustainability. The GREENSCOPE methodology, defined for the four bases of Environment, Economics, Ef...

  10. Quality and Safety in Health Care, Part XIV: The External Environment and Research for Diagnostic Processes.

    PubMed

    Harolds, Jay A

    2016-09-01

    The work system in which diagnosis takes place is affected by the external environment, which includes requirements such as certification, accreditation, and regulations. How errors are reported, malpractice, and the system for payment are some other aspects of the external environment. Improving the external environment is expected to decrease errors in diagnosis. More research on improving the diagnostic process is needed.

  11. A Comparative of business process modelling techniques

    NASA Astrophysics Data System (ADS)

    Tangkawarow, I. R. H. T.; Waworuntu, J.

    2016-04-01

    In this era, there is a lot of business process modeling techniques. This article is the research about differences of business process modeling techniques. For each technique will explain about the definition and the structure. This paper presents a comparative analysis of some popular business process modelling techniques. The comparative framework is based on 2 criteria: notation and how it works when implemented in Somerleyton Animal Park. Each technique will end with the advantages and disadvantages. The final conclusion will give recommend of business process modeling techniques that easy to use and serve the basis for evaluating further modelling techniques.

  12. The Computer-Aided Analytic Process Model. Operations Handbook for the Analytic Process Model Demonstration Package

    DTIC Science & Technology

    1986-01-01

    Research Note 86-06 THE COMPUTER-AIDED ANALYTIC PROCESS MODEL : OPERATIONS HANDBOOK FOR THE ANALYTIC PROCESS MODEL DE ONSTRATION PACKAGE Ronald G...ic Process Model ; Operations Handbook; Tutorial; Apple; Systems Taxonomy Mod--l; Training System; Bradl1ey infantry Fighting * Vehicle; BIFV...8217. . . . . . . .. . . . . . . . . . . . . . . . * - ~ . - - * m- .. . . . . . . item 20. Abstract -continued companion volume-- "The Analytic Process Model for

  13. A Hierarchical Process-Dissociation Model

    ERIC Educational Resources Information Center

    Rouder, Jeffrey N.; Lu, Jun; Morey, Richard D.; Sun, Dongchu; Speckman, Paul L.

    2008-01-01

    In fitting the process-dissociation model (L. L. Jacoby, 1991) to observed data, researchers aggregate outcomes across participant, items, or both. T. Curran and D. L. Hintzman (1995) demonstrated how biases from aggregation may lead to artifactual support for the model. The authors develop a hierarchical process-dissociation model that does not…

  14. Modeling human behaviors and reactions under dangerous environment.

    PubMed

    Kang, J; Wright, D K; Qin, S F; Zhao, Y

    2005-01-01

    This paper describes the framework of a real-time simulation system to model human behavior and reactions in dangerous environments. The system utilizes the latest 3D computer animation techniques, combined with artificial intelligence, robotics and psychology, to model human behavior, reactions and decision making under expected/unexpected dangers in real-time in virtual environments. The development of the system includes: classification on the conscious/subconscious behaviors and reactions of different people; capturing different motion postures by the Eagle Digital System; establishing 3D character animation models; establishing 3D models for the scene; planning the scenario and the contents; and programming within Virtools Dev. Programming within Virtools Dev is subdivided into modeling dangerous events, modeling character's perceptions, modeling character's decision making, modeling character's movements, modeling character's interaction with environment and setting up the virtual cameras. The real-time simulation of human reactions in hazardous environments is invaluable in military defense, fire escape, rescue operation planning, traffic safety studies, and safety planning in chemical factories, the design of buildings, airplanes, ships and trains. Currently, human motion modeling can be realized through established technology, whereas to integrate perception and intelligence into virtual human's motion is still a huge undertaking. The challenges here are the synchronization of motion and intelligence, the accurate modeling of human's vision, smell, touch and hearing, the diversity and effects of emotion and personality in decision making. There are three types of software platforms which could be employed to realize the motion and intelligence within one system, and their advantages and disadvantages are discussed.

  15. Commercial applications in biomedical processing in the microgravity environment

    NASA Astrophysics Data System (ADS)

    Johnson, Terry C.; Taub, Floyd

    1995-01-01

    A series of studies have shown that a purified cell regulatory sialoglycopeptide (CeReS) that arrests cell division and induces cellular differentiation is fully capable of functionally interacting with target insect and mammalian cells in the microgravity environment. Data from several shuttle missions suggest that the signal transduction events that are known to be associated with CeReS action function as well in microgravity as in ground-based experiments. The molecular events known to be associated with CeReS include an ability to interfere with Ca2+ metabolism, the subsequent alkalinization of cell cytosol, and the inhibition of the phosphorylation of the nuclear protein product encoded by the retinoblastoma (RB) gene. The ability of CeReS to function in microgravity opens a wide variety of applications in space life sciences.

  16. Using process groups to implement failure detection in asynchronous environments

    NASA Technical Reports Server (NTRS)

    Ricciardi, Aleta M.; Birman, Kenneth P.

    1991-01-01

    Agreement on the membership of a group of processes in a distributed system is a basic problem that arises in a wide range of applications. Such groups occur when a set of processes cooperate to perform some task, share memory, monitor one another, subdivide a computation, and so forth. The group membership problems is discussed as it relates to failure detection in asynchronous, distributed systems. A rigorous, formal specification for group membership is presented under this interpretation. A solution is then presented for this problem.

  17. Computational Process Modeling for Additive Manufacturing

    NASA Technical Reports Server (NTRS)

    Bagg, Stacey; Zhang, Wei

    2014-01-01

    Computational Process and Material Modeling of Powder Bed additive manufacturing of IN 718. Optimize material build parameters with reduced time and cost through modeling. Increase understanding of build properties. Increase reliability of builds. Decrease time to adoption of process for critical hardware. Potential to decrease post-build heat treatments. Conduct single-track and coupon builds at various build parameters. Record build parameter information and QM Meltpool data. Refine Applied Optimization powder bed AM process model using data. Report thermal modeling results. Conduct metallography of build samples. Calibrate STK models using metallography findings. Run STK models using AO thermal profiles and report STK modeling results. Validate modeling with additional build. Photodiode Intensity measurements highly linear with power input. Melt Pool Intensity highly correlated to Melt Pool Size. Melt Pool size and intensity increase with power. Applied Optimization will use data to develop powder bed additive manufacturing process model.

  18. Dissolving decision making? Models and their roles in decision-making processes and policy at large.

    PubMed

    Zeiss, Ragna; van Egmond, Stans

    2014-12-01

    This article studies the roles three science-based models play in Dutch policy and decision making processes. Key is the interaction between model construction and environment. Their political and scientific environments form contexts that shape the roles of models in policy decision making. Attention is paid to three aspects of the wider context of the models: a) the history of the construction process; b) (changes in) the political and scientific environments; and c) the use in policy processes over longer periods of time. Models are more successfully used when they are constructed in a stable political and scientific environment. Stability and certainty within a scientific field seems to be a key predictor for the usefulness of models for policy making. The economic model is more disputed than the ecology-based model and the model that has its theoretical foundation in physics and chemistry. The roles models play in policy processes are too complex to be considered as straightforward technocratic powers.

  19. A Practical Environment to Apply Model-Driven Web Engineering

    NASA Astrophysics Data System (ADS)

    Escalona, Maria Jose; Gutiérrez, J. J.; Morero, F.; Parra, C. L.; Nieto, J.; Pérez, F.; Martín, F.; Llergo, A.

    The application of a model-driven paradigm in the development of Web Systems has yielded very good research results. Several research groups are defining metamodels, transformations, and tools which offer a suitable environment, known as model-driven Web engineering (MDWE). However, there are very few practical experiences in real Web system developments using real development teams. This chapter presents a practical environment of MDWE based on the use of NDT (navigational development techniques) and Java Web systems, and it provides a practical evaluation of its application within a real project: specialized Diraya.

  20. Large urban fire environment. Trends and model city predictions

    SciTech Connect

    Larson, D.A.; Small, R.D.

    1982-01-01

    The urban fire environment that would result from a megaton-yield nuclear weapon burst is considered. The dependence of temperatures and velocities on fire size, burning intensity, turbulence, and radiation is explored, and specific calculations for three model urban areas are presented. In all cases, high velocity fire winds are predicted. The model-city results show the influence of building density and urban sprawl on the fire environment. Additional calculations consider large-area fires with the burning intensity reduced in a blast-damaged urban center.

  1. Multi-Environment Model Estimation for Motility Analysis of Caenorhabditis elegans

    PubMed Central

    Sznitman, Raphael; Gupta, Manaswi; Hager, Gregory D.; Arratia, Paulo E.; Sznitman, Josué

    2010-01-01

    The nematode Caenorhabditis elegans is a well-known model organism used to investigate fundamental questions in biology. Motility assays of this small roundworm are designed to study the relationships between genes and behavior. Commonly, motility analysis is used to classify nematode movements and characterize them quantitatively. Over the past years, C. elegans' motility has been studied across a wide range of environments, including crawling on substrates, swimming in fluids, and locomoting through microfluidic substrates. However, each environment often requires customized image processing tools relying on heuristic parameter tuning. In the present study, we propose a novel Multi-Environment Model Estimation (MEME) framework for automated image segmentation that is versatile across various environments. The MEME platform is constructed around the concept of Mixture of Gaussian (MOG) models, where statistical models for both the background environment and the nematode appearance are explicitly learned and used to accurately segment a target nematode. Our method is designed to simplify the burden often imposed on users; here, only a single image which includes a nematode in its environment must be provided for model learning. In addition, our platform enables the extraction of nematode ‘skeletons’ for straightforward motility quantification. We test our algorithm on various locomotive environments and compare performances with an intensity-based thresholding method. Overall, MEME outperforms the threshold-based approach for the overwhelming majority of cases examined. Ultimately, MEME provides researchers with an attractive platform for C. elegans' segmentation and ‘skeletonizing’ across a wide range of motility assays. PMID:20661478

  2. Simulation Modeling of Software Development Processes

    NASA Technical Reports Server (NTRS)

    Calavaro, G. F.; Basili, V. R.; Iazeolla, G.

    1996-01-01

    A simulation modeling approach is proposed for the prediction of software process productivity indices, such as cost and time-to-market, and the sensitivity analysis of such indices to changes in the organization parameters and user requirements. The approach uses a timed Petri Net and Object Oriented top-down model specification. Results demonstrate the model representativeness, and its usefulness in verifying process conformance to expectations, and in performing continuous process improvement and optimization.

  3. Ada COCOMO and the Ada Process Model

    DTIC Science & Technology

    1989-01-01

    language, the use of incremental development, and the use of the Ada process model capitalizing on the strengths of Ada to improve the efficiency of software...development. This paper presents the portions of the revised Ada COCOMO dealing with the effects of Ada and the Ada process model . The remainder of...this section of the paper discusses the objectives of Ada COCOMO. Section 2 describes the Ada Process Model and its overall effects on software

  4. Models for Turbulent Transport Processes.

    ERIC Educational Resources Information Center

    Hill, James C.

    1979-01-01

    Since the statistical theories of turbulence that have developed over the last twenty or thirty years are too abstract and unreliable to be of much use to chemical engineers, this paper introduces the techniques of single point models and suggests some areas of needed research. (BB)

  5. Compound Cue Processing in Linearly and Nonlinearly Separable Environments

    ERIC Educational Resources Information Center

    Hoffrage, Ulrich; Garcia-Retamero, Rocio; Czienskowski, Uwe

    2008-01-01

    Take-the-best (TTB) is a fast and frugal heuristic for paired comparison that has been proposed as a model of bounded rationality. This heuristic has been criticized for not taking compound cues into account to predict a criterion, although such an approach is sometimes required to make accurate predictions. By means of computer simulations, it is…

  6. Containerless processing of single crystals in low-G environment

    NASA Technical Reports Server (NTRS)

    Walter, H. U.

    1974-01-01

    Experiments on containerless crystal growth from the melt were conducted during Skylab missions SL3 and SL4 (Skylab Experiment M-560). Six samples of InSb were processed, one of them heavily doped with selenium. The concept of the experiment is discussed and related to general crystal growth methods and their merits as techniques for containerless processing in space. The morphology of the crystals obtained is explained in terms of volume changes associated with solidification and wetting conditions during solidification. All samples exhibit extremely well developed growth facets. Analysis by X-ray topographical methods and chemical etching shows that the crystals are of high structural perfection. Average dislocation density as revealed by etching is of the order of 100 per sq cm; no dislocation clusters could be observed in the space-grown samples. A sequence of striations that is observed in the first half of the selenium-doped sample is explained as being caused by periodic surface breakdown.

  7. Total Ship Design Process Modeling

    DTIC Science & Technology

    2012-04-30

    Microsoft Project® or Primavera ®, and perform process simulations that can investigate risk, cost, and schedule trade-offs. Prior efforts to capture...planning in the face of disruption, delay, and late‐changing  requirements. ADePT is interfaced with  PrimaVera , the AEC  industry favorite program

  8. VHP - An environment for the remote visualization of heuristic processes

    NASA Technical Reports Server (NTRS)

    Crawford, Stuart L.; Leiner, Barry M.

    1991-01-01

    A software system called VHP is introduced which permits the visualization of heuristic algorithms on both resident and remote hardware platforms. The VHP is based on the DCF tool for interprocess communication and is applicable to remote algorithms which can be on different types of hardware and in languages other than VHP. The VHP system is of particular interest to systems in which the visualization of remote processes is required such as robotics for telescience applications.

  9. Employing Noisy Environments to Support Quantum Information Processing

    DTIC Science & Technology

    2007-11-02

    Quantum Information Processing 5. FUNDING NUMBERS DAAD19-02-1-0161 6. AUTHOR(S) Martin B Plenio and Susana F Huelga...designated by other documentation. 12 a. DISTRIBUTION / AVAILABILITY STATEMENT Approved for public release; distribution unlimited. 12 b ...entanglement dynamics can be achieved in such a system. The results of this work have been published in E. Jané, M.B. Plenio and D. Jonathan, ”Quantum

  10. Modelling urban rainfall-runoff responses using an experimental, two-tiered physical modelling environment

    NASA Astrophysics Data System (ADS)

    Green, Daniel; Pattison, Ian; Yu, Dapeng

    2016-04-01

    Surface water (pluvial) flooding occurs when rainwater from intense precipitation events is unable to infiltrate into the subsurface or drain via natural or artificial drainage channels. Surface water flooding poses a serious hazard to urban areas across the world, with the UK's perceived risk appearing to have increased in recent years due to surface water flood events seeming more severe and frequent. Surface water flood risk currently accounts for 1/3 of all UK flood risk, with approximately two million people living in urban areas at risk of a 1 in 200-year flood event. Research often focuses upon using numerical modelling techniques to understand the extent, depth and severity of actual or hypothetical flood scenarios. Although much research has been conducted using numerical modelling, field data available for model calibration and validation is limited due to the complexities associated with data collection in surface water flood conditions. Ultimately, the data which numerical models are based upon is often erroneous and inconclusive. Physical models offer a novel, alternative and innovative environment to collect data within, creating a controlled, closed system where independent variables can be altered independently to investigate cause and effect relationships. A physical modelling environment provides a suitable platform to investigate rainfall-runoff processes occurring within an urban catchment. Despite this, physical modelling approaches are seldom used in surface water flooding research. Scaled laboratory experiments using a 9m2, two-tiered 1:100 physical model consisting of: (i) a low-cost rainfall simulator component able to simulate consistent, uniformly distributed (>75% CUC) rainfall events of varying intensity, and; (ii) a fully interchangeable, modular plot surface have been conducted to investigate and quantify the influence of a number of terrestrial and meteorological factors on overland flow and rainfall-runoff patterns within a modelled

  11. Sensitivity of membranes to their environment. Role of stochastic processes.

    PubMed Central

    Offner, F F

    1984-01-01

    Ionic flow through biomembranes often exhibits a sensitivity to the environment, which is difficult to explain by classical theory, that usually assumes that the free energy available to change the membrane permeability results from the environmental change acting directly on the permeability control mechanism. This implies, for example, that a change delta V in the trans-membrane potential can produce a maximum free energy change, delta V X q, on a gate (control mechanism) carrying a charge q. The analysis presented here shows that when stochastic fluctuations are considered, under suitable conditions (gate cycle times rapid compared with the field relaxation time within a channel), the change in free energy is limited, not by the magnitude of the stimulus, but by the electrochemical potential difference across the membrane, which may be very much greater. Conformational channel gates probably relax more slowly than the field within the channel; this would preclude appreciable direct amplification of the stimulus. It is shown, however, that the effect of impermeable cations such as Ca++ is to restore the amplification of the stimulus through its interaction with the electric field. The analysis predicts that the effect of Ca++ should be primarily to affect the number of channels that are open, while only slightly affecting the conductivity of an open channel. PMID:6093903

  12. Processing of soot in an urban environment: case study from the Mexico City Metropolitan Area

    NASA Astrophysics Data System (ADS)

    Johnson, K. S.; Zuberi, B.; Molina, L. T.; Molina, M. J.; Iedema, M. J.; Cowin, J. P.; Gaspar, D. J.; Wang, C.; Laskin, A.

    2005-08-01

    Chemical composition, size, and mixing state of atmospheric particles are critical in determining their effects on the environment. There is growing evidence that soot aerosols play a particularly important role in both climate and human health, but still relatively little is known of their physical and chemical nature. In addition, the atmospheric residence times and removal mechanisms for soot are neither well understood nor adequately represented in regional and global climate models. To investigate the effect of locality and residence time on properties of soot and mixing state in a polluted urban environment, particles of diameter 0.2-2.0 µm were collected in the Mexico City Metropolitan Area (MCMA) during the MCMA-2003 field campaign from various sites within the city. Individual particle analysis by different electron microscopy methods coupled with energy dispersed X-ray spectroscopy, and secondary ionization mass spectrometry show that freshly-emitted soot particles become rapidly processed in the MCMA. Whereas fresh particulate emissions from mixed-traffic are almost entirely carbonaceous, consisting of soot aggregates with liquid coatings suggestive of unburned lubricating oil and water, ambient soot particles which have been processed for less than a few hours are heavily internally mixed, primarily with ammonium sulfate. Single particle analysis suggests that this mixing occurs through several mechanisms that require further investigation. In light of previously published results, the internally-mixed nature of processed soot particles is expected to affect heterogeneous chemistry on the soot

  13. Processing of soot in an urban environment: case study from the Mexico City Metropolitan Area

    NASA Astrophysics Data System (ADS)

    Johnson, K. S.; Zuberi, B.; Molina, L. T.; Molina, M. J.; Iedema, M. J.; Cowin, J. P.; Gaspar, D. J.; Wang, C.; Laskin, A.

    2005-11-01

    Chemical composition, size, and mixing state of atmospheric particles are critical in determining their effects on the environment. There is growing evidence that soot aerosols play a particularly important role in both climate and human health, but still relatively little is known of their physical and chemical nature. In addition, the atmospheric residence times and removal mechanisms for soot are neither well understood nor adequately represented in regional and global climate models. To investigate the effect of locality and residence time on properties of soot and mixing state in a polluted urban environment, particles of diameter 0.2-2.0 μm were collected in the Mexico City Metropolitan Area (MCMA) during the MCMA-2003 Field Campaign from various sites within the city. Individual particle analysis by different electron microscopy methods coupled with energy dispersed x-ray spectroscopy, and secondary ionization mass spectrometry show that freshly-emitted soot particles become rapidly processed in the MCMA. Whereas fresh particulate emissions from mixed-traffic are almost entirely carbonaceous, consisting of soot aggregates with liquid coatings suggestive of unburned lubricating oil and water, ambient soot particles which have been processed for less than a few hours are heavily internally mixed, primarily with ammonium sulfate. Single particle analysis suggests that this mixing occurs through several mechanisms that require further investigation. In light of previously published results, the internally-mixed nature of processed soot particles is expected to affect heterogeneous chemistry on the soot surface, including interaction with water during wet-removal.

  14. Applicability of the protein environment equilibrium approximation for describing ultrafast biophysical processes

    NASA Astrophysics Data System (ADS)

    Poddubnyy, V. V.; Glebov, I. O.; Sudarkova, S. M.

    2015-06-01

    The theoretical description of ultrafast processes in biological systems, in particular, electron transfer in photosynthetic reaction centers, is an important problem in modern biological physics. Because these processes occur in a protein medium with which an energy exchange is possible, methods of the quantum theory of open systems must be used to describe them. But because of a high process rate and the specifics of the protein environment, basic approximations of this theory might be inapplicable. We study the applicability of the approximation of the protein environment (bath) state invariance for the dissipative dynamics of charge transfer between molecule-pigments contained in reaction centers. For this, we use model systems whose parameters are close to real ones. We conclude that this approximation can be used to describe both the monotonic and the oscillating dynamics of the reaction subsystem in large biological molecules. We consider various mechanisms for bath thermalization and show that the bath thermalization occurs not because of the intramolecular redistribution of the vibrational energy in it but because of its coupling to the reaction subsystem.

  15. Modeling integrated sensor/actuator functions in realistic environments

    NASA Astrophysics Data System (ADS)

    Kim, Jae-Wan; Varadan, Vasundara V.; Varadan, Vijay K.

    1993-07-01

    Smart materials are expected to adapt to their environment and provide a useful response to changes in the environment. Both the sensor and actuator functions with the appropriate feedback mechanism must be integrated and comprise the `brains' of the material. Piezoelectric ceramics have proved to be effective as both sensors and actuators for a wide variety of applications. Thus, realistic simulation models are needed that can predict the performance of smart materials that incorporate piezoceramics. The environment may include the structure on which the transducers are mounted, fluid medium and material damping. In all cases, the smart material should sense the change and make a useful response. A hybrid numerical method involving finite element modeling in the plate structure and transducer region and a plane wave representation in the fluid region is used. The simulation of the performance of smart materials are performed.

  16. Multiscale simulation of molecular processes in cellular environments

    NASA Astrophysics Data System (ADS)

    Chiricotto, Mara; Sterpone, Fabio; Derreumaux, Philippe; Melchionna, Simone

    2016-11-01

    We describe the recent advances in studying biological systems via multiscale simulations. Our scheme is based on a coarse-grained representation of the macromolecules and a mesoscopic description of the solvent. The dual technique handles particles, the aqueous solvent and their mutual exchange of forces resulting in a stable and accurate methodology allowing biosystems of unprecedented size to be simulated. This article is part of the themed issue 'Multiscale modelling at the physics-chemistry-biology interface'.

  17. Multiscale simulation of molecular processes in cellular environments.

    PubMed

    Chiricotto, Mara; Sterpone, Fabio; Derreumaux, Philippe; Melchionna, Simone

    2016-11-13

    We describe the recent advances in studying biological systems via multiscale simulations. Our scheme is based on a coarse-grained representation of the macromolecules and a mesoscopic description of the solvent. The dual technique handles particles, the aqueous solvent and their mutual exchange of forces resulting in a stable and accurate methodology allowing biosystems of unprecedented size to be simulated.This article is part of the themed issue 'Multiscale modelling at the physics-chemistry-biology interface'.

  18. Modelling between Epistemological Beliefs and Constructivist Learning Environment

    ERIC Educational Resources Information Center

    Çetin-Dindar, Ayla; Kirbulut, Zübeyde Demet; Boz, Yezdan

    2014-01-01

    The purpose of this study was to model the relationship between pre-service chemistry teachers' epistemological beliefs and their preference to use constructivist-learning environment in their future class. The sample was 125 pre-service chemistry teachers from five universities in Turkey. Two instruments were used in this study. One of the…

  19. A Tutoring and Student Modelling Paradigm for Gaming Environments.

    ERIC Educational Resources Information Center

    Burton, Richard R.; Brown, John Seely

    This paper describes a paradigm for tutorial systems capable of automatically providing feedback and hints in a game environment. The paradigm is illustrated by a tutoring system for the PLATO game "How the West Was Won." The system uses a computer-based "Expert" player to evaluate a student's moves and construct a "differential model" of the…

  20. Environment Modeling Using Runtime Values for JPF-Android

    NASA Technical Reports Server (NTRS)

    van der Merwe, Heila; Tkachuk, Oksana; Nel, Seal; van der Merwe, Brink; Visser, Willem

    2015-01-01

    Software applications are developed to be executed in a specific environment. This environment includes external native libraries to add functionality to the application and drivers to fire the application execution. For testing and verification, the environment of an application is simplified abstracted using models or stubs. Empty stubs, returning default values, are simple to generate automatically, but they do not perform well when the application expects specific return values. Symbolic execution is used to find input parameters for drivers and return values for library stubs, but it struggles to detect the values of complex objects. In this work-in-progress paper, we explore an approach to generate drivers and stubs based on values collected during runtime instead of using default values. Entry-points and methods that need to be modeled are instrumented to log their parameters and return values. The instrumented applications are then executed using a driver and instrumented libraries. The values collected during runtime are used to generate driver and stub values on- the-fly that improve coverage during verification by enabling the execution of code that previously crashed or was missed. We are implementing this approach to improve the environment model of JPF-Android, our model checking and analysis tool for Android applications.

  1. Prevalence and concentration of Salmonella and Campylobacter in the processing environment of small-scale pastured broiler farms

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A growing niche in the locally grown food movement is the small scale production of broiler chickens using the pasture-raised poultry production model. Little research exists that focuses on Salmonella and Campylobacter contamination in the environment associated with on-farm processing of pasture-r...

  2. Model-based design of peptide chromatographic purification processes.

    PubMed

    Gétaz, David; Stroehlein, Guido; Butté, Alessandro; Morbidelli, Massimo

    2013-04-05

    In this work we present a general procedure for the model-based optimization of a polypeptide crude mixture purification process through its application to a case of industrial relevance. This is done to show how much modeling can be beneficial to optimize complex chromatographic processes in the industrial environment. The target peptide elution profile was modeled with a two sites adsorption equilibrium isotherm exhibiting two inflection points. The variation of the isotherm parameters with the modifier concentration was accounted for. The adsorption isotherm parameters of the target peptide were obtained by the inverse method. The elution of the impurities was approximated by lumping them into pseudo-impurities and by regressing their adsorption isotherm parameters directly as a function of the corresponding parameters of the target peptide. After model calibration and validation by comparison with suitable experimental data, Pareto optimizations of the process were carried out so as to select the optimal batch process.

  3. Current models of the intensely ionizing particle environment in space

    NASA Technical Reports Server (NTRS)

    Adams, James H., Jr.

    1988-01-01

    The Cosmic Ray Effects on MicroElectronics (CREME) model that is currently in use to estimate single event effect rates in spacecraft is described. The CREME model provides a description of the radiation environment in interplanetary space near the orbit of the earth that contains no major deficiencies. The accuracy of the galactic cosmic ray model is limited by the uncertainties in solar modulation. The model for solar energetic particles could be improved by making use of all the data that has been collected on solar energetic particle events. There remain major uncertainties about the environment within the earth's magnetosphere, because of the uncertainties over the charge states of the heavy ions in the anomalous component and solar flares, and because of trapped heavy ions. The present CREME model is valid only at 1 AU, but it could be extended to other parts of the heliosphere. There is considerable data on the radiation environment from 0.2 to 35 AU in the ecliptic plane. This data could be used to extend the CREME model.

  4. The Icelandic volcanic aeolian environment: Processes and impacts - A review

    NASA Astrophysics Data System (ADS)

    Arnalds, Olafur; Dagsson-Waldhauserova, Pavla; Olafsson, Haraldur

    2016-03-01

    Iceland has the largest area of volcaniclastic sandy desert on Earth or 22,000 km2. The sand has been mostly produced by glacio-fluvial processes, leaving behind fine-grained unstable sediments which are later re-distributed by repeated aeolian events. Volcanic eruptions add to this pool of unstable sediments, often from subglacial eruptions. Icelandic desert surfaces are divided into sand fields, sandy lavas and sandy lag gravel, each with separate aeolian surface characteristics such as threshold velocities. Storms are frequent due to Iceland's location on the North Atlantic Storm track. Dry winds occur on the leeward sides of mountains and glaciers, in spite of the high moisture content of the Atlantic cyclones. Surface winds often move hundreds to more than 1000 kg m-1 per annum, and more than 10,000 kg m-1 have been measured in a single storm. Desertification occurs when aeolian processes push sand fronts and have thus destroyed many previously fully vegetated ecosystems since the time of the settlement of Iceland in the late ninth century. There are about 135 dust events per annum, ranging from minor storms to >300,000 t of dust emitted in single storms. Dust production is on the order of 30-40 million tons annually, some traveling over 1000 km and deposited on land and sea. Dust deposited on deserts tends to be re-suspended during subsequent storms. High PM10 concentrations occur during major dust storms. They are more frequent in the wake of volcanic eruptions, such as after the Eyjafjallajökull 2010 eruption. Airborne dust affects human health, with negative effects enhanced by the tubular morphology of the grains, and the basaltic composition with its high metal content. Dust deposition on snow and glaciers intensifies melting. Moreover, the dust production probably also influences atmospheric conditions and parameters that affect climate change.

  5. Framework for Modeling the Cognitive Process

    DTIC Science & Technology

    2005-06-16

    Yaworsky Air Force Research Laboratory/IFSB Rome, NY Keywords: Cognitive Process Modeling, Cognition, Conceptual Framework , Information...center of our conceptual framework and will distinguish our use of terms within the context of this framework. 3. A Conceptual Framework for...Modeling the Cognitive Process We will describe our conceptual framework using graphical examples to help illustrate main points. We form the two

  6. An Extension to the Weibull Process Model

    DTIC Science & Technology

    1981-11-01

    Subt5l . TYPE OF REPORT & PERIOD COVERED AN EXTENSION+TO THE WEIBULL PROCESS MODEL 6. PERFORMING O’G. REPORT NUMBER I. AuTHOR() S. CONTRACT OR GRANT...indicatinq its imrportance to applications. 4 AN EXTENSION TO TE WEIBULL PROCESS MODEL 1. INTRODUCTION Recent papers by Bain and Engelhardt (1980)1 and Crow

  7. Hybrid modelling of anaerobic wastewater treatment processes.

    PubMed

    Karama, A; Bernard, O; Genovesi, A; Dochain, D; Benhammou, A; Steyer, J P

    2001-01-01

    This paper presents a hybrid approach for the modelling of an anaerobic digestion process. The hybrid model combines a feed-forward network, describing the bacterial kinetics, and the a priori knowledge based on the mass balances of the process components. We have considered an architecture which incorporates the neural network as a static model of unmeasured process parameters (kinetic growth rate) and an integrator for the dynamic representation of the process using a set of dynamic differential equations. The paper contains a description of the neural network component training procedure. The performance of this approach is illustrated with experimental data.

  8. ESO C Library for an Image Processing Software Environment (eclipse)

    NASA Astrophysics Data System (ADS)

    Devillard, N.

    Written in ANSI C, eclipse is a library offering numerous services related to astronomical image processing: FITS data access, various image and cube loading methods, binary image handling and filtering (including convolution and morphological filters), 2-D cross-correlation, connected components, cube and image arithmetic, dead pixel detection and correction, object detection, data extraction, flat-fielding with robust fit, image generation, statistics, photometry, image-space resampling, image combination, and cube stacking. It also contains support for mathematical tools like random number generation, FFT, curve fitting, matrices, fast median computation, and point-pattern matching. The main feature of this library is its ability to handle large amounts of input data (up to 2 GB in the current version) regardless of the amount of memory and swap available on the local machine. Another feature is the very high speed allowed by optimized C, making it an ideal base tool for programming efficient number-crunching applications, e.g., on parallel (Beowulf) systems. Running on all Unix-like platforms, eclipse is portable. A high-level interface to Python is foreseen that would allow programmers to prototype their applications much faster than through C programs.

  9. Eclipse: ESO C Library for an Image Processing Software Environment

    NASA Astrophysics Data System (ADS)

    Devillard, Nicolas

    2011-12-01

    Written in ANSI C, eclipse is a library offering numerous services related to astronomical image processing: FITS data access, various image and cube loading methods, binary image handling and filtering (including convolution and morphological filters), 2-D cross-correlation, connected components, cube and image arithmetic, dead pixel detection and correction, object detection, data extraction, flat-fielding with robust fit, image generation, statistics, photometry, image-space resampling, image combination, and cube stacking. It also contains support for mathematical tools like random number generation, FFT, curve fitting, matrices, fast median computation, and point-pattern matching. The main feature of this library is its ability to handle large amounts of input data (up to 2GB in the current version) regardless of the amount of memory and swap available on the local machine. Another feature is the very high speed allowed by optimized C, making it an ideal base tool for programming efficient number-crunching applications, e.g., on parallel (Beowulf) systems.

  10. Distillation modeling for a uranium refining process

    SciTech Connect

    Westphal, B.R.

    1996-03-01

    As part of the spent fuel treatment program at Argonne National Laboratory, a vacuum distillation process is being employed for the recovery of uranium following an electrorefining process. Distillation of a salt electrolyte, containing a eutectic mixture of lithium and potassium chlorides, from uranium is achieved by a simple batch operation and is termed {open_quotes}cathode processing{close_quotes}. The incremental distillation of electrolyte salt will be modeled by an equilibrium expression and on a molecular basis since the operation is conducted under moderate vacuum conditions. As processing continues, the two models will be compared and analyzed for correlation with actual operating results. Possible factors that may contribute to aberrations from the models include impurities at the vapor-liquid boundary, distillate reflux, anomalous pressure gradients, and mass transport phenomena at the evaporating surface. Ultimately, the purpose of either process model is to enable the parametric optimization of the process.

  11. Declarative business process modelling: principles and modelling languages

    NASA Astrophysics Data System (ADS)

    Goedertier, Stijn; Vanthienen, Jan; Caron, Filip

    2015-02-01

    The business process literature has proposed a multitude of business process modelling approaches or paradigms, each in response to a different business process type with a unique set of requirements. Two polar paradigms, i.e. the imperative and the declarative paradigm, appear to define the extreme positions on the paradigm spectrum. While imperative approaches focus on explicitly defining how an organisational goal should be reached, the declarative approaches focus on the directives, policies and regulations restricting the potential ways to achieve the organisational goal. In between, a variety of hybrid-paradigms can be distinguished, e.g. the advanced and adaptive case management. This article focuses on the less-exposed declarative approach on process modelling. An outline of the declarative process modelling and the modelling approaches is presented, followed by an overview of the observed declarative process modelling principles and an evaluation of the declarative process modelling approaches.

  12. Modeling and control for closed environment plant production systems

    NASA Technical Reports Server (NTRS)

    Fleisher, David H.; Ting, K. C.; Janes, H. W. (Principal Investigator)

    2002-01-01

    A computer program was developed to study multiple crop production and control in controlled environment plant production systems. The program simulates crop growth and development under nominal and off-nominal environments. Time-series crop models for wheat (Triticum aestivum), soybean (Glycine max), and white potato (Solanum tuberosum) are integrated with a model-based predictive controller. The controller evaluates and compensates for effects of environmental disturbances on crop production scheduling. The crop models consist of a set of nonlinear polynomial equations, six for each crop, developed using multivariate polynomial regression (MPR). Simulated data from DSSAT crop models, previously modified for crop production in controlled environments with hydroponics under elevated atmospheric carbon dioxide concentration, were used for the MPR fitting. The model-based predictive controller adjusts light intensity, air temperature, and carbon dioxide concentration set points in response to environmental perturbations. Control signals are determined from minimization of a cost function, which is based on the weighted control effort and squared-error between the system response and desired reference signal.

  13. VARTM Process Modeling of Aerospace Composite Structures

    NASA Technical Reports Server (NTRS)

    Song, Xiao-Lan; Grimsley, Brian W.; Hubert, Pascal; Cano, Roberto J.; Loos, Alfred C.

    2003-01-01

    A three-dimensional model was developed to simulate the VARTM composite manufacturing process. The model considers the two important mechanisms that occur during the process: resin flow, and compaction and relaxation of the preform. The model was used to simulate infiltration of a carbon preform with an epoxy resin by the VARTM process. The model predicted flow patterns and preform thickness changes agreed qualitatively with the measured values. However, the predicted total infiltration times were much longer than measured most likely due to the inaccurate preform permeability values used in the simulation.

  14. Modeling Users, Context and Devices for Ambient Assisted Living Environments

    PubMed Central

    Castillejo, Eduardo; Almeida, Aitor; López-de-Ipiña, Diego; Chen, Liming

    2014-01-01

    The participation of users within AAL environments is increasing thanks to the capabilities of the current wearable devices. Furthermore, the significance of considering user's preferences, context conditions and device's capabilities help smart environments to personalize services and resources for them. Being aware of different characteristics of the entities participating in these situations is vital for reaching the main goals of the corresponding systems efficiently. To collect different information from these entities, it is necessary to design several formal models which help designers to organize and give some meaning to the gathered data. In this paper, we analyze several literature solutions for modeling users, context and devices considering different approaches in the Ambient Assisted Living domain. Besides, we remark different ongoing standardization works in this area. We also discuss the used techniques, modeled characteristics and the advantages and drawbacks of each approach to finally draw several conclusions about the reviewed works. PMID:24643006

  15. Modeling users, context and devices for ambient assisted living environments.

    PubMed

    Castillejo, Eduardo; Almeida, Aitor; López-de-Ipiña, Diego; Chen, Liming

    2014-03-17

    The participation of users within AAL environments is increasing thanks to the capabilities of the current wearable devices. Furthermore, the significance of considering user's preferences, context conditions and device's capabilities help smart environments to personalize services and resources for them. Being aware of different characteristics of the entities participating in these situations is vital for reaching the main goals of the corresponding systems efficiently. To collect different information from these entities, it is necessary to design several formal models which help designers to organize and give some meaning to the gathered data. In this paper, we analyze several literature solutions for modeling users, context and devices considering different approaches in the Ambient Assisted Living domain. Besides, we remark different ongoing standardization works in this area. We also discuss the used techniques, modeled characteristics and the advantages and drawbacks of each approach to finally draw several conclusions about the reviewed works.

  16. Strengthening the weak link: Built Environment modelling for loss analysis

    NASA Astrophysics Data System (ADS)

    Millinship, I.

    2012-04-01

    Methods to analyse insured losses from a range of natural perils, including pricing by primary insurers and catastrophe modelling by reinsurers, typically lack sufficient exposure information. Understanding the hazard intensity in terms of spatial severity and frequency is only the first step towards quantifying the risk of a catastrophic event. For any given event we need to know: Are any structures affected? What type of buildings are they? How much damaged occurred? How much will the repairs cost? To achieve this, detailed exposure information is required to assess the likely damage and to effectively calculate the resultant loss. Modelling exposures in the Built Environment therefore plays as important a role in understanding re/insurance risk as characterising the physical hazard. Across both primary insurance books and aggregated reinsurance portfolios, the location of a property (a risk) and its monetary value is typically known. Exactly what that risk is in terms of detailed property descriptors including structure type and rebuild cost - and therefore its vulnerability to loss - is often omitted. This data deficiency is a primary source of variations between modelled losses and the actual claims value. Built Environment models are therefore required at a high resolution to describe building attributes that relate vulnerability to property damage. However, national-scale household-level datasets are often not computationally practical in catastrophe models and data must be aggregated. In order to provide more accurate risk analysis, we have developed and applied a methodology for Built Environment modelling for incorporation into a range of re/insurance applications, including operational models for different international regions and different perils and covering residential, commercial and industry exposures. Illustrated examples are presented, including exposure modelling suitable for aggregated reinsurance analysis for the UK and bespoke high resolution

  17. Threat processing: models and mechanisms.

    PubMed

    Bentz, Dorothée; Schiller, Daniela

    2015-01-01

    The experience of fear is closely linked to the survival of species. Fear can be conceptualized as a brain state that orchestrates defense reactions to threats. To avoid harm, an organism must be equipped with neural circuits that allow learning, detecting, and rapidly responding to threats. Past experience with threat can transform neutral stimuli present at the time of experience into learned threat-related stimuli via associative learning. Pavlovian threat conditioning is the central experimental paradigm to study associative learning. Once learned, these stimulus-response associations are not always expressed depending on context or new experiences with the conditioned stimuli. Neural circuits mediating threat learning have the inherent plasticity to adapt to changing environmental threats. Encounters devoid of danger pave the way for extinction or reconsolidation to occur. Extinction and reconsolidation can both lead to changes in the expression of threat-induced defense responses, but differ in stability and have a different neural basis. This review presents the behavioral models and the system-level neural mechanisms in animals and humans of threat learning and modulation.

  18. Discriminating Tectonic Tremor from Magmatic Processes in Observationally Challenging Environments

    NASA Astrophysics Data System (ADS)

    Brown, J. R.; Beroza, G. C.

    2011-12-01

    Deep tectonic tremor is a long-duration, low amplitude signal that has been shown to consist of low frequency earthquakes (LFEs) on the plate interface in subduction zones. Detecting LFEs from tremor-like signals in subduction settings can be challenging due to the combination of volcanic seismicity and sparse station geometry. This is particularly true for island arcs such as the Alaska-Aleutian subduction zone where the islands are small and noise levels are high. We have detected and located LFEs within tremor signals along the Alaska-Aleutian Arc in four locations: Kodiak Island, Alaska Peninsula, eastern Aleutians, and the Andreanof Islands. In all areas, the LFEs are located 10-40 km trenchward of the volcanic chain at depths ranging from 45-70 km. Location errors are significant (+/- 20 km in depth) due to sparse station geometry such that there is the possibility that the tremor could be associated with nearby volcanoes. Since most documented volcanic tremor is located in the shallow crust, it can often be discriminated from tectonic tremor simply based on location. However, deep volcanic tremor has been documented in Hawaii to depths of 40 km and could be more widespread. In the Aleutian arc, deep long period events (DLPs), which are thought to result from the movement of magma and volatiles, have been located as deep as 45 km and sometimes resemble tremor-like signals. The spectral character is another potential discriminant. We compare the cepstra (Fourier transform of the logarithmic power spectrum of a time series) of the tectonic tremor-like signals/LFEs and DLPs associated with volcanoes. Source characteristics of DLPs (non-shear slip) and tectonic tremor/LFEs (shear slip) are distinct and should be noticeable in the cepstral domain. This approach of using tremor locations and cepstral analysis could be useful for detecting and differentiating tectonic tremor from deep volcanic processes in other island arcs as well.

  19. ARTEMIS: Ares Real Time Environments for Modeling, Integration, and Simulation

    NASA Technical Reports Server (NTRS)

    Hughes, Ryan; Walker, David

    2009-01-01

    This slide presentation reviews the use of ARTEMIS in the development and testing of the ARES launch vehicles. Ares Real Time Environment for Modeling, Simulation and Integration (ARTEMIS) is the real time simulation supporting Ares I hardware-in-the-loop (HWIL) testing. ARTEMIS accurately models all Ares/Orion/Ground subsystems which interact with Ares avionics components from pre-launch through orbit insertion The ARTEMIS System integration Lab, and the STIF architecture is reviewed. The functional components of ARTEMIS are outlined. An overview of the models and a block diagram is presented.

  20. Thermal modeling of carbon-epoxy laminates in fire environments.

    SciTech Connect

    McGurn, Matthew T. , Buffalo, NY); DesJardin, Paul Edward , Buffalo, NY); Dodd, Amanda B.

    2010-10-01

    A thermal model is developed for the response of carbon-epoxy composite laminates in fire environments. The model is based on a porous media description that includes the effects of gas transport within the laminate along with swelling. Model comparisons are conducted against the data from Quintere et al. Simulations are conducted for both coupon level and intermediate scale one-sided heating tests. Comparisons of the heat release rate (HRR) as well as the final products (mass fractions, volume percentages, porosity, etc.) are conducted. Overall, the agreement between available the data and model is excellent considering the simplified approximations to account for flame heat flux. A sensitivity study using a newly developed swelling model shows the importance of accounting for laminate expansion for the prediction of burnout. Excellent agreement is observed between the model and data of the final product composition that includes porosity, mass fractions and volume expansion ratio.

  1. An information processing model of anxiety: automatic and strategic processes.

    PubMed

    Beck, A T; Clark, D A

    1997-01-01

    A three-stage schema-based information processing model of anxiety is described that involves: (a) the initial registration of a threat stimulus; (b) the activation of a primal threat mode; and (c) the secondary activation of more elaborative and reflective modes of thinking. The defining elements of automatic and strategic processing are discussed with the cognitive bias in anxiety reconceptualized in terms of a mixture of automatic and strategic processing characteristics depending on which stage of the information processing model is under consideration. The goal in the treatment of anxiety is to deactivate the more automatic primal threat mode and to strengthen more constructive reflective modes of thinking. Arguments are presented for the inclusion of verbal mediation as a necessary but not sufficient component in the cognitive and behavioral treatment of anxiety.

  2. Modeling Gene-Environment Interactions With Quasi-Natural Experiments.

    PubMed

    Schmitz, Lauren; Conley, Dalton

    2017-02-01

    This overview develops new empirical models that can effectively document Gene × Environment (G×E) interactions in observational data. Current G×E studies are often unable to support causal inference because they use endogenous measures of the environment or fail to adequately address the nonrandom distribution of genes across environments, confounding estimates. Comprehensive measures of genetic variation are incorporated into quasi-natural experimental designs to exploit exogenous environmental shocks or isolate variation in environmental exposure to avoid potential confounders. In addition, we offer insights from population genetics that improve upon extant approaches to address problems from population stratification. Together, these tools offer a powerful way forward for G×E research on the origin and development of social inequality across the life course.

  3. Distributed data processing and analysis environment for neutron scattering experiments at CSNS

    NASA Astrophysics Data System (ADS)

    Tian, H. L.; Zhang, J. R.; Yan, L. L.; Tang, M.; Hu, L.; Zhao, D. X.; Qiu, Y. X.; Zhang, H. Y.; Zhuang, J.; Du, R.

    2016-10-01

    China Spallation Neutron Source (CSNS) is the first high-performance pulsed neutron source in China, which will meet the increasing fundamental research and technique applications demands domestically and overseas. A new distributed data processing and analysis environment has been developed, which has generic functionalities for neutron scattering experiments. The environment consists of three parts, an object-oriented data processing framework adopting a data centered architecture, a communication and data caching system based on the C/S paradigm, and data analysis and visualization software providing the 2D/3D experimental data display. This environment will be widely applied in CSNS for live data processing.

  4. A new Mars radiation environment model with visualization

    NASA Technical Reports Server (NTRS)

    De Angelis, G.; Clowdsley, M. S.; Singleterry, R. C.; Wilson, J. W.

    2004-01-01

    A new model for the radiation environment to be found on the planet Mars due to Galactic Cosmic Rays (OCR) has been developed at the NASA Langley Research Center. Solar modulated primary particles rescaled for Mars conditions are transported through the Martian atmosphere, with temporal properties modeled with variable timescales, down to the surface, with altitude and backscattering patterns taken into account. The Martian atmosphere has been modeled by using the Mars Global Reference Atmospheric Model--version 2001 (Mars-GRAM 2001). The altitude to compute the atmospheric thickness profile has been determined by using a model for the topography based on the data provided by the Mars Orbiter Laser Altimeter (MOLA) instrument on board the Mars Global Surveyor (MGS) spacecraft. The Mars surface composition has been modeled based on averages over the measurements obtained from orbiting spacecraft and at various landing sites, taking into account the possible volatile inventory (e.g., CO2 ice, H2O ice) along with its time variation throughout the Martian year. Particle transport has been performed with the HZETRN heavy ion code. The Mars Radiation Environment Model has been made available worldwide through the Space Ionizing Radiation Effects and Shielding Tools (SIREST) website, a project of NASA Langley Research Center. c2004 COSPAR. Published by Elsevier Ltd. All rights reserved.

  5. Three Models for the Curriculum Development Process

    ERIC Educational Resources Information Center

    O'Hanlon, James

    1973-01-01

    Presents descriptions of the management, systematic, and open-access curriculum development models to identify the decisionmaking bases, operational processes, evaluation requirements, and curriculum control methods of each model. A possible relationship among these models is then suggested. (Author/DN)

  6. Modelling the near-Earth space environment using LDEF data

    NASA Technical Reports Server (NTRS)

    Atkinson, Dale R.; Coombs, Cassandra R.; Crowell, Lawrence B.; Watts, Alan J.

    1992-01-01

    Near-Earth space is a dynamic environment, that is currently not well understood. In an effort to better characterize the near-Earth space environment, this study compares the results of actual impact crater measurement data and the Space Environment (SPENV) Program developed in-house at POD, to theoretical models established by Kessler (NASA TM-100471, 1987) and Cour-Palais (NASA SP-8013, 1969). With the continuing escalation of debris there will exist a definite hazard to unmanned satellites as well as manned operations. Since the smaller non-trackable debris has the highest impact rate, it is clearly necessary to establish the true debris environment for all particle sizes. Proper comprehension of the near-Earth space environment and its origin will permit improvement in spacecraft design and mission planning, thereby reducing potential disasters and extreme costs. Results of this study directly relate to the survivability of future spacecraft and satellites that are to travel through and/or reside in low Earth orbit (LEO). More specifically, these data are being used to: (1) characterize the effects of the LEO micrometeoroid an debris environment on satellite designs and components; (2) update the current theoretical micrometeoroid and debris models for LEO; (3) help assess the survivability of spacecraft and satellites that must travel through or reside in LEO, and the probability of their collision with already resident debris; and (4) help define and evaluate future debris mitigation and disposal methods. Combined model predictions match relatively well with the LDEF data for impact craters larger than approximately 0.05 cm, diameter; however, for smaller impact craters, the combined predictions diverge and do not reflect the sporadic clouds identified by the Interplanetary Dust Experiment (IDE) aboard LDEF. The divergences cannot currently be explained by the authors or model developers. The mean flux of small craters (approximately 0.05 cm diameter) is

  7. Preform Characterization in VARTM Process Model Development

    NASA Technical Reports Server (NTRS)

    Grimsley, Brian W.; Cano, Roberto J.; Hubert, Pascal; Loos, Alfred C.; Kellen, Charles B.; Jensen, Brian J.

    2004-01-01

    Vacuum-Assisted Resin Transfer Molding (VARTM) is a Liquid Composite Molding (LCM) process where both resin injection and fiber compaction are achieved under pressures of 101.3 kPa or less. Originally developed over a decade ago for marine composite fabrication, VARTM is now considered a viable process for the fabrication of aerospace composites (1,2). In order to optimize and further improve the process, a finite element analysis (FEA) process model is being developed to include the coupled phenomenon of resin flow, preform compaction and resin cure. The model input parameters are obtained from resin and fiber-preform characterization tests. In this study, the compaction behavior and the Darcy permeability of a commercially available carbon fabric are characterized. The resulting empirical model equations are input to the 3- Dimensional Infiltration, version 5 (3DINFILv.5) process model to simulate infiltration of a composite panel.

  8. Quantum jump model for a system with a finite-size environment.

    PubMed

    Suomela, S; Kutvonen, A; Ala-Nissila, T

    2016-06-01

    Measuring the thermodynamic properties of open quantum systems poses a major challenge. A calorimetric detection has been proposed as a feasible experimental scheme to measure work and fluctuation relations in open quantum systems. However, the detection requires a finite size for the environment, which influences the system dynamics. This process cannot be modeled with the standard stochastic approaches. We develop a quantum jump model suitable for systems coupled to a finite-size environment. We use the method to study the common fluctuation relations and prove that they are satisfied.

  9. Complex Unsaturated Zone Flow and Thermohydrologic Processes in a Regulatory Environment: A Perspective on Uncertainty

    NASA Astrophysics Data System (ADS)

    Fedors, R. W.; Manepally, C.; Justus, P. S.; Basagaoglu, H.; Pensado, O.; Dubreuilh, P.

    2007-12-01

    An important part of a risk-informed, performance-based regulatory review of a potential license application for disposal of high-level radioactive waste at Yucca Mountain, Nevada, is the consideration of alternative interpretations and models of risk significant physical processes. The Nuclear Regulatory Commission (NRC) expects that simplified models will be abstracted from complex process-level models to conduct total-system performance assessments. There are several phases or steps to developing an abstracted model and its supporting basis from more detailed and complicated models for each area of the total system. For complex ambient and thermally perturbed flow in fractured tuffs of the unsaturated zone at Yucca Mountain, these steps c,an be summarized as (i) site characterization and observation, (ii) field and laboratory tests, (iii) conceptual model development, (iv) process-level numerical modeling, and (v) abstraction development. Each step is affected by uncertainty in (i) assessing parameters for models and (ii) conceptualization and understanding of governing processes. Because of the complexity and uncertainty, alternative interpretations and models become important aspects in the regulatory environment. NRC staff gain confidence in performance assessment model results through understanding the uncertainty in the various models. An example of a complex process in the unsaturated zone is seepage into drifts, which leads to liquid water potentially contacting waste packages. Seepage is a risk-important process for the unsaturated zone at Yucca Mountain because of its potential effect on waste package integrity and trainsport of potentially released radionuclides. Complexities for seepage include (i) characterization of fractures that carry flow, (ii) effect of small to intermediate scale structural features on flow, (iii) consideration of the diverse flow regimes (rivulets, film flow, capillarity) in fractures, (iv) effect of vapor transport associated

  10. Periglacial process research for improved understanding of climate change in periglacial environments

    NASA Astrophysics Data System (ADS)

    Hvidtfeldt Christiansen, Hanne

    2010-05-01

    Periglacial landscapes extend widely outside the glaciated areas and the areas underlain by permafrost and with seasonal frost. Yet recently significant attention has in cryosphere research, related to periglacial geomorphology, been given to a direct climate permafrost relationship. The focus is on the permafrost thermal state including the thickness of the active layer, and often simplifying how these two key conditions are directly climatically controlled. There has been less focus on the understanding and quantification of the different periglacial processes, which largely control the consequences of changing climatic conditions on the permafrost and on seasonal frost all over the periglacial environments. It is the complex relationship between climate, micro-climate and local geomorphological, geological and ecological conditions, which controls periglacial processes. In several cases local erosion or deposition will affect the rates of landform change significantly more than any climate change. Thus detailed periglacial process studies will sophisticate the predictions of how periglacial landscapes can be expected to respond to climatic changes, and be built into Earth System Modelling. Particularly combining direct field observations and measurements with remote sensing and geochronological studies of periglacial landforms, enables a significantly improved understanding of periglacial process rates. An overview of the state of research in key periglacial processes are given focusing on ice-wedges and solifluction landforms, and seasonal ground thermal dynamics, all with examples from the high Arctic in Svalbard. Thermal contraction cracking and its seasonal meteorological control is presented, and potential thermal erosion of ice-wedges leading to development of thermokarst is discussed. Local and meteorological controls on solifluction rates are presented and their climatic control indicated. Seasonal ground thermal processes and their dependence on local

  11. DISTRIBUTED PROCESSING TRADE-OFF MODEL FOR ELECTRIC UTILITY OPERATION

    NASA Technical Reports Server (NTRS)

    Klein, S. A.

    1994-01-01

    The Distributed processing Trade-off Model for Electric Utility Operation is based upon a study performed for the California Institute of Technology's Jet Propulsion Laboratory. This study presented a technique that addresses the question of trade-offs between expanding a communications network or expanding the capacity of distributed computers in an electric utility Energy Management System (EMS). The technique resulted in the development of a quantitative assessment model that is presented in a Lotus 1-2-3 worksheet environment. The model gives EMS planners a macroscopic tool for evaluating distributed processing architectures and the major technical and economic tradeoffs as well as interactions within these architectures. The model inputs (which may be varied according to application and need) include geographic parameters, data flow and processing workload parameters, operator staffing parameters, and technology/economic parameters. The model's outputs are total cost in various categories, a number of intermediate cost and technical calculation results, as well as graphical presentation of Costs vs. Percent Distribution for various parameters. The model has been implemented on an IBM PC using the LOTUS 1-2-3 spreadsheet environment and was developed in 1986. Also included with the spreadsheet model are a number of representative but hypothetical utility system examples.

  12. Road environment perception algorithm based on object semantic probabilistic model

    NASA Astrophysics Data System (ADS)

    Liu, Wei; Wang, XinMei; Tian, Jinwen; Wang, Yong

    2015-12-01

    This article seeks to discover the object categories' semantic probabilistic model (OSPM) based on statistical test analysis method. We applied this model on road forward environment perception algorithm, including on-road object recognition and detection. First, the image was represented by a set composed of words (local feature regions). Then, found the probability distribution among image, local regions and object semantic category based on the new model. In training, the parameters of the object model are estimated. This is done by using expectation-maximization in a maximum likelihood setting. In recognition, this model is used to classify images by using a Bayesian manner. In detection, the posterios is calculated to detect the typical on-road objects. Experiments release the good performance on object recognition and detection in urban street background.

  13. Martian Radiation Environment: Model Calculations and Recent Measurements with "MARIE"

    NASA Technical Reports Server (NTRS)

    Saganti, P. B.; Cucinotta, F. A.; zeitlin, C. J.; Cleghorn, T. F.

    2004-01-01

    The Galactic Cosmic Ray spectra in Mars orbit were generated with the recently expanded HZETRN (High Z and Energy Transport) and QMSFRG (Quantum Multiple-Scattering theory of nuclear Fragmentation) model calculations. These model calculations are compared with the first eighteen months of measured data from the MARIE (Martian Radiation Environment Experiment) instrument onboard the 2001 Mars Odyssey spacecraft that is currently in Martian orbit. The dose rates observed by the MARIE instrument are within 10% of the model calculated predictions. Model calculations are compared with the MARIE measurements of dose, dose-equivalent values, along with the available particle flux distribution. Model calculated particle flux includes GCR elemental composition of atomic number, Z = 1-28 and mass number, A = 1-58. Particle flux calculations specific for the current MARIE mapping period are reviewed and presented.

  14. Modeling of the Adiabatic and Isothermal Methanation Process

    NASA Astrophysics Data System (ADS)

    Porubova, Jekaterina; Bazbauers, Gatis; Markova, Darja

    2011-01-01

    Increased use of biomass offers one of the ways to reduce anthropogenic impact on the environment. Using various biomass conversion processes, it is possible to obtain different types of fuels: • solid, e.g. bio-carbon; • liquid, e.g. biodiesel and ethanol; • gaseous, e.g. biomethane. Biomethane can be used in the transport and energy sector, and the total methane production efficiency can reach 65%. By modeling adiabatic and isothermal methanation processes, the most effective one from the methane production point of view is defined. Influence of the process parameters on the overall efficiency of the methane production is determined.

  15. Processes controlling the physico-chemical micro-environments associated with Pompeii worms

    NASA Astrophysics Data System (ADS)

    Le Bris, N.; Zbinden, M.; Gaill, F.

    2005-06-01

    Alvinella pompejana is a tube-dwelling polychaete colonizing hydrothermal smokers of the East Pacific Rise. Extreme temperature, low pH and millimolar sulfide levels have been reported in its immediate surroundings. The conditions experienced by this organism and its associated microbes are, however, poorly known and the processes controlling the physico-chemical gradients in this environment remain to be elucidated. Using miniature in situ sensors coupled with close-up video imagery, we have characterized fine-scale pH and temperature profiles in the biogeoassemblage constituting A. pompejana colonies. Steep discontinuities at both the individual and the colony scale were highlighted, indicating a partitioning of the vent fluid-seawater interface into chemically and thermally distinct micro-environments. The comparison of geochemical models with these data furthermore reveals that temperature is not a relevant tracer of the fluid dilution at these scales. The inner-tube micro-environment is expected to be supplied from the seawater-dominated medium overlying tube openings and to undergo subsequent conductive heating through the tube walls. Its neutral pH is likely to be associated with moderately oxidative conditions. Such a model provides an explanation of the atypical thermal and chemical patterns that were previously reported for this medium from discrete samples and in situ measurements. Conversely, the medium surrounding the tubes is shown to be dominated by the fluid venting from the chimney wall. This hot fluid appears to be gradually cooled (120-30 °C) as it passes through the thickness of the worm colony, as a result of a thermal exchange mechanism induced by the tube assemblage. Its pH, however, remains very low (pH˜4), and reducing conditions can be expected in this medium. Such a thermal and chemical buffering mechanism is consistent with the mineralogical anomalies previously highlighted and provides a first explanation of the exceptional ability of

  16. Modeling cellular processes in 3D.

    PubMed

    Mogilner, Alex; Odde, David

    2011-12-01

    Recent advances in photonic imaging and fluorescent protein technology offer unprecedented views of molecular space-time dynamics in living cells. At the same time, advances in computing hardware and software enable modeling of ever more complex systems, from global climate to cell division. As modeling and experiment become more closely integrated we must address the issue of modeling cellular processes in 3D. Here, we highlight recent advances related to 3D modeling in cell biology. While some processes require full 3D analysis, we suggest that others are more naturally described in 2D or 1D. Keeping the dimensionality as low as possible reduces computational time and makes models more intuitively comprehensible; however, the ability to test full 3D models will build greater confidence in models generally and remains an important emerging area of cell biological modeling.

  17. Software-Engineering Process Simulation (SEPS) model

    NASA Technical Reports Server (NTRS)

    Lin, C. Y.; Abdel-Hamid, T.; Sherif, J. S.

    1992-01-01

    The Software Engineering Process Simulation (SEPS) model is described which was developed at JPL. SEPS is a dynamic simulation model of the software project development process. It uses the feedback principles of system dynamics to simulate the dynamic interactions among various software life cycle development activities and management decision making processes. The model is designed to be a planning tool to examine tradeoffs of cost, schedule, and functionality, and to test the implications of different managerial policies on a project's outcome. Furthermore, SEPS will enable software managers to gain a better understanding of the dynamics of software project development and perform postmodern assessments.

  18. Job Aiding/Training Decision Process Model

    DTIC Science & Technology

    1992-09-01

    I[ -, . 1’, oo Ii AL-CR-i1992-0004 AD-A256 947lEE = IIEI ifl ll 1l I JOB AIDING/TRAINING DECISION PROCESS MODEL A R M John P. Zenyuh DTIC S Phillip C...March 1990 - April 1990 4. TITLE AND SUBTITLE S. FUNDING NUMBERS C - F33615-86-C-0545 Job Aiding/Training Decision Process Model PE - 62205F PR - 1121 6...Components to Process Model Decision and Selection Points ........... 32 13. Summary of Subject Recommendations for Aiding Approaches

  19. Genetic line by environment interaction on rainbow trout growth and processing traits

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Genetic line-by-environment (GxE) interactions were determined for growth and processing traits of five genetic lines of rainbow trout reared in four environments. Genetic lines included 1) mixed pool of 109 families selectively bred for improved growth (Growth Line) at the USDA National Center fo...

  20. A Delineation of the Cognitive Processes Manifested in a Social Annotation Environment

    ERIC Educational Resources Information Center

    Li, S. C.; Pow, J. W. C.; Cheung, W. C.

    2015-01-01

    This study aims to examine how students' learning trajectories progress in an online social annotation environment, and how their cognitive processes and levels of interaction correlate with their learning outcomes. Three different types of activities (cognitive, metacognitive and social) were identified in the online environment. The time…

  1. Gene-Environment Processes Linking Aggression, Peer Victimization, and the Teacher-Child Relationship

    ERIC Educational Resources Information Center

    Brendgen, Mara; Boivin, Michel; Dionne, Ginette; Barker, Edward D.; Vitaro, Frank; Girard, Alain; Tremblay, Richard; Perusse, Daniel

    2011-01-01

    Aggressive behavior in middle childhood is at least partly explained by genetic factors. Nevertheless, estimations of simple effects ignore possible gene-environment interactions (G x E) or gene-environment correlations (rGE) in the etiology of aggression. The present study aimed to simultaneously test for G x E and rGE processes between…

  2. A poultry-processing model for quantitative microbiological risk assessment.

    PubMed

    Nauta, Maarten; van der Fels-Klerx, Ine; Havelaar, Arie

    2005-02-01

    A poultry-processing model for a quantitative microbiological risk assessment (QMRA) of campylobacter is presented, which can also be applied to other QMRAs involving poultry processing. The same basic model is applied in each consecutive stage of industrial processing. It describes the effects of inactivation and removal of the bacteria, and the dynamics of cross-contamination in terms of the transfer of campylobacter from the intestines to the carcass surface and the environment, from the carcasses to the environment, and from the environment to the carcasses. From the model it can be derived that, in general, the effect of inactivation and removal is dominant for those carcasses with high initial bacterial loads, and cross-contamination is dominant for those with low initial levels. In other QMRA poultry-processing models, the input-output relationship between the numbers of bacteria on the carcasses is usually assumed to be linear on a logarithmic scale. By including some basic mechanistics, it is shown that this may not be realistic. As nonlinear behavior may affect the predicted effects of risk mitigations; this finding is relevant for risk management. Good knowledge of the variability of bacterial loads on poultry entering the process is important. The common practice in microbiology to only present geometric mean of bacterial counts is insufficient: arithmetic mean are more suitable, in particular, to describe the effect of cross-contamination. The effects of logistic slaughter (scheduled processing) as a risk mitigation strategy are predicted to be small. Some additional complications in applying microbiological data obtained in processing plants are discussed.

  3. MODEL OF DIFFUSERS / PERMEATORS FOR HYDROGEN PROCESSING

    SciTech Connect

    Hang, T; William Jacobs, W

    2007-08-27

    Palladium-silver (Pd-Ag) diffusers are mainstays of hydrogen processing. Diffusers separate hydrogen from inert species such as nitrogen, argon or helium. The tubing becomes permeable to hydrogen when heated to more than 250 C and a differential pressure is created across the membrane. The hydrogen diffuses better at higher temperatures. Experimental or experiential results have been the basis for determining or predicting a diffuser's performance. However, the process can be mathematically modeled, and comparison to experimental or other operating data can be utilized to improve the fit of the model. A reliable model-based diffuser system design is the goal which will have impacts on tritium and hydrogen processing. A computer model has been developed to solve the differential equations for diffusion given the operating boundary conditions. The model was compared to operating data for a low pressure diffuser system. The modeling approach and the results are presented in this paper.

  4. Stochastic model of the residual acceleration environment in microgravity

    NASA Technical Reports Server (NTRS)

    Vinals, Jorge

    1994-01-01

    We describe a theoretical investigation of the effects that stochastic residual accelerations (g-jitter) onboard spacecraft can have on experiments conducted in a microgravity environment. We first introduce a stochastic model of the residual acceleration field, and develop a numerical algorithm to solve the equations governing fluid flow that allow for a stochastic body force. We next summarize our studies of two generic situations: stochastic parametric resonance and the onset of convective flow induced by a fluctuating acceleration field.

  5. Celiac disease: a model disease for gene-environment interaction.

    PubMed

    Uibo, Raivo; Tian, Zhigang; Gershwin, M Eric

    2011-03-01

    Celiac sprue remains a model autoimmune disease for dissection of genetic and environmental influences on disease progression. The 2010 Congress of Autoimmunity included several key sessions devoted to genetics and environment. Several papers from these symposia were selected for in-depth discussion and publication. This issue is devoted to this theme. The goal is not to discuss genetic and environmental interactions, but rather to focus on key elements of diagnosis, the inflammatory response and the mechanisms of autoimmunity.

  6. Celiac disease: a model disease for gene–environment interaction

    PubMed Central

    Uibo, Raivo; Tian, Zhigang; Gershwin, M Eric

    2011-01-01

    Celiac sprue remains a model autoimmune disease for dissection of genetic and environmental influences on disease progression. The 2010 Congress of Autoimmunity included several key sessions devoted to genetics and environment. Several papers from these symposia were selected for in-depth discussion and publication. This issue is devoted to this theme. The goal is not to discuss genetic and environmental interactions, but rather to focus on key elements of diagnosis, the inflammatory response and the mechanisms of autoimmunity. PMID:21317918

  7. PROCESS DESIGN FOR ENVIRONMENT: A MULTI-OBJECTIVE FRAMEWORK UNDER UNCERTAINTY

    EPA Science Inventory

    Designing chemical processes for environment requires consideration of several indexes of environmental impact including ozone depletion and global warming potentials, human and aquatic toxicity, and photochemical oxidation, and acid rain potentials. Current methodologies like t...

  8. A model for dispersion of contaminants in the subway environment

    SciTech Connect

    Coke, L. R.; Sanchez, J. G.; Policastro, A. J.

    2000-05-03

    Although subway ventilation has been studied extensively, very little has been published on dispersion of contaminants in the subway environment. This paper presents a model that predicts dispersion of contaminants in a complex subway system. It accounts for the combined transient effects of train motion, station airflows, train car air exchange rates, and source release properties. Results are presented for a range of typical subway scenarios. The effects of train piston action and train car air exchange are discussed. The model could also be applied to analyze the environmental impact of hazardous materials releases such as chemical and biological agents.

  9. Forest Canopy Processes in a Regional Chemical Transport Model

    NASA Astrophysics Data System (ADS)

    Makar, Paul; Staebler, Ralf; Akingunola, Ayodeji; Zhang, Junhua; McLinden, Chris; Kharol, Shailesh; Moran, Michael; Robichaud, Alain; Zhang, Leiming; Stroud, Craig; Pabla, Balbir; Cheung, Philip

    2016-04-01

    Forest canopies have typically been absent or highly parameterized in regional chemical transport models. Some forest-related processes are often considered - for example, biogenic emissions from the forests are included as a flux lower boundary condition on vertical diffusion, as is deposition to vegetation. However, real forest canopies comprise a much more complicated set of processes, at scales below the "transport model-resolved scale" of vertical levels usually employed in regional transport models. Advective and diffusive transport within the forest canopy typically scale with the height of the canopy, and the former process tends to dominate over the latter. Emissions of biogenic hydrocarbons arise from the foliage, which may be located tens of metres above the surface, while emissions of biogenic nitric oxide from decaying plant matter are located at the surface - in contrast to the surface flux boundary condition usually employed in chemical transport models. Deposition, similarly, is usually parameterized as a flux boundary condition, but may be differentiated between fluxes to vegetation and fluxes to the surface when the canopy scale is considered. The chemical environment also changes within forest canopies: shading, temperature, and relativity humidity changes with height within the canopy may influence chemical reaction rates. These processes have been observed in a host of measurement studies, and have been simulated using site-specific one-dimensional forest canopy models. Their influence on regional scale chemistry has been unknown, until now. In this work, we describe the results of the first attempt to include complex canopy processes within a regional chemical transport model (GEM-MACH). The original model core was subdivided into "canopy" and "non-canopy" subdomains. In the former, three additional near-surface layers based on spatially and seasonally varying satellite-derived canopy height and leaf area index were added to the original model

  10. Modeling and optimum time performance for concurrent processing

    NASA Technical Reports Server (NTRS)

    Mielke, Roland R.; Stoughton, John W.; Som, Sukhamoy

    1988-01-01

    The development of a new graph theoretic model for describing the relation between a decomposed algorithm and its execution in a data flow environment is presented. Called ATAMM, the model consists of a set of Petri net marked graphs useful for representing decision-free algorithms having large-grained, computationally complex primitive operations. Performance time measures which determine computing speed and throughput capacity are defined, and the ATAMM model is used to develop lower bounds for these times. A concurrent processing operating strategy for achieving optimum time performance is presented and illustrated by example.

  11. [Biological processes of the human environment regeneration within the Martian crew life support systems].

    PubMed

    Sychev, V N; Levinskikh, M A; Shepelev, E Ia; Podol'skiĭ, I G

    2003-01-01

    Five ground-based experiments at RF SRC-IBMP had the purpose to make a thorough investigation of a model of the human-unicellular algae-mineralization life support system. The system measured 15 m3 and contained 45 liters of alga suspension; the dry alga density was 10 to 12 g/l and water volume (including the alga suspension) amounted to 59 l. More sophisticated LSS models where algae were substituted by higher plants (crop area in the greenhouse equaled 15 m2) were investigated in three experiments from 1.5 mos. to 2 mos. in duration. It was found that the alga containing LSS was able to fulfill not only the macrofunction (air and water regeneration) but also several additional functions (air purification, establishment of microbial cenosis etc.) providing an adequate human environment. This polyfunctionality of the biological regenerative processes is a weighty argument for their integration into space LSSs. Another important aspect is that the unicellular algae containing systems are highly reliable owing to a huge number of species-cells which will be quickly recovered in case of the death of a part of the population and, consequently, functionality of the LSS autotrophic component will be restored before long. For an extended period of time the Martian crew will have no communication with the Earth's biosphere which implies that LSS should be absolutely reliable and redundant. Redundancy can be achieved through installation aboard the vehicle of two systems constructed on different principles of regeneration, i.e. physical-chemical and biological. Each of the LSSs should have the power to satisfy all needs of the crew. The best option is when two systems are functioning in parallel sharing the responsibility for the human environment. Redundancy in this case will mean that in the event of failure or a drastic decrease in performance of one system the other one will make up for the loss by increasing its share in the overall regeneration process.

  12. Space Environment Effects: Low-Altitude Trapped Radiation Model

    NASA Technical Reports Server (NTRS)

    Huston, S. L.; Pfitzer, K. A.

    1998-01-01

    Accurate models of the Earth's trapped energetic proton environment are required for both piloted and robotic space missions. For piloted missions, the concern is mainly total dose to the astronauts, particularly in long-duration missions and during extravehicular activity (EVA). As astronomical and remote-sensing detectors become more sensitive, the proton flux can induce unwanted backgrounds in these instruments. Due to this unwanted background, the following description details the development of a new model for the low-trapped proton environment. The model is based on nearly 20 years of data from the TIRO/NOAA weather satellites. The model, which has been designated NOAAPRO (for NOAA protons), predicts the integral omnidirectional proton flux in three energy ranges: >16, >36, and >80 MeV. It contains a true solar cycle variation and accounts for the secular variation in the Earth's magnetic field. It also extends to lower values of the magnetic L parameter than does AP8. Thus, the model addresses the major shortcomings of AP8.

  13. Predicting Material Performance in the Space Environment from Laboratory Test Data, Static Design Environments, and Space Weather Models

    NASA Technical Reports Server (NTRS)

    Minow, Josep I.; Edwards, David L.

    2008-01-01

    Qualifying materials for use in the space environment is typically accomplished with laboratory exposures to simulated UV/EUV, atomic oxygen, and charged particle radiation environments with in-situ or subsequent measurements of material properties of interest to the particular application. Choice of environment exposure levels are derived from static design environments intended to represent either mean or extreme conditions that are anticipated to be encountered during a mission. The real space environment however is quite variable. Predictions of the on orbit performance of a material qualified to laboratory environments can be done using information on 'space weather' variations in the real environment. This presentation will first review the variability of space environments of concern for material degradation and then demonstrate techniques for using test data to predict material performance in a variety of space environments from low Earth orbit to interplanetary space using historical measurements and space weather models.

  14. Evolution of quantum-like modeling in decision making processes

    SciTech Connect

    Khrennikova, Polina

    2012-12-18

    The application of the mathematical formalism of quantum mechanics to model behavioral patterns in social science and economics is a novel and constantly emerging field. The aim of the so called 'quantum like' models is to model the decision making processes in a macroscopic setting, capturing the particular 'context' in which the decisions are taken. Several subsequent empirical findings proved that when making a decision people tend to violate the axioms of expected utility theory and Savage's Sure Thing principle, thus violating the law of total probability. A quantum probability formula was devised to describe more accurately the decision making processes. A next step in the development of QL-modeling in decision making was the application of Schroedinger equation to describe the evolution of people's mental states. A shortcoming of Schroedinger equation is its inability to capture dynamics of an open system; the brain of the decision maker can be regarded as such, actively interacting with the external environment. Recently the master equation, by which quantum physics describes the process of decoherence as the result of interaction of the mental state with the environmental 'bath', was introduced for modeling the human decision making. The external environment and memory can be referred to as a complex 'context' influencing the final decision outcomes. The master equation can be considered as a pioneering and promising apparatus for modeling the dynamics of decision making in different contexts.

  15. Evolution of quantum-like modeling in decision making processes

    NASA Astrophysics Data System (ADS)

    Khrennikova, Polina

    2012-12-01

    The application of the mathematical formalism of quantum mechanics to model behavioral patterns in social science and economics is a novel and constantly emerging field. The aim of the so called 'quantum like' models is to model the decision making processes in a macroscopic setting, capturing the particular 'context' in which the decisions are taken. Several subsequent empirical findings proved that when making a decision people tend to violate the axioms of expected utility theory and Savage's Sure Thing principle, thus violating the law of total probability. A quantum probability formula was devised to describe more accurately the decision making processes. A next step in the development of QL-modeling in decision making was the application of Schrödinger equation to describe the evolution of people's mental states. A shortcoming of Schrödinger equation is its inability to capture dynamics of an open system; the brain of the decision maker can be regarded as such, actively interacting with the external environment. Recently the master equation, by which quantum physics describes the process of decoherence as the result of interaction of the mental state with the environmental 'bath', was introduced for modeling the human decision making. The external environment and memory can be referred to as a complex 'context' influencing the final decision outcomes. The master equation can be considered as a pioneering and promising apparatus for modeling the dynamics of decision making in different contexts.

  16. Method of moment solutions to scattering problems in a parallel processing environment

    NASA Technical Reports Server (NTRS)

    Cwik, Tom; Partee, Jonathan; Patterson, Jean

    1991-01-01

    This paper describes the implementation of a parallelized method of moments (MOM) code into an interactive workstation environment. The workstation allows interactive solid body modeling and mesh generation, MOM analysis, and the graphical display of results. After describing the parallel computing environment, the implementation and results of parallelizing a general MOM code are presented in detail.

  17. Precipitates/Salts Model Calculations for Various Drift Temperature Environments

    SciTech Connect

    P. Marnier

    2001-12-20

    The objective and scope of this calculation is to assist Performance Assessment Operations and the Engineered Barrier System (EBS) Department in modeling the geochemical effects of evaporation within a repository drift. This work is developed and documented using procedure AP-3.12Q, Calculations, in support of ''Technical Work Plan For Engineered Barrier System Department Modeling and Testing FY 02 Work Activities'' (BSC 2001a). The primary objective of this calculation is to predict the effects of evaporation on the abstracted water compositions established in ''EBS Incoming Water and Gas Composition Abstraction Calculations for Different Drift Temperature Environments'' (BSC 2001c). A secondary objective is to predict evaporation effects on observed Yucca Mountain waters for subsequent cement interaction calculations (BSC 2001d). The Precipitates/Salts model is documented in an Analysis/Model Report (AMR), ''In-Drift Precipitates/Salts Analysis'' (BSC 2001b).

  18. Radiation Belt Environment Model: Application to Space Weather and Beyond

    NASA Technical Reports Server (NTRS)

    Fok, Mei-Ching H.

    2011-01-01

    Understanding the dynamics and variability of the radiation belts are of great scientific and space weather significance. A physics-based Radiation Belt Environment (RBE) model has been developed to simulate and predict the radiation particle intensities. The RBE model considers the influences from the solar wind, ring current and plasmasphere. It takes into account the particle drift in realistic, time-varying magnetic and electric field, and includes diffusive effects of wave-particle interactions with various wave modes in the magnetosphere. The RBE model has been used to perform event studies and real-time prediction of energetic electron fluxes. In this talk, we will describe the RBE model equation, inputs and capabilities. Recent advancement in space weather application and artificial radiation belt study will be discussed as well.

  19. Threshold dynamics of a malaria transmission model in periodic environment

    NASA Astrophysics Data System (ADS)

    Wang, Lei; Teng, Zhidong; Zhang, Tailei

    2013-05-01

    In this paper, we propose a malaria transmission model with periodic environment. The basic reproduction number R0 is computed for the model and it is shown that the disease-free periodic solution of the model is globally asymptotically stable when R0<1, that is, the disease goes extinct when R0<1, while the disease is uniformly persistent and there is at least one positive periodic solution when R0>1. It indicates that R0 is the threshold value determining the extinction and the uniform persistence of the disease. Finally, some examples are given to illustrate the main theoretical results. The numerical simulations show that, when the disease is uniformly persistent, different dynamic behaviors may be found in this model, such as the global attractivity and the chaotic attractor.

  20. Cost Models for MMC Manufacturing Processes

    NASA Technical Reports Server (NTRS)

    Elzey, Dana M.; Wadley, Haydn N. G.

    1996-01-01

    Processes for the manufacture of advanced metal matrix composites are rapidly approaching maturity in the research laboratory and there is growing interest in their transition to industrial production. However, research conducted to date has almost exclusively focused on overcoming the technical barriers to producing high-quality material and little attention has been given to the economical feasibility of these laboratory approaches and process cost issues. A quantitative cost modeling (QCM) approach was developed to address these issues. QCM are cost analysis tools based on predictive process models relating process conditions to the attributes of the final product. An important attribute, of the QCM approach is the ability to predict the sensitivity of material production costs to product quality and to quantitatively explore trade-offs between cost and quality. Applications of the cost models allow more efficient direction of future MMC process technology development and a more accurate assessment of MMC market potential. Cost models were developed for two state-of-the art metal matrix composite (MMC) manufacturing processes: tape casting and plasma spray deposition. Quality and Cost models are presented for both processes and the resulting predicted quality-cost curves are presented and discussed.

  1. Virulence of Listeria monocytogenes isolates from humans and smoked salmon, peeled shrimp, and their processing environments.

    PubMed

    Gudmundsdóttir, Sigrún; Roche, Sylvie M; Kristinsson, Karl G; Kristjánsson, Már

    2006-09-01

    The virulence of 82 Listeria monocytogenes isolates from human cases and cold-smoked salmon, cooked peeled shrimp, and their production environments was assessed using the plaque-forming assay and a subcutaneous inoculation test in mice. These isolates were previously typed using serotyping and pulsed-field gel electrophoresis. The isolates from food-production environments were collected in several surveys over the period of 5 years. Sixty-eight (99.8%) of 69 isolates tested from food and food-processing environments were considered virulent while only one was avirulent. All clinical isolates (13) were highly virulent. The isolates were from raw materials, final products, and the production environment. This stresses the importance of hygiene in the processing environment as well as among personnel to avoid contamination of the final product.

  2. Model for Simulating a Spiral Software-Development Process

    NASA Technical Reports Server (NTRS)

    Mizell, Carolyn; Curley, Charles; Nayak, Umanath

    2010-01-01

    A discrete-event simulation model, and a computer program that implements the model, have been developed as means of analyzing a spiral software-development process. This model can be tailored to specific development environments for use by software project managers in making quantitative cases for deciding among different software-development processes, courses of action, and cost estimates. A spiral process can be contrasted with a waterfall process, which is a traditional process that consists of a sequence of activities that include analysis of requirements, design, coding, testing, and support. A spiral process is an iterative process that can be regarded as a repeating modified waterfall process. Each iteration includes assessment of risk, analysis of requirements, design, coding, testing, delivery, and evaluation. A key difference between a spiral and a waterfall process is that a spiral process can accommodate changes in requirements at each iteration, whereas in a waterfall process, requirements are considered to be fixed from the beginning and, therefore, a waterfall process is not flexible enough for some projects, especially those in which requirements are not known at the beginning or may change during development. For a given project, a spiral process may cost more and take more time than does a waterfall process, but may better satisfy a customer's expectations and needs. Models for simulating various waterfall processes have been developed previously, but until now, there have been no models for simulating spiral processes. The present spiral-process-simulating model and the software that implements it were developed by extending a discrete-event simulation process model of the IEEE 12207 Software Development Process, which was built using commercially available software known as the Process Analysis Tradeoff Tool (PATT). Typical inputs to PATT models include industry-average values of product size (expressed as number of lines of code

  3. Estimation, modeling, and simulation of patterned growth in extreme environments.

    PubMed

    Strader, B; Schubert, K E; Quintana, M; Gomez, E; Curnutt, J; Boston, P

    2011-01-01

    In the search for life on Mars and other extraterrestrial bodies or in our attempts to identify biological traces in the most ancient rock record of Earth, one of the biggest problems facing us is how to recognize life or the remains of ancient life in a context very different from our planet's modern biological examples. Specific chemistries or biological properties may well be inapplicable to extraterrestrial conditions or ancient Earth environments. Thus, we need to develop an arsenal of techniques that are of broader applicability. The notion of patterning created in some fashion by biological processes and properties may provide such a generalized property of biological systems no matter what the incidentals of chemistry or environmental conditions. One approach to recognizing these kinds of patterns is to look at apparently organized arrangements created and left by life in extreme environments here on Earth, especially at various spatial scales, different geologies, and biogeochemical circumstances.

  4. Modeling a healthy and a person with heart failure conditions using the object-oriented modeling environment Dymola.

    PubMed

    Heinke, Stefanie; Pereira, Carina; Leonhardt, Steffen; Walter, Marian

    2015-10-01

    Several mathematical models of different physiological systems are spread through literature. They serve as tools which improve the understanding of (patho-) physiological processes, may help to meet clinical decisions and can even enhance medical therapies. These models are typically implemented in a signal-flow-oriented simulation environment and focus on the behavior of one specific subsystem. Neglecting other physiological subsystems and using a technical description of the physiology hinders the exchange with and acceptance of clinicians. By contrast, this paper presents a new model implemented in a physical, object-oriented modeling environment which includes the cardiovascular, respiratory and thermoregulatory system. Simulation results for a healthy subject at rest and at the onset of exercise are given, showing the validity of the model. Finally, simulation results showing the interaction of the cardiovascular system with a ventricular assist device in case of heart failure are presented showing the flexibility and mightiness of the model and the simulation environment. Thus, we present a new model including three important physiological systems and one medical device implemented in an innovative simulation environment.

  5. The (Mathematical) Modeling Process in Biosciences.

    PubMed

    Torres, Nestor V; Santos, Guido

    2015-01-01

    In this communication, we introduce a general framework and discussion on the role of models and the modeling process in the field of biosciences. The objective is to sum up the common procedures during the formalization and analysis of a biological problem from the perspective of Systems Biology, which approaches the study of biological systems as a whole. We begin by presenting the definitions of (biological) system and model. Particular attention is given to the meaning of mathematical model within the context of biology. Then, we present the process of modeling and analysis of biological systems. Three stages are described in detail: conceptualization of the biological system into a model, mathematical formalization of the previous conceptual model and optimization and system management derived from the analysis of the mathematical model. All along this work the main features and shortcomings of the process are analyzed and a set of rules that could help in the task of modeling any biological system are presented. Special regard is given to the formative requirements and the interdisciplinary nature of this approach. We conclude with some general considerations on the challenges that modeling is posing to current biology.

  6. The (Mathematical) Modeling Process in Biosciences

    PubMed Central

    Torres, Nestor V.; Santos, Guido

    2015-01-01

    In this communication, we introduce a general framework and discussion on the role of models and the modeling process in the field of biosciences. The objective is to sum up the common procedures during the formalization and analysis of a biological problem from the perspective of Systems Biology, which approaches the study of biological systems as a whole. We begin by presenting the definitions of (biological) system and model. Particular attention is given to the meaning of mathematical model within the context of biology. Then, we present the process of modeling and analysis of biological systems. Three stages are described in detail: conceptualization of the biological system into a model, mathematical formalization of the previous conceptual model and optimization and system management derived from the analysis of the mathematical model. All along this work the main features and shortcomings of the process are analyzed and a set of rules that could help in the task of modeling any biological system are presented. Special regard is given to the formative requirements and the interdisciplinary nature of this approach. We conclude with some general considerations on the challenges that modeling is posing to current biology. PMID:26734063

  7. Computational models of molecular self-organization in cellular environments.

    PubMed

    LeDuc, Philip; Schwartz, Russell

    2007-01-01

    The cellular environment creates numerous obstacles to efficient chemistry, as molecular components must navigate through a complex, densely crowded, heterogeneous, and constantly changing landscape in order to function at the appropriate times and places. Such obstacles are especially challenging to self-organizing or self-assembling molecular systems, which often need to build large structures in confined environments and typically have high-order kinetics that should make them exquisitely sensitive to concentration gradients, stochastic noise, and other non-ideal reaction conditions. Yet cells nonetheless manage to maintain a finely tuned network of countless molecular assemblies constantly forming and dissolving with a robustness and efficiency generally beyond what human engineers currently can achieve under even carefully controlled conditions. Significant advances in high-throughput biochemistry and genetics have made it possible to identify many of the components and interactions of this network, but its scale and complexity will likely make it impossible to understand at a global, systems level without predictive computational models. It is thus necessary to develop a clear understanding of how the reality of cellular biochemistry differs from the ideal models classically assumed by simulation approaches and how simulation methods can be adapted to accurately reflect biochemistry in the cell, particularly for the self-organizing systems that are most sensitive to these factors. In this review, we present approaches that have been undertaken from the modeling perspective to address various ways in which self-organization in the cell differs from idealized models.

  8. Program Development and Evaluation: A Modeling Process.

    ERIC Educational Resources Information Center

    Green, Donald W.; Corgiat, RayLene

    A model of program development and evaluation was developed at Genesee Community College, utilizing a system theory/process of deductive and inductive reasoning to ensure coherence and continuity within the program. The model links activities to specific measurable outcomes. Evaluation checks and feedback are built in at various levels so that…

  9. A Process Model for Water Jug Problems

    ERIC Educational Resources Information Center

    Atwood, Michael E.; Polson, Peter G.

    1976-01-01

    A model is developed and evaluated for use in the water jug task in in which subjects are required to find a sequence of moves which produce a specified amount of water in each jug. Results indicate that the model presented correctly predicts the difficulties of different problems and describes the behavior of subjects in the process of problem…

  10. Modeling of fluidized bed silicon deposition process

    NASA Technical Reports Server (NTRS)

    Kim, K.; Hsu, G.; Lutwack, R.; PRATURI A. K.

    1977-01-01

    The model is intended for use as a means of improving fluidized bed reactor design and for the formulation of the research program in support of the contracts of Silicon Material Task for the development of the fluidized bed silicon deposition process. A computer program derived from the simple modeling is also described. Results of some sample calculations using the computer program are shown.

  11. Ant-mediated ecosystem processes are driven by trophic community structure but mainly by the environment.

    PubMed

    Salas-Lopez, Alex; Mickal, Houadria; Menzel, Florian; Orivel, Jérôme

    2017-01-01

    The diversity and functional identity of organisms are known to be relevant to the maintenance of ecosystem processes but can be variable in different environments. Particularly, it is uncertain whether ecosystem processes are driven by complementary effects or by dominant groups of species. We investigated how community structure (i.e., the diversity and relative abundance of biological entities) explains the community-level contribution of Neotropical ant communities to different ecosystem processes in different environments. Ants were attracted with food resources representing six ant-mediated ecosystem processes in four environments: ground and vegetation strata in cropland and forest habitats. The exploitation frequencies of the baits were used to calculate the taxonomic and trophic structures of ant communities and their contribution to ecosystem processes considered individually or in combination (i.e., multifunctionality). We then investigated whether community structure variables could predict ecosystem processes and whether such relationships were affected by the environment. We found that forests presented a greater biodiversity and trophic complementarity and lower dominance than croplands, but this did not affect ecosystem processes. In contrast, trophic complementarity was greater on the ground than on vegetation and was followed by greater resource exploitation levels. Although ant participation in ecosystem processes can be predicted by means of trophic-based indices, we found that variations in community structure and performance in ecosystem processes were best explained by environment. We conclude that determining the extent to which the dominance and complementarity of communities affect ecosystem processes in different environments requires a better understanding of resource availability to different species.

  12. U.S. Coast Guard Human Systems Integration (HSI) Process Model.

    DTIC Science & Technology

    1994-04-01

    acquisitions. This report provides a recommended " Process Model " for integrating the various elements of HSI (i.e., Manpower, Personnel, Training, Human Factors...whether elements of existing programs could be used in the Coast Guard environment. Based on this review, a process model was developed to integrate HSI into the Coast Guard acquisition process.

  13. Cognitive Virtualization: Combining Cognitive Models and Virtual Environments

    SciTech Connect

    Tuan Q. Tran; David I. Gertman; Donald D. Dudenhoeffer; Ronald L. Boring; Alan R. Mecham

    2007-08-01

    3D manikins are often used in visualizations to model human activity in complex settings. Manikins assist in developing understanding of human actions, movements and routines in a variety of different environments representing new conceptual designs. One such environment is a nuclear power plant control room, here they have the potential to be used to simulate more precise ergonomic assessments of human work stations. Next generation control rooms will pose numerous challenges for system designers. The manikin modeling approach by itself, however, may be insufficient for dealing with the desired technical advancements and challenges of next generation automated systems. Uncertainty regarding effective staffing levels; and the potential for negative human performance consequences in the presence of advanced automated systems (e.g., reduced vigilance, poor situation awareness, mistrust or blind faith in automation, higher information load and increased complexity) call for further research. Baseline assessment of novel control room equipment(s) and configurations needs to be conducted. These design uncertainties can be reduced through complementary analysis that merges ergonomic manikin models with models of higher cognitive functions, such as attention, memory, decision-making, and problem-solving. This paper will discuss recent advancements in merging a theoretical-driven cognitive modeling framework within a 3D visualization modeling tool to evaluate of next generation control room human factors and ergonomic assessment. Though this discussion primary focuses on control room design, the application for such a merger between 3D visualization and cognitive modeling can be extended to various areas of focus such as training and scenario planning.

  14. Stochastic model for supersymmetric particle branching process

    NASA Astrophysics Data System (ADS)

    Zhang, Yuanyuan; Chan, Aik Hui; Oh, Choo Hiap

    2017-01-01

    We develop a stochastic branching model to describe the jet evolution of supersymmetric (SUSY) particles. This model is a modified two-phase branching process, or more precisely, a two-phase simple birth process plus Poisson process. Both pure SUSY partons initiated jets and SUSY plus ordinary partons initiated jets scenarios are considered. The stochastic branching equations are established and the Multiplicity Distributions (MDs) are derived for these two scenarios. We also fit the distribution of the general case (SUSY plus ordinary partons initiated jets) with experimental data. The fitting shows the SUSY particles have not participated in branching at current collision energy yet.

  15. Quantitative modeling of soil genesis processes

    NASA Technical Reports Server (NTRS)

    Levine, E. R.; Knox, R. G.; Kerber, A. G.

    1992-01-01

    For fine spatial scale simulation, a model is being developed to predict changes in properties over short-, meso-, and long-term time scales within horizons of a given soil profile. Processes that control these changes can be grouped into five major process clusters: (1) abiotic chemical reactions; (2) activities of organisms; (3) energy balance and water phase transitions; (4) hydrologic flows; and (5) particle redistribution. Landscape modeling of soil development is possible using digitized soil maps associated with quantitative soil attribute data in a geographic information system (GIS) framework to which simulation models are applied.

  16. Filament winding cylinders. I - Process model

    NASA Technical Reports Server (NTRS)

    Lee, Soo-Yong; Springer, George S.

    1990-01-01

    A model was developed which describes the filament winding process of composite cylinders. The model relates the significant process variables such as winding speed, fiber tension, and applied temperature to the thermal, chemical and mechanical behavior of the composite cylinder and the mandrel. Based on the model, a user friendly code was written which can be used to calculate (1) the temperature in the cylinder and the mandrel, (2) the degree of cure and viscosity in the cylinder, (3) the fiber tensions and fiber positions, (4) the stresses and strains in the cylinder and in the mandrel, and (5) the void diameters in the cylinder.

  17. Using Perspective to Model Complex Processes

    SciTech Connect

    Kelsey, R.L.; Bisset, K.R.

    1999-04-04

    The notion of perspective, when supported in an object-based knowledge representation, can facilitate better abstractions of reality for modeling and simulation. The object modeling of complex physical and chemical processes is made more difficult in part due to the poor abstractions of state and phase changes available in these models. The notion of perspective can be used to create different views to represent the different states of matter in a process. These techniques can lead to a more understandable model. Additionally, the ability to record the progress of a process from start to finish is problematic. It is desirable to have a historic record of the entire process, not just the end result of the process. A historic record should facilitate backtracking and re-start of a process at different points in time. The same representation structures and techniques can be used to create a sequence of process markers to represent a historic record. By using perspective, the sequence of markers can have multiple and varying views tailored for a particular user's context of interest.

  18. Exposing earth surface process model simulations to a large audience

    NASA Astrophysics Data System (ADS)

    Overeem, I.; Kettner, A. J.; Borkowski, L.; Russell, E. L.; Peddicord, H.

    2015-12-01

    The Community Surface Dynamics Modeling System (CSDMS) represents a diverse group of >1300 scientists who develop and apply numerical models to better understand the Earth's surface. CSDMS has a mandate to make the public more aware of model capabilities and therefore started sharing state-of-the-art surface process modeling results with large audiences. One platform to reach audiences outside the science community is through museum displays on 'Science on a Sphere' (SOS). Developed by NOAA, SOS is a giant globe, linked with computers and multiple projectors and can display data and animations on a sphere. CSDMS has developed and contributed model simulation datasets for the SOS system since 2014, including hydrological processes, coastal processes, and human interactions with the environment. Model simulations of a hydrological and sediment transport model (WBM-SED) illustrate global river discharge patterns. WAVEWATCH III simulations have been specifically processed to show the impacts of hurricanes on ocean waves, with focus on hurricane Katrina and super storm Sandy. A large world dataset of dams built over the last two centuries gives an impression of the profound influence of humans on water management. Given the exposure of SOS, CSDMS aims to contribute at least 2 model datasets a year, and will soon provide displays of global river sediment fluxes and changes of the sea ice free season along the Arctic coast. Over 100 facilities worldwide show these numerical model displays to an estimated 33 million people every year. Datasets storyboards, and teacher follow-up materials associated with the simulations, are developed to address common core science K-12 standards. CSDMS dataset documentation aims to make people aware of the fact that they look at numerical model results, that underlying models have inherent assumptions and simplifications, and that limitations are known. CSDMS contributions aim to familiarize large audiences with the use of numerical

  19. Hydrogeochemical processes in ground water in a tropical karst environment of Southern Mexico

    NASA Astrophysics Data System (ADS)

    Mota, Sandra; Escolero, Oscar

    2015-04-01

    The karstic aquifers are of a great strategic importance in many regions along the world. These aquifers belong to carbonated formations which have been affected by fissuration and dissolution (karstification) processes. The specific organization of the flows in this type of aquifer determines the methodologies to be used in its exploration, although much still unknown about the processes occurring in tropical environments. This research has the overall aim to identify the hydrogeochemical processes affecting groundwater in the Rio Grande Basin of Comitan, in the state of Chiapas, Mexico. In the Rio Grande Basin are delimited 54 sub-basins having an area of 6126.67 km2. The geology of the area is characterized by lithology dominated by Mesozoic sedimentary rocks of the Lower Cretaceous series, clastic and carbonate rocks limestone-dolomite type are the oldest. Another lithological association present in the area is limestone-shale. Consistent with the previous unit, a deposit of sediments consisting of shale, sandstone and limestone occurs. On these previous formations layers of siltstone and sandstone with interbedded limestone were deposited. Deep and shallow wells used to supply water to the population, were used to establish a monitoring network aimed at identifying the types of groundwater and the processes occurring in the karstic aquifer. For the development of this work was carried out sampling September 2014, where 50 sites used for groundwater extraction were sampled, of which 20 are deep wells and 30 shallow wells. The physicochemical parameters were measured in the field, while the chemical constituents were analyzed in the laboratory. The data obtained were drawn diagrams to identify hydrogeochemical facies of groundwater sampled, and contour maps of chemical content and some measured parameters. Likewise, the field data have been interpreted with the help of hydrogeochemical models to identify the processes that may be changing water quality in the

  20. Improving Model Performance through Process-Based Diagnostics

    NASA Astrophysics Data System (ADS)

    Clune, T.; Kuo, K.; Schmidt, G. A.; Bauer, M. P.; Oloso, A. O.

    2013-12-01

    Many of the aspects of the climate system that are of the greatest interest (e.g., the sensitivity of the system to external forcings) are emergent properties that arise via the complex interplay between disparate processes. This is also true for climate models -- most diagnostics are not a function of an isolated portion of source code, but rather are affected by multiple components and procedures. Thus any model-observation mismatch is hard to attribute to any specific piece of code or imperfection in a specific model assumption. An alternative approach is to identify diagnostics that are more closely tied to specific processes -- implying that if a mismatch is found, it should be much easier to identify and address specific algorithmic choices that will improve the simulation. However, this approach requires looking at model output and observational data in a more sophisticated way than the more traditional production of monthly or annual mean quantities. The data must instead be filtered in time and space for examples of the specific process being targeted. We are developing a data analysis environment called PROcess-Based Explorer (PROBE) that seeks to enable efficient and systematic computation of process-based diagnostics on very large sets of data. In this environment, investigators can define arbitrarily complex filters and then seamlessly perform computations in parallel on the filtered output from their model. The same analysis can be performed on additional related data sets (e.g., reanalyses) thereby enabling routine comparisons between model and observational data. PROBE also incorporates workflow technology to automatically update computed diagnostics for subsequent executions of a model. In this presentation, we will discuss the design and current status of PROBE as well as share results from some preliminary use cases.

  1. Modeling the VARTM Composite Manufacturing Process

    NASA Technical Reports Server (NTRS)

    Song, Xiao-Lan; Loos, Alfred C.; Grimsley, Brian W.; Cano, Roberto J.; Hubert, Pascal

    2004-01-01

    A comprehensive simulation model of the Vacuum Assisted Resin Transfer Modeling (VARTM) composite manufacturing process has been developed. For isothermal resin infiltration, the model incorporates submodels which describe cure of the resin and changes in resin viscosity due to cure, resin flow through the reinforcement preform and distribution medium and compaction of the preform during the infiltration. The accuracy of the model was validated by measuring the flow patterns during resin infiltration of flat preforms. The modeling software was used to evaluate the effects of the distribution medium on resin infiltration of a flat preform. Different distribution medium configurations were examined using the model and the results were compared with data collected during resin infiltration of a carbon fabric preform. The results of the simulations show that the approach used to model the distribution medium can significantly effect the predicted resin infiltration times. Resin infiltration into the preform can be accurately predicted only when the distribution medium is modeled correctly.

  2. Operational SAR Data Processing in GIS Environments for Rapid Disaster Mapping

    NASA Astrophysics Data System (ADS)

    Bahr, Thomas

    2014-05-01

    DEM without the need of ground control points. This step includes radiometric calibration. (3) A subsequent change detection analysis generates the final map showing the extent of the flash flood on Nov. 5th 2010. The underlying algorithms are provided by three different sources: Geocoding & radiometric calibration (2) is a standard functionality from the commercial SARscape Toolbox for ArcGIS. This toolbox is extended by the filter tool (1), which is called from the SARscape modules in ENVI. The change detection analysis (3) is based on ENVI processing routines and scripted with IDL. (2) and (3) are integrated with ArcGIS using a predefined Python interface. These 3 processing steps are combined using the ArcGIS ModelBuilder to create a new model for rapid disaster mapping in ArcGIS, based on SAR data. Moreover, this model can be dissolved from its desktop environment and published to users across the ArcGIS Server enterprise. Thus disaster zones, e.g. after severe flooding, can be automatically identified and mapped to support local task forces - using an operational workflow for SAR image analysis, which can be executed by the responsible operators without SAR expert knowledge.

  3. Mathematical modeling of the coating process.

    PubMed

    Toschkoff, Gregor; Khinast, Johannes G

    2013-12-05

    Coating of tablets is a common unit operation in the pharmaceutical industry. In most cases, the final product must meet strict quality requirements; to meet them, a detailed understanding of the coating process is required. To this end, numerous experiment studies have been performed. However, to acquire a mechanistic understanding, experimental data must be interpreted in the light of mathematical models. In recent years, a combination of analytical modeling and computational simulations enabled deeper insights into the nature of the coating process. This paper presents an overview of modeling and simulation approaches of the coating process, covering various relevant aspects from scale-up considerations to coating mass uniformity investigations and models for drop atomization. The most important analytical and computational concepts are presented and the findings are compared.

  4. Recent Developments in the Radiation Belt Environment Model

    NASA Technical Reports Server (NTRS)

    Fok, M.-C.; Glocer, A.; Zheng, Q.; Horne, R. B.; Meredith, N. P.; Albert, J. M.; Nagai, T.

    2010-01-01

    The fluxes of energetic particles in the radiation belts are found to be strongly controlled by the solar wind conditions. In order to understand and predict the radiation particle intensities, we have developed a physics-based Radiation Belt Environment (RBE) model that considers the influences from the solar wind, ring current and plasmasphere. Recently, an improved calculation of wave-particle interactions has been incorporated. In particular, the model now includes cross diffusion in energy and pitch-angle. We find that the exclusion of cross diffusion could cause significant overestimation of electron flux enhancement during storm recovery. The RBE model is also connected to MHD fields so that the response of the radiation belts to fast variations in the global magnetosphere can be studied.Weare able to reproduce the rapid flux increase during a substorm dipolarization on 4 September 2008. The timing is much shorter than the time scale of wave associated acceleration.

  5. New V and V Tools for Diagnostic Modeling Environment (DME)

    NASA Technical Reports Server (NTRS)

    Pecheur, Charles; Nelson, Stacy; Merriam, Marshall (Technical Monitor)

    2002-01-01

    The purpose of this report is to provide correctness and reliability criteria for verification and validation (V&V) of Second Generation Reusable Launch Vehicle (RLV) Diagnostic Modeling Environment, describe current NASA Ames Research Center tools for V&V of Model Based Reasoning systems, and discuss the applicability of Advanced V&V to DME. This report is divided into the following three sections: (1) correctness and reliability criteria; (2) tools for V&V of Model Based Reasoning; and (3) advanced V&V applicable to DME. The Executive Summary includes an overview of the main points from each section. Supporting details, diagrams, figures, and other information are included in subsequent sections. A glossary, acronym list, appendices, and references are included at the end of this report.

  6. Biomedical Simulation Models of Human Auditory Processes

    NASA Technical Reports Server (NTRS)

    Bicak, Mehmet M. A.

    2012-01-01

    Detailed acoustic engineering models that explore noise propagation mechanisms associated with noise attenuation and transmission paths created when using hearing protectors such as earplugs and headsets in high noise environments. Biomedical finite element (FE) models are developed based on volume Computed Tomography scan data which provides explicit external ear, ear canal, middle ear ossicular bones and cochlea geometry. Results from these studies have enabled a greater understanding of hearing protector to flesh dynamics as well as prioritizing noise propagation mechanisms. Prioritization of noise mechanisms can form an essential framework for exploration of new design principles and methods in both earplug and earcup applications. These models are currently being used in development of a novel hearing protection evaluation system that can provide experimentally correlated psychoacoustic noise attenuation. Moreover, these FE models can be used to simulate the effects of blast related impulse noise on human auditory mechanisms and brain tissue.

  7. Engineered Barrier System Degradation, Flow, and Transport Process Model Report

    SciTech Connect

    E.L. Hardin

    2000-07-17

    The Engineered Barrier System Degradation, Flow, and Transport Process Model Report (EBS PMR) is one of nine PMRs supporting the Total System Performance Assessment (TSPA) being developed by the Yucca Mountain Project for the Site Recommendation Report (SRR). The EBS PMR summarizes the development and abstraction of models for processes that govern the evolution of conditions within the emplacement drifts of a potential high-level nuclear waste repository at Yucca Mountain, Nye County, Nevada. Details of these individual models are documented in 23 supporting Analysis/Model Reports (AMRs). Nineteen of these AMRs are for process models, and the remaining 4 describe the abstraction of results for application in TSPA. The process models themselves cluster around four major topics: ''Water Distribution and Removal Model, Physical and Chemical Environment Model, Radionuclide Transport Model, and Multiscale Thermohydrologic Model''. One AMR (Engineered Barrier System-Features, Events, and Processes/Degradation Modes Analysis) summarizes the formal screening analysis used to select the Features, Events, and Processes (FEPs) included in TSPA and those excluded from further consideration. Performance of a potential Yucca Mountain high-level radioactive waste repository depends on both the natural barrier system (NBS) and the engineered barrier system (EBS) and on their interactions. Although the waste packages are generally considered as components of the EBS, the EBS as defined in the EBS PMR includes all engineered components outside the waste packages. The principal function of the EBS is to complement the geologic system in limiting the amount of water contacting nuclear waste. A number of alternatives were considered by the Project for different EBS designs that could provide better performance than the design analyzed for the Viability Assessment. The design concept selected was Enhanced Design Alternative II (EDA II).

  8. Database integration in a multimedia-modeling environment

    SciTech Connect

    Dorow, Kevin E.

    2002-09-02

    Integration of data from disparate remote sources has direct applicability to modeling, which can support Brownfield assessments. To accomplish this task, a data integration framework needs to be established. A key element in this framework is the metadata that creates the relationship between the pieces of information that are important in the multimedia modeling environment and the information that is stored in the remote data source. The design philosophy is to allow modelers and database owners to collaborate by defining this metadata in such a way that allows interaction between their components. The main parts of this framework include tools to facilitate metadata definition, database extraction plan creation, automated extraction plan execution / data retrieval, and a central clearing house for metadata and modeling / database resources. Cross-platform compatibility (using Java) and standard communications protocols (http / https) allow these parts to run in a wide variety of computing environments (Local Area Networks, Internet, etc.), and, therefore, this framework provides many benefits. Because of the specific data relationships described in the metadata, the amount of data that have to be transferred is kept to a minimum (only the data that fulfill a specific request are provided as opposed to transferring the complete contents of a data source). This allows for real-time data extraction from the actual source. Also, the framework sets up collaborative responsibilities such that the different types of participants have control over the areas in which they have domain knowledge-the modelers are responsible for defining the data relevant to their models, while the database owners are responsible for mapping the contents of the database using the metadata definitions. Finally, the data extraction mechanism allows for the ability to control access to the data and what data are made available.

  9. Examining Student Research Choices and Processes in a Disintermediated Searching Environment

    ERIC Educational Resources Information Center

    Rempel, Hannah Gascho; Buck, Stefanie; Deitering, Anne-Marie

    2013-01-01

    Students today perform research in a disintermediated environment, which often allows them to struggle directly with the process of selecting research tools and choosing scholarly sources. The authors conducted a qualitative study with twenty students, using structured observations to ascertain the processes students use to select databases and…

  10. Molecular Characterization and Serotyping of Salmonella Isolated from the Shell Egg Processing Environment

    Technology Transfer Automated Retrieval System (TEKTRAN)

    ABSTRACT BODY: Introduction: Salmonellosis may be contracted by the consumption of raw or undercooked eggs. In order to develop effective sanitation practices it is helpful to understand the location of Salmonella reservoirs in processing environments. Shell egg processing reservoirs for Salmonella...

  11. Parabolic Anderson Model in a Dynamic Random Environment: Random Conductances

    NASA Astrophysics Data System (ADS)

    Erhard, D.; den Hollander, F.; Maillard, G.

    2016-06-01

    The parabolic Anderson model is defined as the partial differential equation ∂ u( x, t)/ ∂ t = κ Δ u( x, t) + ξ( x, t) u( x, t), x ∈ ℤ d , t ≥ 0, where κ ∈ [0, ∞) is the diffusion constant, Δ is the discrete Laplacian, and ξ is a dynamic random environment that drives the equation. The initial condition u( x, 0) = u 0( x), x ∈ ℤ d , is typically taken to be non-negative and bounded. The solution of the parabolic Anderson equation describes the evolution of a field of particles performing independent simple random walks with binary branching: particles jump at rate 2 d κ, split into two at rate ξ ∨ 0, and die at rate (- ξ) ∨ 0. In earlier work we looked at the Lyapunov exponents λ p(κ ) = limlimits _{tto ∞} 1/t log {E} ([u(0,t)]p)^{1/p}, quad p in {N} , qquad λ 0(κ ) = limlimits _{tto ∞} 1/2 log u(0,t). For the former we derived quantitative results on the κ-dependence for four choices of ξ : space-time white noise, independent simple random walks, the exclusion process and the voter model. For the latter we obtained qualitative results under certain space-time mixing conditions on ξ. In the present paper we investigate what happens when κΔ is replaced by Δ𝓚, where 𝓚 = {𝓚( x, y) : x, y ∈ ℤ d , x ˜ y} is a collection of random conductances between neighbouring sites replacing the constant conductances κ in the homogeneous model. We show that the associated annealed Lyapunov exponents λ p (𝓚), p ∈ ℕ, are given by the formula λ p({K} ) = {sup} {λ p(κ ) : κ in {Supp} ({K} )}, where, for a fixed realisation of 𝓚, Supp(𝓚) is the set of values taken by the 𝓚-field. We also show that for the associated quenched Lyapunov exponent λ 0(𝓚) this formula only provides a lower bound, and we conjecture that an upper bound holds when Supp(𝓚) is replaced by its convex hull. Our proof is valid for three classes of reversible ξ, and for all 𝓚

  12. Barotropic Tidal Predictions and Validation in a Relocatable Modeling Environment. Revised

    NASA Technical Reports Server (NTRS)

    Mehra, Avichal; Passi, Ranjit; Kantha, Lakshmi; Payne, Steven; Brahmachari, Shuvobroto

    1998-01-01

    Under funding from the Office of Naval Research (ONR), and the Naval Oceanographic Office (NAVOCEANO), the Mississippi State University Center for Air Sea Technology (CAST) has been working on developing a Relocatable Modeling Environment(RME) to provide a uniform and unbiased infrastructure for efficiently configuring numerical models in any geographic/oceanic region. Under Naval Oceanographic Office (NAVO-CEANO) funding, the model was implemented and tested for NAVOCEANO use. With our current emphasis on ocean tidal modeling, CAST has adopted the Colorado University's numerical ocean model, known as CURReNTSS (Colorado University Rapidly Relocatable Nestable Storm Surge) Model, as the model of choice. During the RME development process, CURReNTSS has been relocated to several coastal oceanic regions, providing excellent results that demonstrate its veracity. This report documents the model validation results and provides a brief description of the Graphic user Interface (GUI).

  13. The deterministic SIS epidemic model in a Markovian random environment.

    PubMed

    Economou, Antonis; Lopez-Herrero, Maria Jesus

    2016-07-01

    We consider the classical deterministic susceptible-infective-susceptible epidemic model, where the infection and recovery rates depend on a background environmental process that is modeled by a continuous time Markov chain. This framework is able to capture several important characteristics that appear in the evolution of real epidemics in large populations, such as seasonality effects and environmental influences. We propose computational approaches for the determination of various distributions that quantify the evolution of the number of infectives in the population.

  14. Modeling the effect of outdoor particle concentrations on indoor concentrations in a heated environment

    SciTech Connect

    Pandian, M.D. )

    1988-01-01

    Exposure to suspended particulate mater in the home or workplace can produce adverse human health effects. Sources of suspended particulate matter include cigarette smoke, consumer spray products, and dust from cement manufacture, metal processing, and coal-fired power generation. The particle concentrations in these indoor environments can be determined from experimental studies or modeling techniques. Many experimental studies have been conducted to determine the mass concentration of total suspended particulate matter, usually expressed in {mu}g/m{sup 3}, and the elemental composition of particulate matter in these environments. However, there is not much reported data on particle size distributions in indoor environments. One of the early indoor modeling efforts was undertaken by Shair and Heitner, who conducted a theoretical analysis for relating indoor pollutant concentrations to those outdoors. The author describes the theoretical analysis and compared it to results obtained from experiments on conditioned cigarette smoke particle concentrations in a room at 20{degrees}C and 60 {percent}.

  15. FAME, a microprocessor based front-end analysis and modeling environment

    NASA Technical Reports Server (NTRS)

    Rosenbaum, J. D.; Kutin, E. B.

    1980-01-01

    Higher order software (HOS) is a methodology for the specification and verification of large scale, complex, real time systems. The HOS methodology was implemented as FAME (front end analysis and modeling environment), a microprocessor based system for interactively developing, analyzing, and displaying system models in a low cost user-friendly environment. The nature of the model is such that when completed it can be the basis for projection to a variety of forms such as structured design diagrams, Petri-nets, data flow diagrams, and PSL/PSA source code. The user's interface with the analyzer is easily recognized by any current user of a structured modeling approach; therefore extensive training is unnecessary. Furthermore, when all the system capabilities are used one can check on proper usage of data types, functions, and control structures thereby adding a new dimension to the design process that will lead to better and more easily verified software designs.

  16. Building intuition of iron evolution during solar cell processing through analysis of different process models

    NASA Astrophysics Data System (ADS)

    Morishige, Ashley E.; Laine, Hannu S.; Schön, Jonas; Haarahiltunen, Antti; Hofstetter, Jasmin; del Cañizo, Carlos; Schubert, Martin C.; Savin, Hele; Buonassisi, Tonio

    2015-09-01

    An important aspect of Process Simulators for photovoltaics is prediction of defect evolution during device fabrication. Over the last twenty years, these tools have accelerated process optimization, and several Process Simulators for iron, a ubiquitous and deleterious impurity in silicon, have been developed. The diversity of these tools can make it difficult to build intuition about the physics governing iron behavior during processing. Thus, in one unified software environment and using self-consistent terminology, we combine and describe three of these Simulators. We vary structural defect distribution and iron precipitation equations to create eight distinct Models, which we then use to simulate different stages of processing. We find that the structural defect distribution influences the final interstitial iron concentration ([]) more strongly than the iron precipitation equations. We identify two regimes of iron behavior: (1) diffusivity-limited, in which iron evolution is kinetically limited and bulk [] predictions can vary by an order of magnitude or more, and (2) solubility-limited, in which iron evolution is near thermodynamic equilibrium and the Models yield similar results. This rigorous analysis provides new intuition that can inform Process Simulation, material, and process development, and it enables scientists and engineers to choose an appropriate level of Model complexity based on wafer type and quality, processing conditions, and available computation time.

  17. Comparing Two Types of Model Progression in an Inquiry Learning Environment with Modelling Facilities

    ERIC Educational Resources Information Center

    Mulder, Yvonne G.; Lazonder, Ard W.; de Jong, Ton

    2011-01-01

    The educational advantages of inquiry learning environments that incorporate modelling facilities are often challenged by students' poor inquiry skills. This study examined two types of model progression as means to compensate for these skill deficiencies. Model order progression (MOP), the predicted optimal variant, gradually increases the…

  18. Computational Modeling in Structural Materials Processing

    NASA Technical Reports Server (NTRS)

    Meyyappan, Meyya; Arnold, James O. (Technical Monitor)

    1997-01-01

    High temperature materials such as silicon carbide, a variety of nitrides, and ceramic matrix composites find use in aerospace, automotive, machine tool industries and in high speed civil transport applications. Chemical vapor deposition (CVD) is widely used in processing such structural materials. Variations of CVD include deposition on substrates, coating of fibers, inside cavities and on complex objects, and infiltration within preforms called chemical vapor infiltration (CVI). Our current knowledge of the process mechanisms, ability to optimize processes, and scale-up for large scale manufacturing is limited. In this regard, computational modeling of the processes is valuable since a validated model can be used as a design tool. The effort is similar to traditional chemically reacting flow modeling with emphasis on multicomponent diffusion, thermal diffusion, large sets of homogeneous reactions, and surface chemistry. In the case of CVI, models for pore infiltration are needed. In the present talk, examples of SiC nitride, and Boron deposition from the author's past work will be used to illustrate the utility of computational process modeling.

  19. Software Engineering Laboratory (SEL) cleanroom process model

    NASA Technical Reports Server (NTRS)

    Green, Scott; Basili, Victor; Godfrey, Sally; Mcgarry, Frank; Pajerski, Rose; Waligora, Sharon

    1991-01-01

    The Software Engineering Laboratory (SEL) cleanroom process model is described. The term 'cleanroom' originates in the integrated circuit (IC) production process, where IC's are assembled in dust free 'clean rooms' to prevent the destructive effects of dust. When applying the clean room methodology to the development of software systems, the primary focus is on software defect prevention rather than defect removal. The model is based on data and analysis from previous cleanroom efforts within the SEL and is tailored to serve as a guideline in applying the methodology to future production software efforts. The phases that are part of the process model life cycle from the delivery of requirements to the start of acceptance testing are described. For each defined phase, a set of specific activities is discussed, and the appropriate data flow is described. Pertinent managerial issues, key similarities and differences between the SEL's cleanroom process model and the standard development approach used on SEL projects, and significant lessons learned from prior cleanroom projects are presented. It is intended that the process model described here will be further tailored as additional SEL cleanroom projects are analyzed.

  20. Virtual building environments (VBE) - Applying information modeling to buildings

    SciTech Connect

    Bazjanac, Vladimir

    2004-06-21

    A Virtual Building Environment (VBE) is a ''place'' where building industry project staffs can get help in creating Building Information Models (BIM) and in the use of virtual buildings. It consists of a group of industry software that is operated by industry experts who are also experts in the use of that software. The purpose of a VBE is to facilitate expert use of appropriate software applications in conjunction with each other to efficiently support multidisciplinary work. This paper defines BIM and virtual buildings, and describes VBE objectives, set-up and characteristics of operation. It informs about the VBE Initiative and the benefits from a couple of early VBE projects.

  1. Modeling the Parasitic Filariasis Spread by Mosquito in Periodic Environment

    PubMed Central

    Wang, Xiaoyun; Pan, Qiuhui

    2017-01-01

    In this paper a mosquito-borne parasitic infection model in periodic environment is considered. Threshold parameter R0 is given by linear next infection operator, which determined the dynamic behaviors of system. We obtain that when R0 < 1, the disease-free periodic solution is globally asymptotically stable and when R0 > 1 by Poincaré map we obtain that disease is uniformly persistent. Numerical simulations support the results and sensitivity analysis shows effects of parameters on R0, which provided references to seek optimal measures to control the transmission of lymphatic filariasis. PMID:28280518

  2. Dynamic occupancy models for explicit colonization processes

    USGS Publications Warehouse

    Broms, Kristin M.; Hooten, Mevin B.; Johnson, Devin S.; Altwegg, Res; Conquest, Loveday

    2016-01-01

    The dynamic, multi-season occupancy model framework has become a popular tool for modeling open populations with occupancies that change over time through local colonizations and extinctions. However, few versions of the model relate these probabilities to the occupancies of neighboring sites or patches. We present a modeling framework that incorporates this information and is capable of describing a wide variety of spatiotemporal colonization and extinction processes. A key feature of the model is that it is based on a simple set of small-scale rules describing how the process evolves. The result is a dynamic process that can account for complicated large-scale features. In our model, a site is more likely to be colonized if more of its neighbors were previously occupied and if it provides more appealing environmental characteristics than its neighboring sites. Additionally, a site without occupied neighbors may also become colonized through the inclusion of a long-distance dispersal process. Although similar model specifications have been developed for epidemiological applications, ours formally accounts for detectability using the well-known occupancy modeling framework. After demonstrating the viability and potential of this new form of dynamic occupancy model in a simulation study, we use it to obtain inference for the ongoing Common Myna (Acridotheres tristis) invasion in South Africa. Our results suggest that the Common Myna continues to enlarge its distribution and its spread via short distance movement, rather than long-distance dispersal. Overall, this new modeling framework provides a powerful tool for managers examining the drivers of colonization including short- vs. long-distance dispersal, habitat quality, and distance from source populations.

  3. Implementation of the Business Process Modelling Notation (BPMN) in the modelling of anatomic pathology processes

    PubMed Central

    Rojo, Marcial García; Rolón, Elvira; Calahorra, Luis; García, Felix Óscar; Sánchez, Rosario Paloma; Ruiz, Francisco; Ballester, Nieves; Armenteros, María; Rodríguez, Teresa; Espartero, Rafael Martín

    2008-01-01

    Background Process orientation is one of the essential elements of quality management systems, including those in use in healthcare. Business processes in hospitals are very complex and variable. BPMN (Business Process Modelling Notation) is a user-oriented language specifically designed for the modelling of business (organizational) processes. Previous experiences of the use of this notation in the processes modelling within the Pathology in Spain or another country are not known. We present our experience in the elaboration of the conceptual models of Pathology processes, as part of a global programmed surgical patient process, using BPMN. Methods With the objective of analyzing the use of BPMN notation in real cases, a multidisciplinary work group was created, including software engineers from the Dep. of Technologies and Information Systems from the University of Castilla-La Mancha and health professionals and administrative staff from the Hospital General de Ciudad Real. The work in collaboration was carried out in six phases: informative meetings, intensive training, process selection, definition of the work method, process describing by hospital experts, and process modelling. Results The modelling of the processes of Anatomic Pathology is presented using BPMN. The presented subprocesses are those corresponding to the surgical pathology examination of the samples coming from operating theatre, including the planning and realization of frozen studies. Conclusion The modelling of Anatomic Pathology subprocesses has allowed the creation of an understandable graphical model, where management and improvements are more easily implemented by health professionals. PMID:18673511

  4. Stochastic differential equation model to Prendiville processes

    NASA Astrophysics Data System (ADS)

    Granita, Bahar, Arifah

    2015-10-01

    The Prendiville process is another variation of the logistic model which assumes linearly decreasing population growth rate. It is a continuous time Markov chain (CTMC) taking integer values in the finite interval. The continuous time Markov chain can be approximated by stochastic differential equation (SDE). This paper discusses the stochastic differential equation of Prendiville process. The work started with the forward Kolmogorov equation in continuous time Markov chain of Prendiville process. Then it was formulated in the form of a central-difference approximation. The approximation was then used in Fokker-Planck equation in relation to the stochastic differential equation of the Prendiville process. The explicit solution of the Prendiville process was obtained from the stochastic differential equation. Therefore, the mean and variance function of the Prendiville process could be easily found from the explicit solution.

  5. Stochastic differential equation model to Prendiville processes

    SciTech Connect

    Granita; Bahar, Arifah

    2015-10-22

    The Prendiville process is another variation of the logistic model which assumes linearly decreasing population growth rate. It is a continuous time Markov chain (CTMC) taking integer values in the finite interval. The continuous time Markov chain can be approximated by stochastic differential equation (SDE). This paper discusses the stochastic differential equation of Prendiville process. The work started with the forward Kolmogorov equation in continuous time Markov chain of Prendiville process. Then it was formulated in the form of a central-difference approximation. The approximation was then used in Fokker-Planck equation in relation to the stochastic differential equation of the Prendiville process. The explicit solution of the Prendiville process was obtained from the stochastic differential equation. Therefore, the mean and variance function of the Prendiville process could be easily found from the explicit solution.

  6. Predicting plants -modeling traits as a function of environment

    NASA Astrophysics Data System (ADS)

    Franklin, Oskar

    2016-04-01

    A central problem in understanding and modeling vegetation dynamics is how to represent the variation in plant properties and function across different environments. Addressing this problem there is a strong trend towards trait-based approaches, where vegetation properties are functions of the distributions of functional traits rather than of species. Recently there has been enormous progress in in quantifying trait variability and its drivers and effects (Van Bodegom et al. 2012; Adier et al. 2014; Kunstler et al. 2015) based on wide ranging datasets on a small number of easily measured traits, such as specific leaf area (SLA), wood density and maximum plant height. However, plant function depends on many other traits and while the commonly measured trait data are valuable, they are not sufficient for driving predictive and mechanistic models of vegetation dynamics -especially under novel climate or management conditions. For this purpose we need a model to predict functional traits, also those not easily measured, and how they depend on the plants' environment. Here I present such a mechanistic model based on fitness concepts and focused on traits related to water and light limitation of trees, including: wood density, drought response, allocation to defense, and leaf traits. The model is able to predict observed patterns of variability in these traits in relation to growth and mortality, and their responses to a gradient of water limitation. The results demonstrate that it is possible to mechanistically predict plant traits as a function of the environment based on an eco-physiological model of plant fitness. References Adier, P.B., Salguero-Gómez, R., Compagnoni, A., Hsu, J.S., Ray-Mukherjee, J., Mbeau-Ache, C. et al. (2014). Functional traits explain variation in plant lifehistory strategies. Proc. Natl. Acad. Sci. U. S. A., 111, 740-745. Kunstler, G., Falster, D., Coomes, D.A., Hui, F., Kooyman, R.M., Laughlin, D.C. et al. (2015). Plant functional traits

  7. Chain binomial models and binomial autoregressive processes.

    PubMed

    Weiss, Christian H; Pollett, Philip K

    2012-09-01

    We establish a connection between a class of chain-binomial models of use in ecology and epidemiology and binomial autoregressive (AR) processes. New results are obtained for the latter, including expressions for the lag-conditional distribution and related quantities. We focus on two types of chain-binomial model, extinction-colonization and colonization-extinction models, and present two approaches to parameter estimation. The asymptotic distributions of the resulting estimators are studied, as well as their finite-sample performance, and we give an application to real data. A connection is made with standard AR models, which also has implications for parameter estimation.

  8. Process simulation and modeling for gas processing plant

    NASA Astrophysics Data System (ADS)

    Alhameli, Falah Obaid Kenish Mubarak

    Natural gas is one of the major energy sources and its demand is increasing rapidly due to its environmental and economic advantages over other fuels. Gas processing is an essential component of natural gas system. In this work, gas processing plant is introduced with the objective of meeting pipeline gas quality. It consists of separation, sweetening and dehydration units. The separation unit contains phase separators along with stabilizer (conventional distillation column). The sweetening unit is an amine process with MDEA (Methyl DiEthanol Amine) solvent. The dehydration unit is glycol absorption with TEG (TriEthyleneGlycol) solvent. ProMaxRTM 3.2 was used to simulate the plant. Box-Behnken design was applied to build a black-box model using design of experiments (DoE). MinitabRTM 15 was used to generate and analyse the design. The chosen variables for the model were 10. They represent the gas feed conditions and units' parameters. The total runs were 170. They were successfully implemented and analysed. Total energy of the plant and water content for the product gas models were obtained. Case study was conducted to investigate the impact of H2S composition increase in the feed gas. The models were used for the case study with the objective of total energy minimization and constraint of 4 lb/MMscf for water content in the product gas. Lingo 13 was used for the optimization. It was observed that the feed pressure had the highest influence among the other parameters. Finally, some recommendations were pointed out for the future works.

  9. Cost Models for MMC Manufacturing Processes

    NASA Technical Reports Server (NTRS)

    Elzey, Dana M.; Wadley, Haydn N. G.

    1996-01-01

    The quality cost modeling (QCM) tool is intended to be a relatively simple-to-use device for obtaining a first-order assessment of the quality-cost relationship for a given process-material combination. The QCM curve is a plot of cost versus quality (an index indicating microstructural quality), which is unique for a given process-material combination. The QCM curve indicates the tradeoff between cost and performance, thus enabling one to evaluate affordability. Additionally, the effect of changes in process design, raw materials, and process conditions on the cost-quality relationship can be evaluated. Such results might indicate the most efficient means to obtain improved quality at reduced cost by process design refinements, the implementation of sensors and models for closed loop process control, or improvement in the properties of raw materials being fed into the process. QCM also allows alternative processes for producing the same or similar material to be compared in terms of their potential for producing competitively priced, high quality material. Aside from demonstrating the usefulness of the QCM concept, this is one of the main foci of the present research program, namely to compare processes for making continuous fiber reinforced, metal matrix composites (MMC's). Two processes, low pressure plasma spray deposition and tape casting are considered for QCM development. This document consists of a detailed look at the design of the QCM approach, followed by discussion of the application of QCM to each of the selected MMC manufacturing processes along with results, comparison of processes, and finally, a summary of findings and recommendations.

  10. Physical Processes and Real-Time Chemical Measurement of the Insect Olfactory Environment

    PubMed Central

    Abrell, Leif; Hildebrand, John G.

    2009-01-01

    Odor-mediated insect navigation in airborne chemical plumes is vital to many ecological interactions, including mate finding, flower nectaring, and host locating (where disease transmission or herbivory may begin). After emission, volatile chemicals become rapidly mixed and diluted through physical processes that create a dynamic olfactory environment. This review examines those physical processes and some of the analytical technologies available to characterize those behavior-inducing chemical signals at temporal scales equivalent to the olfactory processing in insects. In particular, we focus on two areas of research that together may further our understanding of olfactory signal dynamics and its processing and perception by insects. First, measurement of physical atmospheric processes in the field can provide insight into the spatiotemporal dynamics of the odor signal available to insects. Field measurements in turn permit aspects of the physical environment to be simulated in the laboratory, thereby allowing careful investigation into the links between odor signal dynamics and insect behavior. Second, emerging analytical technologies with high recording frequencies and field-friendly inlet systems may offer new opportunities to characterize natural odors at spatiotemporal scales relevant to insect perception and behavior. Characterization of the chemical signal environment allows the determination of when and where olfactory-mediated behaviors may control ecological interactions. Finally, we argue that coupling of these two research areas will foster increased understanding of the physicochemical environment and enable researchers to determine how olfactory environments shape insect behaviors and sensory systems. PMID:18548311

  11. Affective Responses and Cognitive Models of the Computing Environment.

    ERIC Educational Resources Information Center

    Wallace, Andrew R.; Sinclair, Kenneth E.

    New electronic technologies provide powerful tools for managing and processing the rapidly increasing amounts of information available for learning; teachers, however, have often been slow in integrating computers into the curriculum. This study addresses the question of how prospective teachers construct affective and cognitive models about…

  12. X-ray emission processes in stars and their immediate environment.

    PubMed

    Testa, Paola

    2010-04-20

    A decade of X-ray stellar observations with Chandra and XMM-Newton has led to significant advances in our understanding of the physical processes at work in hot (magnetized) plasmas in stars and their immediate environment, providing new perspectives and challenges, and in turn the need for improved models. The wealth of high-quality stellar spectra has allowed us to investigate, in detail, the characteristics of the X-ray emission across the Hertzsprung-Russell (HR) diagram. Progress has been made in addressing issues ranging from classical stellar activity in stars with solar-like dynamos (such as flares, activity cycles, spatial and thermal structuring of the X-ray emitting plasma, and evolution of X-ray activity with age), to X-ray generating processes (e.g., accretion, jets, magnetically confined winds) that were poorly understood in the preChandra/XMM-Newton era. I will discuss the progress made in the study of high energy stellar physics and its impact in a wider astrophysical context, focusing on the role of spectral diagnostics now accessible.

  13. X-ray emission processes in stars and their immediate environment

    PubMed Central

    Testa, Paola

    2010-01-01

    A decade of X-ray stellar observations with Chandra and XMM-Newton has led to significant advances in our understanding of the physical processes at work in hot (magnetized) plasmas in stars and their immediate environment, providing new perspectives and challenges, and in turn the need for improved models. The wealth of high-quality stellar spectra has allowed us to investigate, in detail, the characteristics of the X-ray emission across the Hertzsprung-Russell (HR) diagram. Progress has been made in addressing issues ranging from classical stellar activity in stars with solar-like dynamos (such as flares, activity cycles, spatial and thermal structuring of the X-ray emitting plasma, and evolution of X-ray activity with age), to X-ray generating processes (e.g., accretion, jets, magnetically confined winds) that were poorly understood in the preChandra/XMM-Newton era. I will discuss the progress made in the study of high energy stellar physics and its impact in a wider astrophysical context, focusing on the role of spectral diagnostics now accessible. PMID:20360562

  14. Session on modeling of radiative transfer processes

    NASA Technical Reports Server (NTRS)

    Flatau, Piotr

    1993-01-01

    The session on modeling of radiative transfer processes is reviewed. Six critical issues surfaced in the discussion concerning scale-interactive radiative processes relevent to the mesoscale convective systems (MCS's). These issues are the need to expand basic knowledge of how MCS's influence climate through extensive cloud shields and increased humidity in the upper troposphere; to improve radiation parameterizations used in mesoscale and General Circulation Model (GCM) models; to improve our basic understanding of the influence of radiation on MCS dynamics due to diabatic heating, production of condensate, and vertical and horizontal heat fluxes; to quantify our understanding of radiative impacts of MCS's on the surface and free atmosphere energy budgets; to quantify and identify radiative and microphysical processes important in the evolution of MCS's; and to improve the capability to remotely sense MCS radiative properties from space and ground-based systems.

  15. Incorporating evolutionary processes into population viability models.

    PubMed

    Pierson, Jennifer C; Beissinger, Steven R; Bragg, Jason G; Coates, David J; Oostermeijer, J Gerard B; Sunnucks, Paul; Schumaker, Nathan H; Trotter, Meredith V; Young, Andrew G

    2015-06-01

    We examined how ecological and evolutionary (eco-evo) processes in population dynamics could be better integrated into population viability analysis (PVA). Complementary advances in computation and population genomics can be combined into an eco-evo PVA to offer powerful new approaches to understand the influence of evolutionary processes on population persistence. We developed the mechanistic basis of an eco-evo PVA using individual-based models with individual-level genotype tracking and dynamic genotype-phenotype mapping to model emergent population-level effects, such as local adaptation and genetic rescue. We then outline how genomics can allow or improve parameter estimation for PVA models by providing genotypic information at large numbers of loci for neutral and functional genome regions. As climate change and other threatening processes increase in rate and scale, eco-evo PVAs will become essential research tools to evaluate the effects of adaptive potential, evolutionary rescue, and locally adapted traits on persistence.

  16. Model-based adaptive 3D sonar reconstruction in reverberating environments.

    PubMed

    Saucan, Augustin-Alexandru; Sintes, Christophe; Chonavel, Thierry; Caillec, Jean-Marc Le

    2015-10-01

    In this paper, we propose a novel model-based approach for 3D underwater scene reconstruction, i.e., bathymetry, for side scan sonar arrays in complex and highly reverberating environments like shallow water areas. The presence of multipath echoes and volume reverberation generates false depth estimates. To improve the resulting bathymetry, this paper proposes and develops an adaptive filter, based on several original geometrical models. This multimodel approach makes it possible to track and separate the direction of arrival trajectories of multiple echoes impinging the array. Echo tracking is perceived as a model-based processing stage, incorporating prior information on the temporal evolution of echoes in order to reject cluttered observations generated by interfering echoes. The results of the proposed filter on simulated and real sonar data showcase the clutter-free and regularized bathymetric reconstruction. Model validation is carried out with goodness of fit tests, and demonstrates the importance of model-based processing for bathymetry reconstruction.

  17. A process algebra model of QED

    NASA Astrophysics Data System (ADS)

    Sulis, William

    2016-03-01

    The process algebra approach to quantum mechanics posits a finite, discrete, determinate ontology of primitive events which are generated by processes (in the sense of Whitehead). In this ontology, primitive events serve as elements of an emergent space-time and of emergent fundamental particles and fields. Each process generates a set of primitive elements, using only local information, causally propagated as a discrete wave, forming a causal space termed a causal tapestry. Each causal tapestry forms a discrete and finite sampling of an emergent causal manifold (space-time) M and emergent wave function. Interactions between processes are described by a process algebra which possesses 8 commutative operations (sums and products) together with a non-commutative concatenation operator (transitions). The process algebra possesses a representation via nondeterministic combinatorial games. The process algebra connects to quantum mechanics through the set valued process and configuration space covering maps, which associate each causal tapestry with sets of wave functions over M. Probabilities emerge from interactions between processes. The process algebra model has been shown to reproduce many features of the theory of non-relativistic scalar particles to a high degree of accuracy, without paradox or divergences. This paper extends the approach to a semi-classical form of quantum electrodynamics.

  18. Modeling Kanban Processes in Systems Engineering

    DTIC Science & Technology

    2012-06-01

    Modeling Kanban Processes in Systems Engineering Richard Turner School of Systems and Enterprises Stevens Institute of Technology Hoboken, NJ...dingold@usc.edu, jolane@usc.edu Abstract—Systems engineering processes using pull scheduling methods ( kanban ) are being evaluated with hybrid...development projects incrementally evolve capabilities of existing systems and/or systems of systems. A kanban -based scheduling system was defined and

  19. Knowledge-based approach to fault diagnosis and control in distributed process environments

    NASA Astrophysics Data System (ADS)

    Chung, Kwangsue; Tou, Julius T.

    1991-03-01

    This paper presents a new design approach to knowledge-based decision support systems for fault diagnosis and control for quality assurance and productivity improvement in automated manufacturing environments. Based on the observed manifestations, the knowledge-based diagnostic system hypothesizes a set of the most plausible disorders by mimicking the reasoning process of a human diagnostician. The data integration technique is designed to generate error-free hierarchical category files. A novel approach to diagnostic problem solving has been proposed by integrating the PADIKS (Pattern-Directed Knowledge-Based System) concept and the symbolic model of diagnostic reasoning based on the categorical causal model. The combination of symbolic causal reasoning and pattern-directed reasoning produces a highly efficient diagnostic procedure and generates a more realistic expert behavior. In addition, three distinctive constraints are designed to further reduce the computational complexity and to eliminate non-plausible hypotheses involved in the multiple disorders problem. The proposed diagnostic mechanism, which consists of three different levels of reasoning operations, significantly reduces the computational complexity in the diagnostic problem with uncertainty by systematically shrinking the hypotheses space. This approach is applied to the test and inspection data collected from a PCB manufacturing operation.

  20. Using {sup 222}Rn as a tracer of geophysical processes in underground environments

    SciTech Connect

    Lacerda, T.; Anjos, R. M.; Silva, A. A. R. da; Yoshimura, E. M.

    2014-11-11

    Radon levels in two old mines in San Luis, Argentina, are reported and analyzed. These mines are today used for touristic visitation. Our goal was to assess the potential use of such radioactive noble gas as tracer of geological processes in underground environments. CR-39 nuclear track detectors were used during the winter and summer seasons. The findings show that the significant radon concentrations reported in this environment are subject to large seasonal modulations, due to the strong dependence of natural ventilation on the variations of outside temperature. The results also indicate that radon pattern distribution appear as a good method to localize unknown ducts, fissures or secondary tunnels in subterranean environments.

  1. Modelling vehicle colour and pattern for multiple deployment environments

    NASA Astrophysics Data System (ADS)

    Liggins, Eric; Moorhead, Ian R.; Pearce, Daniel A.; Baker, Christopher J.; Serle, William P.

    2016-10-01

    Military land platforms are often deployed around the world in very different climate zones. Procuring vehicles in a large range of camouflage patterns and colour schemes is expensive and may limit the environments in which they can be effectively used. As such this paper reports a modelling approach for use in the optimisation and selection of a colour palette, to support operations in diverse environments and terrains. Three different techniques were considered based upon the differences between vehicle and background in L*a*b* colour space, to predict the optimum (initially single) colour to reduce the vehicle signature in the visible band. Calibrated digital imagery was used as backgrounds and a number of scenes were sampled. The three approaches used, and reported here are a) background averaging behind the vehicle b) background averaging in the area surrounding the vehicle and c) use of the spatial extension to CIE L*a*b*; S-CIELAB (Zhang and Wandell, Society for Information Display Symposium Technical Digest, vol. 27, pp. 731-734, 1996). Results are compared with natural scene colour statistics. The models used showed good agreement in the colour predictions for individual and multiple terrains or climate zones. A further development of the technique examines the effect of different patterns and colour combinations on the S-CIELAB spatial colour difference metric, when scaled for appropriate viewing ranges.

  2. Process diagnostics for precision grinding brittle materials in a production environment

    SciTech Connect

    Blaedel, K L; Davis, P J; Piscotty, M A

    1999-04-01

    Precision grinding processes are steadily migrating from research laboratory environments into manufacturing production lines as precision machines and processes become increasingly more commonplace throughout industry. Low-roughness, low-damage precision grinding is gaining widespread commercial acceptance for a host of brittle materials including advanced structural ceramics. The development of these processes is often problematic and requires diagnostic information and analysis to harden the processes for manufacturing. This paper presents a series of practical precision grinding tests developed and practiced at Lawrence Livermore National Laboratory that yield important information to help move a new process idea into production.

  3. Rocks in the River: The Challenge of Piloting the Inquiry Process in Today's Learning Environment

    ERIC Educational Resources Information Center

    Lambusta, Patrice; Graham, Sandy; Letteri-Walker, Barbara

    2014-01-01

    School librarians in Newport News, Virginia, are meeting the challenges of integrating an Inquiry Process Model into instruction. In the original model the process began by asking students to develop questions to start their inquiry journey. As this model was taught it was discovered that students often did not have enough background knowledge to…

  4. Retort process modelling for Indian traditional foods.

    PubMed

    Gokhale, S V; Lele, S S

    2014-11-01

    Indian traditional staple and snack food is typically a heterogeneous recipe that incorporates varieties of vegetables, lentils and other ingredients. Modelling the retorting process of multilayer pouch packed Indian food was achieved using lumped-parameter approach. A unified model is proposed to estimate cold point temperature. Initial process conditions, retort temperature and % solid content were the significantly affecting independent variables. A model was developed using combination of vegetable solids and water, which was then validated using four traditional Indian vegetarian products: Pulav (steamed rice with vegetables), Sambar (south Indian style curry containing mixed vegetables and lentils), Gajar Halawa (carrot based sweet product) and Upama (wheat based snack product). The predicted and experimental values of temperature profile matched with ±10 % error which is a good match considering the food was a multi component system. Thus the model will be useful as a tool to reduce number of trials required to optimize retorting of various Indian traditional vegetarian foods.

  5. The DAB model of drawing processes

    NASA Technical Reports Server (NTRS)

    Hochhaus, Larry W.

    1989-01-01

    The problem of automatic drawing was investigated in two ways. First, a DAB model of drawing processes was introduced. DAB stands for three types of knowledge hypothesized to support drawing abilities, namely, Drawing Knowledge, Assimilated Knowledge, and Base Knowledge. Speculation concerning the content and character of each of these subsystems of the drawing process is introduced and the overall adequacy of the model is evaluated. Second, eight experts were each asked to understand six engineering drawings and to think aloud while doing so. It is anticipated that a concurrent protocol analysis of these interviews can be carried out in the future. Meanwhile, a general description of the videotape database is provided. In conclusion, the DAB model was praised as a worthwhile first step toward solution of a difficult problem, but was considered by and large inadequate to the challenge of automatic drawing. Suggestions for improvements on the model were made.

  6. Deterministic geologic processes and stochastic modeling

    SciTech Connect

    Rautman, C.A.; Flint, A.L.

    1991-12-31

    Recent outcrop sampling at Yucca Mountain, Nevada, has produced significant new information regarding the distribution of physical properties at the site of a potential high-level nuclear waste repository. Consideration of the spatial distribution of measured values and geostatistical measures of spatial variability indicates that there are a number of widespread deterministic geologic features at the site that have important implications for numerical modeling of such performance aspects as ground water flow and radionuclide transport. These deterministic features have their origin in the complex, yet logical, interplay of a number of deterministic geologic processes, including magmatic evolution; volcanic eruption, transport, and emplacement; post-emplacement cooling and alteration; and late-stage (diagenetic) alteration. Because of geologic processes responsible for formation of Yucca Mountain are relatively well understood and operate on a more-or-less regional scale, understanding of these processes can be used in modeling the physical properties and performance of the site. Information reflecting these deterministic geologic processes may be incorporated into the modeling program explicitly, using geostatistical concepts such as soft information, or implicitly, through the adoption of a particular approach to modeling. It is unlikely that any single representation of physical properties at the site will be suitable for all modeling purposes. Instead, the same underlying physical reality will need to be described many times, each in a manner conducive to assessing specific performance issues.

  7. Performance analysis of no-vent fill process for liquid hydrogen tank in terrestrial and on-orbit environments

    NASA Astrophysics Data System (ADS)

    Wang, Lei; Li, Yanzhong; Zhang, Feini; Ma, Yuan

    2015-12-01

    Two finite difference computer models, aiming at the process predictions of no-vent fill in normal gravity and microgravity environments respectively, are developed to investigate the filling performance in a liquid hydrogen (LH2) tank. In the normal gravity case model, the tank/fluid system is divided into five control volume including ullage, bulk liquid, gas-liquid interface, ullage-adjacent wall, and liquid-adjacent wall. In the microgravity case model, vapor-liquid thermal equilibrium state is maintained throughout the process, and only two nodes representing fluid and wall regions are applied. To capture the liquid-wall heat transfer accurately, a series of heat transfer mechanisms are considered and modeled successively, including film boiling, transition boiling, nucleate boiling and liquid natural convection. The two models are validated by comparing their prediction with experimental data, which shows good agreement. Then the two models are used to investigate the performance of no-vent fill in different conditions and several conclusions are obtained. It shows that in the normal gravity environment the no-vent fill experiences a continuous pressure rise during the whole process and the maximum pressure occurs at the end of the operation, while the maximum pressure of the microgravity case occurs at the beginning stage of the process. Moreover, it seems that increasing inlet mass flux has an apparent influence on the pressure evolution of no-vent fill process in normal gravity but a little influence in microgravity. The larger initial wall temperature brings about more significant liquid evaporation during the filling operation, and then causes higher pressure evolution, no matter the filling process occurs under normal gravity or microgravity conditions. Reducing inlet liquid temperature can improve the filling performance in normal gravity, but cannot significantly reduce the maximum pressure in microgravity. The presented work benefits the

  8. Attrition and abrasion models for oil shale process modeling

    SciTech Connect

    Aldis, D.F.

    1991-10-25

    As oil shale is processed, fine particles, much smaller than the original shale are created. This process is called attrition or more accurately abrasion. In this paper, models of abrasion are presented for oil shale being processed in several unit operations. Two of these unit operations, a fluidized bed and a lift pipe are used in the Lawrence Livermore National Laboratory Hot-Recycle-Solid (HRS) process being developed for the above ground processing of oil shale. In two reports, studies were conducted on the attrition of oil shale in unit operations which are used in the HRS process. Carley reported results for attrition in a lift pipe for oil shale which had been pre-processed either by retorting or by retorting then burning. The second paper, by Taylor and Beavers, reported results for a fluidized bed processing of oil shale. Taylor and Beavers studied raw, retorted, and shale which had been retorted and then burned. In this paper, empirical models are derived, from the experimental studies conducted on oil shale for the process occurring in the HRS process. The derived models are presented along with comparisons with experimental results.

  9. Applications integration in a hybrid cloud computing environment: modelling and platform

    NASA Astrophysics Data System (ADS)

    Li, Qing; Wang, Ze-yuan; Li, Wei-hua; Li, Jun; Wang, Cheng; Du, Rui-yang

    2013-08-01

    With the development of application services providers and cloud computing, more and more small- and medium-sized business enterprises use software services and even infrastructure services provided by professional information service companies to replace all or part of their information systems (ISs). These information service companies provide applications, such as data storage, computing processes, document sharing and even management information system services as public resources to support the business process management of their customers. However, no cloud computing service vendor can satisfy the full functional IS requirements of an enterprise. As a result, enterprises often have to simultaneously use systems distributed in different clouds and their intra enterprise ISs. Thus, this article presents a framework to integrate applications deployed in public clouds and intra ISs. A run-time platform is developed and a cross-computing environment process modelling technique is also developed to improve the feasibility of ISs under hybrid cloud computing environments.

  10. Modeling of a thermoplastic pultrusion process

    SciTech Connect

    Astroem, B.T. ); Pipes, R.B. )

    1991-07-01

    To obtain a fundamental understanding of the effects of processing parameters and die geometry in a pultrusion process, a mathematical model is essential in order to minimize the number of trial-and-error experiments. Previous investigators have suggested a variety of more or less complete models for thermoset pultrusion, while little effort seems to have been spent modeling its less well-understood thermoplastic equivalent. Hence, a set of intricately related models to describe the temperature and pressure distributions, as well as the matrix flow, in a thermoplastic composite as it travels through the pultrusion die is presented. An approach to calculate the accumulated pulling force is also explored, and the individual mechanisms contributing to the pulling force are discussed. The pressure model incorporates a matrix viscosity that varies with shear rate, temperature, and pressure. Comparisons are made between shear-rate-dependent and Newtonian viscosity representations, indicating the necessity of including non-Newtonian fluid behavior when modeling thermoplastic pultrusion. The governing equations of the models are stated in general terms, and simplifications are implemented in order to obtain solutions without extensive numerical efforts. Pressure, temperature, cooling rate, and pulling force distributions are presented for carbon-fiber-reinforced polyetheretherketone. Pulling force predictions are compared to data obtained from preliminary experiments conducted with a model pultrusion line that was built solely for the pultrusion of thermoplastic matrix composites, and the correlation is found to be qualitatively satisfactory.

  11. Development of climate data storage and processing model

    NASA Astrophysics Data System (ADS)

    Okladnikov, I. G.; Gordov, E. P.; Titov, A. G.

    2016-11-01

    We present a storage and processing model for climate datasets elaborated in the framework of a virtual research environment (VRE) for climate and environmental monitoring and analysis of the impact of climate change on the socio-economic processes on local and regional scales. The model is based on a «shared nothings» distributed computing architecture and assumes using a computing network where each computing node is independent and selfsufficient. Each node holds a dedicated software for the processing and visualization of geospatial data providing programming interfaces to communicate with the other nodes. The nodes are interconnected by a local network or the Internet and exchange data and control instructions via SSH connections and web services. Geospatial data is represented by collections of netCDF files stored in a hierarchy of directories in the framework of a file system. To speed up data reading and processing, three approaches are proposed: a precalculation of intermediate products, a distribution of data across multiple storage systems (with or without redundancy), and caching and reuse of the previously obtained products. For a fast search and retrieval of the required data, according to the data storage and processing model, a metadata database is developed. It contains descriptions of the space-time features of the datasets available for processing, their locations, as well as descriptions and run options of the software components for data analysis and visualization. The model and the metadata database together will provide a reliable technological basis for development of a high- performance virtual research environment for climatic and environmental monitoring.

  12. Therapeutic Process During Exposure: Habituation Model

    PubMed Central

    Benito, Kristen G.; Walther, Michael

    2015-01-01

    The current paper outlines the habituation model of exposure process, which is a behavioral model emphasizing use of individually tailored functional analysis during exposures. This is a model of therapeutic process rather than one meant to explain the mechanism of change underlying exposure-based treatments. Habitation, or a natural decrease in anxiety level in the absence of anxiety-reducing behavior, might be best understood as an intermediate treatment outcome that informs therapeutic process, rather than as a mechanism of change. The habituation model purports that three conditions are necessary for optimal benefit from exposures: 1) fear activation, 2) minimization of anxiety-reducing behaviors, and 3) habituation. We describe prescribed therapist and client behaviors as those that increase or maintain anxiety level during an exposure (and therefore, facilitate habituation), and proscribed therapist and client behaviors as those that decrease anxiety during an exposure (and therefore, impede habituation). We illustrate model-consistent behaviors in the case of Monica, as well as outline the existing research support and call for additional research to further test the tenets of the habituation model as described in this paper. PMID:26258012

  13. Multiscale retinocortical model of contrast processing

    NASA Astrophysics Data System (ADS)

    Moorhead, Ian R.; Haig, Nigel D.

    1996-04-01

    Visual performance models have in the past, typically been empirical, relying on the user to supply numerical values such as target contrast and background luminance to describe the performance of the visual system, when undertaking a specified task. However, it is becoming increasingly easy to obtain computer images using for example digital cameras, scanners, imaging photometers and radiometers. We have therefore been examining the possibility of producing a quantitative model of human vision that is capable of directly processing images in order to provide predictions of performance. We are particularly interested in being able to process images of 'real' scenes. The model is inspired by human vision and the components have analogies with parts of the human visual system but their properties are governed primarily by existing psychophysical data. The first stage of the model generates a multiscale, difference of Gaussian (DoG) representation of the image (Burton, Haig and Moorhead), with a central foveal region of high resolution, and with a resolution that declines with eccentricity as the scale of the filter increases. Incorporated into this stage is a gain control process which ensures that the contrast sensitivity is consistent with the psychophysical data of van Nes and Bouman. The second stage incorporates a model of perceived contrast proposed by Cannon and Fullenkamp. Their model assumes the image is analyzed by oriented (Gabor) filters and produces a representation of the image in terms of perceived contrast.

  14. An Extended Petri-Net Based Approach for Supply Chain Process Enactment in Resource-Centric Web Service Environment

    NASA Astrophysics Data System (ADS)

    Wang, Xiaodong; Zhang, Xiaoyu; Cai, Hongming; Xu, Boyi

    Enacting a supply-chain process involves variant partners and different IT systems. REST receives increasing attention for distributed systems with loosely coupled resources. Nevertheless, resource model incompatibilities and conflicts prevent effective process modeling and deployment in resource-centric Web service environment. In this paper, a Petri-net based framework for supply-chain process integration is proposed. A resource meta-model is constructed to represent the basic information of resources. Then based on resource meta-model, XML schemas and documents are derived, which represent resources and their states in Petri-net. Thereafter, XML-net, a high level Petri-net, is employed for modeling control and data flow of process. From process model in XML-net, RESTful services and choreography descriptions are deduced. Therefore, unified resource representation and RESTful services description are proposed for cross-system integration in a more effective way. A case study is given to illustrate the approach and the desirable features of the approach are discussed.

  15. Improving the process of process modelling by the use of domain process patterns

    NASA Astrophysics Data System (ADS)

    Koschmider, Agnes; Reijers, Hajo A.

    2015-01-01

    The use of business process models has become prevalent in a wide area of enterprise applications. But while their popularity is expanding, concerns are growing with respect to their proper creation and maintenance. An obvious way to boost the efficiency of creating high-quality business process models would be to reuse relevant parts of existing models. At this point, however, limited support exists to guide process modellers towards the usage of appropriate model content. In this paper, a set of content-oriented patterns is presented, which is extracted from a large set of process models from the order management and manufacturing production domains. The patterns are derived using a newly proposed set of algorithms, which are being discussed in this paper. The authors demonstrate how such Domain Process Patterns, in combination with information on their historic usage, can support process modellers in generating new models. To support the wider dissemination and development of Domain Process Patterns within and beyond the studied domains, an accompanying website has been set up.

  16. Representation of planetary magnetospheric environment with the paraboloid model

    NASA Astrophysics Data System (ADS)

    Kalegaev, V. V.; Alexeev, I. I.; Belenkaya, E. S.; Mukhametdinova, L. R.; Khodachenko, M. L.; Génot, V.; Kallio, E. J.; Al-Ubaidi, T.; Modolo, R.

    2013-09-01

    Paraboloid model of the Earth's magnetosphere has been developed at Moscow State University to represent correctly the electrodynamics processes in the near-Earth's space [1]. This model is intended to calculate the magnetic field generated by a variety of current systems located on the boundaries and within the boundaries of the Earth's magnetosphere under a wide range of environmental conditions, quiet and disturbed, affected by Solar-Terrestrial interactions simulated by Solar activity such as Solar Flares and related phenomena which induce terrestrial magnetic disturbances such as Magnetic Storms. The model depends on a small set of physical input parameters, which characterize the intensity of large-scale magnetospheric current systems and their location. Among these parameters are a geomagnetic dipole tilt angle, distance to the subsolar point of the magnetosphere, etc. The input parameters depend on real- or quasi-real- time Empirical Data that include solar wind and IMF data as well as geomagnetic indices. A generalized paraboloid model was implemented to represent the magnetospheres of some magnetized planets, e.g. Saturn [2], Jupiter [3], Mercury [4]. Interactive models of the Earth's, Kronian and Mercury's magnetospheres, which take into account specific features of the modeled objects have been realized at Space Monitoring Data Center of SINP MSU [5]. The real-time model of the Earth's magnetosphere is currently working at SINP MSU Space Weather Web-site [6]. Data from different sources (satellite measurements, simulation data bases and online services) are accumulated inside a digital framework developed within the FP7 project IMPEx. Paraboloid model of the magnetospheres (PMM) is part of this infrastructure. A set of Webservices to provide the access to PMM calculations and to enable the modeling data post-processing under SOAP protocol have been created. These will be implemented for easy data exchange within IMPEx infrastructure.

  17. Modeling stroke rehabilitation processes using the Unified Modeling Language (UML).

    PubMed

    Ferrante, Simona; Bonacina, Stefano; Pinciroli, Francesco

    2013-10-01

    In organising and providing rehabilitation procedures for stroke patients, the usual need for many refinements makes it inappropriate to attempt rigid standardisation, but greater detail is required concerning workflow. The aim of this study was to build a model of the post-stroke rehabilitation process. The model, implemented in the Unified Modeling Language, was grounded on international guidelines and refined following the clinical pathway adopted at local level by a specialized rehabilitation centre. The model describes the organisation of the rehabilitation delivery and it facilitates the monitoring of recovery during the process. Indeed, a system software was developed and tested to support clinicians in the digital administration of clinical scales. The model flexibility assures easy updating after process evolution.

  18. Machine platform and software environment for rapid optics assembly process development

    NASA Astrophysics Data System (ADS)

    Sauer, Sebastian; Müller, Tobias; Haag, Sebastian; Zontar, Daniel

    2016-03-01

    The assembly of optical components for laser systems is proprietary knowledge and typically done by well-trained personnel in clean room environment as it has major impact on the overall laser performance. Rising numbers of laser systems drives laser production to industrial-level automation solutions allowing for high volumes by simultaneously ensuring stable quality, lots of variants and low cost. Therefore, an easy programmable, expandable and reconfigurable machine with intuitive and flexible software environment for process configuration is required. With Fraunhofer IPT's expertise on optical assembly processes, the next step towards industrializing the production of optical systems is made.

  19. Hencky's model for elastomer forming process

    NASA Astrophysics Data System (ADS)

    Oleinikov, A. A.; Oleinikov, A. I.

    2016-08-01

    In the numerical simulation of elastomer forming process, Henckys isotropic hyperelastic material model can guarantee relatively accurate prediction of strain range in terms of large deformations. It is shown, that this material model prolongate Hooke's law from the area of infinitesimal strains to the area of moderate ones. New representation of the fourth-order elasticity tensor for Hencky's hyperelastic isotropic material is obtained, it possesses both minor symmetries, and the major symmetry. Constitutive relations of considered model is implemented into MSC.Marc code. By calculating and fitting curves, the polyurethane elastomer material constants are selected. Simulation of equipment for elastomer sheet forming are considered.

  20. Dynamical modeling of laser ablation processes

    SciTech Connect

    Leboeuf, J.N.; Chen, K.R.; Donato, J.M.; Geohegan, D.B.; Liu, C.L.; Puretzky, A.A.; Wood, R.F.

    1995-09-01

    Several physics and computational approaches have been developed to globally characterize phenomena important for film growth by pulsed laser deposition of materials. These include thermal models of laser-solid target interactions that initiate the vapor plume; plume ionization and heating through laser absorption beyond local thermodynamic equilibrium mechanisms; gas dynamic, hydrodynamic, and collisional descriptions of plume transport; and molecular dynamics models of the interaction of plume particles with the deposition substrate. The complexity of the phenomena involved in the laser ablation process is matched by the diversity of the modeling task, which combines materials science, atomic physics, and plasma physics.

  1. A Generic Modeling Process to Support Functional Fault Model Development

    NASA Technical Reports Server (NTRS)

    Maul, William A.; Hemminger, Joseph A.; Oostdyk, Rebecca; Bis, Rachael A.

    2016-01-01

    Functional fault models (FFMs) are qualitative representations of a system's failure space that are used to provide a diagnostic of the modeled system. An FFM simulates the failure effect propagation paths within a system between failure modes and observation points. These models contain a significant amount of information about the system including the design, operation and off nominal behavior. The development and verification of the models can be costly in both time and resources. In addition, models depicting similar components can be distinct, both in appearance and function, when created individually, because there are numerous ways of representing the failure space within each component. Generic application of FFMs has the advantages of software code reuse: reduction of time and resources in both development and verification, and a standard set of component models from which future system models can be generated with common appearance and diagnostic performance. This paper outlines the motivation to develop a generic modeling process for FFMs at the component level and the effort to implement that process through modeling conventions and a software tool. The implementation of this generic modeling process within a fault isolation demonstration for NASA's Advanced Ground System Maintenance (AGSM) Integrated Health Management (IHM) project is presented and the impact discussed.

  2. The SERIOL2 Model of Orthographic Processing

    ERIC Educational Resources Information Center

    Whitney, Carol; Marton, Yuval

    2013-01-01

    The SERIOL model of orthographic analysis proposed mechanisms for converting visual input into a serial encoding of letter order, which involved hemisphere-specific processing at the retinotopic level. As a test of SERIOL predictions, we conducted a consonant trigram-identification experiment, where the trigrams were briefly presented at various…

  3. STBRSIM. Oil Shale Retorting Process Model

    SciTech Connect

    Braun, R.L.; Diaz, J.C.

    1992-03-02

    STBRSIM simulates an aboveground oil-shale retorting process that utilizes two reactors; a staged, fluidized-bed retort and a lift-pipe combustor. The model calculates the steady-state operating conditions for the retorting system,taking into account the chemical and physical processes occurring in the two reactors and auxiliary equipment. Chemical and physical processes considered in modeling the retort include: kerogen pyrolysis, bound water release, fluidization of solids mixture, and bed pressure drop. Processes accounted for by the combustor model include: combustion of residual organic carbon and hydrogen, combustion of pyrite and pyrrhotite, combustion of nonpyrolized kerogen, decomposition of dolomite and calcite, pneumatic transport, heat transfer between solids and gas streams, pressure drop and change in void fraction, and particle attrition. The release of mineral water and the pyrolysis of kerogen take place in the retort when raw shale is mixed with hot partially-burned shale, and the partial combustion of residual char and sulfur takes place in the combustor as the shale particles are transported pneumatically by preheated air. Auxiliary equipment is modeled to determine its effect on the system. This equipment includes blowers and heat-exchangers for the recycle gas to the retort and air to the combustor, as well as a condensor for the product stream from the retort. Simulation results include stream flow rates, temperatures and pressures, bed dimensions, and heater, cooling, and compressor power requirements.

  4. STBRSIM. Oil Shale Retorting Process Model

    SciTech Connect

    Eyberger, L.R.

    1992-03-02

    STBRSIM simulates an aboveground oil-shale retorting process that utilizes two reactors - a staged, fluidized-bed retort and a lift-pipe combustor. The model calculates the steady-state operating conditions for the retorting system, taking into account the chemical and physical processes occurring in the two reactors and auxiliary equipment. Chemical and physical processes considered in modeling the retort include: kerogen pyrolysis, bound water release, fluidization of solids mixture, and bed pressure drop. Processes accounted for by the combustor model include: combustion of residual organic carbon and hydrogen, combustion of pyrite and pyrrhotite, combustion of nonpyrolized kerogen, decomposition of dolomite and calcite, pneumatic transport, heat transfer between solids and gas streams, pressure drop and change in void fraction, and particle attrition. The release of mineral water and the pyrolysis of kerogen take place in the retort when raw shale is mixed with hot partially-burned shale, and the partial combustion of residual char and sulfur takes place in the combustor as the shale particles are transported pneumatically by preheated air. Auxiliary equipment is modeled to determine its effect on the system. This equipment includes blowers and heat-exchangers for the recycle gas to the retort and air to the combustor, as well as a condensor for the product stream from the retort. Simulation results include stream flow rates, temperatures and pressures, bed dimensions, and heater, cooling, and compressor power requirements.

  5. Content, Process, and Product: Modeling Differentiated Instruction

    ERIC Educational Resources Information Center

    Taylor, Barbara Kline

    2015-01-01

    Modeling differentiated instruction is one way to demonstrate how educators can incorporate instructional strategies to address students' needs, interests, and learning styles. This article discusses how secondary teacher candidates learn to focus on content--the "what" of instruction; process--the "how" of instruction;…

  6. Aligning Grammatical Theories and Language Processing Models

    ERIC Educational Resources Information Center

    Lewis, Shevaun; Phillips, Colin

    2015-01-01

    We address two important questions about the relationship between theoretical linguistics and psycholinguistics. First, do grammatical theories and language processing models describe separate cognitive systems, or are they accounts of different aspects of the same system? We argue that most evidence is consistent with the one-system view. Second,…

  7. Computational Process Modeling for Additive Manufacturing (OSU)

    NASA Technical Reports Server (NTRS)

    Bagg, Stacey; Zhang, Wei

    2015-01-01

    Powder-Bed Additive Manufacturing (AM) through Direct Metal Laser Sintering (DMLS) or Selective Laser Melting (SLM) is being used by NASA and the Aerospace industry to "print" parts that traditionally are very complex, high cost, or long schedule lead items. The process spreads a thin layer of metal powder over a build platform, then melts the powder in a series of welds in a desired shape. The next layer of powder is applied, and the process is repeated until layer-by-layer, a very complex part can be built. This reduces cost and schedule by eliminating very complex tooling and processes traditionally used in aerospace component manufacturing. To use the process to print end-use items, NASA seeks to understand SLM material well enough to develop a method of qualifying parts for space flight operation. Traditionally, a new material process takes many years and high investment to generate statistical databases and experiential knowledge, but computational modeling can truncate the schedule and cost -many experiments can be run quickly in a model, which would take years and a high material cost to run empirically. This project seeks to optimize material build parameters with reduced time and cost through modeling.

  8. Empirical Modeling of Plant Gas Fluxes in Controlled Environments

    NASA Technical Reports Server (NTRS)

    Cornett, Jessie David

    1994-01-01

    As humans extend their reach beyond the earth, bioregenerative life support systems must replace the resupply and physical/chemical systems now used. The Controlled Ecological Life Support System (CELSS) will utilize plants to recycle the carbon dioxide (CO2) and excrement produced by humans and return oxygen (O2), purified water and food. CELSS design requires knowledge of gas flux levels for net photosynthesis (PS(sub n)), dark respiration (R(sub d)) and evapotranspiration (ET). Full season gas flux data regarding these processes for wheat (Triticum aestivum), soybean (Glycine max) and rice (Oryza sativa) from published sources were used to develop empirical models. Univariate models relating crop age (days after planting) and gas flux were fit by simple regression. Models are either high order (5th to 8th) or more complex polynomials whose curves describe crop development characteristics. The models provide good estimates of gas flux maxima, but are of limited utility. To broaden the applicability, data were transformed to dimensionless or correlation formats and, again, fit by regression. Polynomials, similar to those in the initial effort, were selected as the most appropriate models. These models indicate that, within a cultivar, gas flux patterns appear remarkably similar prior to maximum flux, but exhibit considerable variation beyond this point. This suggests that more broadly applicable models of plant gas flux are feasible, but univariate models defining gas flux as a function of crop age are too simplistic. Multivariate models using CO2 and crop age were fit for PS(sub n), and R(sub d) by multiple regression. In each case, the selected model is a subset of a full third order model with all possible interactions. These models are improvements over the univariate models because they incorporate more than the single factor, crop age, as the primary variable governing gas flux. They are still limited, however, by their reliance on the other environmental

  9. Process modeling with the regression network.

    PubMed

    van der Walt, T; Barnard, E; van Deventer, J

    1995-01-01

    A new connectionist network topology called the regression network is proposed. The structural and underlying mathematical features of the regression network are investigated. Emphasis is placed on the intricacies of the optimization process for the regression network and some measures to alleviate these difficulties of optimization are proposed and investigated. The ability of the regression network algorithm to perform either nonparametric or parametric optimization, as well as a combination of both, is also highlighted. It is further shown how the regression network can be used to model systems which are poorly understood on the basis of sparse data. A semi-empirical regression network model is developed for a metallurgical processing operation (a hydrocyclone classifier) by building mechanistic knowledge into the connectionist structure of the regression network model. Poorly understood aspects of the process are provided for by use of nonparametric regions within the structure of the semi-empirical connectionist model. The performance of the regression network model is compared to the corresponding generalization performance results obtained by some other nonparametric regression techniques.

  10. Coal-to-Liquids Process Model

    SciTech Connect

    2006-01-01

    A comprehensive Aspen Plus model has been developed to rigorously model coal-to-liquids processes. This portion was developed under Laboratory Directed Research and Development (LDRD) funding. The model is built in a modular fashion to allow rapid reconfiguration for evaluation of process options. Aspen Plus is the framework in which the model is developed. The coal-to-liquids simulation package is an assemble of Aspen Hierarchy Blocks representing subsections of the plant. Each of these Blocks are considered individual components of the Copyright, which may be extracted and licensed as individual components, but which may be combined with one or more other components, to model general coal-conversion processes, including the following plant operations: (1) coal handling and preparation, (2) coal pyrolysis, combustion, or gasification, (3) syngas conditioning and cleanup, (4) sulfur recovery using Claus-SCOT unit operations, (5) Fischer-Tropsch liquid fuels synthesis, (6) hydrocracking of high molecular weight paraffin, (7) hydrotreating of low molecular weight paraffin and olefins, (8) gas separations, and (9) power generation representing integrated combined cycle technology.

  11. Development of an interdisciplinary model cluster for tidal water environments

    NASA Astrophysics Data System (ADS)

    Dietrich, Stephan; Winterscheid, Axel; Jens, Wyrwa; Hartmut, Hein; Birte, Hein; Stefan, Vollmer; Andreas, Schöl

    2013-04-01

    Global climate change has a high potential to influence both the persistence and the transport pathways of water masses and its constituents in tidal waters and estuaries. These processes are linked through dispersion processes, thus directly influencing the sediment and solid suspend matter budgets, and thus the river morphology. Furthermore, the hydrologic regime has an impact on the transport of nutrients, phytoplankton, suspended matter, and temperature that determine the oxygen content within water masses, which is a major parameter describing the water quality. This project aims at the implementation of a so-called (numerical) model cluster in tidal waters, which includes the model compartments hydrodynamics, morphology and ecology. For the implementation of this cluster it is required to continue with the integration of different models that work in a wide range of spatial and temporal scales. The model cluster is thus suggested to lead to a more precise knowledge of the feedback processes between the single interdisciplinary model compartments. In addition to field measurements this model cluster will provide a complementary scientific basis required to address a spectrum of research questions concerning the integral management of estuaries within the Federal Institute of Hydrology (BfG, Germany). This will in particular include aspects like sediment and water quality management as well as adaptation strategies to climate change. The core of the model cluster will consist of the 3D-hydrodynamic model Delft3D (Roelvink and van Banning, 1994), long-term hydrodynamics in the estuaries are simulated with the Hamburg Shelf Ocean Model HAMSOM (Backhaus, 1983; Hein et al., 2012). The simulation results will be compared with the unstructured grid based SELFE model (Zhang and Bapista, 2008). The additional coupling of the BfG-developed 1D-water quality model QSim (Kirchesch and Schöl, 1999; Hein et al., 2011) with the morphological/hydrodynamic models is an

  12. Modelling the Release, Transport and Fate of Engineered Nanoparticles in the Aquatic Environment - A Review.

    PubMed

    Markus, Adriaan A; Parsons, John R; Roex, Erwin W M; de Voogt, Pim; Laane, Remi W P M

    2016-12-28

    Engineered nanoparticles, that is, particles of up to 100 nm in at least one dimension, are used in many consumer products. Their release into the environment as a consequence of their production and use has raised concern about the possible consequences. While they are made of ordinary substances, their size gives them properties that are not manifest in larger particles. It is precisely these properties that make them useful. For instance titanium dioxide nanoparticles are used in transparent sunscreens, because they are large enough to scatter ultraviolet light but too small to scatter visible light.To investigate the occurrence of nanoparticles in the environment we require practical methods to detect their presence and to measure the concentrations as well as adequate modelling techniques. Modelling provides both a complement to the available detection and measurement methods and the means to understand and predict the release, transport and fate of nanoparticles. Many different modelling approaches have been developed, but it is not always clear for what questions regarding nanoparticles in the environment these approaches can be applied. No modelling technique can be used for every possible aspect of the release of nanoparticles into the environment. Hence it is important to understand which technique to apply in what situation. This article provides an overview of the techniques involved with their strengths and weaknesses. Two points need to be stressed here: the modelling of processes like dissolution and the surface activity of nanoparticles, possibly under influence of ultraviolet light, or chemical transformation has so far received relatively little attention. But also the uncertainties surrounding nanoparticles in general-the amount of nanoparticles used in consumer products, what constitutes the appropriate measure of concentration (mass or numbers) and what processes are relevant-should be explicitly considered as part of the modelling.

  13. Geomagnetic Environment Modeling at the Community Coordinated Modeling Center: Successes, Challenges and Perspectives.

    NASA Astrophysics Data System (ADS)

    Kuznetsova, Maria; Toth, Gabor; Hesse, Michael; Rastaetter, Lutz; Glocer, Alex

    The Community Coordinated Modeling Center (CCMC, http://ccmc.gsfc.nasa.gov) hosts an expanding collection of modern space science and space weather models developed by the international space science community. The goals of the CCMC are to support the research and developmental work necessary to substantially increase the present-day space environment modeling capability and to maximize scientific return on investments into model development. CCMC is servicing models through interactive web-based systems, supporting community-wide research projects and designing displays and tools customized for specific applications. The presentation will review the current state of the geomagnetic environment modeling, highlight resent progress, and showcase the role of state-of-the-art magnetosphere models in advancing our understanding of fundamental phenomena in magnetosphere plasma physics.

  14. Modeling veterans healthcare administration disclosure processes :

    SciTech Connect

    Beyeler, Walter E; DeMenno, Mercy B.; Finley, Patrick D.

    2013-09-01

    As with other large healthcare organizations, medical adverse events at the Department of Veterans Affairs (VA) facilities can expose patients to unforeseen negative risks. VHA leadership recognizes that properly handled disclosure of adverse events can minimize potential harm to patients and negative consequences for the effective functioning of the organization. The work documented here seeks to help improve the disclosure process by situating it within the broader theoretical framework of issues management, and to identify opportunities for process improvement through modeling disclosure and reactions to disclosure. The computational model will allow a variety of disclosure actions to be tested across a range of incident scenarios. Our conceptual model will be refined in collaboration with domain experts, especially by continuing to draw on insights from VA Study of the Communication of Adverse Large-Scale Events (SCALE) project researchers.

  15. An ecological process model of systems change.

    PubMed

    Peirson, Leslea J; Boydell, Katherine M; Ferguson, H Bruce; Ferris, Lorraine E

    2011-06-01

    In June 2007 the American Journal of Community Psychology published a special issue focused on theories, methods and interventions for systems change which included calls from the editors and authors for theoretical advancement in this field. We propose a conceptual model of systems change that integrates familiar and fundamental community psychology principles (succession, interdependence, cycling of resources, adaptation) and accentuates a process orientation. To situate our framework we offer a definition of systems change and a brief review of the ecological perspective and principles. The Ecological Process Model of Systems Change is depicted, described and applied to a case example of policy driven systems level change in publicly funded social programs. We conclude by identifying salient implications for thinking and action which flow from the Model.

  16. A model evaluation checklist for process-based environmental models

    NASA Astrophysics Data System (ADS)

    Jackson-Blake, Leah

    2015-04-01

    Mechanistic catchment-scale phosphorus models appear to perform poorly where diffuse sources dominate. The reasons for this were investigated for one commonly-applied model, the INtegrated model of CAtchment Phosphorus (INCA-P). Model output was compared to 18 months of daily water quality monitoring data in a small agricultural catchment in Scotland, and model structure, key model processes and internal model responses were examined. Although the model broadly reproduced dissolved phosphorus dynamics, it struggled with particulates. The reasons for poor performance were explored, together with ways in which improvements could be made. The process of critiquing and assessing model performance was then generalised to provide a broadly-applicable model evaluation checklist, incorporating: (1) Calibration challenges, relating to difficulties in thoroughly searching a high-dimensional parameter space and in selecting appropriate means of evaluating model performance. In this study, for example, model simplification was identified as a necessary improvement to reduce the number of parameters requiring calibration, whilst the traditionally-used Nash Sutcliffe model performance statistic was not able to discriminate between realistic and unrealistic model simulations, and alternative statistics were needed. (2) Data limitations, relating to a lack of (or uncertainty in) input data, data to constrain model parameters, data for model calibration and testing, and data to test internal model processes. In this study, model reliability could be improved by addressing all four kinds of data limitation. For example, there was insufficient surface water monitoring data for model testing against an independent dataset to that used in calibration, whilst additional monitoring of groundwater and effluent phosphorus inputs would help distinguish between alternative plausible model parameterisations. (3) Model structural inadequacies, whereby model structure may inadequately represent

  17. An Interactive, Web-based High Performance Modeling Environment for Computational Epidemiology.

    PubMed

    Deodhar, Suruchi; Bisset, Keith R; Chen, Jiangzhuo; Ma, Yifei; Marathe, Madhav V

    2014-07-01

    We present an integrated interactive modeling environment to support public health epidemiology. The environment combines a high resolution individual-based model with a user-friendly web-based interface that allows analysts to access the models and the analytics back-end remotely from a desktop or a mobile device. The environment is based on a loosely-coupled service-oriented-architecture that allows analysts to explore various counter factual scenarios. As the modeling tools for public health epidemiology are getting more sophisticated, it is becoming increasingly hard for non-computational scientists to effectively use the systems that incorporate such models. Thus an important design consideration for an integrated modeling environment is to improve ease of use such that experimental simulations can be driven by the users. This is achieved by designing intuitive and user-friendly interfaces that allow users to design and analyze a computational experiment and steer the experiment based on the state of the system. A key feature of a system that supports this design goal is the ability to start, stop, pause and roll-back the disease propagation and intervention application process interactively. An analyst can access the state of the system at any point in time and formulate dynamic interventions based on additional information obtained through state assessment. In addition, the environment provides automated services for experiment set-up and management, thus reducing the overall time for conducting end-to-end experimental studies. We illustrate the applicability of the system by describing computational experiments based on realistic pandemic planning scenarios. The experiments are designed to demonstrate the system's capability and enhanced user productivity.

  18. Improving science and mathematics education with computational modelling in interactive engagement environments

    NASA Astrophysics Data System (ADS)

    Neves, Rui Gomes; Teodoro, Vítor Duarte

    2012-09-01

    A teaching approach aiming at an epistemologically balanced integration of computational modelling in science and mathematics education is presented. The approach is based on interactive engagement learning activities built around computational modelling experiments that span the range of different kinds of modelling from explorative to expressive modelling. The activities are designed to make a progressive introduction to scientific computation without requiring prior development of a working knowledge of programming, generate and foster the resolution of cognitive conflicts in the understanding of scientific and mathematical concepts and promote performative competency in the manipulation of different and complementary representations of mathematical models. The activities are supported by interactive PDF documents which explain the fundamental concepts, methods and reasoning processes using text, images and embedded movies, and include free space for multimedia enriched student modelling reports and teacher feedback. To illustrate, an example from physics implemented in the Modellus environment and tested in undergraduate university general physics and biophysics courses is discussed.

  19. Comparison of the Beta and the Hidden Markov Models of Trust in Dynamic Environments

    NASA Astrophysics Data System (ADS)

    Moe, Marie E. G.; Helvik, Bjarne E.; Knapskog, Svein J.

    Computational trust and reputation models are used to aid the decision-making process in complex dynamic environments, where we are unable to obtain perfect information about the interaction partners. In this paper we present a comparison of our proposed hidden Markov trust model to the Beta reputation system. The hidden Markov trust model takes the time between observations into account, it also distinguishes between system states and uses methods previously applied to intrusion detection for the prediction of which state an agent is in. We show that the hidden Markov trust model performs better when it comes to the detection of changes in behavior of agents, due to its larger richness in model features. This means that our trust model may be more realistic in dynamic environments. However, the increased model complexity also leads to bigger challenges in estimating parameter values for the model. We also show that the hidden Markov trust model can be parameterized so that it responds similarly to the Beta reputation system.

  20. 19 Gene × Environment Interaction Models in Psychiatric Genetics

    PubMed Central

    Karg, Katja; Sen, Srijan

    2013-01-01

    Gene-environment (G×E) interaction research is an emerging area in psychiatry, with the number of G×E studies growing rapidly in the past two decades. This article aims to give a comprehensive introduction to the field, with an emphasis on central theoretical and practical problems that are worth considering before conducting a G×E interaction study. On the theoretical side, we discuss two fundamental, but controversial questions about (1) the validity of statistical models for biological interaction and (2) the utility of G×E research for psychiatric genetics. On the practical side, we focus on study characteristics that potentially influence the outcome of G×E interaction studies and discuss strengths and pitfalls of different study designs, including recent approaches like Genome-Environment Wide Interaction Studies (GEWIS). Finally, we discuss recent developments in G×E interaction research on the most heavily investigated example in psychiatric genetics, the interaction between a serotonin transporter gene promoter variant (5-HTTLPR) and stress on depression. PMID:22241248

  1. Development of a comprehensive weld process model

    SciTech Connect

    Radhakrishnan, B.; Zacharia, T.; Paul, A.

    1997-05-01

    This cooperative research and development agreement (CRADA) between Concurrent Technologies Corporation (CTC) and Lockheed Martin Energy Systems (LMES) combines CTC`s expertise in the welding area and that of LMES to develop computer models and simulation software for welding processes. This development is of significant impact to the industry, including materials producers and fabricators. The main thrust of the research effort was to develop a comprehensive welding simulation methodology. A substantial amount of work has been done by several researchers to numerically model several welding processes. The primary drawback of most of the existing models is the lack of sound linkages between the mechanistic aspects (e.g., heat transfer, fluid flow, and residual stress) and the metallurgical aspects (e.g., microstructure development and control). A comprehensive numerical model which can be used to elucidate the effect of welding parameters/conditions on the temperature distribution, weld pool shape and size, solidification behavior, and microstructure development, as well as stresses and distortion, does not exist. It was therefore imperative to develop a comprehensive model which would predict all of the above phenomena during welding. The CRADA built upon an already existing three-dimensional (3-D) welding simulation model which was developed by LMES which is capable of predicting weld pool shape and the temperature history in 3-d single-pass welds. However, the model does not account for multipass welds, microstructural evolution, distortion and residual stresses. Additionally, the model requires large resources of computing time, which limits its use for practical applications. To overcome this, CTC and LMES have developed through this CRADA the comprehensive welding simulation model described above.

  2. Modeling three-dimensional propagation in a continental shelf environment.

    PubMed

    Ballard, Megan S

    2012-03-01

    An acoustic propagation model is applied to predict measurements of three-dimensional (3-D) effects recorded off the southeast coast of Florida. The measured signal is produced by a low frequency source that is towed north parallel to the shelf from a fixed receiving array. The acoustic data show the direct path arrival at the bearing of the tow ship and a second refracted path arrival as much as 30° inshore of the direct arrival. Notably, the refracted arrival has a received level more than 25 dB greater than that of the direct arrival. A geoacoustic model of the environment is created to explain the data. It is shown that the topography of the seafloor plays the largest role in controlling horizontal refraction effects, whereas the range-dependent sediment properties have the most influence on the received level. The modeling approach is based on a 3-D adiabatic mode technique in which the horizontal refraction equation is solved using a parabolic equation in Cartesian coordinates. A modal decomposition of the field provides insight into the variability in the arrival angle and received level of the measured signal.

  3. Modelling the appearance of chromatic environment using hyperspectral imaging

    NASA Astrophysics Data System (ADS)

    Fomins, S.; Ozolinsh, M.

    2013-11-01

    Color of objects is a spectral composition of incident light source, reflection properties of the object itself, and spectral tuning of the eye. Light sources with different spectral characteristics can produce metameric representation of color; however most variable in this regard is vision. Pigments of color vision are continuously bleached by different stimuli and optical density of the pigment is changed, while continuous conditions provide an adaptation and perception of white. Special cases are color vision deficiencies which cover almost 8 % of male population in Europe. Hyperspectral imaging allows obtaining the spectra of the environment and modelling the performance of the dichromatic, anomalous trichromatic, as also normal trichromatic adapted behavior. First, CRI Nuance hyperspectral imaging system was spectrally calibrated for natural continuous spectral illumination of high color rendering index and narrow band fluorescent light sources. Full-scale images of color deficiency tests were acquired in the range of 420 to 720 nm to evaluate the modelling capacity for dichromatic and anomalous trichromatic vision. Hyperspectral images were turned to cone excitation images according to Stockman and Sharpe (2000) 1. Further, model was extended for anomalous trichromacy conditions. Cone sensitivity spectra were shifted by 4 nm according to each anomaly type. LWS and SWS cone signals were balanced in each condition to provide the appropriate appearance of colors in CIE system.

  4. The spa as a model of an optimal healing environment.

    PubMed

    Frost, Gary J

    2004-01-01

    "Spa" is an acronym for salus per aqua, or health through water. There currently are approximately 10,000 spas of all types in the United States. Most now focus on eating and weight programs with subcategories of sports activities and nutrition most prominent. The main reasons stated by clients for their use are stress reduction, specific medical or other health issues, eating and weight loss, rest and relaxation, fitness and exercise, and pampering and beauty. A detailed description of the Canyon Ranch, a spa facility in Tucson, AZ, is presented as a case study in this paper. It appears that the three most critical factors in creating an optimal healing environment in a spa venue are (1) a dedicated caring staff at all levels, (2) a mission driven organization that will not compromise, and (3) a sound business model and leadership that will ensure permanency.

  5. MOUNTAIN-SCALE COUPLED PROCESSES (TH/THC/THM)MODELS

    SciTech Connect

    Y.S. Wu

    2005-08-24

    This report documents the development and validation of the mountain-scale thermal-hydrologic (TH), thermal-hydrologic-chemical (THC), and thermal-hydrologic-mechanical (THM) models. These models provide technical support for screening of features, events, and processes (FEPs) related to the effects of coupled TH/THC/THM processes on mountain-scale unsaturated zone (UZ) and saturated zone (SZ) flow at Yucca Mountain, Nevada (BSC 2005 [DIRS 174842], Section 2.1.1.1). The purpose and validation criteria for these models are specified in ''Technical Work Plan for: Near-Field Environment and Transport: Coupled Processes (Mountain-Scale TH/THC/THM, Drift-Scale THC Seepage, and Drift-Scale Abstraction) Model Report Integration'' (BSC 2005 [DIRS 174842]). Model results are used to support exclusion of certain FEPs from the total system performance assessment for the license application (TSPA-LA) model on the basis of low consequence, consistent with the requirements of 10 CFR 63.342 [DIRS 173273]. Outputs from this report are not direct feeds to the TSPA-LA. All the FEPs related to the effects of coupled TH/THC/THM processes on mountain-scale UZ and SZ flow are discussed in Sections 6 and 7 of this report. The mountain-scale coupled TH/THC/THM processes models numerically simulate the impact of nuclear waste heat release on the natural hydrogeological system, including a representation of heat-driven processes occurring in the far field. The mountain-scale TH simulations provide predictions for thermally affected liquid saturation, gas- and liquid-phase fluxes, and water and rock temperature (together called the flow fields). The main focus of the TH model is to predict the changes in water flux driven by evaporation/condensation processes, and drainage between drifts. The TH model captures mountain-scale three-dimensional flow effects, including lateral diversion and mountain-scale flow patterns. The mountain-scale THC model evaluates TH effects on water and gas

  6. DEVELOPMENT OF CAPE-OPEN COMPLIANT PROCESS MODELING COMPONENTS IN MICROSOFT .NET

    EPA Science Inventory

    The CAPE-OPEN middleware standards were created to allow process modeling components (PMCs) developed by third parties to be used in any process modeling environment (PME) utilizing these standards. The CAPE-OPEN middleware specifications were based upon both Microsoft's Compone...

  7. Upgrading Preschool Environment in a Swedish Municipality: Evaluation of an Implementation Process.

    PubMed

    Altin, Carolina; Kvist Lindholm, Sofia; Wejdmark, Mats; Lättman-Masch, Robert; Boldemann, Cecilia

    2015-07-01

    Redesigning outdoor preschool environment may favorably affect multiple factors relevant to health and reach many children. Cross-sectional studies in various landscapes at different latitudes have explored the characteristics of preschool outdoor environment considering the play potential triggering combined physical activity and sun-protective behavior due to space, vegetation, and topography. Criteria were pinpointed to upgrade preschool outdoor environment for multiple health outcomes to be applied in local government in charge of public preschools. Purposeful land use policies and administrative management of outdoor land use may serve to monitor the quality of preschool outdoor environments (upgrading and planning). This study evaluates the process of implementing routines for upgrading outdoor preschool environments in a medium-sized municipality, Sweden, 2008-2011, using qualitative and quantitative analysis. Recorded written material (logs and protocols) related to the project was processed using thematic analysis. Quantitative data (m(2) flat/multileveled, overgrown/naked surface, and fraction of free visible sky) were analyzed to assess the impact of implementation (surface, topography, greenery integrated in play). The preschool outdoor environments were upgraded accordingly. The quality of implementation was assessed using the theory of policy streams approach. Though long-term impact remains to be confirmed the process seems to have changed work routines in the interior management for purposeful upgrading of preschool outdoor environments. The aptitude and applicability of inexpensive methods for assessing, selecting, and upgrading preschool land at various latitudes, climates, and outdoor play policies (including gender aspects and staff policies) should be further discussed, as well as the compilation of data for monitoring and evaluation.

  8. A mixed-model quantitative trait loci (QTL) analysis for multiple-environment trial data using environmental covariables for QTL-by-environment interactions, with an example in maize.

    PubMed

    Boer, Martin P; Wright, Deanne; Feng, Lizhi; Podlich, Dean W; Luo, Lang; Cooper, Mark; van Eeuwijk, Fred A

    2007-11-01

    Complex quantitative traits of plants as measured on collections of genotypes across multiple environments are the outcome of processes that depend in intricate ways on genotype and environment simultaneously. For a better understanding of the genetic architecture of such traits as observed across environments, genotype-by-environment interaction should be modeled with statistical models that use explicit information on genotypes and environments. The modeling approach we propose explains genotype-by-environment interaction by differential quantitative trait locus (QTL) expression in relation to environmental variables. We analyzed grain yield and grain moisture for an experimental data set composed of 976 F(5) maize testcross progenies evaluated across 12 environments in the U.S. corn belt during 1994 and 1995. The strategy we used was based on mixed models and started with a phenotypic analysis of multi-environment data, modeling genotype-by-environment interactions and associated genetic correlations between environments, while taking into account intraenvironmental error structures. The phenotypic mixed models were then extended to QTL models via the incorporation of marker information as genotypic covariables. A majority of the detected QTL showed significant QTL-by-environment interactions (QEI). The QEI were further analyzed by including environmental covariates into the mixed model. Most QEI could be understood as differential QTL expression conditional on longitude or year, both consequences of temperature differences during critical stages of the growth.

  9. Thermal modeling of an epoxy encapsulation process

    SciTech Connect

    Baca, R.G.; Schutt, J.A.

    1991-01-01

    The encapsulation of components is a widely used process at Sandia National Laboratories for packaging components to withstand structural loads. Epoxy encapsulants are also used for their outstanding dielectric strength characteristics. The production of high voltage assemblies requires the encapsulation of ceramic and electrical components (such as transformers). Separation of the encapsulant from internal contact surfaces or voids within the encapsulant itself in regions near the mold base have caused high voltage breakdown failures during production testing. In order to understand the failure mechanisms, a methodology was developed to predict both the thermal response and gel front progression of the epoxy the encapsulation process. A thermal model constructed with PATRAN Plus (1) and solved with the P/THERMAL (2) analysis system was used to predict the thermal response of the encapsulant. This paper discusses the incorporation of an Arrhenius kinetics model into Q/TRAN (2) to model the complex volumetric heat generation of the epoxy during the encapsulation process. As the epoxy begins to cure, it generates heat and shrinks. The total cure time of the encapsulant (transformation from a viscous liquid to solid) is dependent on both the initial temperature and the entire temperature history. Because the rate of cure is temperature dependent, the cure rate accelerates with a temperature increase and, likewise, the cure rate is quenched if the temperature is reduced. The temperature and conversion predictions compared well against experimental data. The thermal simulation results were used to modify the temperature cure process of the encapsulant and improve production yields.

  10. Marketing the use of the space environment for the processing of biological and pharmaceutical materials

    NASA Technical Reports Server (NTRS)

    1984-01-01

    The perceptions of U.S. biotechnology and pharmaceutical companies concerning the potential use of the space environment for the processing of biological substances was examined. Physical phenomena that may be important in space-base processing of biological materials are identified and discussed in the context of past and current experiment programs. The capabilities of NASA to support future research and development, and to engage in cooperative risk sharing programs with industry are discussed. Meetings were held with several biotechnology and pharmaceutical companies to provide data for an analysis of the attitudes and perceptions of these industries toward the use of the space environment. Recommendations are made for actions that might be taken by NASA to facilitate the marketing of the use of the space environment, and in particular the Space Shuttle, to the biotechnology and pharmaceutical industries.

  11. Emerge - A Python environment for the modeling of subsurface transfers

    NASA Astrophysics Data System (ADS)

    Lopez, S.; Smai, F.; Sochala, P.

    2014-12-01

    The simulation of subsurface mass and energy transfers often relies on specific codes that were mainly developed using compiled languages which usually ensure computational efficiency at the expense of relatively long development times and relatively rigid software. Even if a very detailed, possibly graphical, user-interface is developed the core numerical aspects are rarely accessible and the smallest modification will always need a compilation step. Thus, user-defined physical laws or alternative numerical schemes may be relatively difficult to use. Over the last decade, Python has emerged as a popular and widely used language in the scientific community. There already exist several libraries for the pre and post-treatment of input and output files for reservoir simulators (e.g. pytough). Development times in Python are considerably reduced compared to compiled languages, and programs can be easily interfaced with libraries written in compiled languages with several comprehensive numerical libraries that provide sequential and parallel solvers (e.g. PETSc, Trilinos…). The core objective of the Emerge project is to explore the possibility to develop a modeling environment in full Python. Consequently, we are developing an open python package with the classes/objects necessary to express, discretize and solve the physical problems encountered in the modeling of subsurface transfers. We heavily relied on Python to have a convenient and concise way of manipulating potentially complex concepts with a few lines of code and a high level of abstraction. Our result aims to be a friendly numerical environment targeting both numerical engineers and physicist or geoscientists with the possibility to quickly specify and handle geometries, arbitrary meshes, spatially or temporally varying properties, PDE formulations, boundary conditions…

  12. Reversibility in Quantum Models of Stochastic Processes

    NASA Astrophysics Data System (ADS)

    Gier, David; Crutchfield, James; Mahoney, John; James, Ryan

    Natural phenomena such as time series of neural firing, orientation of layers in crystal stacking and successive measurements in spin-systems are inherently probabilistic. The provably minimal classical models of such stochastic processes are ɛ-machines, which consist of internal states, transition probabilities between states and output values. The topological properties of the ɛ-machine for a given process characterize the structure, memory and patterns of that process. However ɛ-machines are often not ideal because their statistical complexity (Cμ) is demonstrably greater than the excess entropy (E) of the processes they represent. Quantum models (q-machines) of the same processes can do better in that their statistical complexity (Cq) obeys the relation Cμ >= Cq >= E. q-machines can be constructed to consider longer lengths of strings, resulting in greater compression. With code-words of sufficiently long length, the statistical complexity becomes time-symmetric - a feature apparently novel to this quantum representation. This result has ramifications for compression of classical information in quantum computing and quantum communication technology.

  13. Investigation of the Relationship between Learning Process and Learning Outcomes in E-Learning Environments

    ERIC Educational Resources Information Center

    Yurdugül, Halil; Menzi Çetin, Nihal

    2015-01-01

    Problem Statement: Learners can access and participate in online learning environments regardless of time and geographical barriers. This brings up the umbrella concept of learner autonomy that contains self-directed learning, self-regulated learning and the studying process. Motivation and learning strategies are also part of this umbrella…

  14. Self-Processes and Learning Environment as Influences in the Development of Expertise in Instructional Design

    ERIC Educational Resources Information Center

    Ge, Xun; Hardre, Patricia L.

    2010-01-01

    A major challenge for learning theories is to illuminate how particular kinds of learning experiences and environments promote the development of expertise. Research has been conducted into novice-expert differences in various domains, but few studies have examined the processes involved in learners' expertise development. In an attempt to…

  15. A Virtual Environment for Process Management. A Step by Step Implementation

    ERIC Educational Resources Information Center

    Mayer, Sergio Valenzuela

    2003-01-01

    In this paper it is presented a virtual organizational environment, conceived with the integration of three computer programs: a manufacturing simulation package, an automation of businesses processes (workflows), and business intelligence (Balanced Scorecard) software. It was created as a supporting tool for teaching IE, its purpose is to give…

  16. Antibiotic Resistance in Listeria Species Isolated from Catfish Fillets and Processing Environment

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The susceptibility of 221 Listeria spp. (86 Listeria monocytogenes, 41 Listeria innocua and 94 Listeria seeligeri-Listeria welshimeri-Listeria ivanovii) isolated from catfish fillets and processing environment to 15 antibiotics was determined. Listeria isolates were analysed by disc-diffusion assay...

  17. Corpora Processing and Computational Scaffolding for a Web-Based English Learning Environment: The CANDLE Project

    ERIC Educational Resources Information Center

    Liou, Hsien-Chin; Chang, Jason S; Chen, Hao-Jan; Lin, Chih-Cheng; Liaw, Meei-Ling; Gao, Zhao-Ming; Jang, Jyh-Shing Roger; Yeh, Yuli; Chuang, Thomas C.; You, Geeng-Neng

    2006-01-01

    This paper describes the development of an innovative web-based environment for English language learning with advanced data-driven and statistical approaches. The project uses various corpora, including a Chinese-English parallel corpus ("Sinorama") and various natural language processing (NLP) tools to construct effective English…

  18. ARSENIC UPTAKE PROCESSES IN REDUCING ENVIRONMENTS: IMPLICATIONS FOR ACTIVE REMEDIATION AND NATURAL ATTENUATION

    EPA Science Inventory

    Reductive dissolution of iron oxyhydr(oxides) and release of adsorbed or coprecipitated arsenic is often implicated as a key process that controls the mobility and bioavailability of arsenic in anoxic environments. Yet a complete assessment of arsenic transport and fate requires...

  19. Students' Expectations of the Learning Process in Virtual Reality and Simulation-Based Learning Environments

    ERIC Educational Resources Information Center

    Keskitalo, Tuulikki

    2012-01-01

    Expectations for simulations in healthcare education are high; however, little is known about healthcare students' expectations of the learning process in virtual reality (VR) and simulation-based learning environments (SBLEs). This research aims to describe first-year healthcare students' (N=97) expectations regarding teaching, studying, and…

  20. Process Structure of Parent-Child-Environment Relations and the Prevention of Children's Injuries.

    ERIC Educational Resources Information Center

    Valsiner, Jaan; Lightfoot, Cynthia

    1987-01-01

    The reasoning of caregivers is discussed in the context of preventing childhood accidents. This reasoning process, which uses knowledge about children's behavior in an environment, leads to appropriate preventive actions on the part of the caregiver. Illustrative examples of parents interacting with children are presented. (VM)

  1. One Program's Journey: Using the Change Process To Implement Service in Natural Environments.

    ERIC Educational Resources Information Center

    Brault, Linda M. J.; Ashley, Melinda; Gallo, Jan

    2001-01-01

    This article profiles the Hope Infant Family Support Program in San Diego, California, and its shift to providing educational services in the natural environment. How the program managed this complex change process is discussed, including the development of a vision statement, incentives, skill development, resource allocation changes, and action…

  2. Journey into the Problem-Solving Process: Cognitive Functions in a PBL Environment

    ERIC Educational Resources Information Center

    Chua, B. L.; Tan, O. S.; Liu, W. C.

    2016-01-01

    In a PBL environment, learning results from learners engaging in cognitive processes pivotal in the understanding or resolution of the problem. Using Tan's cognitive function disc, this study examines the learner's perceived cognitive functions at each stage of PBL, as facilitated by the PBL schema. The results suggest that these learners…

  3. Coupled process modeling and waste package performance

    SciTech Connect

    McGrail, B.P.; Engel, D.W.

    1992-11-01

    The interaction of borosilicate waste glasses with water has been studied extensively and reasonably good models are available that describe the reaction kinetics and solution chemical effects. Unfortunately, these models have not been utilized in performance assessment analyses, except in estimating radionuclide solubilities at the waste form surface. A geochemical model has been incorporated in the AREST code to examine the coupled processes of glass dissolution and transport within the engineering barrier system. Our calculations show that the typical assumptions used in performance assessment analyses, such as fixed solubilities or constant reaction rate at the waste form surface, do not always give conservative or realistic predictions of radionuclide release. Varying the transport properties of the waste package materials is shown to give counterintuitive effects on the release rates of some radionuclides. The use of noncoupled performance assessment models could lead a repository designer to an erroneous conclusion regarding the relative benefit of one waste package design or host rock setting over another.

  4. Developing expertise in gynecologic surgery: reflective perspectives of international experts on learning environments and processes.

    PubMed

    Hardré, Patricia L; Nihira, Mikio; LeClaire, Edgar L

    2017-01-01

    Research in medical education does not provide a clear understanding of how professional expertise develops among surgeons and what experiential factors contribute to that development. To address this gap, the researchers interviewed 16 international experts in female pelvic medicine and reconstructive surgery to assess their reflective perceptions of what specific opportunities and experiences initiated and supported their development toward expertise in their field. Characteristics and influences explaining the speed and quality of expertise development were sorted into the following themes: the dynamic process of expertise development, internal and personal characteristics, general aptitudes and preparatory skills, role modeling and interpersonal influences, opportunities to learn and practice, and roles and reference points. Across the narratives and perspectives of these expert surgeons, both individual characteristics and choices, and contextual activities and opportunities were necessary and important. Experiences with greatest impact on quality of expertise development included those provided by the environment and mentors, as well as those sought out by learners themselves, to elaborate and supplement existing opportunities. The ideal combination across experts was interaction and integration of individual characteristics with experiential opportunities. Grounded in theory and research in expertise development, these findings can support improvement of medical education, both for individual mentors and strategic program development. As surgery evolves at a continuously increasing pace, effective mentoring of promising surgical trainees will be critical to ensure that future generations of gynecologic surgeons will remain excellent. Effective, efficient surgical expertise development requires identifying trainees with the appropriate characteristics and providing them with the best development opportunities.

  5. Control of Listeria monocytogenes in the processing environment by understanding biofilm formation and resistance to sanitizers.

    PubMed

    Manios, Stavros G; Skandamis, Panagiotis N

    2014-01-01

    Listeria monocytogenes can colonize in the food processing environment and thus pose a greater risk of cross-contamination to food. One of the proposed mechanisms that facilitates such colonization is biofilm formation. As part of a biofilm, it is hypothesized that L. monocytogenes can survive sanitization procedures. In addition, biofilms are difficult to remove and may require additional physical and chemical mechanisms to reduce their presence and occurrence. The initial stage of biofilm formation is attachment to surfaces, and therefore it is important to be able to determine the ability of L. monocytogenes strains to attach to various inert surfaces. In this chapter, methods to study bacterial attachment to surfaces are described. Attachment is commonly induced by bringing planktonic cells into contact with plastic, glass, or stainless steel surfaces with or without food residues ("soil") in batch or continuous (e.g., with constant flow of nutrients) culture. Measurement of biofilm formed is carried out by detaching cells (with various mechanical methods) and measuring the viable counts or by measuring the total attached biomass. Resistance of biofilms to sanitizers is commonly carried out by exposure of the whole model surface bearing the attached cells to a solution of sanitizer, followed by measuring the survivors as described above.

  6. Developing expertise in gynecologic surgery: reflective perspectives of international experts on learning environments and processes

    PubMed Central

    Hardré, Patricia L; Nihira, Mikio; LeClaire, Edgar L

    2017-01-01

    Research in medical education does not provide a clear understanding of how professional expertise develops among surgeons and what experiential factors contribute to that development. To address this gap, the researchers interviewed 16 international experts in female pelvic medicine and reconstructive surgery to assess their reflective perceptions of what specific opportunities and experiences initiated and supported their development toward expertise in their field. Characteristics and influences explaining the speed and quality of expertise development were sorted into the following themes: the dynamic process of expertise development, internal and personal characteristics, general aptitudes and preparatory skills, role modeling and interpersonal influences, opportunities to learn and practice, and roles and reference points. Across the narratives and perspectives of these expert surgeons, both individual characteristics and choices, and contextual activities and opportunities were necessary and important. Experiences with greatest impact on quality of expertise development included those provided by the environment and mentors, as well as those sought out by learners themselves, to elaborate and supplement existing opportunities. The ideal combination across experts was interaction and integration of individual characteristics with experiential opportunities. Grounded in theory and research in expertise development, these findings can support improvement of medical education, both for individual mentors and strategic program development. As surgery evolves at a continuously increasing pace, effective mentoring of promising surgical trainees will be critical to ensure that future generations of gynecologic surgeons will remain excellent. Effective, efficient surgical expertise development requires identifying trainees with the appropriate characteristics and providing them with the best development opportunities. PMID:28123313

  7. Design Process for Online Websites Created for Teaching Turkish as a Foreign Language in Web Based Environments

    ERIC Educational Resources Information Center

    Türker, Fatih Mehmet

    2016-01-01

    In today's world, where online learning environments have increased their efficiency in education and training, the design of the websites prepared for education and training purposes has become an important process. This study is about the teaching process of the online learning environments created to teach Turkish in web based environments, and…

  8. Peer Review Process and Accreditation of Models

    DTIC Science & Technology

    1990-02-02

    COMMUNICATIONS, AND COMPUTER SCIENCES (AIRMICS) AD-A268 573 PEER REVIEW PROCESS AND ACCREDITATION OF MODELS (ASQBG-A-89-010) 2 February 1990 _DTIC ELECTE...IS BEST QUALITY AVAILABLE. THE COPY FURNISHED TO DTIC CONTAINED A SIGNIFICANT NUMBER OF PAGES WHICH DO NOT REPRODUCE LEGIBLY. Peer Review Process and...1990 Table of Contents Executive Summary 1 Introduction 1 1.1 Purpose 1 1.2 Background 1 1.3 Current Issues 5 2 Previous Peer Reviews 7 2.1 General 7 2.2

  9. Modeling Manufacturing Processes to Mitigate Technological Risk

    SciTech Connect

    Allgood, G.O.; Manges, W.W.

    1999-10-24

    An economic model is a tool for determining the justifiable cost of new sensors and subsystems with respect to value and operation. This process balances the R and D costs against the expense of maintaining current operations and allows for a method to calculate economic indices of performance that can be used as control points in deciding whether to continue development or suspend actions. The model can also be used as an integral part of an overall control loop utilizing real-time process data from the sensor groups to make production decisions (stop production and repair machine, continue and warn of anticipated problems, queue for repairs, etc.). This model has been successfully used and deployed in the CAFE Project. The economic model was one of seven (see Fig. 1) elements critical in developing an investment strategy. It has been successfully used in guiding the R and D activities on the CAFE Project, suspending activities on three new sensor technologies, and continuing development o f two others. The model has also been used to justify the development of a new prognostic approach for diagnosing machine health using COTS equipment and a new algorithmic approach. maintaining current operations and allows for a method to calculate economic indices of performance that can be used as control points in deciding whether to continue development or suspend actions. The model can also be used as an integral part of an overall control loop utilizing real-time process data from the sensor groups to make production decisions (stop production and repair machine, continue and warn of anticipated problems, queue for repairs, etc.).

  10. A New Fractal Model of Chromosome and DNA Processes

    NASA Astrophysics Data System (ADS)

    Bouallegue, K.

    Dynamic chromosome structure remains unknown. Can fractals and chaos be used as new tools to model, identify and generate a structure of chromosomes?Fractals and chaos offer a rich environment for exploring and modeling the complexity of nature. In a sense, fractal geometry is used to describe, model, and analyze the complex forms found in nature. Fractals have also been widely not only in biology but also in medicine. To this effect, a fractal is considered an object that displays self-similarity under magnification and can be constructed using a simple motif (an image repeated on ever-reduced scales).It is worth noting that the problem of identifying a chromosome has become a challenge to find out which one of the models it belongs to. Nevertheless, the several different models (a hierarchical coiling, a folded fiber, and radial loop) have been proposed for mitotic chromosome but have not reached a dynamic model yet.This paper is an attempt to solve topological problems involved in the model of chromosome and DNA processes. By combining the fractal Julia process and the numerical dynamical system, we have finally found out four main points. First, we have developed not only a model of chromosome but also a model of mitosis and one of meiosis. Equally important, we have identified the centromere position through the numerical model captured below. More importantly, in this paper, we have discovered the processes of the cell divisions of both mitosis and meiosis. All in all, the results show that this work could have a strong impact on the welfare of humanity and can lead to a cure of genetic diseases.

  11. Glacier lake outburst floods - modelling process chains

    NASA Astrophysics Data System (ADS)

    Schaub, Yvonne; Huggel, Christian; Haeberli, Wilfried

    2013-04-01

    New lakes are forming in high-mountain areas all over the world due to glacier recession. Often they will be located below steep, destabilized flanks and are therefore exposed to impacts from rock-/ice-avalanches. Several events worldwide are known, where an outburst flood has been triggered by such an impact. In regions such as in the European Alps or in the Cordillera Blanca in Peru, where valley bottoms are densely populated, these far-travelling, high-magnitude events can result in major disasters. For appropriate integral risk management it is crucial to gain knowledge on how the processes (rock-/ice-avalanches - impact waves in lake - impact on dam - outburst flood) interact and how the hazard potential related to corresponding process chains can be assessed. Research in natural hazards so far has mainly concentrated on describing, understanding, modeling or assessing single hazardous processes. Some of the above mentioned individual processes are quite well understood in their physical behavior and some of the process interfaces have also been investigated in detail. Multi-hazard assessments of the entire process chain, however, have only recently become subjects of investigations. Our study aims at closing this gap and providing suggestions on how to assess the hazard potential of the entire process chain in order to generate hazard maps and support risk assessments. We analyzed different types of models (empirical, analytical, physically based) for each process regarding their suitability for application in hazard assessments of the entire process chain based on literature. Results show that for rock-/ice-avalanches, dam breach and outburst floods, only numerical, physically based models are able to provide the required information, whereas the impact wave can be estimated by means of physically based or empirical assessments. We demonstrate how the findings could be applied with the help of a case study of a recent glacier lake outburst event at Laguna

  12. Vertical distribution, migration rates, and model comparison of actinium in a semi-arid environment.

    PubMed

    McClellan, Y; August, R A; Gosz, J R; Gann, S; Parmenter, R R; Windsor, M

    2006-01-01

    Vertical soil characterization and migration of radionuclides were investigated at four radioactively contaminated sites on Kirtland Air Force Base (KAFB), New Mexico to determine the vertical downward migration of radionuclides in a semi-arid environment. The surface soils (0-15 cm) were intentionally contaminated with Brazilian sludge (containing (232)Thorium and other radionuclides) approximately 40 years ago, in order to simulate the conditions resulting from a nuclear weapons accident. Site grading consisted of manually raking or machine disking the sludge. The majority of the radioactivity was found in the top 15 cm of soil, with retention ranging from 69 to 88%. Two models, a compartment diffusion model and leach rate model, were evaluated to determine their capabilities and limitations in predicting radionuclide behavior. The migration rates of actinium were calculated with the diffusion compartment and the leach rate models for all sites, and ranged from 0.009 to 0.1 cm/yr increasing with depth. The migration rates calculated with the leach rate models were similar to those using the diffusion compartment model and did not increase with depth (0.045-0.076, 0.0 cm/yr). The research found that the physical and chemical properties governing transport processes of water and solutes in soil provide a valid radionuclide transport model. The evaluation also showed that the physical model has fewer limitations and may be more applicable to this environment.

  13. Designing a Collaborative Problem Solving Environment for Integrated Water Resource Modeling

    SciTech Connect

    Thurman, David A.; Cowell, Andrew J.; Taira, Randal Y.; Frodge, Jonathan

    2004-06-14

    We report on our approach for designing a collaborative problem solving environment for hydrologists, water quality planners and natural resource managers, all roles within a natural resource management agency and stakeholders in an integrated water resource management process. We describe our approach in context of the Integrated Water Resource Modeling System (IWRMS), under development by Pacific Northwest National Laboratory for the Department of Natural Resources and Parks in King County, Washington. This system will integrate a collection of water resource models (watersheds, rivers, lakes, estuaries) to provide the ability to address water, land use, and other natural resource management decisions and scenarios, with the goal of developing an integrated modeling capability to address future land use and resource management scenarios and provide scientific support to decision makers. Here, we discuss the five-step process used to ascertain the (potentially opposing) needs and interests of stakeholders and provide results and summaries from our experiences. The results of this process guide user interface design efforts to create a collaborative problems solving environment supporting multiple users with differing scientific backgrounds and modeling needs. We conclude with a discussion of participatory interface design methods used to encourage stakeholder involvement and acceptance of the system as well as the lessons learned to date.

  14. Institute works on modeling thermonuclear plasma processes

    NASA Astrophysics Data System (ADS)

    Vatsek, D.

    1985-07-01

    Results of nuclear-physics research are discussed. Principles of a theory of spectra of atoms and ions, were studied. Results of the development of mathematical methods for the study of complex atoms and ions - methods which can be used in astrophysics for ascertaining the structure and properties of the sun, are summarized. Research with applications in molecular biology, metal working and environmental protection using lasers and conventional methods are discussed. Laser-aided research in extremely high-speed processes in molecules, which can be used in the study of living cells is outlined. Laser cutting of steel rods and sheets, laser hardening of products, and other industrial users of lasers are studied. Equipment for analyzing the compositon of the atmosphere and detecting sources of pollution, an automatic device for monitoring microclimate parameters, the EOL-I, and an instrument for measuring small concentrations of mercury in the natural environment and indoors were developed.

  15. A Model of Generating Visual Place Cells Based on Environment Perception and Similar Measure

    PubMed Central

    2016-01-01

    It is an important content to generate visual place cells (VPCs) in the field of bioinspired navigation. By analyzing the firing characteristic of biological place cells and the existing methods for generating VPCs, a model of generating visual place cells based on environment perception and similar measure is abstracted in this paper. VPCs' generation process is divided into three phases, including environment perception, similar measure, and recruiting of a new place cell. According to this process, a specific method for generating VPCs is presented. External reference landmarks are obtained based on local invariant characteristics of image and a similar measure function is designed based on Euclidean distance and Gaussian function. Simulation validates the proposed method is available. The firing characteristic of the generated VPCs is similar to that of biological place cells, and VPCs' firing fields can be adjusted flexibly by changing the adjustment factor of firing field (AFFF) and firing rate's threshold (FRT). PMID:27597859

  16. Theoretical model of crystal growth shaping process

    NASA Astrophysics Data System (ADS)

    Tatarchenko, V. A.; Uspenski, V. S.; Tatarchenko, E. V.; Nabot, J. Ph.; Duffar, T.; Roux, B.

    1997-10-01

    A theoretical investigation of the crystal growth shaping process is carried out on the basis of the dynamic stability concept. The capillary dynamic stability of shaped crystal growth processes for various forms of the liquid menisci is analyzed using the mathematical model of the phenomena in the axisymmetric case. The catching boundary condition of the capillary boundary problem is considered and the limits of its application for shaped crystal growth modeling are discussed. The static stability of a liquid free surface is taken into account by means of the Jacobi equation analysis. The result is that a large number of menisci having drop-like shapes are statically unstable. A few new non-traditional liquid meniscus shapes (e.g., bubbles and related shapes) are proposed for the case of a catching boundary condition.

  17. A Workflow Environment for Reactive Transport Modeling with Application to a Mixing- Controlled Precipitation Experiment

    NASA Astrophysics Data System (ADS)

    Schuchardt, K. L.; Sun, L.; Chase, J. M.; Elsethagen, T. O.; Freedman, V. L.; Redden, G. D.; Scheibe, T. D.

    2007-12-01

    Advances in subsurface modeling techniques such as multi-scale methods, hybrid models, and inverse modeling, combined with petascale computing capabilities, will result in simulations that run over longer time scales, cover larger geographic regions, and model increasingly detailed physical processes. This will lead to significantly more data of increased complexity, creating challenges to already strained processes for parameterizing and running models, organizing and tracking data, and visualizing outputs. To support effective development and utilization of next-generation simulators, we are developing a process integration framework that combines and extends leading edge technologies for process automation, data and metadata management, and large-scale data visualization. Our process integration framework applies workflow techniques to integrate components for accessing and preparing inputs, running simulations, and analyzing results. Data management and provenance middleware enables sharing and community development of data sources and stores full information about data and processes. In the hands of modelers, experimentalists, and developers, the process integration framework will improve efficiency, accuracy and confidence in results, and broaden the array of theories available. In this poster (which will include a live computer demo of the workflow environment) we will present a prototype of the process integration framework, developed to address a selected benchmark problem. The prototype is being used to perform simulations of an intermediate-scale experiment in which a solid mineral is precipitated from the reaction of two mixing solutes. A range of possible experimental configurations are being explored to support design of a planned set of experiments incorporating heterogeneous media. The prototype provides a user interface to specify parameter ranges, runs the required simulations on a user specified machine, automatically manages the input and output

  18. Expert Models and Modeling Processes Associated with a Computer-Modeling Tool

    ERIC Educational Resources Information Center

    Zhang, BaoHui; Liu, Xiufeng; Krajcik, Joseph S.

    2006-01-01

    Holding the premise that the development of expertise is a continuous process, this study concerns expert models and modeling processes associated with a modeling tool called Model-It. Five advanced Ph.D. students in environmental engineering and public health used Model-It to create and test models of water quality. Using "think aloud" technique…

  19. Analytical Modeling of High Rate Processes.

    DTIC Science & Technology

    2007-11-02

    TYPE AND DATES COVERED 1 13 Apr 98 Final (01 Sep 94 - 31 Aug 97) 4. TITLE AND SUBTITLE 5 . FUNDING NUMBERS Analytical Modeling of High Rate Processes...20332- 8050 FROM: S. E. Jones, University Research Professor Department of Aerospace Engineering and Mechanics University of Alabama SUBJECT: Final...Mr. Sandor Augustus and Mr. Jeffrey A. Drinkard. There are no outstanding commitments. The balance in the account, as of July 31 , 1997, was $102,916.42

  20. Characterizing Pluto's plasma environment through multifluid MHD modelling

    NASA Astrophysics Data System (ADS)

    Hale, J. M.; Paty, C. S.

    2013-12-01

    We will report on preliminary results from simulations of the Hadean magnetosphere using a refined version of the global multifluid MHD model which has been successfully used to simulate numerous planetary systems, including Ganymede [Paty et al., 2008], Pluto [Harnett et al., 2005], Saturn [Kidder at al., 2012], and Titan [Snowden et al., 2011a,b], among others. This initial study focuses on exploring the exospheric and solar wind parameter space local to Pluto. We explore multiple system geometries including a simulation in which Pluto has no ionosphere, as appears to be the case due to freezing when Pluto resides at apoapsis, as well as several scenarios with different ionospheric and exospheric densities. Ionospheric densities are based on chemical modeling reported in Krasnopolsky and Cruikshank [1999] and solar wind conditions are based on system geometry at periapsis, apoapsis, and at the time of the New Horizons system flyby. We examine the role of the ionosphere and exosphere in determining the location and structure of the bow shock, as well as characterizing the impact of the variability of solar wind pressure and magnetic field throughout Pluto's orbit. This work supports the characterization of the magnetospheric environment of the Pluto system in preparation for the New Horizons encounter in 2015.