Science.gov

Sample records for environment process model

  1. Near Field Environment Process Model Report

    SciTech Connect

    R.A. Wagner

    2000-11-14

    Waste emplacement and activities associated with construction of a repository system potentially will change environmental conditions within the repository system. These environmental changes principally result from heat generated by the decay of the radioactive waste, which elevates temperatures within the repository system. Elevated temperatures affect distribution of water, increase kinetic rates of geochemical processes, and cause stresses to change in magnitude and orientation from the stresses resulting from the overlying rock and from underground construction activities. The recognition of this evolving environment has been reflected in activities, studies and discussions generally associated with what has been termed the Near-Field Environment (NFE). The NFE interacts directly with waste packages and engineered barriers as well as potentially changing the fluid composition and flow conditions within the mountain. As such, the NFE defines the environment for assessing the performance of a potential Monitored Geologic Repository at Yucca Mountain, Nevada. The NFe evolves over time, and therefore is not amenable to direct characterization or measurement in the ambient system. Analysis or assessment of the NFE must rely upon projections based on tests and models that encompass the long-term processes of the evolution of this environment. This NFE Process Model Report (PMR) describes the analyses and modeling based on current understanding of the evolution of the near-field within the rock mass extending outward from the drift wall.

  2. Space - A unique environment for process modeling R&D

    NASA Technical Reports Server (NTRS)

    Overfelt, Tony

    1991-01-01

    Process modeling, the application of advanced computational techniques to simulate real processes as they occur in regular use, e.g., welding, casting and semiconductor crystal growth, is discussed. Using the low-gravity environment of space will accelerate the technical validation of the procedures and enable extremely accurate determinations of the many necessary thermophysical properties. Attention is given to NASA's centers for the commercial development of space; joint ventures of universities, industries, and goverment agencies to study the unique attributes of space that offer potential for applied R&D and eventual commercial exploitation.

  3. A simple hyperbolic model for communication in parallel processing environments

    NASA Technical Reports Server (NTRS)

    Stoica, Ion; Sultan, Florin; Keyes, David

    1994-01-01

    We introduce a model for communication costs in parallel processing environments called the 'hyperbolic model,' which generalizes two-parameter dedicated-link models in an analytically simple way. Dedicated interprocessor links parameterized by a latency and a transfer rate that are independent of load are assumed by many existing communication models; such models are unrealistic for workstation networks. The communication system is modeled as a directed communication graph in which terminal nodes represent the application processes that initiate the sending and receiving of the information and in which internal nodes, called communication blocks (CBs), reflect the layered structure of the underlying communication architecture. The direction of graph edges specifies the flow of the information carried through messages. Each CB is characterized by a two-parameter hyperbolic function of the message size that represents the service time needed for processing the message. The parameters are evaluated in the limits of very large and very small messages. Rules are given for reducing a communication graph consisting of many to an equivalent two-parameter form, while maintaining an approximation for the service time that is exact in both large and small limits. The model is validated on a dedicated Ethernet network of workstations by experiments with communication subprograms arising in scientific applications, for which a tight fit of the model predictions with actual measurements of the communication and synchronization time between end processes is demonstrated. The model is then used to evaluate the performance of two simple parallel scientific applications from partial differential equations: domain decomposition and time-parallel multigrid. In an appropriate limit, we also show the compatibility of the hyperbolic model with the recently proposed LogP model.

  4. Modeling socio-cultural processes in network-centric environments

    NASA Astrophysics Data System (ADS)

    Santos, Eunice E.; Santos, Eugene, Jr.; Korah, John; George, Riya; Gu, Qi; Kim, Keumjoo; Li, Deqing; Russell, Jacob; Subramanian, Suresh

    2012-05-01

    The major focus in the field of modeling & simulation for network centric environments has been on the physical layer while making simplifications for the human-in-the-loop. However, the human element has a big impact on the capabilities of network centric systems. Taking into account the socio-behavioral aspects of processes such as team building, group decision-making, etc. are critical to realistically modeling and analyzing system performance. Modeling socio-cultural processes is a challenge because of the complexity of the networks, dynamism in the physical and social layers, feedback loops and uncertainty in the modeling data. We propose an overarching framework to represent, model and analyze various socio-cultural processes within network centric environments. The key innovation in our methodology is to simultaneously model the dynamism in both the physical and social layers while providing functional mappings between them. We represent socio-cultural information such as friendships, professional relationships and temperament by leveraging the Culturally Infused Social Network (CISN) framework. The notion of intent is used to relate the underlying socio-cultural factors to observed behavior. We will model intent using Bayesian Knowledge Bases (BKBs), a probabilistic reasoning network, which can represent incomplete and uncertain socio-cultural information. We will leverage previous work on a network performance modeling framework called Network-Centric Operations Performance and Prediction (N-COPP) to incorporate dynamism in various aspects of the physical layer such as node mobility, transmission parameters, etc. We validate our framework by simulating a suitable scenario, incorporating relevant factors and providing analyses of the results.

  5. Broadband model-based processing for shallow ocean environments

    SciTech Connect

    Candy, J.V.; Sullivan, E.J.

    1998-07-01

    Most acoustic sources found in the ocean environment are spatially complex and broadband. In the case of shallow water propagation, these source characteristics complicate the analysis of received acoustic data considerably. A common approach to the broadband problem is to decompose the received signal into a set of narrow-band lines. This then allows the problem to be treated as a multiplicity of narrow-band problems. Here a model-based approach is developed for the processing of data received on a vertical array from a broadband source where it is assumed that the propagation is governed by the normal-mode model. The goal of the processor is to provide an enhanced (filtered) version of the pressure at the array and the modal functions. Thus a pre-processor is actually developed, since one could think of several applications for these enhanced quantities such as localization, modal estimation, etc. It is well-known that in normal-mode theory a different modal structure evolves for each temporal frequency; thus it is not surprising that the model-based solution to this problem results in a scheme that requires a {open_quotes}bank{close_quotes} of narrow-band model-based processors{emdash}each with its own underlying modal structure for the narrow frequency band it operates over. The {open_quotes}optimal{close_quotes} Bayesian solution to the broadband pressure field enhancement and modal function extraction problem is developed. It is shown how this broadband processor can be implemented (using a suboptimal scheme) in pseudo real time due to its inherent parallel structure. A set of noisy broadband data is synthesized to demonstrate how to construct the processor and achieve a minimum variance (optimal Bayesian) design. It is shown that both broadband pressure-field and modal function estimates can be extracted illustrating the feasibility of this approach. {copyright} {ital 1998 Acoustical Society of America.}

  6. Conceptualization of Approaches and Thought Processes Emerging in Validating of Model in Mathematical Modeling in Technology Aided Environment

    ERIC Educational Resources Information Center

    Hidiroglu, Çaglar Naci; Bukova Güzel, Esra

    2013-01-01

    The aim of the present study is to conceptualize the approaches displayed for validation of model and thought processes provided in mathematical modeling process performed in technology-aided learning environment. The participants of this grounded theory study were nineteen secondary school mathematics student teachers. The data gathered from the…

  7. Hot Plasma Environment Model (HPEM): A empirical model for describing time-dependent processes of the jovian energetic electron environment

    NASA Astrophysics Data System (ADS)

    Roussos, E.; Krupp, N.; Fraenz, M.; Kollmann, P.; Truscott, P.; Futaana, Y.

    2015-10-01

    HPEM is a model designed in order to provide time-series of energetic electron differential or integral energy-flux spectra for Jupiter's magnetosphere which can be used as input for internal charging studies of the JUICE spacecraft. The model describes the electron distribution function between 150 keV keV up to ~50 MeV. It is designed to be applicable between the orbit of Europa (9.5 Rj) up to 30 Rj, which is near Callisto's orbit, and in a latitude range of 40 degrees from the planetary equatorial plane, but it can be extended to larger distances and latitudes. The model is constructed with a goal to describe the time variability that a spacecraft can encounter in Jupiter's energetic electron environment. This variability can have two components: the first comes from the motion of the spacecraft within a spatially-varying jovian magnetosphere. For this purpose an average radiation belt model for the differential electron energy-flux spectra was constructed based on Galileo EPD/LEMMS observations, dependent on L, magnetospheric local time and equatorial pitch angle. The second component includes an empirical description of magnetospheric transients that result from dynamics in the magnetosphere. For this purpose, the probability for a given spectrum to deviate from the average one (at a given location) has been modeled with log-normal distributions and such probabilities are obtained with a Monte-Carlo approach. Temporal changes in the electron spectra are constrained by the L- or time gradients observed with Galileo's EPD/LEMMS detector so as to prevent extreme and unrealistic changes between sequential spectra of the model's output. The model is able to reproduce both the statistical scatter of energetic electron fluxes observed with Galileo/EPD, as well as the lifetimes/time scales and the occurence probability of extreme flux enhancements (temporal radiation belts) that Galileo encountered. An application to the JUICE mission is also shown.

  8. Autoplan: A self-processing network model for an extended blocks world planning environment

    NASA Technical Reports Server (NTRS)

    Dautrechy, C. Lynne; Reggia, James A.; Mcfadden, Frank

    1990-01-01

    Self-processing network models (neural/connectionist models, marker passing/message passing networks, etc.) are currently undergoing intense investigation for a variety of information processing applications. These models are potentially very powerful in that they support a large amount of explicit parallel processing, and they cleanly integrate high level and low level information processing. However they are currently limited by a lack of understanding of how to apply them effectively in many application areas. The formulation of self-processing network methods for dynamic, reactive planning is studied. The long-term goal is to formulate robust, computationally effective information processing methods for the distributed control of semiautonomous exploration systems, e.g., the Mars Rover. The current research effort is focusing on hierarchical plan generation, execution and revision through local operations in an extended blocks world environment. This scenario involves many challenging features that would be encountered in a real planning and control environment: multiple simultaneous goals, parallel as well as sequential action execution, action sequencing determined not only by goals and their interactions but also by limited resources (e.g., three tasks, two acting agents), need to interpret unanticipated events and react appropriately through replanning, etc.

  9. OpenDA Open Source Generic Data Assimilation Environment and its Application in Process Models

    NASA Astrophysics Data System (ADS)

    El Serafy, Ghada; Verlaan, Martin; Hummel, Stef; Weerts, Albrecht; Dhondia, Juzer

    2010-05-01

    Data Assimilation techniques are essential elements in state-of-the-art development of models and their optimization with data in the field of groundwater, surface water and soil systems. They are essential tools in calibration of complex modelling systems and improvement of model forecasts. The OpenDA is a new and generic open source data assimilation environment for application to a choice of physical process models, applied to case dependent domains. OpenDA was introduced recently when the developers of Costa, an open-source TU Delft project [http://www.costapse.org; Van Velzen and Verlaan; 2007] and those of the DATools from the former WL|Delft Hydraulics [El Serafy et al 2007; Weerts et al. 2009] decided to join forces. OpenDA makes use of a set of interfaces that describe the interaction between models, observations and data assimilation algorithms. It focuses on flexible applications in portable systems for modelling geophysical processes. It provides a generic interfacing protocol that allows combination of the implemented data assimilation techniques with, in principle, any time-stepping model duscribing a process(atmospheric processes, 3D circulation, 2D water level, sea surface temperature, soil systems, groundwater etc.). Presently, OpenDA features filtering techniques and calibration techniques. The presentation will give an overview of the OpenDA and the results of some of its practical applications. Application of data assimilation in portable operational forecasting systems—the DATools assimilation environment, El Serafy G.Y., H. Gerritsen, S. Hummel, A. H. Weerts, A.E. Mynett and M. Tanaka (2007), Journal of Ocean Dynamics, DOI 10.1007/s10236-007-0124-3, pp.485-499. COSTA a problem solving environment for data assimilation applied for hydrodynamical modelling, Van Velzen and Verlaan (2007), Meteorologische Zeitschrift, Volume 16, Number 6, December 2007 , pp. 777-793(17). Application of generic data assimilation tools (DATools) for flood

  10. Effects of Kinetic Processes in Shaping Io's Global Plasma Environment: A 3D Hybrid Model

    NASA Technical Reports Server (NTRS)

    Lipatov, Alexander S.; Combi, Michael R.

    2004-01-01

    The global dynamics of the ionized and neutral components in the environment of Io plays an important role in the interaction of Jupiter's corotating magnetospheric plasma with Io. The stationary simulation of this problem was done in the MHD and the electrodynamics approaches. One of the main significant results from the simplified two-fluid model simulations was a production of the structure of the double-peak in the magnetic field signature of the I0 flyby that could not be explained by standard MHD models. In this paper, we develop a method of kinetic ion simulation. This method employs the fluid description for electrons and neutrals whereas for ions multilevel, drift-kinetic and particle, approaches are used. We also take into account charge-exchange and photoionization processes. Our model provides much more accurate description for ion dynamics and allows us to take into account the realistic anisotropic ion distribution that cannot be done in fluid simulations. The first results of such simulation of the dynamics of ions in the Io's environment are discussed in this paper.

  11. Effects of kinetic processes in shaping Io's global plasma environment: A 3D hybrid model

    NASA Astrophysics Data System (ADS)

    Lipatov, Alexander S.; Combi, Michael R.

    2006-02-01

    The global dynamics of the ionized and neutral gases in the environment of Io plays an important role in the interaction of Jupiter's corotating magnetospheric plasma with Io. Stationary simulations of this problem have already been done using the magnetohydrodynamics (MHD) and the electrodynamics approaches. One of the major results of recent simplified two-fluid model simulations [Saur, J., Neubauer, F.M., Strobel, D.F., Summers, M.E., 2002. J. Geophys. Res. 107 (SMP5), 1-18] was the production of the structure of the double-peak in the magnetic field signature of the I0 flyby. These could not be explained before by standard MHD models. In this paper, we present a hybrid simulation for Io with kinetic ions and fluid electrons. This method employs a fluid description for electrons and neutrals, whereas for ions a particle approach is used. We also take into account charge-exchange and photoionization processes and solve self-consistently for electric and magnetic fields. Our model may provide a much more accurate description for the ion dynamics than previous approaches and allows us to account for the realistic anisotropic ion velocity distribution that cannot be done in fluid simulations with isotropic temperatures. The first results of such a simulation of the dynamics of ions in Io's environment are discussed in this paper. Comparison with the Galileo I0 flyby results shows that this approach provides an accurate physical basis for the interaction and can therefore naturally reproduce all the observed salient features.

  12. Effects of Kinetic Processes in Shaping Io's Global Plasma Environment: A 3D Hybrid Model

    NASA Technical Reports Server (NTRS)

    Lipatov, Alexander S.; Combi, Michael R.

    2006-01-01

    The global dynamics of the ionized and neutral gases in the environment of Io plays an important role in the interaction of Jupiter s corotating magnetospheric plasma with Io. Stationary simulations of this problem have already been done using the magnetohydrodynamics (MHD) and the electrodynamics approaches. One of the major results of recent simplified two-fluid model simulations [Saur, J., Neubauer, F.M., Strobel, D.F., Summers, M.E., 2002. J. Geophys. Res. 107 (SMP5), 1-18] was the production of the structure of the double-peak in the magnetic field signature of the Io flyby. These could not be explained before by standard MHD models. In this paper, we present a hybrid simulation for Io with kinetic ions and fluid electrons. This method employs a fluid description for electrons and neutrals, whereas for ions a particle approach is used. We also take into account charge-exchange and photoionization processes and solve self-consistently for electric and magnetic fields. Our model may provide a much more accurate description for the ion dynamics than previous approaches and allows us to account for the realistic anisotropic ion velocity distribution that cannot be done in fluid simulations with isotropic temperatures. The first results of such a simulation of the dynamics of ions in Io s environment are discussed in this paper. Comparison with the Galileo IO flyby results shows that this approach provides an accurate physical basis for the interaction and can therefore naturally reproduce all the observed salient features.

  13. Improving Ground Penetrating Radar Imaging in High Loss Environments by Coordinated System Development, Data Processing, Numerical Modeling, & Visualization

    SciTech Connect

    Wright, David L.

    2004-12-01

    Improving Ground Penetrating Radar Imaging in High Loss Environments by Coordinated System Development, Data Processing, Numerical Modeling, and Visualization Methods with Applications to Site Characterization EMSP Project 86992 Progress Report as of 9/2004.

  14. Mathematical modelling of thermal process to aquatic environment with different hydrometeorological conditions.

    PubMed

    Issakhov, Alibek

    2014-01-01

    This paper presents the mathematical model of the thermal process from thermal power plant to aquatic environment of the reservoir-cooler, which is located in the Pavlodar region, 17 Km to the north-east of Ekibastuz town. The thermal process in reservoir-cooler with different hydrometeorological conditions is considered, which is solved by three-dimensional Navier-Stokes equations and temperature equation for an incompressible flow in a stratified medium. A numerical method based on the projection method, divides the problem into three stages. At the first stage, it is assumed that the transfer of momentum occurs only by convection and diffusion. Intermediate velocity field is solved by fractional steps method. At the second stage, three-dimensional Poisson equation is solved by the Fourier method in combination with tridiagonal matrix method (Thomas algorithm). Finally, at the third stage, it is expected that the transfer is only due to the pressure gradient. Numerical method determines the basic laws of the hydrothermal processes that qualitatively and quantitatively are approximated depending on different hydrometeorological conditions. PMID:24991644

  15. Mathematical Modelling of Thermal Process to Aquatic Environment with Different Hydrometeorological Conditions

    PubMed Central

    Issakhov, Alibek

    2014-01-01

    This paper presents the mathematical model of the thermal process from thermal power plant to aquatic environment of the reservoir-cooler, which is located in the Pavlodar region, 17 Km to the north-east of Ekibastuz town. The thermal process in reservoir-cooler with different hydrometeorological conditions is considered, which is solved by three-dimensional Navier-Stokes equations and temperature equation for an incompressible flow in a stratified medium. A numerical method based on the projection method, divides the problem into three stages. At the first stage, it is assumed that the transfer of momentum occurs only by convection and diffusion. Intermediate velocity field is solved by fractional steps method. At the second stage, three-dimensional Poisson equation is solved by the Fourier method in combination with tridiagonal matrix method (Thomas algorithm). Finally, at the third stage, it is expected that the transfer is only due to the pressure gradient. Numerical method determines the basic laws of the hydrothermal processes that qualitatively and quantitatively are approximated depending on different hydrometeorological conditions. PMID:24991644

  16. Modelling Dust Processing and Evolution in Extreme Environments as seen by Herschel Space Observatory

    NASA Astrophysics Data System (ADS)

    Bocchio, Marco

    2014-09-01

    The main goal of my PhD study is to understand the dust processing that occurs during the mixing between the galactic interstellar medium and the intracluster medium. This process is of particular interest in violent phenomena such as galaxy-galaxy interactions or the ``Ram Pressure Stripping'' due to the infalling of a galaxy towards the cluster centre.Initially, I focus my attention to the problem of dust destruction and heating processes, re-visiting the available models in literature. I particularly stress on the cases of extreme environments such as a hot coronal-type gas (e.g., IGM, ICM, HIM) and supernova-generated interstellar shocks. Under these conditions small grains are destroyed on short timescales and large grains are heated by the collisions with fast electrons making the dust spectral energy distribution very different from what observed in the diffuse ISM.In order to test our models I apply them to the case of an interacting galaxy, NGC 4438. Herschel data of this galaxy indicates the presence of dust with a higher-than-expected temperature.With a multi-wavelength analysis on a pixel-by-pixel basis we show that this hot dust seems to be embedded in a hot ionised gas therefore undergoing both collisional heating and small grain destruction.Furthermore, I focus on the long-standing conundrum about the dust destruction and dust formation timescales in the Milky Way. Based on the destruction efficiency in interstellar shocks, previous estimates led to a dust lifetime shorter than the typical timescale for dust formation in AGB stars. Using a recent dust model and an updated dust processing model we re-evaluate the dust lifetime in our Galaxy. Finally, I turn my attention to the phenomenon of ``Ram Pressure Stripping''. The galaxy ESO 137-001 represents one of the best cases to study this effect. Its long H2 tail embedded in a hot and ionised tail raises questions about its possible stripping from the galaxy or formation downstream in the tail. Based on

  17. Analysing Students' Shared Activity while Modeling a Biological Process in a Computer-Supported Educational Environment

    ERIC Educational Resources Information Center

    Ergazaki, M.; Zogza, V.; Komis, V.

    2007-01-01

    This paper reports on a case study with three dyads of high school students (age 14 years) each collaborating on a plant growth modeling task in the computer-supported educational environment "ModelsCreator". Following a qualitative line of research, the present study aims at highlighting the ways in which the collaborating students as well as the…

  18. The LONI Pipeline Processing Environment.

    PubMed

    Rex, David E; Ma, Jeffrey Q; Toga, Arthur W

    2003-07-01

    The analysis of raw data in neuroimaging has become a computationally entrenched process with many intricate steps run on increasingly larger datasets. Many software packages exist that provide either complete analyses or specific steps in an analysis. These packages often possess diverse input and output requirements, utilize different file formats, run in particular environments, and have limited abilities with certain types of data. The combination of these packages to achieve more sensitive and accurate results has become a common tactic in brain mapping studies but requires much work to ensure valid interoperation between programs. The handling, organization, and storage of intermediate data can prove difficult as well. The LONI Pipeline Processing Environment is a simple, efficient, and distributed computing solution to these problems enabling software inclusion from different laboratories in different environments. It is used here to derive a T1-weighted MRI atlas of the human brain from 452 normal young adult subjects with fully automated processing. The LONI Pipeline Processing Environment's parallel processing efficiency using an integrated client/server dataflow model was 80.9% when running the atlas generation pipeline from a PC client (Acer TravelMate 340T) on 48 dedicated server processors (Silicon Graphics Inc. Origin 3000). The environment was 97.5% efficient when the same analysis was run on eight dedicated processors. PMID:12880830

  19. The enhanced Software Life Cyle Support Environment (ProSLCSE): Automation for enterprise and process modeling

    NASA Technical Reports Server (NTRS)

    Milligan, James R.; Dutton, James E.

    1993-01-01

    In this paper, we have introduced a comprehensive method for enterprise modeling that addresses the three important aspects of how an organization goes about its business. FirstEP includes infrastructure modeling, information modeling, and process modeling notations that are intended to be easy to learn and use. The notations stress the use of straightforward visual languages that are intuitive, syntactically simple, and semantically rich. ProSLCSE will be developed with automated tools and services to facilitate enterprise modeling and process enactment. In the spirit of FirstEP, ProSLCSE tools will also be seductively easy to use. Achieving fully managed, optimized software development and support processes will be long and arduous for most software organizations, and many serious problems will have to be solved along the way. ProSLCSE will provide the ability to document, communicate, and modify existing processes, which is the necessary first step.

  20. Model-integrated program synthesis environment for parallel/real-time image processing

    NASA Astrophysics Data System (ADS)

    Moore, Michael S.; Sztipanovitz, Janos; Karsai, Gabor; Nichols, James A.

    1997-09-01

    In this paper, it is shown that, through the use of model- integrated program synthesis (MIPS), parallel real-time implementations of image processing data flows can be synthesized from high level graphical specifications. The complex details in inherent to parallel and real-time software development become transparent to the programmer, enabling the cost-effective exploitation of parallel hardware for building more flexible and powerful real-time imaging systems. The model integrated real-time image processing system (MIRTIS) is presented as an example. MIRTIS employs the multigraph architecture (MGA), a framework and set of tools for building MIPS systems, to generate parallel real-time image processing software which runs under the control of a parallel run-time kernel on a network of Texas Instruments TMS320C40 DSPs (C40s). The MIRTIS models contain graphical declarations of the image processing computations to be performed, the available hardware resources, and the timing constraints of the application. The MIRTIS model interpreter performs the parallelization, scaling, and mapping of the computations to the resources automatically or determines that the timing constraints cannot be met with the available resources. MIRTIS is a clear example of how parallel real-time image processing systems can be built which are (1) cost-effectively programmable, (2) flexible, (3) scalable, and (4) built from commercial off-the-shelf (COTS) components.

  1. OpenDA: Open Source Generic Data Assimilation Environment and its Application in Geophysical Process Models

    NASA Astrophysics Data System (ADS)

    Weerts, A.; van Velzen, N.; Verlaan, M.; Sumihar, J.; Hummel, S.; El Serafy, G.; Dhondia, J.; Gerritsen, H.; Vermeer-Ooms, S.; Loots, E.; Markus, A.; Kockx, A.

    2011-12-01

    Data Assimilation techniques are essential elements in state-of-the-art development of models and their use in operational forecasting and real-time control in the fields of groundwater, surface water and soil systems. In meteorological and atmospheric sciences, steady improvements in numerical weather forecasting and climate prediction over the last couple of decades have been enabled to a large degree by the development of community-based models and data assimilation systems. The hydrologic community should learn from the experiences of the meteorological and atmospheric communities by accelerating the transition of hydrologic DA research into operations and developing community-supported, open-source modeling and forecasting systems and data assimilation tools. In 2010, a community based open source initiative named OpenDA was started. The openDA initiative bears similarities with the well-known openMI initiative. OpenDA makes use of a set of interfaces that describe the interaction between models, observations and data assimilation algorithms. It focuses on flexible applications in portable systems for modeling geophysical processes. It provides a generic interfacing protocol that allows combination of the implemented data assimilation techniques with, in principle, any time-stepping model describing a process (atmospheric processes, 3D circulation, 2D water flow, rainfall-runoff, unsaturated flow, groundwater flow, etc.). Presently, openDA features filtering techniques and calibration techniques. The presentation will give an overview of openDA and the results of some of its practical applications.

  2. Erosion and sedimentation models in New Zealand: spanning scales, processes and environments

    NASA Astrophysics Data System (ADS)

    Elliott, Sandy; Oehler, Francois; Derose, Ron

    2010-05-01

    Erosion and sedimentation are of keen interest in New Zealand due to pasture loss in hill areas, damage to infrastructure, loss of stream conveyance, and ecological impacts in estuarine and coastal areas. Management of these impacts requires prediction of the rates, locations, and timing of erosion and transport across a range of scales, and prediction of the response to intervention measures. A range of models has been applied in New Zealand to address these requirements, including: empirical models for the location and probability of occurrence of shallow landslides; empirical national-scale sediment load models with spatial and temporal downscaling; dynamic field-scale sheet erosion models upscaled and linked to estuarine deposition models, including assessment of climate change and effects of urbanisation; detailed (20 m) physically-based distributed dynamic catchment models applied to catchment scale; and provision of GIS-based decision support tools. Despite these advances, considerable work is required to provide the right information at the right scale. Remaining issues are linking between control measures described at the scale of implementation (part of hillslopes, reaches) to catchment-scale outcomes, which entails fine spatial resolution and large computational demands; ability to predict some key processes such as bank and head gully erosion; representation of sediment remobilisation of stores associated with response to land clearance; ability to represent episodic or catastrophic erosion processes along with relatively continuous processes such as sheet flow in a single model; and prediction of sediment concentrations and clarity under normal flow conditions. In this presentation we describe a variety of models and their application in New Zealand, summarise the models in terms of scales, complexity and uses, and outline approaches to resolving the remaining difficulties.

  3. Business Process Elicitation, Modeling, and Reengineering: Teaching and Learning with Simulated Environments

    ERIC Educational Resources Information Center

    Jeyaraj, Anand

    2010-01-01

    The design of enterprise information systems requires students to master technical skills for elicitation, modeling, and reengineering business processes as well as soft skills for information gathering and communication. These tacit skills and behaviors cannot be effectively taught students but rather experienced and learned by students. This…

  4. Modeled near-field environment porosity modifications due to coupled thermohydrologic and geochemical processes

    SciTech Connect

    Glassley, W. E.; Nitao, J. J.

    1998-10-30

    Heat deposited by waste packages in nuclear waste repositories can modify rock properties by instigating mineral dissolution and precipitation along hydrothermal flow pathways. Modeling this reactive transport requires coupling fluid flow to permeability changes resulting from dissolution and precipitation. Modification of the NUFT thermohydrologic (TH) code package to account for this coupling in a simplified geochemical system has been used to model the time- dependent change in porosity, permeability, matrix and fracture saturation, and temperature in the vicinity of waste-emplacement drifts, using conditions anticipated for the potential Yucca Mountain repository. The results show, within a few hundred years, dramatic porosity reduction approximately 10 m above emplacement drifts. Most of this reduction is attributed to deposition of solute load at the boiling front, although some of it also results from decreasing temperature along the flow path. The actual distribution of the nearly sealed region is sensitive to the time- dependent characteristics of the thermal load imposed on the environment and suggests that the geometry of the sealed region can be engineered by managing the waste-emplacement strategy.

  5. Marine-hydrokinetic energy and the environment: Observations, modeling, and basic processes

    NASA Astrophysics Data System (ADS)

    Foufoula-Georgiou, Efi; Guala, Michele; Sotiropoulos, Fotis

    2012-03-01

    Research at the Interface of Marine Hydrokinetic Energy and the Environment: A Workshop; Minneapolis, Minnesota, 5-7 October 2011 Marine and hydrokinetic (MHK) energy harvesting technologies convert the kinetic energy of waves and water currents into power to generate electricity. Although these technologies are in early stages of development compared to other renewable technologies, such as solar and wind energy, they offer electricity consumers situated near coastlines or inland rivers an alternative energy technology that can help meet renewable portfolio standards. However, the potential environmental impacts of MHK energy are far from well understood, both in general principles and in site-specific cases. As pressure for new MHK energy licenses builds, accelerated research in providing the scientific understanding of harnessing the natural power of water for renewable energy at a competitive cost and without harming the environment becomes a priority.

  6. Model-based processing for shallow ocean environments: The broadband problem

    SciTech Connect

    Candy, J.V.; Sullivan, E.J.

    1996-01-31

    Most acoustic sources found is the ocean environmental are spatially complex and broadband. When propagating in a shallow ocean these source characteristics complicate the analysis of received acoustic data considerably. The enhancement of broadband acoustic pressure- field measurements using a vertical array is discussed. Here a model- based approach is developed for a broadband source using a normal- mode propagation model.

  7. Performance Improvement: Applying a Human Performance Model to Organizational Processes in a Military Training Environment

    ERIC Educational Resources Information Center

    Aaberg, Wayne; Thompson, Carla J.; West, Haywood V.; Swiergosz, Matthew J.

    2009-01-01

    This article provides a description and the results of a study that utilized the human performance (HP) model and methods to explore and analyze a training organization. The systemic and systematic practices of the HP model are applicable to military training organizations as well as civilian organizations. Implications of the study for future…

  8. An Efficient Simulation Environment for Modeling Large-Scale Cortical Processing

    PubMed Central

    Richert, Micah; Nageswaran, Jayram Moorkanikara; Dutt, Nikil; Krichmar, Jeffrey L.

    2011-01-01

    We have developed a spiking neural network simulator, which is both easy to use and computationally efficient, for the generation of large-scale computational neuroscience models. The simulator implements current or conductance based Izhikevich neuron networks, having spike-timing dependent plasticity and short-term plasticity. It uses a standard network construction interface. The simulator allows for execution on either GPUs or CPUs. The simulator, which is written in C/C++, allows for both fine grain and coarse grain specificity of a host of parameters. We demonstrate the ease of use and computational efficiency of this model by implementing a large-scale model of cortical areas V1, V4, and area MT. The complete model, which has 138,240 neurons and approximately 30 million synapses, runs in real-time on an off-the-shelf GPU. The simulator source code, as well as the source code for the cortical model examples is publicly available. PMID:22007166

  9. Arthropod model systems for studying complex biological processes in the space environment

    NASA Astrophysics Data System (ADS)

    Marco, Roberto; de Juan, Emilio; Ushakov, Ilya; Hernandorena, Arantxa; Gonzalez-Jurado, Juan; Calleja, Manuel; Manzanares, Miguel; Maroto, Miguel; Garesse, Rafael; Reitz, Günther; Miquel, Jaime

    1994-08-01

    Three arthropod systems are discussed in relation to their complementary and potential use in Space Biology. In a next biosatellite flight, Drosophila melanogaster pre-adapted during several months to different g levels will be flown in an automatic device that separates parental from first and second generations. In the same flight, flies will be exposed to microgravity conditions in an automatic unit in which fly motility can be recorded. In the International Microgravity Laboratory-2, several groups of Drosophila embryos will be grown in Space and the motility of a male fly population will be video-recorded. In the Biopan, an ESA exobilogy facility that can be flown attached to the exterior of a Russian biosatellite, Artemia dormant gastrulae will be exposed to the space environment in the exterior of the satellite under a normal atmosphere or in the void. Gastrulae will be separated in hit and non-hit populations. The developmental and aging response of these animals will be studied upon recovery. With these experiments we will be able to establish whether exposure to the space environment influences arthropod development and aging, and elaborate on some of the cellular mechanisms involved which should be tested in future experiments.

  10. Relational-database model for improving quality assurance and process control in a composite manufacturing environment

    NASA Astrophysics Data System (ADS)

    Gentry, Jeffery D.

    2000-05-01

    A relational database is a powerful tool for collecting and analyzing the vast amounts of inner-related data associated with the manufacture of composite materials. A relational database contains many individual database tables that store data that are related in some fashion. Manufacturing process variables as well as quality assurance measurements can be collected and stored in database tables indexed according to lot numbers, part type or individual serial numbers. Relationships between manufacturing process and product quality can then be correlated over a wide range of product types and process variations. This paper presents details on how relational databases are used to collect, store, and analyze process variables and quality assurance data associated with the manufacture of advanced composite materials. Important considerations are covered including how the various types of data are organized and how relationships between the data are defined. Employing relational database techniques to establish correlative relationships between process variables and quality assurance measurements is then explored. Finally, the benefits of database techniques such as data warehousing, data mining and web based client/server architectures are discussed in the context of composite material manufacturing.

  11. Condensation Processes in Astrophysical Environments

    NASA Technical Reports Server (NTRS)

    Nuth, Joseph A., III; Rietmeijer, Frans J. M.; Hill, Hugh G. M.

    2002-01-01

    Astrophysical systems present an intriguing set of challenges for laboratory chemists. Chemistry occurs in regions considered an excellent vacuum by laboratory standards and at temperatures that would vaporize laboratory equipment. Outflows around Asymptotic Giant Branch (AGB) stars have timescales ranging from seconds to weeks depending on the distance of the region of interest from the star and, on the way significant changes in the state variables are defined. The atmospheres in normal stars may only change significantly on several billion-year timescales. Most laboratory experiments carried out to understand astrophysical processes are not done at conditions that perfectly match the natural suite of state variables or timescales appropriate for natural conditions. Experimenters must make use of simple analog experiments that place limits on the behavior of natural systems, often extrapolating to lower-pressure and/or higher-temperature environments. Nevertheless, we argue that well-conceived experiments will often provide insights into astrophysical processes that are impossible to obtain through models or observations. This is especially true for complex chemical phenomena such as the formation and metamorphism of refractory grains under a range of astrophysical conditions. Data obtained in our laboratory has been surprising in numerous ways, ranging from the composition of the condensates to the thermal evolution of their spectral properties. None of this information could have been predicted from first principals and would not have been credible even if it had.

  12. Modeling microevolution in a changing environment: the evolving quasispecies and the diluted champion process

    NASA Astrophysics Data System (ADS)

    Bianconi, Ginestra; Fichera, Davide; Franz, Silvio; Peliti, Luca

    2011-08-01

    Several pathogens use evolvability as a survival strategy against acquired immunity of the host. Despite their high variability in time, some of them exhibit quite low variability within the population at any given time, a somewhat paradoxical behavior often called the evolving quasispecies. In this paper we introduce a simplified model of an evolving viral population in which the effects of the acquired immunity of the host are represented by the decrease of the fitness of the corresponding viral strains, depending on the frequency of the strain in the viral population. The model exhibits evolving quasispecies behavior in a certain range of its parameters, and suggests how punctuated evolution can be induced by a simple feedback mechanism.

  13. Process membership in asynchronous environments

    NASA Technical Reports Server (NTRS)

    Ricciardi, Aleta M.; Birman, Kenneth P.

    1993-01-01

    The development of reliable distributed software is simplified by the ability to assume a fail-stop failure model. The emulation of such a model in an asynchronous distributed environment is discussed. The solution proposed, called Strong-GMP, can be supported through a highly efficient protocol, and was implemented as part of a distributed systems software project at Cornell University. The precise definition of the problem, the protocol, correctness proofs, and an analysis of costs are addressed.

  14. Geothermal Systems in Yellowstone National Park are Excellent Model Environments for Linking Microbial Processes and Geochemical Cycling

    NASA Astrophysics Data System (ADS)

    Inskeep, W. P.; Jay, Z.

    2008-12-01

    Geothermal systems in Yellowstone National Park (YNP) are geochemically diverse, span pH values from approximately 2 to 10, and generally contain a plethora of reduced constituents that may serve as electron donors for chemotrophic microorganisms. One of our long-term goals has been to determine linkages between geochemical processes and the distribution of microbial populations in high-temperature environments, where geochemical conditions often constrain microbial community diversity. Although geochemical characteristics vary greatly across the world's largest geothermal basin, there exist key geochemical attributes that are likely most important for defining patterns in microbial distribution. For example, excellent model systems exist in YNP, where the predominant geochemical and microbial processes are focused on either S species and or Fe-oxidation-reduction. In such cases, we hypothesize that genetic diversity and functional gene content will link directly with habitat parameters. Several cases studies will be presented where pilot metagenomic data (random shotgun sequencing of environmental DNA) was used to identify key functional attributes and confirm that specific patterns of microbial distribution are indeed reflected in other gene loci besides the 16S rRNA gene. These model systems are excellent candidates for elucidating definitive linkages between S, As, and or Fe cycling, genomics and microbial regulation.

  15. A Phenomena-Oriented Environment for Teaching Process Modeling: Novel Modeling Software and Its Use in Problem Solving.

    ERIC Educational Resources Information Center

    Foss, Alan S.; Geurts, Kevin R.; Goodeve, Peter J.; Dahm, Kevin D.; Stephanopoulos, George; Bieszczad, Jerry; Koulouris, Alexandros

    1999-01-01

    Discusses a program that offers students a phenomenon-oriented environment expressed in the fundamental concepts and language of chemical engineering such as mass and energy balancing, phase equilibria, reaction stoichiometry and rate, modes of heat, and species transport. (CCM)

  16. A Comparison of Reasoning Processes in a Collaborative Modelling Environment: Learning about genetics problems using virtual chat

    NASA Astrophysics Data System (ADS)

    Pata, Kai; Sarapuu, Tago

    2006-09-01

    This study investigated the possible activation of different types of model-based reasoning processes in two learning settings, and the influence of various terms of reasoning on the learners’ problem representation development. Changes in 53 students’ problem representations about genetic issue were analysed while they worked with different modelling tools in a synchronous network-based environment. The discussion log-files were used for the “microgenetic” analysis of reasoning types. For studying the stages of students’ problem representation development, individual pre-essays and post-essays and their utterances during two reasoning phases were used. An approach for mapping problem representations was developed. Characterizing the elements of mental models and their reasoning level enabled the description of five hierarchical categories of problem representations. Learning in exploratory and experimental settings was registered as the shift towards more complex stages of problem representations in genetics. The effect of different types of reasoning could be observed as the divergent development of problem representations within hierarchical categories.

  17. Chandra Radiation Environment Modeling

    NASA Technical Reports Server (NTRS)

    Minow, Joseph I.; Blackwell, W. C.

    2003-01-01

    CRMFLX (Chandra Radiation Model of ion FluX) is a radiation environment risk mitigation tool for use as a decision aid in planning the operations times for Chandra's Advanced CCD Imaging Spectrometer (ACIS) detector. The accurate prediction of the proton flux environment with energies of 100 - 200 keV is needed in order to protect the ACIS detector against proton degradation. Unfortunately, protons of this energy are abundant in the region of space Chandra must operate, and on-board particle detectors do not measure proton flux levels of the required energy range. This presentation will describe the plasma environment data analysis and modeling basis of the CRMFLX engineering environment model developed to predict the proton flux in the solar wind, magnetosheath, and magnetosphere phenomenological regions of geospace. The recently released CRMFLX Version 2 implementation includes an algorithm that propagates flux from an observation location to other regions of the magnetosphere based on convective ExB and VB-curvature particle drift motions. This technique has the advantage of more completely filling out the database and makes maximum use of limited data obtained during high Kp periods or in areas of the magnetosphere with poor satellite flux measurement coverage.

  18. LDEF environment modeling updates

    NASA Technical Reports Server (NTRS)

    Gordon, Tim; Rantanen, Ray; Whitaker, Ann F.

    1995-01-01

    An updated gas dynamics model for gas interactions around the LDEF is presented that includes improved scattering algorithms. The primary improvement is more accurate predictions of surface fluxes in the wake region. The code used is the Integrated Spacecraft Environments Model (ISEM). Additionally, initial results of a detailed ISEM prediction model of the Solar Array Passive LDEF Experiment (SAMPLE), A0171, is presented. This model includes details of the A0171 geometry and outgassing characteristics of the many surfaces on the experiment. The detailed model includes the multiple scattering that exists between the ambient atmosphere, LDEF outgassing, and atomic oxygen erosion products. Predictions are made for gas densities, surface fluxes and deposition at three different time periods of the LDEF mission.

  19. Sensitivity of the Community Land Model (CLM4.0) to Key Modeling Parameters and Modeling of Key Physical Processes with Focus on the Arctic Environment

    NASA Astrophysics Data System (ADS)

    Kalinina, E.; Peplinski, W.; Tidwell, V. C.; Hart, D.

    2012-12-01

    The Community Land Model (CLM) simulates major physical processes at the land surface and in the shallow subsurface and calculates the parameters (including energy components) that are then used as the inputs into the atmospheric model. Our major goal was to identify the parameters that have greatest impacts on these inputs and thus, the greatest potential to impact the climate in Arctic environment. Another goal was to identify the limitations in representing different physical processes and to determine whether these limitations restrict the ability of CLM to predict the distribution of energy at the land surface. The focus of our analysis was on the vegetation and soil models. We selected a grid cell near Fairbanks, Alaska. This grid cell does not have lakes, glaciers, and wetlands and the major land unit is the vegetated land. The historical data set for the period of 1948 to 2004 from National Center for Atmospheric Research (NCAR) was used to generate atmospheric forcing data for this analysis. The CLM 4.0 (Community Land Model) was used for land simulations of the selected point grid cell. A range of hydrogeologic and thermal soil properties and vegetation characteristics was defined for the vegetation and soil data. We modified the subsurface drainage parameters to allow for more realistic water table depths (shallow water table) and fluctuations. We also modified the root distribution parameters hard-wired in CLM to represent its potential variability in the sensitivity runs. Multiple CLM sensitivity runs were compared with regard to their effects on the feedbacks to the atmospheric model. The major conclusions of this analysis are: - The vegetation and soil parameters mostly affect the ground heat component of the energy balance, which in this environment is only about 3%. As a result, these parameters have relatively small impacts on the atmospheric inputs. - The most important parameters are the Leaf Area Index (LAI) and soil moisture. The other

  20. Modeling Multiphase Coastal and Hydraulic Processes in an Interactive Python Environment with the Open Source Proteus Toolkit

    NASA Astrophysics Data System (ADS)

    Kees, C. E.; Farthing, M. W.; Ahmadia, A. J.; Bakhtyar, R.; Miller, C. T.

    2014-12-01

    Hydrology is dominated by multiphase flow processes, due to the importance of capturing water's interaction with soil and air phases. Unfortunately, many different mathematical model formulations are required to model particular processes and scales of interest, and each formulation often requires specialized numerical methods. The Proteus toolkit is a software package for research on models for coastal and hydraulic processes and improvements in numerics, particularly 3D multiphase processes and parallel numerics. The models considered include multiphase flow, shallow water flow, turbulent free surface flow, and various flow-driven processes. We will discuss the objectives of Proteus and recent evolution of the toolkit's design as well as present examples of how it has been used used to construct computational models of multiphase flows for the US Army Corps of Engineers. Proteus is also an open source toolkit authored primarily within the US Army Corps of Engineers, and used, developed, and maintained by a small community of researchers in both theoretical modeling and computational methods research. We will discuss how open source and community development practices have played a role in the creation of Proteus.

  1. Contact processes in crowded environments.

    PubMed

    Xu, S-L-Y; Schwarz, J M

    2013-11-01

    Periodically sheared colloids at low densities demonstrate a dynamical phase transition from an inactive to active phase as the strain amplitude is increased. The inactive phase consists of no collisions (contacts) between particles in the steady state limit, while in the active phase collisions persist. To investigate this system at higher densities, we construct and study a conserved-particle-number contact process with three-body interactions, which are potentially more likely than two-body interactions at higher densities. For example, consider one active (diffusing) particle colliding with two inactive (nondiffusing) particles such that they become active and consider spontaneous inactivation. In mean field, this system exhibits a continuous dynamical phase transition. Simulations on square lattices also indicate a continuous transition with exponents similar to those measured for the conserved lattice gas (CLG) model. In contrast, the three-body interaction requiring two active particles to activate one inactive particle exhibits a discontinuous transition. Finally, inspired by kinetically constrained models of the glass transition, we investigate the "caging effect" at even higher particle densities to look for a second dynamical phase transition back to an inactive phase. Square lattice simulations suggest a continuous transition with a new set of exponents differing from both the CLG model and what is known as directed percolation, indicating a potentially new universality class for a contact process with a conserved particle number. PMID:24329237

  2. An Integrated Vehicle Modeling Environment

    NASA Technical Reports Server (NTRS)

    Totah, Joseph J.; Kinney, David J.; Kaneshige, John T.; Agabon, Shane

    1999-01-01

    This paper describes an Integrated Vehicle Modeling Environment for estimating aircraft geometric, inertial, and aerodynamic characteristics, and for interfacing with a high fidelity, workstation based flight simulation architecture. The goals in developing this environment are to aid in the design of next generation intelligent fight control technologies, conduct research in advanced vehicle interface concepts for autonomous and semi-autonomous applications, and provide a value-added capability to the conceptual design and aircraft synthesis process. Results are presented for three aircraft by comparing estimates generated by the Integrated Vehicle Modeling Environment with known characteristics of each vehicle under consideration. The three aircraft are a modified F-15 with moveable canards attached to the airframe, a mid-sized, twin-engine commercial transport concept, and a small, single-engine, uninhabited aerial vehicle. Estimated physical properties and dynamic characteristics are correlated with those known for each aircraft over a large portion of the flight envelope of interest. These results represent the completion of a critical step toward meeting the stated goals for developing this modeling environment.

  3. DEVELOPMENT OF A CHEMICAL PROCESS MODELING ENVIRONMENT BASED ON CAPE-OPEN INTERFACE STANDARDS AND THE MICROSOFT .NET FRAMEWORK

    EPA Science Inventory

    Chemical process simulation has long been used as a design tool in the development of chemical plants, and has long been considered a means to evaluate different design options. With the advent of large scale computer networks and interface models for program components, it is po...

  4. Generalized Environment for Modeling Systems

    SciTech Connect

    2012-02-07

    GEMS is an integrated environment that allows technical analysts, modelers, researchers, etc. to integrate and deploy models and/or decision tools with associated data to the internet for direct use by customers. GEMS does not require that the model developer know how to code or script and therefore delivers this capability to a large group of technical specialists. Customers gain the benefit of being able to execute their own scenarios directly without need for technical support. GEMS is a process that leverages commercial software products with specialized codes that add connectivity and unique functions to support the overall capability. Users integrate pre-existing models with a commercial product and store parameters and input trajectories in a companion commercial database. The model is then exposed into a commercial web environment and a graphical user interface (GUI) is applied by the model developer. Users execute the model through the web based GUI and GEMS manages supply of proper inputs, execution of models, routing of data to models and display of results back to users. GEMS works in layers, the following description is from the bottom up. Modelers create models in the modeling tool of their choice such as Excel, Matlab, or Fortran. They can also use models from a library of previously wrapped legacy codes (models). Modelers integrate the models (or a single model) by wrapping and connecting the models using the Phoenix Integration tool entitled ModelCenter. Using a ModelCenter/SAS plugin (DOE copyright CW-10-08) the modeler gets data from either an SAS or SQL database and sends results back to SAS or SQL. Once the model is working properly, the ModelCenter file is saved and stored in a folder location to which a SharePoint server tool created at INL is pointed. This enables the ModelCenter model to be run from SharePoint. The modeler then goes into Microsoft SharePoint and creates a graphical user interface (GUI) using the ModelCenter WebPart (CW-12

  5. Generalized Environment for Modeling Systems

    Energy Science and Technology Software Center (ESTSC)

    2012-02-07

    GEMS is an integrated environment that allows technical analysts, modelers, researchers, etc. to integrate and deploy models and/or decision tools with associated data to the internet for direct use by customers. GEMS does not require that the model developer know how to code or script and therefore delivers this capability to a large group of technical specialists. Customers gain the benefit of being able to execute their own scenarios directly without need for technical support.more » GEMS is a process that leverages commercial software products with specialized codes that add connectivity and unique functions to support the overall capability. Users integrate pre-existing models with a commercial product and store parameters and input trajectories in a companion commercial database. The model is then exposed into a commercial web environment and a graphical user interface (GUI) is applied by the model developer. Users execute the model through the web based GUI and GEMS manages supply of proper inputs, execution of models, routing of data to models and display of results back to users. GEMS works in layers, the following description is from the bottom up. Modelers create models in the modeling tool of their choice such as Excel, Matlab, or Fortran. They can also use models from a library of previously wrapped legacy codes (models). Modelers integrate the models (or a single model) by wrapping and connecting the models using the Phoenix Integration tool entitled ModelCenter. Using a ModelCenter/SAS plugin (DOE copyright CW-10-08) the modeler gets data from either an SAS or SQL database and sends results back to SAS or SQL. Once the model is working properly, the ModelCenter file is saved and stored in a folder location to which a SharePoint server tool created at INL is pointed. This enables the ModelCenter model to be run from SharePoint. The modeler then goes into Microsoft SharePoint and creates a graphical user interface (GUI) using the ModelCenter Web

  6. Modeling of LDEF contamination environment

    NASA Technical Reports Server (NTRS)

    Carruth, M. Ralph, Jr.; Rantanen, Ray; Gordon, Tim

    1993-01-01

    The Long Duration Exposure Facility (LDEF) satellite was unique in many ways. It was a large structure that was in space for an extended period of time and was stable in orientation relative to the velocity vector. There are obvious and well documented effects of contamination and space environment effects on the LDEF satellite. In order to examine the interaction of LDEF with its environment and the resulting effect on the satellite, the Integrated Spacecraft Environments Model (ISEM) was used to model the LDEF-induced neutral environment at several different times and altitudes during the mission.

  7. Space environment model construction technology

    NASA Astrophysics Data System (ADS)

    Nishimoto, Hironobu; Matsumoto, Haruhisa

    1992-08-01

    A space environment model was constructed based on the results of the review on space environment model conducted in Fiscal Year 1986 and 1987. The space environment model was constructed to collect theories and data required for grasping various physical entities such as radiation, plasma, and spacecraft fragments and so forth, and to enable quantitative prediction of their time wise, spacial distribution and their effects such as electrification and material deterioration, and its system structure and functions were shown. The Technical Data Acquisition Equipment (TEDA) installed onboard the Engineering Test Satellite-5 (ETS-5) consist of various satellite environment monitors and component and material deterioration monitors for the purpose of acquiring technical data required for design and evaluation for satellite development. Review was conducted to clarify the correlation between each TEDA data and to apply the result in constructing the satellite environment model. Correlation between each TEDA data was made clear.

  8. Managing Algorithmic Skeleton Nesting Requirements in Realistic Image Processing Applications: The Case of the SKiPPER-II Parallel Programming Environment's Operating Model

    NASA Astrophysics Data System (ADS)

    Coudarcher, Rémi; Duculty, Florent; Serot, Jocelyn; Jurie, Frédéric; Derutin, Jean-Pierre; Dhome, Michel

    2005-12-01

    SKiPPER is a SKeleton-based Parallel Programming EnviRonment being developed since 1996 and running at LASMEA Laboratory, the Blaise-Pascal University, France. The main goal of the project was to demonstrate the applicability of skeleton-based parallel programming techniques to the fast prototyping of reactive vision applications. This paper deals with the special features embedded in the latest version of the project: algorithmic skeleton nesting capabilities and a fully dynamic operating model. Throughout the case study of a complete and realistic image processing application, in which we have pointed out the requirement for skeleton nesting, we are presenting the operating model of this feature. The work described here is one of the few reported experiments showing the application of skeleton nesting facilities for the parallelisation of a realistic application, especially in the area of image processing. The image processing application we have chosen is a 3D face-tracking algorithm from appearance.

  9. Estimation of environment-related properties of chemicals for design of sustainable processes: development of group-contribution+ (GC+) property models and uncertainty analysis.

    PubMed

    Hukkerikar, Amol Shivajirao; Kalakul, Sawitree; Sarup, Bent; Young, Douglas M; Sin, Gürkan; Gani, Rafiqul

    2012-11-26

    of the developed property models for the estimation of environment-related properties and uncertainties of the estimated property values is highlighted through an illustrative example. The developed property models provide reliable estimates of environment-related properties needed to perform process synthesis, design, and analysis of sustainable chemical processes and allow one to evaluate the effect of uncertainties of estimated property values on the calculated performance of processes giving useful insights into quality and reliability of the design of sustainable processes. PMID:23039255

  10. Biosphere Process Model Report

    SciTech Connect

    J. Schmitt

    2000-05-25

    To evaluate the postclosure performance of a potential monitored geologic repository at Yucca Mountain, a Total System Performance Assessment (TSPA) will be conducted. Nine Process Model Reports (PMRs), including this document, are being developed to summarize the technical basis for each of the process models supporting the TSPA model. These reports cover the following areas: (1) Integrated Site Model; (2) Unsaturated Zone Flow and Transport; (3) Near Field Environment; (4) Engineered Barrier System Degradation, Flow, and Transport; (5) Waste Package Degradation; (6) Waste Form Degradation; (7) Saturated Zone Flow and Transport; (8) Biosphere; and (9) Disruptive Events. Analysis/Model Reports (AMRs) contain the more detailed technical information used to support TSPA and the PMRs. The AMRs consists of data, analyses, models, software, and supporting documentation that will be used to defend the applicability of each process model for evaluating the postclosure performance of the potential Yucca Mountain repository system. This documentation will ensure the traceability of information from its source through its ultimate use in the TSPA-Site Recommendation (SR) and in the National Environmental Policy Act (NEPA) analysis processes. The objective of the Biosphere PMR is to summarize (1) the development of the biosphere model, and (2) the Biosphere Dose Conversion Factors (BDCFs) developed for use in TSPA. The Biosphere PMR does not present or summarize estimates of potential radiation doses to human receptors. Dose calculations are performed as part of TSPA and will be presented in the TSPA documentation. The biosphere model is a component of the process to evaluate postclosure repository performance and regulatory compliance for a potential monitored geologic repository at Yucca Mountain, Nevada. The biosphere model describes those exposure pathways in the biosphere by which radionuclides released from a potential repository could reach a human receptor

  11. Slot Region Radiation Environment Models

    NASA Astrophysics Data System (ADS)

    Sandberg, Ingmar; Daglis, Ioannis; Heynderickx, Daniel; Evans, Hugh; Nieminen, Petteri

    2013-04-01

    Herein we present the main characteristics and first results of the Slot Region Radiation Environment Models (SRREMs) project. The statistical models developed in SRREMs aim to address the variability of trapped electron and proton fluxes in the region between the inner and the outer electron radiation belt. The energetic charged particle fluxes in the slot region are highly dynamic and are known to vary by several orders of magnitude on both short and long timescales. During quiet times, the particle fluxes are much lower than those found at the peak of the inner and outer belts and the region is considered benign. During geospace magnetic storms, though, this region can fill with energetic particles as the peak of the outer belt is pushed Earthwards and the fluxes can increase drastically. There has been a renewed interest in the potential operation of commercial satellites in orbits that are at least partially contained within the Slot Region. Hence, there is a need to improve the current radiation belt models, most of which do not model the extreme variability of the slot region and instead provide long-term averages between the better-known low and medium Earth orbits (LEO and MEO). The statistical models developed in the SRREMs project are based on the analysis of a large volume of available data and on the construction of a virtual database of slot region particle fluxes. The analysis that we have followed retains the long-term temporal, spatial and spectral variations in electron and proton fluxes as well as the short-term enhancement events at altitudes and inclinations relevant for satellites in the slot region. A large number of datasets have been used for the construction, evaluation and inter-calibration of the SRREMs virtual dataset. Special emphasis has been given on the use and analysis of ESA Standard Radiation Environment Monitor (SREM) data from the units on-board PROBA-1, INTEGRAL, and GIOVE-B due to the sufficient spatial and long temporal

  12. Adapting the Electrospinning Process to Provide Three Unique Environments for a Tri-layered In Vitro Model of the Airway Wall.

    PubMed

    Bridge, Jack C; Aylott, Jonathan W; Brightling, Christopher E; Ghaemmaghami, Amir M; Knox, Alan J; Lewis, Mark P; Rose, Felicity R A J; Morris, Gavin E

    2015-01-01

    Electrospinning is a highly adaptable method producing porous 3D fibrous scaffolds that can be exploited in in vitro cell culture. Alterations to intrinsic parameters within the process allow a high degree of control over scaffold characteristics including fiber diameter, alignment and porosity. By developing scaffolds with similar dimensions and topographies to organ- or tissue-specific extracellular matrices (ECM), micro-environments representative to those that cells are exposed to in situ can be created. The airway bronchiole wall, comprised of three main micro-environments, was selected as a model tissue. Using decellularized airway ECM as a guide, we electrospun the non-degradable polymer, polyethylene terephthalate (PET), by three different protocols to produce three individual electrospun scaffolds optimized for epithelial, fibroblast or smooth muscle cell-culture. Using a commercially available bioreactor system, we stably co-cultured the three cell-types to provide an in vitro model of the airway wall over an extended time period. This model highlights the potential for such methods being employed in in vitro diagnostic studies investigating important inter-cellular cross-talk mechanisms or assessing novel pharmaceutical targets, by providing a relevant platform to allow the culture of fully differentiated adult cells within 3D, tissue-specific environments. PMID:26275100

  13. Adapting the Electrospinning Process to Provide Three Unique Environments for a Tri-layered In Vitro Model of the Airway Wall

    PubMed Central

    Bridge, Jack C.; Aylott, Jonathan W.; Brightling, Christopher E.; Ghaemmaghami, Amir M.; Knox, Alan J.; Lewis, Mark P.; Rose, Felicity R.A.J.; Morris, Gavin E.

    2015-01-01

    Electrospinning is a highly adaptable method producing porous 3D fibrous scaffolds that can be exploited in in vitro cell culture. Alterations to intrinsic parameters within the process allow a high degree of control over scaffold characteristics including fiber diameter, alignment and porosity. By developing scaffolds with similar dimensions and topographies to organ- or tissue-specific extracellular matrices (ECM), micro-environments representative to those that cells are exposed to in situ can be created. The airway bronchiole wall, comprised of three main micro-environments, was selected as a model tissue. Using decellularized airway ECM as a guide, we electrospun the non-degradable polymer, polyethylene terephthalate (PET), by three different protocols to produce three individual electrospun scaffolds optimized for epithelial, fibroblast or smooth muscle cell-culture. Using a commercially available bioreactor system, we stably co-cultured the three cell-types to provide an in vitro model of the airway wall over an extended time period. This model highlights the potential for such methods being employed in in vitro diagnostic studies investigating important inter-cellular cross-talk mechanisms or assessing novel pharmaceutical targets, by providing a relevant platform to allow the culture of fully differentiated adult cells within 3D, tissue-specific environments. PMID:26275100

  14. A Learning Model for Enhancing the Student's Control in Educational Process Using Web 2.0 Personal Learning Environments

    ERIC Educational Resources Information Center

    Rahimi, Ebrahim; van den Berg, Jan; Veen, Wim

    2015-01-01

    In recent educational literature, it has been observed that improving student's control has the potential of increasing his or her feeling of ownership, personal agency and activeness as means to maximize his or her educational achievement. While the main conceived goal for personal learning environments (PLEs) is to increase student's control by…

  15. Process planning under job shop environment

    NASA Astrophysics Data System (ADS)

    Wang, Xiankui; Li, Zhizhong; Liu, Chengying; Tian, Wensheng

    1995-08-01

    There is a lack of information flow from job shop environment to the CAPP (computer aided process planning) system, which prevents a CAPP system from being more practical. THCAPP-G (CAPP for garment, developed by Tsinghua University in 1994) is a two-stage, nonlinear, closed-loop and dynamic process planning system. It generates process plans taking advantage of the flexibility of the manufacturing process and job shop environment, according to the dynamic status of the job shop environment. Techniques of ES (expert system), CPA (critical path analysis), and heuristic method were utilized comprehensively. An information flow between job shop environment and CAPP system was realized in this system. The developed ideas and system structure are discussed mainly in this paper.

  16. Scalable Networked Information Processing Environment (SNIPE)

    SciTech Connect

    Fagg, G.E.; Moore, K.; Dongarra, J.J. |; Geist, A.

    1997-11-01

    SNIPE is a metacomputing system that aims to provide a reliable, secure, fault tolerant environment for long term distributed computing applications and data stores across the global Internet. This system combines global naming and replication of both processing and data to support large scale information processing applications leading to better availability and reliability than currently available with typical cluster computing and/or distributed computer environments.

  17. Modeling the Space Radiation Environment

    NASA Technical Reports Server (NTRS)

    Xapsos, Michael A.

    2006-01-01

    There has been a renaissance of interest in space radiation environment modeling. This has been fueled by the growing need to replace long time standard AP-9 and AE-8 trapped particle models, the interplanetary exploration initiative, the modern satellite instrumentation that has led to unprecedented measurement accuracy, and the pervasive use of Commercial off the Shelf (COTS) microelectronics that require more accurate predictive capabilities. The objective of this viewgraph presentation was to provide basic understanding of the components of the space radiation environment and their variations, review traditional radiation effects application models, and present recent developments.

  18. A Process and Environment Aware Sierra/SolidMechanics Cohesive Zone Modeling Capability for Polymer/Solid Interfaces

    SciTech Connect

    Reedy, E. D.; Chambers, Robert S.; Hughes, Lindsey Gloe; Kropka, Jamie Michael; Stavig, Mark E.; Stevens, Mark J.

    2015-09-01

    The performance and reliability of many mechanical and electrical components depend on the integrity of po lymer - to - solid interfaces . Such interfaces are found in adhesively bonded joints, encapsulated or underfilled electronic modules, protective coatings, and laminates. The work described herein was aimed at improving Sandia's finite element - based capability to predict interfacial crack growth by 1) using a high fidelity nonlinear viscoelastic material model for the adhesive in fracture simulations, and 2) developing and implementing a novel cohesive zone fracture model that generates a mode - mixity dependent toughness as a natural consequence of its formulation (i.e., generates the observed increase in interfacial toughness wi th increasing crack - tip interfacial shear). Furthermore, molecular dynamics simulations were used to study fundamental material/interfa cial physics so as to develop a fuller understanding of the connection between molecular structure and failure . Also reported are test results that quantify how joint strength and interfacial toughness vary with temperature.

  19. Spacecraft Internal Acoustic Environment Modeling

    NASA Technical Reports Server (NTRS)

    Allen, Christopher; Chu, S. Reynold

    2008-01-01

    The objective of the project is to develop an acoustic modeling capability, based on commercial off-the-shelf software, to be used as a tool for oversight of the future manned Constellation vehicles to ensure compliance with acoustic requirements and thus provide a safe and habitable acoustic environment for the crews, and to validate developed models via building physical mockups and conducting acoustic measurements.

  20. Modeling the growth and constraints of thermophiles and biogeochemical processes in deep-sea hydrothermal environments (Invited)

    NASA Astrophysics Data System (ADS)

    Holden, J. F.; Ver Eecke, H. C.; Lin, T. J.; Butterfield, D. A.; Olson, E. J.; Jamieson, J.; Knutson, J. K.; Dyar, M. D.

    2010-12-01

    and contain an abundance of Fe(III) oxide and sulfate minerals, especially on surfaces of pore spaces. Hyperthermophilic iron reducers attach to iron oxide particles via cell wall invaginations and pili and reduce the iron through direct contact. The iron is reduced to magnetite, possibly with a maghemite intermediate. Thus iron reducers could outcompete methanogens in low H2, mildly reducing habitats such as Endeavour. Unlike strain JH146, respiration rates per cell were highest near the optimal growth temperature for the iron reducer Hyperthermus strain Ro04 and decreased near the temperature limits for growth. This study highlights the need to model microbe-metal interactions and improve respiration estimates from pure cultures to refine our in situ bioenergetic and habitat models.

  1. Electronic materials processing and the microgravity environment

    NASA Technical Reports Server (NTRS)

    Witt, A. F.

    1988-01-01

    The nature and origin of deficiencies in bulk electronic materials for device fabrication are analyzed. It is found that gravity generated perturbations during their formation account largely for the introduction of critical chemical and crystalline defects and, moreover, are responsible for the still existing gap between theory and experiment and thus for excessive reliance on proprietary empiricism in processing technology. Exploration of the potential of reduced gravity environment for electronic materials processing is found to be not only desirable but mandatory.

  2. Teaching Process Writing in an Online Environment

    ERIC Educational Resources Information Center

    Carolan, Fergal; Kyppö, Anna

    2015-01-01

    This reflective practice paper offers some insights into teaching an interdisciplinary academic writing course aimed at promoting process writing. The study reflects on students' acquisition of writing skills and the teacher's support practices in a digital writing environment. It presents writers' experiences related to various stages of process…

  3. Cabin Environment Physics Risk Model

    NASA Technical Reports Server (NTRS)

    Mattenberger, Christopher J.; Mathias, Donovan Leigh

    2014-01-01

    This paper presents a Cabin Environment Physics Risk (CEPR) model that predicts the time for an initial failure of Environmental Control and Life Support System (ECLSS) functionality to propagate into a hazardous environment and trigger a loss-of-crew (LOC) event. This physics-of failure model allows a probabilistic risk assessment of a crewed spacecraft to account for the cabin environment, which can serve as a buffer to protect the crew during an abort from orbit and ultimately enable a safe return. The results of the CEPR model replace the assumption that failure of the crew critical ECLSS functionality causes LOC instantly, and provide a more accurate representation of the spacecraft's risk posture. The instant-LOC assumption is shown to be excessively conservative and, moreover, can impact the relative risk drivers identified for the spacecraft. This, in turn, could lead the design team to allocate mass for equipment to reduce overly conservative risk estimates in a suboptimal configuration, which inherently increases the overall risk to the crew. For example, available mass could be poorly used to add redundant ECLSS components that have a negligible benefit but appear to make the vehicle safer due to poor assumptions about the propagation time of ECLSS failures.

  4. Process engineering concerns in the lunar environment

    NASA Technical Reports Server (NTRS)

    Sullivan, T. A.

    1990-01-01

    The paper discusses the constraints on a production process imposed by the lunar or Martian environment on the space transportation system. A proposed chemical route to produce oxygen from iron oxide bearing minerals (including ilmenite) is presented in three different configurations which vary in complexity. A design for thermal energy storage is presented that could both provide power during the lunar night and act as a blast protection barrier for the outpost. A process to release carbon from the lunar regolith as methane is proposed, capitalizing on the greater abundance and favorable physical properties of methane relative to hydrogen to benefit the entire system.

  5. The process-based stand growth model Formix 3-Q applied in a GIS environment for growth and yield analysis in a tropical rain forest.

    PubMed

    Ditzer, T.; Glauner, R.; Förster, M.; Köhler, P.; Huth, A.

    2000-03-01

    Managing tropical rain forests is difficult because few long-term field data on forest growth and the impact of harvesting disturbance are available. Growth models may provide a valuable tool for managers of tropical forests, particularly if applied to the extended forest areas of up to 100,000 ha that typically constitute the so-called forest management units (FMUs). We used a stand growth model in a geographic information system (GIS) environment to simulate tropical rain forest growth at the FMU level. We applied the process-based rain forest growth model Formix 3-Q to the 55,000 ha Deramakot Forest Reserve (DFR) in Sabah, Malaysia. The FMU was considered to be composed of single and independent small-scale stands differing in site conditions and forest structure. Field data, which were analyzed with a GIS, comprised a terrestrial forest inventory, site and soil analyses (water, nutrients, slope), the interpretation of aerial photographs of the present vegetation and topographic maps. Different stand types were determined based on a classification of site quality (three classes), slopes (four classes), and present forest structure (four strata). The effects of site quality on tree allometry (height-diameter curve, biomass allometry, leaf area) and growth (increment size) are incorporated into Formix 3-Q. We derived allometric relations and growth factors for different site conditions from the field data. Climax forest structure at the stand level was shown to depend strongly on site conditions. Simulated successional pattern and climax structure were compared with field observations. Based on the current management plan for the DFR, harvesting scenarios were simulated for stands on different sites. The effects of harvesting guidelines on forest structure and the implications for sustainable forest management at Deramakot were analyzed. Based on the stand types and GIS analysis, we also simulated undisturbed regeneration of the logged-over forest in the DFR at

  6. Electrophysiological models of neural processing.

    PubMed

    Nelson, Mark E

    2011-01-01

    The brain is an amazing information processing system that allows organisms to adaptively monitor and control complex dynamic interactions with their environment across multiple spatial and temporal scales. Mathematical modeling and computer simulation techniques have become essential tools in understanding diverse aspects of neural processing ranging from sub-millisecond temporal coding in the sound localization circuity of barn owls to long-term memory storage and retrieval in humans that can span decades. The processing capabilities of individual neurons lie at the core of these models, with the emphasis shifting upward and downward across different levels of biological organization depending on the nature of the questions being addressed. This review provides an introduction to the techniques for constructing biophysically based models of individual neurons and local networks. Topics include Hodgkin-Huxley-type models of macroscopic membrane currents, Markov models of individual ion-channel currents, compartmental models of neuronal morphology, and network models involving synaptic interactions among multiple neurons. PMID:21064164

  7. Microbial processes in fractured rock environments

    NASA Astrophysics Data System (ADS)

    Kinner, Nancy E.; Eighmy, T. Taylor; Mills, M.; Coulburn, J.; Tisa, L.

    Little is known about the types and activities of microbes in fractured rock environments, but recent studies in a variety of bedrock formations have documented the presence of a diverse array of prokaryotes (Eubacteria and Archaea) and some protists. The prokaryotes appear to live in both diffusion-dominated microfractures and larger, more conductive open fractures. Some of the prokaryotes are associated with the surfaces of the host rock and mineral precipitates, while other planktonic forms are floating/moving in the groundwater filling the fractures. Studies indicate that the surface-associated and planktonic communities are distinct, and their importance in microbially mediated processes occurring in the bedrock environment may vary, depending on the availability of electron donors/acceptors and nutrients needed by the cells. In general, abundances of microbes are low compared with other environments, because of the paucity of these substances that are transported into the deeper subsurface where most bedrock occurs, unless there is significant pollution with an electron donor. To obtain a complete picture of the microbes present and their metabolic activity, it is usually necessary to sample formation water from specific fractures (versus open boreholes), and fracture surfaces (i.e., cores). Transport of the microbes through the major fracture pathways can be rapid, but may be quite limited in the microfractures. Very low abundances of small ( 2-3 μm) flagellated protists, which appear to prey upon planktonic bacteria, have been found in a bedrock aquifer. Much more research is needed to expand the understanding of all microbial processes in fractured rock environments.

  8. Estimation of Environment-Related Properties of Chemicals for Design of Sustainable Processes: Development of Group-Contribution+ (GC+) Property Models and Uncertainty Analysis

    EPA Science Inventory

    The aim of this work is to develop group-contribution+ (GC+) method (combined group-contribution (GC) method and atom connectivity index (CI) method) based property models to provide reliable estimations of environment-related properties of organic chemicals together with uncert...

  9. Space environment and lunar surface processes, 2

    NASA Technical Reports Server (NTRS)

    Comstock, G. M.

    1982-01-01

    The top few millimeters of a surface exposed to space represents a physically and chemically active zone with properties different from those of a surface in the environment of a planetary atmosphere. To meet the need or a quantitative synthesis of the various processes contributing to the evolution of surfaces of the Moon, Mercury, the asteroids, and similar bodies, (exposure to solar wind, solar flare particles, galactic cosmic rays, heating from solar radiation, and meteoroid bombardment), the MESS 2 computer program was developed. This program differs from earlier work in that the surface processes are broken down as a function of size scale and treated in three dimensions with good resolution on each scale. The results obtained apply to the development of soil near the surface and is based on lunar conditions. Parameters can be adjusted to describe asteroid regoliths and other space-related bodies.

  10. Spacecraft Internal Acoustic Environment Modeling

    NASA Technical Reports Server (NTRS)

    Chu, SShao-sheng R.; Allen, Christopher S.

    2009-01-01

    carried out by acquiring octave band microphone data simultaneously at ten fixed locations throughout the mockup. SPLs (Sound Pressure Levels) predicted by our SEA model match well with measurements for our CM mockup, with a more complicated shape. Additionally in FY09, background NC noise (Noise Criterion) simulation and MRT (Modified Rhyme Test) were developed and performed in the mockup to determine the maximum noise level in CM habitable volume for fair crew voice communications. Numerous demonstrations of simulated noise environment in the mockup and associated SIL (Speech Interference Level) via MRT were performed for various communities, including members from NASA and Orion prime-/sub-contractors. Also, a new HSIR (Human-Systems Integration Requirement) for limiting pre- and post-landing SIL was proposed.

  11. A multiarchitecture parallel-processing development environment

    NASA Technical Reports Server (NTRS)

    Townsend, Scott; Blech, Richard; Cole, Gary

    1993-01-01

    A description is given of the hardware and software of a multiprocessor test bed - the second generation Hypercluster system. The Hypercluster architecture consists of a standard hypercube distributed-memory topology, with multiprocessor shared-memory nodes. By using standard, off-the-shelf hardware, the system can be upgraded to use rapidly improving computer technology. The Hypercluster's multiarchitecture nature makes it suitable for researching parallel algorithms in computational field simulation applications (e.g., computational fluid dynamics). The dedicated test-bed environment of the Hypercluster and its custom-built software allows experiments with various parallel-processing concepts such as message passing algorithms, debugging tools, and computational 'steering'. Such research would be difficult, if not impossible, to achieve on shared, commercial systems.

  12. Gene-Environment Interplay in Twin Models.

    PubMed

    Verhulst, Brad; Hatemi, Peter K

    2013-07-01

    In this article, we respond to Shultziner's critique that argues that identical twins are more alike not because of genetic similarity, but because they select into more similar environments and respond to stimuli in comparable ways, and that these effects bias twin model estimates to such an extent that they are invalid. The essay further argues that the theory and methods that undergird twin models, as well as the empirical studies which rely upon them, are unaware of these potential biases. We correct this and other misunderstandings in the essay and find that gene-environment (GE) interplay is a well-articulated concept in behavior genetics and political science, operationalized as gene-environment correlation and gene-environment interaction. Both are incorporated into interpretations of the classical twin design (CTD) and estimated in numerous empirical studies through extensions of the CTD. We then conduct simulations to quantify the influence of GE interplay on estimates from the CTD. Due to the criticism's mischaracterization of the CTD and GE interplay, combined with the absence of any empirical evidence to counter what is presented in the extant literature and this article, we conclude that the critique does not enhance our understanding of the processes that drive political traits, genetic or otherwise. PMID:24808718

  13. Gene-Environment Interplay in Twin Models

    PubMed Central

    Hatemi, Peter K.

    2013-01-01

    In this article, we respond to Shultziner’s critique that argues that identical twins are more alike not because of genetic similarity, but because they select into more similar environments and respond to stimuli in comparable ways, and that these effects bias twin model estimates to such an extent that they are invalid. The essay further argues that the theory and methods that undergird twin models, as well as the empirical studies which rely upon them, are unaware of these potential biases. We correct this and other misunderstandings in the essay and find that gene-environment (GE) interplay is a well-articulated concept in behavior genetics and political science, operationalized as gene-environment correlation and gene-environment interaction. Both are incorporated into interpretations of the classical twin design (CTD) and estimated in numerous empirical studies through extensions of the CTD. We then conduct simulations to quantify the influence of GE interplay on estimates from the CTD. Due to the criticism’s mischaracterization of the CTD and GE interplay, combined with the absence of any empirical evidence to counter what is presented in the extant literature and this article, we conclude that the critique does not enhance our understanding of the processes that drive political traits, genetic or otherwise. PMID:24808718

  14. Near-field environment/processes working group summary

    SciTech Connect

    Murphy, W.M.

    1995-09-01

    This article is a summary of the proceedings of a group discussion which took place at the Workshop on the Role of Natural Analogs in Geologic Disposal of High-Level Nuclear Waste in San Antonio, Texas on July 22-25, 1991. The working group concentrated on the subject of the near-field environment to geologic repositories for high-level nuclear waste. The near-field environment may be affected by thermal perturbations from the waste, and by disturbances caused by the introduction of exotic materials during construction of the repository. This group also discussed the application of modelling of performance-related processes.

  15. MODELING TREE LEVEL PROCESSES

    EPA Science Inventory

    An overview of three main types of simulation approach (explanatory, abstraction, and estimation) is presented, along with a discussion of their capabilities limitations, and the steps required for their validation. A process model being developed through the Forest Response Prog...

  16. Modeling of Flow and Water Quality Processes with Finite Volume Method due to Spreading and Dispersion of Petrochemical Pollution in the Hydro-Environments

    NASA Astrophysics Data System (ADS)

    Sarhadi Zadeh, Ehsan; Hejazi, Kourosh

    2009-11-01

    Having two water frontiers, namely (everlasting) Persian Gulf and Oman Sea in the south and Caspian Sea in the north, intense dependence on extracting and exporting oil, especially via marine fleets and ever-increasing development of petrochemical industry, Iran is exposed to severe environmental damages caused by oil and petrochemical industries. This essay investigates how oil spill is diffused and its environmental pollution is spread. The movement of oil spill, and its diffusion in water and its effects on water and the environment has been simulated by developing a Depth-Averaged numerical model and using the Finite Volume method. The existing models are not efficient enough to fulfill current modeling needs. The developed model uses the parameters useful in the advection and diffusion of oil pollutions in a model appropriate for predicting the transport of oil spill. Since the Navier-Stokes Equations play an important role in the advection and diffusion of oil pollutions, it is highly important to choose an appropriate numerical method in the advection and diffusion section. In this essay, choosing the methods used in the advection and diffusion have been emphasized and highly-accurate algorithms has been used in the advection terms. These algorithms are not present in similar models. The resulting equations have been solved using the ADI method. This method solves the unknown parameters with solving a Penta-Diagonal matrix in each time step. It does so without sacrificing the desired precision.

  17. Automated Environment Generation for Software Model Checking

    NASA Technical Reports Server (NTRS)

    Tkachuk, Oksana; Dwyer, Matthew B.; Pasareanu, Corina S.

    2003-01-01

    A key problem in model checking open systems is environment modeling (i.e., representing the behavior of the execution context of the system under analysis). Software systems are fundamentally open since their behavior is dependent on patterns of invocation of system components and values defined outside the system but referenced within the system. Whether reasoning about the behavior of whole programs or about program components, an abstract model of the environment can be essential in enabling sufficiently precise yet tractable verification. In this paper, we describe an approach to generating environments of Java program fragments. This approach integrates formally specified assumptions about environment behavior with sound abstractions of environment implementations to form a model of the environment. The approach is implemented in the Bandera Environment Generator (BEG) which we describe along with our experience using BEG to reason about properties of several non-trivial concurrent Java programs.

  18. Processing Conditions, Rice Properties, Health and Environment

    PubMed Central

    Roy, Poritosh; Orikasa, Takahiro; Okadome, Hiroshi; Nakamura, Nobutaka; Shiina, Takeo

    2011-01-01

    Rice is the staple food for nearly two-thirds of the world’s population. Food components and environmental load of rice depends on the rice form that is resulted by different processing conditions. Brown rice (BR), germinated brown rice (GBR) and partially-milled rice (PMR) contains more health beneficial food components compared to the well milled rice (WMR). Although the arsenic concentration in cooked rice depends on the cooking methods, parboiled rice (PBR) seems to be relatively prone to arsenic contamination compared to that of untreated rice, if contaminated water is used for parboiling and cooking. A change in consumption patterns from PBR to untreated rice (non-parboiled), and WMR to PMR or BR may conserve about 43–54 million tons of rice and reduce the risk from arsenic contamination in the arsenic prone area. This study also reveals that a change in rice consumption patterns not only supply more food components but also reduces environmental loads. A switch in production and consumption patterns would improve food security where food grains are scarce, and provide more health beneficial food components, may prevent some diseases and ease the burden on the Earth. However, motivation and awareness of the environment and health, and even a nominal incentive may require for a method switching which may help in building a sustainable society. PMID:21776212

  19. Dynamic Radiation Environment Assimilation Model: DREAM

    NASA Astrophysics Data System (ADS)

    Reeves, G. D.; Chen, Y.; Cunningham, G. S.; Friedel, R. W. H.; Henderson, M. G.; Jordanova, V. K.; Koller, J.; Morley, S. K.; Thomsen, M. F.; Zaharia, S.

    2012-03-01

    The Dynamic Radiation Environment Assimilation Model (DREAM) was developed to provide accurate, global specification of the Earth's radiation belts and to better understand the physical processes that control radiation belt structure and dynamics. DREAM is designed using a modular software approach in order to provide a computational framework that makes it easy to change components such as the global magnetic field model, radiation belt dynamics model, boundary conditions, etc. This paper provides a broad overview of the DREAM model and a summary of some of the principal results to date. We describe the structure of the DREAM model, describe the five major components, and illustrate the various options that are available for each component. We discuss how the data assimilation is performed and the data preprocessing and postprocessing that are required for producing the final DREAM outputs. We describe how we apply global magnetic field models for conversion between flux and phase space density and, in particular, the benefits of using a self-consistent, coupled ring current-magnetic field model. We discuss some of the results from DREAM including testing of boundary condition assumptions and effects of adding a source term to radial diffusion models. We also describe some of the testing and validation of DREAM and prospects for future development.

  20. Modeling robot contour processes

    NASA Astrophysics Data System (ADS)

    Whitney, D. E.; Edsall, A. C.

    Robot contour processes include those with contact force like car body grinding or deburring of complex castings, as well as those with little or no contact force like inspection. This paper describes ways of characterizing, identifying, and estimating contours and robot trajectories. Contour and robot are modeled as stochastic processes in order to emphasize that both successive robot cycles and successive industrial workpieces are similar but not exactly the same. The stochastic models can be used to identify the state of a workpiece or process, or to design a filter to estimate workpiece, shape and robot position from robot-based measurements.

  1. Model for a Healthy Work Environment.

    PubMed

    Blevins, Jamie

    2016-01-01

    The Healthy Work Environment (HWE) Model, considered a model of standards of professional behaviors, was created to help foster an environment that is happy, healthy, realistic, and feasible. The model focuses on areas of PEOPLE and PRACTICE, where each letter of these words identifies core, professional qualities and behaviors to foster an environment amenable and conducive to accountability for one's behavior and action. Each of these characteristics is supported from a Christian, biblical perspective. The HWE Model provides a mental and physical checklist of what is important in creating and sustaining a healthy work environment in education and practice. PMID:27610916

  2. The bisexual branching process with population-size dependent mating as a mathematical model to describe phenomena concerning to inhabit or re-inhabit environments with animal species.

    PubMed

    Mota, M; del Puerto, I; Ramos, A

    2007-03-01

    We consider the bisexual Galton-Watson branching process with population-size dependent mating as a mathematical model adequate for the description of some natural phenomena. More specifically we are interested in studying some questions about the problem of populating an environmental with new animal species or re-populating it with species which have previously disappeared. PMID:16197966

  3. Design of Training Systems Utility Assessment: The Training Process Flow and System Capabilities/Requirements and Resources Models Operating in the TRAPAC Environment.

    ERIC Educational Resources Information Center

    Duffy, Larry R.

    The report summarizes the results of a field test conducted for the purpose of determining the utility to Naval training of the Systems Capabilities/Requirements and Resources (SCRR) and the Training Process Flow (TPF) computer-based mathematical models. Basic descriptions of the SCRR and the TPF and their development are given. Training…

  4. Students' mental models of the environment

    NASA Astrophysics Data System (ADS)

    Shepardson, Daniel P.; Wee, Bryan; Priddy, Michelle; Harbor, Jon

    2007-02-01

    What are students' mental models of the environment? In what ways, if any, do students' mental models vary by grade level or community setting? These two questions guided the research reported in this article. The Environments Task was administered to students from 25 different teacher-classrooms. The student responses were first inductively analyzed in order to identify students' mental models of the environment. The second phase of analysis involved the statistical testing of the identified mental models. From this analysis four mental models emerged: Model 1, the environment as a place where animals/plants live - a natural place; Model 2, the environment as a place that supports life; Model 3, the environment as a place impacted or modified by human activity; and Model 4, the environment as a place where animals, plants, and humans live. The dominant mental model was Mental Model 1. Yet, a greater frequency of urban students than suburban and rural students held Mental Model 3. The implications to environmental science education are explored.

  5. Microwave sintering process model.

    PubMed

    Peng, Hu; Tinga, W R; Sundararaj, U; Eadie, R L

    2003-01-01

    In order to simulate and optimize the microwave sintering of a silicon nitride and tungsten carbide/cobalt toolbits process, a microwave sintering process model has been built. A cylindrical sintering furnace was used containing a heat insulating layer, a susceptor layer, and an alumina tube containing the green toolbit parts between parallel, electrically conductive, graphite plates. Dielectric and absorption properties of the silicon nitride green parts, the tungsten carbide/cobalt green parts, and an oxidizable susceptor material were measured using perturbation and waveguide transmission methods. Microwave absorption data were measured over a temperature range from 20 degrees C to 800 degrees C. These data were then used in the microwave process model which assumed plane wave propagation along the radial direction and included the microwave reflection at each interface between the materials and the microwave absorption in the bulk materials. Heat transfer between the components inside the cylindrical sintering furnace was also included in the model. The simulated heating process data for both silicon nitride and tungsten carbide/cobalt samples closely follow the experimental data. By varying the physical parameters of the sintering furnace model, such as the thickness of the susceptor layer, the thickness of the allumina tube wall, the sample load volume and the graphite plate mass, the model data predicts their effects which are helpful in optimizing those parameters in the industrial sintering process. PMID:15323110

  6. Foam process models.

    SciTech Connect

    Moffat, Harry K.; Noble, David R.; Baer, Thomas A.; Adolf, Douglas Brian; Rao, Rekha Ranjana; Mondy, Lisa Ann

    2008-09-01

    In this report, we summarize our work on developing a production level foam processing computational model suitable for predicting the self-expansion of foam in complex geometries. The model is based on a finite element representation of the equations of motion, with the movement of the free surface represented using the level set method, and has been implemented in SIERRA/ARIA. An empirically based time- and temperature-dependent density model is used to encapsulate the complex physics of foam nucleation and growth in a numerically tractable model. The change in density with time is at the heart of the foam self-expansion as it creates the motion of the foam. This continuum-level model uses an homogenized description of foam, which does not include the gas explicitly. Results from the model are compared to temperature-instrumented flow visualization experiments giving the location of the foam front as a function of time for our EFAR model system.

  7. Students' Mental Models of the Environment

    ERIC Educational Resources Information Center

    Shepardson, Daniel P.; Wee, Bryan; Priddy, Michelle; Harbor, Jon

    2007-01-01

    What are students' mental models of the environment? In what ways, if any, do students' mental models vary by grade level or community setting? These two questions guided the research reported in this article. The Environments Task was administered to students from 25 different teacher-classrooms. The student responses were first inductively…

  8. Radiolysis Process Model

    SciTech Connect

    Buck, Edgar C.; Wittman, Richard S.; Skomurski, Frances N.; Cantrell, Kirk J.; McNamara, Bruce K.; Soderquist, Chuck Z.

    2012-07-17

    Assessing the performance of spent (used) nuclear fuel in geological repository requires quantification of time-dependent phenomena that may influence its behavior on a time-scale up to millions of years. A high-level waste repository environment will be a dynamic redox system because of the time-dependent generation of radiolytic oxidants and reductants and the corrosion of Fe-bearing canister materials. One major difference between used fuel and natural analogues, including unirradiated UO2, is the intense radiolytic field. The radiation emitted by used fuel can produce radiolysis products in the presence of water vapor or a thin-film of water (including OH• and H• radicals, O2-, eaq, H2O2, H2, and O2) that may increase the waste form degradation rate and change radionuclide behavior. H2O2 is the dominant oxidant for spent nuclear fuel in an O2 depleted water environment, the most sensitive parameters have been identified with respect to predictions of a radiolysis model under typical conditions. As compared with the full model with about 100 reactions it was found that only 30-40 of the reactions are required to determine [H2O2] to one part in 10–5 and to preserve most of the predictions for major species. This allows a systematic approach for model simplification and offers guidance in designing experiments for validation.

  9. Spacecraft Internal Acoustic Environment Modeling

    NASA Technical Reports Server (NTRS)

    Chu, S. Reynold; Allen, Chris

    2009-01-01

    The objective of the project is to develop an acoustic modeling capability, based on commercial off-the-shelf software, to be used as a tool for oversight of the future manned Constellation vehicles. The use of such a model will help ensure compliance with acoustic requirements. Also, this project includes modeling validation and development feedback via building physical mockups and conducting acoustic measurements to compare with the predictions.

  10. Neuroscientific Model of Motivational Process

    PubMed Central

    Kim, Sung-il

    2013-01-01

    Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three sub processes, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous sub processes, namely reward-driven approach, value-based decision-making, and goal-directed control. Reward-driven approach is the process in which motivation is generated by reward anticipation and selective approach behaviors toward reward. This process recruits the ventral striatum (reward area) in which basic stimulus-action association is formed, and is classified as an automatic motivation to which relatively less attention is assigned. By contrast, value-based decision-making is the process of evaluating various outcomes of actions, learning through positive prediction error, and calculating the value continuously. The striatum and the orbitofrontal cortex (valuation area) play crucial roles in sustaining motivation. Lastly, the goal-directed control is the process of regulating motivation through cognitive control to achieve goals. This consciously controlled motivation is associated with higher-level cognitive functions such as planning, retaining the goal, monitoring the performance, and regulating action. The anterior cingulate cortex (attention area) and the dorsolateral prefrontal cortex (cognitive control area) are the main neural circuits related to regulation of motivation. These three sub processes interact with each other by sending reward prediction error signals through dopaminergic pathway from the striatum and to the prefrontal cortex. The neuroscientific model of motivational process suggests several educational implications with regard to the generation, maintenance, and regulation of motivation to learn in the learning environment. PMID:23459598

  11. Neuroscientific model of motivational process.

    PubMed

    Kim, Sung-Il

    2013-01-01

    Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three sub processes, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous sub processes, namely reward-driven approach, value-based decision-making, and goal-directed control. Reward-driven approach is the process in which motivation is generated by reward anticipation and selective approach behaviors toward reward. This process recruits the ventral striatum (reward area) in which basic stimulus-action association is formed, and is classified as an automatic motivation to which relatively less attention is assigned. By contrast, value-based decision-making is the process of evaluating various outcomes of actions, learning through positive prediction error, and calculating the value continuously. The striatum and the orbitofrontal cortex (valuation area) play crucial roles in sustaining motivation. Lastly, the goal-directed control is the process of regulating motivation through cognitive control to achieve goals. This consciously controlled motivation is associated with higher-level cognitive functions such as planning, retaining the goal, monitoring the performance, and regulating action. The anterior cingulate cortex (attention area) and the dorsolateral prefrontal cortex (cognitive control area) are the main neural circuits related to regulation of motivation. These three sub processes interact with each other by sending reward prediction error signals through dopaminergic pathway from the striatum and to the prefrontal cortex. The neuroscientific model of motivational process suggests several educational implications with regard to the generation, maintenance, and regulation of motivation to learn in the learning environment. PMID:23459598

  12. Listeria monocytogenes in Irish Farmhouse cheese processing environments.

    PubMed

    Fox, Edward; Hunt, Karen; O'Brien, Martina; Jordan, Kieran

    2011-03-01

    Sixteen cheesemaking facilities were sampled during the production season at monthly intervals over a two-year period. Thirteen facilities were found to have samples positive for Listeria monocytogenes. Samples were divided into 4 categories; cheese, raw milk, processing environment and external to the processing environment (samples from the farm such as silage, bedding, and pooled water). In order to attempt to identify the source, persistence and putative transfer routes of contamination with the L. monocytogenes isolates, they were differentiated using PFGE and serotyping. Of the 250 isolates, there were 52 different pulsotypes. No pulsotype was found at more than one facility. Two facilities had persistent pulsotypes that were isolated on sampling occasions at least 6 months apart. Of the samples tested, 6.3% of milk, 13.1% of processing environment and 12.3% of samples external to the processing environment, respectively, were positive for L. monocytogenes. Pulsotypes found in raw milk were also found in the processing environment, however, one of the pulsotypes from raw milk was found in cheese on only one occasion. One of the pulsotypes isolated from the environment external to the processing facility was found on the surface of cheese, however, a number of them were found in the processing environment. The results suggest that the farm environment external to the processing environment may in some cases be the source of processing environment contamination with L. monocytogenes. PMID:21087802

  13. Space environment and lunar surface processes

    NASA Technical Reports Server (NTRS)

    Comstock, G. M.

    1979-01-01

    The development of a general rock/soil model capable of simulating in a self consistent manner the mechanical and exposure history of an assemblage of solid and loose material from submicron to planetary size scales, applicable to lunar and other space exposed planetary surfaces is discussed. The model was incorporated into a computer code called MESS.2 (model for the evolution of space exposed surfaces). MESS.2, which represents a considerable increase in sophistication and scope over previous soil and rock surface models, is described. The capabilities of previous models for near surface soil and rock surfaces are compared with the rock/soil model, MESS.2.

  14. Combustion Processes in the Aerospace Environment

    NASA Technical Reports Server (NTRS)

    Huggett, Clayton

    1969-01-01

    The aerospace environment introduces new and enhanced fire hazards because the special atmosphere employed may increase the frequency and intensity of fires, because the confinement associated with aerospace systems adversely affects the dynamics of fire development and control, and because the hostile external environments limit fire control and rescue operations. Oxygen enriched atmospheres contribute to the fire hazard in aerospace systems by extending the list of combustible fuels, increasing the probability of ignition, and increasing the rates of fire spread and energy release. A system for classifying atmospheres according to the degree of fire hazard, based on the heat capacity of the atmosphere per mole of oxygen, is suggested. A brief exploration of the dynamics of chamber fires shows that such fires will exhibit an exponential growth rate and may grow to dangerous size in a very short time. Relatively small quantities of fuel and oxygen can produce a catastrophic fire in a closed chamber.

  15. Electron environment specification models for Galileo

    NASA Astrophysics Data System (ADS)

    Lazaro, Didier; Bourdarie, Sebastien; Hands, Alex; Ryden, Keith; Nieminen, Petteri

    The MEO radiation hazard is becoming an increasingly important consideration with an ever rising number of satellites missions spending most of their time in this environment. This region lies in the heart of the highly dynamic electron radiation belt, where very large radiation doses can be encountered unless proper shielding to critical systems and components is applied. Significant internal charging hazards also arise in the MEO regime. For electron environment specification at Galileo altitude, new models have been developed and implemented: long term effects model for dose evaluation, statistical model for internal charging analysis and latitudinal model for ELDRS analysis. Models outputs, tools and validation with observations (Giove-A data) and existing models (such as FLUMIC) are presented . "Energetic Electron Environment Models for MEO" Co 21403/08/NL/JD in consortium with ONERA, QinetiQ, SSTL and CNES .

  16. Engineered Barrier System: Physical and Chemical Environment Model

    SciTech Connect

    D. M. Jolley; R. Jarek; P. Mariner

    2004-02-09

    The conceptual and predictive models documented in this Engineered Barrier System: Physical and Chemical Environment Model report describe the evolution of the physical and chemical conditions within the waste emplacement drifts of the repository. The modeling approaches and model output data will be used in the total system performance assessment (TSPA-LA) to assess the performance of the engineered barrier system and the waste form. These models evaluate the range of potential water compositions within the emplacement drifts, resulting from the interaction of introduced materials and minerals in dust with water seeping into the drifts and with aqueous solutions forming by deliquescence of dust (as influenced by atmospheric conditions), and from thermal-hydrological-chemical (THC) processes in the drift. These models also consider the uncertainty and variability in water chemistry inside the drift and the compositions of introduced materials within the drift. This report develops and documents a set of process- and abstraction-level models that constitute the engineered barrier system: physical and chemical environment model. Where possible, these models use information directly from other process model reports as input, which promotes integration among process models used for total system performance assessment. Specific tasks and activities of modeling the physical and chemical environment are included in the technical work plan ''Technical Work Plan for: In-Drift Geochemistry Modeling'' (BSC 2004 [DIRS 166519]). As described in the technical work plan, the development of this report is coordinated with the development of other engineered barrier system analysis model reports.

  17. Radiation Environment Modeling for Spacecraft Design: New Model Developments

    NASA Technical Reports Server (NTRS)

    Barth, Janet; Xapsos, Mike; Lauenstein, Jean-Marie; Ladbury, Ray

    2006-01-01

    A viewgraph presentation on various new space radiation environment models for spacecraft design is described. The topics include: 1) The Space Radiatio Environment; 2) Effects of Space Environments on Systems; 3) Space Radiatio Environment Model Use During Space Mission Development and Operations; 4) Space Radiation Hazards for Humans; 5) "Standard" Space Radiation Environment Models; 6) Concerns about Standard Models; 7) Inadequacies of Current Models; 8) Development of New Models; 9) New Model Developments: Proton Belt Models; 10) Coverage of New Proton Models; 11) Comparison of TPM-1, PSB97, AP-8; 12) New Model Developments: Electron Belt Models; 13) Coverage of New Electron Models; 14) Comparison of "Worst Case" POLE, CRESELE, and FLUMIC Models with the AE-8 Model; 15) New Model Developments: Galactic Cosmic Ray Model; 16) Comparison of NASA, MSU, CIT Models with ACE Instrument Data; 17) New Model Developmemts: Solar Proton Model; 18) Comparison of ESP, JPL91, KIng/Stassinopoulos, and PSYCHIC Models; 19) New Model Developments: Solar Heavy Ion Model; 20) Comparison of CREME96 to CREDO Measurements During 2000 and 2002; 21) PSYCHIC Heavy ion Model; 22) Model Standardization; 23) Working Group Meeting on New Standard Radiation Belt and Space Plasma Models; and 24) Summary.

  18. Space Environments and Effects: Trapped Proton Model

    NASA Technical Reports Server (NTRS)

    Huston, S. L.; Kauffman, W. (Technical Monitor)

    2002-01-01

    An improved model of the Earth's trapped proton environment has been developed. This model, designated Trapped Proton Model version 1 (TPM-1), determines the omnidirectional flux of protons with energy between 1 and 100 MeV throughout near-Earth space. The model also incorporates a true solar cycle dependence. The model consists of several data files and computer software to read them. There are three versions of the mo'del: a FORTRAN-Callable library, a stand-alone model, and a Web-based model.

  19. Simple Thermal Environment Model (STEM) User's Guide

    NASA Technical Reports Server (NTRS)

    Justus, C.G.; Batts, G. W.; Anderson, B. J.; James, B. F.

    2001-01-01

    This report presents a Simple Thermal Environment Model (STEM) for determining appropriate engineering design values to specify the thermal environment of Earth-orbiting satellites. The thermal environment of a satellite, consists of three components: (1) direct solar radiation, (2) Earth-atmosphere reflected shortwave radiation, as characterized by Earth's albedo, and (3) Earth-atmosphere-emitted outgoing longwave radiation (OLR). This report, together with a companion "guidelines" report provides methodology and guidelines for selecting "design points" for thermal environment parameters for satellites and spacecraft systems. The methods and models reported here are outgrowths of Earth Radiation Budget Experiment (ERBE) satellite data analysis and thermal environment specifications discussed by Anderson and Smith (1994). In large part, this report is intended to update (and supersede) those results.

  20. Mountains and man. A study of process and environment

    SciTech Connect

    Price, L.W.

    1986-01-01

    This book explores the processes and features of mountain environments: glaciers, snow and avalanches, landforms, weather and climate vegetation soils, and wildlife. The effects of latitudinal position on these processes and features are analyzed.

  1. Sanitation in the Shell Egg Processing Environment

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Hazard analysis and critical control programs (HACCP) will eventually be required for commercial shell egg processing plants. Sanitation is an essential prerequisite program for HACCP and is based upon current Good Manufacturing Practices (cGMPs) as listed in the Code of Federal Regulations. Good ...

  2. Sanitation in the Shell Egg Processing Environment

    Technology Transfer Automated Retrieval System (TEKTRAN)

    In the past, most of the regulations regarding egg processing are concerned with quality rather than safety. Hazard Analysis and Critical Control Point (HACCP) will be required by retailers or by the federal government. GMPs (Good Manufacturing Practices) and SSOPs (Sanitation Standard Operating P...

  3. Building an environment model using depth information

    NASA Technical Reports Server (NTRS)

    Roth-Tabak, Y.; Jain, Ramesh

    1989-01-01

    Modeling the environment is one of the most crucial issues for the development and research of autonomous robot and tele-perception. Though the physical robot operates (navigates and performs various tasks) in the real world, any type of reasoning, such as situation assessment, planning or reasoning about action, is performed based on information in its internal world. Hence, the robot's intentional actions are inherently constrained by the models it has. These models may serve as interfaces between sensing modules and reasoning modules, or in the case of telerobots serve as interface between the human operator and the distant robot. A robot operating in a known restricted environment may have a priori knowledge of its whole possible work domain, which will be assimilated in its World Model. As the information in the World Model is relatively fixed, an Environment Model must be introduced to cope with the changes in the environment and to allow exploring entirely new domains. Introduced here is an algorithm that uses dense range data collected at various positions in the environment to refine and update or generate a 3-D volumetric model of an environment. The model, which is intended for autonomous robot navigation and tele-perception, consists of cubic voxels with the possible attributes: Void, Full, and Unknown. Experimental results from simulations of range data in synthetic environments are given. The quality of the results show great promise for dealing with noisy input data. The performance measures for the algorithm are defined, and quantitative results for noisy data and positional uncertainty are presented.

  4. Float-zone processing in a weightless environment

    NASA Technical Reports Server (NTRS)

    Fowle, A. A.; Haggerty, J. S.; Perron, R. R.; Strong, P. F.; Swanson, J. L.

    1976-01-01

    The results were reported of investigations to: (1) test the validity of analyses which set maximum practical diameters for Si crystals that can be processed by the float zone method in a near weightless environment, (2) determine the convective flow patterns induced in a typical float zone, Si melt under conditions perceived to be advantageous to the crystal growth process using flow visualization techniques applied to a dimensionally scaled model of the Si melt, (3) revise the estimates of the economic impact of space produced Si crystal by the float zone method on the U.S. electronics industry, and (4) devise a rational plan for future work related to crystal growth phenomena wherein low gravity conditions available in a space site can be used to maximum benefit to the U.S. electronics industry.

  5. Understanding the Impact of Virtual World Environments on Social and Cognitive Processes in Learning

    ERIC Educational Resources Information Center

    Zhang, Chi

    2009-01-01

    Researchers in information systems and technology-mediated learning have begun to examine how virtual world environments can be used in learning and how they enable learning processes and enhance learning outcomes. This research examined learning processes in a virtual world learning environment (VWLE). A research model of VWLE effects on learning…

  6. Processing Motion Signals in Complex Environments

    NASA Technical Reports Server (NTRS)

    Verghese, Preeti

    2000-01-01

    Motion information is critical for human locomotion and scene segmentation. Currently we have excellent neurophysiological models that are able to predict human detection and discrimination of local signals. Local motion signals are insufficient by themselves to guide human locomotion and to provide information about depth, object boundaries and surface structure. My research is aimed at understanding the mechanisms underlying the combination of motion signals across space and time. A target moving on an extended trajectory amidst noise dots in Brownian motion is much more detectable than the sum of signals generated by independent motion energy units responding to the trajectory segments. This result suggests that facilitation occurs between motion units tuned to similar directions, lying along the trajectory path. We investigated whether the interaction between local motion units along the motion direction is mediated by contrast. One possibility is that contrast-driven signals from motion units early in the trajectory sequence are added to signals in subsequent units. If this were the case, then units later in the sequence would have a larger signal than those earlier in the sequence. To test this possibility, we compared contrast discrimination thresholds for the first and third patches of a triplet of sequentially presented Gabor patches, aligned along the motion direction. According to this simple additive model, contrast increment thresholds for the third patch should be higher than thresholds for the first patch.The lack of a measurable effect on contrast thresholds for these various manipulations suggests that the pooling of signals along a trajectory is not mediated by contrast-driven signals. Instead, these results are consistent with models that propose that the facilitation of trajectory signals is achieved by a second-level network that chooses the strongest local motion signals and combines them if they occur in a spatio-temporal sequence consistent

  7. Broadband acoustic source processing in a noisy shallow ocean environment

    SciTech Connect

    Candy, J.V.; Sullivan, E.J.

    1996-07-18

    Acoustic sources found in the ocean environment are spatially complex and broadband, complicating the analysis of received acoustic data considerably. A model-based approach is developed for a broadband source in a shallow ocean environment characterized by a normal-mode propagation model. Here we develop the optimal Bayesian solution to the broadband pressure-field enhancement and modal function extraction problem.

  8. The AE-8 trapped electron model environment

    NASA Technical Reports Server (NTRS)

    Vette, James I.

    1991-01-01

    The machine sensible version of the AE-8 electron model environment was completed in December 1983. It has been sent to users on the model environment distribution list and is made available to new users by the National Space Science Data Center (NSSDC). AE-8 is the last in a series of terrestrial trapped radiation models that includes eight proton and eight electron versions. With the exception of AE-8, all these models were documented in formal reports as well as being available in a machine sensible form. The purpose of this report is to complete the documentation, finally, for AE-8 so that users can understand its construction and see the comparison of the model with the new data used, as well as with the AE-4 model.

  9. Space Station Freedom natural environment design models

    NASA Technical Reports Server (NTRS)

    Suggs, Robert M.

    1993-01-01

    The Space Station Freedom program has established a series of natural environment models and databases for utilization in design and operations planning activities. The suite of models and databases that have either been selected from among internationally recognized standards or developed specifically for spacecraft design applications are presented. The models have been integrated with an orbit propagator and employed to compute environmental conditions for planned operations altitudes of Space Station Freedom.

  10. Radiation environment models and the atmospheric cutoff

    NASA Technical Reports Server (NTRS)

    Konradi, Andrei; Hardy, Alva C.; Atwell, William

    1987-01-01

    The limitations of radiation environment models are examined by applying the model to the South Atlantic anomaly (SAA). The local magnetic-field-intensity (in gauss) and McIlwain (1961) drift-shell-parameter contours in the SAA are analyzed. It is noted that it is necessary to decouple the atmospheric absorption effects from the trapped radiation models in order to obtain accurate radiation dose predictions. Two methods for obtaining more accurate results are proposed.

  11. r-process nucleosynthesis in dynamic helium-burning environments

    NASA Technical Reports Server (NTRS)

    Cowan, J. J.; Cameron, A. G. W.; Truran, J. W.

    1985-01-01

    The results of an extended examination of r-process nucleosynthesis in helium-burning enviroments are presented. Using newly calculated nuclear rates, dynamical r-process calculations have been made of thermal runaways in helium cores typical of low-mass stars and in the helium zones of stars undergoing supernova explosions. These calculations show that, for a sufficient flux of neutrons produced by the C-13 neutron source, r-process nuclei in solar proportions can be produced. The conditions required for r-process production are found to be 10 to the 20th-10 to the 21st neutrons per cubic centimeter for times of 0.01-0.1 s and neutron number densities in excess of 10 to the 19th per cubic centimeter for times of about 1 s. The amount of C-13 required is found to be exceedingly high - larger than is found to occur in any current stellar evolutionary model. It is thus unlikely that these helium-burning environments are responsible for producing the bulk of the r-process elements seen in the solar system.

  12. Liberty High School Transition Project: Model Process for Assimilating School, Community, Business, Government and Service Groups of the Least Restrictive Environment for Nondisabled and Disabled.

    ERIC Educational Resources Information Center

    Grimes, Michael K.

    The panel presentation traces the development of and describes the operation of a Brentwood (California) project to prepare approximately 75 severely disabled individuals, ages 12-22, to function in the least restrictive recreation/leisure, vocational, and general community environments. Transition Steering Committee developed such project…

  13. The Educational Process in the Emerging Information Society: Conditions for the Reversal of the Linear Model of Education and the Development of an Open Type Hybrid Learning Environment.

    ERIC Educational Resources Information Center

    Anastasiades, Panagiotes S.; Retalis, Simos

    The introduction of communications and information technologies in the area of education tends to create a totally different environment, which is marked by a change of the teacher's role and a transformation of the basic components that make up the meaning and content of the learning procedure as a whole. It could be said that, despite any…

  14. The national operational environment model (NOEM)

    NASA Astrophysics Data System (ADS)

    Salerno, John J.; Romano, Brian; Geiler, Warren

    2011-06-01

    The National Operational Environment Model (NOEM) is a strategic analysis/assessment tool that provides insight into the complex state space (as a system) that is today's modern operational environment. The NOEM supports baseline forecasts by generating plausible futures based on the current state. It supports what-if analysis by forecasting ramifications of potential "Blue" actions on the environment. The NOEM also supports sensitivity analysis by identifying possible pressure (leverage) points in support of the Commander that resolves forecasted instabilities, and by ranking sensitivities in a list for each leverage point and response. The NOEM can be used to assist Decision Makers, Analysts and Researchers with understanding the inter-workings of a region or nation state, the consequences of implementing specific policies, and the ability to plug in new operational environment theories/models as they mature. The NOEM is built upon an open-source, license-free set of capabilities, and aims to provide support for pluggable modules that make up a given model. The NOEM currently has an extensive number of modules (e.g. economic, security & social well-being pieces such as critical infrastructure) completed along with a number of tools to exercise them. The focus this year is on modeling the social and behavioral aspects of a populace within their environment, primarily the formation of various interest groups, their beliefs, their requirements, their grievances, their affinities, and the likelihood of a wide range of their actions, depending on their perceived level of security and happiness. As such, several research efforts are currently underway to model human behavior from a group perspective, in the pursuit of eventual integration and balance of populace needs/demands within their respective operational environment and the capacity to meet those demands. In this paper we will provide an overview of the NOEM, the need for and a description of its main components

  15. The dynamic radiation environment assimilation model (DREAM)

    SciTech Connect

    Reeves, Geoffrey D; Koller, Josef; Tokar, Robert L; Chen, Yue; Henderson, Michael G; Friedel, Reiner H

    2010-01-01

    The Dynamic Radiation Environment Assimilation Model (DREAM) is a 3-year effort sponsored by the US Department of Energy to provide global, retrospective, or real-time specification of the natural and potential nuclear radiation environments. The DREAM model uses Kalman filtering techniques that combine the strengths of new physical models of the radiation belts with electron observations from long-term satellite systems such as GPS and geosynchronous systems. DREAM includes a physics model for the production and long-term evolution of artificial radiation belts from high altitude nuclear explosions. DREAM has been validated against satellites in arbitrary orbits and consistently produces more accurate results than existing models. Tools for user-specific applications and graphical displays are in beta testing and a real-time version of DREAM has been in continuous operation since November 2009.

  16. An intelligent processing environment for real-time simulation

    NASA Technical Reports Server (NTRS)

    Carroll, Chester C.; Wells, Buren Earl, Jr.

    1988-01-01

    The development of a highly efficient and thus truly intelligent processing environment for real-time general purpose simulation of continuous systems is described. Such an environment can be created by mapping the simulation process directly onto the University of Alamba's OPERA architecture. To facilitate this effort, the field of continuous simulation is explored, highlighting areas in which efficiency can be improved. Areas in which parallel processing can be applied are also identified, and several general OPERA type hardware configurations that support improved simulation are investigated. Three direct execution parallel processing environments are introduced, each of which greatly improves efficiency by exploiting distinct areas of the simulation process. These suggested environments are candidate architectures around which a highly intelligent real-time simulation configuration can be developed.

  17. Fuzzy control of the production environment process parameters

    NASA Astrophysics Data System (ADS)

    Izvekov, V. N.

    2015-04-01

    The fuzzy control process for support of given microclimatic production environment process parameters with loss of one from values, regulating regime of process was shown. The structural schematic decisions with algorithm of functioning and oriented to existing apparatus (means of realization) was presented.

  18. Optical modeling in Testbed Environment for Space Situational Awareness (TESSA).

    PubMed

    Nikolaev, Sergei

    2011-08-01

    We describe optical systems modeling in the Testbed Environment for Space Situational Awareness (TESSA) simulator. We begin by presenting a brief outline of the overall TESSA architecture and focus on components for modeling optical sensors. Both image generation and image processing stages are described in detail, highlighting the differences in modeling ground- and space-based sensors. We conclude by outlining the applicability domains for the TESSA simulator, including potential real-life scenarios. PMID:21833092

  19. Designing user models in a virtual cave environment

    SciTech Connect

    Brown-VanHoozer, S.; Hudson, R.; Gokhale, N.

    1995-12-31

    In this paper, the results of a first study into the use of virtual reality for human factor studies and design of simple and complex models of control systems, components, and processes are described. The objective was to design a model in a virtual environment that would reflect more characteristics of the user`s mental model of a system and fewer of the designer`s. The technology of a CAVE{trademark} virtual environment and the methodology of Neuro Linguistic Programming were employed in this study.

  20. Monitoring and Modelling Lakes and Coastal Environments

    NASA Astrophysics Data System (ADS)

    Odada, Eric

    2009-01-01

    The monitoring and modeling of lakes and coastal environments is becoming ever more important, particularly because these environments bear heavy loads in terms of human population, and their resources are critical to the livelihoods and well-being of coastal inhabitants and ecosystems. Monitoring and Modelling Lakes and Coastal Environments is a collection of 18 papers arising from the Lake 2004 International Conference on Conservation, Restoration and Management of Lakes and Coastal Wetlands, held in Bhubaneswar, Orissa, India, 9-13 December 2004. Consequently, 15 of the papers are concerned with studies on the Indian subcontinent, and many of the papers focus on India's Lake Chilika, the site of a special session during the conference. Two papers concern Japan, and one focuses on North America's Great Lakes region. Although the book has a regional bias, the replication of best practices that can be drawn from these studies may be useful for an international audience.

  1. Bioflims in the poultry production and processing environment

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The chapter conveys the importance of biofilm study in the environment of the poultry production and processing industires. Implications for food safety and security are established for sites of occurrences and causes of biofilm formation in poultry environments. Regulations and testing methods th...

  2. [Watershed water environment pollution models and their applications: a review].

    PubMed

    Zhu, Yao; Liang, Zhi-Wei; Li, Wei; Yang, Yi; Yang, Mu-Yi; Mao, Wei; Xu, Han-Li; Wu, Wei-Xiang

    2013-10-01

    Watershed water environment pollution model is the important tool for studying watershed environmental problems. Through the quantitative description of the complicated pollution processes of whole watershed system and its parts, the model can identify the main sources and migration pathways of pollutants, estimate the pollutant loadings, and evaluate their impacts on water environment, providing a basis for watershed planning and management. This paper reviewed the watershed water environment models widely applied at home and abroad, with the focuses on the models of pollutants loading (GWLF and PLOAD), water quality of received water bodies (QUAL2E and WASP), and the watershed models integrated pollutant loadings and water quality (HSPF, SWAT, AGNPS, AnnAGNPS, and SWMM), and introduced the structures, principles, and main characteristics as well as the limitations in practical applications of these models. The other models of water quality (CE-QUAL-W2, EFDC, and AQUATOX) and watershed models (GLEAMS and MIKE SHE) were also briefly introduced. Through the case analysis on the applications of single model and integrated models, the development trend and application prospect of the watershed water environment pollution models were discussed. PMID:24483100

  3. Metal Catalyzed Fusion: Nuclear Active Environment vs. Process

    NASA Astrophysics Data System (ADS)

    Chubb, Talbot

    2009-03-01

    To achieve radiationless dd fusion and/or other LENR reactions via chemistry: some focus on environment of interior or altered near-surface volume of bulk metal; some on environment inside metal nanocrystals or on their surface; some on the interface between nanometal crystals and ionic crystals; some on a momentum shock-stimulation reaction process. Experiment says there is also a spontaneous reaction process.

  4. GREENSCOPE: Sustainable Process Modeling

    EPA Science Inventory

    EPA researchers are responding to environmental problems by incorporating sustainability into process design and evaluation. EPA researchers are also developing a tool that allows users to assess modifications to existing and new chemical processes to determine whether changes in...

  5. Model for integrated management of quality, labor risks prevention, environment and ethical aspects, applied to R&D&I and production processes in an organization

    NASA Astrophysics Data System (ADS)

    González, M. R.; Torres, F.; Yoldi, V.; Arcega, F.; Plaza, I.

    2012-04-01

    It is proposed an integrated management model for an organization. This model is based on the continuous improvement Plan-Do-Check-Act cycle and it intends to integrate the environmental, risk prevention and ethical aspects as well as research, development and innovation projects management in the general quality management structure proposed by ISO 9001:2008. It aims to fulfill the standards ISO 9001, ISO 14001, OSHAS 18001, SGE 21 y 166002.

  6. Securing Provenance of Distributed Processes in an Untrusted Environment

    NASA Astrophysics Data System (ADS)

    Syalim, Amril; Nishide, Takashi; Sakurai, Kouichi

    Recently, there is much concern about the provenance of distributed processes, that is about the documentation of the origin and the processes to produce an object in a distributed system. The provenance has many applications in the forms of medical records, documentation of processes in the computer systems, recording the origin of data in the cloud, and also documentation of human-executed processes. The provenance of distributed processes can be modeled by a directed acyclic graph (DAG) where each node represents an entity, and an edge represents the origin and causal relationship between entities. Without sufficient security mechanisms, the provenance graph suffers from integrity and confidentiality problems, for example changes or deletions of the correct nodes, additions of fake nodes and edges, and unauthorized accesses to the sensitive nodes and edges. In this paper, we propose an integrity mechanism for provenance graph using the digital signature involving three parties: the process executors who are responsible in the nodes' creation, a provenance owner that records the nodes to the provenance store, and a trusted party that we call the Trusted Counter Server (TCS) that records the number of nodes stored by the provenance owner. We show that the mechanism can detect the integrity problem in the provenance graph, namely unauthorized and malicious “authorized” updates even if all the parties, except the TCS, collude to update the provenance. In this scheme, the TCS only needs a very minimal storage (linear with the number of the provenance owners). To protect the confidentiality and for an efficient access control administration, we propose a method to encrypt the provenance graph that allows access by paths and compartments in the provenance graph. We argue that encryption is important as a mechanism to protect the provenance data stored in an untrusted environment. We analyze the security of the integrity mechanism, and perform experiments to measure

  7. A Modeling Environment for Patient Portals

    PubMed Central

    Duncavage, Sean; Mathe, Janos; Werner, Jan; Malin, Bradley A.; Ledeczi, Akos; Sztipanovits, Janos

    2007-01-01

    Clinical Information Systems (CIS) are complex environments that integrate information technologies, humans, and patient data. Given the sensitivity of patient data, federal regulations require health care providers to define privacy and security policies and to deploy enforcement technologies. The introduction of model-based design techniques, combined with the development of high-level modeling abstractions and analysis methods, provide a mechanism to investigate these concerns by conceptually simplifying CIS without sacrificing expressive power. This work introduces the Model-based Design Environment for Clinical Information Systems (MODECIS), which is a graphical design environment that assists CIS architects in formalizing systems and services. MODECIS leverages Service-Oriented Architectures to create realistic system models as abstractions. MODECIS enables the analysis of legacy architectures and the design and simulation of future CIS. We present the feasibility of MODECIS by modeling operations, such as user authentication, of MyHealth@Vanderbilt, a real world patient portal in use at the Vanderbilt University Medical Center. PMID:18693826

  8. Visual Modelling of Learning Processes

    ERIC Educational Resources Information Center

    Copperman, Elana; Beeri, Catriel; Ben-Zvi, Nava

    2007-01-01

    This paper introduces various visual models for the analysis and description of learning processes. The models analyse learning on two levels: the dynamic level (as a process over time) and the functional level. Two types of model for dynamic modelling are proposed: the session trace, which documents a specific learner in a particular learning…

  9. Model-Based Detection in a Shallow Water Ocean Environment

    SciTech Connect

    Candy, J V

    2001-07-30

    A model-based detector is developed to process shallow water ocean acoustic data. The function of the detector is to adaptively monitor the environment and decide whether or not a change from normal has occurred. Here we develop a processor incorporating both a normal-mode ocean acoustic model and a vertical hydrophone array. The detector is applied to data acquired from the Hudson Canyon experiments at various ranges and its performance is evaluated.

  10. A Multiagent Modeling Environment for Simulating Work Practice in Organizations

    NASA Technical Reports Server (NTRS)

    Sierhuis, Maarten; Clancey, William J.; vanHoof, Ron

    2004-01-01

    In this paper we position Brahms as a tool for simulating organizational processes. Brahms is a modeling and simulation environment for analyzing human work practice, and for using such models to develop intelligent software agents to support the work practice in organizations. Brahms is the result of more than ten years of research at the Institute for Research on Learning (IRL), NYNEX Science & Technology (the former R&D institute of the Baby Bell telephone company in New York, now Verizon), and for the last six years at NASA Ames Research Center, in the Work Systems Design and Evaluation group, part of the Computational Sciences Division (Code IC). Brahms has been used on more than ten modeling and simulation research projects, and recently has been used as a distributed multiagent development environment for developing work practice support tools for human in-situ science exploration on planetary surfaces, in particular a human mission to Mars. Brahms was originally conceived of as a business process modeling and simulation tool that incorporates the social systems of work, by illuminating how formal process flow descriptions relate to people s actual located activities in the workplace. Our research started in the early nineties as a reaction to experiences with work process modeling and simulation . Although an effective tool for convincing management of the potential cost-savings of the newly designed work processes, the modeling and simulation environment was only able to describe work as a normative workflow. However, the social systems, uncovered in work practices studied by the design team played a significant role in how work actually got done-actual lived work. Multi- tasking, informal assistance and circumstantial work interactions could not easily be represented in a tool with a strict workflow modeling paradigm. In response, we began to develop a tool that would have the benefits of work process modeling and simulation, but be distinctively able to

  11. Modelling the martian cosmic radiation environment

    NASA Astrophysics Data System (ADS)

    Dartnell, L. R.; Desorgher, L.; Ward, J. M.; Coates, A. J.

    2013-09-01

    The martian surface is no longer protected by a global magnetic field or substantial atmosphere and so is essentially unshielded to the flux of cosmic rays. This creates an ionising radiation field on the surface and subsurface that is hazardous to life and the operation of spacecraft instruments. Here we report the modelling approach used to characterise this complex and time-variable radiation environment and discuss the wider applications of the results generated.

  12. A model environment for outer zone electrons

    NASA Technical Reports Server (NTRS)

    Singley, G. W.; Vette, J. I.

    1972-01-01

    A brief morphology of outer zone electrons is given to illustrate the nature of the phenomena that we are attempting to model. This is followed by a discussion of the data processing that was done with the various data received from the experimenters before incorporating it into the data base from which this model was ultimately derived. The details of the derivation are given, and several comparisons of the final model with the various experimental measurements are presented.

  13. Virtual Collaborative Simulation Environment for Integrated Product and Process Development

    NASA Technical Reports Server (NTRS)

    Gulli, Michael A.

    1997-01-01

    Deneb Robotics is a leader in the development of commercially available, leading edge three- dimensional simulation software tools for virtual prototyping,, simulation-based design, manufacturing process simulation, and factory floor simulation and training applications. Deneb has developed and commercially released a preliminary Virtual Collaborative Engineering (VCE) capability for Integrated Product and Process Development (IPPD). This capability allows distributed, real-time visualization and evaluation of design concepts, manufacturing processes, and total factory and enterprises in one seamless simulation environment.

  14. Interplanetary Radiation and Internal Charging Environment Models for Solar Sails

    NASA Technical Reports Server (NTRS)

    Minow, Joseph I.; Altstatt, Richard L.; NeegaardParker, Linda

    2005-01-01

    A Solar Sail Radiation Environment (SSRE) model has been developed for defining charged particle environments over an energy range from 0.01 keV to 1 MeV for hydrogen ions, helium ions, and electrons. The SSRE model provides the free field charged particle environment required for characterizing energy deposition per unit mass, charge deposition, and dose rate dependent conductivity processes required to evaluate radiation dose and internal (bulk) charging processes in the solar sail membrane in interplanetary space. Solar wind and energetic particle measurements from instruments aboard the Ulysses spacecraft in a solar, near-polar orbit provide the particle data over a range of heliospheric latitudes used to derive the environment that can be used for radiation and charging environments for both high inclination 0.5 AU Solar Polar Imager mission and the 1.0 AU L1 solar missions. This paper describes the techniques used to model comprehensive electron, proton, and helium spectra over the range of particle energies of significance to energy and charge deposition in thin (less than 25 micrometers) solar sail materials.

  15. Tracer modeling in an urban environment

    SciTech Connect

    Reisner, J.M.; Smith, W.S.; Bossert, J.E.; Winterkamp, J.L.

    1998-12-31

    The accurate simulation of the transport of a tracer released into an urban area requires sufficiently high model resolution to resolve buildings and urban street canyons. Within the authors' group a modeling effort has been underway to develop a model -- termed HIGRAD -- capable of simulating flow at the high spatial resolution required within the urban environment. HIGRAD uses state-of-the-art numerical techniques to accurately simulate the regions of strong shear found near edges of buildings. HIGRAD also employs a newly developed radiation package which in addition to standard shortwave and longwave heating/cooling effects can account for the shadowing effects of building complexes on the urban flow field. Idealized simulations have been conducted which clearly illustrate the role radiation plays in transport and dispersion in an urban setting. The authors have also modeled the flow of an inert tracer in a realistic, complex urban environment. Complex flow/building interactions were produced during the simulation and these interactions had a significant impact on the transport of the tracer.

  16. Multiscale Materials Modeling in an Industrial Environment.

    PubMed

    Weiß, Horst; Deglmann, Peter; In 't Veld, Pieter J; Cetinkaya, Murat; Schreiner, Eduard

    2016-06-01

    In this review, we sketch the materials modeling process in industry. We show that predictive and fast modeling is a prerequisite for successful participation in research and development processes in the chemical industry. Stable and highly automated workflows suitable for handling complex systems are a must. In particular, we review approaches to build and parameterize soft matter systems. By satisfying these prerequisites, efficiency for the development of new materials can be significantly improved, as exemplified here for formulation polymer development. This is in fact in line with recent Materials Genome Initiative efforts sponsored by the US government. Valuable contributions to product development are possible today by combining existing modeling techniques in an intelligent fashion, provided modeling and experiment work hand in hand. PMID:26927661

  17. The Khoros software development environment for image and signal processing.

    PubMed

    Konstantinides, K; Rasure, J R

    1994-01-01

    Data flow visual language systems allow users to graphically create a block diagram of their applications and interactively control input, output, and system variables. Khoros is an integrated software development environment for information processing and visualization. It is particularly attractive for image processing because of its rich collection of tools for image and digital signal processing. This paper presents a general overview of Khoros with emphasis on its image processing and DSP tools. Various examples are presented and the future direction of Khoros is discussed. PMID:18291923

  18. An integrative model linking feedback environment and organizational citizenship behavior.

    PubMed

    Peng, Jei-Chen; Chiu, Su-Fen

    2010-01-01

    Past empirical evidence has suggested that a positive supervisor feedback environment may enhance employees' organizational citizenship behavior (OCB). In this study, we aim to extend previous research by proposing and testing an integrative model that examines the mediating processes underlying the relationship between supervisor feedback environment and employee OCB. Data were collected from 259 subordinate-supervisor dyads across a variety of organizations in Taiwan. We used structural equation modeling to test our hypotheses. The results demonstrated that supervisor feedback environment influenced employees' OCB indirectly through (1) both positive affective-cognition and positive attitude (i.e., person-organization fit and organizational commitment), and (2) both negative affective-cognition and negative attitude (i.e., role stressors and job burnout). Theoretical and practical implications are discussed. PMID:21166326

  19. FAME, the Flux Analysis and Modeling Environment

    PubMed Central

    2012-01-01

    Background The creation and modification of genome-scale metabolic models is a task that requires specialized software tools. While these are available, subsequently running or visualizing a model often relies on disjoint code, which adds additional actions to the analysis routine and, in our experience, renders these applications suboptimal for routine use by (systems) biologists. Results The Flux Analysis and Modeling Environment (FAME) is the first web-based modeling tool that combines the tasks of creating, editing, running, and analyzing/visualizing stoichiometric models into a single program. Analysis results can be automatically superimposed on familiar KEGG-like maps. FAME is written in PHP and uses the Python-based PySCeS-CBM for its linear solving capabilities. It comes with a comprehensive manual and a quick-start tutorial, and can be accessed online at http://f-a-m-e.org/. Conclusions With FAME, we present the community with an open source, user-friendly, web-based "one stop shop" for stoichiometric modeling. We expect the application will be of substantial use to investigators and educators alike. PMID:22289213

  20. Business Process Modeling: Perceived Benefits

    NASA Astrophysics Data System (ADS)

    Indulska, Marta; Green, Peter; Recker, Jan; Rosemann, Michael

    The process-centered design of organizations and information systems is globally seen as an appropriate response to the increased economic pressure on organizations. At the methodological core of process-centered management is process modeling. However, business process modeling in large initiatives can be a time-consuming and costly exercise, making it potentially difficult to convince executive management of its benefits. To date, and despite substantial interest and research in the area of process modeling, the understanding of the actual benefits of process modeling in academia and practice is limited. To address this gap, this paper explores the perception of benefits derived from process modeling initiatives, as reported through a global Delphi study. The study incorporates the views of three groups of stakeholders - academics, practitioners and vendors. Our findings lead to the first identification and ranking of 19 unique benefits associated with process modeling. The study in particular found that process modeling benefits vary significantly between practitioners and academics. We argue that the variations may point to a disconnect between research projects and practical demands.

  1. Introducing ORACLE: Library Processing in a Multi-User Environment.

    ERIC Educational Resources Information Center

    Queensland Library Board, Brisbane (Australia).

    Currently being developed by the State Library of Queensland, Australia, ORACLE (On-Line Retrieval of Acquisitions, Cataloguing, and Circulation Details for Library Enquiries) is a computerized library system designed to provide rapid processing of library materials in a multi-user environment. It is based on the Australian MARC format and fully…

  2. Extreme Environments Development of Decision Processes and Training Programs for Medical Policy Formulation

    NASA Technical Reports Server (NTRS)

    Stough, Roger

    2004-01-01

    The purpose of this workshop was to survey existing health and safety policies as well as processes and practices for various extreme environments; to identify strengths and shortcomings of these processes; and to recommend parameters for inclusion in a generic approach to policy formulation, applicable to the broadest categories of extreme environments. It was anticipated that two additional workshops would follow. The November 7, 2003 workshop would be devoted to the evaluation of different model(s) and a concluding expert evaluation of the usefulness of the model using a policy formulation example. The final workshop was planned for March 2004.

  3. Conceptual models of information processing

    NASA Technical Reports Server (NTRS)

    Stewart, L. J.

    1983-01-01

    The conceptual information processing issues are examined. Human information processing is defined as an active cognitive process that is analogous to a system. It is the flow and transformation of information within a human. The human is viewed as an active information seeker who is constantly receiving, processing, and acting upon the surrounding environmental stimuli. Human information processing models are conceptual representations of cognitive behaviors. Models of information processing are useful in representing the different theoretical positions and in attempting to define the limits and capabilities of human memory. It is concluded that an understanding of conceptual human information processing models and their applications to systems design leads to a better human factors approach.

  4. Business process modeling in healthcare.

    PubMed

    Ruiz, Francisco; Garcia, Felix; Calahorra, Luis; Llorente, César; Gonçalves, Luis; Daniel, Christel; Blobel, Bernd

    2012-01-01

    The importance of the process point of view is not restricted to a specific enterprise sector. In the field of health, as a result of the nature of the service offered, health institutions' processes are also the basis for decision making which is focused on achieving their objective of providing quality medical assistance. In this chapter the application of business process modelling - using the Business Process Modelling Notation (BPMN) standard is described. Main challenges of business process modelling in healthcare are the definition of healthcare processes, the multi-disciplinary nature of healthcare, the flexibility and variability of the activities involved in health care processes, the need of interoperability between multiple information systems, and the continuous updating of scientific knowledge in healthcare. PMID:22925789

  5. Modeling nuclear processes by Simulink

    NASA Astrophysics Data System (ADS)

    Rashid, Nahrul Khair Alang Md

    2015-04-01

    Modelling and simulation are essential parts in the study of dynamic systems behaviours. In nuclear engineering, modelling and simulation are important to assess the expected results of an experiment before the actual experiment is conducted or in the design of nuclear facilities. In education, modelling can give insight into the dynamic of systems and processes. Most nuclear processes can be described by ordinary or partial differential equations. Efforts expended to solve the equations using analytical or numerical solutions consume time and distract attention from the objectives of modelling itself. This paper presents the use of Simulink, a MATLAB toolbox software that is widely used in control engineering, as a modelling platform for the study of nuclear processes including nuclear reactor behaviours. Starting from the describing equations, Simulink models for heat transfer, radionuclide decay process, delayed neutrons effect, reactor point kinetic equations with delayed neutron groups, and the effect of temperature feedback are used as examples.

  6. Modeling nuclear processes by Simulink

    SciTech Connect

    Rashid, Nahrul Khair Alang Md

    2015-04-29

    Modelling and simulation are essential parts in the study of dynamic systems behaviours. In nuclear engineering, modelling and simulation are important to assess the expected results of an experiment before the actual experiment is conducted or in the design of nuclear facilities. In education, modelling can give insight into the dynamic of systems and processes. Most nuclear processes can be described by ordinary or partial differential equations. Efforts expended to solve the equations using analytical or numerical solutions consume time and distract attention from the objectives of modelling itself. This paper presents the use of Simulink, a MATLAB toolbox software that is widely used in control engineering, as a modelling platform for the study of nuclear processes including nuclear reactor behaviours. Starting from the describing equations, Simulink models for heat transfer, radionuclide decay process, delayed neutrons effect, reactor point kinetic equations with delayed neutron groups, and the effect of temperature feedback are used as examples.

  7. Process material management in the Space Station environment

    NASA Technical Reports Server (NTRS)

    Perry, J. L.; Humphries, W. R.

    1988-01-01

    The Space Station will provide a unique facility for conducting material-processing and life-science experiments under microgravity conditions. These conditions place special requirements on the U.S. Laboratory for storing and transporting chemicals and process fluids, reclaiming water from selected experiments, treating and storing experiment wastes, and providing vacuum utilities. To meet these needs and provide a safe laboratory environment, the Process Material Management System (PMMS) is being developed. Preliminary design requirements and concepts related to the PMMS are addressed, and the MSFC PMMS breadboard test facility and a preliminary plan for validating the overall system design are discussed.

  8. Models of the Reading Process

    PubMed Central

    Rayner, Keith; Reichle, Erik D.

    2010-01-01

    Reading is a complex skill involving the orchestration of a number of components. Researchers often talk about a “model of reading” when talking about only one aspect of the reading process (for example, models of word identification are often referred to as “models of reading”). Here, we review prominent models that are designed to account for (1) word identification, (2) syntactic parsing, (3) discourse representations, and (4) how certain aspects of language processing (e.g., word identification), in conjunction with other constraints (e g., limited visual acuity, saccadic error, etc.), guide readers’ eyes. Unfortunately, it is the case that these various models addressing specific aspects of the reading process seldom make contact with models dealing with other aspects of reading. Thus, for example, the models of word identification seldom make contact with models of eye movement control, and vice versa. While this may be unfortunate in some ways, it is quite understandable in other ways because reading itself is a very complex process. We discuss prototypical models of aspects of the reading process in the order mentioned above. We do not review all possible models, but rather focus on those we view as being representative and most highly recognized. PMID:21170142

  9. Modeling of space environment impact on nanostructured materials. General principles

    NASA Astrophysics Data System (ADS)

    Voronina, Ekaterina; Novikov, Lev

    2016-07-01

    In accordance with the resolution of ISO TC20/SC14 WG4/WG6 joint meeting, Technical Specification (TS) 'Modeling of space environment impact on nanostructured materials. General principles' which describes computer simulation methods of space environment impact on nanostructured materials is being prepared. Nanomaterials surpass traditional materials for space applications in many aspects due to their unique properties associated with nanoscale size of their constituents. This superiority in mechanical, thermal, electrical and optical properties will evidently inspire a wide range of applications in the next generation spacecraft intended for the long-term (~15-20 years) operation in near-Earth orbits and the automatic and manned interplanetary missions. Currently, ISO activity on developing standards concerning different issues of nanomaterials manufacturing and applications is high enough. Most such standards are related to production and characterization of nanostructures, however there is no ISO documents concerning nanomaterials behavior in different environmental conditions, including the space environment. The given TS deals with the peculiarities of the space environment impact on nanostructured materials (i.e. materials with structured objects which size in at least one dimension lies within 1-100 nm). The basic purpose of the document is the general description of the methodology of applying computer simulation methods which relate to different space and time scale to modeling processes occurring in nanostructured materials under the space environment impact. This document will emphasize the necessity of applying multiscale simulation approach and present the recommendations for the choice of the most appropriate methods (or a group of methods) for computer modeling of various processes that can occur in nanostructured materials under the influence of different space environment components. In addition, TS includes the description of possible

  10. Combustion modeling for experimentation in a space environment

    NASA Technical Reports Server (NTRS)

    Berlad, A. L.

    1974-01-01

    The merits of combustion experimentation in a space environment are assessed, and the impact of such experimentation on current theoretical models is considered. It is noted that combustion theory and experimentation for less than normal gravitational conditions are incomplete, inadequate, or nonexistent. Extensive and systematic experimentation in a space environment is viewed as essential for more adequate and complete theoretical models of such processes as premixed flame propagation and extinction limits, premixed flame propagation in droplet and particle clouds, ignition and autoignition in premixed combustible media, and gas jet combustion of unpremixed reactants. Current theories and models in these areas are described, and some combustion studies that can be undertaken in the Space Shuttle Program are proposed, including crossed molecular beam, turbulence, and upper pressure limit (of gases) studies.

  11. MODELING WIND TURBINES IN THE GRIDLAB-D SOFTWARE ENVIRONMENT

    SciTech Connect

    Fuller, J.C.; Schneider, K.P.

    2009-01-01

    In recent years, the rapid expansion of wind power has resulted in a need to more accurately model the effects of wind penetration on the electricity infrastructure. GridLAB-D is a new simulation environment developed for the U.S. Department of Energy (DOE) by the Pacifi c Northwest National Laboratory (PNNL), in cooperation with academic and industrial partners. GridLAB-D was originally written and designed to help integrate end-use smart grid technologies, and it is currently being expanded to include a number of other technologies, including distributed energy resources (DER). The specifi c goal of this project is to create a preliminary wind turbine generator (WTG) model for integration into GridLAB-D. As wind power penetration increases, models are needed to accurately study the effects of increased penetration; this project is a beginning step at examining these effects within the GridLAB-D environment. Aerodynamic, mechanical and electrical power models were designed to simulate the process by which mechanical power is extracted by a wind turbine and converted into electrical energy. The process was modeled using historic atmospheric data, collected over a period of 30 years as the primary energy input. This input was then combined with preliminary models for synchronous and induction generators. Additionally, basic control methods were implemented, using either constant power factor or constant power modes. The model was then compiled into the GridLAB-D simulation environment, and the power outputs were compared against manufacturers’ data and then a variation of the IEEE 4 node test feeder was used to examine the model’s behavior. Results showed the designs were suffi cient for a prototype model and provided output power similar to the available manufacturers’ data. The prototype model is designed as a template for the creation of new modules, with turbine-specifi c parameters to be added by the user.

  12. Integrated numeric and symbolic signal processing using a heterogeneous design environment

    NASA Astrophysics Data System (ADS)

    Mani, Ramamurthy; Nawab, S. Hamid; Winograd, Joseph M.; Evans, Brian L.

    1996-10-01

    We present a solution to a complex multi-tone transient detection problem to illustrate the integrated use of symbolic and numeric processing techniques which are supported by well-established underlying models. Examples of such models include synchronous dataflow for numeric processing and the blackboard paradigm for symbolic heuristic search. Our transient detection solution serves to emphasize the importance of developing system design methods and tools which can support the integrated use of well- established symbolic and numerical models of computation. Recently, we incorporated a blackboard-based model of computation underlying the Integrated Processing and Understanding of Signals (IPUS) paradigm into a system-level design environment for numeric processing called Ptolemy. Using the IPUS/Ptolemy environment, we are implementing our solution to the multi-tone transient detection problem.

  13. Kinetic Modeling of Microbiological Processes

    SciTech Connect

    Liu, Chongxuan; Fang, Yilin

    2012-08-26

    Kinetic description of microbiological processes is vital for the design and control of microbe-based biotechnologies such as waste water treatment, petroleum oil recovery, and contaminant attenuation and remediation. Various models have been proposed to describe microbiological processes. This editorial article discusses the advantages and limiation of these modeling approaches in cluding tranditional, Monod-type models and derivatives, and recently developed constraint-based approaches. The article also offers the future direction of modeling researches that best suit for petroleum and environmental biotechnologies.

  14. Open environment for image processing and software development

    NASA Astrophysics Data System (ADS)

    Rasure, John R.; Young, Mark

    1992-04-01

    The main goal of the Khoros software project is to create and provide an integrated software development environment for information processing and data visualization. The Khoros software system is now being used as a foundation to improve productivity and promote software reuse in a wide variety of application domain. A powerful feature of the Khoros system is the high-level, abstract visual language that can be employed to significantly boost the productivity of the researcher. Central to the Khoros system is the need for a consistent yet flexible user interface development system that provides cohesiveness to the vast number of programs that make up the Khoros system. Automated tools assist in maintenance as well as development of programs. The software structure that embodies this system provides for extensibility and portability, and allows for easy tailoring to target specific application domains and processing environments. First, an overview of the Khoros software environment is given. Then this paper presents the abstract applications programmer interface, API, the data services that are provided in Khoros to support it, and the Khoros visualization and image file format. The authors contend that Khoros is an excellent environment for the exploration and implementation of imaging standards.

  15. Social Models: Blueprints or Processes?

    ERIC Educational Resources Information Center

    Little, Graham R.

    1981-01-01

    Discusses the nature and implications of two different models for societal planning: (1) the problem-solving process approach based on Karl Popper; and (2) the goal-setting "blueprint" approach based on Karl Marx. (DC)

  16. Regulatory Models and the Environment: Practice, Pitfalls, and Prospects

    SciTech Connect

    Holmes, K. John; Graham, Judith A.; McKone, Thomas; Whipple, Chris

    2008-06-01

    Computational models support environmental regulatory activities by providing the regulator an ability to evaluate available knowledge, assess alternative regulations, and provide a framework to assess compliance. But all models face inherent uncertainties, because human and natural systems are always more complex and heterogeneous than can be captured in a model. Here we provide a summary discussion of the activities, findings, and recommendations of the National Research Council's Committee on Regulatory Environmental Models, a committee funded by the US Environmental Protection Agency to provide guidance on the use of computational models in the regulatory process. Modeling is a difficult enterprise even outside of the potentially adversarial regulatory environment. The demands grow when the regulatory requirements for accountability, transparency, public accessibility, and technical rigor are added to the challenges. Moreover, models cannot be validated (declared true) but instead should be evaluated with regard to their suitability as tools to address a specific question. The committee concluded that these characteristics make evaluation of a regulatory model more complex than simply comparing measurement data with model results. Evaluation also must balance the need for a model to be accurate with the need for a model to be reproducible, transparent, and useful for the regulatory decision at hand. Meeting these needs requires model evaluation to be applied over the"life cycle" of a regulatory model with an approach that includes different forms of peer review, uncertainty analysis, and extrapolation methods than for non-regulatory models.

  17. Critical processes affecting Cryptosporidium oocyst survival in the environment.

    PubMed

    King, B J; Monis, P T

    2007-03-01

    Cryptosporidium are parasitic protozoans that cause gastrointestinal disease and represent a significant risk to public health. Cryptosporidium oocysts are prevalent in surface waters as a result of human, livestock and native animal faecal contamination. The resistance of oocysts to the concentrations of chlorine and monochloramine used to disinfect potable water increases the risk of waterborne transmission via drinking water. In addition to being resistant to commonly used disinfectants, it is thought that oocysts can persist in the environment and be readily mobilized by precipitation events. This paper will review the critical processes involved in the inactivation or removal of oocysts in the terrestrial and aquatic environments and consider how these processes will respond in the context of climate change. PMID:17096874

  18. A network-oriented business modeling environment

    NASA Astrophysics Data System (ADS)

    Bisconti, Cristian; Storelli, Davide; Totaro, Salvatore; Arigliano, Francesco; Savarino, Vincenzo; Vicari, Claudia

    The development of formal models related to the organizational aspects of an enterprise is fundamental when these aspects must be re-engineered and digitalized, especially when the enterprise is involved in the dynamics and value flows of a business network. Business modeling provides an opportunity to synthesize and make business processes, business rules and the structural aspects of an organization explicit, allowing business managers to control their complexity and guide an enterprise through effective decisional and strategic activities. This chapter discusses the main results of the TEKNE project in terms of software components that enable enterprises to configure, store, search and share models of any aspects of their business while leveraging standard and business-oriented technologies and languages to bridge the gap between the world of business people and IT experts and to foster effective business-to-business collaborations.

  19. Analysis of Interpersonal Communication Processes in Digital Factory Environments

    NASA Astrophysics Data System (ADS)

    Schütze, Jens; Baum, Heiko; Laue, Martin; Müller, Egon

    The paper outlines the scope of influence of digital factory on the interpersonal communication process and the exemplary description of them. On the basis of a brief description about the theoretical basic concepts of the digital factory occurs the illustration of communicative features in digital factory. Practical coherences of interpersonal communication from a human oriented view were analyzed in Volkswagen AG in Wolfsburg in a pilot project. A modeling method was developed within the process analysis. This method makes it possible to visualize interpersonal communication and its human oriented attribute in a technically focused workflow. Due to the results of a developed inquiry about communication analysis and process models of modeling methods it was possible to build the processes in a suitable way for humans and to obtain a positive implication on the communication processes.

  20. Performance of redundant disk array organizations in transaction processing environments

    NASA Technical Reports Server (NTRS)

    Mourad, Antoine N.; Fuchs, W. K.; Saab, Daniel G.

    1993-01-01

    A performance evaluation is conducted for two redundant disk-array organizations in a transaction-processing environment, relative to the performance of both mirrored disk organizations and organizations using neither striping nor redundancy. The proposed parity-striping alternative to striping with rotated parity is shown to furnish rapid recovery from failure at the same low storage cost without interleaving the data over multiple disks. Both noncached systems and systems using a nonvolatile cache as the controller are considered.

  1. NG6: Integrated next generation sequencing storage and processing environment

    PubMed Central

    2012-01-01

    Background Next generation sequencing platforms are now well implanted in sequencing centres and some laboratories. Upcoming smaller scale machines such as the 454 junior from Roche or the MiSeq from Illumina will increase the number of laboratories hosting a sequencer. In such a context, it is important to provide these teams with an easily manageable environment to store and process the produced reads. Results We describe a user-friendly information system able to manage large sets of sequencing data. It includes, on one hand, a workflow environment already containing pipelines adapted to different input formats (sff, fasta, fastq and qseq), different sequencers (Roche 454, Illumina HiSeq) and various analyses (quality control, assembly, alignment, diversity studies,…) and, on the other hand, a secured web site giving access to the results. The connected user will be able to download raw and processed data and browse through the analysis result statistics. The provided workflows can easily be modified or extended and new ones can be added. Ergatis is used as a workflow building, running and monitoring system. The analyses can be run locally or in a cluster environment using Sun Grid Engine. Conclusions NG6 is a complete information system designed to answer the needs of a sequencing platform. It provides a user-friendly interface to process, store and download high-throughput sequencing data. PMID:22958229

  2. CAD tool environment for MEMS process design support

    NASA Astrophysics Data System (ADS)

    Schmidt, T.; Wagener, A.; Popp, J.; Hahn, K.; Bruck, R.

    2005-07-01

    MEMS fabrication processes are characterized by a numerous useable process steps, materials and effects to fabricate the intended microstructure. Up to now CAD support in this domain concentrates mainly on the structural design (e.g. simulation programs on FEM basis). These tools often assume fixed interfaces to fabrication process like material parameters or design rules. Taking into account that MEMS design requires concurrently structural design (defining the lateral 2-dim shapes) as well as process design (responsible for the third dimension) it turns out that technology interfaces consisting only of sets of static data are no longer sufficient. For successful design flows in these areas it is necessary to incorporate a higher degree of process related data. A broader interface between process configuration on the one side and the application design on the other side seems to be needed. This paper proposes a novel approach. A process management system is introduced. It allows the specification of processes for specific applications. The system is based on a dedicated database environment that is able to store and manage all process related design constraints linked to the fabrication process data itself. The interdependencies between application specific processes and all stages of the design flow will be discussed and the complete software system PRINCE will be introduced meeting the requirements of this new approach. Based on a concurrent design methodology presented in the beginning of this paper, a system is presented that supports application specific process design. The paper will highlight the incorporated tools and the present status of the software system. A complete configuration of an Si-thin film process example will demonstrate the usage of PRINCE.

  3. Model-based description of environment interaction for mobile robots

    NASA Astrophysics Data System (ADS)

    Borghi, Giuseppe; Ferrari, Carlo; Pagello, Enrico; Vianello, Marco

    1999-01-01

    We consider a mobile robot that attempts to accomplish a task by reaching a given goal, and interacts with its environment through a finite set of actions and observations. The interaction between robot and environment is modeled by Partially Observable Markov Decision Processes (POMDP). The robot takes its decisions in presence of uncertainty about the current state, by maximizing its reward gained during interactions with the environment. It is able to self-locate into the environment by collecting actions and perception histories during the navigation. To make the state estimation more reliable, we introduce an additional information in the model without adding new states and without discretizing the considered measures. Thus, we associate to the state transition probabilities also a continuous metric given through the mean and the variance of some significant sensor measurements suitable to be kept under continuous form, such as odometric measurements, showing that also such unreliable data can supply a great deal of information to the robot. The overall control system of the robot is structured as a two-levels layered architecture, where the low level implements several collision avoidance algorithms, while the upper level takes care of the navigation problem. In this paper, we concentrate on how to use POMDP models at the upper level.

  4. Simulation model for plant growth in controlled environment systems

    NASA Technical Reports Server (NTRS)

    Raper, C. D., Jr.; Wann, M.

    1986-01-01

    The role of the mathematical model is to relate the individual processes to environmental conditions and the behavior of the whole plant. Using the controlled-environment facilities of the phytotron at North Carolina State University for experimentation at the whole-plant level and methods for handling complex models, researchers developed a plant growth model to describe the relationships between hierarchial levels of the crop production system. The fundamental processes that are considered are: (1) interception of photosynthetically active radiation by leaves, (2) absorption of photosynthetically active radiation, (3) photosynthetic transformation of absorbed radiation into chemical energy of carbon bonding in solube carbohydrates in the leaves, (4) translocation between carbohydrate pools in leaves, stems, and roots, (5) flow of energy from carbohydrate pools for respiration, (6) flow from carbohydrate pools for growth, and (7) aging of tissues. These processes are described at the level of organ structure and of elementary function processes. The driving variables of incident photosynthetically active radiation and ambient temperature as inputs pertain to characterization at the whole-plant level. The output of the model is accumulated dry matter partitioned among leaves, stems, and roots; thus, the elementary processes clearly operate under the constraints of the plant structure which is itself the output of the model.

  5. Development of the Delta Shell as an integrated modeling environment

    NASA Astrophysics Data System (ADS)

    Donchyts, Gennadii; Baart, Fedor; Jagers, Bert

    2010-05-01

    Many engineering problem require the use of multiple numerical models from multiple disciplines. For example the use of river model for flow calculation coupled with groundwater model and rainfall-runoff model. These models need to be setup, coupled, run, results need to be visualized, input and output data need to be stored. For some of these steps a software or standards already exist, but there is a need for an environment allowing to perform all these steps.The goal of the present work is to create a modeling environment where models from different domains can perform all the sixe steps: setup, couple, run, visualize, store. This presentation deals with the different problems which arise when setting up a modelling framework, such as terminology, numerical aspects as well as the software development issues which arise. In order to solve these issues we use Domain Driven Design methods, available open standards and open source components. While creating an integrated modeling environment we have identified that a separation of the following domains is essential: a framework allowing to link and exchange data between models; a framework allowing to integrate different components of the environment; graphical user interface; GIS; hybrid relational and multi-dimensional data store; discipline-specific libraries: river hydrology, morphology, water quality, statistics; model-specific components Delta Shell environment which is the basis for several products such as HABITAT, SOBEK and the future Delft3D interface. It implements and integrates components covering the above mentioned domains by making use of open standards and open source components. Different components have been developed to fill in gaps. For exchaning data with the GUI an object oriented scientific framework in .NET was developed within Delta Shell somewhat similar to the JSR-275. For the GIS domain several OGC standards were used such as SFS, WCS and WFS. For storage the CF standard together with

  6. Metrics for Business Process Models

    NASA Astrophysics Data System (ADS)

    Mendling, Jan

    Up until now, there has been little research on why people introduce errors in real-world business process models. In a more general context, Simon [404] points to the limitations of cognitive capabilities and concludes that humans act rationally only to a certain extent. Concerning modeling errors, this argument would imply that human modelers lose track of the interrelations of large and complex models due to their limited cognitive capabilities and introduce errors that they would not insert in a small model. A recent study by Mendling et al. [275] explores in how far certain complexity metrics of business process models have the potential to serve as error determinants. The authors conclude that complexity indeed appears to have an impact on error probability. Before we can test such a hypothesis in a more general setting, we have to establish an understanding of how we can define determinants that drive error probability and how we can measure them.

  7. Command Process Modeling & Risk Analysis

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila

    2011-01-01

    Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.

  8. Hybrid Models for Trajectory Error Modelling in Urban Environments

    NASA Astrophysics Data System (ADS)

    Angelatsa, E.; Parés, M. E.; Colomina, I.

    2016-06-01

    This paper tackles the first step of any strategy aiming to improve the trajectory of terrestrial mobile mapping systems in urban environments. We present an approach to model the error of terrestrial mobile mapping trajectories, combining deterministic and stochastic models. Due to urban specific environment, the deterministic component will be modelled with non-continuous functions composed by linear shifts, drifts or polynomial functions. In addition, we will introduce a stochastic error component for modelling residual noise of the trajectory error function. First step for error modelling requires to know the actual trajectory error values for several representative environments. In order to determine as accurately as possible the trajectories error, (almost) error less trajectories should be estimated using extracted nonsemantic features from a sequence of images collected with the terrestrial mobile mapping system and from a full set of ground control points. Once the references are estimated, they will be used to determine the actual errors in terrestrial mobile mapping trajectory. The rigorous analysis of these data sets will allow us to characterize the errors of a terrestrial mobile mapping system for a wide range of environments. This information will be of great use in future campaigns to improve the results of the 3D points cloud generation. The proposed approach has been evaluated using real data. The data originate from a mobile mapping campaign over an urban and controlled area of Dortmund (Germany), with harmful GNSS conditions. The mobile mapping system, that includes two laser scanner and two cameras, was mounted on a van and it was driven over a controlled area around three hours. The results show the suitability to decompose trajectory error with non-continuous deterministic and stochastic components.

  9. An Overview of NASA's Oribital Debris Environment Model

    NASA Technical Reports Server (NTRS)

    Matney, Mark

    2010-01-01

    Using updated measurement data, analysis tools, and modeling techniques; the NASA Orbital Debris Program Office has created a new Orbital Debris Environment Model. This model extends the coverage of orbital debris flux throughout the Earth orbit environment, and includes information on the mass density of the debris as well as the uncertainties in the model environment. This paper will give an overview of this model and its implications for spacecraft risk analysis.

  10. Visualizing the process of interaction in a 3D environment

    NASA Astrophysics Data System (ADS)

    Vaidya, Vivek; Suryanarayanan, Srikanth; Krishnan, Kajoli; Mullick, Rakesh

    2007-03-01

    As the imaging modalities used in medicine transition to increasingly three-dimensional data the question of how best to interact with and analyze this data becomes ever more pressing. Immersive virtual reality systems seem to hold promise in tackling this, but how individuals learn and interact in these environments is not fully understood. Here we will attempt to show some methods in which user interaction in a virtual reality environment can be visualized and how this can allow us to gain greater insight into the process of interaction/learning in these systems. Also explored is the possibility of using this method to improve understanding and management of ergonomic issues within an interface.

  11. The formation of adipocere in model aquatic environments.

    PubMed

    Stuart, B H; Notter, S J; Dent, B; Selvalatchmanan, J; Fu, S

    2016-01-01

    An examination of the chemistry of adipocere formation in aquatic systems provides insight into how environmental factors affect the decomposition processes of human remains. Gas chromatography–mass spectrometry (GC-MS) and inductively coupled plasma–mass spectrometry (ICPMS) have been employed to monitor the changes to the chemistry of adipocere formed in aquatic environments used to model seawater, river and chlorinated water systems. Seawater was shown to inhibit adipocere formation, and a distinctively different elemental composition was produced in this environment due to the high concentrations of salts. By comparison, river water has been shown to accelerate the formation of adipocere. Chlorinated water appears to significantly enhance adipocere formation, based on a comparison with established fatty acid concentration values. However, a competing reaction to form chlorohydrins in chlorinated water is believed to be responsible for the unusual findings in this environment. The application of the chemical characterization of adipocere to an understanding of how this particular decomposition product forms in different water environments has been demonstrated, and there is potential to utilise this approach to identify the environment in which a body has been immersed. PMID:26493693

  12. Process modeling - It's history, current status, and future

    NASA Astrophysics Data System (ADS)

    Duttweiler, Russell E.; Griffith, Walter M.; Jain, Sulekh C.

    1991-04-01

    The development of process modeling is reviewed to examine the potential of process applications to prevent and solve problems associated with the aerospace industry. The business and global environments is assessed, and the traditional approach to product/process design is argued to be obsolete. A revised engineering process is described which involves planning and prediction before production by means of process simulation. Process simulation can permit simultaneous engineering of unit processes and complex processes, and examples are given in the cross-coupling of forging-process variance. The implementation of process modeling, CAE, and computer simulations are found to reduce costs and time associated with technological development when incorporated judiciously.

  13. Physical processes affecting the sedimentary environments of Long Island Sound

    USGS Publications Warehouse

    Signell, R.P.; Knebel, H. J.; List, J.H.; Farris, A.S.

    1997-01-01

    A modeling study was undertaken to simulate the bottom tidal-, wave-, and wind-driven currents in Long Island Sound in order to provide a general physical oceanographic framework for understanding the characteristics and distribution of seafloor sedimentary environments. Tidal currents are important in the funnel-shaped eastern part of the Sound, where a strong gradient of tidal-current speed was found. This current gradient parallels the general westward progression of sedimentary environments from erosion or non-deposition, through bedload transport and sediment sorting, to fine-grained deposition. Wave-driven currents, meanwhile, appear to be important along the shallow margins of the basin, explaining the occurrence of relatively coarse sediments in regions where tidal currents alone are not strong enough to move sediment. Finally, westerly wind events are shown to locally enhance bottom currents along the axial depression of the sound, providing a possible explanation for the relatively coarse sediments found in the depression despite tide- and wave-induced currents below the threshold of sediment movement. The strong correlation between the near-bottom current intensity based on the model results and the sediment response as indicated by the distribution of sedimentary environments provides a framework for predicting the long-term effects of anthropogenic activities.

  14. Modeling Production Plant Forming Processes

    SciTech Connect

    Rhee, M; Becker, R; Couch, R; Li, M

    2004-09-22

    Engineering has simulation tools and experience in modeling forming processes. Y-12 personnel have expressed interest in validating our tools and experience against their manufacturing process activities such as rolling, casting, and forging etc. We have demonstrated numerical capabilities in a collaborative DOE/OIT project with ALCOA that is nearing successful completion. The goal was to use ALE3D to model Alcoa's slab rolling process in order to demonstrate a computational tool that would allow Alcoa to define a rolling schedule that would minimize the probability of ingot fracture, thus reducing waste and energy consumption. It is intended to lead to long-term collaboration with Y-12 and perhaps involvement with other components of the weapons production complex. Using simulations to aid in design of forming processes can: decrease time to production; reduce forming trials and associated expenses; and guide development of products with greater uniformity and less scrap.

  15. MASCARET: creating virtual learning environments from system modelling

    NASA Astrophysics Data System (ADS)

    Querrec, Ronan; Vallejo, Paola; Buche, Cédric

    2013-03-01

    The design process for a Virtual Learning Environment (VLE) such as that put forward in the SIFORAS project (SImulation FOR training and ASsistance) means that system specifications can be differentiated from pedagogical specifications. System specifications can also be obtained directly from the specialists' expertise; that is to say directly from Product Lifecycle Management (PLM) tools. To do this, the system model needs to be considered as a piece of VLE data. In this paper we present Mascaret, a meta-model which can be used to represent such system models. In order to ensure that the meta-model is capable of describing, representing and simulating such systems, MASCARET is based SysML1, a standard defined by Omg.

  16. An Instructional Method for the AutoCAD Modeling Environment.

    ERIC Educational Resources Information Center

    Mohler, James L.

    1997-01-01

    Presents a command organizer for AutoCAD to aid new uses in operating within the 3-D modeling environment. Addresses analyzing the problem, visualization skills, nonlinear tools, a static view of a dynamic model, the AutoCAD organizer, environment attributes, and control of the environment. Contains 11 references. (JRH)

  17. Learning Environment, Learning Process, Academic Outcomes and Career Success of University Graduates

    ERIC Educational Resources Information Center

    Vermeulen, Lyanda; Schmidt, Henk G.

    2008-01-01

    This study expands on literature covering models on educational productivity, student integration and effectiveness of instruction. An expansion of the literature concerning the impact of higher education on workplace performance is also covered. Relationships were examined between the quality of the academic learning environment, the process of…

  18. Mathematical modeling of biomass fuels formation process

    SciTech Connect

    Gaska, Krzysztof Wandrasz, Andrzej J.

    2008-07-01

    The increasing demand for thermal and electric energy in many branches of industry and municipal management accounts for a drastic diminishing of natural resources (fossil fuels). Meanwhile, in numerous technical processes, a huge mass of wastes is produced. A segregated and converted combustible fraction of the wastes, with relatively high calorific value, may be used as a component of formed fuels. The utilization of the formed fuel components from segregated groups of waste in associated processes of co-combustion with conventional fuels causes significant savings resulting from partial replacement of fossil fuels, and reduction of environmental pollution resulting directly from the limitation of waste migration to the environment (soil, atmospheric air, surface and underground water). The realization of technological processes with the utilization of formed fuel in associated thermal systems should be qualified by technical criteria, which means that elementary processes as well as factors of sustainable development, from a global viewpoint, must not be disturbed. The utilization of post-process waste should be preceded by detailed technical, ecological and economic analyses. In order to optimize the mixing process of fuel components, a mathematical model of the forming process was created. The model is defined as a group of data structures which uniquely identify a real process and conversion of this data in algorithms based on a problem of linear programming. The paper also presents the optimization of parameters in the process of forming fuels using a modified simplex algorithm with a polynomial worktime. This model is a datum-point in the numerical modeling of real processes, allowing a precise determination of the optimal elementary composition of formed fuels components, with assumed constraints and decision variables of the task.

  19. Group Modeling in Social Learning Environments

    ERIC Educational Resources Information Center

    Stankov, Slavomir; Glavinic, Vlado; Krpan, Divna

    2012-01-01

    Students' collaboration while learning could provide better learning environments. Collaboration assumes social interactions which occur in student groups. Social theories emphasize positive influence of such interactions on learning. In order to create an appropriate learning environment that enables social interactions, it is important to…

  20. Sampling the food processing environment: taking up the cudgel for preventive quality management in food processing environments.

    PubMed

    Wagner, Martin; Stessl, Beatrix

    2014-01-01

    The Listeria monitoring program for Austrian cheese factories was established in 1988. The basic idea is to control the introduction of L. monocytogenes into the food processing environment, preventing the pathogen from contaminating the food under processing. The Austrian Listeria monitoring program comprises four levels of investigation, dealing with routine monitoring of samples and consequences of finding a positive sample. Preventive quality control concepts attempt to detect a foodborne hazard along the food processing chain, prior to food delivery, retailing, and consumption. The implementation of a preventive food safety concept provokes a deepened insight by the manufacturers into problems concerning food safety. The development of preventive quality assurance strategies contributes to the national food safety status and protects public health. PMID:24792566

  1. The concepts of energy, environment, and cost for process design

    SciTech Connect

    Abu-Khader, M.M.; Speight, J.G.

    2004-05-01

    The process industries (specifically, energy and chemicals) are characterized by a variety of reactors and reactions to bring about successful process operations. The design of energy-related and chemical processes and their evolution is a complex process that determines the competitiveness of these industries, as well as their environmental impact. Thus, we have developed an Enviro-Energy Concept designed to facilitate sustainable industrial development. The Complete Onion Model represents a complete methodology for chemical process design and illustrates all of the requirements to achieve the best possible design within the accepted environmental standards. Currently, NOx emissions from industrial processes continue to receive maximum attention, therefore the issue problem of NOx emissions from industrial sources such as power stations and nitric acid plants is considered. The Selective Catalytic Reduction (SCR) is one of the most promising and effective commercial technologies. It is considered the Best Available Control Technology (BACT) for NOx reduction. The solution of NOx emissions problem is either through modifying the chemical process design and/or installing an end-of-pipe technology. The degree of integration between the process design and the installed technology plays a critical role in the capital cost evaluation. Therefore, integrating process units and then optimizing the design has a vital effect on the total cost. Both the environmental regulations and the cost evaluation are the boundary constraints of the optimum solution.

  2. Practical Framework: Implementing OEE Method in Manufacturing Process Environment

    NASA Astrophysics Data System (ADS)

    Maideen, N. C.; Sahudin, S.; Mohd Yahya, N. H.; Norliawati, A. O.

    2016-02-01

    Manufacturing process environment requires reliable machineries in order to be able to satisfy the market demand. Ideally, a reliable machine is expected to be operated and produce a quality product at its maximum designed capability. However, due to some reason, the machine usually unable to achieved the desired performance. Since the performance will affect the productivity of the system, a measurement technique should be applied. Overall Equipment Effectiveness (OEE) is a good method to measure the performance of the machine. The reliable result produced from OEE can then be used to propose a suitable corrective action. There are a lot of published paper mentioned about the purpose and benefit of OEE that covers what and why factors. However, the how factor not yet been revealed especially the implementation of OEE in manufacturing process environment. Thus, this paper presents a practical framework to implement OEE and a case study has been discussed to explain in detail each steps proposed. The proposed framework is beneficial to the engineer especially the beginner to start measure their machine performance and later improve the performance of the machine.

  3. Thermoplastic matrix composite processing model

    NASA Technical Reports Server (NTRS)

    Dara, P. H.; Loos, A. C.

    1985-01-01

    The effects the processing parameters pressure, temperature, and time have on the quality of continuous graphite fiber reinforced thermoplastic matrix composites were quantitatively accessed by defining the extent to which intimate contact and bond formation has occurred at successive ply interfaces. Two models are presented predicting the extents to which the ply interfaces have achieved intimate contact and cohesive strength. The models are based on experimental observation of compression molded laminates and neat resin conditions, respectively. Identified as the mechanism explaining the phenomenon by which the plies bond to themselves is the theory of autohesion (or self diffusion). Theoretical predictions from the Reptation Theory between autohesive strength and contact time are used to explain the effects of the processing parameters on the observed experimental strengths. The application of a time-temperature relationship for autohesive strength predictions is evaluated. A viscoelastic compression molding model of a tow was developed to explain the phenomenon by which the prepreg ply interfaces develop intimate contact.

  4. Measurement and modeling of moist processes

    NASA Technical Reports Server (NTRS)

    Cotton, William; Starr, David; Mitchell, Kenneth; Fleming, Rex; Koch, Steve; Smith, Steve; Mailhot, Jocelyn; Perkey, Don; Tripoli, Greg

    1993-01-01

    The keynote talk summarized five years of work simulating observed mesoscale convective systems with the RAMS (Regional Atmospheric Modeling System) model. Excellent results are obtained when simulating squall line or other convective systems that are strongly forced by fronts or other lifting mechanisms. Less highly forced systems are difficult to model. The next topic in this colloquium was measurement of water vapor and other constituents of the hydrologic cycle. Impressive accuracy was shown measuring water vapor with both the airborne DIAL (Differential Absorption Lidar) system and the the ground-based Raman Lidar. NMC's plans for initializing land water hydrology in mesoscale models was presented before water vapor measurement concepts for GCIP were discussed. The subject of using satellite data to provide mesoscale moisture and wind analyses was next. Recent activities in modeling of moist processes in mesoscale systems was reported on. These modeling activities at the Canadian Atmospheric Environment Service (AES) used a hydrostatic, variable-resolution grid model. Next the spatial resolution effects of moisture budgets was discussed; in particular, the effects of temporal resolution on heat and moisture budgets for cumulus parameterization. The conclusion of this colloquium was on modeling scale interaction processes.

  5. Parallel processing environment for multi-flexible body dynamics

    NASA Technical Reports Server (NTRS)

    Venugopal, Ravi; Kumar, Manoj N.; Singh, Ramen P.; Taylor, Lawrence W., Jr.

    1989-01-01

    The implementation of a dynamics solution algorithm with inherent parallelism which is applicable to the dynamics of large flexible space structures is described. The algorithm is unique in that parts of the solution can be computed simultaneously by working with different branches of its tree topology. The algorithm exhibits close to 0(n) type behavior. The data flow within the solution algorithm is discussed along with results from its implementation in a multiprocessing environment. A model of the United States Space Station is used as an example. The results show that, with fast multiple scalar processors, an efficient algorithm, and symbolically generated equations of motion, real-time performance can be achieved with present-day hardware technology, even with complex dynamical models.

  6. Welding process modelling and control

    NASA Technical Reports Server (NTRS)

    Romine, Peter L.; Adenwala, Jinen A.

    1993-01-01

    The research and analysis performed, and software developed, and hardware/software recommendations made during 1992 in development of the PC-based data acquisition system for support of Welding Process Modeling and Control is reported. A need was identified by the Metals Processing Branch of NASA Marshall Space Flight Center, for a mobile data aquisition and analysis system, customized for welding measurement and calibration. Several hardware configurations were evaluated and a PC-based system was chosen. The Welding Measurement System (WMS) is a dedicated instrument, strictly for the use of data aquisition and analysis. Although the WMS supports many of the functions associated with the process control, it is not the intention for this system to be used for welding process control.

  7. An ecohydrologic model for a shallow groundwater urban environment.

    PubMed

    Arden, Sam; Ma, Xin Cissy; Brown, Mark

    2014-01-01

    The urban environment is a patchwork of natural and artificial surfaces that results in complex interactions with and impacts to natural hydrologic cycles. Evapotranspiration is a major hydrologic flow that is often altered through urbanization, although the mechanisms of change are sometimes difficult to tease out due to difficulty in effectively simulating soil-plant-atmosphere interactions. This paper introduces a simplified yet realistic model that is a combination of existing surface runoff and ecohydrology models designed to increase the quantitative understanding of complex urban hydrologic processes. Results demonstrate that the model is capable of simulating the long-term variability of major hydrologic fluxes as a function of impervious surface, temperature, water table elevation, canopy interception, soil characteristics, precipitation and complex mechanisms of plant water uptake. These understandings have potential implications for holistic urban water system management. PMID:25500468

  8. Animal models and conserved processes

    PubMed Central

    2012-01-01

    Background The concept of conserved processes presents unique opportunities for using nonhuman animal models in biomedical research. However, the concept must be examined in the context that humans and nonhuman animals are evolved, complex, adaptive systems. Given that nonhuman animals are examples of living systems that are differently complex from humans, what does the existence of a conserved gene or process imply for inter-species extrapolation? Methods We surveyed the literature including philosophy of science, biological complexity, conserved processes, evolutionary biology, comparative medicine, anti-neoplastic agents, inhalational anesthetics, and drug development journals in order to determine the value of nonhuman animal models when studying conserved processes. Results Evolution through natural selection has employed components and processes both to produce the same outcomes among species but also to generate different functions and traits. Many genes and processes are conserved, but new combinations of these processes or different regulation of the genes involved in these processes have resulted in unique organisms. Further, there is a hierarchy of organization in complex living systems. At some levels, the components are simple systems that can be analyzed by mathematics or the physical sciences, while at other levels the system cannot be fully analyzed by reducing it to a physical system. The study of complex living systems must alternate between focusing on the parts and examining the intact whole organism while taking into account the connections between the two. Systems biology aims for this holism. We examined the actions of inhalational anesthetic agents and anti-neoplastic agents in order to address what the characteristics of complex living systems imply for inter-species extrapolation of traits and responses related to conserved processes. Conclusion We conclude that even the presence of conserved processes is insufficient for inter

  9. Model for amorphous aggregation processes

    NASA Astrophysics Data System (ADS)

    Stranks, Samuel D.; Ecroyd, Heath; van Sluyter, Steven; Waters, Elizabeth J.; Carver, John A.; von Smekal, Lorenz

    2009-11-01

    The amorphous aggregation of proteins is associated with many phenomena, ranging from the formation of protein wine haze to the development of cataract in the eye lens and the precipitation of recombinant proteins during their expression and purification. While much literature exists describing models for linear protein aggregation, such as amyloid fibril formation, there are few reports of models which address amorphous aggregation. Here, we propose a model to describe the amorphous aggregation of proteins which is also more widely applicable to other situations where a similar process occurs, such as in the formation of colloids and nanoclusters. As first applications of the model, we have tested it against experimental turbidimetry data of three proteins relevant to the wine industry and biochemistry, namely, thaumatin, a thaumatinlike protein, and α -lactalbumin. The model is very robust and describes amorphous experimental data to a high degree of accuracy. Details about the aggregation process, such as shape parameters of the aggregates and rate constants, can also be extracted.

  10. A Process for Technology Prioritization in a Competitive Environment

    NASA Technical Reports Server (NTRS)

    Stephens, Karen; Herman, Melody; Griffin, Brand

    2006-01-01

    This slide presentation reviews NASA's process for prioritizing technology requirements where there is a competitive environment. The In-Space Propulsion Technology (ISPT) project is used to exemplify the process. The ISPT project focuses on the mid level Technology Readiness Level (TRL) for development. These are TRL's 4 through 6, (i.e. Technology Development and Technology Demonstration. The objective of the planning activity is to identify the current most likely date each technology is needed and create ISPT technology development schedules based on these dates. There is a minimum of 4 years between flight and pacing mission. The ISPT Project needed to identify the "pacing mission" for each technology in order to provide funding for each area. Graphic representations show the development of the process. A matrix shows which missions are currently receiving pull from the both the Solar System Exploration and the Sun-Solar System Connection Roadmaps. The timeframes of the pacing missions technologies are shown for various types of propulsion. A pacing mission that was in the near future serves to increase the priority for funding. Adaptations were made when budget reductions precluded the total implementation of the plan.

  11. Operational SAR Data Processing in GIS Environments for Rapid Disaster Mapping

    NASA Astrophysics Data System (ADS)

    Meroni, A.; Bahr, T.

    2013-05-01

    Having access to SAR data can be highly important and critical especially for disaster mapping. Updating a GIS with contemporary information from SAR data allows to deliver a reliable set of geospatial information to advance civilian operations, e.g. search and rescue missions. Therefore, we present in this paper the operational processing of SAR data within a GIS environment for rapid disaster mapping. This is exemplified by the November 2010 flash flood in the Veneto region, Italy. A series of COSMO-SkyMed acquisitions was processed in ArcGIS® using a single-sensor, multi-mode, multi-temporal approach. The relevant processing steps were combined using the ArcGIS ModelBuilder to create a new model for rapid disaster mapping in ArcGIS, which can be accessed both via a desktop and a server environment.

  12. Self-assembly processes in the prebiotic environment

    PubMed Central

    Deamer, David; Singaram, Sara; Rajamani, Sudha; Kompanichenko, Vladimir; Guggenheim, Stephen

    2006-01-01

    An important question guiding research on the origin of life concerns the environmental conditions where molecular systems with the properties of life first appeared on the early Earth. An appropriate site would require liquid water, a source of organic compounds, a source of energy to drive polymerization reactions and a process by which the compounds were sufficiently concentrated to undergo physical and chemical interactions. One such site is a geothermal setting, in which organic compounds interact with mineral surfaces to promote self-assembly and polymerization reactions. Here, we report an initial study of two geothermal sites where mixtures of representative organic solutes (amino acids, nucleobases, a fatty acid and glycerol) and phosphate were mixed with high-temperature water in clay-lined pools. Most of the added organics and phosphate were removed from solution with half-times measured in minutes to a few hours. Analysis of the clay, primarily smectite and kaolin, showed that the organics were adsorbed to the mineral surfaces at the acidic pH of the pools, but could subsequently be released in basic solutions. These results help to constrain the range of possible environments for the origin of life. A site conducive to self-assembly of organic solutes would be an aqueous environment relatively low in ionic solutes, at an intermediate temperature range and neutral pH ranges, in which cyclic concentration of the solutes can occur by transient dry intervals. PMID:17008220

  13. Construction material processed using lunar simulant in various environments

    NASA Technical Reports Server (NTRS)

    Chase, Stan; Ocallaghan-Hay, Bridget; Housman, Ralph; Kindig, Michael; King, John; Montegrande, Kevin; Norris, Raymond; Vanscotter, Ryan; Willenborg, Jonathan; Staubs, Harry

    1995-01-01

    The manufacture of construction materials from locally available resources in space is an important first step in the establishment of lunar and planetary bases. The objective of the CoMPULSIVE (Construction Material Processed Using Lunar Simulant In Various Environments) experiment is to develop a procedure to produce construction materials by sintering or melting Johnson Space Center Simulant 1 (JSC-1) lunar soil simulant in both earth-based (1-g) and microgravity (approximately 0-g) environments. The characteristics of the resultant materials will be tested to determine its physical and mechanical properties. The physical characteristics include: crystalline, thermal, and electrical properties. The mechanical properties include: compressive tensile, and flexural strengths. The simulant, placed in a sealed graphite crucible, will be heated using a high temperature furnace. The crucible will then be cooled by radiative and forced convective means. The core furnace element consists of space qualified quartz-halogen incandescent lamps with focusing mirrors. Sample temperatures of up to 2200 C are attainable using this heating method.

  14. Propagation modeling in a manufacturing environment

    SciTech Connect

    Birdwell, J.D.; Horn, R.D.; Rader, M.S.; Shourbaji, A.A.

    1995-12-31

    Wireless sensors which utilize low power spread spectrum data transmission have significant potential in industrial environments due to low cabling and installation costs. In addition, this technology imposes fewer constraints upon placement due to cable routing, allowing sensors to be installed in areas with poor access. Limitations are imposed on sensor and receiver placement by electromagnetic propagation effects in the industrial environment, including multipath and the presence of absorbing media. This paper explores the electromagnetic analysis of potential wireless sensor applications using commercially available finite element software. In addition, since the applications environment is often at least partially specified in electronic form using computer-aided drafting software, the importation of information from this software is discussed. Both three-dimensional and two-dimensional examples are presented which demonstrate the utility and limitations of the method.

  15. Extending the LWS Data Environment: Distributed Data Processing and Analysis

    NASA Technical Reports Server (NTRS)

    Narock, Thomas

    2005-01-01

    The final stages of this work saw changes to the original framework, as well as the completion and integration of several data processing services. Initially, it was thought that a peer-to-peer architecture was necessary to make this work possible. The peer-to-peer architecture provided many benefits including the dynamic discovery of new services that would be continually added. A prototype example was built and while it showed promise, a major disadvantage was seen in that it was not easily integrated into the existing data environment. While the peer-to-peer system worked well for finding and accessing distributed data processing services, it was found that its use was limited by the difficulty in calling it from existing tools and services. After collaborations with members of the data community, it was determined that our data processing system was of high value and that a new interface should be pursued in order for the community to take full advantage of it. As such; the framework was modified from a peer-to-peer architecture to a more traditional web service approach. Following this change multiple data processing services were added. These services include such things as coordinate transformations and sub setting of data. Observatory (VHO), assisted with integrating the new architecture into the VHO. This allows anyone using the VHO to search for data, to then pass that data through our processing services prior to downloading it. As a second attempt at demonstrating the new system, a collaboration was established with the Collaborative Sun Earth Connector (CoSEC) group at Lockheed Martin. This group is working on a graphical user interface to the Virtual Observatories and data processing software. The intent is to provide a high-level easy-to-use graphical interface that will allow access to the existing Virtual Observatories and data processing services from one convenient application. Working with the CoSEC group we provided access to our data

  16. Information Network Model Query Processing

    NASA Astrophysics Data System (ADS)

    Song, Xiaopu

    Information Networking Model (INM) [31] is a novel database model for real world objects and relationships management. It naturally and directly supports various kinds of static and dynamic relationships between objects. In INM, objects are networked through various natural and complex relationships. INM Query Language (INM-QL) [30] is designed to explore such information network, retrieve information about schema, instance, their attributes, relationships, and context-dependent information, and process query results in the user specified form. INM database management system has been implemented using Berkeley DB, and it supports INM-QL. This thesis is mainly focused on the implementation of the subsystem that is able to effectively and efficiently process INM-QL. The subsystem provides a lexical and syntactical analyzer of INM-QL, and it is able to choose appropriate evaluation strategies and index mechanism to process queries in INM-QL without the user's intervention. It also uses intermediate result structure to hold intermediate query result and other helping structures to reduce complexity of query processing.

  17. Drought processes, modeling, and mitigation

    NASA Astrophysics Data System (ADS)

    Mishra, Ashok K.; Sivakumar, Bellie; Singh, Vijay P.

    2015-07-01

    Accurate assessment of droughts is crucial for proper planning and management of our water resources, environment, and ecosystems. The combined influence of increasing water demands and the anticipated impacts of global climate change has already raised serious concerns about worsening drought conditions in the future and their social, economic, and environmental impacts. As a result, studies on droughts are currently a major focal point for a broad range of research communities, including civil engineers, hydrologists, environmentalists, ecologists, meteorologists, geologists, agricultural scientists, economists, policy makers, and water managers. There is, therefore, an urgent need for enhancing our understanding of droughts (e.g. occurrence, modeling), making more reliable assessments of their impacts on various sectors of our society (e.g. domestic, agricultural, industrial), and undertaking appropriate adaptation and mitigation measures, especially in the face of global climate change.

  18. Model test optimization using the virtual environment for test optimization

    SciTech Connect

    Klenke, S.E.; Reese, G.M.; Schoof, L.A.; Shierling, C.

    1995-11-01

    We present a software environment integrating analysis and test-based models to support optimal modal test design through a Virtual Environment for Test Optimization (VETO). The VETO assists analysis and test engineers to maximize the value of each modal test. It is particularly advantageous for structural dynamics model reconciliation applications. The VETO enables an engineer to interact with a finite element model of a test object to optimally place sensors and exciters and to investigate the selection of data acquisition parameters needed to conduct a complete modal survey. Additionally, the user can evaluate the use of different types of instrumentation such as filters, amplifiers and transducers for which models are available in the VETO. The dynamic response of most of the virtual instruments (including the device under test) are modeled in the state space domain. Design of modal excitation levels and appropriate test instrumentation are facilitated by the VETO`s ability to simulate such features as unmeasured external inputs, A/D quantization effects, and electronic noise. Measures of the quality of the experimental design, including the Modal Assurance Criterion, and the Normal Mode Indicator Function are available. The VETO also integrates tools such as Effective Independence and minamac to assist in selection of optimal sensor locations. The software is designed about three distinct modules: (1) a main controller and GUI written in C++, (2) a visualization model, taken from FEAVR, running under AVS, and (3) a state space model and time integration module built in SIMULINK. These modules are designed to run as separate processes on interconnected machines.

  19. A Collaborative Model for Ubiquitous Learning Environments

    ERIC Educational Resources Information Center

    Barbosa, Jorge; Barbosa, Debora; Rabello, Solon

    2016-01-01

    Use of mobile devices and widespread adoption of wireless networks have enabled the emergence of Ubiquitous Computing. Application of this technology to improving education strategies gave rise to Ubiquitous e-Learning, also known as Ubiquitous Learning. There are several approaches to organizing ubiquitous learning environments, but most of them…

  20. Structural study of the Eu3+ environments in fluorozirconate glasses: Role of the temperature-induced and the pressure-induced phase transition processes in the development of a rare earth's local structure model

    NASA Astrophysics Data System (ADS)

    Muñoz-Santiuste, Juan E.; Rodríguez-Mendoza, Ulises R.; González-Platas, Javier; Lavín, Víctor

    2009-04-01

    The correlation between the optical properties of the Eu3+ ions and their local structures in fluorozirconate glasses and glass-ceramics have been analyzed by means of steady-state and time-resolved site-selective laser spectroscopies. Changes in the crystal-field interaction, ranging from weak to medium strength values, are observed monitoring the luminescence and the lifetime of the Eu3+ ions in different local environments in the glass. As key roles in this study, the Eu3+ luminescence in the thermally-induced crystallization of the glass and the pressure-induced amorphization of the crystalline phase of the glass-ceramic experimentally states the existence of a parent local structure for the Eu3+ ions in the glass, identified as the EuZrF7 crystalline phase. Starting from the ab initio single overlap model, crystal-field calculations have been performed in the glass and the glass-ceramic. From the site-selective measurements, the crystal-field parameters sets are obtained, giving a suitable simulation of the F7J (J =0-6) Stark energy level diagram for the Eu3+ ions in the different environments present in the fluorozirconate glass. A simple geometrical model based on a continuous distortion of the parent structure is proposed for the distribution of local environments of the Eu3+ ions in the fluorozirconate glass.

  1. Dynamic displays of chemical process flowsheet models

    SciTech Connect

    Aull, J.E.

    1996-11-01

    This paper describes the algorithms used in constructing dynamic graphical displays of a process flowsheet. Movies are created which portray changes in the process over time using animation in the flowsheet such as individual streams that take on a color keyed to the current flow rate, tank levels that visibly rise and fall and {open_quotes}gauges{close_quotes} that move to display parameter values. Movies of this type can be a valuable tool for visualizing, analyzing, and communicating the behavior of a process model. This paper describes the algorithms used in constructing displays of this kind for dynamic models using the SPEEDUP{trademark} modeling package and the GMS{trademark} graphics package. It also tells how data is exported from the SPEEDUP{trademark} package to GMS{trademark} and describes how a user environment for running movies and editing flowsheets is set up. The algorithms are general enough to be applied to other processes and graphics packages. In fact the techniques described here can be used to create movies of any time-dependent data.

  2. Influence of global climatic processes on environment The Arctic seas

    NASA Astrophysics Data System (ADS)

    Kholmyansky, Mikhael; Anokhin, Vladimir; Kartashov, Alexandr

    2016-04-01

    One of the most actual problems of the present is changes of environment of Arctic regions under the influence of global climatic processes. Authors as a result of the works executed by them in different areas of the Russian Arctic regions, have received the materials characterising intensity of these processes. Complex researches are carried out on water area and in a coastal zone the White, the Barents, the Kara and the East-Siberian seas, on lake water areas of subarctic region since 1972 on the present. Into structure of researches enter: hydrophysical, cryological observations, direct measurements of temperatures, the analysis of the drill data, electrometric definitions of the parametres of a frozen zone, lithodynamic and geochemical definitions, geophysical investigations of boreholes, studying of glaciers on the basis of visual observations and the analysis of photographs. The obtained data allows to estimate change of temperature of a water layer, deposits and benthonic horizon of atmosphere for last 25 years. On the average they make 0,38⁰C for sea waters, 0,23⁰C for friable deposits and 0,72⁰C for atmosphere. Under the influence of temperature changes in hydrosphere and lithosphere of a shelf cryolithic zone changes the characteristics. It is possible to note depth increase of roof position of the cryolithic zone on the most part of the studied water area. Modern fast rise in temperature high-ice rocks composing coast, has led to avalanche process thermo - denudation and to receipt in the sea of quantity of a material of 1978 three times exceeding level Rise in temperature involves appreciable deviation borders of the Arctic glacial covers. On our monitoring measurements change of the maintenance of oxygen in benthonic area towards increase that is connected with reduction of the general salinity of waters at the expense of fresh water arriving at ice thawing is noticed. It, in turn, leads to change of a biogene part of ecosystem. The executed

  3. Radiation resistence of microorganisms from radiation sterilization processing environments

    NASA Astrophysics Data System (ADS)

    Sabovljev, Svetlana A.; Žunić, Zora S.

    The radiation resistance of microorganisms was examined on the samples of dust collected from the radiation sterilization processing environments including assembly, storage, and sterilization plant areas. The isolation of radiation resistant strains was performed by irradiation with screening doses ranging from 10 to 35 kGy and test pieces containing 10 6 to 10 8 CFU in dried serum-broth, representing 100 to 5000 colonies of primary cultures of microorganisms from 7 different sites. In an examination of 16900 colonies of aerobic microorganisms from 3 hygienically controlled production sites and 4 uncontrolled ones, 30 strains of bacteria were isolated. Of those 15 were classified as genus Bacillus, 9 as Micrococcus and 6 as Sarcina. All of the 15 strains of Gram positive sporeforming aerobic rods exhibited an exponential decrease in the surviving fraction as a function of dose, indicating that the inactivation of spores of aerobic rods is a consequence of a single energy deposition into the target. All strains were found to be moderately resistant to radiation with D-6 values (dose required to reduce survival to 6 log cycles) between 18 and 26 kGy. All of the isolated Gram positive cocci showed inactivation curves having a shoulder, indicating that different processes are involved in the inactivation of these cells, e.g. accumulation of sublethal lesions, or final repair capacity of potential lethal lesions. Moderate radiation resistance was observed in 13 strains with D-6 values between 16 to 30 kGy. Two slow-growing, red pigmented strains tentatively classified as genus Micrococcus isolated from uncontrolled sites (human dwellings) were exceptionally resistant with D-6 more than 45 kGy. For hygienically controlled sites, Gram positive spereforming rods composed two thirds of the resistant microflora, while Gram positive cocci comprised one third. For hygienically uncontrolled sites this ratio was reversed. An assumption is made that one isolated strain has grown

  4. Modeling of Plasma Spray Processes

    NASA Astrophysics Data System (ADS)

    Chang, Chong H.

    1996-10-01

    A comprehensive computational model for thermal plasma processes is being developed with sufficient generality and flexibility to apply to a wide variety of present and proposed plasma processing concepts and devices. In our model for gas-particle flows, the gas is represented as a continuous multicomponent chemically reacting gas with temperature-dependent thermodynamic and transport properties. Ions and electrons are considered as separate components or species of the mixture, while ionization and dissociation reactions are treated as chemical reactions. Entrained particles interacting with the plasma are represented by a stochastic particle model in which the velocities, temperatures, sizes, and other characteristics of typical particles are computed simultaneously with the plasma flow. The model in its present form can simulate particle injection, heating, and melting, but not evaporation and condensation. This model is embodied in the LAVA computer code, which has previously been applied to simulate plasma spraying, mixing and demixing of plasma gases, and departures from chemical (ionization/dissociation), thermal, and excitation equilibrium in plasmas. A transient simulation has been performed of stainless steel particles injected into a swirling high-velocity nitrogen-hydrogen plasma jet in air under typical operating conditions for a newly developed high-velocity high-power (HVHP) torch, which produces plasma jets with peak velocities in excess of 3000 m/s. The calculational results show that strong departures from ionization and dissociation equilibrium develop in the downstream region as the chemical reactions freeze out at lower temperatures. The calculational results also show good agreement with experimental data on particle temperature, velocity, and spray pattern, together with important statistical effects associated with distributions in particle properties and injection conditions. This work was performed under the auspices of the U. S

  5. Challenging the Expanding Environment Model of Teaching Elementary Social Studies.

    ERIC Educational Resources Information Center

    Palmer, Jesse

    1989-01-01

    Looks at criticism of the Expanding Environments Model in the elementary school social studies curriculum. Cites recent reports that recommend a history-centered elementary curriculum. States that teaching methods may be the cause of historical, civic, and geographic illiteracy rather than the Expanding Environments Model. (LS)

  6. Cosmic ray environment model for Earth orbit

    NASA Technical Reports Server (NTRS)

    Edmonds, L.

    1985-01-01

    A set of computer codes, which include the effects of the Earth's magnetic field, used to predict the cosmic ray environment (atomic numbers 1 through 28) for a spacecraft in a near-Earth orbit is described. A simple transport analysis is used to approximate the environment at the center of a spherical shield of arbitrary thickness. The final output is in a form (a Heinrich Curve) which has immediate applications for single event upset rate predictions. The codes will culate the time average environment for an arbitrary number (fractional or whole) of circular orbits. The computer codes were run for some selected orbits and the results, which can be useful for quick estimates of single event upset rates, are given. The codes were listed in the language HPL, which is appropriate or a Hewlett Packard 9825B desk top computer. Extensive documentation of the codes is available from COSMIC, except where explanations have been deferred to references where extensive documentation can be found. Some qualitative aspects of the effects of mass and magnetic shielding are also discussed.

  7. Gravity Modeling for Variable Fidelity Environments

    NASA Technical Reports Server (NTRS)

    Madden, Michael M.

    2006-01-01

    Aerospace simulations can model worlds, such as the Earth, with differing levels of fidelity. The simulation may represent the world as a plane, a sphere, an ellipsoid, or a high-order closed surface. The world may or may not rotate. The user may select lower fidelity models based on computational limits, a need for simplified analysis, or comparison to other data. However, the user will also wish to retain a close semblance of behavior to the real world. The effects of gravity on objects are an important component of modeling real-world behavior. Engineers generally equate the term gravity with the observed free-fall acceleration. However, free-fall acceleration is not equal to all observers. To observers on the sur-face of a rotating world, free-fall acceleration is the sum of gravitational attraction and the centrifugal acceleration due to the world's rotation. On the other hand, free-fall acceleration equals gravitational attraction to an observer in inertial space. Surface-observed simulations (e.g. aircraft), which use non-rotating world models, may choose to model observed free fall acceleration as the gravity term; such a model actually combines gravitational at-traction with centrifugal acceleration due to the Earth s rotation. However, this modeling choice invites confusion as one evolves the simulation to higher fidelity world models or adds inertial observers. Care must be taken to model gravity in concert with the world model to avoid denigrating the fidelity of modeling observed free fall. The paper will go into greater depth on gravity modeling and the physical disparities and synergies that arise when coupling specific gravity models with world models.

  8. Shuttle measured contaminant environment and modeling for payloads. Preliminary assessment of the space telescope environment in the shuttle bay

    NASA Technical Reports Server (NTRS)

    Scialdone, J. J.

    1983-01-01

    A baseline gaseous and particulate environment of the Shuttle bay was developed based on the various measurements which were made during the first four flights of the Shuttle. The environment is described by the time dependent pressure, density, scattered molecular fluxes, the column densities and including the transient effects of water dumps, engine firings and opening and closing of the bay doors. The particulate conditions in the ambient and on surfaces were predicted as a function of the mission time based on the available data. This basic Shuttle environment when combined with the outgassing and the particulate contributions of the payloads, can provide a description of the environment of a payload in the Shuttle bay. As an example of this application, the environment of the Space Telescope in the bay, which may be representative of the environment of several payloads, was derived. Among the many findings obtained in the process of modeling the environment, one is that the payloads environment in the bay is not substantially different or more objectionable than the self-generated environment of a large payload or spacecraft. It is, however, more severe during ground facilities operations, the first 15 to 20 hours of the flight, during and for a short period after ater was dumped overboard, and the reaction control engines are being fired.

  9. Development, validation and application of numerical space environment models

    NASA Astrophysics Data System (ADS)

    Honkonen, Ilja

    2013-10-01

    Currently the majority of space-based assets are located inside the Earth's magnetosphere where they must endure the effects of the near-Earth space environment, i.e. space weather, which is driven by the supersonic flow of plasma from the Sun. Space weather refers to the day-to-day changes in the temperature, magnetic field and other parameters of the near-Earth space, similarly to ordinary weather which refers to changes in the atmosphere above ground level. Space weather can also cause adverse effects on the ground, for example, by inducing large direct currents in power transmission systems. The performance of computers has been growing exponentially for many decades and as a result the importance of numerical modeling in science has also increased rapidly. Numerical modeling is especially important in space plasma physics because there are no in-situ observations of space plasmas outside of the heliosphere and it is not feasible to study all aspects of space plasmas in a terrestrial laboratory. With the increasing number of computational cores in supercomputers, the parallel performance of numerical models on distributed memory hardware is also becoming crucial. This thesis consists of an introduction, four peer reviewed articles and describes the process of developing numerical space environment/weather models and the use of such models to study the near-Earth space. A complete model development chain is presented starting from initial planning and design to distributed memory parallelization and optimization, and finally testing, verification and validation of numerical models. A grid library that provides good parallel scalability on distributed memory hardware and several novel features, the distributed cartesian cell-refinable grid (DCCRG), is designed and developed. DCCRG is presently used in two numerical space weather models being developed at the Finnish Meteorological Institute. The first global magnetospheric test particle simulation based on the

  10. Periglacial process and Pleistocene environment in northern China

    SciTech Connect

    Guo Xudong; Liu Dongsheng ); Yan Fuhua )

    1991-03-01

    In the present time, five kinds of periglacial phenomena have been defined: ice wedges, periglacial involutions, congelifolds, congeliturbations, and loess dunes. From the stratigraphical and geochronological data, the periglacial process is divided into six stages. (1) Guanting periglacial stage, characterized by the congeliturbative deposits that have developed in early Pleistocene Guanting loess-like formation. Paleomagnetic dating gives 2.43 Ma B.P. (2) Yanchi periglacial stage, characterized by the congelifold that has developed in middle Pleistocene Yanchi Lishi loess formation. Paleomagnetic dating gives 0.50 Ma B.P. (3) Zhaitang periglacial stage (II), characterized by the periglacial involutions that have developed in lower middle Pleistocene Lishi loess formation. Paleomagnetic dating gives 0.30 Ma B.P. (4) Zhaitang periglacial state (I), characterized by the ice (soil) wedge that has developed in upper-middle Pleistocene Lishi loess formation. Paleomagnetic dating gives 0.20 Ma B.P. (5) Qiansangyu periglacial stage (II), characterized by the ice (sand) wedges that has developed in late Pleistocene Malan loess formation. Paleomagnetic dating gives 0.13 Ma B.P. (6) Qiansangyu periglacial stage (I), characterized by the ice (soil) wedge that has developed in late Pleistocene Malan loess-like formation. Thermoluminescent dating gives 0.018 Ma B.P. Spore-pollen composition analysis shows that the savannah steppe environment prevailed in northern China during Pleistocene periglacial periods. These fossilized periglacial phenomena indicate a rather arid and windy periglacial environment with a mean annual temperature estimated some 12-15C colder than that in the present.

  11. Indoor environment modeling for interactive robot security application

    NASA Astrophysics Data System (ADS)

    Jo, Sangwoo; Shahab, Qonita M.; Kwon, Yong-Moo; Ahn, Sang Chul

    2006-10-01

    This paper presents our simple and easy to use method to obtain a 3D textured model. For expression of reality, we need to integrate the 3D models and real scenes. Most of other cases of 3D modeling method consist of two data acquisition devices. One is for getting a 3D model and another for obtaining realistic textures. In this case, the former device would be 2D laser range-finder and the latter device would be common camera. Our algorithm consists of building a measurement-based 2D metric map which is acquired by laser range-finder, texture acquisition/stitching and texture-mapping to corresponding 3D model. The algorithm is implemented with laser sensor for obtaining 2D/3D metric map and two cameras for gathering texture. Our geometric 3D model consists of planes that model the floor and walls. The geometry of the planes is extracted from the 2D metric map data. Textures for the floor and walls are generated from the images captured by two 1394 cameras which have wide Field of View angle. Image stitching and image cutting process is used to generate textured images for corresponding with a 3D model. The algorithm is applied to 2 cases which are corridor and space that has the four walls like room of building. The generated 3D map model of indoor environment is shown with VRML format and can be viewed in a web browser with a VRML plug-in. The proposed algorithm can be applied to 3D model-based remote surveillance system through WWW.

  12. Modeling pellet impact drilling process

    NASA Astrophysics Data System (ADS)

    Kovalyov, A. V.; Ryabchikov, S. Ya; Isaev, Ye D.; Ulyanova, O. S.

    2016-03-01

    The paper describes pellet impact drilling which could be used to increase the drilling speed and the rate of penetration when drilling hard rocks. Pellet impact drilling implies rock destruction by metal pellets with high kinetic energy in the immediate vicinity of the earth formation encountered. The pellets are circulated in the bottom hole by a high velocity fluid jet, which is the principle component of the ejector pellet impact drill bit. The experiments conducted has allowed modeling the process of pellet impact drilling, which creates the scientific and methodological basis for engineering design of drilling operations under different geo-technical conditions.

  13. Modeling Environment for Total Risk-2E

    EPA Science Inventory

    MENTOR-2E uses an integrated, mechanistically consistent source-to-dose-to-response modeling framework to quantify inhalation exposure and doses resulting from emergency events. It is an implementation of the MENTOR system that is focused towards modeling of the impacts of rele...

  14. Process Model for Friction Stir Welding

    NASA Technical Reports Server (NTRS)

    Adams, Glynn

    1996-01-01

    forging affect of the shoulder. The energy balance at the boundary of the plastic region with the environment required that energy flow away from the boundary in both radial directions. One resolution to this problem may be to introduce a time dependency into the process model, allowing the energy flow to oscillate across this boundary. Finally, experimental measurements are needed to verify the concepts used here and to aid in improving the model.

  15. Performance modeling codes for the QuakeSim problem solving environment

    NASA Technical Reports Server (NTRS)

    Parker, J. W.; Donnellan, A.; Lyzenga, G.; Rundle, J.; Tullis, T.

    2003-01-01

    The QuakeSim Problem Solving Environment uses a web-services approach to unify and deploy diverse remote data sources and processing services within a browser environment. Here we focus on the high-performance crustal modeling applications that will be included in this set of remote but interoperable applications.

  16. Collapse models and perceptual processes

    NASA Astrophysics Data System (ADS)

    Carlo Ghirardi, Gian; Romano, Raffaele

    2014-04-01

    Theories including a collapse mechanism have been presented various years ago. They are based on a modification of standard quantum mechanics in which nonlinear and stochastic terms are added to the evolution equation. Their principal merits derive from the fact that they are mathematically precise schemes accounting, on the basis of a unique universal dynamical principle, both for the quantum behavior of microscopic systems as well as for the reduction associated to measurement processes and for the classical behavior of macroscopic objects. Since such theories qualify themselves not as new interpretations but as modifications of the standard theory they can be, in principle, tested against quantum mechanics. Recently, various investigations identifying possible crucial test have been discussed. In spite of the extreme difficulty to perform such tests it seems that recent technological developments allow at least to put precise limits on the parameters characterizing the modifications of the evolution equation. Here we will simply mention some of the recent investigations in this direction, while we will mainly concentrate our attention to the way in which collapse theories account for definite perceptual process. The differences between the case of reductions induced by perceptions and those related to measurement procedures by means of standard macroscopic devices will be discussed. On this basis, we suggest a precise experimental test of collapse theories involving conscious observers. We make plausible, by discussing in detail a toy model, that the modified dynamics can give rise to quite small but systematic errors in the visual perceptual process.

  17. NUMERICAL MODELING OF FINE SEDIMENT PHYSICAL PROCESSES.

    USGS Publications Warehouse

    Schoellhamer, David H.

    1985-01-01

    Fine sediment in channels, rivers, estuaries, and coastal waters undergo several physical processes including flocculation, floc disruption, deposition, bed consolidation, and resuspension. This paper presents a conceptual model and reviews mathematical models of these physical processes. Several general fine sediment models that simulate some of these processes are reviewed. These general models do not directly simulate flocculation and floc disruption, but the conceptual model and existing functions are shown to adequately model these two processes for one set of laboratory data.

  18. THE RHIC/AGS ONLINE MODEL ENVIRONMENT: DESIGN AND OVERVIEW.

    SciTech Connect

    SATOGATA,T.; BROWN,K.; PILAT,F.; TAFTI,A.A.; TEPIKIAN,S.; VAN ZEIJTS,J.

    1999-03-29

    An integrated online modeling environment is currently under development for use by AGS and RHIC physicists and commissioners. This environment combines the modeling efforts of both groups in a CDEV [1] client-server design, providing access to expected machine optics and physics parameters based on live and design machine settings. An abstract modeling interface has been designed as a set of adapters [2] around core computational modeling engines such as MAD and UAL/Teapot++ [3]. This approach allows us to leverage existing survey, lattice, and magnet infrastructure, as well as easily incorporate new model engine developments. This paper describes the architecture of the RHIC/AGS modeling environment, including the application interface through CDEV and general tools for graphical interaction with the model using Tcl/Tk. Separate papers at this conference address the specifics of implementation and modeling experience for AGS and RHIC.

  19. Understanding Fundamental Material Degradation Processes in High Temperature Aggressive Chemomechanical Environments

    SciTech Connect

    Stubbins, James; Gewirth, Andrew; Sehitoglu, Huseyin; Sofronis, Petros; Robertson, Ian

    2014-01-16

    The objective of this project is to develop a fundamental understanding of the mechanisms that limit materials durability for very high-temperature applications. Current design limitations are based on material strength and corrosion resistance. This project will characterize the interactions of high-temperature creep, fatigue, and environmental attack in structural metallic alloys of interest for the very high-temperature gas-cooled reactor (VHTR) or Next–Generation Nuclear Plant (NGNP) and for the associated thermo-chemical processing systems for hydrogen generation. Each of these degradation processes presents a major materials design challenge on its own, but in combination, they can act synergistically to rapidly degrade materials and limit component lives. This research and development effort will provide experimental results to characterize creep-fatigue-environment interactions and develop predictive models to define operation limits for high-temperature structural material applications. Researchers will study individually and in combination creep-fatigue-environmental attack processes in Alloys 617, 230, and 800H, as well as in an advanced Ni-Cr oxide dispersion strengthened steel (ODS) system. For comparison, the study will also examine basic degradation processes in nichrome (Ni-20Cr), which is a basis for most high-temperature structural materials, as well as many of the superalloys. These materials are selected to represent primary candidate alloys, one advanced developmental alloy that may have superior high-temperature durability, and one model system on which basic performance and modeling efforts can be based. The research program is presented in four parts, which all complement each other. The first three are primarily experimental in nature, and the last will tie the work together in a coordinated modeling effort. The sections are (1) dynamic creep-fatigue-environment process, (2) subcritical crack processes, (3) dynamic corrosion – crack

  20. Modeling Environment for Total Risk-1A

    EPA Science Inventory

    MENTOR-1A uses an integrated, mechanistically consistent source-to-dose modeling framework to quantify inhalation exposure and dose for individuals and/or populations due to co-occurring air pollutants. It uses the "One Atmosphere" concept to characterize simultaneous exposures t...

  1. Modeling Environment for Total Risk-4M

    EPA Science Inventory

    MENTOR-4M uses an integrated, mechanistically consistent, source-to-dose modeling framework to quantify simultaneous exposures and doses of individuals and populations to multiple contaminants. It is an implementation of the MENTOR system for exposures to Multiple contaminants fr...

  2. A new security model for collaborative environments

    SciTech Connect

    Agarwal, Deborah; Lorch, Markus; Thompson, Mary; Perry, Marcia

    2003-06-06

    Prevalent authentication and authorization models for distributed systems provide for the protection of computer systems and resources from unauthorized use. The rules and policies that drive the access decisions in such systems are typically configured up front and require trust establishment before the systems can be used. This approach does not work well for computer software that moderates human-to-human interaction. This work proposes a new model for trust establishment and management in computer systems supporting collaborative work. The model supports the dynamic addition of new users to a collaboration with very little initial trust placed into their identity and supports the incremental building of trust relationships through endorsements from established collaborators. It also recognizes the strength of a users authentication when making trust decisions. By mimicking the way humans build trust naturally the model can support a wide variety of usage scenarios. Its particular strength lies in the support for ad-hoc and dynamic collaborations and the ubiquitous access to a Computer Supported Collaboration Workspace (CSCW) system from locations with varying levels of trust and security.

  3. A model of the energetic ion environment of Mars

    NASA Technical Reports Server (NTRS)

    Luhmann, J. G.; Schwingenschuh, K.

    1990-01-01

    Because Mars has a weak intrinsic magnetic field and a substantial atmosphere, instruments on orbiting spacecraft should detect a population of energetic heavy planetary ions which result from comet-like ion pickup in the solar wind and magnetosheath convection electric fields, in addition to those that might result from processes internal to a Martian 'magnetosphere.' Although this ion exosphere has been previously discussed in the literature, detailed predictions that might be directly applied to the interpretation of data are not available. Here a test particle model is used to construct a global picture of Martian pickup ions in the Mars environment. The model makes use of the recent Nagy and Cravens (1988) model of the Martian exosphere and Spreiter and Stahara's (1980) gas dynamic model of the magnetosheath. The pickup of ions originating at Phobos is also considered. Notable properties of the resulting ion distributions include their near-monoenergetic spectra, pancake pitch angle distributions, and large gyroradii compared to the planetary scale.

  4. User behavioral model in hypertext environment

    NASA Astrophysics Data System (ADS)

    Moskvin, Oleksii M.; Sailarbek, Saltanat; Gromaszek, Konrad

    2015-12-01

    There is an important role of the users that are traversing Internet resources and their activities which, according to the practice, aren't usually considered by the Internet resource owners so to adjust and optimize hypertext structure. Optimal hypertext structure allows users to locate pages of interest, which are the goals of the informational search, in a faster way. Paper presents a model that conducts user auditory behavior analysis in order to figure out their goals in particular hypertext segment and allows finding out optimal routes for reaching those goals in terms of the routes length and informational value. Potential application of the proposed model is mostly the systems that evaluate hypertext networks and optimize their referential structure for faster information retrieval.

  5. Development of a Heterogenic Distributed Environment for Spatial Data Processing Using Cloud Technologies

    NASA Astrophysics Data System (ADS)

    Garov, A. S.; Karachevtseva, I. P.; Matveev, E. V.; Zubarev, A. E.; Florinsky, I. V.

    2016-06-01

    We are developing a unified distributed communication environment for processing of spatial data which integrates web-, desktop- and mobile platforms and combines volunteer computing model and public cloud possibilities. The main idea is to create a flexible working environment for research groups, which may be scaled according to required data volume and computing power, while keeping infrastructure costs at minimum. It is based upon the "single window" principle, which combines data access via geoportal functionality, processing possibilities and communication between researchers. Using an innovative software environment the recently developed planetary information system (http://cartsrv.mexlab.ru/geoportal) will be updated. The new system will provide spatial data processing, analysis and 3D-visualization and will be tested based on freely available Earth remote sensing data as well as Solar system planetary images from various missions. Based on this approach it will be possible to organize the research and representation of results on a new technology level, which provides more possibilities for immediate and direct reuse of research materials, including data, algorithms, methodology, and components. The new software environment is targeted at remote scientific teams, and will provide access to existing spatial distributed information for which we suggest implementation of a user interface as an advanced front-end, e.g., for virtual globe system.

  6. Mars environment and magnetic orbiter model payload

    NASA Astrophysics Data System (ADS)

    Langlais, B.; Leblanc, F.; Fouchet, T.; Barabash, S.; Breuer, D.; Chassefière, E.; Coates, A.; Dehant, V.; Forget, F.; Lammer, H.; Lewis, S.; Lopez-Valverde, M.; Mandea, M.; Menvielle, M.; Pais, A.; Paetzold, M.; Read, P.; Sotin, C.; Tarits, P.; Vennerstrom, S.; Branduardi-Raymont, G.; Cremonese, G.; Merayo, J. G. M.; Ott, T.; Rème, H.; Trotignon, J. G.; Walhund, J. E.

    2009-03-01

    Mars Environment and Magnetic Orbiter was proposed as an answer to the Cosmic Vision Call of Opportunity as a M-class mission. The MEMO mission is designed to study the strong interconnections between the planetary interior, atmosphere and solar conditions essential to understand planetary evolution, the appearance of life and its sustainability. MEMO provides a high-resolution, complete, mapping of the magnetic field (below an altitude of about 250 km), with an yet unachieved full global coverage. This is combined with an in situ characterization of the high atmosphere and remote sensing of the middle and lower atmospheres, with an unmatched accuracy. These measurements are completed by an improved detection of the gravity field signatures associated with carbon dioxide cycle and to the tidal deformation. In addition the solar wind, solar EUV/UV and energetic particle fluxes are simultaneously and continuously monitored. The challenging scientific objectives of the MEMO mission proposal are fulfilled with the appropriate scientific instruments and orbit strategy. MEMO is composed of a main platform, placed on a elliptical (130 × 1,000 km), non polar (77° inclination) orbit, and of an independent, higher apoapsis (10,000 km) and low periapsis (300 km) micro-satellite. These orbital parameters are designed so that the scientific return of MEMO is maximized, in terms of measurement altitude, local time, season and geographical coverage. MEMO carry several suites of instruments, made of an ‘exospheric-upper atmosphere’ package, a ‘magnetic field’ package, and a ‘low-middle atmosphere’ package. Nominal mission duration is one Martian year.

  7. Application of integration algorithms in a parallel processing environment for the simulation of jet engines

    NASA Technical Reports Server (NTRS)

    Krosel, S. M.; Milner, E. J.

    1982-01-01

    The application of Predictor corrector integration algorithms developed for the digital parallel processing environment are investigated. The algorithms are implemented and evaluated through the use of a software simulator which provides an approximate representation of the parallel processing hardware. Test cases which focus on the use of the algorithms are presented and a specific application using a linear model of a turbofan engine is considered. Results are presented showing the effects of integration step size and the number of processors on simulation accuracy. Real time performance, interprocessor communication, and algorithm startup are also discussed.

  8. Causal Model Progressions as a Foundation for Intelligent Learning Environments.

    ERIC Educational Resources Information Center

    White, Barbara Y.; Frederiksen, John R.

    This paper describes the theoretical underpinnings and architecture of a new type of learning environment that incorporates features of microworlds and of intelligent tutoring systems. The environment is based on a progression of increasingly sophisticated causal models that simulate domain phenomena, generate explanations, and serve as student…

  9. The Quality of Home Environment in Brazil: An Ecological Model

    ERIC Educational Resources Information Center

    de Oliveira, Ebenezer A.; Barros, Fernando C.; Anselmi, Luciana D. da Silva; Piccinini, Cesar A.

    2006-01-01

    Based on Bronfenbrenner's (1999) ecological perspective, a longitudinal, prospective model of individual differences in the quality of home environment (Home Observation for Measurement of the Environment--HOME) was tested in a sample of 179 Brazilian children and their families. Perinatal measures of family socioeconomic status (SES) and child…

  10. Managing Analysis Models in the Design Process

    NASA Technical Reports Server (NTRS)

    Briggs, Clark

    2006-01-01

    Design of large, complex space systems depends on significant model-based support for exploration of the design space. Integrated models predict system performance in mission-relevant terms given design descriptions and multiple physics-based numerical models. Both the design activities and the modeling activities warrant explicit process definitions and active process management to protect the project from excessive risk. Software and systems engineering processes have been formalized and similar formal process activities are under development for design engineering and integrated modeling. JPL is establishing a modeling process to define development and application of such system-level models.

  11. Exascale Co-design for Modeling Materials in Extreme Environments

    SciTech Connect

    Germann, Timothy C.

    2014-07-08

    Computational materials science has provided great insight into the response of materials under extreme conditions that are difficult to probe experimentally. For example, shock-induced plasticity and phase transformation processes in single-crystal and nanocrystalline metals have been widely studied via large-scale molecular dynamics simulations, and many of these predictions are beginning to be tested at advanced 4th generation light sources such as the Advanced Photon Source (APS) and Linac Coherent Light Source (LCLS). I will describe our simulation predictions and their recent verification at LCLS, outstanding challenges in modeling the response of materials to extreme mechanical and radiation environments, and our efforts to tackle these as part of the multi-institutional, multi-disciplinary Exascale Co-design Center for Materials in Extreme Environments (ExMatEx). ExMatEx has initiated an early and deep collaboration between domain (computational materials) scientists, applied mathematicians, computer scientists, and hardware architects, in order to establish the relationships between algorithms, software stacks, and architectures needed to enable exascale-ready materials science application codes within the next decade. We anticipate that we will be able to exploit hierarchical, heterogeneous architectures to achieve more realistic large-scale simulations with adaptive physics refinement, and are using tractable application scale-bridging proxy application testbeds to assess new approaches and requirements. Such current scale-bridging strategies accumulate (or recompute) a distributed response database from fine-scale calculations, in a top-down rather than bottom-up multiscale approach.

  12. LIGHT-INDUCED PROCESSES AFFECTING ENTEROCOCCI IN AQUATIC ENVIRONMENTS

    EPA Science Inventory

    Fecal indicator bacteria such as enterococci have been used to assess contamination of freshwater and marine environments by pathogenic microorganisms. Various past studies have shown that sunlight plays an important role in reducing concentrations of culturable enterococci and ...

  13. Silicon-Carbide Power MOSFET Performance in High Efficiency Boost Power Processing Unit for Extreme Environments

    NASA Technical Reports Server (NTRS)

    Ikpe, Stanley A.; Lauenstein, Jean-Marie; Carr, Gregory A.; Hunter, Don; Ludwig, Lawrence L.; Wood, William; Del Castillo, Linda Y.; Fitzpatrick, Fred; Chen, Yuan

    2016-01-01

    Silicon-Carbide device technology has generated much interest in recent years. With superior thermal performance, power ratings and potential switching frequencies over its Silicon counterpart, Silicon-Carbide offers a greater possibility for high powered switching applications in extreme environment. In particular, Silicon-Carbide Metal-Oxide- Semiconductor Field-Effect Transistors' (MOSFETs) maturing process technology has produced a plethora of commercially available power dense, low on-state resistance devices capable of switching at high frequencies. A novel hard-switched power processing unit (PPU) is implemented utilizing Silicon-Carbide power devices. Accelerated life data is captured and assessed in conjunction with a damage accumulation model of gate oxide and drain-source junction lifetime to evaluate potential system performance at high temperature environments.

  14. Sensitivity of UO2 Stability in a Reducing Environment on Radiolysis Model Parameters

    SciTech Connect

    Wittman, Richard S.; Buck, Edgar C.

    2012-09-01

    Results for a radiolysis model sensitivity study of radiolytically produced H2O2 are presented as they relate to Spent (or Used) Light Water Reactor uranium oxide (UO2) nuclear fuel (UNF) oxidation in a low oxygen environment. The model builds on previous reaction kinetic studies to represent the radiolytic processes occurring at the nuclear fuel surface. Hydrogen peroxide (H2O2) is the dominant oxidant for spent nuclear fuel in an O2-depleted water environment.

  15. Cupola Furnace Computer Process Model

    SciTech Connect

    Seymour Katz

    2004-12-31

    The cupola furnace generates more than 50% of the liquid iron used to produce the 9+ million tons of castings annually. The cupola converts iron and steel into cast iron. The main advantages of the cupola furnace are lower energy costs than those of competing furnaces (electric) and the ability to melt less expensive metallic scrap than the competing furnaces. However the chemical and physical processes that take place in the cupola furnace are highly complex making it difficult to operate the furnace in optimal fashion. The results are low energy efficiency and poor recovery of important and expensive alloy elements due to oxidation. Between 1990 and 2004 under the auspices of the Department of Energy, the American Foundry Society and General Motors Corp. a computer simulation of the cupola furnace was developed that accurately describes the complex behavior of the furnace. When provided with the furnace input conditions the model provides accurate values of the output conditions in a matter of seconds. It also provides key diagnostics. Using clues from the diagnostics a trained specialist can infer changes in the operation that will move the system toward higher efficiency. Repeating the process in an iterative fashion leads to near optimum operating conditions with just a few iterations. More advanced uses of the program have been examined. The program is currently being combined with an ''Expert System'' to permit optimization in real time. The program has been combined with ''neural network'' programs to affect very easy scanning of a wide range of furnace operation. Rudimentary efforts were successfully made to operate the furnace using a computer. References to these more advanced systems will be found in the ''Cupola Handbook''. Chapter 27, American Foundry Society, Des Plaines, IL (1999).

  16. LEGEND, a LEO-to-GEO Environment Debris Model

    NASA Technical Reports Server (NTRS)

    Liou, Jer Chyi; Hall, Doyle T.

    2013-01-01

    LEGEND (LEO-to-GEO Environment Debris model) is a three-dimensional orbital debris evolutionary model that is capable of simulating the historical and future debris populations in the near-Earth environment. The historical component in LEGEND adopts a deterministic approach to mimic the known historical populations. Launched rocket bodies, spacecraft, and mission-related debris (rings, bolts, etc.) are added to the simulated environment. Known historical breakup events are reproduced, and fragments down to 1 mm in size are created. The LEGEND future projection component adopts a Monte Carlo approach and uses an innovative pair-wise collision probability evaluation algorithm to simulate the future breakups and the growth of the debris populations. This algorithm is based on a new "random sampling in time" approach that preserves characteristics of the traditional approach and captures the rapidly changing nature of the orbital debris environment. LEGEND is a Fortran 90-based numerical simulation program. It operates in a UNIX/Linux environment.

  17. Process modeling and industrial energy use

    SciTech Connect

    Howe, S O; Pilati, D A; Sparrow, F T

    1980-11-01

    How the process models developed at BNL are used to analyze industrial energy use is described and illustrated. Following a brief overview of the industry modeling program, the general methodology of process modeling is discussed. The discussion highlights the important concepts, contents, inputs, and outputs of a typical process model. A model of the US pulp and paper industry is then discussed as a specific application of process modeling methodology. Case study results from the pulp and paper model illustrate how process models can be used to analyze a variety of issues. Applications addressed with the case study results include projections of energy demand, conservation technology assessment, energy-related tax policies, and sensitivity analysis. A subsequent discussion of these results supports the conclusion that industry process models are versatile and powerful tools for energy end-use modeling and conservation analysis. Information on the current status of industry models at BNL is tabulated.

  18. A process-based standard for the Solar Energetic Particle Event Environment

    NASA Astrophysics Data System (ADS)

    Gabriel, Stephen

    For 10 years or more, there has been a lack of concensus on what the ISO standard model for the Solar Energetic Particle Event (SEPE) environment should be. Despite many technical discussions between the world experts in this field, it has been impossible to agree on which of the several models available should be selected as the standard. Most of these discussions at the ISO WG4 meetings and conferences, etc have centred around the differences in modelling approach between the MSU model and the several remaining models from elsewhere worldwide (mainly the USA and Europe). The topic is considered timely given the inclusion of a session on reference data sets at the Space Weather Workshop in Boulder in April 2014. The original idea of a ‘process-based’ standard was conceived by Dr Kent Tobiska as a way of getting round the problems associated with not only the presence of different models, which in themselves could have quite distinct modelling approaches but could also be based on different data sets. In essence, a process based standard approach overcomes these issues by allowing there to be more than one model and not necessarily a single standard model; however, any such model has to be completely transparent in that the data set and the modelling techniques used have to be not only to be clearly and unambiguously defined but also subject to peer review. If the model meets all of these requirements then it should be acceptable as a standard model. So how does this process-based approach resolve the differences between the existing modelling approaches for the SEPE environment and remove the impasse? In a sense, it does not remove all of the differences but only some of them; however, most importantly it will allow something which so far has been impossible without ambiguities and disagreement and that is a comparison of the results of the various models. To date one of the problems (if not the major one) in comparing the results of the various different SEPE

  19. A cellular automaton model for tumor growth in heterogeneous environment

    NASA Astrophysics Data System (ADS)

    Jiao, Yang; Torquato, Sal

    2011-03-01

    Cancer is not a single disease: it exhibits heterogeneity on different spatial and temporal scales and strongly interacts with its host environment. Most mathematical modeling of malignant tumor growth has assumed a homogeneous host environment. We have developed a cellular automaton model for tumor growth that explicitly incorporates the structural heterogeneity of the host environment such as tumor stroma. We show that these structural heterogeneities have non-trivial effects on the tumor growth dynamics and prognosis. Y. J. is supported by PSOC, NCI.

  20. Use of terrestrial laser scanning (TLS) for monitoring and modelling of geomorphic processes and phenomena at a small and medium spatial scale in Polar environment (Scott River — Spitsbergen)

    NASA Astrophysics Data System (ADS)

    Kociuba, Waldemar; Kubisz, Waldemar; Zagórski, Piotr

    2014-05-01

    The application of Terrestrial Laser Scanning (TLS) for precise modelling of land relief and quantitative estimation of spatial and temporal transformations can contribute to better understanding of catchment-forming processes. Experimental field measurements utilising the 3D laser scanning technology were carried out within the Scott River catchment located in the NW part of the Wedel Jarlsberg Land (Spitsbergen). The measurements concerned the glacier-free part of the Scott River valley floor with a length of 3.5 km and width from 0.3 to 1.5 km and were conducted with a state-of-the-art medium-range stationary laser scanner, a Leica Scan Station C10. A complex set of measurements of the valley floor were carried out from 86 measurement sites interrelated by the application of 82 common 'target points'. During scanning, from 5 to 19 million measurements were performed at each of the sites, and a point-cloud constituting a 'model space' was obtained. By merging individual 'model spaces', a Digital Surface Model (DSM) of the Scott River valley was obtained, with a co-registration error not exceeding ± 9 mm. The accuracy of the model permitted precise measurements of dimensions of landforms of varied scales on the main valley floor and slopes and in selected sub-catchments. The analyses verified the efficiency of the measurement system in Polar meteorological conditions of Spitsbergen in mid-summer.

  1. Integrated approaches to the application of advanced modeling technology in process development and optimization

    SciTech Connect

    Allgor, R.J.; Feehery, W.F.; Tolsma, J.E.

    1995-12-31

    The batch process development problem serves as good candidate to guide the development of process modeling environments. It demonstrates that very robust numerical techniques are required within an environment that can collect, organize, and maintain the data and models required to address the batch process development problem. This paper focuses on improving the robustness and efficiency of the numerical algorithms required in such a modeling environment through the development of hybrid numerical and symbolic strategies.

  2. Analog modelling of obduction processes

    NASA Astrophysics Data System (ADS)

    Agard, P.; Zuo, X.; Funiciello, F.; Bellahsen, N.; Faccenna, C.; Savva, D.

    2012-04-01

    Obduction corresponds to one of plate tectonics oddities, whereby dense, oceanic rocks (ophiolites) are presumably 'thrust' on top of light, continental ones, as for the short-lived, almost synchronous Peri-Arabic obduction (which took place along thousands of km from Turkey to Oman in c. 5-10 Ma). Analog modelling experiments were performed to study the mechanisms of obduction initiation and test various triggering hypotheses (i.e., plate acceleration, slab hitting the 660 km discontinuity, ridge subduction; Agard et al., 2007). The experimental setup comprises (1) an upper mantle, modelled as a low-viscosity transparent Newtonian glucose syrup filling a rigid Plexiglas tank and (2) high-viscosity silicone plates (Rhodrosil Gomme with PDMS iron fillers to reproduce densities of continental or oceanic plates), located at the centre of the tank above the syrup to simulate the subducting and the overriding plates - and avoid friction on the sides of the tank. Convergence is simulated by pushing on a piston at one end of the model with velocities comparable to those of plate tectonics (i.e., in the range 1-10 cm/yr). The reference set-up includes, from one end to the other (~60 cm): (i) the piston, (ii) a continental margin containing a transition zone to the adjacent oceanic plate, (iii) a weakness zone with variable resistance and dip (W), (iv) an oceanic plate - with or without a spreading ridge, (v) a subduction zone (S) dipping away from the piston and (vi) an upper, active continental margin, below which the oceanic plate is being subducted at the start of the experiment (as is known to have been the case in Oman). Several configurations were tested and over thirty different parametric tests were performed. Special emphasis was placed on comparing different types of weakness zone (W) and the extent of mechanical coupling across them, particularly when plates were accelerated. Displacements, together with along-strike and across-strike internal deformation in all

  3. Multidimensional Data Modeling for Business Process Analysis

    NASA Astrophysics Data System (ADS)

    Mansmann, Svetlana; Neumuth, Thomas; Scholl, Marc H.

    The emerging area of business process intelligence attempts to enhance the analytical capabilities of business process management systems by employing data warehousing and mining technologies. This paper presents an approach to re-engineering the business process modeling in conformity with the multidimensional data model. Since the business process and the multidimensional model are driven by rather different objectives and assumptions, there is no straightforward solution to converging these models.

  4. Particle Swarm Based Collective Searching Model for Adaptive Environment

    SciTech Connect

    Cui, Xiaohui; Patton, Robert M; Potok, Thomas E; Treadwell, Jim N

    2007-01-01

    This report presents a pilot study of an integration of particle swarm algorithm, social knowledge adaptation and multi-agent approaches for modeling the collective search behavior of self-organized groups in an adaptive environment. The objective of this research is to apply the particle swarm metaphor as a model of social group adaptation for the dynamic environment and to provide insight and understanding of social group knowledge discovering and strategic searching. A new adaptive environment model, which dynamically reacts to the group collective searching behaviors, is proposed in this research. The simulations in the research indicate that effective communication between groups is not the necessary requirement for whole self-organized groups to achieve the efficient collective searching behavior in the adaptive environment.

  5. Particle Swarm Based Collective Searching Model for Adaptive Environment

    SciTech Connect

    Cui, Xiaohui; Patton, Robert M; Potok, Thomas E; Treadwell, Jim N

    2008-01-01

    This report presents a pilot study of an integration of particle swarm algorithm, social knowledge adaptation and multi-agent approaches for modeling the collective search behavior of self-organized groups in an adaptive environment. The objective of this research is to apply the particle swarm metaphor as a model of social group adaptation for the dynamic environment and to provide insight and understanding of social group knowledge discovering and strategic searching. A new adaptive environment model, which dynamically reacts to the group collective searching behaviors, is proposed in this research. The simulations in the research indicate that effective communication between groups is not the necessary requirement for whole self-organized groups to achieve the efficient collective searching behavior in the adaptive environment.

  6. Designing a Web-Based Science Learning Environment for Model-Based Collaborative Inquiry

    ERIC Educational Resources Information Center

    Sun, Daner; Looi, Chee-Kit

    2013-01-01

    The paper traces a research process in the design and development of a science learning environment called WiMVT (web-based inquirer with modeling and visualization technology). The WiMVT system is designed to help secondary school students build a sophisticated understanding of scientific conceptions, and the science inquiry process, as well as…

  7. Process Correlation Analysis Model for Process Improvement Identification

    PubMed Central

    Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data. PMID:24977170

  8. Simulation model of clastic sedimentary processes

    SciTech Connect

    Tetzlaff, D.M.

    1987-01-01

    This dissertation describes SEDSIM, a computer model that simulates erosion, transport, and deposition of clastic sediments by free-surface flow in natural environments. SEDSIM is deterministic and is applicable to sedimentary processes in rivers, deltas, continental shelves, submarine canyons, and turbidite fans. The model is used to perform experiments in clastic sedimentation. Computer experimentation is limited by computing power available, but is free from scaling problems associated with laboratory experiments. SEDSIM responds to information provided to it at the outset of a simulation experiment, including topography, subsurface configuration, physical parameters of fluid and sediment, and characteristics of sediment sources. Extensive computer graphics are incorporated in SEDSIM. The user can display the three-dimensional geometry of simulated deposits in the form of successions of contour maps, perspective diagrams, vector plots of current velocities, and vertical sections of any azimuth orientation. The sections show both sediment age and composition. SEDSIM works realistically with processes involving channel shifting and topographic changes. Example applications include simulation of an ancient submarine canyon carved into a Cretaceous sequence in the National Petroleum Reserve in Alaska, known mainly from seismic sections and a sequence of Tertiary age in the Golden Meadow oil field of Louisiana, known principally from well logs.

  9. A journey to statistical process control in the development environment

    SciTech Connect

    Hanna, M.; Langston, D.

    1996-12-31

    Over the past 10 years many organizations have undertaken {open_quotes}process reengineering{close_quotes} activities in an attempt to increase their productivity and quality. Unfortunately, the launching point for these reengineering efforts has been based upon the belief that organizational processes either do not exist or they are grossly inefficient. It is the position of the authors that these beliefs are typically unfounded. All ongoing organizations have processes. These processes are effective, based upon the fact they are producing products (or services) that are being purchased. Therefore, the issue is not to invent or reengineer new processes, rather it is to increase the efficiency of the existing ones. This paper outlines a process (or organizational journey) for continually improving process based upon quantitative management techniques and statistical process control methods.

  10. Supporting Inquiry Processes with an Interactive Learning Environment: Inquiry Island

    ERIC Educational Resources Information Center

    Eslinger, Eric; White, Barbara; Frederiksen, John; Brobst, Joseph

    2008-01-01

    This research addresses the effectiveness of an interactive learning environment, Inquiry Island, as a general-purpose framework for the design of inquiry-based science curricula. We introduce the software as a scaffold designed to support the creation and assessment of inquiry projects, and describe its use in a middle-school genetics unit.…

  11. NoteCards: A Multimedia Idea Processing Environment.

    ERIC Educational Resources Information Center

    Halasz, Frank G.

    1986-01-01

    Notecards is a computer environment designed to help people work with ideas by providing a set of tools for a variety of specific activities, which can range from sketching on the back of an envelope to formally representing knowledge. The basic framework of this hypermedia system is a semantic network of electronic notecards connected by…

  12. Active microrheology of a model of the nuclear micromechanical environment

    NASA Astrophysics Data System (ADS)

    Byrd, Henry; Kilfoil, Maria

    2014-03-01

    In order to successfully complete the final stages of chromosome segregation, eukaryotic cells require the motor enzyme topoisomerase II, which can resolve topological constraints between entangled strands of duplex DNA. We created an in vitro model of a close approximation of the nuclear micromechanical environment in terms of DNA mass and entanglement density, and investigated the influence of this motor enzyme on the DNA mechanics. Topoisomerase II is a non-processive ATPase which we found significantly increases the motions of embedded microspheres in the DNA network. Because of this activity, we study the mechanical properties of our model system by active microrheology by optical trapping. We test the limits of fluctuation dissipation theorem (FDT) under this type of activity by comparing the active microrheology to passive measurements, where thermal motion alone drives the beads. We can relate any departure from FDT to the timescale of topoisomerase II activity in the DNA network. These experiments provide insight into the physical necessity of this motor enzyme in the cell.

  13. Drift-Scale Coupled Processes (DST and THC Seepage) Models

    SciTech Connect

    E. Gonnenthal; N. Spyoher

    2001-02-05

    The purpose of this Analysis/Model Report (AMR) is to document the Near-Field Environment (NFE) and Unsaturated Zone (UZ) models used to evaluate the potential effects of coupled thermal-hydrologic-chemical (THC) processes on unsaturated zone flow and transport. This is in accordance with the ''Technical Work Plan (TWP) for Unsaturated Zone Flow and Transport Process Model Report'', Addendum D, Attachment D-4 (Civilian Radioactive Waste Management System (CRWMS) Management and Operating Contractor (M and O) 2000 [153447]) and ''Technical Work Plan for Nearfield Environment Thermal Analyses and Testing'' (CRWMS M and O 2000 [153309]). These models include the Drift Scale Test (DST) THC Model and several THC seepage models. These models provide the framework to evaluate THC coupled processes at the drift scale, predict flow and transport behavior for specified thermal loading conditions, and predict the chemistry of waters and gases entering potential waste-emplacement drifts. The intended use of this AMR is to provide input for the following: (1) Performance Assessment (PA); (2) Abstraction of Drift-Scale Coupled Processes AMR (ANL-NBS-HS-000029); (3) UZ Flow and Transport Process Model Report (PMR); and (4) Near-Field Environment (NFE) PMR. The work scope for this activity is presented in the TWPs cited above, and summarized as follows: continue development of the repository drift-scale THC seepage model used in support of the TSPA in-drift geochemical model; incorporate heterogeneous fracture property realizations; study sensitivity of results to changes in input data and mineral assemblage; validate the DST model by comparison with field data; perform simulations to predict mineral dissolution and precipitation and their effects on fracture properties and chemistry of water (but not flow rates) that may seep into drifts; submit modeling results to the TDMS and document the models. The model development, input data, sensitivity and validation studies described in

  14. Drift-Scale Coupled Processes (DST and THC Seepage) Models

    SciTech Connect

    E. Sonnenthale

    2001-04-16

    The purpose of this Analysis/Model Report (AMR) is to document the Near-Field Environment (NFE) and Unsaturated Zone (UZ) models used to evaluate the potential effects of coupled thermal-hydrologic-chemical (THC) processes on unsaturated zone flow and transport. This is in accordance with the ''Technical Work Plan (TWP) for Unsaturated Zone Flow and Transport Process Model Report'', Addendum D, Attachment D-4 (Civilian Radioactive Waste Management System (CRWMS) Management and Operating Contractor (M&O) 2000 [1534471]) and ''Technical Work Plan for Nearfield Environment Thermal Analyses and Testing'' (CRWMS M&O 2000 [153309]). These models include the Drift Scale Test (DST) THC Model and several THC seepage models. These models provide the framework to evaluate THC coupled processes at the drift scale, predict flow and transport behavior for specified thermal loading conditions, and predict the chemistry of waters and gases entering potential waste-emplacement drifts. The intended use of this AMR is to provide input for the following: Performance Assessment (PA); Near-Field Environment (NFE) PMR; Abstraction of Drift-Scale Coupled Processes AMR (ANL-NBS-HS-000029); and UZ Flow and Transport Process Model Report (PMR). The work scope for this activity is presented in the TWPs cited above, and summarized as follows: Continue development of the repository drift-scale THC seepage model used in support of the TSPA in-drift geochemical model; incorporate heterogeneous fracture property realizations; study sensitivity of results to changes in input data and mineral assemblage; validate the DST model by comparison with field data; perform simulations to predict mineral dissolution and precipitation and their effects on fracture properties and chemistry of water (but not flow rates) that may seep into drifts; submit modeling results to the TDMS and document the models. The model development, input data, sensitivity and validation studies described in this AMR are required

  15. Interactive Schematic Integration Within the Propellant System Modeling Environment

    NASA Technical Reports Server (NTRS)

    Coote, David; Ryan, Harry; Burton, Kenneth; McKinney, Lee; Woodman, Don

    2012-01-01

    Task requirements for rocket propulsion test preparations of the test stand facilities drive the need to model the test facility propellant systems prior to constructing physical modifications. The Propellant System Modeling Environment (PSME) is an initiative designed to enable increased efficiency and expanded capabilities to a broader base of NASA engineers in the use of modeling and simulation (M&S) technologies for rocket propulsion test and launch mission requirements. PSME will enable a wider scope of users to utilize M&S of propulsion test and launch facilities for predictive and post-analysis functionality by offering a clean, easy-to-use, high-performance application environment.

  16. Hypercompetitive Environments: An Agent-based model approach

    NASA Astrophysics Data System (ADS)

    Dias, Manuel; Araújo, Tanya

    Information technology (IT) environments are characterized by complex changes and rapid evolution. Globalization and the spread of technological innovation have increased the need for new strategic information resources, both from individual firms and management environments. Improvements in multidisciplinary methods and, particularly, the availability of powerful computational tools, are giving researchers an increasing opportunity to investigate management environments in their true complex nature. The adoption of a complex systems approach allows for modeling business strategies from a bottom-up perspective — understood as resulting from repeated and local interaction of economic agents — without disregarding the consequences of the business strategies themselves to individual behavior of enterprises, emergence of interaction patterns between firms and management environments. Agent-based models are at the leading approach of this attempt.

  17. Supporting Inquiry Processes with an Interactive Learning Environment: Inquiry Island

    NASA Astrophysics Data System (ADS)

    Eslinger, Eric; White, Barbara; Frederiksen, John; Brobst, Joseph

    2008-12-01

    This research addresses the effectiveness of an interactive learning environment, Inquiry Island, as a general-purpose framework for the design of inquiry-based science curricula. We introduce the software as a scaffold designed to support the creation and assessment of inquiry projects, and describe its use in a middle-school genetics unit. Students in the intervention showed significant gains in inquiry skills. We also illustrate the power of the software to gather and analyze qualitative data about student learning.

  18. Model-based software process improvement

    NASA Technical Reports Server (NTRS)

    Zettervall, Brenda T.

    1994-01-01

    The activities of a field test site for the Software Engineering Institute's software process definition project are discussed. Products tested included the improvement model itself, descriptive modeling techniques, the CMM level 2 framework document, and the use of process definition guidelines and templates. The software process improvement model represents a five stage cyclic approach for organizational process improvement. The cycles consist of the initiating, diagnosing, establishing, acting, and leveraging phases.

  19. Health, Supportive Environments, and the Reasonable Person Model

    PubMed Central

    Kaplan, Stephen; Kaplan, Rachel

    2003-01-01

    The Reasonable Person Model is a conceptual framework that links environmental factors with human behavior. People are more reasonable, cooperative, helpful, and satisfied when the environment supports their basic informational needs. The same environmental supports are important factors in enhancing human health. We use this framework to identify the informational requirements common to various health-promoting factors that are realizable through welldesigned physical environments. Environmental attractors, support of way-finding, and facilitation of social interaction all contribute to the health-relevant themes of community, crime, and mode of transportation. In addition, the nearby natural environment, although often neglected, can serve as a remarkably effective resource. PMID:12948967

  20. Inquiry, play, and problem solving in a process learning environment

    NASA Astrophysics Data System (ADS)

    Thwaits, Anne Y.

    What is the nature of art/science collaborations in museums? How do art objects and activities contribute to the successes of science centers? Based on the premise that art exhibitions and art-based activities engage museum visitors in different ways than do strictly factual, information-based displays, I address these questions in a case study that examines the roles of visual art and artists in the Exploratorium, a museum that has influenced exhibit design and professional practice in many of the hands-on science centers in the United States and around the world. The marriage of art and science in education is not a new idea---Leonardo da Vinci and other early polymaths surely understood how their various endeavors informed one another, and some 20th century educators understood the value of the arts and creativity in the learning and practice of other disciplines. When, in 2010, the National Science Teachers Association added an A to the federal government's ubiquitous STEM initiative and turned it into STEAM, art educators nationwide took notice. With a heightened interest in the integration of and collaboration between disciplines comes an increased need for models of best practice for educators and educational institutions. With the intention to understand the nature of such collaborations and the potential they hold, I undertook this study. I made three site visits to the Exploratorium, where I took photos, recorded notes in a journal, interacted with exhibits, and observed museum visitors. I collected other data by examining the institution's website, press releases, annual reports, and fact sheets; and by reading popular and scholarly articles written by museum staff members and by independent journalists. I quickly realized that the Exploratorium was not created in the way than most museums are, and the history of its founding and the ideals of its founder illuminate what was then and continues now to be different about this museum from most others in the

  1. Charged Particle Environment Definition for NGST: Model Development

    NASA Technical Reports Server (NTRS)

    Blackwell, William C.; Minow, Joseph I.; Evans, Steven W.; Hardage, Donna M.; Suggs, Robert M.

    2000-01-01

    NGST will operate in a halo orbit about the L2 point, 1.5 million km from the Earth, where the spacecraft will periodically travel through the magnetotail region. There are a number of tools available to calculate the high energy, ionizing radiation particle environment from galactic cosmic rays and from solar disturbances. However, space environment tools are not generally available to provide assessments of charged particle environment and its variations in the solar wind, magnetosheath, and magnetotail at L2 distances. An engineering-level phenomenology code (LRAD) was therefore developed to facilitate the definition of charged particle environments in the vicinity of the L2 point in support of the NGST program. LRAD contains models tied to satellite measurement data of the solar wind and magnetotail regions. The model provides particle flux and fluence calculations necessary to predict spacecraft charging conditions and the degradation of materials used in the construction of NGST. This paper describes the LRAD environment models for the deep magnetotail (XGSE < -100 Re) and solar wind, and presents predictions of the charged particle environment for NGST.

  2. Modeling and Performance Simulation of the Mass Storage Network Environment

    NASA Technical Reports Server (NTRS)

    Kim, Chan M.; Sang, Janche

    2000-01-01

    This paper describes the application of modeling and simulation in evaluating and predicting the performance of the mass storage network environment. Network traffic is generated to mimic the realistic pattern of file transfer, electronic mail, and web browsing. The behavior and performance of the mass storage network and a typical client-server Local Area Network (LAN) are investigated by modeling and simulation. Performance characteristics in throughput and delay demonstrate the important role of modeling and simulation in network engineering and capacity planning.

  3. Mathematical Modeling: A Structured Process

    ERIC Educational Resources Information Center

    Anhalt, Cynthia Oropesa; Cortez, Ricardo

    2015-01-01

    Mathematical modeling, in which students use mathematics to explain or interpret physical, social, or scientific phenomena, is an essential component of the high school curriculum. The Common Core State Standards for Mathematics (CCSSM) classify modeling as a K-12 standard for mathematical practice and as a conceptual category for high school…

  4. Birth/death process model

    NASA Technical Reports Server (NTRS)

    Solloway, C. B.; Wakeland, W.

    1976-01-01

    First-order Markov model developed on digital computer for population with specific characteristics. System is user interactive, self-documenting, and does not require user to have complete understanding of underlying model details. Contains thorough error-checking algorithms on input and default capabilities.

  5. Modeling human behaviors and reactions under dangerous environment.

    PubMed

    Kang, J; Wright, D K; Qin, S F; Zhao, Y

    2005-01-01

    This paper describes the framework of a real-time simulation system to model human behavior and reactions in dangerous environments. The system utilizes the latest 3D computer animation techniques, combined with artificial intelligence, robotics and psychology, to model human behavior, reactions and decision making under expected/unexpected dangers in real-time in virtual environments. The development of the system includes: classification on the conscious/subconscious behaviors and reactions of different people; capturing different motion postures by the Eagle Digital System; establishing 3D character animation models; establishing 3D models for the scene; planning the scenario and the contents; and programming within Virtools Dev. Programming within Virtools Dev is subdivided into modeling dangerous events, modeling character's perceptions, modeling character's decision making, modeling character's movements, modeling character's interaction with environment and setting up the virtual cameras. The real-time simulation of human reactions in hazardous environments is invaluable in military defense, fire escape, rescue operation planning, traffic safety studies, and safety planning in chemical factories, the design of buildings, airplanes, ships and trains. Currently, human motion modeling can be realized through established technology, whereas to integrate perception and intelligence into virtual human's motion is still a huge undertaking. The challenges here are the synchronization of motion and intelligence, the accurate modeling of human's vision, smell, touch and hearing, the diversity and effects of emotion and personality in decision making. There are three types of software platforms which could be employed to realize the motion and intelligence within one system, and their advantages and disadvantages are discussed. PMID:15850116

  6. A Comparative of business process modelling techniques

    NASA Astrophysics Data System (ADS)

    Tangkawarow, I. R. H. T.; Waworuntu, J.

    2016-04-01

    In this era, there is a lot of business process modeling techniques. This article is the research about differences of business process modeling techniques. For each technique will explain about the definition and the structure. This paper presents a comparative analysis of some popular business process modelling techniques. The comparative framework is based on 2 criteria: notation and how it works when implemented in Somerleyton Animal Park. Each technique will end with the advantages and disadvantages. The final conclusion will give recommend of business process modeling techniques that easy to use and serve the basis for evaluating further modelling techniques.

  7. Negative Binomial Process Count and Mixture Modeling.

    PubMed

    Zhou, Mingyuan; Carin, Lawrence

    2015-02-01

    The seemingly disjoint problems of count and mixture modeling are united under the negative binomial (NB) process. A gamma process is employed to model the rate measure of a Poisson process, whose normalization provides a random probability measure for mixture modeling and whose marginalization leads to an NB process for count modeling. A draw from the NB process consists of a Poisson distributed finite number of distinct atoms, each of which is associated with a logarithmic distributed number of data samples. We reveal relationships between various count- and mixture-modeling distributions and construct a Poisson-logarithmic bivariate distribution that connects the NB and Chinese restaurant table distributions. Fundamental properties of the models are developed, and we derive efficient Bayesian inference. It is shown that with augmentation and normalization, the NB process and gamma-NB process can be reduced to the Dirichlet process and hierarchical Dirichlet process, respectively. These relationships highlight theoretical, structural, and computational advantages of the NB process. A variety of NB processes, including the beta-geometric, beta-NB, marked-beta-NB, marked-gamma-NB and zero-inflated-NB processes, with distinct sharing mechanisms, are also constructed. These models are applied to topic modeling, with connections made to existing algorithms under Poisson factor analysis. Example results show the importance of inferring both the NB dispersion and probability parameters. PMID:26353243

  8. Negative Binomial Process Count and Mixture Modeling.

    PubMed

    Zhou, Mingyuan; Carin, Lawrence

    2013-10-17

    The seemingly disjoint problems of count and mixture modeling are united under the negative binomial (NB) process. A gamma process is employed to model the rate measure of a Poisson process, whose normalization provides a random probability measure for mixture modeling and whose marginalization leads to a NB process for count modeling. A draw from the NB process consists of a Poisson distributed finite number of distinct atoms, each of which is associated with a logarithmic distributed number of data samples. We reveal relationships between various count- and mixture-modeling distributions distributions, and construct a Poisson-logarithmic bivariate distribution that connects the NB and Chinese restaurant table distributions. Fundamental properties of the models are developed, and we derive efficient Bayesian inference. It is shown that with augmentation and normalization, the NB process and gamma-NB process can be reduced to the Dirichlet process and hierarchical Dirichlet process, respectively. These relationships highlight theoretical, structural and computational advantages of the NB process. A variety of NB processes, including the beta-geometric, beta-NB, marked-beta-NB, marked-gamma-NB and zero-inflated-NB processes, with distinct sharing mechanisms, are also constructed. These models are applied to topic modeling, with connections made to existing algorithms under Poisson factor analysis. Example results show the importance of inferring both the NB dispersion and probability parameters. PMID:24144977

  9. Information-Processing Models and Curriculum Design

    ERIC Educational Resources Information Center

    Calfee, Robert C.

    1970-01-01

    "This paper consists of three sections--(a) the relation of theoretical analyses of learning to curriculum design, (b) the role of information-processing models in analyses of learning processes, and (c) selected examples of the application of information-processing models to curriculum design problems." (Author)

  10. GREENSCOPE: A Method for Modeling Chemical Process Sustainability

    EPA Science Inventory

    Current work within the U.S. Environmental Protection Agency’s National Risk Management Research Laboratory is focused on the development of a method for modeling chemical process sustainability. The GREENSCOPE methodology, defined for the four bases of Environment, Economics, Ef...

  11. Large urban fire environment. Trends and model city predictions

    SciTech Connect

    Larson, D.A.; Small, R.D.

    1982-01-01

    The urban fire environment that would result from a megaton-yield nuclear weapon burst is considered. The dependence of temperatures and velocities on fire size, burning intensity, turbulence, and radiation is explored, and specific calculations for three model urban areas are presented. In all cases, high velocity fire winds are predicted. The model-city results show the influence of building density and urban sprawl on the fire environment. Additional calculations consider large-area fires with the burning intensity reduced in a blast-damaged urban center.

  12. Chromate reduction and retention processes within arid subsurface environments.

    PubMed

    Ginder-Vogel, Matthew; Borch, Thomas; Mayes, Melanie A; Jardine, Phillip M; Fendorf, Scott

    2005-10-15

    Chromate is a widespread contaminantthat has deleterious impacts on human health, the mobility and toxicity of which are diminished by reduction to Cr(III). While biological and chemical reduction reactions of Cr(VI) are well resolved, reduction within natural sediments, particularly of arid environments, remains poorly described. Here, we examine chromate reduction within arid sediments from the Hanford, WA site, where Fe(III) (hydr)oxide and carbonate coatings limit mineral reactivity. Chromium(VI) reduction by Hanford sediments is negligible unless pretreated with acid; acidic pretreatment of packed mineral beds having a Cr(VI) feed solution results in Cr(III) associating with the minerals antigorite and lizardite in addition to magnetite and Fe(II)-bearing clay minerals. Highly alkaline conditions (pH > 14), representative of conditions near high-level nuclearwaste tanks, result in Fe(II) dissolution and concurrent Cr(VI) reduction. Additionally, Cr(III) and Cr(VI) are found associated with portlandite, suggesting a secondary mechanism for chromium retention at high pH. Thus, mineral reactivity is limited within this arid environment and appreciable reduction of Cr(VI) is restricted to highly alkaline conditions resulting near leaking radioactive waste disposal tanks. PMID:16295844

  13. Chromate Reduction and Retention Processes within Arid Subsurface Environments

    SciTech Connect

    Ginder-Vogel,M.; Borch, T.; Mayes, M.; Jardine, P.; Fendorf, S.

    2005-01-01

    Chromate is a widespread contaminant that has deleterious impacts on human health, the mobility and toxicity of which are diminished by reduction to Cr(III). While biological and chemical reduction reactions of Cr(VI) are well resolved, reduction within natural sediments, particularly of arid environments, remains poorly described. Here, we examine chromate reduction within arid sediments from the Hanford, WA site, where Fe(III) (hydr)oxide and carbonate coatings limit mineral reactivity. Chromium(VI) reduction by Hanford sediments is negligible unless pretreated with acid; acidic pretreatment of packed mineral beds having a Cr(VI) feed solution results in Cr(III) associating with the minerals antigorite and lizardite in addition to magnetite and Fe(II)-bearing clay minerals. Highly alkaline conditions (pH > 14), representative of conditions near high-level nuclear waste tanks, result in Fe(II) dissolution and concurrent Cr(VI) reduction. Additionally, Cr(III) and Cr(VI) are found associated with portlandite, suggesting a secondary mechanism for chromium retention at high pH. Thus, mineral reactivity is limited within this arid environment and appreciable reduction of Cr(VI) is restricted to highly alkaline conditions resulting near leaking radioactive waste disposal tanks.

  14. A Hierarchical Process-Dissociation Model

    ERIC Educational Resources Information Center

    Rouder, Jeffrey N.; Lu, Jun; Morey, Richard D.; Sun, Dongchu; Speckman, Paul L.

    2008-01-01

    In fitting the process-dissociation model (L. L. Jacoby, 1991) to observed data, researchers aggregate outcomes across participant, items, or both. T. Curran and D. L. Hintzman (1995) demonstrated how biases from aggregation may lead to artifactual support for the model. The authors develop a hierarchical process-dissociation model that does not…

  15. Quality and Safety in Health Care, Part XIV: The External Environment and Research for Diagnostic Processes.

    PubMed

    Harolds, Jay A

    2016-09-01

    The work system in which diagnosis takes place is affected by the external environment, which includes requirements such as certification, accreditation, and regulations. How errors are reported, malpractice, and the system for payment are some other aspects of the external environment. Improving the external environment is expected to decrease errors in diagnosis. More research on improving the diagnostic process is needed. PMID:27280903

  16. Computational Process Modeling for Additive Manufacturing

    NASA Technical Reports Server (NTRS)

    Bagg, Stacey; Zhang, Wei

    2014-01-01

    Computational Process and Material Modeling of Powder Bed additive manufacturing of IN 718. Optimize material build parameters with reduced time and cost through modeling. Increase understanding of build properties. Increase reliability of builds. Decrease time to adoption of process for critical hardware. Potential to decrease post-build heat treatments. Conduct single-track and coupon builds at various build parameters. Record build parameter information and QM Meltpool data. Refine Applied Optimization powder bed AM process model using data. Report thermal modeling results. Conduct metallography of build samples. Calibrate STK models using metallography findings. Run STK models using AO thermal profiles and report STK modeling results. Validate modeling with additional build. Photodiode Intensity measurements highly linear with power input. Melt Pool Intensity highly correlated to Melt Pool Size. Melt Pool size and intensity increase with power. Applied Optimization will use data to develop powder bed additive manufacturing process model.

  17. Spectroscopic analyses of chemical adaptation processes within microalgal biomass in response to changing environments.

    PubMed

    Vogt, Frank; White, Lauren

    2015-03-31

    Via photosynthesis, marine phytoplankton transforms large quantities of inorganic compounds into biomass. This has considerable environmental impacts as microalgae contribute for instance to counter-balancing anthropogenic releases of the greenhouse gas CO2. On the other hand, high concentrations of nitrogen compounds in an ecosystem can lead to harmful algae blooms. In previous investigations it was found that the chemical composition of microalgal biomass is strongly dependent on the nutrient availability. Therefore, it is expected that algae's sequestration capabilities and productivity are also determined by the cells' chemical environments. For investigating this hypothesis, novel analytical methodologies are required which are capable of monitoring live cells exposed to chemically shifting environments followed by chemometric modeling of their chemical adaptation dynamics. FTIR-ATR experiments have been developed for acquiring spectroscopic time series of live Dunaliella parva cultures adapting to different nutrient situations. Comparing experimental data from acclimated cultures to those exposed to a chemically shifted nutrient situation reveals insights in which analyte groups participate in modifications of microalgal biomass and on what time scales. For a chemometric description of these processes, a data model has been deduced which explains the chemical adaptation dynamics explicitly rather than empirically. First results show that this approach is feasible and derives information about the chemical biomass adaptations. Future investigations will utilize these instrumental and chemometric methodologies for quantitative investigations of the relation between chemical environments and microalgal sequestration capabilities. PMID:25813024

  18. Commercial applications in biomedical processing in the microgravity environment

    NASA Astrophysics Data System (ADS)

    Johnson, Terry C.; Taub, Floyd

    1995-01-01

    A series of studies have shown that a purified cell regulatory sialoglycopeptide (CeReS) that arrests cell division and induces cellular differentiation is fully capable of functionally interacting with target insect and mammalian cells in the microgravity environment. Data from several shuttle missions suggest that the signal transduction events that are known to be associated with CeReS action function as well in microgravity as in ground-based experiments. The molecular events known to be associated with CeReS include an ability to interfere with Ca2+ metabolism, the subsequent alkalinization of cell cytosol, and the inhibition of the phosphorylation of the nuclear protein product encoded by the retinoblastoma (RB) gene. The ability of CeReS to function in microgravity opens a wide variety of applications in space life sciences.

  19. Mathematical and physical modelling of materials processing

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Mathematical and physical modeling of turbulence phenomena in metals processing, electromagnetically driven flows in materials processing, gas-solid reactions, rapid solidification processes, the electroslag casting process, the role of cathodic depolarizers in the corrosion of aluminum in sea water, and predicting viscoelastic flows are described.

  20. Modelling urban rainfall-runoff responses using an experimental, two-tiered physical modelling environment

    NASA Astrophysics Data System (ADS)

    Green, Daniel; Pattison, Ian; Yu, Dapeng

    2016-04-01

    Surface water (pluvial) flooding occurs when rainwater from intense precipitation events is unable to infiltrate into the subsurface or drain via natural or artificial drainage channels. Surface water flooding poses a serious hazard to urban areas across the world, with the UK's perceived risk appearing to have increased in recent years due to surface water flood events seeming more severe and frequent. Surface water flood risk currently accounts for 1/3 of all UK flood risk, with approximately two million people living in urban areas at risk of a 1 in 200-year flood event. Research often focuses upon using numerical modelling techniques to understand the extent, depth and severity of actual or hypothetical flood scenarios. Although much research has been conducted using numerical modelling, field data available for model calibration and validation is limited due to the complexities associated with data collection in surface water flood conditions. Ultimately, the data which numerical models are based upon is often erroneous and inconclusive. Physical models offer a novel, alternative and innovative environment to collect data within, creating a controlled, closed system where independent variables can be altered independently to investigate cause and effect relationships. A physical modelling environment provides a suitable platform to investigate rainfall-runoff processes occurring within an urban catchment. Despite this, physical modelling approaches are seldom used in surface water flooding research. Scaled laboratory experiments using a 9m2, two-tiered 1:100 physical model consisting of: (i) a low-cost rainfall simulator component able to simulate consistent, uniformly distributed (>75% CUC) rainfall events of varying intensity, and; (ii) a fully interchangeable, modular plot surface have been conducted to investigate and quantify the influence of a number of terrestrial and meteorological factors on overland flow and rainfall-runoff patterns within a modelled

  1. Simulation Modeling of Software Development Processes

    NASA Technical Reports Server (NTRS)

    Calavaro, G. F.; Basili, V. R.; Iazeolla, G.

    1996-01-01

    A simulation modeling approach is proposed for the prediction of software process productivity indices, such as cost and time-to-market, and the sensitivity analysis of such indices to changes in the organization parameters and user requirements. The approach uses a timed Petri Net and Object Oriented top-down model specification. Results demonstrate the model representativeness, and its usefulness in verifying process conformance to expectations, and in performing continuous process improvement and optimization.

  2. Processes for Developing Scaffolding in a Computer Mediated Learning Environment.

    ERIC Educational Resources Information Center

    Bull, Kay S.; Shuler, Paul; Overton, Robert; Kimball, Sarah; Boykin, Cynthia; Griffin, John

    When in the "zone of proximal development" for a particular skill or piece of information, a learner is ready to learn but lacks certain prerequisites. Scaffolding is an interactive process in which a teacher or facilitator assists such a learner to build a "structure" to contain and frame the new information. Scaffolding can be provided by…

  3. Models for Turbulent Transport Processes.

    ERIC Educational Resources Information Center

    Hill, James C.

    1979-01-01

    Since the statistical theories of turbulence that have developed over the last twenty or thirty years are too abstract and unreliable to be of much use to chemical engineers, this paper introduces the techniques of single point models and suggests some areas of needed research. (BB)

  4. Modeling aggregation of dust monomers in low gravity environments

    NASA Astrophysics Data System (ADS)

    Doyon, Julien; Rioux, Claude

    The modeling of aggregation phenomena in microgravity is of paramount relevance to the understanding of the formation of planets. Relevant experiments have been carried out at a ground based laboratory and on aircraft providing low gravity during parabolic flight.1 Other possible environments are rockets, shuttles and the international space station. Numerical simulation of aggregation can provide us a tool to understand the formal and the-oretical background of the phenomena. The comparison between low gravity experiment and modeling prediction may confirm a theory. Also, experiments that are hard to perform can be simulated on computers allowing a vast choice of physical properties. Simulations to date have been constrained to ensembles of 100 to 1000 monomers.2 We have been able to extend such numbers to 10 000 monomers and the final goal is about 100 000 monomers, where gravitational effects become relevant yielding spheroidal systems of particles (planetesimals and planetoids). Simulations made are assumed to be diffusion processes where colliding particles will stick together with a certain probability. Future work shall include other interactions like electrostatic or magnetic forces. Recent results are to be shown at the meeting. I acknowledge the support from the ELIPS program (jointly between Canadian and European space agencies). The guidance of Prof. Slobodrian is warmly thanked. References. 1. R.J. Slobodrian, C. Rioux and J.-C. Leclerc, Microgravity Research and Aplications in Phys-ical Sciences and Biotechnology, Proceedings of the First International Symposium, Sorrento, Italy (2000) ESA SP-454, p.779-786. and Refs. therein. 2. P. Deladurantaye, C Rioux and R.J Slobodrian, Chaos, Solitons Fractals , (1997), pp. 1693-1708. Carl Robert and Eric Litvak, Software " Fractal", private communication.

  5. An integrated model of social environment and social context for pediatric rehabilitation.

    PubMed

    Batorowicz, Beata; King, Gillian; Mishra, Lipi; Missiuna, Cheryl

    2016-06-01

    This article considers the conceptualization and operationalization of "social environment" and "social context" with implications for research and practice with children and youth with impairments. We first discuss social environment and social context as constructs important for understanding interaction between external environmental qualities and the individual's experience. The article considers existing conceptualizations within psychological and sociological bodies of literature, research using these concepts, current developmental theories and issues in the understanding of environment and participation within rehabilitation science. We then describe a model that integrates a person-focused perspective with an environment-focused perspective and that outlines the mechanisms through which children/youth and social environment interact and transact. Finally, we consider the implications of the proposed model for research and clinical practice. This conceptual model directs researchers and practitioners toward interventions that will address the mechanisms of child-environment interaction and that will build capacity within both children and their social environments, including families, peers groups and communities. Health is created and lived by people within the settings of their everyday life; where they learn, work, play, and love [p.2]. Implications for Rehabilitation Understanding how social environment and personal factors interact over time to affect the development of children/youth can influence the design of services for children and youth with impairments. The model described integrates the individual-focused and environment-focused perspectives and outlines the mechanisms of the ongoing reciprocal interaction between children/youth and their social environments: provision of opportunities, resources and supports and contextual processes of choice, active engagement and collaboration. Addressing these mechanisms could contribute to creating

  6. Problems in modeling man machine control behavior in biodynamic environments

    NASA Technical Reports Server (NTRS)

    Jex, H. R.

    1972-01-01

    Reviewed are some current problems in modeling man-machine control behavior in a biodynamic environment. It is given in two parts: (1) a review of the models which are appropriate for manual control behavior and the added elements necessary to deal with biodynamic interfaces; and (2) a review of some biodynamic interface pilot/vehicle problems which have occurred, been solved, or need to be solved.

  7. MODELING THE FATE OF TOXIC ORGANIC MATERIALS IN AQUATIC ENVIRONMENTS

    EPA Science Inventory

    Documentation is given for PEST, a dynamic simulation model for evaluating the fate of toxic organic materials (TOM) in freshwater environments. PEST represents the time-varying concentration (in ppm) of a given TOM in each of as many as 16 carrier compartments; it also computes ...

  8. Modelling between Epistemological Beliefs and Constructivist Learning Environment

    ERIC Educational Resources Information Center

    Çetin-Dindar, Ayla; Kirbulut, Zübeyde Demet; Boz, Yezdan

    2014-01-01

    The purpose of this study was to model the relationship between pre-service chemistry teachers' epistemological beliefs and their preference to use constructivist-learning environment in their future class. The sample was 125 pre-service chemistry teachers from five universities in Turkey. Two instruments were used in this study. One of the…

  9. Modeling battlefield sensor environments with the views workbench

    SciTech Connect

    Woyna, M.A.; Christiansen, J.H.; Hield, C.W.; Simunich, K.L.

    1994-08-01

    The Visual Intelligence and Electronic Warfare Simulation (VIEWS) Workbench software system has been developed by Argonne National Laboratory (ANL) to enable Army intelligence and electronic warfare (IEW) analysts at Unix workstations to conveniently build detailed IEW battlefield scenarios, or ``sensor environments,`` to drive he Army`s high-resolution IEW sensor performance models. VIEWS is fully object-oriented, including the underlying database.

  10. Environment Modeling Using Runtime Values for JPF-Android

    NASA Technical Reports Server (NTRS)

    van der Merwe, Heila; Tkachuk, Oksana; Nel, Seal; van der Merwe, Brink; Visser, Willem

    2015-01-01

    Software applications are developed to be executed in a specific environment. This environment includes external native libraries to add functionality to the application and drivers to fire the application execution. For testing and verification, the environment of an application is simplified abstracted using models or stubs. Empty stubs, returning default values, are simple to generate automatically, but they do not perform well when the application expects specific return values. Symbolic execution is used to find input parameters for drivers and return values for library stubs, but it struggles to detect the values of complex objects. In this work-in-progress paper, we explore an approach to generate drivers and stubs based on values collected during runtime instead of using default values. Entry-points and methods that need to be modeled are instrumented to log their parameters and return values. The instrumented applications are then executed using a driver and instrumented libraries. The values collected during runtime are used to generate driver and stub values on- the-fly that improve coverage during verification by enabling the execution of code that previously crashed or was missed. We are implementing this approach to improve the environment model of JPF-Android, our model checking and analysis tool for Android applications.

  11. Containerless processing of single crystals in low-G environment

    NASA Technical Reports Server (NTRS)

    Walter, H. U.

    1974-01-01

    Experiments on containerless crystal growth from the melt were conducted during Skylab missions SL3 and SL4 (Skylab Experiment M-560). Six samples of InSb were processed, one of them heavily doped with selenium. The concept of the experiment is discussed and related to general crystal growth methods and their merits as techniques for containerless processing in space. The morphology of the crystals obtained is explained in terms of volume changes associated with solidification and wetting conditions during solidification. All samples exhibit extremely well developed growth facets. Analysis by X-ray topographical methods and chemical etching shows that the crystals are of high structural perfection. Average dislocation density as revealed by etching is of the order of 100 per sq cm; no dislocation clusters could be observed in the space-grown samples. A sequence of striations that is observed in the first half of the selenium-doped sample is explained as being caused by periodic surface breakdown.

  12. VHP - An environment for the remote visualization of heuristic processes

    NASA Technical Reports Server (NTRS)

    Crawford, Stuart L.; Leiner, Barry M.

    1991-01-01

    A software system called VHP is introduced which permits the visualization of heuristic algorithms on both resident and remote hardware platforms. The VHP is based on the DCF tool for interprocess communication and is applicable to remote algorithms which can be on different types of hardware and in languages other than VHP. The VHP system is of particular interest to systems in which the visualization of remote processes is required such as robotics for telescience applications.

  13. Sensitivity of membranes to their environment. Role of stochastic processes.

    PubMed Central

    Offner, F F

    1984-01-01

    Ionic flow through biomembranes often exhibits a sensitivity to the environment, which is difficult to explain by classical theory, that usually assumes that the free energy available to change the membrane permeability results from the environmental change acting directly on the permeability control mechanism. This implies, for example, that a change delta V in the trans-membrane potential can produce a maximum free energy change, delta V X q, on a gate (control mechanism) carrying a charge q. The analysis presented here shows that when stochastic fluctuations are considered, under suitable conditions (gate cycle times rapid compared with the field relaxation time within a channel), the change in free energy is limited, not by the magnitude of the stimulus, but by the electrochemical potential difference across the membrane, which may be very much greater. Conformational channel gates probably relax more slowly than the field within the channel; this would preclude appreciable direct amplification of the stimulus. It is shown, however, that the effect of impermeable cations such as Ca++ is to restore the amplification of the stimulus through its interaction with the electric field. The analysis predicts that the effect of Ca++ should be primarily to affect the number of channels that are open, while only slightly affecting the conductivity of an open channel. PMID:6093903

  14. Current models of the intensely ionizing particle environment in space

    NASA Technical Reports Server (NTRS)

    Adams, James H., Jr.

    1988-01-01

    The Cosmic Ray Effects on MicroElectronics (CREME) model that is currently in use to estimate single event effect rates in spacecraft is described. The CREME model provides a description of the radiation environment in interplanetary space near the orbit of the earth that contains no major deficiencies. The accuracy of the galactic cosmic ray model is limited by the uncertainties in solar modulation. The model for solar energetic particles could be improved by making use of all the data that has been collected on solar energetic particle events. There remain major uncertainties about the environment within the earth's magnetosphere, because of the uncertainties over the charge states of the heavy ions in the anomalous component and solar flares, and because of trapped heavy ions. The present CREME model is valid only at 1 AU, but it could be extended to other parts of the heliosphere. There is considerable data on the radiation environment from 0.2 to 35 AU in the ecliptic plane. This data could be used to extend the CREME model.

  15. Processing of soot in an urban environment: case study from the Mexico City Metropolitan Area

    NASA Astrophysics Data System (ADS)

    Johnson, K. S.; Zuberi, B.; Molina, L. T.; Molina, M. J.; Iedema, M. J.; Cowin, J. P.; Gaspar, D. J.; Wang, C.; Laskin, A.

    2005-08-01

    Chemical composition, size, and mixing state of atmospheric particles are critical in determining their effects on the environment. There is growing evidence that soot aerosols play a particularly important role in both climate and human health, but still relatively little is known of their physical and chemical nature. In addition, the atmospheric residence times and removal mechanisms for soot are neither well understood nor adequately represented in regional and global climate models. To investigate the effect of locality and residence time on properties of soot and mixing state in a polluted urban environment, particles of diameter 0.2-2.0 µm were collected in the Mexico City Metropolitan Area (MCMA) during the MCMA-2003 field campaign from various sites within the city. Individual particle analysis by different electron microscopy methods coupled with energy dispersed X-ray spectroscopy, and secondary ionization mass spectrometry show that freshly-emitted soot particles become rapidly processed in the MCMA. Whereas fresh particulate emissions from mixed-traffic are almost entirely carbonaceous, consisting of soot aggregates with liquid coatings suggestive of unburned lubricating oil and water, ambient soot particles which have been processed for less than a few hours are heavily internally mixed, primarily with ammonium sulfate. Single particle analysis suggests that this mixing occurs through several mechanisms that require further investigation. In light of previously published results, the internally-mixed nature of processed soot particles is expected to affect heterogeneous chemistry on the soot

  16. Processing of soot in an urban environment: case study from the Mexico City Metropolitan Area

    NASA Astrophysics Data System (ADS)

    Johnson, K. S.; Zuberi, B.; Molina, L. T.; Molina, M. J.; Iedema, M. J.; Cowin, J. P.; Gaspar, D. J.; Wang, C.; Laskin, A.

    2005-11-01

    Chemical composition, size, and mixing state of atmospheric particles are critical in determining their effects on the environment. There is growing evidence that soot aerosols play a particularly important role in both climate and human health, but still relatively little is known of their physical and chemical nature. In addition, the atmospheric residence times and removal mechanisms for soot are neither well understood nor adequately represented in regional and global climate models. To investigate the effect of locality and residence time on properties of soot and mixing state in a polluted urban environment, particles of diameter 0.2-2.0 μm were collected in the Mexico City Metropolitan Area (MCMA) during the MCMA-2003 Field Campaign from various sites within the city. Individual particle analysis by different electron microscopy methods coupled with energy dispersed x-ray spectroscopy, and secondary ionization mass spectrometry show that freshly-emitted soot particles become rapidly processed in the MCMA. Whereas fresh particulate emissions from mixed-traffic are almost entirely carbonaceous, consisting of soot aggregates with liquid coatings suggestive of unburned lubricating oil and water, ambient soot particles which have been processed for less than a few hours are heavily internally mixed, primarily with ammonium sulfate. Single particle analysis suggests that this mixing occurs through several mechanisms that require further investigation. In light of previously published results, the internally-mixed nature of processed soot particles is expected to affect heterogeneous chemistry on the soot surface, including interaction with water during wet-removal.

  17. An approach to accidents modeling based on compounds road environments.

    PubMed

    Fernandes, Ana; Neves, Jose

    2013-04-01

    The most common approach to study the influence of certain road features on accidents has been the consideration of uniform road segments characterized by a unique feature. However, when an accident is related to the road infrastructure, its cause is usually not a single characteristic but rather a complex combination of several characteristics. The main objective of this paper is to describe a methodology developed in order to consider the road as a complete environment by using compound road environments, overcoming the limitations inherented in considering only uniform road segments. The methodology consists of: dividing a sample of roads into segments; grouping them into quite homogeneous road environments using cluster analysis; and identifying the influence of skid resistance and texture depth on road accidents in each environment by using generalized linear models. The application of this methodology is demonstrated for eight roads. Based on real data from accidents and road characteristics, three compound road environments were established where the pavement surface properties significantly influence the occurrence of accidents. Results have showed clearly that road environments where braking maneuvers are more common or those with small radii of curvature and high speeds require higher skid resistance and texture depth as an important contribution to the accident prevention. PMID:23376544

  18. Implementing a Gaussian Process Learning Algorithm in Mixed Parallel Environment

    SciTech Connect

    Chandola, Varun; Vatsavai, Raju

    2011-01-01

    In this paper, we present a scalability analysis of a parallel Gaussian process training algorithm to simultaneously analyze a massive number of time series. We study three different parallel implementations: using threads, MPI, and a hybrid implementation using threads and MPI. We compare the scalability for the multi-threaded implementation on three different hardware platforms: a Mac desktop with two quad-core Intel Xeon processors (16 virtual cores), a Linux cluster node with four quad-core 2.3 GHz AMD Opteron processors, and SGI Altix ICE 8200 cluster node with two quad-core Intel Xeon processors (16 virtual cores). We also study the scalability of the MPI based and the hybrid MPI and thread based implementations on the SGI cluster with 128 nodes (2048 cores). Experimental results show that the hybrid implementation scales better than the multi-threaded and MPI based implementations. The hybrid implementation, using 1536 cores, can analyze a remote sensing data set with over 4 million time series in nearly 5 seconds while the serial algorithm takes nearly 12 hours to process the same data set.

  19. Prevalence and concentration of Salmonella and Campylobacter in the processing environment of small-scale pastured broiler farms

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A growing niche in the locally grown food movement is the small scale production of broiler chickens using the pasture-raised poultry production model. Little research exists that focuses on Salmonella and Campylobacter contamination in the environment associated with on-farm processing of pasture-r...

  20. Mindseye: a visual programming and modeling environment for imaging science

    NASA Astrophysics Data System (ADS)

    Carney, Thom

    1998-07-01

    Basic vision science research has reached the point that many investigators are now designing quantitative models of human visual function in areas such as, pattern discrimination, motion detection, optical flow, color discrimination, adaptation and stereopsis. These models have practical significance in their application to image compression technologies and as tools for evaluating image quality. We have been working on a vision modeling environment, called Mindseye, that is designed to simplify the implementation and testing of general purpose spatio- temporal models of human vision. Mindseye is an evolving general-purpose vision-modeling environment that embodies the general structures of the visual system and provides a set of modular tools within a flexible platform tailored to the needs of researchers. The environment employs a user- friendly graphics interface with on-line documentation that describes the functionality of the individual modules. Mindseye, while functional, is still research in progress. We are seeking input from the image compression and evaluation community as well as from the vision science community as to the potential utility of Mindseye, and how it might be enhanced to meet future needs.

  1. Modeling and control for closed environment plant production systems

    NASA Technical Reports Server (NTRS)

    Fleisher, David H.; Ting, K. C.; Janes, H. W. (Principal Investigator)

    2002-01-01

    A computer program was developed to study multiple crop production and control in controlled environment plant production systems. The program simulates crop growth and development under nominal and off-nominal environments. Time-series crop models for wheat (Triticum aestivum), soybean (Glycine max), and white potato (Solanum tuberosum) are integrated with a model-based predictive controller. The controller evaluates and compensates for effects of environmental disturbances on crop production scheduling. The crop models consist of a set of nonlinear polynomial equations, six for each crop, developed using multivariate polynomial regression (MPR). Simulated data from DSSAT crop models, previously modified for crop production in controlled environments with hydroponics under elevated atmospheric carbon dioxide concentration, were used for the MPR fitting. The model-based predictive controller adjusts light intensity, air temperature, and carbon dioxide concentration set points in response to environmental perturbations. Control signals are determined from minimization of a cost function, which is based on the weighted control effort and squared-error between the system response and desired reference signal.

  2. Modeling and control for closed environment plant production systems.

    PubMed

    Fleisher, David H; Ting, K C

    2002-01-01

    A computer program was developed to study multiple crop production and control in controlled environment plant production systems. The program simulates crop growth and development under nominal and off-nominal environments. Time-series crop models for wheat (Triticum aestivum), soybean (Glycine max), and white potato (Solanum tuberosum) are integrated with a model-based predictive controller. The controller evaluates and compensates for effects of environmental disturbances on crop production scheduling. The crop models consist of a set of nonlinear polynomial equations, six for each crop, developed using multivariate polynomial regression (MPR). Simulated data from DSSAT crop models, previously modified for crop production in controlled environments with hydroponics under elevated atmospheric carbon dioxide concentration, were used for the MPR fitting. The model-based predictive controller adjusts light intensity, air temperature, and carbon dioxide concentration set points in response to environmental perturbations. Control signals are determined from minimization of a cost function, which is based on the weighted control effort and squared-error between the system response and desired reference signal. PMID:12882224

  3. Snow process monitoring in mountain forest environments with a digital camera network

    NASA Astrophysics Data System (ADS)

    Dong, Chunyu; Menzel, Lucas

    2016-04-01

    Snow processes are important components of the hydrologic cycle in mountainous areas and at high latitudes. Sparse observations in remote regions, in combination with complex topography, local climate specifics and the impact of heterogeneous vegetation cover complicate a detailed investigation of snow related processes. In this study, a camera network is applied to monitor the complex snow processes with high temporal resolution in montane forest environments (800-1200 m a.s.l.) in southwestern Germany. A typical feature of this region is the high temporal variability of weather conditions, with frequent snow accumulation and ablation processes and recurrent snow interception on conifers. We developed a semi-automatic procedure to interpret snow depths from the digital images, which shows high consistency with manual readings and station-based measurements. To extract the snow canopy interception dynamics from the pictures, six binary classification methods are compared. MaxEntropy classifier shows obviously better performance than the others in various illumination conditions, and it is thus selected to execute the snow interception quantification. The snow accumulation and ablation processes on the ground as well as the snow loading and unloading in forest canopies are investigated based on the snow parameters derived from the time-lapse photography monitoring. Besides, the influences of meteorological conditions, forest cover and elevation on snow processes are considered. Further, our investigations serve to improve the snow and interception modules of a hydrological model. We found that time-lapse photography proves to be an effective and low-cost approach to collect useful snow-related information which supports our understanding of snow processes and the further development of hydrological models. We will present selected results from our investigations over two consecutive winters.

  4. Modeling Users, Context and Devices for Ambient Assisted Living Environments

    PubMed Central

    Castillejo, Eduardo; Almeida, Aitor; López-de-Ipiña, Diego; Chen, Liming

    2014-01-01

    The participation of users within AAL environments is increasing thanks to the capabilities of the current wearable devices. Furthermore, the significance of considering user's preferences, context conditions and device's capabilities help smart environments to personalize services and resources for them. Being aware of different characteristics of the entities participating in these situations is vital for reaching the main goals of the corresponding systems efficiently. To collect different information from these entities, it is necessary to design several formal models which help designers to organize and give some meaning to the gathered data. In this paper, we analyze several literature solutions for modeling users, context and devices considering different approaches in the Ambient Assisted Living domain. Besides, we remark different ongoing standardization works in this area. We also discuss the used techniques, modeled characteristics and the advantages and drawbacks of each approach to finally draw several conclusions about the reviewed works. PMID:24643006

  5. The Icelandic volcanic aeolian environment: Processes and impacts - A review

    NASA Astrophysics Data System (ADS)

    Arnalds, Olafur; Dagsson-Waldhauserova, Pavla; Olafsson, Haraldur

    2016-03-01

    Iceland has the largest area of volcaniclastic sandy desert on Earth or 22,000 km2. The sand has been mostly produced by glacio-fluvial processes, leaving behind fine-grained unstable sediments which are later re-distributed by repeated aeolian events. Volcanic eruptions add to this pool of unstable sediments, often from subglacial eruptions. Icelandic desert surfaces are divided into sand fields, sandy lavas and sandy lag gravel, each with separate aeolian surface characteristics such as threshold velocities. Storms are frequent due to Iceland's location on the North Atlantic Storm track. Dry winds occur on the leeward sides of mountains and glaciers, in spite of the high moisture content of the Atlantic cyclones. Surface winds often move hundreds to more than 1000 kg m-1 per annum, and more than 10,000 kg m-1 have been measured in a single storm. Desertification occurs when aeolian processes push sand fronts and have thus destroyed many previously fully vegetated ecosystems since the time of the settlement of Iceland in the late ninth century. There are about 135 dust events per annum, ranging from minor storms to >300,000 t of dust emitted in single storms. Dust production is on the order of 30-40 million tons annually, some traveling over 1000 km and deposited on land and sea. Dust deposited on deserts tends to be re-suspended during subsequent storms. High PM10 concentrations occur during major dust storms. They are more frequent in the wake of volcanic eruptions, such as after the Eyjafjallajökull 2010 eruption. Airborne dust affects human health, with negative effects enhanced by the tubular morphology of the grains, and the basaltic composition with its high metal content. Dust deposition on snow and glaciers intensifies melting. Moreover, the dust production probably also influences atmospheric conditions and parameters that affect climate change.

  6. Combining Wireless Sensor Networks and Groundwater Transport Models: Protocol and Model Development in a Simulative Environment

    NASA Astrophysics Data System (ADS)

    Barnhart, K.; Urteaga, I.; Han, Q.; Porta, L.; Jayasumana, A.; Illangasekare, T.

    2007-12-01

    Groundwater transport modeling is intended to aid in remediation processes by providing prediction of plume location and by helping to bridge data gaps in the typically undersampled subsurface environment. Increased availability of computer resources has made computer-based transport models almost ubiquitous in calculating health risks, determining cleanup strategies, guiding environmental regulatory policy, and in determining culpable parties in lawsuits. Despite their broad use, very few studies exist which verify model correctness or even usefulness, and those that have shown significant discrepancies between predicted and actual results. Better predictions can only be gained from additional and higher quality data, but this is an expensive proposition using current sampling techniques. A promising technology is the use of wireless sensor networks (WSNs) which are comprised of wireless nodes (motes) coupled to in-situ sensors that are capable of measuring hydrological parameters. As the motes are typically battery powered, power consumption is a major concern in routing algorithms. By supplying predictions about the direction and arrival time of the contaminant, the application-driven routing protocol would then become more efficient. A symbiotic relationship then exists between the WSN, which is supplying the data to calibrate the transport model, and the model, which may be supplying predictive information to the WSN for optimum monitoring performance. Many challenges exist before the above can be realized: WSN protocols must mature, as must sensor technology, and inverse models and tools must be developed for integration into the system. As current model calibration, even automatic calibration, still often requires manual tweaking of calibration parameters, implementing this in a real-time closed-loop process may require significant work. Based on insights from a previous proof-of-concept intermediate-scale tank experiment, we are developing the models, tools

  7. Lithography process window analysis with calibrated model

    NASA Astrophysics Data System (ADS)

    Zhou, Wenzhan; Yu, Jin; Lo, James; Liu, Johnson

    2004-05-01

    As critical-dimension shrink below 0.13 μm, the SPC (Statistical Process Control) based on CD (Critical Dimension) control in lithography process becomes more difficult. Increasing requirements of a shrinking process window have called on the need for more accurate decision of process window center. However in practical fabrication, we found that systematic error introduced by metrology and/or resist process can significantly impact the process window analysis result. Especially, when the simple polynomial functions are used to fit the lithographic data from focus exposure matrix (FEM), the model will fit these systematic errors rather than filter them out. This will definitely impact the process window analysis and determination of the best process condition. In this paper, we proposed to use a calibrated first principle model to do process window analysis. With this method, the systematic metrology error can be filtered out efficiently and give a more reasonable window analysis result.

  8. ARTEMIS: Ares Real Time Environments for Modeling, Integration, and Simulation

    NASA Technical Reports Server (NTRS)

    Hughes, Ryan; Walker, David

    2009-01-01

    This slide presentation reviews the use of ARTEMIS in the development and testing of the ARES launch vehicles. Ares Real Time Environment for Modeling, Simulation and Integration (ARTEMIS) is the real time simulation supporting Ares I hardware-in-the-loop (HWIL) testing. ARTEMIS accurately models all Ares/Orion/Ground subsystems which interact with Ares avionics components from pre-launch through orbit insertion The ARTEMIS System integration Lab, and the STIF architecture is reviewed. The functional components of ARTEMIS are outlined. An overview of the models and a block diagram is presented.

  9. Distillation modeling for a uranium refining process

    SciTech Connect

    Westphal, B.R.

    1996-03-01

    As part of the spent fuel treatment program at Argonne National Laboratory, a vacuum distillation process is being employed for the recovery of uranium following an electrorefining process. Distillation of a salt electrolyte, containing a eutectic mixture of lithium and potassium chlorides, from uranium is achieved by a simple batch operation and is termed {open_quotes}cathode processing{close_quotes}. The incremental distillation of electrolyte salt will be modeled by an equilibrium expression and on a molecular basis since the operation is conducted under moderate vacuum conditions. As processing continues, the two models will be compared and analyzed for correlation with actual operating results. Possible factors that may contribute to aberrations from the models include impurities at the vapor-liquid boundary, distillate reflux, anomalous pressure gradients, and mass transport phenomena at the evaporating surface. Ultimately, the purpose of either process model is to enable the parametric optimization of the process.

  10. VARTM Process Modeling of Aerospace Composite Structures

    NASA Technical Reports Server (NTRS)

    Song, Xiao-Lan; Grimsley, Brian W.; Hubert, Pascal; Cano, Roberto J.; Loos, Alfred C.

    2003-01-01

    A three-dimensional model was developed to simulate the VARTM composite manufacturing process. The model considers the two important mechanisms that occur during the process: resin flow, and compaction and relaxation of the preform. The model was used to simulate infiltration of a carbon preform with an epoxy resin by the VARTM process. The model predicted flow patterns and preform thickness changes agreed qualitatively with the measured values. However, the predicted total infiltration times were much longer than measured most likely due to the inaccurate preform permeability values used in the simulation.

  11. Sedimentary Environments and Processes in the Vicinity of Quicks Hole, Elizabeth Islands, Massachusetts

    NASA Astrophysics Data System (ADS)

    Poppe, L. J.; Ackerman, S. D.; Moser, M. S.; Stewart, H. F.; Foster, D. S.; Blackwood, D. S.; Butman, B.

    2007-05-01

    Continuous-coverage multibeam bathymetric models and sidescan sonar imagery, verified with bottom sampling and photography, provide 1) detailed basemaps that yield topographic and geological perspectives of the sea floor, 2) a fundamental framework for research and management activities, and 3) information on sedimentary environments and processes. Interpretations presented here are based on NOAA hydrographic survey H11076 that covers approximately 23 sq. km around Quicks Hole, a major passage through the Elizabeth Islands chain, offshore southeastern Massachusetts. Bouldery gravels overgrown with seaweed and sessile fauna dominate the sea floor in Quicks Hole, along shorelines, and on isolated bathymetric highs. These deposits, which reflect environments of erosion and nondeposition, are winnowed lags of till from the exposed Buzzards Bay moraine. The sea floor south of the Hole in Vineyard Sound is characterized by environments associated with coarse bedload transport and covered with transverse and barchanoid sand waves. Transverse waves exceed 7 m in amplitude, have slip faces predominantly oriented to the west and southwest, and have straight, slightly sinuous, or curved crests. Megaripples, which mimic asymmetry of the transverse waves but not necessarily their orientation, are commonly present on stoss slopes; current ripples are ubiquitous. These smaller bedforms suggest that transport is active and that sand waves are propagating under the present hydraulic regime. Net sediment transport is primarily to the west and southwest as evidenced by comparisons with data from an earlier hydrographic survey, orientation of barchanoid waves, and asymmetry of transverse waves and of scour marks around boulders and shipwrecks. The sea floor across the northern part of the study area in Buzzards Bay and away from the opening to Quicks Hole is more protected from wind- and tidally-driven currents. Environments here are primarily characterized by processes associated

  12. ESO C Library for an Image Processing Software Environment (eclipse)

    NASA Astrophysics Data System (ADS)

    Devillard, N.

    Written in ANSI C, eclipse is a library offering numerous services related to astronomical image processing: FITS data access, various image and cube loading methods, binary image handling and filtering (including convolution and morphological filters), 2-D cross-correlation, connected components, cube and image arithmetic, dead pixel detection and correction, object detection, data extraction, flat-fielding with robust fit, image generation, statistics, photometry, image-space resampling, image combination, and cube stacking. It also contains support for mathematical tools like random number generation, FFT, curve fitting, matrices, fast median computation, and point-pattern matching. The main feature of this library is its ability to handle large amounts of input data (up to 2 GB in the current version) regardless of the amount of memory and swap available on the local machine. Another feature is the very high speed allowed by optimized C, making it an ideal base tool for programming efficient number-crunching applications, e.g., on parallel (Beowulf) systems. Running on all Unix-like platforms, eclipse is portable. A high-level interface to Python is foreseen that would allow programmers to prototype their applications much faster than through C programs.

  13. Eclipse: ESO C Library for an Image Processing Software Environment

    NASA Astrophysics Data System (ADS)

    Devillard, Nicolas

    2011-12-01

    Written in ANSI C, eclipse is a library offering numerous services related to astronomical image processing: FITS data access, various image and cube loading methods, binary image handling and filtering (including convolution and morphological filters), 2-D cross-correlation, connected components, cube and image arithmetic, dead pixel detection and correction, object detection, data extraction, flat-fielding with robust fit, image generation, statistics, photometry, image-space resampling, image combination, and cube stacking. It also contains support for mathematical tools like random number generation, FFT, curve fitting, matrices, fast median computation, and point-pattern matching. The main feature of this library is its ability to handle large amounts of input data (up to 2GB in the current version) regardless of the amount of memory and swap available on the local machine. Another feature is the very high speed allowed by optimized C, making it an ideal base tool for programming efficient number-crunching applications, e.g., on parallel (Beowulf) systems.

  14. A new Mars radiation environment model with visualization.

    PubMed

    De Angelis, G; Clowdsley, M S; Singleterry, R C; Wilson, J W

    2004-01-01

    A new model for the radiation environment to be found on the planet Mars due to Galactic Cosmic Rays (OCR) has been developed at the NASA Langley Research Center. Solar modulated primary particles rescaled for Mars conditions are transported through the Martian atmosphere, with temporal properties modeled with variable timescales, down to the surface, with altitude and backscattering patterns taken into account. The Martian atmosphere has been modeled by using the Mars Global Reference Atmospheric Model--version 2001 (Mars-GRAM 2001). The altitude to compute the atmospheric thickness profile has been determined by using a model for the topography based on the data provided by the Mars Orbiter Laser Altimeter (MOLA) instrument on board the Mars Global Surveyor (MGS) spacecraft. The Mars surface composition has been modeled based on averages over the measurements obtained from orbiting spacecraft and at various landing sites, taking into account the possible volatile inventory (e.g., CO2 ice, H2O ice) along with its time variation throughout the Martian year. Particle transport has been performed with the HZETRN heavy ion code. The Mars Radiation Environment Model has been made available worldwide through the Space Ionizing Radiation Effects and Shielding Tools (SIREST) website, a project of NASA Langley Research Center. PMID:15880920

  15. A new Mars radiation environment model with visualization

    NASA Technical Reports Server (NTRS)

    De Angelis, G.; Clowdsley, M. S.; Singleterry, R. C.; Wilson, J. W.

    2004-01-01

    A new model for the radiation environment to be found on the planet Mars due to Galactic Cosmic Rays (OCR) has been developed at the NASA Langley Research Center. Solar modulated primary particles rescaled for Mars conditions are transported through the Martian atmosphere, with temporal properties modeled with variable timescales, down to the surface, with altitude and backscattering patterns taken into account. The Martian atmosphere has been modeled by using the Mars Global Reference Atmospheric Model--version 2001 (Mars-GRAM 2001). The altitude to compute the atmospheric thickness profile has been determined by using a model for the topography based on the data provided by the Mars Orbiter Laser Altimeter (MOLA) instrument on board the Mars Global Surveyor (MGS) spacecraft. The Mars surface composition has been modeled based on averages over the measurements obtained from orbiting spacecraft and at various landing sites, taking into account the possible volatile inventory (e.g., CO2 ice, H2O ice) along with its time variation throughout the Martian year. Particle transport has been performed with the HZETRN heavy ion code. The Mars Radiation Environment Model has been made available worldwide through the Space Ionizing Radiation Effects and Shielding Tools (SIREST) website, a project of NASA Langley Research Center. c2004 COSPAR. Published by Elsevier Ltd. All rights reserved.

  16. INTEGRATED FISCHER TROPSCH MODULAR PROCESS MODEL

    SciTech Connect

    Donna Post Guillen; Richard Boardman; Anastasia M. Gribik; Rick A. Wood; Robert A. Carrington

    2007-12-01

    With declining petroleum reserves, increased world demand, and unstable politics in some of the world’s richest oil producing regions, the capability for the U.S. to produce synthetic liquid fuels from domestic resources is critical to national security and economic stability. Coal, biomass and other carbonaceous materials can be converted to liquid fuels using several conversion processes. The leading candidate for large-scale conversion of coal to liquid fuels is the Fischer Tropsch (FT) process. Process configuration, component selection, and performance are interrelated and dependent on feed characteristics. This paper outlines a flexible modular approach to model an integrated FT process that utilizes a library of key component models, supporting kinetic data and materials and transport properties allowing rapid development of custom integrated plant models. The modular construction will permit rapid assessment of alternative designs and feed stocks. The modeling approach consists of three thrust areas, or “strands” – model/module development, integration of the model elements into an end to end integrated system model, and utilization of the model for plant design. Strand 1, model/module development, entails identifying, developing, and assembling a library of codes, user blocks, and data for FT process unit operations for a custom feedstock and plant description. Strand 2, integration development, provides the framework for linking these component and subsystem models to form an integrated FT plant simulation. Strand 3, plant design, includes testing and validation of the comprehensive model and performing design evaluation analyses.

  17. Multidimensional vibrational spectroscopy for tunneling processes in a dissipative environment.

    PubMed

    Ishizaki, Akihito; Tanimura, Yoshitaka

    2005-07-01

    Simulating tunneling processes as well as their observation are challenging problems for many areas. In this study, we consider a double-well potential system coupled to a heat bath with a linear-linear (LL) and square-linear (SL) system-bath interactions. The LL interaction leads to longitudinal (T1) and transversal (T2) homogeneous relaxations, whereas the SL interaction leads to the inhomogeneous dephasing (T2*) relaxation in the white noise limit with a rotating wave approximation. We discuss the dynamics of the double-well system under infrared (IR) laser excitations from a Gaussian-Markovian quantum Fokker-Planck equation approach, which was developed by generalizing Kubo's stochastic Liouville equation. Analytical expression of the Green function is obtained for a case of two-state-jump modulation by performing the Fourier-Laplace transformation. We then calculate a two-dimensional infrared signal, which is defined by the four-body correlation function of optical dipole, for various noise correlation time, system-bath coupling parameters, and temperatures. It is shown that the bath-induced vibrational excitation and relaxation dynamics between the tunneling splitting levels can be detected as the isolated off-diagonal peaks in the third-order two-dimensional infrared (2D-IR) spectroscopy for a specific phase matching condition. Furthermore, this spectroscopy also allows us to directly evaluate the rate constants for tunneling reactions, which relates to the coherence between the splitting levels; it can be regarded as a novel technique for measuring chemical reaction rates. We depict the change of reaction rates as a function of system-bath coupling strength and a temperature through the 2D-IR signal. PMID:16035851

  18. Modelling the near-Earth space environment using LDEF data

    NASA Technical Reports Server (NTRS)

    Atkinson, Dale R.; Coombs, Cassandra R.; Crowell, Lawrence B.; Watts, Alan J.

    1992-01-01

    Near-Earth space is a dynamic environment, that is currently not well understood. In an effort to better characterize the near-Earth space environment, this study compares the results of actual impact crater measurement data and the Space Environment (SPENV) Program developed in-house at POD, to theoretical models established by Kessler (NASA TM-100471, 1987) and Cour-Palais (NASA SP-8013, 1969). With the continuing escalation of debris there will exist a definite hazard to unmanned satellites as well as manned operations. Since the smaller non-trackable debris has the highest impact rate, it is clearly necessary to establish the true debris environment for all particle sizes. Proper comprehension of the near-Earth space environment and its origin will permit improvement in spacecraft design and mission planning, thereby reducing potential disasters and extreme costs. Results of this study directly relate to the survivability of future spacecraft and satellites that are to travel through and/or reside in low Earth orbit (LEO). More specifically, these data are being used to: (1) characterize the effects of the LEO micrometeoroid an debris environment on satellite designs and components; (2) update the current theoretical micrometeoroid and debris models for LEO; (3) help assess the survivability of spacecraft and satellites that must travel through or reside in LEO, and the probability of their collision with already resident debris; and (4) help define and evaluate future debris mitigation and disposal methods. Combined model predictions match relatively well with the LDEF data for impact craters larger than approximately 0.05 cm, diameter; however, for smaller impact craters, the combined predictions diverge and do not reflect the sporadic clouds identified by the Interplanetary Dust Experiment (IDE) aboard LDEF. The divergences cannot currently be explained by the authors or model developers. The mean flux of small craters (approximately 0.05 cm diameter) is

  19. Modelling Biological Processes Using Simple Matrices.

    ERIC Educational Resources Information Center

    Paton, Ray

    1991-01-01

    A variety of examples are given from different areas of biology to illustrate the general applicability of matrix algebra to discrete models. These models of biological systems are concerned with relations between processes occurring in discrete time intervals. Diffusion, ecosystems, and different types of cells are modeled. (KR/Author)

  20. Quantum jump model for a system with a finite-size environment

    NASA Astrophysics Data System (ADS)

    Suomela, S.; Kutvonen, A.; Ala-Nissila, T.

    2016-06-01

    Measuring the thermodynamic properties of open quantum systems poses a major challenge. A calorimetric detection has been proposed as a feasible experimental scheme to measure work and fluctuation relations in open quantum systems. However, the detection requires a finite size for the environment, which influences the system dynamics. This process cannot be modeled with the standard stochastic approaches. We develop a quantum jump model suitable for systems coupled to a finite-size environment. We use the method to study the common fluctuation relations and prove that they are satisfied.

  1. Martian Radiation Environment: Model Calculations and Recent Measurements with "MARIE"

    NASA Technical Reports Server (NTRS)

    Saganti, P. B.; Cucinotta, F. A.; zeitlin, C. J.; Cleghorn, T. F.

    2004-01-01

    The Galactic Cosmic Ray spectra in Mars orbit were generated with the recently expanded HZETRN (High Z and Energy Transport) and QMSFRG (Quantum Multiple-Scattering theory of nuclear Fragmentation) model calculations. These model calculations are compared with the first eighteen months of measured data from the MARIE (Martian Radiation Environment Experiment) instrument onboard the 2001 Mars Odyssey spacecraft that is currently in Martian orbit. The dose rates observed by the MARIE instrument are within 10% of the model calculated predictions. Model calculations are compared with the MARIE measurements of dose, dose-equivalent values, along with the available particle flux distribution. Model calculated particle flux includes GCR elemental composition of atomic number, Z = 1-28 and mass number, A = 1-58. Particle flux calculations specific for the current MARIE mapping period are reviewed and presented.

  2. Preform Characterization in VARTM Process Model Development

    NASA Technical Reports Server (NTRS)

    Grimsley, Brian W.; Cano, Roberto J.; Hubert, Pascal; Loos, Alfred C.; Kellen, Charles B.; Jensen, Brian J.

    2004-01-01

    Vacuum-Assisted Resin Transfer Molding (VARTM) is a Liquid Composite Molding (LCM) process where both resin injection and fiber compaction are achieved under pressures of 101.3 kPa or less. Originally developed over a decade ago for marine composite fabrication, VARTM is now considered a viable process for the fabrication of aerospace composites (1,2). In order to optimize and further improve the process, a finite element analysis (FEA) process model is being developed to include the coupled phenomenon of resin flow, preform compaction and resin cure. The model input parameters are obtained from resin and fiber-preform characterization tests. In this study, the compaction behavior and the Darcy permeability of a commercially available carbon fabric are characterized. The resulting empirical model equations are input to the 3- Dimensional Infiltration, version 5 (3DINFILv.5) process model to simulate infiltration of a composite panel.

  3. Stochastic model of the residual acceleration environment in microgravity

    NASA Technical Reports Server (NTRS)

    Vinals, Jorge

    1994-01-01

    We describe a theoretical investigation of the effects that stochastic residual accelerations (g-jitter) onboard spacecraft can have on experiments conducted in a microgravity environment. We first introduce a stochastic model of the residual acceleration field, and develop a numerical algorithm to solve the equations governing fluid flow that allow for a stochastic body force. We next summarize our studies of two generic situations: stochastic parametric resonance and the onset of convective flow induced by a fluctuating acceleration field.

  4. DISTRIBUTED PROCESSING TRADE-OFF MODEL FOR ELECTRIC UTILITY OPERATION

    NASA Technical Reports Server (NTRS)

    Klein, S. A.

    1994-01-01

    The Distributed processing Trade-off Model for Electric Utility Operation is based upon a study performed for the California Institute of Technology's Jet Propulsion Laboratory. This study presented a technique that addresses the question of trade-offs between expanding a communications network or expanding the capacity of distributed computers in an electric utility Energy Management System (EMS). The technique resulted in the development of a quantitative assessment model that is presented in a Lotus 1-2-3 worksheet environment. The model gives EMS planners a macroscopic tool for evaluating distributed processing architectures and the major technical and economic tradeoffs as well as interactions within these architectures. The model inputs (which may be varied according to application and need) include geographic parameters, data flow and processing workload parameters, operator staffing parameters, and technology/economic parameters. The model's outputs are total cost in various categories, a number of intermediate cost and technical calculation results, as well as graphical presentation of Costs vs. Percent Distribution for various parameters. The model has been implemented on an IBM PC using the LOTUS 1-2-3 spreadsheet environment and was developed in 1986. Also included with the spreadsheet model are a number of representative but hypothetical utility system examples.

  5. Modeling Cellular Processes in 3-D

    PubMed Central

    Mogilner, Alex; Odde, David

    2011-01-01

    Summary Recent advances in photonic imaging and fluorescent protein technology offer unprecedented views of molecular space-time dynamics in living cells. At the same time, advances in computing hardware and software enable modeling of ever more complex systems, from global climate to cell division. As modeling and experiment become more closely integrated, we must address the issue of modeling cellular processes in 3-D. Here, we highlight recent advances related to 3-D modeling in cell biology. While some processes require full 3-D analysis, we suggest that others are more naturally described in 2-D or 1-D. Keeping the dimensionality as low as possible reduces computational time and makes models more intuitively comprehensible; however, the ability to test full 3-D models will build greater confidence in models generally and remains an important emerging area of cell biological modeling. PMID:22036197

  6. Software-Engineering Process Simulation (SEPS) model

    NASA Technical Reports Server (NTRS)

    Lin, C. Y.; Abdel-Hamid, T.; Sherif, J. S.

    1992-01-01

    The Software Engineering Process Simulation (SEPS) model is described which was developed at JPL. SEPS is a dynamic simulation model of the software project development process. It uses the feedback principles of system dynamics to simulate the dynamic interactions among various software life cycle development activities and management decision making processes. The model is designed to be a planning tool to examine tradeoffs of cost, schedule, and functionality, and to test the implications of different managerial policies on a project's outcome. Furthermore, SEPS will enable software managers to gain a better understanding of the dynamics of software project development and perform postmodern assessments.

  7. Modeling of the Adiabatic and Isothermal Methanation Process

    NASA Astrophysics Data System (ADS)

    Porubova, Jekaterina; Bazbauers, Gatis; Markova, Darja

    2011-01-01

    Increased use of biomass offers one of the ways to reduce anthropogenic impact on the environment. Using various biomass conversion processes, it is possible to obtain different types of fuels: • solid, e.g. bio-carbon; • liquid, e.g. biodiesel and ethanol; • gaseous, e.g. biomethane. Biomethane can be used in the transport and energy sector, and the total methane production efficiency can reach 65%. By modeling adiabatic and isothermal methanation processes, the most effective one from the methane production point of view is defined. Influence of the process parameters on the overall efficiency of the methane production is determined.

  8. Periglacial process research for improved understanding of climate change in periglacial environments

    NASA Astrophysics Data System (ADS)

    Hvidtfeldt Christiansen, Hanne

    2010-05-01

    Periglacial landscapes extend widely outside the glaciated areas and the areas underlain by permafrost and with seasonal frost. Yet recently significant attention has in cryosphere research, related to periglacial geomorphology, been given to a direct climate permafrost relationship. The focus is on the permafrost thermal state including the thickness of the active layer, and often simplifying how these two key conditions are directly climatically controlled. There has been less focus on the understanding and quantification of the different periglacial processes, which largely control the consequences of changing climatic conditions on the permafrost and on seasonal frost all over the periglacial environments. It is the complex relationship between climate, micro-climate and local geomorphological, geological and ecological conditions, which controls periglacial processes. In several cases local erosion or deposition will affect the rates of landform change significantly more than any climate change. Thus detailed periglacial process studies will sophisticate the predictions of how periglacial landscapes can be expected to respond to climatic changes, and be built into Earth System Modelling. Particularly combining direct field observations and measurements with remote sensing and geochronological studies of periglacial landforms, enables a significantly improved understanding of periglacial process rates. An overview of the state of research in key periglacial processes are given focusing on ice-wedges and solifluction landforms, and seasonal ground thermal dynamics, all with examples from the high Arctic in Svalbard. Thermal contraction cracking and its seasonal meteorological control is presented, and potential thermal erosion of ice-wedges leading to development of thermokarst is discussed. Local and meteorological controls on solifluction rates are presented and their climatic control indicated. Seasonal ground thermal processes and their dependence on local

  9. Prevalence and survival of Listeria monocytogenes in Danish aquatic and fish-processing environments.

    PubMed

    Hansen, Cisse Hedegaard; Vogel, Birte Fonnesbech; Gram, Lone

    2006-09-01

    Listeria monocytogenes contamination of ready-to-eat food products such as cold-smoked fish is often caused by pathogen subtypes persisting in food-processing environments. The purpose of the present study was to determine whether these L. monocytogenes subtypes can be found in the outside environment, i.e., outside food processing plants, and whether they survive better in the aquatic environment than do other strains. A total of 400 samples were collected from the outside environment, fish slaughterhouses, fish farms, and a smokehouse. L. monocytogenes was not detected in a freshwater stream, but prevalence increased with the degree of human activity: 2% in seawater fish farms, 10% in freshwater fish farms, 16% in fish slaughterhouses, and 68% in a fish smokehouse. The fish farms and slaughterhouses processed Danish rainbow trout, whereas the smokehouse was used for farm-raised Norwegian salmon. No variation with season was observed. Inside the processing plants, the pattern of randomly amplified polymorphic DNA (RAPD) types was homogeneous, but greater diversity existed among isolates from the outside environments. The RAPD type dominating the inside of the fish smokehouse was found only sporadically in outside environments. To examine survival in different environments, L. monocytogenes or Listeria innocua strains were inoculated into freshwater and saltwater microcosms. Pathogen counts decreased over time in Instant Ocean and remained constant in phosphate-buffered saline. In contrast, counts decreased rapidly in natural seawater and fresh water. The count reduction was much slower when the natural waters were autoclaved or filtered (0.2-microm pore size), indicating that the pathogen reduction in natural waters was attributable to a biological mechanism, e.g., protozoan grazing. A low prevalence of L. monocytogenes was found in the outside environment, and the bacteria did not survive well in natural environments. Therefore, L. monocytogenes in the outer

  10. A model for dispersion of contaminants in the subway environment

    SciTech Connect

    Coke, L. R.; Sanchez, J. G.; Policastro, A. J.

    2000-05-03

    Although subway ventilation has been studied extensively, very little has been published on dispersion of contaminants in the subway environment. This paper presents a model that predicts dispersion of contaminants in a complex subway system. It accounts for the combined transient effects of train motion, station airflows, train car air exchange rates, and source release properties. Results are presented for a range of typical subway scenarios. The effects of train piston action and train car air exchange are discussed. The model could also be applied to analyze the environmental impact of hazardous materials releases such as chemical and biological agents.

  11. Comprehensive computational model for thermal plasma processing

    NASA Astrophysics Data System (ADS)

    Chang, C. H.

    A new numerical model is described for simulating thermal plasmas containing entrained particles, with emphasis on plasma spraying applications. The plasma is represented as a continuum multicomponent chemically reacting ideal gas, while the particles are tracked as discrete Lagrangian entities coupled to the plasma. The overall computational model is embodied in a new computer code called LAVA. Computational results are presented from a transient simulation of alumina spraying in a turbulent argon-helium plasma jet in air environment, including torch geometry, substrate, and multiple species with chemical reactions. Plasma-particle interactions including turbulent dispersion have been modeled in a fully self-consistent manner.

  12. Sediment connectivity: addressing the non-linearity of erosional processes within spatially and temporally variable environments

    NASA Astrophysics Data System (ADS)

    Turnbull, Laura; Bracken, Louise; Wainwright, John

    2014-05-01

    A major challenge for geomorphologists is to scale up small-magnitude erosional processes to predict landscape form and landscape-scale sediment flux. Here, we present a sediment connectivity framework, showing the controls and dynamics of sediment transport which govern erosional processes across multiple scales. This framework is based on the concept that the interplay of structural components (morphology) and process components (flow of energy/transport vectors and materials) determines the long-term behaviour of the sediment flux, which is manifest as a change in landform. The sediment connectivity framework therefore incorporates all aspects of the geomorphic system that control sediment flux. Because of the link between process (flux) and form, sediment connectivity is a product of sediment entrainment and sediment-transport distance and the emergent characteristics of sediment deposition and sediment residence times. Therefore, depending on the dominant processes in operation and their spatial and temporal configuration, the scaling of erosion differs in form and extent. Sediment-transport distances are an integral component of this sediment connectivity framework, as they provide a means of addressing the non-linearity of erosional processes within spatially and temporally variable environments. We apply this sediment-connectivity framework to test how structural and process components of a system alter sediment flux. Specifically, we use a modelling-based approach to investigate how antecedent soil-moisture content and rainfall characteristics affect hydrological and sediment connectivity over a shrub-encroachment gradient in the southwest USA; a region that is undergoing rapid vegetation transitions. We carried out scenario-based runoff and erosion modelling using MAHLERAN to investigate the impact of changes in runoff and erosion to soil moisture and rainfall characteristics. Using outputs from these simulations, we quantify hydrological and sediment

  13. Space Environment Effects: Low-Altitude Trapped Radiation Model

    NASA Technical Reports Server (NTRS)

    Huston, S. L.; Pfitzer, K. A.

    1998-01-01

    Accurate models of the Earth's trapped energetic proton environment are required for both piloted and robotic space missions. For piloted missions, the concern is mainly total dose to the astronauts, particularly in long-duration missions and during extravehicular activity (EVA). As astronomical and remote-sensing detectors become more sensitive, the proton flux can induce unwanted backgrounds in these instruments. Due to this unwanted background, the following description details the development of a new model for the low-trapped proton environment. The model is based on nearly 20 years of data from the TIRO/NOAA weather satellites. The model, which has been designated NOAAPRO (for NOAA protons), predicts the integral omnidirectional proton flux in three energy ranges: >16, >36, and >80 MeV. It contains a true solar cycle variation and accounts for the secular variation in the Earth's magnetic field. It also extends to lower values of the magnetic L parameter than does AP8. Thus, the model addresses the major shortcomings of AP8.

  14. Gene-Environment Processes Linking Aggression, Peer Victimization, and the Teacher-Child Relationship

    ERIC Educational Resources Information Center

    Brendgen, Mara; Boivin, Michel; Dionne, Ginette; Barker, Edward D.; Vitaro, Frank; Girard, Alain; Tremblay, Richard; Perusse, Daniel

    2011-01-01

    Aggressive behavior in middle childhood is at least partly explained by genetic factors. Nevertheless, estimations of simple effects ignore possible gene-environment interactions (G x E) or gene-environment correlations (rGE) in the etiology of aggression. The present study aimed to simultaneously test for G x E and rGE processes between…

  15. A Delineation of the Cognitive Processes Manifested in a Social Annotation Environment

    ERIC Educational Resources Information Center

    Li, S. C.; Pow, J. W. C.; Cheung, W. C.

    2015-01-01

    This study aims to examine how students' learning trajectories progress in an online social annotation environment, and how their cognitive processes and levels of interaction correlate with their learning outcomes. Three different types of activities (cognitive, metacognitive and social) were identified in the online environment. The time…

  16. Radiation Belt Environment Model: Application to Space Weather and Beyond

    NASA Technical Reports Server (NTRS)

    Fok, Mei-Ching H.

    2011-01-01

    Understanding the dynamics and variability of the radiation belts are of great scientific and space weather significance. A physics-based Radiation Belt Environment (RBE) model has been developed to simulate and predict the radiation particle intensities. The RBE model considers the influences from the solar wind, ring current and plasmasphere. It takes into account the particle drift in realistic, time-varying magnetic and electric field, and includes diffusive effects of wave-particle interactions with various wave modes in the magnetosphere. The RBE model has been used to perform event studies and real-time prediction of energetic electron fluxes. In this talk, we will describe the RBE model equation, inputs and capabilities. Recent advancement in space weather application and artificial radiation belt study will be discussed as well.

  17. Modelling foraging ants in a dynamic and confined environment.

    PubMed

    Bandeira de Melo, Elton B; Araújo, Aluízio F R

    2011-04-01

    In social insects, the superposition of simple individual behavioral rules leads to the emergence of complex collective patterns and helps solve difficult problems inherent to surviving in hostile habitats. Modelling ant colony foraging reveals strategies arising from the insects' self-organization and helps develop of new computational strategies in order to solve complex problems. This paper presents advances in modelling ants' behavior when foraging in a confined and dynamic environment, based on experiments with the Argentine ant Linepithema humile in a relatively complex artificial network. We propose a model which overcomes the problem of stagnation observed in earlier models by taking into account additional biological aspects, by using non-linear functions for the deposit, perception and evaporation of pheromone, and by introducing new mechanisms to represent randomness and the exploratory behavior of the ants. PMID:21236313

  18. Precipitates/Salts Model Calculations for Various Drift Temperature Environments

    SciTech Connect

    P. Marnier

    2001-12-20

    The objective and scope of this calculation is to assist Performance Assessment Operations and the Engineered Barrier System (EBS) Department in modeling the geochemical effects of evaporation within a repository drift. This work is developed and documented using procedure AP-3.12Q, Calculations, in support of ''Technical Work Plan For Engineered Barrier System Department Modeling and Testing FY 02 Work Activities'' (BSC 2001a). The primary objective of this calculation is to predict the effects of evaporation on the abstracted water compositions established in ''EBS Incoming Water and Gas Composition Abstraction Calculations for Different Drift Temperature Environments'' (BSC 2001c). A secondary objective is to predict evaporation effects on observed Yucca Mountain waters for subsequent cement interaction calculations (BSC 2001d). The Precipitates/Salts model is documented in an Analysis/Model Report (AMR), ''In-Drift Precipitates/Salts Analysis'' (BSC 2001b).

  19. Threshold dynamics of a malaria transmission model in periodic environment

    NASA Astrophysics Data System (ADS)

    Wang, Lei; Teng, Zhidong; Zhang, Tailei

    2013-05-01

    In this paper, we propose a malaria transmission model with periodic environment. The basic reproduction number R0 is computed for the model and it is shown that the disease-free periodic solution of the model is globally asymptotically stable when R0<1, that is, the disease goes extinct when R0<1, while the disease is uniformly persistent and there is at least one positive periodic solution when R0>1. It indicates that R0 is the threshold value determining the extinction and the uniform persistence of the disease. Finally, some examples are given to illustrate the main theoretical results. The numerical simulations show that, when the disease is uniformly persistent, different dynamic behaviors may be found in this model, such as the global attractivity and the chaotic attractor.

  20. Merging Ultrasonic Sensor Readings Into A Consistent Environment Model

    NASA Astrophysics Data System (ADS)

    Peremans, Herbert; van Campenhout, Jan M.

    1990-03-01

    The algorithm presented in this paper constructs a geometric model of the environment using ultrasonic sensors. To do this in a reliable way, it has to take different error sources into account. Unlike other approaches, where a low-level, pixel based, probabilistic model is constructed to represent the uncertainty arising from false measurements, a high level, geometric, model is constructed. It is shown that a high level model, besides being faster to construct, is more appropriate for taking into account the typical characteristics of ultrasonic sensors. The algorithm detects and eliminates inconsistent measurements by combining evidence gathered from different points of view. This is made possible by extracting from the measurements not only information concerning the position of obstacles, but also information about regions that must be empty when seen from a certain angle. To conclude, some examples of the behaviour of this algorithm in real-world situations are presented.

  1. Predicting Material Performance in the Space Environment from Laboratory Test Data, Static Design Environments, and Space Weather Models

    NASA Technical Reports Server (NTRS)

    Minow, Josep I.; Edwards, David L.

    2008-01-01

    Qualifying materials for use in the space environment is typically accomplished with laboratory exposures to simulated UV/EUV, atomic oxygen, and charged particle radiation environments with in-situ or subsequent measurements of material properties of interest to the particular application. Choice of environment exposure levels are derived from static design environments intended to represent either mean or extreme conditions that are anticipated to be encountered during a mission. The real space environment however is quite variable. Predictions of the on orbit performance of a material qualified to laboratory environments can be done using information on 'space weather' variations in the real environment. This presentation will first review the variability of space environments of concern for material degradation and then demonstrate techniques for using test data to predict material performance in a variety of space environments from low Earth orbit to interplanetary space using historical measurements and space weather models.

  2. The effects of physical environments in medical wards on medication communication processes affecting patient safety.

    PubMed

    Liu, Wei; Manias, Elizabeth; Gerdtz, Marie

    2014-03-01

    Physical environments of clinical settings play an important role in health communication processes. Effective medication management requires seamless communication among health professionals of different disciplines. This paper explores how physical environments affect communication processes for managing medications and patient safety in acute care hospital settings. Findings highlighted the impact of environmental interruptions on communication processes about medications. In response to frequent interruptions and limited space within working environments, nurses, doctors and pharmacists developed adaptive practices in the local clinical context. Communication difficulties were associated with the ward physical layout, the controlled drug key and the medication retrieving device. Health professionals should be provided with opportunities to discuss the effects of ward environments on medication communication processes and how this impacts medication safety. Hospital administrators and architects need to consider health professionals' views and experiences when designing hospital spaces. PMID:24486620

  3. PROCESS DESIGN FOR ENVIRONMENT: A MULTI-OBJECTIVE FRAMEWORK UNDER UNCERTAINTY

    EPA Science Inventory

    Designing chemical processes for environment requires consideration of several indexes of environmental impact including ozone depletion and global warming potentials, human and aquatic toxicity, and photochemical oxidation, and acid rain potentials. Current methodologies like t...

  4. Explicitly representing soil microbial processes in Earth system models

    NASA Astrophysics Data System (ADS)

    Wieder, William R.; Allison, Steven D.; Davidson, Eric A.; Georgiou, Katerina; Hararuk, Oleksandra; He, Yujie; Hopkins, Francesca; Luo, Yiqi; Smith, Matthew J.; Sulman, Benjamin; Todd-Brown, Katherine; Wang, Ying-Ping; Xia, Jianyang; Xu, Xiaofeng

    2015-10-01

    Microbes influence soil organic matter decomposition and the long-term stabilization of carbon (C) in soils. We contend that by revising the representation of microbial processes and their interactions with the physicochemical soil environment, Earth system models (ESMs) will make more realistic global C cycle projections. Explicit representation of microbial processes presents considerable challenges due to the scale at which these processes occur. Thus, applying microbial theory in ESMs requires a framework to link micro-scale process-level understanding and measurements to macro-scale models used to make decadal- to century-long projections. Here we review the diversity, advantages, and pitfalls of simulating soil biogeochemical cycles using microbial-explicit modeling approaches. We present a roadmap for how to begin building, applying, and evaluating reliable microbial-explicit model formulations that can be applied in ESMs. Drawing from experience with traditional decomposition models, we suggest the following: (1) guidelines for common model parameters and output that can facilitate future model intercomparisons; (2) development of benchmarking and model-data integration frameworks that can be used to effectively guide, inform, and evaluate model parameterizations with data from well-curated repositories; and (3) the application of scaling methods to integrate microbial-explicit soil biogeochemistry modules within ESMs. With contributions across scientific disciplines, we feel this roadmap can advance our fundamental understanding of soil biogeochemical dynamics and more realistically project likely soil C response to environmental change at global scales.

  5. Explicitly Representing Soil Microbial Processes In Earth System Models

    SciTech Connect

    Wieder, William R.; Allison, Steven D.; Davidson, Eric A.; Georgiou, Katrina; Hararuk, Oleksandra; He, Yujie; Hopkins, Francesca; Luo, Yiqi; Smith, Mathew J.; Sulman, Benjamin; Todd-Brown, Katherine EO; Wang, Ying-Ping; Xia, Jianyang; Xu, Xiaofeng

    2015-10-26

    Microbes influence soil organic matter (SOM) decomposition and the long-term stabilization of carbon (C) in soils. We contend that by revising the representation of microbial processes and their interactions with the physicochemical soil environment, Earth system models (ESMs) may make more realistic global C cycle projections. Explicit representation of microbial processes presents considerable challenges due to the scale at which these processes occur. Thus, applying microbial theory in ESMs requires a framework to link micro-scale process-level understanding and measurements to macro-scale models used to make decadal- to century-long projections. Here, we review the diversity, advantages, and pitfalls of simulating soil biogeochemical cycles using microbial-explicit modeling approaches. We present a roadmap for how to begin building, applying, and evaluating reliable microbial-explicit model formulations that can be applied in ESMs. Drawing from experience with traditional decomposition models we suggest: (1) guidelines for common model parameters and output that can facilitate future model intercomparisons; (2) development of benchmarking and model-data integration frameworks that can be used to effectively guide, inform, and evaluate model parameterizations with data from well-curated repositories; and (3) the application of scaling methods to integrate microbial-explicit soil biogeochemistry modules within ESMs. With contributions across scientific disciplines, we feel this roadmap can advance our fundamental understanding of soil biogeochemical dynamics and more realistically project likely soil C response to environmental change at global scales.

  6. Forest Canopy Processes in a Regional Chemical Transport Model

    NASA Astrophysics Data System (ADS)

    Makar, Paul; Staebler, Ralf; Akingunola, Ayodeji; Zhang, Junhua; McLinden, Chris; Kharol, Shailesh; Moran, Michael; Robichaud, Alain; Zhang, Leiming; Stroud, Craig; Pabla, Balbir; Cheung, Philip

    2016-04-01

    Forest canopies have typically been absent or highly parameterized in regional chemical transport models. Some forest-related processes are often considered - for example, biogenic emissions from the forests are included as a flux lower boundary condition on vertical diffusion, as is deposition to vegetation. However, real forest canopies comprise a much more complicated set of processes, at scales below the "transport model-resolved scale" of vertical levels usually employed in regional transport models. Advective and diffusive transport within the forest canopy typically scale with the height of the canopy, and the former process tends to dominate over the latter. Emissions of biogenic hydrocarbons arise from the foliage, which may be located tens of metres above the surface, while emissions of biogenic nitric oxide from decaying plant matter are located at the surface - in contrast to the surface flux boundary condition usually employed in chemical transport models. Deposition, similarly, is usually parameterized as a flux boundary condition, but may be differentiated between fluxes to vegetation and fluxes to the surface when the canopy scale is considered. The chemical environment also changes within forest canopies: shading, temperature, and relativity humidity changes with height within the canopy may influence chemical reaction rates. These processes have been observed in a host of measurement studies, and have been simulated using site-specific one-dimensional forest canopy models. Their influence on regional scale chemistry has been unknown, until now. In this work, we describe the results of the first attempt to include complex canopy processes within a regional chemical transport model (GEM-MACH). The original model core was subdivided into "canopy" and "non-canopy" subdomains. In the former, three additional near-surface layers based on spatially and seasonally varying satellite-derived canopy height and leaf area index were added to the original model

  7. Modeling and optimum time performance for concurrent processing

    NASA Astrophysics Data System (ADS)

    Mielke, Roland R.; Stoughton, John W.; Som, Sukhamoy

    1988-08-01

    The development of a new graph theoretic model for describing the relation between a decomposed algorithm and its execution in a data flow environment is presented. Called ATAMM, the model consists of a set of Petri net marked graphs useful for representing decision-free algorithms having large-grained, computationally complex primitive operations. Performance time measures which determine computing speed and throughput capacity are defined, and the ATAMM model is used to develop lower bounds for these times. A concurrent processing operating strategy for achieving optimum time performance is presented and illustrated by example.

  8. Modeling and optimum time performance for concurrent processing

    NASA Technical Reports Server (NTRS)

    Mielke, Roland R.; Stoughton, John W.; Som, Sukhamoy

    1988-01-01

    The development of a new graph theoretic model for describing the relation between a decomposed algorithm and its execution in a data flow environment is presented. Called ATAMM, the model consists of a set of Petri net marked graphs useful for representing decision-free algorithms having large-grained, computationally complex primitive operations. Performance time measures which determine computing speed and throughput capacity are defined, and the ATAMM model is used to develop lower bounds for these times. A concurrent processing operating strategy for achieving optimum time performance is presented and illustrated by example.

  9. Representing Microbial Processes in Environmental Reactive Transport Models

    NASA Astrophysics Data System (ADS)

    van Cappellen, P.

    2009-04-01

    The activities of microorganisms profoundly impact the chemical structure and biogeochemical dynamics of surface and subsurface environments. In the context of reactive transport modeling, a major challenge is to derive, calibrate and validate rate expressions for microbially-mediated reaction processes. This challenge is best met by combining field observations, laboratory experiments and theory. In my presentation, I will illustrate such an integrated approach for the case of microbial respiration in aquatic sediments. Topics that will be dealt with are model consistency, interpretation of experimental data, bioenergetics, transient behavior and model performance.

  10. The Coalescent Process in Models with Selection

    PubMed Central

    Kaplan, N. L.; Darden, T.; Hudson, R. R.

    1988-01-01

    Statistical properties of the process describing the genealogical history of a random sample of genes are obtained for a class of population genetics models with selection. For models with selection, in contrast to models without selection, the distribution of this process, the coalescent process, depends on the distribution of the frequencies of alleles in the ancestral generations. If the ancestral frequency process can be approximated by a diffusion, then the mean and the variance of the number of segregating sites due to selectively neutral mutations in random samples can be numerically calculated. The calculations are greatly simplified if the frequencies of the alleles are tightly regulated. If the mutation rates between alleles maintained by balancing selection are low, then the number of selectively neutral segregating sites in a random sample of genes is expected to substantially exceed the number predicted under a neutral model. PMID:3066685

  11. Limitations of Gene × Environment Interaction Models in Psychiatry

    PubMed Central

    Munafò, Marcus R.; Zammit, Stanley; Flint, Jonathan

    2016-01-01

    Background Psychiatric disorders run in families, and early twin, family and adoption studies confirmed that this was due in part to shared genetic inheritance. While candidate gene studies largely failed to reliably identify genetic variants associated with psychiatric disorders, genomewide association studies are beginning to do so. However, the proportion of phenotypic variance explained remains well below what would be expected from previous heritability estimates. Scope We review possible reasons for this “missing heritability”, and in particular whether incorporating gene by environment interactions into our models will substantially improve our understanding of the aetiology of psychiatric disorders, and inform clinical perceptions and practice. Findings We discuss potential limitations of the gene by environment interaction approach. In particular, we discuss whether these are likely to be a major contributor to psychiatric disorders at the level of the specific interaction (as opposed to at an aggregate level). Conclusions Gene by environment interaction studies offered initial promise that a far greater proportion of phenotypic variance could be explained by incorporating measures of environmental exposures into genetic studies. However, in our opinion there are few (if any) clear examples of gene by environment interactions in psychiatry, and their scope for informing either our understanding of disease pathology or clinical practice remains limited at present. PMID:24828285

  12. Object-oriented models of cognitive processing.

    PubMed

    Mather, G

    2001-05-01

    Information-processing models of vision and cognition are inspired by procedural programming languages. Models that emphasize object-based representations are closely related to object-oriented programming languages. The concepts underlying object-oriented languages provide a theoretical framework for cognitive processing that differs markedly from that offered by procedural languages. This framework is well-suited to a system designed to deal flexibly with discrete objects and unpredictable events in the world. PMID:11323249

  13. Evolution of quantum-like modeling in decision making processes

    NASA Astrophysics Data System (ADS)

    Khrennikova, Polina

    2012-12-01

    The application of the mathematical formalism of quantum mechanics to model behavioral patterns in social science and economics is a novel and constantly emerging field. The aim of the so called 'quantum like' models is to model the decision making processes in a macroscopic setting, capturing the particular 'context' in which the decisions are taken. Several subsequent empirical findings proved that when making a decision people tend to violate the axioms of expected utility theory and Savage's Sure Thing principle, thus violating the law of total probability. A quantum probability formula was devised to describe more accurately the decision making processes. A next step in the development of QL-modeling in decision making was the application of Schrödinger equation to describe the evolution of people's mental states. A shortcoming of Schrödinger equation is its inability to capture dynamics of an open system; the brain of the decision maker can be regarded as such, actively interacting with the external environment. Recently the master equation, by which quantum physics describes the process of decoherence as the result of interaction of the mental state with the environmental 'bath', was introduced for modeling the human decision making. The external environment and memory can be referred to as a complex 'context' influencing the final decision outcomes. The master equation can be considered as a pioneering and promising apparatus for modeling the dynamics of decision making in different contexts.

  14. Evolution of quantum-like modeling in decision making processes

    SciTech Connect

    Khrennikova, Polina

    2012-12-18

    The application of the mathematical formalism of quantum mechanics to model behavioral patterns in social science and economics is a novel and constantly emerging field. The aim of the so called 'quantum like' models is to model the decision making processes in a macroscopic setting, capturing the particular 'context' in which the decisions are taken. Several subsequent empirical findings proved that when making a decision people tend to violate the axioms of expected utility theory and Savage's Sure Thing principle, thus violating the law of total probability. A quantum probability formula was devised to describe more accurately the decision making processes. A next step in the development of QL-modeling in decision making was the application of Schroedinger equation to describe the evolution of people's mental states. A shortcoming of Schroedinger equation is its inability to capture dynamics of an open system; the brain of the decision maker can be regarded as such, actively interacting with the external environment. Recently the master equation, by which quantum physics describes the process of decoherence as the result of interaction of the mental state with the environmental 'bath', was introduced for modeling the human decision making. The external environment and memory can be referred to as a complex 'context' influencing the final decision outcomes. The master equation can be considered as a pioneering and promising apparatus for modeling the dynamics of decision making in different contexts.

  15. Cost Models for MMC Manufacturing Processes

    NASA Technical Reports Server (NTRS)

    Elzey, Dana M.; Wadley, Haydn N. G.

    1996-01-01

    Processes for the manufacture of advanced metal matrix composites are rapidly approaching maturity in the research laboratory and there is growing interest in their transition to industrial production. However, research conducted to date has almost exclusively focused on overcoming the technical barriers to producing high-quality material and little attention has been given to the economical feasibility of these laboratory approaches and process cost issues. A quantitative cost modeling (QCM) approach was developed to address these issues. QCM are cost analysis tools based on predictive process models relating process conditions to the attributes of the final product. An important attribute, of the QCM approach is the ability to predict the sensitivity of material production costs to product quality and to quantitatively explore trade-offs between cost and quality. Applications of the cost models allow more efficient direction of future MMC process technology development and a more accurate assessment of MMC market potential. Cost models were developed for two state-of-the art metal matrix composite (MMC) manufacturing processes: tape casting and plasma spray deposition. Quality and Cost models are presented for both processes and the resulting predicted quality-cost curves are presented and discussed.

  16. Processing of Soot in an Urban Environment: Case Study from the Mexico City Metropolitan Area

    SciTech Connect

    Johnson, Kirsten S.; Zuberi, Bilal M.; Molina, Luisa; Molina, Mario J.; Iedema, Martin J.; Cowin, James P.; Gaspar, Daniel J.; Wang, Chong M.; Laskin, Alexander

    2005-11-14

    Chemical composition, size, and mixing state of atmospheric particles are critical in determining their e ffects on the environment. There is growing evidence that soot aerosols play a particularly important role in both climate and human health, but still relatively little is known of their physical and chemical nature. In addition, the atmo- 5 spheric residence times and removal mechanisms for soot are neither well understood nor adequately represented in regional and global climate models. To investigate the effect of locality and residence time on properties of soot and mixing state in a polluted urban environment, particles of diameter 0.2–2.0 µm were collected in the Mexico City Metropolitan Area (MCMA) during the MCMA-2003 field campaign from various sites 10 within the city. Individual particle analysis by di fferent electron microscopy methods coupled with energy dispersed X-ray spectroscopy, and secondary ionization mass spectrometry show that freshly-emitted soot particles become rapidly processed in the MCMA. Whereas fresh particulate emissions from mixed-tra ffic are almost entirely carbonaceous, consisting of soot aggregates with liquid coatings suggestive of unburned 15 lubricating oil and water, ambient soot particles which have been processed for less than a few hours are heavily internally mixed, primarily with ammonium sulfate. Single particle analysis suggests that this mixing occurs through several mechanisms that require further investigation. In light of previously published results, the internally-mixed nature of processed soot particles is expected to a ffect heterogeneous chemistry on 20 the soot surface, including interaction with water during wet-removal.

  17. Cognitive Virtualization: Combining Cognitive Models and Virtual Environments

    SciTech Connect

    Tuan Q. Tran; David I. Gertman; Donald D. Dudenhoeffer; Ronald L. Boring; Alan R. Mecham

    2007-08-01

    3D manikins are often used in visualizations to model human activity in complex settings. Manikins assist in developing understanding of human actions, movements and routines in a variety of different environments representing new conceptual designs. One such environment is a nuclear power plant control room, here they have the potential to be used to simulate more precise ergonomic assessments of human work stations. Next generation control rooms will pose numerous challenges for system designers. The manikin modeling approach by itself, however, may be insufficient for dealing with the desired technical advancements and challenges of next generation automated systems. Uncertainty regarding effective staffing levels; and the potential for negative human performance consequences in the presence of advanced automated systems (e.g., reduced vigilance, poor situation awareness, mistrust or blind faith in automation, higher information load and increased complexity) call for further research. Baseline assessment of novel control room equipment(s) and configurations needs to be conducted. These design uncertainties can be reduced through complementary analysis that merges ergonomic manikin models with models of higher cognitive functions, such as attention, memory, decision-making, and problem-solving. This paper will discuss recent advancements in merging a theoretical-driven cognitive modeling framework within a 3D visualization modeling tool to evaluate of next generation control room human factors and ergonomic assessment. Though this discussion primary focuses on control room design, the application for such a merger between 3D visualization and cognitive modeling can be extended to various areas of focus such as training and scenario planning.

  18. Method of moment solutions to scattering problems in a parallel processing environment

    NASA Technical Reports Server (NTRS)

    Cwik, Tom; Partee, Jonathan; Patterson, Jean

    1991-01-01

    This paper describes the implementation of a parallelized method of moments (MOM) code into an interactive workstation environment. The workstation allows interactive solid body modeling and mesh generation, MOM analysis, and the graphical display of results. After describing the parallel computing environment, the implementation and results of parallelizing a general MOM code are presented in detail.

  19. Model for Simulating a Spiral Software-Development Process

    NASA Technical Reports Server (NTRS)

    Mizell, Carolyn; Curley, Charles; Nayak, Umanath

    2010-01-01

    A discrete-event simulation model, and a computer program that implements the model, have been developed as means of analyzing a spiral software-development process. This model can be tailored to specific development environments for use by software project managers in making quantitative cases for deciding among different software-development processes, courses of action, and cost estimates. A spiral process can be contrasted with a waterfall process, which is a traditional process that consists of a sequence of activities that include analysis of requirements, design, coding, testing, and support. A spiral process is an iterative process that can be regarded as a repeating modified waterfall process. Each iteration includes assessment of risk, analysis of requirements, design, coding, testing, delivery, and evaluation. A key difference between a spiral and a waterfall process is that a spiral process can accommodate changes in requirements at each iteration, whereas in a waterfall process, requirements are considered to be fixed from the beginning and, therefore, a waterfall process is not flexible enough for some projects, especially those in which requirements are not known at the beginning or may change during development. For a given project, a spiral process may cost more and take more time than does a waterfall process, but may better satisfy a customer's expectations and needs. Models for simulating various waterfall processes have been developed previously, but until now, there have been no models for simulating spiral processes. The present spiral-process-simulating model and the software that implements it were developed by extending a discrete-event simulation process model of the IEEE 12207 Software Development Process, which was built using commercially available software known as the Process Analysis Tradeoff Tool (PATT). Typical inputs to PATT models include industry-average values of product size (expressed as number of lines of code

  20. The (Mathematical) Modeling Process in Biosciences

    PubMed Central

    Torres, Nestor V.; Santos, Guido

    2015-01-01

    In this communication, we introduce a general framework and discussion on the role of models and the modeling process in the field of biosciences. The objective is to sum up the common procedures during the formalization and analysis of a biological problem from the perspective of Systems Biology, which approaches the study of biological systems as a whole. We begin by presenting the definitions of (biological) system and model. Particular attention is given to the meaning of mathematical model within the context of biology. Then, we present the process of modeling and analysis of biological systems. Three stages are described in detail: conceptualization of the biological system into a model, mathematical formalization of the previous conceptual model and optimization and system management derived from the analysis of the mathematical model. All along this work the main features and shortcomings of the process are analyzed and a set of rules that could help in the task of modeling any biological system are presented. Special regard is given to the formative requirements and the interdisciplinary nature of this approach. We conclude with some general considerations on the challenges that modeling is posing to current biology. PMID:26734063

  1. A real-time, interactive steering environment for integrated ground water modeling.

    PubMed

    Li, Shu-Guang; Liu, Qun

    2006-01-01

    We present in this note an innovative software environment, called Interactive Ground Water (IGW), for unified deterministic and stochastic ground water modeling. Based on efficient computational algorithms, IGW allows simulating three-dimensional (3D) unsteady flow and transport in saturated media subject to systematic and "random" stresses and geological and chemical heterogeneity. Adopting a new computing paradigm, IGW eliminates the fragmentation in the traditional modeling schemes and allows fully utilizing today's dramatically increased computing power. For many problems, IGW enables real-time modeling, visualization, mapping, and analysis. The software environment functions as a "numerical laboratory" in which an investigator may freely explore the following: creating visually an aquifer system of desired configurations, interactively applying stresses and boundary conditions, and then investigating and visualizing on the fly the geology and flow and transport dynamics. At any time, a researcher can pause to interact dynamically with virtually any aspects of the modeling process and then resume the integrated visual exploration; he or she can initiate, pause, or resume particle tracking, plume modeling, subscale modeling, stochastic modeling, monitoring, and budget analyses. IGW continually provides results that are dynamically processed, overlaid, and displayed. It dynamically merges modeling inputs and outputs into composite two-dimensional/3D images-integrating related data to provide a more complete view of the complex interplay among the geology, hydrology, flow system, and transport. These unique capabilities of real-time modeling, steering, analysis, and mapping expand the utility of models as tools for research, education, and professional investigations. PMID:16961499

  2. Electromagnetic Devices and Processes in Environment Protection. Proceedings of International Conference ELMECO 1994

    NASA Astrophysics Data System (ADS)

    1994-09-01

    The electrical power industry has contributed substantially to the destruction of the natural environment and that is why electrical engineers are particularly obliged to repair the damage and to reduce the risks as much as possible. For this reason, an approach to train specialists in the field of 'Electromagnetic processes and devices in environment protection' was undertaken by the Department of Fundamental Electrical Engineering, Lublin Technical University in 1990. Resulting research activities created the need to exchange the experience of specialists who deal with this field. The following categories were discussed at the conference and the papers that comprise them are presented within this report: ozone and plasma generators; materials and devices in environment protection; electromagnetic processes in environment protection; electromagnetic fields and devices; and noise and electromagnetic disturbance influence on human environment.

  3. The Epidemic Process and The Contagion Model

    ERIC Educational Resources Information Center

    Worthen, Dennis B.

    1973-01-01

    Goffman's epidemic theory is presented and compared to the contagion theory developed by Menzel. An attempt is made to compare the two models presented and examine their similarities and differences. The conclusion drawn is that the two models are very similar in their approach to understanding communication processes. (14 references) (Author/SJ)

  4. Information-Processing Models of Cognition.

    ERIC Educational Resources Information Center

    Simon, Herbert A.

    1981-01-01

    Reviews recent progress in modeling human cognition, in particular the use of computers in generating models. Topics covered include the information processing approach to cognition, problem solving, semantic memory, pattern induction, and learning and cognitive development. A 164-item reference list is attached. (JL)

  5. Modeling of fluidized bed silicon deposition process

    NASA Technical Reports Server (NTRS)

    Kim, K.; Hsu, G.; Lutwack, R.; PRATURI A. K.

    1977-01-01

    The model is intended for use as a means of improving fluidized bed reactor design and for the formulation of the research program in support of the contracts of Silicon Material Task for the development of the fluidized bed silicon deposition process. A computer program derived from the simple modeling is also described. Results of some sample calculations using the computer program are shown.

  6. Modeling biochemical transformation processes and information processing with Narrator

    PubMed Central

    Mandel, Johannes J; Fuß, Hendrik; Palfreyman, Niall M; Dubitzky, Werner

    2007-01-01

    Background Software tools that model and simulate the dynamics of biological processes and systems are becoming increasingly important. Some of these tools offer sophisticated graphical user interfaces (GUIs), which greatly enhance their acceptance by users. Such GUIs are based on symbolic or graphical notations used to describe, interact and communicate the developed models. Typically, these graphical notations are geared towards conventional biochemical pathway diagrams. They permit the user to represent the transport and transformation of chemical species and to define inhibitory and stimulatory dependencies. A critical weakness of existing tools is their lack of supporting an integrative representation of transport, transformation as well as biological information processing. Results Narrator is a software tool facilitating the development and simulation of biological systems as Co-dependence models. The Co-dependence Methodology complements the representation of species transport and transformation together with an explicit mechanism to express biological information processing. Thus, Co-dependence models explicitly capture, for instance, signal processing structures and the influence of exogenous factors or events affecting certain parts of a biological system or process. This combined set of features provides the system biologist with a powerful tool to describe and explore the dynamics of life phenomena. Narrator's GUI is based on an expressive graphical notation which forms an integral part of the Co-dependence Methodology. Behind the user-friendly GUI, Narrator hides a flexible feature which makes it relatively easy to map models defined via the graphical notation to mathematical formalisms and languages such as ordinary differential equations, the Systems Biology Markup Language or Gillespie's direct method. This powerful feature facilitates reuse, interoperability and conceptual model development. Conclusion Narrator is a flexible and intuitive systems

  7. Recent Developments in the Radiation Belt Environment Model

    NASA Technical Reports Server (NTRS)

    Fok, M.-C.; Glocer, A.; Zheng, Q.; Horne, R. B.; Meredith, N. P.; Albert, J. M.; Nagai, T.

    2010-01-01

    The fluxes of energetic particles in the radiation belts are found to be strongly controlled by the solar wind conditions. In order to understand and predict the radiation particle intensities, we have developed a physics-based Radiation Belt Environment (RBE) model that considers the influences from the solar wind, ring current and plasmasphere. Recently, an improved calculation of wave-particle interactions has been incorporated. In particular, the model now includes cross diffusion in energy and pitch-angle. We find that the exclusion of cross diffusion could cause significant overestimation of electron flux enhancement during storm recovery. The RBE model is also connected to MHD fields so that the response of the radiation belts to fast variations in the global magnetosphere can be studied.Weare able to reproduce the rapid flux increase during a substorm dipolarization on 4 September 2008. The timing is much shorter than the time scale of wave associated acceleration.

  8. New V and V Tools for Diagnostic Modeling Environment (DME)

    NASA Technical Reports Server (NTRS)

    Pecheur, Charles; Nelson, Stacy; Merriam, Marshall (Technical Monitor)

    2002-01-01

    The purpose of this report is to provide correctness and reliability criteria for verification and validation (V&V) of Second Generation Reusable Launch Vehicle (RLV) Diagnostic Modeling Environment, describe current NASA Ames Research Center tools for V&V of Model Based Reasoning systems, and discuss the applicability of Advanced V&V to DME. This report is divided into the following three sections: (1) correctness and reliability criteria; (2) tools for V&V of Model Based Reasoning; and (3) advanced V&V applicable to DME. The Executive Summary includes an overview of the main points from each section. Supporting details, diagrams, figures, and other information are included in subsequent sections. A glossary, acronym list, appendices, and references are included at the end of this report.

  9. A network-based training environment: a medical image processing paradigm.

    PubMed

    Costaridou, L; Panayiotakis, G; Sakellaropoulos, P; Cavouras, D; Dimopoulos, J

    1998-01-01

    The capability of interactive multimedia and Internet technologies is investigated with respect to the implementation of a distance learning environment. The system is built according to a client-server architecture, based on the Internet infrastructure, composed of server nodes conceptually modelled as WWW sites. Sites are implemented by customization of available components. The environment integrates network-delivered interactive multimedia courses, network-based tutoring, SIG support, information databases of professional interest, as well as course and tutoring management. This capability has been demonstrated by means of an implemented system, validated with digital image processing content, specifically image enhancement. Image enhancement methods are theoretically described and applied to mammograms. Emphasis is given to the interactive presentation of the effects of algorithm parameters on images. The system end-user access depends on available bandwidth, so high-speed access can be achieved via LAN or local ISDN connections. Network based training offers new means of improved access and sharing of learning resources and expertise, as promising supplements in training. PMID:9922949

  10. Reservoir and contaminated sediments impacts in high-Andean environments: Morphodynamic interactions with biogeochemical processes

    NASA Astrophysics Data System (ADS)

    Escauriaza, C. R.; Contreras, M. T.; Müllendorff, D. A.; Pasten, P.; Pizarro, G. E.

    2014-12-01

    Rapid changes due to anthropic interventions in high-altitude environments, such as the Altiplano region in South America, require new approaches to understand the connections between physical and biogeochemical processes. Alterations of the water quality linked to the river morphology can affect the ecosystems and human development in the long-term. The future construction of a reservoir in the Lluta river, located in northern Chile, will change the spatial distribution of arsenic-rich sediments, which can have significant effects on the lower parts of the watershed. In this investigation we develop a coupled numerical model to predict and evaluate the interactions between morphodynamic changes in the Lluta reservoir, and conditions that can potentially desorb arsenic from the sediments. Assuming that contaminants are mobilized under anaerobic conditions, we calculate the oxygen concentration within the sediments to study the interactions of the delta progradation with the potential arsenic release. This work provides a framework for future studies aimed to analyze the complex connections between morphodynamics and water quality, when contaminant-rich sediments accumulate in a reservoir. The tool can also help to design effective risk management and remediation strategies in these extreme environments. Research has been supported by Fondecyt grant 1130940 and CONICYT/FONDAP Grant 15110017

  11. Database integration in a multimedia-modeling environment

    SciTech Connect

    Dorow, Kevin E.

    2002-09-02

    Integration of data from disparate remote sources has direct applicability to modeling, which can support Brownfield assessments. To accomplish this task, a data integration framework needs to be established. A key element in this framework is the metadata that creates the relationship between the pieces of information that are important in the multimedia modeling environment and the information that is stored in the remote data source. The design philosophy is to allow modelers and database owners to collaborate by defining this metadata in such a way that allows interaction between their components. The main parts of this framework include tools to facilitate metadata definition, database extraction plan creation, automated extraction plan execution / data retrieval, and a central clearing house for metadata and modeling / database resources. Cross-platform compatibility (using Java) and standard communications protocols (http / https) allow these parts to run in a wide variety of computing environments (Local Area Networks, Internet, etc.), and, therefore, this framework provides many benefits. Because of the specific data relationships described in the metadata, the amount of data that have to be transferred is kept to a minimum (only the data that fulfill a specific request are provided as opposed to transferring the complete contents of a data source). This allows for real-time data extraction from the actual source. Also, the framework sets up collaborative responsibilities such that the different types of participants have control over the areas in which they have domain knowledge-the modelers are responsible for defining the data relevant to their models, while the database owners are responsible for mapping the contents of the database using the metadata definitions. Finally, the data extraction mechanism allows for the ability to control access to the data and what data are made available.

  12. The Integration of Word Processing with Data Processing in an Educational Environment. Final Report.

    ERIC Educational Resources Information Center

    Patterson, Lorna; Schlender, Jim

    A project examined the Office of the Future and determined trends regarding an integration of word processing and data processing. It then sought to translate those trends into an educational package to develop the potential information specialist. A survey instrument completed by 33 office managers and word processing and data processing…

  13. Quantitative modeling of soil genesis processes

    NASA Technical Reports Server (NTRS)

    Levine, E. R.; Knox, R. G.; Kerber, A. G.

    1992-01-01

    For fine spatial scale simulation, a model is being developed to predict changes in properties over short-, meso-, and long-term time scales within horizons of a given soil profile. Processes that control these changes can be grouped into five major process clusters: (1) abiotic chemical reactions; (2) activities of organisms; (3) energy balance and water phase transitions; (4) hydrologic flows; and (5) particle redistribution. Landscape modeling of soil development is possible using digitized soil maps associated with quantitative soil attribute data in a geographic information system (GIS) framework to which simulation models are applied.

  14. Using Perspective to Model Complex Processes

    SciTech Connect

    Kelsey, R.L.; Bisset, K.R.

    1999-04-04

    The notion of perspective, when supported in an object-based knowledge representation, can facilitate better abstractions of reality for modeling and simulation. The object modeling of complex physical and chemical processes is made more difficult in part due to the poor abstractions of state and phase changes available in these models. The notion of perspective can be used to create different views to represent the different states of matter in a process. These techniques can lead to a more understandable model. Additionally, the ability to record the progress of a process from start to finish is problematic. It is desirable to have a historic record of the entire process, not just the end result of the process. A historic record should facilitate backtracking and re-start of a process at different points in time. The same representation structures and techniques can be used to create a sequence of process markers to represent a historic record. By using perspective, the sequence of markers can have multiple and varying views tailored for a particular user's context of interest.

  15. Parabolic Anderson Model in a Dynamic Random Environment: Random Conductances

    NASA Astrophysics Data System (ADS)

    Erhard, D.; den Hollander, F.; Maillard, G.

    2016-06-01

    The parabolic Anderson model is defined as the partial differential equation ∂ u( x, t)/ ∂ t = κ Δ u( x, t) + ξ( x, t) u( x, t), x ∈ ℤ d , t ≥ 0, where κ ∈ [0, ∞) is the diffusion constant, Δ is the discrete Laplacian, and ξ is a dynamic random environment that drives the equation. The initial condition u( x, 0) = u 0( x), x ∈ ℤ d , is typically taken to be non-negative and bounded. The solution of the parabolic Anderson equation describes the evolution of a field of particles performing independent simple random walks with binary branching: particles jump at rate 2 d κ, split into two at rate ξ ∨ 0, and die at rate (- ξ) ∨ 0. In earlier work we looked at the Lyapunov exponents λ p(κ ) = limlimits _{tto ∞} 1/t log {E} ([u(0,t)]p)^{1/p}, quad p in {N} , qquad λ 0(κ ) = limlimits _{tto ∞} 1/2 log u(0,t). For the former we derived quantitative results on the κ-dependence for four choices of ξ : space-time white noise, independent simple random walks, the exclusion process and the voter model. For the latter we obtained qualitative results under certain space-time mixing conditions on ξ. In the present paper we investigate what happens when κΔ is replaced by Δ𝓚, where 𝓚 = {𝓚( x, y) : x, y ∈ ℤ d , x ˜ y} is a collection of random conductances between neighbouring sites replacing the constant conductances κ in the homogeneous model. We show that the associated annealed Lyapunov exponents λ p (𝓚), p ∈ ℕ, are given by the formula λ p({K} ) = {sup} {λ p(κ ) : κ in {Supp} ({K} )}, where, for a fixed realisation of 𝓚, Supp(𝓚) is the set of values taken by the 𝓚-field. We also show that for the associated quenched Lyapunov exponent λ 0(𝓚) this formula only provides a lower bound, and we conjecture that an upper bound holds when Supp(𝓚) is replaced by its convex hull. Our proof is valid for three classes of reversible ξ, and for all 𝓚

  16. Exposing earth surface process model simulations to a large audience

    NASA Astrophysics Data System (ADS)

    Overeem, I.; Kettner, A. J.; Borkowski, L.; Russell, E. L.; Peddicord, H.

    2015-12-01

    The Community Surface Dynamics Modeling System (CSDMS) represents a diverse group of >1300 scientists who develop and apply numerical models to better understand the Earth's surface. CSDMS has a mandate to make the public more aware of model capabilities and therefore started sharing state-of-the-art surface process modeling results with large audiences. One platform to reach audiences outside the science community is through museum displays on 'Science on a Sphere' (SOS). Developed by NOAA, SOS is a giant globe, linked with computers and multiple projectors and can display data and animations on a sphere. CSDMS has developed and contributed model simulation datasets for the SOS system since 2014, including hydrological processes, coastal processes, and human interactions with the environment. Model simulations of a hydrological and sediment transport model (WBM-SED) illustrate global river discharge patterns. WAVEWATCH III simulations have been specifically processed to show the impacts of hurricanes on ocean waves, with focus on hurricane Katrina and super storm Sandy. A large world dataset of dams built over the last two centuries gives an impression of the profound influence of humans on water management. Given the exposure of SOS, CSDMS aims to contribute at least 2 model datasets a year, and will soon provide displays of global river sediment fluxes and changes of the sea ice free season along the Arctic coast. Over 100 facilities worldwide show these numerical model displays to an estimated 33 million people every year. Datasets storyboards, and teacher follow-up materials associated with the simulations, are developed to address common core science K-12 standards. CSDMS dataset documentation aims to make people aware of the fact that they look at numerical model results, that underlying models have inherent assumptions and simplifications, and that limitations are known. CSDMS contributions aim to familiarize large audiences with the use of numerical

  17. Modeling the VARTM Composite Manufacturing Process

    NASA Technical Reports Server (NTRS)

    Song, Xiao-Lan; Loos, Alfred C.; Grimsley, Brian W.; Cano, Roberto J.; Hubert, Pascal

    2004-01-01

    A comprehensive simulation model of the Vacuum Assisted Resin Transfer Modeling (VARTM) composite manufacturing process has been developed. For isothermal resin infiltration, the model incorporates submodels which describe cure of the resin and changes in resin viscosity due to cure, resin flow through the reinforcement preform and distribution medium and compaction of the preform during the infiltration. The accuracy of the model was validated by measuring the flow patterns during resin infiltration of flat preforms. The modeling software was used to evaluate the effects of the distribution medium on resin infiltration of a flat preform. Different distribution medium configurations were examined using the model and the results were compared with data collected during resin infiltration of a carbon fabric preform. The results of the simulations show that the approach used to model the distribution medium can significantly effect the predicted resin infiltration times. Resin infiltration into the preform can be accurately predicted only when the distribution medium is modeled correctly.

  18. The deterministic SIS epidemic model in a Markovian random environment.

    PubMed

    Economou, Antonis; Lopez-Herrero, Maria Jesus

    2016-07-01

    We consider the classical deterministic susceptible-infective-susceptible epidemic model, where the infection and recovery rates depend on a background environmental process that is modeled by a continuous time Markov chain. This framework is able to capture several important characteristics that appear in the evolution of real epidemics in large populations, such as seasonality effects and environmental influences. We propose computational approaches for the determination of various distributions that quantify the evolution of the number of infectives in the population. PMID:26515172

  19. A Model for Design of Tailored Working Environment Intervention Programmes for Small Enterprises

    PubMed Central

    Kvorning, Laura V; Rasmussen, Charlotte DN; Smith, Louise H; Flyvholm, Mari-Ann

    2012-01-01

    Objectives Small enterprises have higher exposure to occupational hazards compared to larger enterprises and further, they have fewer resources to control the risks. In order to improve the working environment, development of efficient measures is therefore a major challenge for regulators and other stakeholders. The aim of this paper is to develop a systematic model for the design of tailored intervention programmes meeting the needs of small enterprises. Methods An important challenge for the design process is the transfer of knowledge from one context to another. The concept of realist analysis can provide insight into mechanisms by which intervention knowledge can be transferred from one context to another. We use this theoretical approach to develop a design model. Results The model consist of five steps: 1) Defining occupational health and safety challenges of the target group, 2) selecting methods to improve the working environment, 3) developing theories about mechanisms which motivate the target group, 4) analysing the specific context of the target group for small enterprise programmes including owner-management role, social relations, and the perception of the working environment, and 5) designing the intervention based on the preceding steps. We demonstrate how the design model can be applied in practice by the development of an intervention programme for small enterprises in the construction industry. Conclusion The model provides a useful tool for a systematic design process. The model makes it transparent for both researchers and practitioners as to how existing knowledge can be used in the design of new intervention programmes. PMID:23019530

  20. Silicon EFG process development by multiscale modeling

    NASA Astrophysics Data System (ADS)

    Müller, M.; Birkmann, B.; Mosel, F.; Westram, I.; Seidl, A.

    2010-04-01

    An overview of simulation models in use for optimizing the edge-defined film-fed growth (EFG) process of thin-walled hollow silicon tubes at WACKER SCHOTT Solar is presented. The simulations span the length scales from complete furnace models over growth simulations with a mesoscopic description of the crystalline character of silicon down to solidification simulations with atomic resolution. Results gained from one model are used as input parameters or boundary conditions on other levels. Examples for the application of these models and their impact on process design are given. These include the reduction of tube thickness variations, the control of tube deformations, residual stresses and dislocation densities and the identification of twin formation processes typical for EFG silicon.

  1. Method for modelling sea surface clutter in complicated propagation environments

    NASA Astrophysics Data System (ADS)

    Dockery, G. D.

    1990-04-01

    An approach for predicting clutter levels in complicated propagation conditions using an advanced propagation model and one of several empirical clutter cross-section models is described. Incident power and grazing angle information is obtained using a parabolic equation/Fourier split-step technique to predict the distribution of energy in complicated, range-varying environments. Such environments also require the use of an algorithm that establishes a physically reasonable range-interpolation scheme for the measured refractivity profiles. The reflectivity of the sea surface is represented using a clutter cross-section model that was developed originally by the Georgia Institutue of Technology and subsequently modified to include the effects of arbitrary refractive conditions. Predicted clutter power levels generated by the new procedure are compared with clutter measured at 2.9 GHz during propagation experiments conducted at the NASA Wallops Flight Facility on Virginia's Eastern Shore. During these experiments, high-resolution refractivity data were collected in both range and altitude by an instrumented helicopter.

  2. Geometrical model for malaria parasite migration in structured environments

    NASA Astrophysics Data System (ADS)

    Battista, Anna; Frischknecht, Friedrich; Schwarz, Ulrich S.

    2014-10-01

    Malaria is transmitted to vertebrates via a mosquito bite, during which rodlike and crescent-shaped parasites, called sporozoites, are injected into the skin of the host. Searching for a blood capillary to penetrate, sporozoites move quickly in locally helical trajectories, that are frequently perturbed by interactions with the extracellular environment. Here we present a theoretical analysis of the active motility of sporozoites in a structured environment. The sporozoite is modelled as a self-propelled rod with spontaneous curvature and bending rigidity. It interacts with hard obstacles through collision rules inferred from experimental observation of two-dimensional sporozoite movement in pillar arrays. Our model shows that complex motion patterns arise from the geometrical shape of the parasite and that its mechanical flexibility is crucial for stable migration patterns. Extending the model to three dimensions reveals that a bent and twisted rod can associate to cylindrical obstacles in a manner reminiscent of the association of sporozoites to blood capillaries, supporting the notion of a prominent role of cell shape during malaria transmission.

  3. Barotropic Tidal Predictions and Validation in a Relocatable Modeling Environment. Revised

    NASA Technical Reports Server (NTRS)

    Mehra, Avichal; Passi, Ranjit; Kantha, Lakshmi; Payne, Steven; Brahmachari, Shuvobroto

    1998-01-01

    Under funding from the Office of Naval Research (ONR), and the Naval Oceanographic Office (NAVOCEANO), the Mississippi State University Center for Air Sea Technology (CAST) has been working on developing a Relocatable Modeling Environment(RME) to provide a uniform and unbiased infrastructure for efficiently configuring numerical models in any geographic/oceanic region. Under Naval Oceanographic Office (NAVO-CEANO) funding, the model was implemented and tested for NAVOCEANO use. With our current emphasis on ocean tidal modeling, CAST has adopted the Colorado University's numerical ocean model, known as CURReNTSS (Colorado University Rapidly Relocatable Nestable Storm Surge) Model, as the model of choice. During the RME development process, CURReNTSS has been relocated to several coastal oceanic regions, providing excellent results that demonstrate its veracity. This report documents the model validation results and provides a brief description of the Graphic user Interface (GUI).

  4. Incorporating process variability into stormwater quality modelling.

    PubMed

    Wijesiri, Buddhi; Egodawatta, Prasanna; McGree, James; Goonetilleke, Ashantha

    2015-11-15

    Process variability in pollutant build-up and wash-off generates inherent uncertainty that affects the outcomes of stormwater quality models. Poor characterisation of process variability constrains the accurate accounting of the uncertainty associated with pollutant processes. This acts as a significant limitation to effective decision making in relation to stormwater pollution mitigation. The study undertaken developed three theoretical scenarios based on research findings that variations in particle size fractions <150 μm and >150 μm during pollutant build-up and wash-off primarily determine the variability associated with these processes. These scenarios, which combine pollutant build-up and wash-off processes that takes place on a continuous timeline, are able to explain process variability under different field conditions. Given the variability characteristics of a specific build-up or wash-off event, the theoretical scenarios help to infer the variability characteristics of the associated pollutant process that follows. Mathematical formulation of the theoretical scenarios enables the incorporation of variability characteristics of pollutant build-up and wash-off processes in stormwater quality models. The research study outcomes will contribute to the quantitative assessment of uncertainty as an integral part of the interpretation of stormwater quality modelling outcomes. PMID:26179783

  5. Radiation Environment Variations at Mars - Model Calculations and Measurements

    NASA Astrophysics Data System (ADS)

    Saganti, Premkumar; Cucinotta, Francis

    Variations in the space radiation environment due to changes in the GCR (Galactic Cosmic Ray) from the past (#23) solar cycle to the current one (#24) has been intriguing in many ways, with an unprecedented long duration of the recent solar minimum condition and a very low peak activity of the current solar maximum. Model calculated radiation data and assessment of variations in the particle flux - protons, alpha particles, and heavy ions of the GCR environment is essential for understanding radiation risk and for any future intended long-duration human exploration missions. During the past solar cycle, we have had most active and higher solar maximum (2001-2003) condition. In the beginning of the current solar cycle (#24), we experienced a very long duration of solar minimum (2009-2011) condition with a lower peak activity (2013-2014). At Mars, radiation measurements in orbit were obtained (onboard the 2001 Mars Odyssey spacecraft) during the past (#23) solar maximum condition. Radiation measurements on the surface of Mars are being currently measured (onboard the Mars Science Laboratory, 2012 - Curiosity) during the current (#24) solar peak activity (August 2012 - present). We present our model calculated radiation environment at Mars during solar maxima for solar cycles #23 and #24. We compare our earlier model calculations (Cucinotta et al., J. Radiat. Res., 43, S35-S39, 2002; Saganti et al., J. Radiat. Res., 43, S119-S124, 2002; and Saganti et al., Space Science Reviews, 110, 143-156, 2004) with the most recent radiation measurements on the surface of Mars (2012 - present).

  6. Process models for telehealth: an industrial approach to quality management of distant medical practice.

    PubMed Central

    Kangarloo, H.; Dionisio, J. D.; Sinha, U.; Johnson, D.; Taira, R. K.

    1999-01-01

    Process modeling is explored as an approach for prospectively managing the quality of a telemedicine/telehealth service. This kind of prospective quality management is more appropriate for dynamic health care environments compared to traditional quality assurance programs. A vector model approach has also been developed to match a process model to the needs of a particular site. Images Figure 4 PMID:10566418

  7. Hydrogeochemical processes in ground water in a tropical karst environment of Southern Mexico

    NASA Astrophysics Data System (ADS)

    Mota, Sandra; Escolero, Oscar

    2015-04-01

    The karstic aquifers are of a great strategic importance in many regions along the world. These aquifers belong to carbonated formations which have been affected by fissuration and dissolution (karstification) processes. The specific organization of the flows in this type of aquifer determines the methodologies to be used in its exploration, although much still unknown about the processes occurring in tropical environments. This research has the overall aim to identify the hydrogeochemical processes affecting groundwater in the Rio Grande Basin of Comitan, in the state of Chiapas, Mexico. In the Rio Grande Basin are delimited 54 sub-basins having an area of 6126.67 km2. The geology of the area is characterized by lithology dominated by Mesozoic sedimentary rocks of the Lower Cretaceous series, clastic and carbonate rocks limestone-dolomite type are the oldest. Another lithological association present in the area is limestone-shale. Consistent with the previous unit, a deposit of sediments consisting of shale, sandstone and limestone occurs. On these previous formations layers of siltstone and sandstone with interbedded limestone were deposited. Deep and shallow wells used to supply water to the population, were used to establish a monitoring network aimed at identifying the types of groundwater and the processes occurring in the karstic aquifer. For the development of this work was carried out sampling September 2014, where 50 sites used for groundwater extraction were sampled, of which 20 are deep wells and 30 shallow wells. The physicochemical parameters were measured in the field, while the chemical constituents were analyzed in the laboratory. The data obtained were drawn diagrams to identify hydrogeochemical facies of groundwater sampled, and contour maps of chemical content and some measured parameters. Likewise, the field data have been interpreted with the help of hydrogeochemical models to identify the processes that may be changing water quality in the

  8. Biomedical Simulation Models of Human Auditory Processes

    NASA Technical Reports Server (NTRS)

    Bicak, Mehmet M. A.

    2012-01-01

    Detailed acoustic engineering models that explore noise propagation mechanisms associated with noise attenuation and transmission paths created when using hearing protectors such as earplugs and headsets in high noise environments. Biomedical finite element (FE) models are developed based on volume Computed Tomography scan data which provides explicit external ear, ear canal, middle ear ossicular bones and cochlea geometry. Results from these studies have enabled a greater understanding of hearing protector to flesh dynamics as well as prioritizing noise propagation mechanisms. Prioritization of noise mechanisms can form an essential framework for exploration of new design principles and methods in both earplug and earcup applications. These models are currently being used in development of a novel hearing protection evaluation system that can provide experimentally correlated psychoacoustic noise attenuation. Moreover, these FE models can be used to simulate the effects of blast related impulse noise on human auditory mechanisms and brain tissue.

  9. Utilizing Vector Space Models for User Modeling within e-Learning Environments

    ERIC Educational Resources Information Center

    Mangina, E.; Kilbride, J.

    2008-01-01

    User modeling has been found to enhance the effectiveness and/or usability of software systems through the representation of certain properties of a particular user. This paper presents the research and the results of the development of a user modeling system for the implementation of student models within e-learning environments, utilizing vector…

  10. Comparing Two Types of Model Progression in an Inquiry Learning Environment with Modelling Facilities

    ERIC Educational Resources Information Center

    Mulder, Yvonne G.; Lazonder, Ard W.; de Jong, Ton

    2011-01-01

    The educational advantages of inquiry learning environments that incorporate modelling facilities are often challenged by students' poor inquiry skills. This study examined two types of model progression as means to compensate for these skill deficiencies. Model order progression (MOP), the predicted optimal variant, gradually increases the…

  11. Engineered Barrier System Degradation, Flow, and Transport Process Model Report

    SciTech Connect

    E.L. Hardin

    2000-07-17

    The Engineered Barrier System Degradation, Flow, and Transport Process Model Report (EBS PMR) is one of nine PMRs supporting the Total System Performance Assessment (TSPA) being developed by the Yucca Mountain Project for the Site Recommendation Report (SRR). The EBS PMR summarizes the development and abstraction of models for processes that govern the evolution of conditions within the emplacement drifts of a potential high-level nuclear waste repository at Yucca Mountain, Nye County, Nevada. Details of these individual models are documented in 23 supporting Analysis/Model Reports (AMRs). Nineteen of these AMRs are for process models, and the remaining 4 describe the abstraction of results for application in TSPA. The process models themselves cluster around four major topics: ''Water Distribution and Removal Model, Physical and Chemical Environment Model, Radionuclide Transport Model, and Multiscale Thermohydrologic Model''. One AMR (Engineered Barrier System-Features, Events, and Processes/Degradation Modes Analysis) summarizes the formal screening analysis used to select the Features, Events, and Processes (FEPs) included in TSPA and those excluded from further consideration. Performance of a potential Yucca Mountain high-level radioactive waste repository depends on both the natural barrier system (NBS) and the engineered barrier system (EBS) and on their interactions. Although the waste packages are generally considered as components of the EBS, the EBS as defined in the EBS PMR includes all engineered components outside the waste packages. The principal function of the EBS is to complement the geologic system in limiting the amount of water contacting nuclear waste. A number of alternatives were considered by the Project for different EBS designs that could provide better performance than the design analyzed for the Viability Assessment. The design concept selected was Enhanced Design Alternative II (EDA II).

  12. Modeling the effect of outdoor particle concentrations on indoor concentrations in a heated environment

    SciTech Connect

    Pandian, M.D. )

    1988-01-01

    Exposure to suspended particulate mater in the home or workplace can produce adverse human health effects. Sources of suspended particulate matter include cigarette smoke, consumer spray products, and dust from cement manufacture, metal processing, and coal-fired power generation. The particle concentrations in these indoor environments can be determined from experimental studies or modeling techniques. Many experimental studies have been conducted to determine the mass concentration of total suspended particulate matter, usually expressed in {mu}g/m{sup 3}, and the elemental composition of particulate matter in these environments. However, there is not much reported data on particle size distributions in indoor environments. One of the early indoor modeling efforts was undertaken by Shair and Heitner, who conducted a theoretical analysis for relating indoor pollutant concentrations to those outdoors. The author describes the theoretical analysis and compared it to results obtained from experiments on conditioned cigarette smoke particle concentrations in a room at 20{degrees}C and 60 {percent}.

  13. Virtual building environments (VBE) - Applying information modeling to buildings

    SciTech Connect

    Bazjanac, Vladimir

    2004-06-21

    A Virtual Building Environment (VBE) is a ''place'' where building industry project staffs can get help in creating Building Information Models (BIM) and in the use of virtual buildings. It consists of a group of industry software that is operated by industry experts who are also experts in the use of that software. The purpose of a VBE is to facilitate expert use of appropriate software applications in conjunction with each other to efficiently support multidisciplinary work. This paper defines BIM and virtual buildings, and describes VBE objectives, set-up and characteristics of operation. It informs about the VBE Initiative and the benefits from a couple of early VBE projects.

  14. Spring-Model-Based Wireless Localization in Cooperative User Environments

    NASA Astrophysics Data System (ADS)

    Ke, Wei; Wu, Lenan; Qi, Chenhao

    To overcome the shortcomings of conventional cellular positioning, a novel cooperative location algorithm that uses the available peer-to-peer communication between the mobile terminals (MTs) is proposed. The main idea behind the proposed approach is to incorporate the long- and short-range location information to improve the estimation of the MT's coordinates. Since short-range communications among MTs are characterized by high line-of-sight (LOS) probability, an improved spring-model-based cooperative location method can be exploited to provide low-cost improvement for cellular-based location in the non-line-of-sight (NLOS) environments.

  15. Operational SAR Data Processing in GIS Environments for Rapid Disaster Mapping

    NASA Astrophysics Data System (ADS)

    Bahr, Thomas

    2014-05-01

    DEM without the need of ground control points. This step includes radiometric calibration. (3) A subsequent change detection analysis generates the final map showing the extent of the flash flood on Nov. 5th 2010. The underlying algorithms are provided by three different sources: Geocoding & radiometric calibration (2) is a standard functionality from the commercial SARscape Toolbox for ArcGIS. This toolbox is extended by the filter tool (1), which is called from the SARscape modules in ENVI. The change detection analysis (3) is based on ENVI processing routines and scripted with IDL. (2) and (3) are integrated with ArcGIS using a predefined Python interface. These 3 processing steps are combined using the ArcGIS ModelBuilder to create a new model for rapid disaster mapping in ArcGIS, based on SAR data. Moreover, this model can be dissolved from its desktop environment and published to users across the ArcGIS Server enterprise. Thus disaster zones, e.g. after severe flooding, can be automatically identified and mapped to support local task forces - using an operational workflow for SAR image analysis, which can be executed by the responsible operators without SAR expert knowledge.

  16. Predicting plants -modeling traits as a function of environment

    NASA Astrophysics Data System (ADS)

    Franklin, Oskar

    2016-04-01

    A central problem in understanding and modeling vegetation dynamics is how to represent the variation in plant properties and function across different environments. Addressing this problem there is a strong trend towards trait-based approaches, where vegetation properties are functions of the distributions of functional traits rather than of species. Recently there has been enormous progress in in quantifying trait variability and its drivers and effects (Van Bodegom et al. 2012; Adier et al. 2014; Kunstler et al. 2015) based on wide ranging datasets on a small number of easily measured traits, such as specific leaf area (SLA), wood density and maximum plant height. However, plant function depends on many other traits and while the commonly measured trait data are valuable, they are not sufficient for driving predictive and mechanistic models of vegetation dynamics -especially under novel climate or management conditions. For this purpose we need a model to predict functional traits, also those not easily measured, and how they depend on the plants' environment. Here I present such a mechanistic model based on fitness concepts and focused on traits related to water and light limitation of trees, including: wood density, drought response, allocation to defense, and leaf traits. The model is able to predict observed patterns of variability in these traits in relation to growth and mortality, and their responses to a gradient of water limitation. The results demonstrate that it is possible to mechanistically predict plant traits as a function of the environment based on an eco-physiological model of plant fitness. References Adier, P.B., Salguero-Gómez, R., Compagnoni, A., Hsu, J.S., Ray-Mukherjee, J., Mbeau-Ache, C. et al. (2014). Functional traits explain variation in plant lifehistory strategies. Proc. Natl. Acad. Sci. U. S. A., 111, 740-745. Kunstler, G., Falster, D., Coomes, D.A., Hui, F., Kooyman, R.M., Laughlin, D.C. et al. (2015). Plant functional traits

  17. Models of Solar Wind Structures and Their Interaction with the Earth's Space Environment

    NASA Astrophysics Data System (ADS)

    Watermann, J.; Wintoft, P.; Sanahuja, B.; Saiz, E.; Poedts, S.; Palmroth, M.; Milillo, A.; Metallinou, F.-A.; Jacobs, C.; Ganushkina, N. Y.; Daglis, I. A.; Cid, C.; Cerrato, Y.; Balasis, G.; Aylward, A. D.; Aran, A.

    2009-11-01

    The discipline of “Space Weather” is built on the scientific foundation of solar-terrestrial physics but with a strong orientation toward applied research. Models describing the solar-terrestrial environment are therefore at the heart of this discipline, for both physical understanding of the processes involved and establishing predictive capabilities of the consequences of these processes. Depending on the requirements, purely physical models, semi-empirical or empirical models are considered to be the most appropriate. This review focuses on the interaction of solar wind disturbances with geospace. We cover interplanetary space, the Earth’s magnetosphere (with the exception of radiation belt physics), the ionosphere (with the exception of radio science), the neutral atmosphere and the ground (via electromagnetic induction fields). Space weather relevant state-of-the-art physical and semi-empirical models of the various regions are reviewed. They include models for interplanetary space, its quiet state and the evolution of recurrent and transient solar perturbations (corotating interaction regions, coronal mass ejections, their interplanetary remnants, and solar energetic particle fluxes). Models of coupled large-scale solar wind-magnetosphere-ionosphere processes (global magnetohydrodynamic descriptions) and of inner magnetosphere processes (ring current dynamics) are discussed. Achievements in modeling the coupling between magnetospheric processes and the neutral and ionized upper and middle atmospheres are described. Finally we mention efforts to compile comprehensive and flexible models from selections of existing modules applicable to particular regions and conditions in interplanetary space and geospace.

  18. Modeling a distributed environment for a petroleum reservoir engineering application with software product line

    NASA Astrophysics Data System (ADS)

    de Faria Scheidt, Rafael; Vilain, Patrícia; Dantas, M. A. R.

    2014-10-01

    Petroleum reservoir engineering is a complex and interesting field that requires large amount of computational facilities to achieve successful results. Usually, software environments for this field are developed without taking care out of possible interactions and extensibilities required by reservoir engineers. In this paper, we present a research work which it is characterized by the design and implementation based on a software product line model for a real distributed reservoir engineering environment. Experimental results indicate successfully the utilization of this approach for the design of distributed software architecture. In addition, all components from the proposal provided greater visibility of the organization and processes for the reservoir engineers.

  19. Task Results Processing for the Needs of Task-Oriented Design Environments

    ERIC Educational Resources Information Center

    Zheliazkova, Irina; Kolev, R.

    2008-01-01

    This paper presents learners' task results gathered by means of an example task-oriented environment for knowledge testing and processed by EXCEL. The processing is domain- and task-independent and includes automatic calculation of several important task and session's parameters, drawing specific graphics, generating tables, and analyzing the…

  20. Molecular Characterization and Serotyping of Salmonella Isolated from the Shell Egg Processing Environment

    Technology Transfer Automated Retrieval System (TEKTRAN)

    ABSTRACT BODY: Introduction: Salmonellosis may be contracted by the consumption of raw or undercooked eggs. In order to develop effective sanitation practices it is helpful to understand the location of Salmonella reservoirs in processing environments. Shell egg processing reservoirs for Salmonella...

  1. Computational Modeling in Structural Materials Processing

    NASA Technical Reports Server (NTRS)

    Meyyappan, Meyya; Arnold, James O. (Technical Monitor)

    1997-01-01

    High temperature materials such as silicon carbide, a variety of nitrides, and ceramic matrix composites find use in aerospace, automotive, machine tool industries and in high speed civil transport applications. Chemical vapor deposition (CVD) is widely used in processing such structural materials. Variations of CVD include deposition on substrates, coating of fibers, inside cavities and on complex objects, and infiltration within preforms called chemical vapor infiltration (CVI). Our current knowledge of the process mechanisms, ability to optimize processes, and scale-up for large scale manufacturing is limited. In this regard, computational modeling of the processes is valuable since a validated model can be used as a design tool. The effort is similar to traditional chemically reacting flow modeling with emphasis on multicomponent diffusion, thermal diffusion, large sets of homogeneous reactions, and surface chemistry. In the case of CVI, models for pore infiltration are needed. In the present talk, examples of SiC nitride, and Boron deposition from the author's past work will be used to illustrate the utility of computational process modeling.

  2. Software Engineering Laboratory (SEL) cleanroom process model

    NASA Technical Reports Server (NTRS)

    Green, Scott; Basili, Victor; Godfrey, Sally; Mcgarry, Frank; Pajerski, Rose; Waligora, Sharon

    1991-01-01

    The Software Engineering Laboratory (SEL) cleanroom process model is described. The term 'cleanroom' originates in the integrated circuit (IC) production process, where IC's are assembled in dust free 'clean rooms' to prevent the destructive effects of dust. When applying the clean room methodology to the development of software systems, the primary focus is on software defect prevention rather than defect removal. The model is based on data and analysis from previous cleanroom efforts within the SEL and is tailored to serve as a guideline in applying the methodology to future production software efforts. The phases that are part of the process model life cycle from the delivery of requirements to the start of acceptance testing are described. For each defined phase, a set of specific activities is discussed, and the appropriate data flow is described. Pertinent managerial issues, key similarities and differences between the SEL's cleanroom process model and the standard development approach used on SEL projects, and significant lessons learned from prior cleanroom projects are presented. It is intended that the process model described here will be further tailored as additional SEL cleanroom projects are analyzed.

  3. Dynamic occupancy models for explicit colonization processes.

    PubMed

    Broms, Kristin M; Hooten, Mevin B; Johnson, Devin S; Altwegg, Res; Conquest, Loveday L

    2016-01-01

    The dynamic, multi-season occupancy model framework has become a popular tool for modeling open populations with occupancies that change over time through local colonizations and extinctions. However, few versions of the model relate these probabilities to the occupancies of neighboring sites or patches. We present a modeling framework that incorporates this information and is capable of describing a wide variety of spatiotemporal colonization and extinction processes. A key feature of the model is that it is based on a simple set of small-scale rules describing how the process evolves. The result is a dynamic process that can account for complicated large-scale features. In our model, a site is more likely to be colonized if more of its neighbors were previously occupied and if it provides more appealing environmental characteristics than its neighboring sites. Additionally, a site without occupied neighbors may also become colonized through the inclusion of a long-distance dispersal process. Although similar model specifications have been developed for epidemiological applications, ours formally accounts for detectability using the well-known occupancy modeling framework. After demonstrating the viability and potential of this new form of dynamic occupancy model in a simulation study, we use it to obtain inference for the ongoing Common Myna (Acridotheres tristis) invasion in South Africa. Our results suggest that the Common Myna continues to enlarge its distribution and its spread via short distance movement, rather than long-distance dispersal. Overall, this new modeling framework provides a powerful tool for managers examining the drivers of colonization including short- vs. long-distance dispersal, habitat quality, and distance from source populations. PMID:27008788

  4. Dynamic occupancy models for explicit colonization processes

    USGS Publications Warehouse

    Broms, Kristin M.; Hooten, Mevin B.; Johnson, Devin S.; Altwegg, Res; Conquest, Loveday

    2016-01-01

    The dynamic, multi-season occupancy model framework has become a popular tool for modeling open populations with occupancies that change over time through local colonizations and extinctions. However, few versions of the model relate these probabilities to the occupancies of neighboring sites or patches. We present a modeling framework that incorporates this information and is capable of describing a wide variety of spatiotemporal colonization and extinction processes. A key feature of the model is that it is based on a simple set of small-scale rules describing how the process evolves. The result is a dynamic process that can account for complicated large-scale features. In our model, a site is more likely to be colonized if more of its neighbors were previously occupied and if it provides more appealing environmental characteristics than its neighboring sites. Additionally, a site without occupied neighbors may also become colonized through the inclusion of a long-distance dispersal process. Although similar model specifications have been developed for epidemiological applications, ours formally accounts for detectability using the well-known occupancy modeling framework. After demonstrating the viability and potential of this new form of dynamic occupancy model in a simulation study, we use it to obtain inference for the ongoing Common Myna (Acridotheres tristis) invasion in South Africa. Our results suggest that the Common Myna continues to enlarge its distribution and its spread via short distance movement, rather than long-distance dispersal. Overall, this new modeling framework provides a powerful tool for managers examining the drivers of colonization including short- vs. long-distance dispersal, habitat quality, and distance from source populations.

  5. Implementation of the Business Process Modelling Notation (BPMN) in the modelling of anatomic pathology processes

    PubMed Central

    Rojo, Marcial García; Rolón, Elvira; Calahorra, Luis; García, Felix Óscar; Sánchez, Rosario Paloma; Ruiz, Francisco; Ballester, Nieves; Armenteros, María; Rodríguez, Teresa; Espartero, Rafael Martín

    2008-01-01

    Background Process orientation is one of the essential elements of quality management systems, including those in use in healthcare. Business processes in hospitals are very complex and variable. BPMN (Business Process Modelling Notation) is a user-oriented language specifically designed for the modelling of business (organizational) processes. Previous experiences of the use of this notation in the processes modelling within the Pathology in Spain or another country are not known. We present our experience in the elaboration of the conceptual models of Pathology processes, as part of a global programmed surgical patient process, using BPMN. Methods With the objective of analyzing the use of BPMN notation in real cases, a multidisciplinary work group was created, including software engineers from the Dep. of Technologies and Information Systems from the University of Castilla-La Mancha and health professionals and administrative staff from the Hospital General de Ciudad Real. The work in collaboration was carried out in six phases: informative meetings, intensive training, process selection, definition of the work method, process describing by hospital experts, and process modelling. Results The modelling of the processes of Anatomic Pathology is presented using BPMN. The presented subprocesses are those corresponding to the surgical pathology examination of the samples coming from operating theatre, including the planning and realization of frozen studies. Conclusion The modelling of Anatomic Pathology subprocesses has allowed the creation of an understandable graphical model, where management and improvements are more easily implemented by health professionals. PMID:18673511

  6. Building intuition of iron evolution during solar cell processing through analysis of different process models

    NASA Astrophysics Data System (ADS)

    Morishige, Ashley E.; Laine, Hannu S.; Schön, Jonas; Haarahiltunen, Antti; Hofstetter, Jasmin; del Cañizo, Carlos; Schubert, Martin C.; Savin, Hele; Buonassisi, Tonio

    2015-09-01

    An important aspect of Process Simulators for photovoltaics is prediction of defect evolution during device fabrication. Over the last twenty years, these tools have accelerated process optimization, and several Process Simulators for iron, a ubiquitous and deleterious impurity in silicon, have been developed. The diversity of these tools can make it difficult to build intuition about the physics governing iron behavior during processing. Thus, in one unified software environment and using self-consistent terminology, we combine and describe three of these Simulators. We vary structural defect distribution and iron precipitation equations to create eight distinct Models, which we then use to simulate different stages of processing. We find that the structural defect distribution influences the final interstitial iron concentration ([]) more strongly than the iron precipitation equations. We identify two regimes of iron behavior: (1) diffusivity-limited, in which iron evolution is kinetically limited and bulk [] predictions can vary by an order of magnitude or more, and (2) solubility-limited, in which iron evolution is near thermodynamic equilibrium and the Models yield similar results. This rigorous analysis provides new intuition that can inform Process Simulation, material, and process development, and it enables scientists and engineers to choose an appropriate level of Model complexity based on wafer type and quality, processing conditions, and available computation time.

  7. Stochastic differential equation model to Prendiville processes

    SciTech Connect

    Granita; Bahar, Arifah

    2015-10-22

    The Prendiville process is another variation of the logistic model which assumes linearly decreasing population growth rate. It is a continuous time Markov chain (CTMC) taking integer values in the finite interval. The continuous time Markov chain can be approximated by stochastic differential equation (SDE). This paper discusses the stochastic differential equation of Prendiville process. The work started with the forward Kolmogorov equation in continuous time Markov chain of Prendiville process. Then it was formulated in the form of a central-difference approximation. The approximation was then used in Fokker-Planck equation in relation to the stochastic differential equation of the Prendiville process. The explicit solution of the Prendiville process was obtained from the stochastic differential equation. Therefore, the mean and variance function of the Prendiville process could be easily found from the explicit solution.

  8. Model-based adaptive 3D sonar reconstruction in reverberating environments.

    PubMed

    Saucan, Augustin-Alexandru; Sintes, Christophe; Chonavel, Thierry; Caillec, Jean-Marc Le

    2015-10-01

    In this paper, we propose a novel model-based approach for 3D underwater scene reconstruction, i.e., bathymetry, for side scan sonar arrays in complex and highly reverberating environments like shallow water areas. The presence of multipath echoes and volume reverberation generates false depth estimates. To improve the resulting bathymetry, this paper proposes and develops an adaptive filter, based on several original geometrical models. This multimodel approach makes it possible to track and separate the direction of arrival trajectories of multiple echoes impinging the array. Echo tracking is perceived as a model-based processing stage, incorporating prior information on the temporal evolution of echoes in order to reject cluttered observations generated by interfering echoes. The results of the proposed filter on simulated and real sonar data showcase the clutter-free and regularized bathymetric reconstruction. Model validation is carried out with goodness of fit tests, and demonstrates the importance of model-based processing for bathymetry reconstruction. PMID:25974936

  9. Cost Models for MMC Manufacturing Processes

    NASA Technical Reports Server (NTRS)

    Elzey, Dana M.; Wadley, Haydn N. G.

    1996-01-01

    The quality cost modeling (QCM) tool is intended to be a relatively simple-to-use device for obtaining a first-order assessment of the quality-cost relationship for a given process-material combination. The QCM curve is a plot of cost versus quality (an index indicating microstructural quality), which is unique for a given process-material combination. The QCM curve indicates the tradeoff between cost and performance, thus enabling one to evaluate affordability. Additionally, the effect of changes in process design, raw materials, and process conditions on the cost-quality relationship can be evaluated. Such results might indicate the most efficient means to obtain improved quality at reduced cost by process design refinements, the implementation of sensors and models for closed loop process control, or improvement in the properties of raw materials being fed into the process. QCM also allows alternative processes for producing the same or similar material to be compared in terms of their potential for producing competitively priced, high quality material. Aside from demonstrating the usefulness of the QCM concept, this is one of the main foci of the present research program, namely to compare processes for making continuous fiber reinforced, metal matrix composites (MMC's). Two processes, low pressure plasma spray deposition and tape casting are considered for QCM development. This document consists of a detailed look at the design of the QCM approach, followed by discussion of the application of QCM to each of the selected MMC manufacturing processes along with results, comparison of processes, and finally, a summary of findings and recommendations.

  10. Exposure modeling of engineered nanoparticles in the environment.

    PubMed

    Mueller, Nicole C; Nowack, Bernd

    2008-06-15

    The aim of this study was to use a life-cycle perspective to model the quantities of engineered nanoparticles released into the environment. Three types of nanoparticles were studied: nano silver (nano-Ag), nano TiO2 (nano-TiO2), and carbon nanotubes (CNT). The quantification was based on a substance flow analysis from products to air, soil, and water in Switzerland. The following parameters were used as model inputs: estimated worldwide production volume, allocation of the production volume to product categories, particle release from products, and flow coefficients within the environmental compartments. The predicted environmental concentrations (PEC) were then compared to the predicted no effect concentrations (PNEC) derived from the literature to estimate a possible risk. The expected concentrations of the three nanoparticles in the different environmental compartments vary widely, caused by the different life cycles of the nanoparticle-containing products. The PEC values for nano-TiO2 in water are 0.7--16 microg/L and close to or higher than the PNEC value for nano-TiO2 (< 1 microg/L). The risk quotients (PEC/PNEC) for CNT and nano-Ag were much smaller than one, therefore comprising no reason to expect adverse effects from those particles. The results of this study make it possible for the first time to carry out a quantitative risk assessment of nanoparticles in the environment and suggest further detailed studies of nano-TiO2. PMID:18605569

  11. Schools as host environments: toward a schoolwide reading improvement model.

    PubMed

    Kame'enui, E J; Simmons, D C; Coyne, M D

    2000-01-01

    Despite vast differences among school districts across the country, all students must learn how to read in a complex "host-environment" called a school. A challenge in beginning reading, therefore, is to transcend these differences and focus, instead, on the essential task of teaching reading in schools. Teaching reading involves attending to what we know about beginning reading and the alphabetic writing system, the difficulties of reading, and the challenges associated with dyslexia. Teaching reading in a school requires that interventions be tailored to the unique needs of an individual school and implemented and sustained at the school building level. In this article, we outline the Schoolwide Reading Improvement Model (SRIM). This model is characterized by the strategic integration of research-based practices in assessment, instructional design, and beginning reading instruction. Additionally, the SRIM acknowledges the specific needs of individual schools and is customized to provide the best fit with each unique "host-environment." First we provide a description of each major stage of the SRIM and then an example of its application in a school district in western Oregon. PMID:20563779

  12. Incorporating evolutionary processes into population viability models.

    PubMed

    Pierson, Jennifer C; Beissinger, Steven R; Bragg, Jason G; Coates, David J; Oostermeijer, J Gerard B; Sunnucks, Paul; Schumaker, Nathan H; Trotter, Meredith V; Young, Andrew G

    2015-06-01

    We examined how ecological and evolutionary (eco-evo) processes in population dynamics could be better integrated into population viability analysis (PVA). Complementary advances in computation and population genomics can be combined into an eco-evo PVA to offer powerful new approaches to understand the influence of evolutionary processes on population persistence. We developed the mechanistic basis of an eco-evo PVA using individual-based models with individual-level genotype tracking and dynamic genotype-phenotype mapping to model emergent population-level effects, such as local adaptation and genetic rescue. We then outline how genomics can allow or improve parameter estimation for PVA models by providing genotypic information at large numbers of loci for neutral and functional genome regions. As climate change and other threatening processes increase in rate and scale, eco-evo PVAs will become essential research tools to evaluate the effects of adaptive potential, evolutionary rescue, and locally adapted traits on persistence. PMID:25494697

  13. Session on modeling of radiative transfer processes

    NASA Technical Reports Server (NTRS)

    Flatau, Piotr

    1993-01-01

    The session on modeling of radiative transfer processes is reviewed. Six critical issues surfaced in the discussion concerning scale-interactive radiative processes relevent to the mesoscale convective systems (MCS's). These issues are the need to expand basic knowledge of how MCS's influence climate through extensive cloud shields and increased humidity in the upper troposphere; to improve radiation parameterizations used in mesoscale and General Circulation Model (GCM) models; to improve our basic understanding of the influence of radiation on MCS dynamics due to diabatic heating, production of condensate, and vertical and horizontal heat fluxes; to quantify our understanding of radiative impacts of MCS's on the surface and free atmosphere energy budgets; to quantify and identify radiative and microphysical processes important in the evolution of MCS's; and to improve the capability to remotely sense MCS radiative properties from space and ground-based systems.

  14. Quantitative Modeling of Earth Surface Processes

    NASA Astrophysics Data System (ADS)

    Pelletier, Jon D.

    This textbook describes some of the most effective and straightforward quantitative techniques for modeling Earth surface processes. By emphasizing a core set of equations and solution techniques, the book presents state-of-the-art models currently employed in Earth surface process research, as well as a set of simple but practical research tools. Detailed case studies demonstrate application of the methods to a wide variety of processes including hillslope, fluvial, aeolian, glacial, tectonic, and climatic systems. Exercises at the end of each chapter begin with simple calculations and then progress to more sophisticated problems that require computer programming. All the necessary computer codes are available online at www.cambridge.org/9780521855976. Assuming some knowledge of calculus and basic programming experience, this quantitative textbook is designed for advanced geomorphology courses and as a reference book for professional researchers in Earth and planetary science looking for a quantitative approach to Earth surface processes.

  15. More details...
  16. Hydrothermal processing of Hanford tank wastes: Process modeling and control

    SciTech Connect

    Currier, R.P.

    1994-10-01

    In the Los Alamos National Laboratory (LANL) hydrothermal process, waste streams are first pressurized and heated as they pass through a continuous flow tubular reactor vessel. The waste is maintained at reaction temperature of 300--550 C where organic destruction and sludge reformation occur. This report documents LANL activities in process modeling and control undertaken in FY94 to support hydrothermal process development. Key issues discussed include non-ideal flow patterns (e.g. axial dispersion) and their effect on reactor performance, the use and interpretation of inert tracer experiments, and the use of computational fluid mechanics to evaluate novel hydrothermal reactor designs. In addition, the effects of axial dispersion (and simplifications to rate expressions) on the estimated kinetic parameters are explored by non-linear regression to experimental data. Safety-related calculations are reported which estimate the explosion limits of effluent gases and the fate of hydrogen as it passes through the reactor. Development and numerical solution of a generalized one-dimensional mathematical model is also summarized. The difficulties encountered in using commercially available software to correlate the behavior of high temperature, high pressure aqueous electrolyte mixtures are summarized. Finally, details of the control system and experiments conducted to empirically determine the system response are reported.

  17. Physical Processes and Real-Time Chemical Measurement of the Insect Olfactory Environment

    PubMed Central

    Abrell, Leif; Hildebrand, John G.

    2009-01-01

    Odor-mediated insect navigation in airborne chemical plumes is vital to many ecological interactions, including mate finding, flower nectaring, and host locating (where disease transmission or herbivory may begin). After emission, volatile chemicals become rapidly mixed and diluted through physical processes that create a dynamic olfactory environment. This review examines those physical processes and some of the analytical technologies available to characterize those behavior-inducing chemical signals at temporal scales equivalent to the olfactory processing in insects. In particular, we focus on two areas of research that together may further our understanding of olfactory signal dynamics and its processing and perception by insects. First, measurement of physical atmospheric processes in the field can provide insight into the spatiotemporal dynamics of the odor signal available to insects. Field measurements in turn permit aspects of the physical environment to be simulated in the laboratory, thereby allowing careful investigation into the links between odor signal dynamics and insect behavior. Second, emerging analytical technologies with high recording frequencies and field-friendly inlet systems may offer new opportunities to characterize natural odors at spatiotemporal scales relevant to insect perception and behavior. Characterization of the chemical signal environment allows the determination of when and where olfactory-mediated behaviors may control ecological interactions. Finally, we argue that coupling of these two research areas will foster increased understanding of the physicochemical environment and enable researchers to determine how olfactory environments shape insect behaviors and sensory systems. PMID:18548311

  18. A process algebra model of QED

    NASA Astrophysics Data System (ADS)

    Sulis, William

    2016-03-01

    The process algebra approach to quantum mechanics posits a finite, discrete, determinate ontology of primitive events which are generated by processes (in the sense of Whitehead). In this ontology, primitive events serve as elements of an emergent space-time and of emergent fundamental particles and fields. Each process generates a set of primitive elements, using only local information, causally propagated as a discrete wave, forming a causal space termed a causal tapestry. Each causal tapestry forms a discrete and finite sampling of an emergent causal manifold (space-time) M and emergent wave function. Interactions between processes are described by a process algebra which possesses 8 commutative operations (sums and products) together with a non-commutative concatenation operator (transitions). The process algebra possesses a representation via nondeterministic combinatorial games. The process algebra connects to quantum mechanics through the set valued process and configuration space covering maps, which associate each causal tapestry with sets of wave functions over M. Probabilities emerge from interactions between processes. The process algebra model has been shown to reproduce many features of the theory of non-relativistic scalar particles to a high degree of accuracy, without paradox or divergences. This paper extends the approach to a semi-classical form of quantum electrodynamics.

  19. X-ray emission processes in stars and their immediate environment

    PubMed Central

    Testa, Paola

    2010-01-01

    A decade of X-ray stellar observations with Chandra and XMM-Newton has led to significant advances in our understanding of the physical processes at work in hot (magnetized) plasmas in stars and their immediate environment, providing new perspectives and challenges, and in turn the need for improved models. The wealth of high-quality stellar spectra has allowed us to investigate, in detail, the characteristics of the X-ray emission across the Hertzsprung-Russell (HR) diagram. Progress has been made in addressing issues ranging from classical stellar activity in stars with solar-like dynamos (such as flares, activity cycles, spatial and thermal structuring of the X-ray emitting plasma, and evolution of X-ray activity with age), to X-ray generating processes (e.g., accretion, jets, magnetically confined winds) that were poorly understood in the preChandra/XMM-Newton era. I will discuss the progress made in the study of high energy stellar physics and its impact in a wider astrophysical context, focusing on the role of spectral diagnostics now accessible. PMID:20360562

  20. X-ray emission processes in stars and their immediate environment.

    PubMed

    Testa, Paola

    2010-04-20

    A decade of X-ray stellar observations with Chandra and XMM-Newton has led to significant advances in our understanding of the physical processes at work in hot (magnetized) plasmas in stars and their immediate environment, providing new perspectives and challenges, and in turn the need for improved models. The wealth of high-quality stellar spectra has allowed us to investigate, in detail, the characteristics of the X-ray emission across the Hertzsprung-Russell (HR) diagram. Progress has been made in addressing issues ranging from classical stellar activity in stars with solar-like dynamos (such as flares, activity cycles, spatial and thermal structuring of the X-ray emitting plasma, and evolution of X-ray activity with age), to X-ray generating processes (e.g., accretion, jets, magnetically confined winds) that were poorly understood in the preChandra/XMM-Newton era. I will discuss the progress made in the study of high energy stellar physics and its impact in a wider astrophysical context, focusing on the role of spectral diagnostics now accessible. PMID:20360562

  21. Applications integration in a hybrid cloud computing environment: modelling and platform

    NASA Astrophysics Data System (ADS)

    Li, Qing; Wang, Ze-yuan; Li, Wei-hua; Li, Jun; Wang, Cheng; Du, Rui-yang

    2013-08-01

    With the development of application services providers and cloud computing, more and more small- and medium-sized business enterprises use software services and even infrastructure services provided by professional information service companies to replace all or part of their information systems (ISs). These information service companies provide applications, such as data storage, computing processes, document sharing and even management information system services as public resources to support the business process management of their customers. However, no cloud computing service vendor can satisfy the full functional IS requirements of an enterprise. As a result, enterprises often have to simultaneously use systems distributed in different clouds and their intra enterprise ISs. Thus, this article presents a framework to integrate applications deployed in public clouds and intra ISs. A run-time platform is developed and a cross-computing environment process modelling technique is also developed to improve the feasibility of ISs under hybrid cloud computing environments.

  1. Retort process modelling for Indian traditional foods.

    PubMed

    Gokhale, S V; Lele, S S

    2014-11-01

    Indian traditional staple and snack food is typically a heterogeneous recipe that incorporates varieties of vegetables, lentils and other ingredients. Modelling the retorting process of multilayer pouch packed Indian food was achieved using lumped-parameter approach. A unified model is proposed to estimate cold point temperature. Initial process conditions, retort temperature and % solid content were the significantly affecting independent variables. A model was developed using combination of vegetable solids and water, which was then validated using four traditional Indian vegetarian products: Pulav (steamed rice with vegetables), Sambar (south Indian style curry containing mixed vegetables and lentils), Gajar Halawa (carrot based sweet product) and Upama (wheat based snack product). The predicted and experimental values of temperature profile matched with ±10 % error which is a good match considering the food was a multi component system. Thus the model will be useful as a tool to reduce number of trials required to optimize retorting of various Indian traditional vegetarian foods. PMID:26396305

  2. The DAB model of drawing processes

    NASA Technical Reports Server (NTRS)

    Hochhaus, Larry W.

    1989-01-01

    The problem of automatic drawing was investigated in two ways. First, a DAB model of drawing processes was introduced. DAB stands for three types of knowledge hypothesized to support drawing abilities, namely, Drawing Knowledge, Assimilated Knowledge, and Base Knowledge. Speculation concerning the content and character of each of these subsystems of the drawing process is introduced and the overall adequacy of the model is evaluated. Second, eight experts were each asked to understand six engineering drawings and to think aloud while doing so. It is anticipated that a concurrent protocol analysis of these interviews can be carried out in the future. Meanwhile, a general description of the videotape database is provided. In conclusion, the DAB model was praised as a worthwhile first step toward solution of a difficult problem, but was considered by and large inadequate to the challenge of automatic drawing. Suggestions for improvements on the model were made.

  3. Soil processes parameterization in meteorological model.

    NASA Astrophysics Data System (ADS)

    Mazur, Andrzej; Duniec, Grzegorz

    2014-05-01

    In August 2012 Polish Institute Meteorology and Water Management - National Research Institute (IMWM-NRI) started a collaboration with the Institute of Agrophysics - Polish Academy of Science (IA-PAS) in order to improve soil processes parameterization in COSMO meteorological model of high resolution (horizontal grid size equal to 2,8 km). This cooperation turned into a project named "New approach to parameterization of physical processes in soil in numerical model". The new set of soil processes parameterizations is being developed considering many physical and microphysical processes in soil. Currently, main effort is focused on description of bare soil evaporation, soil water transport and the runoff from soil layers. The preliminary results from new mathematical formulation of bare soil evaporation implemented in COSMO model will be presented. Moreover, during the Conference authors (realizing a constant need for further improvement) would like to show future plans and topics for further studies. It is planned to combine the mentioned new approach with TILE and MOSAIC parameterizations, previously investigated as a part of TERRA-MultiLevel module of COSMO model, and to use measurements data received from IA-PAS and from Satellite Remote Sensing Center in soil-related COSMO model numerical experiments.

  4. Attrition and abrasion models for oil shale process modeling

    SciTech Connect

    Aldis, D.F.

    1991-10-25

    As oil shale is processed, fine particles, much smaller than the original shale are created. This process is called attrition or more accurately abrasion. In this paper, models of abrasion are presented for oil shale being processed in several unit operations. Two of these unit operations, a fluidized bed and a lift pipe are used in the Lawrence Livermore National Laboratory Hot-Recycle-Solid (HRS) process being developed for the above ground processing of oil shale. In two reports, studies were conducted on the attrition of oil shale in unit operations which are used in the HRS process. Carley reported results for attrition in a lift pipe for oil shale which had been pre-processed either by retorting or by retorting then burning. The second paper, by Taylor and Beavers, reported results for a fluidized bed processing of oil shale. Taylor and Beavers studied raw, retorted, and shale which had been retorted and then burned. In this paper, empirical models are derived, from the experimental studies conducted on oil shale for the process occurring in the HRS process. The derived models are presented along with comparisons with experimental results.

  5. Gaussian Process Modeling of Protein Turnover.

    PubMed

    Rahman, Mahbubur; Previs, Stephen F; Kasumov, Takhar; Sadygov, Rovshan G

    2016-07-01

    We describe a stochastic model to compute in vivo protein turnover rate constants from stable-isotope labeling and high-throughput liquid chromatography-mass spectrometry experiments. We show that the often-used one- and two-compartment nonstochastic models allow explicit solutions from the corresponding stochastic differential equations. The resulting stochastic process is a Gaussian processes with Ornstein-Uhlenbeck covariance matrix. We applied the stochastic model to a large-scale data set from (15)N labeling and compared its performance metrics with those of the nonstochastic curve fitting. The comparison showed that for more than 99% of proteins, the stochastic model produced better fits to the experimental data (based on residual sum of squares). The model was used for extracting protein-decay rate constants from mouse brain (slow turnover) and liver (fast turnover) samples. We found that the most affected (compared to two-exponent curve fitting) results were those for liver proteins. The ratio of the median of degradation rate constants of liver proteins to those of brain proteins increased 4-fold in stochastic modeling compared to the two-exponent fitting. Stochastic modeling predicted stronger differences of protein turnover processes between mouse liver and brain than previously estimated. The model is independent of the labeling isotope. To show this, we also applied the model to protein turnover studied in induced heart failure in rats, in which metabolic labeling was achieved by administering heavy water. No changes in the model were necessary for adapting to heavy-water labeling. The approach has been implemented in a freely available R code. PMID:27229456

  6. Knowledge-based approach to fault diagnosis and control in distributed process environments

    NASA Astrophysics Data System (ADS)

    Chung, Kwangsue; Tou, Julius T.

    1991-03-01

    This paper presents a new design approach to knowledge-based decision support systems for fault diagnosis and control for quality assurance and productivity improvement in automated manufacturing environments. Based on the observed manifestations, the knowledge-based diagnostic system hypothesizes a set of the most plausible disorders by mimicking the reasoning process of a human diagnostician. The data integration technique is designed to generate error-free hierarchical category files. A novel approach to diagnostic problem solving has been proposed by integrating the PADIKS (Pattern-Directed Knowledge-Based System) concept and the symbolic model of diagnostic reasoning based on the categorical causal model. The combination of symbolic causal reasoning and pattern-directed reasoning produces a highly efficient diagnostic procedure and generates a more realistic expert behavior. In addition, three distinctive constraints are designed to further reduce the computational complexity and to eliminate non-plausible hypotheses involved in the multiple disorders problem. The proposed diagnostic mechanism, which consists of three different levels of reasoning operations, significantly reduces the computational complexity in the diagnostic problem with uncertainty by systematically shrinking the hypotheses space. This approach is applied to the test and inspection data collected from a PCB manufacturing operation.

  7. Modeling Asymmetric Rolling Process of Mg alloys

    SciTech Connect

    Cho, Jaehyung; Kim, Hyung-Wuk; Kang, Suk-Bong

    2010-06-15

    Asymmetric deformation during rolling can arise in various ways: difference in the radii, speeds, frictions of the top and bottom rolls. Asymmetric warm rolling processes of magnesium alloys were modeled using a lagrangian incremental approach. A constitutive equation representing flow behaviors of AZ31 magnesium alloys during warm deformation was implemented to the modeling. Various roll speed ratios were introduced to investigate deformation behaviors of the magnesium alloys. Bending and texturing of the strips were examined.

  8. Building phenomenological models of complex biological processes

    NASA Astrophysics Data System (ADS)

    Daniels, Bryan; Nemenman, Ilya

    2009-11-01

    A central goal of any modeling effort is to make predictions regarding experimental conditions that have not yet been observed. Overly simple models will not be able to fit the original data well, but overly complex models are likely to overfit the data and thus produce bad predictions. Modern quantitative biology modeling efforts often err on the complexity side of this balance, using myriads of microscopic biochemical reaction processes with a priori unknown kinetic parameters to model relatively simple biological phenomena. In this work, we show how Bayesian model selection (which is mathematically similar to low temperature expansion in statistical physics) can be used to build coarse-grained, phenomenological models of complex dynamical biological processes, which have better predictive powers than microscopically correct, but poorely constrained mechanistic molecular models. We illustrate this on the example of a multiply-modifiable protein molecule, which is a simplified description of multiple biological systems, such as an immune receptors and an RNA polymerase complex. Our approach is similar in spirit to the phenomenological Landau expansion for the free energy in the theory of critical phenomena.

  9. Modeling Multi-process Transport of Pathogens in Porous Media

    NASA Astrophysics Data System (ADS)

    Cheng, L.; Brusseau, M. L.

    2004-12-01

    The transport behavior of microorganisms in porous media is of interest with regard to the fate of pathogens associated with wastewater recharge, riverbank filtration, and land application of biosolids. This interest has fomented research on the transport of pathogens in the subsurface environment. The factors influencing pathogen transport within the subsurface environment include advection, dispersion, filtration, and inactivation. The filtration process, which mediates the magnitude and rate of pathogen retention, comprises several mechanisms such as attachment to porous-medium surfaces, straining, and sedimentation. We present a mathematical model wherein individual filtration mechanisms are explicitly incorporated along with advection, dispersion, and inactivation. The performance of the model is evaluated by applying it to several data sets obtained from miscible-displacement experiments conducted using various pathogens. Input parameters are obtained to the extent possible from independent means.

  10. Using 222Rn as a tracer of geophysical processes in underground environments

    NASA Astrophysics Data System (ADS)

    Lacerda, T.; Anjos, R. M.; Valladares, D. L.; da Silva, A. A. R.; Rizzotto, M.; Velasco, H.; de Rosas, J. P.; Ayub, J. Juri; Yoshimura, E. M.

    2014-11-01

    Radon levels in two old mines in San Luis, Argentina, are reported and analyzed. These mines are today used for touristic visitation. Our goal was to assess the potential use of such radioactive noble gas as tracer of geological processes in underground environments. CR-39 nuclear track detectors were used during the winter and summer seasons. The findings show that the significant radon concentrations reported in this environment are subject to large seasonal modulations, due to the strong dependence of natural ventilation on the variations of outside temperature. The results also indicate that radon pattern distribution appear as a good method to localize unknown ducts, fissures or secondary tunnels in subterranean environments.

  11. Using {sup 222}Rn as a tracer of geophysical processes in underground environments

    SciTech Connect

    Lacerda, T.; Anjos, R. M.; Silva, A. A. R. da; Yoshimura, E. M.

    2014-11-11

    Radon levels in two old mines in San Luis, Argentina, are reported and analyzed. These mines are today used for touristic visitation. Our goal was to assess the potential use of such radioactive noble gas as tracer of geological processes in underground environments. CR-39 nuclear track detectors were used during the winter and summer seasons. The findings show that the significant radon concentrations reported in this environment are subject to large seasonal modulations, due to the strong dependence of natural ventilation on the variations of outside temperature. The results also indicate that radon pattern distribution appear as a good method to localize unknown ducts, fissures or secondary tunnels in subterranean environments.

  12. Environment.

    ERIC Educational Resources Information Center

    White, Gilbert F.

    1980-01-01

    Presented are perspectives on the emergence of environmental problems. Six major trends in scientific thinking are identified including: holistic approaches to examining environments, life support systems, resource management, risk assessment, streamlined methods for monitoring environmental change, and emphasis on the global framework. (Author/SA)

  13. Process diagnostics for precision grinding brittle materials in a production environment

    SciTech Connect

    Blaedel, K L; Davis, P J; Piscotty, M A

    1999-04-01

    Precision grinding processes are steadily migrating from research laboratory environments into manufacturing production lines as precision machines and processes become increasingly more commonplace throughout industry. Low-roughness, low-damage precision grinding is gaining widespread commercial acceptance for a host of brittle materials including advanced structural ceramics. The development of these processes is often problematic and requires diagnostic information and analysis to harden the processes for manufacturing. This paper presents a series of practical precision grinding tests developed and practiced at Lawrence Livermore National Laboratory that yield important information to help move a new process idea into production.

  14. Processing and Modeling of Porous Copper Using Sintering Dissolution Process

    NASA Astrophysics Data System (ADS)

    Salih, Mustafa Abualgasim Abdalhakam

    The growth of porous metal has produced materials with improved properties as compared to non-metals and solid metals. Porous metal can be classified as either open cell or closed cell. Open cell allows a fluid media to pass through it. Closed cell is made up of adjacent sealed pores with shared cell walls. Metal foams offer higher strength to weight ratios, increased impact energy absorption, and a greater tolerance to high temperatures and adverse environmental conditions when compared to bulk materials. Copper and its alloys are examples of these, well known for high strength and good mechanical, thermal and electrical properties. In the present study, the porous Cu was made by a powder metallurgy process, using three different space holders, sodium chloride, sodium carbonate and potassium carbonate. Several different samples have been produced, using different ratios of volume fraction. The densities of the porous metals have been measured and compared to the theoretical density calculated using an equation developed for these foams. The porous structure was determined with the removal of spacer materials through sintering process. The sintering process of each spacer material depends on the melting point of the spacer material. Processing, characterization, and mechanical properties were completed. These tests include density measurements, compression tests, computed tomography (CT) and scanning electron microscopy (SEM). The captured morphological images are utilized to generate the object-oriented finite element (OOF) analysis for the porous copper. Porous copper was formed with porosities in the range of 40-66% with density ranges from 3 to 5.2 g/cm3. A study of two different methods to measure porosity was completed. OOF (Object Oriented Finite Elements) is a desktop software application for studying the relationship between the microstructure of a material and its overall mechanical, dielectric, or thermal properties using finite element models based on

  15. Performance analysis of no-vent fill process for liquid hydrogen tank in terrestrial and on-orbit environments

    NASA Astrophysics Data System (ADS)

    Wang, Lei; Li, Yanzhong; Zhang, Feini; Ma, Yuan

    2015-12-01

    Two finite difference computer models, aiming at the process predictions of no-vent fill in normal gravity and microgravity environments respectively, are developed to investigate the filling performance in a liquid hydrogen (LH2) tank. In the normal gravity case model, the tank/fluid system is divided into five control volume including ullage, bulk liquid, gas-liquid interface, ullage-adjacent wall, and liquid-adjacent wall. In the microgravity case model, vapor-liquid thermal equilibrium state is maintained throughout the process, and only two nodes representing fluid and wall regions are applied. To capture the liquid-wall heat transfer accurately, a series of heat transfer mechanisms are considered and modeled successively, including film boiling, transition boiling, nucleate boiling and liquid natural convection. The two models are validated by comparing their prediction with experimental data, which shows good agreement. Then the two models are used to investigate the performance of no-vent fill in different conditions and several conclusions are obtained. It shows that in the normal gravity environment the no-vent fill experiences a continuous pressure rise during the whole process and the maximum pressure occurs at the end of the operation, while the maximum pressure of the microgravity case occurs at the beginning stage of the process. Moreover, it seems that increasing inlet mass flux has an apparent influence on the pressure evolution of no-vent fill process in normal gravity but a little influence in microgravity. The larger initial wall temperature brings about more significant liquid evaporation during the filling operation, and then causes higher pressure evolution, no matter the filling process occurs under normal gravity or microgravity conditions. Reducing inlet liquid temperature can improve the filling performance in normal gravity, but cannot significantly reduce the maximum pressure in microgravity. The presented work benefits the

  16. The Chandra X-Ray Observatory Radiation Environment Model

    NASA Technical Reports Server (NTRS)

    Blackwell, W. C.; Minow, Joseph I.; Smith, Shawn; Swift, Wesley R.; ODell, Stephen L.; Cameron, Robert A.

    2003-01-01

    CRMFLX (Chandra Radiation Model of ion FluX) is an environmental risk mitigation tool for use as a decision aid in planning the operations times for Chandra's Advanced CCD Imaging Spectrometer (ACIS) detector. The accurate prediction of the proton flux environment with energies of 100 - 200 keV is needed in order to protect the ACIS detector against proton degradation. Unfortunately, protons of this energy are abundant in the region of space Chandra must operate, and the on-board Electron, Proton, and Helium Instrument (EPHIN) does not measure proton flux levels of the required energy range. In addition to the concerns arising from the radiation belts, substorm injections of plasma from the magnetotail may increase the protons flux by orders of magnitude in this energy range. The Earth's magnetosphere is a dynamic entity, with the size and location of the magnetopause driven by the highly variable solar wind parameters (number density, velocity, and magnetic field components). Operational times for the telescope must be made weeks in advance, decisions which are complicated by the variability of the environment. CRMFLX is an engineering model developed to address these problems and provides proton flux and fluence statistics for the terrestrial outer magnetosphere, magnetosheath, and solar wind for use in scheduling ACIS operations. CRMFLX implements a number of standard models to predict the bow shock, magnetopause, and plasma sheet boundaries based on the sampling of historical solar wind data sets. Measurements from the GEOTAIL and POLAR spacecraft are used to create the proton flux database. This paper describes the recently released CRMFLX v2 implementation that includes an algorithm that propagates flux from an observation location to other regions of the magnetosphere based on convective ExB and VB-curvature particle drift motions in electric and magnetic fields. This technique has the advantage of more completely filling out the database and makes maximum

  17. Empirical Modeling of Plant Gas Fluxes in Controlled Environments

    NASA Technical Reports Server (NTRS)

    Cornett, Jessie David

    1994-01-01

    As humans extend their reach beyond the earth, bioregenerative life support systems must replace the resupply and physical/chemical systems now used. The Controlled Ecological Life Support System (CELSS) will utilize plants to recycle the carbon dioxide (CO2) and excrement produced by humans and return oxygen (O2), purified water and food. CELSS design requires knowledge of gas flux levels for net photosynthesis (PS(sub n)), dark respiration (R(sub d)) and evapotranspiration (ET). Full season gas flux data regarding these processes for wheat (Triticum aestivum), soybean (Glycine max) and rice (Oryza sativa) from published sources were used to develop empirical models. Univariate models relating crop age (days after planting) and gas flux were fit by simple regression. Models are either high order (5th to 8th) or more complex polynomials whose curves describe crop development characteristics. The models provide good estimates of gas flux maxima, but are of limited utility. To broaden the applicability, data were transformed to dimensionless or correlation formats and, again, fit by regression. Polynomials, similar to those in the initial effort, were selected as the most appropriate models. These models indicate that, within a cultivar, gas flux patterns appear remarkably similar prior to maximum flux, but exhibit considerable variation beyond this point. This suggests that more broadly applicable models of plant gas flux are feasible, but univariate models defining gas flux as a function of crop age are too simplistic. Multivariate models using CO2 and crop age were fit for PS(sub n), and R(sub d) by multiple regression. In each case, the selected model is a subset of a full third order model with all possible interactions. These models are improvements over the univariate models because they incorporate more than the single factor, crop age, as the primary variable governing gas flux. They are still limited, however, by their reliance on the other environmental

  18. Improving the process of process modelling by the use of domain process patterns

    NASA Astrophysics Data System (ADS)

    Koschmider, Agnes; Reijers, Hajo A.

    2015-01-01

    The use of business process models has become prevalent in a wide area of enterprise applications. But while their popularity is expanding, concerns are growing with respect to their proper creation and maintenance. An obvious way to boost the efficiency of creating high-quality business process models would be to reuse relevant parts of existing models. At this point, however, limited support exists to guide process modellers towards the usage of appropriate model content. In this paper, a set of content-oriented patterns is presented, which is extracted from a large set of process models from the order management and manufacturing production domains. The patterns are derived using a newly proposed set of algorithms, which are being discussed in this paper. The authors demonstrate how such Domain Process Patterns, in combination with information on their historic usage, can support process modellers in generating new models. To support the wider dissemination and development of Domain Process Patterns within and beyond the studied domains, an accompanying website has been set up.

  19. A Generic Modeling Process to Support Functional Fault Model Development

    NASA Technical Reports Server (NTRS)

    Maul, William A.; Hemminger, Joseph A.; Oostdyk, Rebecca; Bis, Rachael A.

    2016-01-01

    Functional fault models (FFMs) are qualitative representations of a system's failure space that are used to provide a diagnostic of the modeled system. An FFM simulates the failure effect propagation paths within a system between failure modes and observation points. These models contain a significant amount of information about the system including the design, operation and off nominal behavior. The development and verification of the models can be costly in both time and resources. In addition, models depicting similar components can be distinct, both in appearance and function, when created individually, because there are numerous ways of representing the failure space within each component. Generic application of FFMs has the advantages of software code reuse: reduction of time and resources in both development and verification, and a standard set of component models from which future system models can be generated with common appearance and diagnostic performance. This paper outlines the motivation to develop a generic modeling process for FFMs at the component level and the effort to implement that process through modeling conventions and a software tool. The implementation of this generic modeling process within a fault isolation demonstration for NASA's Advanced Ground System Maintenance (AGSM) Integrated Health Management (IHM) project is presented and the impact discussed.

  20. Development of an interdisciplinary model cluster for tidal water environments

    NASA Astrophysics Data System (ADS)

    Dietrich, Stephan; Winterscheid, Axel; Jens, Wyrwa; Hartmut, Hein; Birte, Hein; Stefan, Vollmer; Andreas, Schöl

    2013-04-01

    Global climate change has a high potential to influence both the persistence and the transport pathways of water masses and its constituents in tidal waters and estuaries. These processes are linked through dispersion processes, thus directly influencing the sediment and solid suspend matter budgets, and thus the river morphology. Furthermore, the hydrologic regime has an impact on the transport of nutrients, phytoplankton, suspended matter, and temperature that determine the oxygen content within water masses, which is a major parameter describing the water quality. This project aims at the implementation of a so-called (numerical) model cluster in tidal waters, which includes the model compartments hydrodynamics, morphology and ecology. For the implementation of this cluster it is required to continue with the integration of different models that work in a wide range of spatial and temporal scales. The model cluster is thus suggested to lead to a more precise knowledge of the feedback processes between the single interdisciplinary model compartments. In addition to field measurements this model cluster will provide a complementary scientific basis required to address a spectrum of research questions concerning the integral management of estuaries within the Federal Institute of Hydrology (BfG, Germany). This will in particular include aspects like sediment and water quality management as well as adaptation strategies to climate change. The core of the model cluster will consist of the 3D-hydrodynamic model Delft3D (Roelvink and van Banning, 1994), long-term hydrodynamics in the estuaries are simulated with the Hamburg Shelf Ocean Model HAMSOM (Backhaus, 1983; Hein et al., 2012). The simulation results will be compared with the unstructured grid based SELFE model (Zhang and Bapista, 2008). The additional coupling of the BfG-developed 1D-water quality model QSim (Kirchesch and Schöl, 1999; Hein et al., 2011) with the morphological/hydrodynamic models is an

  1. Model-based internal wave processing

    SciTech Connect

    Candy, J.V.; Chambers, D.H.

    1995-06-09

    A model-based approach is proposed to solve the oceanic internal wave signal processing problem that is based on state-space representations of the normal-mode vertical velocity and plane wave horizontal velocity propagation models. It is shown that these representations can be utilized to spatially propagate the modal (dept) vertical velocity functions given the basic parameters (wave numbers, Brunt-Vaisala frequency profile etc.) developed from the solution of the associated boundary value problem as well as the horizontal velocity components. Based on this framework, investigations are made of model-based solutions to the signal enhancement problem for internal waves.

  2. Dynamical modeling of laser ablation processes

    SciTech Connect

    Leboeuf, J.N.; Chen, K.R.; Donato, J.M.; Geohegan, D.B.; Liu, C.L.; Puretzky, A.A.; Wood, R.F.

    1995-09-01

    Several physics and computational approaches have been developed to globally characterize phenomena important for film growth by pulsed laser deposition of materials. These include thermal models of laser-solid target interactions that initiate the vapor plume; plume ionization and heating through laser absorption beyond local thermodynamic equilibrium mechanisms; gas dynamic, hydrodynamic, and collisional descriptions of plume transport; and molecular dynamics models of the interaction of plume particles with the deposition substrate. The complexity of the phenomena involved in the laser ablation process is matched by the diversity of the modeling task, which combines materials science, atomic physics, and plasma physics.

  3. Computer modeling of gas flow and gas loading of rock in a bench blasting environment

    SciTech Connect

    Preece, D.S.; Baer, M.R. ); Knudsen, S.D. )

    1991-01-01

    Numerical modeling can contribute greatly to an understanding of the physics involved in the blasting process. This paper will describe the latest enhancements to the blast modeling code DMC (Distinct Motion Code) (Taylor and Preece, 1989) and will demonstrate the ability of DMC to model gas flow and rock motion in a bench blasting environment. DMC has been used previously to model rock motion associated with blasting in a cratering environment (Preece and Taylor, 1990) and in confined volume blasting associated with in-situ oil shale retorting (Preece, 1990 a b). These applications of DMC treated the explosive loading as force versus time functions on specific spheres which were adjusted to obtain correct face velocities. It was recognized that a great need in explosives modeling was the coupling of an ability to simulate gas flow with the rock motion simulation capability of DMC. This was accomplished by executing a finite difference code that computes gas flow through a porous media (Baer and Gross, 1989) in conjunction with DMC. The marriage of these two capabilities has been documented by Preece and Knudsen, 1991. The capabilities that have been added recently to DMC and which will be documented in this paper include: (1) addition of a new equation of state for the explosive gases; (2) modeling of gas flow and sphere loading in a bench environment. 8 refs., 5 figs.

  4. Model Identification of Integrated ARMA Processes

    ERIC Educational Resources Information Center

    Stadnytska, Tetiana; Braun, Simone; Werner, Joachim

    2008-01-01

    This article evaluates the Smallest Canonical Correlation Method (SCAN) and the Extended Sample Autocorrelation Function (ESACF), automated methods for the Autoregressive Integrated Moving-Average (ARIMA) model selection commonly available in current versions of SAS for Windows, as identification tools for integrated processes. SCAN and ESACF can…

  5. Mathematical Modelling of Continuous Biotechnological Processes

    ERIC Educational Resources Information Center

    Pencheva, T.; Hristozov, I.; Shannon, A. G.

    2003-01-01

    Biotechnological processes (BTP) are characterized by a complicated structure of organization and interdependent characteristics. Partial differential equations or systems of partial differential equations are used for their behavioural description as objects with distributed parameters. Modelling of substrate without regard to dispersion…

  6. Aligning Grammatical Theories and Language Processing Models

    ERIC Educational Resources Information Center

    Lewis, Shevaun; Phillips, Colin

    2015-01-01

    We address two important questions about the relationship between theoretical linguistics and psycholinguistics. First, do grammatical theories and language processing models describe separate cognitive systems, or are they accounts of different aspects of the same system? We argue that most evidence is consistent with the one-system view. Second,…

  7. The SERIOL2 Model of Orthographic Processing

    ERIC Educational Resources Information Center

    Whitney, Carol; Marton, Yuval

    2013-01-01

    The SERIOL model of orthographic analysis proposed mechanisms for converting visual input into a serial encoding of letter order, which involved hemisphere-specific processing at the retinotopic level. As a test of SERIOL predictions, we conducted a consonant trigram-identification experiment, where the trigrams were briefly presented at various…

  8. Content, Process, and Product: Modeling Differentiated Instruction

    ERIC Educational Resources Information Center

    Taylor, Barbara Kline

    2015-01-01

    Modeling differentiated instruction is one way to demonstrate how educators can incorporate instructional strategies to address students' needs, interests, and learning styles. This article discusses how secondary teacher candidates learn to focus on content--the "what" of instruction; process--the "how" of instruction;…

  9. Computational Process Modeling for Additive Manufacturing (OSU)

    NASA Technical Reports Server (NTRS)

    Bagg, Stacey; Zhang, Wei

    2015-01-01

    Powder-Bed Additive Manufacturing (AM) through Direct Metal Laser Sintering (DMLS) or Selective Laser Melting (SLM) is being used by NASA and the Aerospace industry to "print" parts that traditionally are very complex, high cost, or long schedule lead items. The process spreads a thin layer of metal powder over a build platform, then melts the powder in a series of welds in a desired shape. The next layer of powder is applied, and the process is repeated until layer-by-layer, a very complex part can be built. This reduces cost and schedule by eliminating very complex tooling and processes traditionally used in aerospace component manufacturing. To use the process to print end-use items, NASA seeks to understand SLM material well enough to develop a method of qualifying parts for space flight operation. Traditionally, a new material process takes many years and high investment to generate statistical databases and experiential knowledge, but computational modeling can truncate the schedule and cost -many experiments can be run quickly in a model, which would take years and a high material cost to run empirically. This project seeks to optimize material build parameters with reduced time and cost through modeling.

  10. Hot blast stove process model and model-based controller

    SciTech Connect

    Muske, K.R.; Howse, J.W.; Hansen, G.A.; Cagliostro, D.J.; Chaubal, P.C.

    1998-12-31

    This paper describes the process model and model-based control techniques implemented on the hot blast stoves for the No. 7 Blast Furnace at the Inland Steel facility in East Chicago, Indiana. A detailed heat transfer model of the stoves is developed and verified using plant data. This model is used as part of a predictive control scheme to determine the minimum amount of fuel necessary to achieve the blast air requirements. The model is also used to predict maximum and minimum temperature constraint violations within the stove so that the controller can take corrective actions while still achieving the required stove performance.

  11. Modeling chondrocyte patterns by elliptical cluster processes.

    PubMed

    Meinhardt, Martin; Lück, Sebastian; Martin, Pascal; Felka, Tino; Aicher, Wilhelm; Rolauffs, Bernd; Schmidt, Volker

    2012-02-01

    Superficial zone chondrocytes (CHs) of human joints are spatially organized in distinct horizontal patterns. Among other factors, the type of spatial CH organization within a given articular surface depends on whether the cartilage has been derived from an intact joint or the joint is affected by osteoarthritis (OA). Furthermore, specific variations of the type of spatial organization are associated with particular states of OA. This association may prove relevant for early disease recognition based on a quantitative structural characterization of CH patterns. Therefore, we present a point process model describing the distinct morphology of CH patterns within the articular surface of intact human cartilage. This reference model for intact CH organization can be seen as a first step towards a model-based statistical diagnostic tool. Model parameters are fitted to fluorescence microscopy data by a novel statistical methodology utilizing tools from cluster and principal component analysis. This way, the complex morphology of surface CH patters is represented by a relatively small number of model parameters. We validate the point process model by comparing biologically relevant structural characteristics between the fitted model and data derived from photomicrographs of the human articular surface using techniques from spatial statistics. PMID:22155191

  12. An Interactive, Web-based High Performance Modeling Environment for Computational Epidemiology

    PubMed Central

    Deodhar, Suruchi; Bisset, Keith R.; Chen, Jiangzhuo; Ma, Yifei; Marathe, Madhav V.

    2014-01-01

    We present an integrated interactive modeling environment to support public health epidemiology. The environment combines a high resolution individual-based model with a user-friendly web-based interface that allows analysts to access the models and the analytics back-end remotely from a desktop or a mobile device. The environment is based on a loosely-coupled service-oriented-architecture that allows analysts to explore various counter factual scenarios. As the modeling tools for public health epidemiology are getting more sophisticated, it is becoming increasingly hard for non-computational scientists to effectively use the systems that incorporate such models. Thus an important design consideration for an integrated modeling environment is to improve ease of use such that experimental simulations can be driven by the users. This is achieved by designing intuitive and user-friendly interfaces that allow users to design and analyze a computational experiment and steer the experiment based on the state of the system. A key feature of a system that supports this design goal is the ability to start, stop, pause and roll-back the disease propagation and intervention application process interactively. An analyst can access the state of the system at any point in time and formulate dynamic interventions based on additional information obtained through state assessment. In addition, the environment provides automated services for experiment set-up and management, thus reducing the overall time for conducting end-to-end experimental studies. We illustrate the applicability of the system by describing computational experiments based on realistic pandemic planning scenarios. The experiments are designed to demonstrate the system's capability and enhanced user productivity. PMID:25530914

  13. Improving science and mathematics education with computational modelling in interactive engagement environments

    NASA Astrophysics Data System (ADS)

    Neves, Rui Gomes; Teodoro, Vítor Duarte

    2012-09-01

    A teaching approach aiming at an epistemologically balanced integration of computational modelling in science and mathematics education is presented. The approach is based on interactive engagement learning activities built around computational modelling experiments that span the range of different kinds of modelling from explorative to expressive modelling. The activities are designed to make a progressive introduction to scientific computation without requiring prior development of a working knowledge of programming, generate and foster the resolution of cognitive conflicts in the understanding of scientific and mathematical concepts and promote performative competency in the manipulation of different and complementary representations of mathematical models. The activities are supported by interactive PDF documents which explain the fundamental concepts, methods and reasoning processes using text, images and embedded movies, and include free space for multimedia enriched student modelling reports and teacher feedback. To illustrate, an example from physics implemented in the Modellus environment and tested in undergraduate university general physics and biophysics courses is discussed.

  14. Commentary on the shifting processes model: a conceptual model for weight management.

    PubMed

    Pagoto, Sherry; Rodrigues, Stephanie

    2013-12-01

    Macchi and colleagues propose a theoretical model that merges concepts from the biopsychosocial model and family systems theory to produce a broader framework for understanding weight loss and maintenance (see record 2013-28564-001). The Shifting Processes Model views individual weight loss and maintenance in the context of family dynamics, including family eating and exercise habits, home environment, and family relationships. The authors reason that traditional models put the burden of change on the individual rather than the family system, when the latter is an important context of individual behavior. PMID:24377766

  15. Space Environment Modelling with the Use of Artificial Intelligence Methods

    NASA Astrophysics Data System (ADS)

    Lundstedt, H.; Wintoft, P.; Wu, J.-G.; Gleisner, H.; Dovheden, V.

    1996-12-01

    Space based technological systems are affected by the space weather in many ways. Several severe failures of satellites have been reported at times of space storms. Our society also increasingly depends on satellites for communication, navigation, exploration, and research. Predictions of the conditions in the satellite environment have therefore become very important. We will here present predictions made with the use of artificial intelligence (AI) techniques, such as artificial neural networks (ANN) and hybrids of AT methods. We are developing a space weather model based on intelligence hybrid systems (IHS). The model consists of different forecast modules, each module predicts the space weather on a specific time-scale. The time-scales range from minutes to months with the fundamental time-scale of 1-5 minutes, 1-3 hours, 1-3 days, and 27 days. Solar and solar wind data are used as input data. From solar magnetic field measurements, either made on the ground at Wilcox Solar Observatory (WSO) at Stanford, or made from space by the satellite SOHO, solar wind parameters can be predicted and modelled with ANN and MHD models. Magnetograms from WSO are available on a daily basis. However, from SOHO magnetograms will be available every 90 minutes. SOHO magnetograms as input to ANNs will therefore make it possible to even predict solar transient events. Geomagnetic storm activity can today be predicted with very high accuracy by means of ANN methods using solar wind input data. However, at present real-time solar wind data are only available during part of the day from the satellite WIND. With the launch of ACE in 1997, solar wind data will on the other hand be available during 24 hours per day. The conditions of the satellite environment are not only disturbed at times of geomagnetic storms but also at times of intense solar radiation and highly energetic particles. These events are associated with increased solar activity. Predictions of these events are therefore

  16. ISLE (Image and Signal Lisp Environment): A functional language interface for signal and image processing

    SciTech Connect

    Azevedo, S.G.; Fitch, J.P.

    1987-05-01

    Conventional software interfaces which utilize imperative computer commands or menu interactions are often restrictive environments when used for researching new algorithms or analyzing processed experimental data. We found this to be true with current signal processing software (SIG). Existing ''functional language'' interfaces provide features such as command nesting for a more natural interaction with the data. The Image and Signal Lisp Environment (ISLE) will be discussed as an example of an interpreted functional language interface based on Common LISP. Additional benefits include multidimensional and multiple data-type independence through dispatching functions, dynamic loading of new functions, and connections to artificial intelligence software.

  17. Machine platform and software environment for rapid optics assembly process development

    NASA Astrophysics Data System (ADS)

    Sauer, Sebastian; Müller, Tobias; Haag, Sebastian; Zontar, Daniel

    2016-03-01

    The assembly of optical components for laser systems is proprietary knowledge and typically done by well-trained personnel in clean room environment as it has major impact on the overall laser performance. Rising numbers of laser systems drives laser production to industrial-level automation solutions allowing for high volumes by simultaneously ensuring stable quality, lots of variants and low cost. Therefore, an easy programmable, expandable and reconfigurable machine with intuitive and flexible software environment for process configuration is required. With Fraunhofer IPT's expertise on optical assembly processes, the next step towards industrializing the production of optical systems is made.

  18. A model evaluation checklist for process-based environmental models

    NASA Astrophysics Data System (ADS)

    Jackson-Blake, Leah

    2015-04-01

    Mechanistic catchment-scale phosphorus models appear to perform poorly where diffuse sources dominate. The reasons for this were investigated for one commonly-applied model, the INtegrated model of CAtchment Phosphorus (INCA-P). Model output was compared to 18 months of daily water quality monitoring data in a small agricultural catchment in Scotland, and model structure, key model processes and internal model responses were examined. Although the model broadly reproduced dissolved phosphorus dynamics, it struggled with particulates. The reasons for poor performance were explored, together with ways in which improvements could be made. The process of critiquing and assessing model performance was then generalised to provide a broadly-applicable model evaluation checklist, incorporating: (1) Calibration challenges, relating to difficulties in thoroughly searching a high-dimensional parameter space and in selecting appropriate means of evaluating model performance. In this study, for example, model simplification was identified as a necessary improvement to reduce the number of parameters requiring calibration, whilst the traditionally-used Nash Sutcliffe model performance statistic was not able to discriminate between realistic and unrealistic model simulations, and alternative statistics were needed. (2) Data limitations, relating to a lack of (or uncertainty in) input data, data to constrain model parameters, data for model calibration and testing, and data to test internal model processes. In this study, model reliability could be improved by addressing all four kinds of data limitation. For example, there was insufficient surface water monitoring data for model testing against an independent dataset to that used in calibration, whilst additional monitoring of groundwater and effluent phosphorus inputs would help distinguish between alternative plausible model parameterisations. (3) Model structural inadequacies, whereby model structure may inadequately represent

  19. Coal-to-Liquids Process Model

    Energy Science and Technology Software Center (ESTSC)

    2006-01-01

    A comprehensive Aspen Plus model has been developed to rigorously model coal-to-liquids processes. This portion was developed under Laboratory Directed Research and Development (LDRD) funding. The model is built in a modular fashion to allow rapid reconfiguration for evaluation of process options. Aspen Plus is the framework in which the model is developed. The coal-to-liquids simulation package is an assemble of Aspen Hierarchy Blocks representing subsections of the plant. Each of these Blocks are consideredmore » individual components of the Copyright, which may be extracted and licensed as individual components, but which may be combined with one or more other components, to model general coal-conversion processes, including the following plant operations: (1) coal handling and preparation, (2) coal pyrolysis, combustion, or gasification, (3) syngas conditioning and cleanup, (4) sulfur recovery using Claus-SCOT unit operations, (5) Fischer-Tropsch liquid fuels synthesis, (6) hydrocracking of high molecular weight paraffin, (7) hydrotreating of low molecular weight paraffin and olefins, (8) gas separations, and (9) power generation representing integrated combined cycle technology.« less

  20. Comparison of the Beta and the Hidden Markov Models of Trust in Dynamic Environments

    NASA Astrophysics Data System (ADS)

    Moe, Marie E. G.; Helvik, Bjarne E.; Knapskog, Svein J.

    Computational trust and reputation models are used to aid the decision-making process in complex dynamic environments, where we are unable to obtain perfect information about the interaction partners. In this paper we present a comparison of our proposed hidden Markov trust model to the Beta reputation system. The hidden Markov trust model takes the time between observations into account, it also distinguishes between system states and uses methods previously applied to intrusion detection for the prediction of which state an agent is in. We show that the hidden Markov trust model performs better when it comes to the detection of changes in behavior of agents, due to its larger richness in model features. This means that our trust model may be more realistic in dynamic environments. However, the increased model complexity also leads to bigger challenges in estimating parameter values for the model. We also show that the hidden Markov trust model can be parameterized so that it responds similarly to the Beta reputation system.

  1. [Cellular model of blood coagulation process].

    PubMed

    Bijak, Michał; Rzeźnicka, Paulina; Saluk, Joanna; Nowak, Paweł

    2015-07-01

    Blood coagulation is a process which main objective is the prevention of blood loss when the integrity of the blood vessel is damaged. Over the years, have been presented a number of concepts characterizing the mechanism of thrombus formation. Since the 60s of last century was current cascade model of the coagulation wherein forming of the fibrin clot is determined by two pathways called extrinsic and intrinsic pathways. In the nineties of the last century Monroe and Hoffman presented his concept of blood coagulation process which complement the currently valid model of cells participation especially of blood platelets which aim is to provide a negatively charged phospholipid surface and thereby allow the coagulation enzymatic complexes formation. Developed conception they called cellular model of coagulation. The aim of this work was to present in details of this blood coagulation, including descriptions of its various phases. PMID:26277170

  2. Modeling of the vacuum plasma spray process

    SciTech Connect

    Varacalle, D.J. Jr.; Neiser, R.A.; Smith, M.F.

    1992-10-01

    Experimental and analytical studies have been conducted to investigate gas, particle, and coating dynamics in the vacuum plasma spray (VPS) process for a tungsten powder. VPS coatings were examined metallographically and the results compared with the model`s predictions. The plasma was numerically modeled from the cathode tip to the spray distance in the free plume for the experimental conditions of this study. This information was then used as boundary conditions to solve the particle dynamics. The predicted temperature and velocity of the powder particles at standoff were then used as initial conditions for a coating dynamics code. The code predicts the coating morphology for the specific process parameters. The predicted characteristics exhibit good correlation with the observed coating properties.

  3. A Spatial Analysis and Modeling System (SAMS) for environment management

    NASA Technical Reports Server (NTRS)

    Stetina, Fran; Hill, John; Chan, Paul; Jaske, Robert; Rochon, Gilbert

    1993-01-01

    This is a proposal to develop a uniform global environmental data gathering and distribution system to support the calibration and validation of remotely sensed data. SAMS is based on an enhanced version of FEMA's Integrated Emergency Management Information Systems and the Department of Defense's Air land Battlefield Environment Software Systems. This system consists of state-of-the-art graphics and visualization techniques, simulation models, database management and expert systems for conducting environmental and disaster preparedness studies. This software package will be integrated into various Landsat and UNEP-GRID stations which are planned to become direct readout stations during the EOS (Earth Observing System) timeframe. This system would be implemented as a pilot program to support the Tropical Rainfall Measuring Mission (TRMM). This will be a joint NASA-FEMA-University-Industry project.

  4. A Spatial Analysis and Modeling System (SAMS) for environment management

    NASA Technical Reports Server (NTRS)

    Vermillion, Charles H.; Stetina, Fran; Hill, John; Chan, Paul; Jaske, Robert; Rochon, Gilbert

    1992-01-01

    This is a proposal to develop a uniform global environmental data gathering and distribution system to support the calibration and validation of remotely sensed data. SAMS is based on an enhanced version of FE MA's Integrated Emergency Management Information Systems and the Department of Defense's Air Land Battlefield Environment Software Systems. This system consists of state-of-the-art graphics and visualization techniques, simulation models, database management and expert systems for conducting environmental and disaster preparedness studies. This software package will be integrated into various Landsat and UNEP-GRID stations which are planned to become direct readout stations during the EOS timeframe. This system would be implemented as a pilot program to support the Tropical Rainfall Measuring Mission (TRMM). This will be a joint NASA-FEMA-University-Industry project.

  5. Lattice models of directed and semiflexible polymers in anisotropic environment

    NASA Astrophysics Data System (ADS)

    Haydukivska, K.; Blavatska, V.

    2015-10-01

    We study the conformational properties of polymers in presence of extended columnar defects of parallel orientation. Two classes of macromolecules are considered: the so-called partially directed polymers with preferred orientation along direction of the external stretching field and semiflexible polymers. We are working within the frames of lattice models: partially directed self-avoiding walks (PDSAWs) and biased self-avoiding walks (BSAWs). Our numerical analysis of PDSAWs reveals, that competition between the stretching field and anisotropy caused by presence of extended defects leads to existing of three characteristic length scales in the system. At each fixed concentration of disorder we found a transition point, where the influence of extended defects is exactly counterbalanced by the stretching field. Numerical simulations of BSAWs in anisotropic environment reveal an increase of polymer stiffness. In particular, the persistence length of semiflexible polymers increases in presence of disorder.

  6. Modeling abiotic processes of aniline in water-saturated soils

    SciTech Connect

    Fabrega-Duque, J.R.; Jafvert, C.T.; Li, H.; Lee, L.S.

    2000-05-01

    The long-term interactions of aromatic amines with soils are important in defining the fate and transport of these compounds in the environment. Abiotic loss of aniline from the aqueous phase to the soil phase occurs with an initial rapid loss due to reversible mass transfer processes, followed by a slow loss due to irreversible reactions. A kinetic model describing these processes in water-saturated soils was developed and evaluated. The model assumes that instantaneous equilibrium occurs for the following reversible processes: (1) acid dissociation of the protonated organic base (BH+) in the aqueous phase; (2) ion exchange between inorganic divalent cations (D{sup 2+} = Ca{sup 2+} + Mg{sup 2+}) on the soil and the protonated organic base; and (3) partitioning of the nonionic species of aniline (B{sub aq}) to soil organic carbon. The model assumes that irreversible loss of aniline occurs through reaction of B{sub aq} with irreversible sites (C{sub ir}) on the soil. A kinetic rate constant, k{sub ir}, and the total concentration of irreversible sites, C{sub T}, were employed as adjustable model parameters. The model was evaluated as adjustable model parameters. The model was evaluated with measured mass distributions of aniline between water and five soils ranging in pH (4.4--7.3), at contact times ranging from 2 to 1,600 h. Some experiments were performed at different soil mass to water volume ratios. A good fit was obtained with a single value of k{sub ir} for all soils, pH values, and soil-water ratios. To accurately predict soil-water distributions at contact times <24 h, mass transfer of the neutral species to the soil was modeled as a kinetic process, again, assuming that ion exchange processes are instantaneous.

  7. Emerge - A Python environment for the modeling of subsurface transfers

    NASA Astrophysics Data System (ADS)

    Lopez, S.; Smai, F.; Sochala, P.

    2014-12-01

    The simulation of subsurface mass and energy transfers often relies on specific codes that were mainly developed using compiled languages which usually ensure computational efficiency at the expense of relatively long development times and relatively rigid software. Even if a very detailed, possibly graphical, user-interface is developed the core numerical aspects are rarely accessible and the smallest modification will always need a compilation step. Thus, user-defined physical laws or alternative numerical schemes may be relatively difficult to use. Over the last decade, Python has emerged as a popular and widely used language in the scientific community. There already exist several libraries for the pre and post-treatment of input and output files for reservoir simulators (e.g. pytough). Development times in Python are considerably reduced compared to compiled languages, and programs can be easily interfaced with libraries written in compiled languages with several comprehensive numerical libraries that provide sequential and parallel solvers (e.g. PETSc, Trilinos…). The core objective of the Emerge project is to explore the possibility to develop a modeling environment in full Python. Consequently, we are developing an open python package with the classes/objects necessary to express, discretize and solve the physical problems encountered in the modeling of subsurface transfers. We heavily relied on Python to have a convenient and concise way of manipulating potentially complex concepts with a few lines of code and a high level of abstraction. Our result aims to be a friendly numerical environment targeting both numerical engineers and physicist or geoscientists with the possibility to quickly specify and handle geometries, arbitrary meshes, spatially or temporally varying properties, PDE formulations, boundary conditions…

  8. Iron and steel industry process model

    SciTech Connect

    Sparrow, F.T.; Pilati, D.; Dougherty, T.; McBreen, E.; Juang, L.L.

    1980-01-01

    The iron and steel industry process model depicts expected energy-consumption characteristics of the iron and steel industry and ancillary industries for the next 25 years by means of a process model of the major steps in steelmaking, from ore mining and scrap recycling to the final finishing of carbon, alloy, and stainless steel into steel products such as structural steel, slabs, plates, tubes, and bars. Two plant types are modeled: fully integrated mills and mini-mills. User-determined inputs into the model are as follows: projected energy and materials prices; projected costs of capacity expansion and replacement; energy-conserving options, both operating modes and investments; the internal rate of return required on investment; and projected demand for finished steel. Nominal input choices in the model for the inputs listed above are as follows: National Academy of Sciences Committee on Nuclear and Alternative Energy Systems Demand Panel nominal energy-price projections for oil, gas, distillates, residuals, and electricity and 1975 actual prices for materials; actual 1975 costs; new technologies added; 15% after taxes; and 1975 actual demand with 1.5%/y growth. The model reproduces the base-year (1975) actual performance of the industry; then, given the above nominal input choices, it projects modes of operation and capacity expansion that minimize the cost of meeting the given final demands for each of 5 years, each year being the midpoint of a 5-year interval. The output of the model includes the following: total energy use and intensity (Btu/ton) by type, by process, and by time period; energy conservation options chosen; utilization rates for existing capacity; capital-investment decisions for capacity expansion.

  9. Development of a comprehensive weld process model

    SciTech Connect

    Radhakrishnan, B.; Zacharia, T.; Paul, A.

    1997-05-01

    This cooperative research and development agreement (CRADA) between Concurrent Technologies Corporation (CTC) and Lockheed Martin Energy Systems (LMES) combines CTC`s expertise in the welding area and that of LMES to develop computer models and simulation software for welding processes. This development is of significant impact to the industry, including materials producers and fabricators. The main thrust of the research effort was to develop a comprehensive welding simulation methodology. A substantial amount of work has been done by several researchers to numerically model several welding processes. The primary drawback of most of the existing models is the lack of sound linkages between the mechanistic aspects (e.g., heat transfer, fluid flow, and residual stress) and the metallurgical aspects (e.g., microstructure development and control). A comprehensive numerical model which can be used to elucidate the effect of welding parameters/conditions on the temperature distribution, weld pool shape and size, solidification behavior, and microstructure development, as well as stresses and distortion, does not exist. It was therefore imperative to develop a comprehensive model which would predict all of the above phenomena during welding. The CRADA built upon an already existing three-dimensional (3-D) welding simulation model which was developed by LMES which is capable of predicting weld pool shape and the temperature history in 3-d single-pass welds. However, the model does not account for multipass welds, microstructural evolution, distortion and residual stresses. Additionally, the model requires large resources of computing time, which limits its use for practical applications. To overcome this, CTC and LMES have developed through this CRADA the comprehensive welding simulation model described above.

  10. MOUNTAIN-SCALE COUPLED PROCESSES (TH/THC/THM)MODELS

    SciTech Connect

    Y.S. Wu

    2005-08-24

    This report documents the development and validation of the mountain-scale thermal-hydrologic (TH), thermal-hydrologic-chemical (THC), and thermal-hydrologic-mechanical (THM) models. These models provide technical support for screening of features, events, and processes (FEPs) related to the effects of coupled TH/THC/THM processes on mountain-scale unsaturated zone (UZ) and saturated zone (SZ) flow at Yucca Mountain, Nevada (BSC 2005 [DIRS 174842], Section 2.1.1.1). The purpose and validation criteria for these models are specified in ''Technical Work Plan for: Near-Field Environment and Transport: Coupled Processes (Mountain-Scale TH/THC/THM, Drift-Scale THC Seepage, and Drift-Scale Abstraction) Model Report Integration'' (BSC 2005 [DIRS 174842]). Model results are used to support exclusion of certain FEPs from the total system performance assessment for the license application (TSPA-LA) model on the basis of low consequence, consistent with the requirements of 10 CFR 63.342 [DIRS 173273]. Outputs from this report are not direct feeds to the TSPA-LA. All the FEPs related to the effects of coupled TH/THC/THM processes on mountain-scale UZ and SZ flow are discussed in Sections 6 and 7 of this report. The mountain-scale coupled TH/THC/THM processes models numerically simulate the impact of nuclear waste heat release on the natural hydrogeological system, including a representation of heat-driven processes occurring in the far field. The mountain-scale TH simulations provide predictions for thermally affected liquid saturation, gas- and liquid-phase fluxes, and water and rock temperature (together called the flow fields). The main focus of the TH model is to predict the changes in water flux driven by evaporation/condensation processes, and drainage between drifts. The TH model captures mountain-scale three-dimensional flow effects, including lateral diversion and mountain-scale flow patterns. The mountain-scale THC model evaluates TH effects on water and gas

  11. A Mixed-Model Quantitative Trait Loci (QTL) Analysis for Multiple-Environment Trial Data Using Environmental Covariables for QTL-by-Environment Interactions, With an Example in Maize

    PubMed Central

    Boer, Martin P.; Wright, Deanne; Feng, Lizhi; Podlich, Dean W.; Luo, Lang; Cooper, Mark; van Eeuwijk, Fred A.

    2007-01-01

    Complex quantitative traits of plants as measured on collections of genotypes across multiple environments are the outcome of processes that depend in intricate ways on genotype and environment simultaneously. For a better understanding of the genetic architecture of such traits as observed across environments, genotype-by-environment interaction should be modeled with statistical models that use explicit information on genotypes and environments. The modeling approach we propose explains genotype-by-environment interaction by differential quantitative trait locus (QTL) expression in relation to environmental variables. We analyzed grain yield and grain moisture for an experimental data set composed of 976 F5 maize testcross progenies evaluated across 12 environments in the U.S. corn belt during 1994 and 1995. The strategy we used was based on mixed models and started with a phenotypic analysis of multi-environment data, modeling genotype-by-environment interactions and associated genetic correlations between environments, while taking into account intraenvironmental error structures. The phenotypic mixed models were then extended to QTL models via the incorporation of marker information as genotypic covariables. A majority of the detected QTL showed significant QTL-by-environment interactions (QEI). The QEI were further analyzed by including environmental covariates into the mixed model. Most QEI could be understood as differential QTL expression conditional on longitude or year, both consequences of temperature differences during critical stages of the growth. PMID:17947443

  12. Thermal modeling of an epoxy encapsulation process

    SciTech Connect

    Baca, R.G.; Schutt, J.A.

    1991-01-01

    The encapsulation of components is a widely used process at Sandia National Laboratories for packaging components to withstand structural loads. Epoxy encapsulants are also used for their outstanding dielectric strength characteristics. The production of high voltage assemblies requires the encapsulation of ceramic and electrical components (such as transformers). Separation of the encapsulant from internal contact surfaces or voids within the encapsulant itself in regions near the mold base have caused high voltage breakdown failures during production testing. In order to understand the failure mechanisms, a methodology was developed to predict both the thermal response and gel front progression of the epoxy the encapsulation process. A thermal model constructed with PATRAN Plus (1) and solved with the P/THERMAL (2) analysis system was used to predict the thermal response of the encapsulant. This paper discusses the incorporation of an Arrhenius kinetics model into Q/TRAN (2) to model the complex volumetric heat generation of the epoxy during the encapsulation process. As the epoxy begins to cure, it generates heat and shrinks. The total cure time of the encapsulant (transformation from a viscous liquid to solid) is dependent on both the initial temperature and the entire temperature history. Because the rate of cure is temperature dependent, the cure rate accelerates with a temperature increase and, likewise, the cure rate is quenched if the temperature is reduced. The temperature and conversion predictions compared well against experimental data. The thermal simulation results were used to modify the temperature cure process of the encapsulant and improve production yields.

  13. Modeling and simulation of plasma processing equipment

    NASA Astrophysics Data System (ADS)

    Kim, Heon Chang

    Currently plasma processing technology is utilized in a wide range of applications including advanced Integrated Circuit (IC) fabrication. Traditionally, plasma processing equipments have been empirically designed and optimized at great expense of development time and cost. This research proposes the development of a first principle based, multidimensional plasma process simulator with the aim of enhancing the equipment design procedure. The proposed simulator accounts for nonlinear interactions among various plasma chemistry and physics, neutral chemistry and transport, and dust transport phenomena. A three moment modeling approach is employed that shows good predictive capabilities at reasonable computational expense. For numerical efficiency, various versions of explicit and implicit Essentially Non- Oscillatory (ENO) algorithms are employed. For the rapid evaluation of time-periodic steady-state solutions, a feedback control approach is employed. Two dimensional simulation results of capacitively coupled rf plasmas show that ion bombardment uniformity can be improved through simulation based design of the plasma process. Through self-consistent simulations of an rf triode, it is also shown that effects of secondary rf voltage and frequency on ion bombardment energy can be accurately captured. These results prove that scaling relations among important process variables can be identified through the three moment modeling and simulation approach. Through coupling of the plasma model with a neutral chemistry and transport model, spatiotemporal distributions of both charged and uncharged species, including metastables, are predicted for an oxygen plasma. Furthermore, simulation results also verify the existence of a double layer in this electronegative plasma. Through Lagrangian simulation of dust in a plasma reactor, it is shown that small particles are accumulate near the center and the radial sheath boundary depending on their initial positions while large

  14. Solidification modeling of continuous casting process

    NASA Astrophysics Data System (ADS)

    Lerner, V. S.; Lerner, Y. S.

    2005-04-01

    The aim of the present work was to utilize a new systematic mathematical-informational approach based on informational macrodynamics (IMD) to model and optimize the casting process, taking as an example horizontal continuous casting (HCC). The IMD model takes into account the interrelated thermal, diffusion, kinetic, hydrodynamic, and mechanical effects that are essential for the given casting process. The optimum technological process parameters are determined by the simultaneous solution of problems of identification and optimal control. The control functions of the synthesized optimal model are found from the extremum of the entropy functional having a particular sense of an integrated assessment of the continuous cast bar physicochemical properties. For the physical system considered, the IMD structures of the optimal model are connected with controllable equations of nonequilibrium thermodynamics. This approach was applied to the HCC of ductile iron, and the results were compared with experimental data and numerical simulation. Good agreement was confirmed between the predicted and practical data, as well as between new and traditional methods.

  15. DEVELOPMENT OF CAPE-OPEN COMPLIANT PROCESS MODELING COMPONENTS IN MICROSOFT .NET

    EPA Science Inventory

    The CAPE-OPEN middleware standards were created to allow process modeling components (PMCs) developed by third parties to be used in any process modeling environment (PME) utilizing these standards. The CAPE-OPEN middleware specifications were based upon both Microsoft's Compone...

  16. Model-Based Analysis of Cell Cycle Responses to Dynamically Changing Environments

    PubMed Central

    Seaton, Daniel D; Krishnan, J

    2016-01-01

    Cell cycle progression is carefully coordinated with a cell’s intra- and extracellular environment. While some pathways have been identified that communicate information from the environment to the cell cycle, a systematic understanding of how this information is dynamically processed is lacking. We address this by performing dynamic sensitivity analysis of three mathematical models of the cell cycle in Saccharomyces cerevisiae. We demonstrate that these models make broadly consistent qualitative predictions about cell cycle progression under dynamically changing conditions. For example, it is shown that the models predict anticorrelated changes in cell size and cell cycle duration under different environments independently of the growth rate. This prediction is validated by comparison to available literature data. Other consistent patterns emerge, such as widespread nonmonotonic changes in cell size down generations in response to parameter changes. We extend our analysis by investigating glucose signalling to the cell cycle, showing that known regulation of Cln3 translation and Cln1,2 transcription by glucose is sufficient to explain the experimentally observed changes in cell cycle dynamics at different glucose concentrations. Together, these results provide a framework for understanding the complex responses the cell cycle is capable of producing in response to dynamic environments. PMID:26741131

  17. Reversibility in Quantum Models of Stochastic Processes

    NASA Astrophysics Data System (ADS)

    Gier, David; Crutchfield, James; Mahoney, John; James, Ryan

    Natural phenomena such as time series of neural firing, orientation of layers in crystal stacking and successive measurements in spin-systems are inherently probabilistic. The provably minimal classical models of such stochastic processes are ɛ-machines, which consist of internal states, transition probabilities between states and output values. The topological properties of the ɛ-machine for a given process characterize the structure, memory and patterns of that process. However ɛ-machines are often not ideal because their statistical complexity (Cμ) is demonstrably greater than the excess entropy (E) of the processes they represent. Quantum models (q-machines) of the same processes can do better in that their statistical complexity (Cq) obeys the relation Cμ >= Cq >= E. q-machines can be constructed to consider longer lengths of strings, resulting in greater compression. With code-words of sufficiently long length, the statistical complexity becomes time-symmetric - a feature apparently novel to this quantum representation. This result has ramifications for compression of classical information in quantum computing and quantum communication technology.

  18. Vertical distribution, migration rates, and model comparison of actinium in a semi-arid environment.

    PubMed

    McClellan, Y; August, R A; Gosz, J R; Gann, S; Parmenter, R R; Windsor, M

    2006-01-01

    Vertical soil characterization and migration of radionuclides were investigated at four radioactively contaminated sites on Kirtland Air Force Base (KAFB), New Mexico to determine the vertical downward migration of radionuclides in a semi-arid environment. The surface soils (0-15 cm) were intentionally contaminated with Brazilian sludge (containing (232)Thorium and other radionuclides) approximately 40 years ago, in order to simulate the conditions resulting from a nuclear weapons accident. Site grading consisted of manually raking or machine disking the sludge. The majority of the radioactivity was found in the top 15 cm of soil, with retention ranging from 69 to 88%. Two models, a compartment diffusion model and leach rate model, were evaluated to determine their capabilities and limitations in predicting radionuclide behavior. The migration rates of actinium were calculated with the diffusion compartment and the leach rate models for all sites, and ranged from 0.009 to 0.1 cm/yr increasing with depth. The migration rates calculated with the leach rate models were similar to those using the diffusion compartment model and did not increase with depth (0.045-0.076, 0.0 cm/yr). The research found that the physical and chemical properties governing transport processes of water and solutes in soil provide a valid radionuclide transport model. The evaluation also showed that the physical model has fewer limitations and may be more applicable to this environment. PMID:16243414

  19. Digraph reliability model processing advances and applications

    NASA Technical Reports Server (NTRS)

    Iverson, D. L.; Patterson-Hine, F. A.

    1993-01-01

    This paper describes a new algorithm, called SourceDoubls, which efficiently solves for singletons and doubletons of a digraph reliability model. Compared with previous methods, the SourceDoubls algorithm provides up to a two order of magnitude reduction in the amount of time required to solve large digraph models. This significant increase in model solution speed allows complex digraphs containing thousands of nodes to be used as knowledge bases for real time automated monitoring and diagnosis applications. Currently, an application to provide monitoring and diagnosis of the Space Station Freedom Data Management System is under development at NASA/Ames Research Center and NASA/Johnson Space Center. This paper contains an overview of this system and provides details of how it will use digraph models processed by the SourceDoubls algorithm to accomplish its task.

  20. Qualitative simulation for process modeling and control

    NASA Technical Reports Server (NTRS)

    Dalle Molle, D. T.; Edgar, T. F.

    1989-01-01

    A qualitative model is developed for a first-order system with a proportional-integral controller without precise knowledge of the process or controller parameters. Simulation of the qualitative model yields all of the solutions to the system equations. In developing the qualitative model, a necessary condition for the occurrence of oscillatory behavior is identified. Initializations that cannot exhibit oscillatory behavior produce a finite set of behaviors. When the phase-space behavior of the oscillatory behavior is properly constrained, these initializations produce an infinite but comprehensible set of asymptotically stable behaviors. While the predictions include all possible behaviors of the real system, a class of spurious behaviors has been identified. When limited numerical information is included in the model, the number of predictions is significantly reduced.

  1. Elements of a Process-based Model of Leaf Photosynthesis

    NASA Astrophysics Data System (ADS)

    Noe, S. M.; Giersch, C.; Schnitzler, J.; Steinbrecher, R.

    2003-12-01

    Process-based modelling of photosynthesis requires appropriate description of leaf photosynthesis. Essential aspects are stomatal conductance and the CO2 assimilation proper, both as affected by the environment. Here we propose and analyse a photosynthesis model with two variables (stomatal conductance (gs) and the CO2 partial pressure inside the leaf (pi)) for the stomatal part and five variables to model the Calvin cycle intermediates. The actual stomatal conductance is calculated via a target function G(I,Δ VP) which describes the effects of light (I) and vapour pressure deficit (Δ VP). CO2 fixation is modelled as a sink term for pi so that a differential equation for pi is derived which greatly simplifies explicit modelling of the Calvin cycle. A plausibility check of the model employing sinusoidal time courses for I and Δ VP is carried out. Using field data, the model is shown to produced a reasonable fit to data sets collected for oak leaves. Preliminary modelling results indicate that also the Calvin cycle intermediates and ATP are in acceptable agreement with experimental data. The model is intended to use as a base module of the isoprene emission model (SIM-BIM), which is a subproject of the BEWA2000 project within the national joint research project AFO2000 (Atmosphaeren Forschungsprogramm 2000).

  2. Designing a Collaborative Problem Solving Environment for Integrated Water Resource Modeling

    SciTech Connect

    Thurman, David A.; Cowell, Andrew J.; Taira, Randal Y.; Frodge, Jonathan

    2004-06-14

    We report on our approach for designing a collaborative problem solving environment for hydrologists, water quality planners and natural resource managers, all roles within a natural resource management agency and stakeholders in an integrated water resource management process. We describe our approach in context of the Integrated Water Resource Modeling System (IWRMS), under development by Pacific Northwest National Laboratory for the Department of Natural Resources and Parks in King County, Washington. This system will integrate a collection of water resource models (watersheds, rivers, lakes, estuaries) to provide the ability to address water, land use, and other natural resource management decisions and scenarios, with the goal of developing an integrated modeling capability to address future land use and resource management scenarios and provide scientific support to decision makers. Here, we discuss the five-step process used to ascertain the (potentially opposing) needs and interests of stakeholders and provide results and summaries from our experiences. The results of this process guide user interface design efforts to create a collaborative problems solving environment supporting multiple users with differing scientific backgrounds and modeling needs. We conclude with a discussion of participatory interface design methods used to encourage stakeholder involvement and acceptance of the system as well as the lessons learned to date.

  3. Upgrading Preschool Environment in a Swedish Municipality: Evaluation of an Implementation Process.

    PubMed

    Altin, Carolina; Kvist Lindholm, Sofia; Wejdmark, Mats; Lättman-Masch, Robert; Boldemann, Cecilia

    2015-07-01

    Redesigning outdoor preschool environment may favorably affect multiple factors relevant to health and reach many children. Cross-sectional studies in various landscapes at different latitudes have explored the characteristics of preschool outdoor environment considering the play potential triggering combined physical activity and sun-protective behavior due to space, vegetation, and topography. Criteria were pinpointed to upgrade preschool outdoor environment for multiple health outcomes to be applied in local government in charge of public preschools. Purposeful land use policies and administrative management of outdoor land use may serve to monitor the quality of preschool outdoor environments (upgrading and planning). This study evaluates the process of implementing routines for upgrading outdoor preschool environments in a medium-sized municipality, Sweden, 2008-2011, using qualitative and quantitative analysis. Recorded written material (logs and protocols) related to the project was processed using thematic analysis. Quantitative data (m(2) flat/multileveled, overgrown/naked surface, and fraction of free visible sky) were analyzed to assess the impact of implementation (surface, topography, greenery integrated in play). The preschool outdoor environments were upgraded accordingly. The quality of implementation was assessed using the theory of policy streams approach. Though long-term impact remains to be confirmed the process seems to have changed work routines in the interior management for purposeful upgrading of preschool outdoor environments. The aptitude and applicability of inexpensive methods for assessing, selecting, and upgrading preschool land at various latitudes, climates, and outdoor play policies (including gender aspects and staff policies) should be further discussed, as well as the compilation of data for monitoring and evaluation. PMID:25589022

  4. Marketing the use of the space environment for the processing of biological and pharmaceutical materials

    NASA Technical Reports Server (NTRS)

    1984-01-01

    The perceptions of U.S. biotechnology and pharmaceutical companies concerning the potential use of the space environment for the processing of biological substances was examined. Physical phenomena that may be important in space-base processing of biological materials are identified and discussed in the context of past and current experiment programs. The capabilities of NASA to support future research and development, and to engage in cooperative risk sharing programs with industry are discussed. Meetings were held with several biotechnology and pharmaceutical companies to provide data for an analysis of the attitudes and perceptions of these industries toward the use of the space environment. Recommendations are made for actions that might be taken by NASA to facilitate the marketing of the use of the space environment, and in particular the Space Shuttle, to the biotechnology and pharmaceutical industries.

  5. Modeling Manufacturing Processes to Mitigate Technological Risk

    SciTech Connect

    Allgood, G.O.; Manges, W.W.

    1999-10-24

    An economic model is a tool for determining the justifiable cost of new sensors and subsystems with respect to value and operation. This process balances the R and D costs against the expense of maintaining current operations and allows for a method to calculate economic indices of performance that can be used as control points in deciding whether to continue development or suspend actions. The model can also be used as an integral part of an overall control loop utilizing real-time process data from the sensor groups to make production decisions (stop production and repair machine, continue and warn of anticipated problems, queue for repairs, etc.). This model has been successfully used and deployed in the CAFE Project. The economic model was one of seven (see Fig. 1) elements critical in developing an investment strategy. It has been successfully used in guiding the R and D activities on the CAFE Project, suspending activities on three new sensor technologies, and continuing development o f two others. The model has also been used to justify the development of a new prognostic approach for diagnosing machine health using COTS equipment and a new algorithmic approach. maintaining current operations and allows for a method to calculate economic indices of performance that can be used as control points in deciding whether to continue development or suspend actions. The model can also be used as an integral part of an overall control loop utilizing real-time process data from the sensor groups to make production decisions (stop production and repair machine, continue and warn of anticipated problems, queue for repairs, etc.).

  6. A Model of Generating Visual Place Cells Based on Environment Perception and Similar Measure.

    PubMed

    Zhou, Yang; Wu, Dewei

    2016-01-01

    It is an important content to generate visual place cells (VPCs) in the field of bioinspired navigation. By analyzing the firing characteristic of biological place cells and the existing methods for generating VPCs, a model of generating visual place cells based on environment perception and similar measure is abstracted in this paper. VPCs' generation process is divided into three phases, including environment perception, similar measure, and recruiting of a new place cell. According to this process, a specific method for generating VPCs is presented. External reference landmarks are obtained based on local invariant characteristics of image and a similar measure function is designed based on Euclidean distance and Gaussian function. Simulation validates the proposed method is available. The firing characteristic of the generated VPCs is similar to that of biological place cells, and VPCs' firing fields can be adjusted flexibly by changing the adjustment factor of firing field (AFFF) and firing rate's threshold (FRT). PMID:27597859

  7. A Model of Generating Visual Place Cells Based on Environment Perception and Similar Measure

    PubMed Central

    2016-01-01

    It is an important content to generate visual place cells (VPCs) in the field of bioinspired navigation. By analyzing the firing characteristic of biological place cells and the existing methods for generating VPCs, a model of generating visual place cells based on environment perception and similar measure is abstracted in this paper. VPCs' generation process is divided into three phases, including environment perception, similar measure, and recruiting of a new place cell. According to this process, a specific method for generating VPCs is presented. External reference landmarks are obtained based on local invariant characteristics of image and a similar measure function is designed based on Euclidean distance and Gaussian function. Simulation validates the proposed method is available. The firing characteristic of the generated VPCs is similar to that of biological place cells, and VPCs' firing fields can be adjusted flexibly by changing the adjustment factor of firing field (AFFF) and firing rate's threshold (FRT). PMID:27597859

  8. Antibiotic Resistance in Listeria Species Isolated from Catfish Fillets and Processing Environment

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The susceptibility of 221 Listeria spp. (86 Listeria monocytogenes, 41 Listeria innocua and 94 Listeria seeligeri-Listeria welshimeri-Listeria ivanovii) isolated from catfish fillets and processing environment to 15 antibiotics was determined. Listeria isolates were analysed by disc-diffusion assay...

  9. Students' Expectations of the Learning Process in Virtual Reality and Simulation-Based Learning Environments

    ERIC Educational Resources Information Center

    Keskitalo, Tuulikki

    2012-01-01

    Expectations for simulations in healthcare education are high; however, little is known about healthcare students' expectations of the learning process in virtual reality (VR) and simulation-based learning environments (SBLEs). This research aims to describe first-year healthcare students' (N=97) expectations regarding teaching, studying, and…

  10. Synchronous Collaboration Competencies in Web-Conferencing Environments--Their Impact on the Learning Process

    ERIC Educational Resources Information Center

    Bower, Matt

    2011-01-01

    Based on a three-semester design-based research study examining learning and teaching in a web-conferencing environment, this article identifies types of synchronous collaboration competencies and reveals their influence on learning processes. Four levels of online collaborative competencies were observed--operational, interactional, managerial,…

  11. Journey into the Problem-Solving Process: Cognitive Functions in a PBL Environment

    ERIC Educational Resources Information Center

    Chua, B. L.; Tan, O. S.; Liu, W. C.

    2016-01-01

    In a PBL environment, learning results from learners engaging in cognitive processes pivotal in the understanding or resolution of the problem. Using Tan's cognitive function disc, this study examines the learner's perceived cognitive functions at each stage of PBL, as facilitated by the PBL schema. The results suggest that these learners…

  12. Investigation of the Relationship between Learning Process and Learning Outcomes in E-Learning Environments

    ERIC Educational Resources Information Center

    Yurdugül, Halil; Menzi Çetin, Nihal

    2015-01-01

    Problem Statement: Learners can access and participate in online learning environments regardless of time and geographical barriers. This brings up the umbrella concept of learner autonomy that contains self-directed learning, self-regulated learning and the studying process. Motivation and learning strategies are also part of this umbrella…

  13. A Virtual Environment for Process Management. A Step by Step Implementation

    ERIC Educational Resources Information Center

    Mayer, Sergio Valenzuela

    2003-01-01

    In this paper it is presented a virtual organizational environment, conceived with the integration of three computer programs: a manufacturing simulation package, an automation of businesses processes (workflows), and business intelligence (Balanced Scorecard) software. It was created as a supporting tool for teaching IE, its purpose is to give…

  14. Corpora Processing and Computational Scaffolding for a Web-Based English Learning Environment: The CANDLE Project

    ERIC Educational Resources Information Center

    Liou, Hsien-Chin; Chang, Jason S; Chen, Hao-Jan; Lin, Chih-Cheng; Liaw, Meei-Ling; Gao, Zhao-Ming; Jang, Jyh-Shing Roger; Yeh, Yuli; Chuang, Thomas C.; You, Geeng-Neng

    2006-01-01

    This paper describes the development of an innovative web-based environment for English language learning with advanced data-driven and statistical approaches. The project uses various corpora, including a Chinese-English parallel corpus ("Sinorama") and various natural language processing (NLP) tools to construct effective English learning tasks…

  15. Collaborative Learning Processes in an Asynchronous Environment: An Analysis through Discourse and Social Networks

    ERIC Educational Resources Information Center

    Tirado, Ramon; Aguaded, Ignacio; Hernando, Angel

    2011-01-01

    This article analyses an experience in collaborative learning in an asynchronous writing environment through discussion forums on a WebCt platform of the University of Huelva's virtual campus, and was part of an innovative teaching project in 2007-08. The main objectives are to describe the processes of collaborative knowledge construction and the…

  16. Self-Processes and Learning Environment as Influences in the Development of Expertise in Instructional Design

    ERIC Educational Resources Information Center

    Ge, Xun; Hardre, Patricia L.

    2010-01-01

    A major challenge for learning theories is to illuminate how particular kinds of learning experiences and environments promote the development of expertise. Research has been conducted into novice-expert differences in various domains, but few studies have examined the processes involved in learners' expertise development. In an attempt to…

  17. Multiscale numerical modeling of levee breach processes

    NASA Astrophysics Data System (ADS)

    Kees, C. E.; Farthing, M. W.; Akkerman, I.; Bazilevs, Y.

    2010-12-01

    One of the dominant failure modes of levees during flood and storm surge events is erosion-based breach formation due to high velocity flow over the back (land-side) slope. Modeling the breaching process numerically is challenging due to both physical and geometric complexity that develops and evolves during the overtopping event. The surface water flows are aerated and sediment-laden mixtures in the supercritical and turbulent regimes. The air/water free surface may undergo perturbations on the same order as the depth or even topological change (breaking). Likewise the soil/fluid interface is characterized by evolving headcuts, which are essentially moving discontinuities in the soil surface elevation. The most widely used models of levee breaching are nevertheless based on depth-integrated models of flow, sediment transport, and bed morphology. In this work our objective is to explore models with less restrictive modeling assumptions, which have become computationally tractable due to advances in both numerical methods and high-performance computing hardware. In particular, we present formulations of fully three-dimensional flow, transport, and morphological evolution for overtopping and breaching processes and apply recently developed finite element and level set methods to solve the governing equations for relevant test problems.

  18. Glacier lake outburst floods - modelling process chains

    NASA Astrophysics Data System (ADS)

    Schaub, Yvonne; Huggel, Christian; Haeberli, Wilfried

    2013-04-01

    New lakes are forming in high-mountain areas all over the world due to glacier recession. Often they will be located below steep, destabilized flanks and are therefore exposed to impacts from rock-/ice-avalanches. Several events worldwide are known, where an outburst flood has been triggered by such an impact. In regions such as in the European Alps or in the Cordillera Blanca in Peru, where valley bottoms are densely populated, these far-travelling, high-magnitude events can result in major disasters. For appropriate integral risk management it is crucial to gain knowledge on how the processes (rock-/ice-avalanches - impact waves in lake - impact on dam - outburst flood) interact and how the hazard potential related to corresponding process chains can be assessed. Research in natural hazards so far has mainly concentrated on describing, understanding, modeling or assessing single hazardous processes. Some of the above mentioned individual processes are quite well understood in their physical behavior and some of the process interfaces have also been investigated in detail. Multi-hazard assessments of the entire process chain, however, have only recently become subjects of investigations. Our study aims at closing this gap and providing suggestions on how to assess the hazard potential of the entire process chain in order to generate hazard maps and support risk assessments. We analyzed different types of models (empirical, analytical, physically based) for each process regarding their suitability for application in hazard assessments of the entire process chain based on literature. Results show that for rock-/ice-avalanches, dam breach and outburst floods, only numerical, physically based models are able to provide the required information, whereas the impact wave can be estimated by means of physically based or empirical assessments. We demonstrate how the findings could be applied with the help of a case study of a recent glacier lake outburst event at Laguna

  19. Characterizing Pluto's plasma environment through multifluid MHD modelling

    NASA Astrophysics Data System (ADS)

    Hale, J. M.; Paty, C. S.

    2013-12-01

    We will report on preliminary results from simulations of the Hadean magnetosphere using a refined version of the global multifluid MHD model which has been successfully used to simulate numerous planetary systems, including Ganymede [Paty et al., 2008], Pluto [Harnett et al., 2005], Saturn [Kidder at al., 2012], and Titan [Snowden et al., 2011a,b], among others. This initial study focuses on exploring the exospheric and solar wind parameter space local to Pluto. We explore multiple system geometries including a simulation in which Pluto has no ionosphere, as appears to be the case due to freezing when Pluto resides at apoapsis, as well as several scenarios with different ionospheric and exospheric densities. Ionospheric densities are based on chemical modeling reported in Krasnopolsky and Cruikshank [1999] and solar wind conditions are based on system geometry at periapsis, apoapsis, and at the time of the New Horizons system flyby. We examine the role of the ionosphere and exosphere in determining the location and structure of the bow shock, as well as characterizing the impact of the variability of solar wind pressure and magnetic field throughout Pluto's orbit. This work supports the characterization of the magnetospheric environment of the Pluto system in preparation for the New Horizons encounter in 2015.

  20. A New Fractal Model of Chromosome and DNA Processes

    NASA Astrophysics Data System (ADS)

    Bouallegue, K.

    Dynamic chromosome structure remains unknown. Can fractals and chaos be used as new tools to model, identify and generate a structure of chromosomes?Fractals and chaos offer a rich environment for exploring and modeling the complexity of nature. In a sense, fractal geometry is used to describe, model, and analyze the complex forms found in nature. Fractals have also been widely not only in biology but also in medicine. To this effect, a fractal is considered an object that displays self-similarity under magnification and can be constructed using a simple motif (an image repeated on ever-reduced scales).It is worth noting that the problem of identifying a chromosome has become a challenge to find out which one of the models it belongs to. Nevertheless, the several different models (a hierarchical coiling, a folded fiber, and radial loop) have been proposed for mitotic chromosome but have not reached a dynamic model yet.This paper is an attempt to solve topological problems involved in the model of chromosome and DNA processes. By combining the fractal Julia process and the numerical dynamical system, we have finally found out four main points. First, we have developed not only a model of chromosome but also a model of mitosis and one of meiosis. Equally important, we have identified the centromere position through the numerical model captured below. More importantly, in this paper, we have discovered the processes of the cell divisions of both mitosis and meiosis. All in all, the results show that this work could have a strong impact on the welfare of humanity and can lead to a cure of genetic diseases.