Science.gov

Sample records for environment process model

  1. Near Field Environment Process Model Report

    SciTech Connect

    R.A. Wagner

    2000-11-14

    Waste emplacement and activities associated with construction of a repository system potentially will change environmental conditions within the repository system. These environmental changes principally result from heat generated by the decay of the radioactive waste, which elevates temperatures within the repository system. Elevated temperatures affect distribution of water, increase kinetic rates of geochemical processes, and cause stresses to change in magnitude and orientation from the stresses resulting from the overlying rock and from underground construction activities. The recognition of this evolving environment has been reflected in activities, studies and discussions generally associated with what has been termed the Near-Field Environment (NFE). The NFE interacts directly with waste packages and engineered barriers as well as potentially changing the fluid composition and flow conditions within the mountain. As such, the NFE defines the environment for assessing the performance of a potential Monitored Geologic Repository at Yucca Mountain, Nevada. The NFe evolves over time, and therefore is not amenable to direct characterization or measurement in the ambient system. Analysis or assessment of the NFE must rely upon projections based on tests and models that encompass the long-term processes of the evolution of this environment. This NFE Process Model Report (PMR) describes the analyses and modeling based on current understanding of the evolution of the near-field within the rock mass extending outward from the drift wall.

  2. Space - A unique environment for process modeling R&D

    NASA Technical Reports Server (NTRS)

    Overfelt, Tony

    1991-01-01

    Process modeling, the application of advanced computational techniques to simulate real processes as they occur in regular use, e.g., welding, casting and semiconductor crystal growth, is discussed. Using the low-gravity environment of space will accelerate the technical validation of the procedures and enable extremely accurate determinations of the many necessary thermophysical properties. Attention is given to NASA's centers for the commercial development of space; joint ventures of universities, industries, and goverment agencies to study the unique attributes of space that offer potential for applied R&D and eventual commercial exploitation.

  3. Modeling socio-cultural processes in network-centric environments

    NASA Astrophysics Data System (ADS)

    Santos, Eunice E.; Santos, Eugene, Jr.; Korah, John; George, Riya; Gu, Qi; Kim, Keumjoo; Li, Deqing; Russell, Jacob; Subramanian, Suresh

    2012-05-01

    The major focus in the field of modeling & simulation for network centric environments has been on the physical layer while making simplifications for the human-in-the-loop. However, the human element has a big impact on the capabilities of network centric systems. Taking into account the socio-behavioral aspects of processes such as team building, group decision-making, etc. are critical to realistically modeling and analyzing system performance. Modeling socio-cultural processes is a challenge because of the complexity of the networks, dynamism in the physical and social layers, feedback loops and uncertainty in the modeling data. We propose an overarching framework to represent, model and analyze various socio-cultural processes within network centric environments. The key innovation in our methodology is to simultaneously model the dynamism in both the physical and social layers while providing functional mappings between them. We represent socio-cultural information such as friendships, professional relationships and temperament by leveraging the Culturally Infused Social Network (CISN) framework. The notion of intent is used to relate the underlying socio-cultural factors to observed behavior. We will model intent using Bayesian Knowledge Bases (BKBs), a probabilistic reasoning network, which can represent incomplete and uncertain socio-cultural information. We will leverage previous work on a network performance modeling framework called Network-Centric Operations Performance and Prediction (N-COPP) to incorporate dynamism in various aspects of the physical layer such as node mobility, transmission parameters, etc. We validate our framework by simulating a suitable scenario, incorporating relevant factors and providing analyses of the results.

  4. A simple hyperbolic model for communication in parallel processing environments

    NASA Technical Reports Server (NTRS)

    Stoica, Ion; Sultan, Florin; Keyes, David

    1994-01-01

    We introduce a model for communication costs in parallel processing environments called the 'hyperbolic model,' which generalizes two-parameter dedicated-link models in an analytically simple way. Dedicated interprocessor links parameterized by a latency and a transfer rate that are independent of load are assumed by many existing communication models; such models are unrealistic for workstation networks. The communication system is modeled as a directed communication graph in which terminal nodes represent the application processes that initiate the sending and receiving of the information and in which internal nodes, called communication blocks (CBs), reflect the layered structure of the underlying communication architecture. The direction of graph edges specifies the flow of the information carried through messages. Each CB is characterized by a two-parameter hyperbolic function of the message size that represents the service time needed for processing the message. The parameters are evaluated in the limits of very large and very small messages. Rules are given for reducing a communication graph consisting of many to an equivalent two-parameter form, while maintaining an approximation for the service time that is exact in both large and small limits. The model is validated on a dedicated Ethernet network of workstations by experiments with communication subprograms arising in scientific applications, for which a tight fit of the model predictions with actual measurements of the communication and synchronization time between end processes is demonstrated. The model is then used to evaluate the performance of two simple parallel scientific applications from partial differential equations: domain decomposition and time-parallel multigrid. In an appropriate limit, we also show the compatibility of the hyperbolic model with the recently proposed LogP model.

  5. Modeling critical zone processes in intensively managed environments

    NASA Astrophysics Data System (ADS)

    Kumar, Praveen; Le, Phong; Woo, Dong; Yan, Qina

    2017-04-01

    Processes in the Critical Zone (CZ), which sustain terrestrial life, are tightly coupled across hydrological, physical, biochemical, and many other domains over both short and long timescales. In addition, vegetation acclimation resulting from elevated atmospheric CO2 concentration, along with response to increased temperature and altered rainfall pattern, is expected to result in emergent behaviors in ecologic and hydrologic functions, subsequently controlling CZ processes. We hypothesize that the interplay between micro-topographic variability and these emergent behaviors will shape complex responses of a range of ecosystem dynamics within the CZ. Here, we develop a modeling framework ('Dhara') that explicitly incorporates micro-topographic variability based on lidar topographic data with coupling of multi-layer modeling of the soil-vegetation continuum and 3-D surface-subsurface transport processes to study ecological and biogeochemical dynamics. We further couple a C-N model with a physically based hydro-geomorphologic model to quantify (i) how topographic variability controls the spatial distribution of soil moisture, temperature, and biogeochemical processes, and (ii) how farming activities modify the interaction between soil erosion and soil organic carbon (SOC) dynamics. To address the intensive computational demand from high-resolution modeling at lidar data scale, we use a hybrid CPU-GPU parallel computing architecture run over large supercomputing systems for simulations. Our findings indicate that rising CO2 concentration and air temperature have opposing effects on soil moisture, surface water and ponding in topographic depressions. Further, the relatively higher soil moisture and lower soil temperature contribute to decreased soil microbial activities in the low-lying areas due to anaerobic conditions and reduced temperatures. The decreased microbial relevant processes cause the reduction of nitrification rates, resulting in relatively lower nitrate

  6. Conceptualization of Approaches and Thought Processes Emerging in Validating of Model in Mathematical Modeling in Technology Aided Environment

    ERIC Educational Resources Information Center

    Hidiroglu, Çaglar Naci; Bukova Güzel, Esra

    2013-01-01

    The aim of the present study is to conceptualize the approaches displayed for validation of model and thought processes provided in mathematical modeling process performed in technology-aided learning environment. The participants of this grounded theory study were nineteen secondary school mathematics student teachers. The data gathered from the…

  7. A Dynamic Process Model for Optimizing the Hospital Environment Cash-Flow

    NASA Astrophysics Data System (ADS)

    Pater, Flavius; Rosu, Serban

    2011-09-01

    In this article is presented a new approach to some fundamental techniques of solving dynamic programming problems with the use of functional equations. We will analyze the problem of minimizing the cost of treatment in a hospital environment. Mathematical modeling of this process leads to an optimal control problem with a finite horizon.

  8. Autoplan: A self-processing network model for an extended blocks world planning environment

    NASA Technical Reports Server (NTRS)

    Dautrechy, C. Lynne; Reggia, James A.; Mcfadden, Frank

    1990-01-01

    Self-processing network models (neural/connectionist models, marker passing/message passing networks, etc.) are currently undergoing intense investigation for a variety of information processing applications. These models are potentially very powerful in that they support a large amount of explicit parallel processing, and they cleanly integrate high level and low level information processing. However they are currently limited by a lack of understanding of how to apply them effectively in many application areas. The formulation of self-processing network methods for dynamic, reactive planning is studied. The long-term goal is to formulate robust, computationally effective information processing methods for the distributed control of semiautonomous exploration systems, e.g., the Mars Rover. The current research effort is focusing on hierarchical plan generation, execution and revision through local operations in an extended blocks world environment. This scenario involves many challenging features that would be encountered in a real planning and control environment: multiple simultaneous goals, parallel as well as sequential action execution, action sequencing determined not only by goals and their interactions but also by limited resources (e.g., three tasks, two acting agents), need to interpret unanticipated events and react appropriately through replanning, etc.

  9. Modeling of autonomous problem solving process by dynamic construction of task models in multiple tasks environment.

    PubMed

    Ohigashi, Yu; Omori, Takashi

    2006-10-01

    Traditional reinforcement learning (RL) supposes a complex but single task to be solved. When a RL agent faces a task similar to a learned one, the agent must re-learn the task from the beginning because it doesn't reuse the past learned results. This is the problem of quick action learning, which is the foundation of decision making in the real world. In this paper, we suppose agents that can solve a set of tasks similar to each other in a multiple tasks environment, where we encounter various problems one after another, and propose a technique of action learning that can quickly solve similar tasks by reusing previously learned knowledge. In our method, a model-based RL uses a task model constructed by combining primitive local predictors for predicting task and environmental dynamics. To evaluate the proposed method, we performed a computer simulation using a simple ping-pong game with variations.

  10. OpenDA Open Source Generic Data Assimilation Environment and its Application in Process Models

    NASA Astrophysics Data System (ADS)

    El Serafy, Ghada; Verlaan, Martin; Hummel, Stef; Weerts, Albrecht; Dhondia, Juzer

    2010-05-01

    Data Assimilation techniques are essential elements in state-of-the-art development of models and their optimization with data in the field of groundwater, surface water and soil systems. They are essential tools in calibration of complex modelling systems and improvement of model forecasts. The OpenDA is a new and generic open source data assimilation environment for application to a choice of physical process models, applied to case dependent domains. OpenDA was introduced recently when the developers of Costa, an open-source TU Delft project [http://www.costapse.org; Van Velzen and Verlaan; 2007] and those of the DATools from the former WL|Delft Hydraulics [El Serafy et al 2007; Weerts et al. 2009] decided to join forces. OpenDA makes use of a set of interfaces that describe the interaction between models, observations and data assimilation algorithms. It focuses on flexible applications in portable systems for modelling geophysical processes. It provides a generic interfacing protocol that allows combination of the implemented data assimilation techniques with, in principle, any time-stepping model duscribing a process(atmospheric processes, 3D circulation, 2D water level, sea surface temperature, soil systems, groundwater etc.). Presently, OpenDA features filtering techniques and calibration techniques. The presentation will give an overview of the OpenDA and the results of some of its practical applications. Application of data assimilation in portable operational forecasting systems—the DATools assimilation environment, El Serafy G.Y., H. Gerritsen, S. Hummel, A. H. Weerts, A.E. Mynett and M. Tanaka (2007), Journal of Ocean Dynamics, DOI 10.1007/s10236-007-0124-3, pp.485-499. COSTA a problem solving environment for data assimilation applied for hydrodynamical modelling, Van Velzen and Verlaan (2007), Meteorologische Zeitschrift, Volume 16, Number 6, December 2007 , pp. 777-793(17). Application of generic data assimilation tools (DATools) for flood

  11. Effects of Kinetic Processes in Shaping Io's Global Plasma Environment: A 3D Hybrid Model

    NASA Technical Reports Server (NTRS)

    Lipatov, Alexander S.; Combi, Michael R.

    2006-01-01

    The global dynamics of the ionized and neutral gases in the environment of Io plays an important role in the interaction of Jupiter s corotating magnetospheric plasma with Io. Stationary simulations of this problem have already been done using the magnetohydrodynamics (MHD) and the electrodynamics approaches. One of the major results of recent simplified two-fluid model simulations [Saur, J., Neubauer, F.M., Strobel, D.F., Summers, M.E., 2002. J. Geophys. Res. 107 (SMP5), 1-18] was the production of the structure of the double-peak in the magnetic field signature of the Io flyby. These could not be explained before by standard MHD models. In this paper, we present a hybrid simulation for Io with kinetic ions and fluid electrons. This method employs a fluid description for electrons and neutrals, whereas for ions a particle approach is used. We also take into account charge-exchange and photoionization processes and solve self-consistently for electric and magnetic fields. Our model may provide a much more accurate description for the ion dynamics than previous approaches and allows us to account for the realistic anisotropic ion velocity distribution that cannot be done in fluid simulations with isotropic temperatures. The first results of such a simulation of the dynamics of ions in Io s environment are discussed in this paper. Comparison with the Galileo IO flyby results shows that this approach provides an accurate physical basis for the interaction and can therefore naturally reproduce all the observed salient features.

  12. Effects of Kinetic Processes in Shaping Io's Global Plasma Environment: A 3D Hybrid Model

    NASA Technical Reports Server (NTRS)

    Lipatov, Alexander S.; Combi, Michael R.

    2006-01-01

    The global dynamics of the ionized and neutral gases in the environment of Io plays an important role in the interaction of Jupiter s corotating magnetospheric plasma with Io. Stationary simulations of this problem have already been done using the magnetohydrodynamics (MHD) and the electrodynamics approaches. One of the major results of recent simplified two-fluid model simulations [Saur, J., Neubauer, F.M., Strobel, D.F., Summers, M.E., 2002. J. Geophys. Res. 107 (SMP5), 1-18] was the production of the structure of the double-peak in the magnetic field signature of the Io flyby. These could not be explained before by standard MHD models. In this paper, we present a hybrid simulation for Io with kinetic ions and fluid electrons. This method employs a fluid description for electrons and neutrals, whereas for ions a particle approach is used. We also take into account charge-exchange and photoionization processes and solve self-consistently for electric and magnetic fields. Our model may provide a much more accurate description for the ion dynamics than previous approaches and allows us to account for the realistic anisotropic ion velocity distribution that cannot be done in fluid simulations with isotropic temperatures. The first results of such a simulation of the dynamics of ions in Io s environment are discussed in this paper. Comparison with the Galileo IO flyby results shows that this approach provides an accurate physical basis for the interaction and can therefore naturally reproduce all the observed salient features.

  13. Effects of Kinetic Processes in Shaping Io's Global Plasma Environment: A 3D Hybrid Model

    NASA Technical Reports Server (NTRS)

    Lipatov, Alexander S.; Combi, Michael R.

    2004-01-01

    The global dynamics of the ionized and neutral components in the environment of Io plays an important role in the interaction of Jupiter's corotating magnetospheric plasma with Io. The stationary simulation of this problem was done in the MHD and the electrodynamics approaches. One of the main significant results from the simplified two-fluid model simulations was a production of the structure of the double-peak in the magnetic field signature of the I0 flyby that could not be explained by standard MHD models. In this paper, we develop a method of kinetic ion simulation. This method employs the fluid description for electrons and neutrals whereas for ions multilevel, drift-kinetic and particle, approaches are used. We also take into account charge-exchange and photoionization processes. Our model provides much more accurate description for ion dynamics and allows us to take into account the realistic anisotropic ion distribution that cannot be done in fluid simulations. The first results of such simulation of the dynamics of ions in the Io's environment are discussed in this paper.

  14. Gaussian process based modeling and experimental design for sensor calibration in drifting environments

    PubMed Central

    Geng, Zongyu; Yang, Feng; Chen, Xi; Wu, Nianqiang

    2016-01-01

    It remains a challenge to accurately calibrate a sensor subject to environmental drift. The calibration task for such a sensor is to quantify the relationship between the sensor’s response and its exposure condition, which is specified by not only the analyte concentration but also the environmental factors such as temperature and humidity. This work developed a Gaussian Process (GP)-based procedure for the efficient calibration of sensors in drifting environments. Adopted as the calibration model, GP is not only able to capture the possibly nonlinear relationship between the sensor responses and the various exposure-condition factors, but also able to provide valid statistical inference for uncertainty quantification of the target estimates (e.g., the estimated analyte concentration of an unknown environment). Built on GP’s inference ability, an experimental design method was developed to achieve efficient sampling of calibration data in a batch sequential manner. The resulting calibration procedure, which integrates the GP-based modeling and experimental design, was applied on a simulated chemiresistor sensor to demonstrate its effectiveness and its efficiency over the traditional method. PMID:26924894

  15. Gaussian process based modeling and experimental design for sensor calibration in drifting environments.

    PubMed

    Geng, Zongyu; Yang, Feng; Chen, Xi; Wu, Nianqiang

    2015-09-01

    It remains a challenge to accurately calibrate a sensor subject to environmental drift. The calibration task for such a sensor is to quantify the relationship between the sensor's response and its exposure condition, which is specified by not only the analyte concentration but also the environmental factors such as temperature and humidity. This work developed a Gaussian Process (GP)-based procedure for the efficient calibration of sensors in drifting environments. Adopted as the calibration model, GP is not only able to capture the possibly nonlinear relationship between the sensor responses and the various exposure-condition factors, but also able to provide valid statistical inference for uncertainty quantification of the target estimates (e.g., the estimated analyte concentration of an unknown environment). Built on GP's inference ability, an experimental design method was developed to achieve efficient sampling of calibration data in a batch sequential manner. The resulting calibration procedure, which integrates the GP-based modeling and experimental design, was applied on a simulated chemiresistor sensor to demonstrate its effectiveness and its efficiency over the traditional method.

  16. Improving Ground Penetrating Radar Imaging in High Loss Environments by Coordinated System Development, Data Processing, Numerical Modeling, & Visualization

    SciTech Connect

    Wright, David L.

    2004-12-01

    Improving Ground Penetrating Radar Imaging in High Loss Environments by Coordinated System Development, Data Processing, Numerical Modeling, and Visualization Methods with Applications to Site Characterization EMSP Project 86992 Progress Report as of 9/2004.

  17. Mathematical Modelling of Thermal Process to Aquatic Environment with Different Hydrometeorological Conditions

    PubMed Central

    Issakhov, Alibek

    2014-01-01

    This paper presents the mathematical model of the thermal process from thermal power plant to aquatic environment of the reservoir-cooler, which is located in the Pavlodar region, 17 Km to the north-east of Ekibastuz town. The thermal process in reservoir-cooler with different hydrometeorological conditions is considered, which is solved by three-dimensional Navier-Stokes equations and temperature equation for an incompressible flow in a stratified medium. A numerical method based on the projection method, divides the problem into three stages. At the first stage, it is assumed that the transfer of momentum occurs only by convection and diffusion. Intermediate velocity field is solved by fractional steps method. At the second stage, three-dimensional Poisson equation is solved by the Fourier method in combination with tridiagonal matrix method (Thomas algorithm). Finally, at the third stage, it is expected that the transfer is only due to the pressure gradient. Numerical method determines the basic laws of the hydrothermal processes that qualitatively and quantitatively are approximated depending on different hydrometeorological conditions. PMID:24991644

  18. Modelling Dust Processing and Evolution in Extreme Environments as seen by Herschel Space Observatory

    NASA Astrophysics Data System (ADS)

    Bocchio, Marco

    2014-09-01

    The main goal of my PhD study is to understand the dust processing that occurs during the mixing between the galactic interstellar medium and the intracluster medium. This process is of particular interest in violent phenomena such as galaxy-galaxy interactions or the ``Ram Pressure Stripping'' due to the infalling of a galaxy towards the cluster centre.Initially, I focus my attention to the problem of dust destruction and heating processes, re-visiting the available models in literature. I particularly stress on the cases of extreme environments such as a hot coronal-type gas (e.g., IGM, ICM, HIM) and supernova-generated interstellar shocks. Under these conditions small grains are destroyed on short timescales and large grains are heated by the collisions with fast electrons making the dust spectral energy distribution very different from what observed in the diffuse ISM.In order to test our models I apply them to the case of an interacting galaxy, NGC 4438. Herschel data of this galaxy indicates the presence of dust with a higher-than-expected temperature.With a multi-wavelength analysis on a pixel-by-pixel basis we show that this hot dust seems to be embedded in a hot ionised gas therefore undergoing both collisional heating and small grain destruction.Furthermore, I focus on the long-standing conundrum about the dust destruction and dust formation timescales in the Milky Way. Based on the destruction efficiency in interstellar shocks, previous estimates led to a dust lifetime shorter than the typical timescale for dust formation in AGB stars. Using a recent dust model and an updated dust processing model we re-evaluate the dust lifetime in our Galaxy. Finally, I turn my attention to the phenomenon of ``Ram Pressure Stripping''. The galaxy ESO 137-001 represents one of the best cases to study this effect. Its long H2 tail embedded in a hot and ionised tail raises questions about its possible stripping from the galaxy or formation downstream in the tail. Based on

  19. Analysing Students' Shared Activity while Modeling a Biological Process in a Computer-Supported Educational Environment

    ERIC Educational Resources Information Center

    Ergazaki, M.; Zogza, V.; Komis, V.

    2007-01-01

    This paper reports on a case study with three dyads of high school students (age 14 years) each collaborating on a plant growth modeling task in the computer-supported educational environment "ModelsCreator". Following a qualitative line of research, the present study aims at highlighting the ways in which the collaborating students as well as the…

  20. Integrated Modeling Environment

    NASA Technical Reports Server (NTRS)

    Mosier, Gary; Stone, Paul; Holtery, Christopher

    2006-01-01

    The Integrated Modeling Environment (IME) is a software system that establishes a centralized Web-based interface for integrating people (who may be geographically dispersed), processes, and data involved in a common engineering project. The IME includes software tools for life-cycle management, configuration management, visualization, and collaboration.

  1. The enhanced Software Life Cyle Support Environment (ProSLCSE): Automation for enterprise and process modeling

    NASA Technical Reports Server (NTRS)

    Milligan, James R.; Dutton, James E.

    1993-01-01

    In this paper, we have introduced a comprehensive method for enterprise modeling that addresses the three important aspects of how an organization goes about its business. FirstEP includes infrastructure modeling, information modeling, and process modeling notations that are intended to be easy to learn and use. The notations stress the use of straightforward visual languages that are intuitive, syntactically simple, and semantically rich. ProSLCSE will be developed with automated tools and services to facilitate enterprise modeling and process enactment. In the spirit of FirstEP, ProSLCSE tools will also be seductively easy to use. Achieving fully managed, optimized software development and support processes will be long and arduous for most software organizations, and many serious problems will have to be solved along the way. ProSLCSE will provide the ability to document, communicate, and modify existing processes, which is the necessary first step.

  2. The Conceptualization of the Mathematical Modelling Process in Technology-Aided Environment

    ERIC Educational Resources Information Center

    Hidiroglu, Çaglar Naci; Güzel, Esra Bukova

    2017-01-01

    The aim of the study is to conceptualize the technology-aided mathematical modelling process in the frame of cognitive modelling perspective. The grounded theory approach was adopted in the study. The research was conducted with seven groups consisting of nineteen prospective mathematics teachers. The data were collected from the video records of…

  3. A two-dimensional B implantation model for semiconductor process simulation environments

    NASA Astrophysics Data System (ADS)

    Klein, K. M.; Park, C.; Morris, S.; Yang, S.-H.; Tasch, A. F.

    1993-06-01

    A computationally efficient semi-empirical model has been developed for modeling two-dimensional distributions of boron implanted into single-crystal silicon. This model accurately and efficiently models the depth profiles and lateral doping profiles under a masking edge for implantations as a function of dose, tilt angle, rotation angle, orientation of the masking edge, and masking layer thickness, in addition to energy. This new two-dimensional model is based on the dual-Pearson model [A.F. Tasch et al., J. Electrochem. Soc. 136 (1989) 810] for one-dimensional dopant depth distributions, which provides an accurate method of modeling the depth profile based on approximately 1000 SIMS profiles, and the UT-MARLOWE Monte Carlo ion implantation simulation code [K.M. Klein et al., IEEE Trans. Electron Devices ED-39 (1992) 1614], which provides well-modeled lateral dopant profiles. Combining depth profile and lateral profile information from these two models allows this new model to be both accurate and computationally efficient, making it suitable for use in semiconductor process modeling codes.

  4. Erosion and sedimentation models in New Zealand: spanning scales, processes and environments

    NASA Astrophysics Data System (ADS)

    Elliott, Sandy; Oehler, Francois; Derose, Ron

    2010-05-01

    Erosion and sedimentation are of keen interest in New Zealand due to pasture loss in hill areas, damage to infrastructure, loss of stream conveyance, and ecological impacts in estuarine and coastal areas. Management of these impacts requires prediction of the rates, locations, and timing of erosion and transport across a range of scales, and prediction of the response to intervention measures. A range of models has been applied in New Zealand to address these requirements, including: empirical models for the location and probability of occurrence of shallow landslides; empirical national-scale sediment load models with spatial and temporal downscaling; dynamic field-scale sheet erosion models upscaled and linked to estuarine deposition models, including assessment of climate change and effects of urbanisation; detailed (20 m) physically-based distributed dynamic catchment models applied to catchment scale; and provision of GIS-based decision support tools. Despite these advances, considerable work is required to provide the right information at the right scale. Remaining issues are linking between control measures described at the scale of implementation (part of hillslopes, reaches) to catchment-scale outcomes, which entails fine spatial resolution and large computational demands; ability to predict some key processes such as bank and head gully erosion; representation of sediment remobilisation of stores associated with response to land clearance; ability to represent episodic or catastrophic erosion processes along with relatively continuous processes such as sheet flow in a single model; and prediction of sediment concentrations and clarity under normal flow conditions. In this presentation we describe a variety of models and their application in New Zealand, summarise the models in terms of scales, complexity and uses, and outline approaches to resolving the remaining difficulties.

  5. Business Process Elicitation, Modeling, and Reengineering: Teaching and Learning with Simulated Environments

    ERIC Educational Resources Information Center

    Jeyaraj, Anand

    2010-01-01

    The design of enterprise information systems requires students to master technical skills for elicitation, modeling, and reengineering business processes as well as soft skills for information gathering and communication. These tacit skills and behaviors cannot be effectively taught students but rather experienced and learned by students. This…

  6. Business Process Elicitation, Modeling, and Reengineering: Teaching and Learning with Simulated Environments

    ERIC Educational Resources Information Center

    Jeyaraj, Anand

    2010-01-01

    The design of enterprise information systems requires students to master technical skills for elicitation, modeling, and reengineering business processes as well as soft skills for information gathering and communication. These tacit skills and behaviors cannot be effectively taught students but rather experienced and learned by students. This…

  7. Kinetic Modeling of Biogeochemical Processes in Subsurface Environments: Coupling Transport, Microbial Metabolism and Geochemistry

    NASA Astrophysics Data System (ADS)

    Wang, Y.

    2002-12-01

    Microbial reactions play an important role in regulating pore water chemistry (e.g., pH and Eh) as well as secondary mineral distribution in many subsurface systems and therefore directly control trace metal migration and recycling in those systems. In this paper, we present a multicomponent kinetic model that explicitly accounts for the coupling of microbial metabolism, microbial population dynamics, advective/dispersive transport of chemical species, aqueous speciation, and mineral precipitation/dissolution in porous geologic media. A modification to the traditional microbial growth kinetic equation is proposed, to account for the likely achievement of quasi-steady state biomass accumulations in natural environments. A scale dependence of microbial reaction rates is derived based on both field observations and the scaling analysis of reactive transport equations. As an example, we use the model to simulate a subsurface contaminant migration scenario, in which a water flow containing both uranium and a complexing organic ligand is recharged into an oxic carbonate aquifer. The model simulation shows that Mn and Fe oxyhydroxides may vary significantly along a flow path. The simulation also shows that uranium (VI) can be reduced and therefore immobilized in the anoxic zone created by microbial degradation. Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy (US DOE) under Contract DE-AC04-94AL85000.

  8. Modeled near-field environment porosity modifications due to coupled thermohydrologic and geochemical processes

    SciTech Connect

    Glassley, W. E.; Nitao, J. J.

    1998-10-30

    Heat deposited by waste packages in nuclear waste repositories can modify rock properties by instigating mineral dissolution and precipitation along hydrothermal flow pathways. Modeling this reactive transport requires coupling fluid flow to permeability changes resulting from dissolution and precipitation. Modification of the NUFT thermohydrologic (TH) code package to account for this coupling in a simplified geochemical system has been used to model the time- dependent change in porosity, permeability, matrix and fracture saturation, and temperature in the vicinity of waste-emplacement drifts, using conditions anticipated for the potential Yucca Mountain repository. The results show, within a few hundred years, dramatic porosity reduction approximately 10 m above emplacement drifts. Most of this reduction is attributed to deposition of solute load at the boiling front, although some of it also results from decreasing temperature along the flow path. The actual distribution of the nearly sealed region is sensitive to the time- dependent characteristics of the thermal load imposed on the environment and suggests that the geometry of the sealed region can be engineered by managing the waste-emplacement strategy.

  9. Parallel processing optimization strategy based on MapReduce model in cloud storage environment

    NASA Astrophysics Data System (ADS)

    Cui, Jianming; Liu, Jiayi; Li, Qiuyan

    2017-05-01

    Currently, a large number of documents in the cloud storage process employed the way of packaging after receiving all the packets. From the local transmitter this stored procedure to the server, packing and unpacking will consume a lot of time, and the transmission efficiency is low as well. A new parallel processing algorithm is proposed to optimize the transmission mode. According to the operation machine graphs model work, using MPI technology parallel execution Mapper and Reducer mechanism. It is good to use MPI technology to implement Mapper and Reducer parallel mechanism. After the simulation experiment of Hadoop cloud computing platform, this algorithm can not only accelerate the file transfer rate, but also shorten the waiting time of the Reducer mechanism. It will break through traditional sequential transmission constraints and reduce the storage coupling to improve the transmission efficiency.

  10. Process migration in UNIX environments

    NASA Technical Reports Server (NTRS)

    Lu, Chin; Liu, J. W. S.

    1988-01-01

    To support process migration in UNIX environments, the main problem is how to encapsulate the location dependent features of the system in such a way that a host independent virtual environment is maintained by the migration handlers on the behalf of each migrated process. An object-oriented approach is used to describe the interaction between a process and its environment. More specifically, environmental objects were introduced in UNIX systems to carry out the user-environment interaction. The implementation of the migration handlers is based on both the state consistency criterion and the property consistency criterion.

  11. Marine-hydrokinetic energy and the environment: Observations, modeling, and basic processes

    NASA Astrophysics Data System (ADS)

    Foufoula-Georgiou, Efi; Guala, Michele; Sotiropoulos, Fotis

    2012-03-01

    Research at the Interface of Marine Hydrokinetic Energy and the Environment: A Workshop; Minneapolis, Minnesota, 5-7 October 2011 Marine and hydrokinetic (MHK) energy harvesting technologies convert the kinetic energy of waves and water currents into power to generate electricity. Although these technologies are in early stages of development compared to other renewable technologies, such as solar and wind energy, they offer electricity consumers situated near coastlines or inland rivers an alternative energy technology that can help meet renewable portfolio standards. However, the potential environmental impacts of MHK energy are far from well understood, both in general principles and in site-specific cases. As pressure for new MHK energy licenses builds, accelerated research in providing the scientific understanding of harnessing the natural power of water for renewable energy at a competitive cost and without harming the environment becomes a priority.

  12. Model-based processing for shallow ocean environments: The broadband problem

    SciTech Connect

    Candy, J.V.; Sullivan, E.J.

    1996-01-31

    Most acoustic sources found is the ocean environmental are spatially complex and broadband. When propagating in a shallow ocean these source characteristics complicate the analysis of received acoustic data considerably. The enhancement of broadband acoustic pressure- field measurements using a vertical array is discussed. Here a model- based approach is developed for a broadband source using a normal- mode propagation model.

  13. Performance Improvement: Applying a Human Performance Model to Organizational Processes in a Military Training Environment

    ERIC Educational Resources Information Center

    Aaberg, Wayne; Thompson, Carla J.; West, Haywood V.; Swiergosz, Matthew J.

    2009-01-01

    This article provides a description and the results of a study that utilized the human performance (HP) model and methods to explore and analyze a training organization. The systemic and systematic practices of the HP model are applicable to military training organizations as well as civilian organizations. Implications of the study for future…

  14. Performance Improvement: Applying a Human Performance Model to Organizational Processes in a Military Training Environment

    ERIC Educational Resources Information Center

    Aaberg, Wayne; Thompson, Carla J.; West, Haywood V.; Swiergosz, Matthew J.

    2009-01-01

    This article provides a description and the results of a study that utilized the human performance (HP) model and methods to explore and analyze a training organization. The systemic and systematic practices of the HP model are applicable to military training organizations as well as civilian organizations. Implications of the study for future…

  15. An Efficient Simulation Environment for Modeling Large-Scale Cortical Processing

    PubMed Central

    Richert, Micah; Nageswaran, Jayram Moorkanikara; Dutt, Nikil; Krichmar, Jeffrey L.

    2011-01-01

    We have developed a spiking neural network simulator, which is both easy to use and computationally efficient, for the generation of large-scale computational neuroscience models. The simulator implements current or conductance based Izhikevich neuron networks, having spike-timing dependent plasticity and short-term plasticity. It uses a standard network construction interface. The simulator allows for execution on either GPUs or CPUs. The simulator, which is written in C/C++, allows for both fine grain and coarse grain specificity of a host of parameters. We demonstrate the ease of use and computational efficiency of this model by implementing a large-scale model of cortical areas V1, V4, and area MT. The complete model, which has 138,240 neurons and approximately 30 million synapses, runs in real-time on an off-the-shelf GPU. The simulator source code, as well as the source code for the cortical model examples is publicly available. PMID:22007166

  16. An efficient simulation environment for modeling large-scale cortical processing.

    PubMed

    Richert, Micah; Nageswaran, Jayram Moorkanikara; Dutt, Nikil; Krichmar, Jeffrey L

    2011-01-01

    We have developed a spiking neural network simulator, which is both easy to use and computationally efficient, for the generation of large-scale computational neuroscience models. The simulator implements current or conductance based Izhikevich neuron networks, having spike-timing dependent plasticity and short-term plasticity. It uses a standard network construction interface. The simulator allows for execution on either GPUs or CPUs. The simulator, which is written in C/C++, allows for both fine grain and coarse grain specificity of a host of parameters. We demonstrate the ease of use and computational efficiency of this model by implementing a large-scale model of cortical areas V1, V4, and area MT. The complete model, which has 138,240 neurons and approximately 30 million synapses, runs in real-time on an off-the-shelf GPU. The simulator source code, as well as the source code for the cortical model examples is publicly available.

  17. Arthropod model systems for studying complex biological processes in the space environment

    NASA Astrophysics Data System (ADS)

    Marco, Roberto; de Juan, Emilio; Ushakov, Ilya; Hernandorena, Arantxa; Gonzalez-Jurado, Juan; Calleja, Manuel; Manzanares, Miguel; Maroto, Miguel; Garesse, Rafael; Reitz, Günther; Miquel, Jaime

    1994-08-01

    Three arthropod systems are discussed in relation to their complementary and potential use in Space Biology. In a next biosatellite flight, Drosophila melanogaster pre-adapted during several months to different g levels will be flown in an automatic device that separates parental from first and second generations. In the same flight, flies will be exposed to microgravity conditions in an automatic unit in which fly motility can be recorded. In the International Microgravity Laboratory-2, several groups of Drosophila embryos will be grown in Space and the motility of a male fly population will be video-recorded. In the Biopan, an ESA exobilogy facility that can be flown attached to the exterior of a Russian biosatellite, Artemia dormant gastrulae will be exposed to the space environment in the exterior of the satellite under a normal atmosphere or in the void. Gastrulae will be separated in hit and non-hit populations. The developmental and aging response of these animals will be studied upon recovery. With these experiments we will be able to establish whether exposure to the space environment influences arthropod development and aging, and elaborate on some of the cellular mechanisms involved which should be tested in future experiments.

  18. Relational-database model for improving quality assurance and process control in a composite manufacturing environment

    NASA Astrophysics Data System (ADS)

    Gentry, Jeffery D.

    2000-05-01

    A relational database is a powerful tool for collecting and analyzing the vast amounts of inner-related data associated with the manufacture of composite materials. A relational database contains many individual database tables that store data that are related in some fashion. Manufacturing process variables as well as quality assurance measurements can be collected and stored in database tables indexed according to lot numbers, part type or individual serial numbers. Relationships between manufacturing process and product quality can then be correlated over a wide range of product types and process variations. This paper presents details on how relational databases are used to collect, store, and analyze process variables and quality assurance data associated with the manufacture of advanced composite materials. Important considerations are covered including how the various types of data are organized and how relationships between the data are defined. Employing relational database techniques to establish correlative relationships between process variables and quality assurance measurements is then explored. Finally, the benefits of database techniques such as data warehousing, data mining and web based client/server architectures are discussed in the context of composite material manufacturing.

  19. A Methodology and Software Environment for Testing Process Model’s Sequential Predictions with Protocols

    DTIC Science & Technology

    1992-12-21

    predictions and the data must be computed and viewable. It is very enlightening to be able to directly compare these aggregate measures with the model...their strengths, they appear to have been applied only to their initial test data. Their weaknesses may be even more enlightening than their strengths...examine working memory contents directly with neurophysiological tools. This is not yet possible. The other way is to use a model to predict what is

  20. Condensation Processes in Astrophysical Environments

    NASA Technical Reports Server (NTRS)

    Nuth, Joseph A., III; Rietmeijer, Frans J. M.; Hill, Hugh G. M.

    2002-01-01

    Astrophysical systems present an intriguing set of challenges for laboratory chemists. Chemistry occurs in regions considered an excellent vacuum by laboratory standards and at temperatures that would vaporize laboratory equipment. Outflows around Asymptotic Giant Branch (AGB) stars have timescales ranging from seconds to weeks depending on the distance of the region of interest from the star and, on the way significant changes in the state variables are defined. The atmospheres in normal stars may only change significantly on several billion-year timescales. Most laboratory experiments carried out to understand astrophysical processes are not done at conditions that perfectly match the natural suite of state variables or timescales appropriate for natural conditions. Experimenters must make use of simple analog experiments that place limits on the behavior of natural systems, often extrapolating to lower-pressure and/or higher-temperature environments. Nevertheless, we argue that well-conceived experiments will often provide insights into astrophysical processes that are impossible to obtain through models or observations. This is especially true for complex chemical phenomena such as the formation and metamorphism of refractory grains under a range of astrophysical conditions. Data obtained in our laboratory has been surprising in numerous ways, ranging from the composition of the condensates to the thermal evolution of their spectral properties. None of this information could have been predicted from first principals and would not have been credible even if it had.

  1. Modeling microevolution in a changing environment: the evolving quasispecies and the diluted champion process

    NASA Astrophysics Data System (ADS)

    Bianconi, Ginestra; Fichera, Davide; Franz, Silvio; Peliti, Luca

    2011-08-01

    Several pathogens use evolvability as a survival strategy against acquired immunity of the host. Despite their high variability in time, some of them exhibit quite low variability within the population at any given time, a somewhat paradoxical behavior often called the evolving quasispecies. In this paper we introduce a simplified model of an evolving viral population in which the effects of the acquired immunity of the host are represented by the decrease of the fitness of the corresponding viral strains, depending on the frequency of the strain in the viral population. The model exhibits evolving quasispecies behavior in a certain range of its parameters, and suggests how punctuated evolution can be induced by a simple feedback mechanism.

  2. Process membership in asynchronous environments

    NASA Technical Reports Server (NTRS)

    Ricciardi, Aleta M.; Birman, Kenneth P.

    1993-01-01

    The development of reliable distributed software is simplified by the ability to assume a fail-stop failure model. The emulation of such a model in an asynchronous distributed environment is discussed. The solution proposed, called Strong-GMP, can be supported through a highly efficient protocol, and was implemented as part of a distributed systems software project at Cornell University. The precise definition of the problem, the protocol, correctness proofs, and an analysis of costs are addressed.

  3. A Phenomena-Oriented Environment for Teaching Process Modeling: Novel Modeling Software and Its Use in Problem Solving.

    ERIC Educational Resources Information Center

    Foss, Alan S.; Geurts, Kevin R.; Goodeve, Peter J.; Dahm, Kevin D.; Stephanopoulos, George; Bieszczad, Jerry; Koulouris, Alexandros

    1999-01-01

    Discusses a program that offers students a phenomenon-oriented environment expressed in the fundamental concepts and language of chemical engineering such as mass and energy balancing, phase equilibria, reaction stoichiometry and rate, modes of heat, and species transport. (CCM)

  4. A Phenomena-Oriented Environment for Teaching Process Modeling: Novel Modeling Software and Its Use in Problem Solving.

    ERIC Educational Resources Information Center

    Foss, Alan S.; Geurts, Kevin R.; Goodeve, Peter J.; Dahm, Kevin D.; Stephanopoulos, George; Bieszczad, Jerry; Koulouris, Alexandros

    1999-01-01

    Discusses a program that offers students a phenomenon-oriented environment expressed in the fundamental concepts and language of chemical engineering such as mass and energy balancing, phase equilibria, reaction stoichiometry and rate, modes of heat, and species transport. (CCM)

  5. LDEF environment modeling updates

    NASA Technical Reports Server (NTRS)

    Gordon, Tim; Rantanen, Ray; Whitaker, Ann F.

    1995-01-01

    An updated gas dynamics model for gas interactions around the LDEF is presented that includes improved scattering algorithms. The primary improvement is more accurate predictions of surface fluxes in the wake region. The code used is the Integrated Spacecraft Environments Model (ISEM). Additionally, initial results of a detailed ISEM prediction model of the Solar Array Passive LDEF Experiment (SAMPLE), A0171, is presented. This model includes details of the A0171 geometry and outgassing characteristics of the many surfaces on the experiment. The detailed model includes the multiple scattering that exists between the ambient atmosphere, LDEF outgassing, and atomic oxygen erosion products. Predictions are made for gas densities, surface fluxes and deposition at three different time periods of the LDEF mission.

  6. Modeling Multiphase Coastal and Hydraulic Processes in an Interactive Python Environment with the Open Source Proteus Toolkit

    NASA Astrophysics Data System (ADS)

    Kees, C. E.; Farthing, M. W.; Ahmadia, A. J.; Bakhtyar, R.; Miller, C. T.

    2014-12-01

    Hydrology is dominated by multiphase flow processes, due to the importance of capturing water's interaction with soil and air phases. Unfortunately, many different mathematical model formulations are required to model particular processes and scales of interest, and each formulation often requires specialized numerical methods. The Proteus toolkit is a software package for research on models for coastal and hydraulic processes and improvements in numerics, particularly 3D multiphase processes and parallel numerics. The models considered include multiphase flow, shallow water flow, turbulent free surface flow, and various flow-driven processes. We will discuss the objectives of Proteus and recent evolution of the toolkit's design as well as present examples of how it has been used used to construct computational models of multiphase flows for the US Army Corps of Engineers. Proteus is also an open source toolkit authored primarily within the US Army Corps of Engineers, and used, developed, and maintained by a small community of researchers in both theoretical modeling and computational methods research. We will discuss how open source and community development practices have played a role in the creation of Proteus.

  7. An Integrated Vehicle Modeling Environment

    NASA Technical Reports Server (NTRS)

    Totah, Joseph J.; Kinney, David J.; Kaneshige, John T.; Agabon, Shane

    1999-01-01

    This paper describes an Integrated Vehicle Modeling Environment for estimating aircraft geometric, inertial, and aerodynamic characteristics, and for interfacing with a high fidelity, workstation based flight simulation architecture. The goals in developing this environment are to aid in the design of next generation intelligent fight control technologies, conduct research in advanced vehicle interface concepts for autonomous and semi-autonomous applications, and provide a value-added capability to the conceptual design and aircraft synthesis process. Results are presented for three aircraft by comparing estimates generated by the Integrated Vehicle Modeling Environment with known characteristics of each vehicle under consideration. The three aircraft are a modified F-15 with moveable canards attached to the airframe, a mid-sized, twin-engine commercial transport concept, and a small, single-engine, uninhabited aerial vehicle. Estimated physical properties and dynamic characteristics are correlated with those known for each aircraft over a large portion of the flight envelope of interest. These results represent the completion of a critical step toward meeting the stated goals for developing this modeling environment.

  8. Quantum process discrimination with information from environment

    NASA Astrophysics Data System (ADS)

    Wang, Yuan-Mei; Li, Jun-Gang; Zou, Jian; Xu, Bao-Ming

    2016-12-01

    In quantum metrology we usually extract information from the reduced probe system but ignore the information lost inevitably into the environment. However, K. Mølmer [Phys. Rev. Lett. 114, 040401 (2015)] showed that the information lost into the environment has an important effect on improving the successful probability of quantum process discrimination. Here we reconsider the model of a driven atom coupled to an environment and distinguish which of two candidate Hamiltonians governs the dynamics of the whole system. We mainly discuss two measurement methods, one of which obtains only the information from the reduced atom state and the other obtains the information from both the atom and its environment. Interestingly, for the two methods the optimal initial states of the atom, used to improve the successful probability of the process discrimination, are different. By comparing the two methods we find that the partial information from the environment is very useful for the discriminations. Project supported by the National Natural Science Foundation of China (Grant Nos. 11274043, 11375025, and 11005008).

  9. A Comparison of Reasoning Processes in a Collaborative Modelling Environment: Learning about Genetics Problems Using Virtual Chat

    ERIC Educational Resources Information Center

    Pata, Kai; Sarapuu, Tago

    2006-01-01

    This study investigated the possible activation of different types of model-based reasoning processes in two learning settings, and the influence of various terms of reasoning on the learners' problem representation development. Changes in 53 students' problem representations about genetic issue were analysed while they worked with different…

  10. DEVELOPMENT OF A CHEMICAL PROCESS MODELING ENVIRONMENT BASED ON CAPE-OPEN INTERFACE STANDARDS AND THE MICROSOFT .NET FRAMEWORK

    EPA Science Inventory

    Chemical process simulation has long been used as a design tool in the development of chemical plants, and has long been considered a means to evaluate different design options. With the advent of large scale computer networks and interface models for program components, it is po...

  11. DEVELOPMENT OF A CHEMICAL PROCESS MODELING ENVIRONMENT BASED ON CAPE-OPEN INTERFACE STANDARDS AND THE MICROSOFT .NET FRAMEWORK

    EPA Science Inventory

    Chemical process simulation has long been used as a design tool in the development of chemical plants, and has long been considered a means to evaluate different design options. With the advent of large scale computer networks and interface models for program components, it is po...

  12. The Ecosystem Model: Designing Campus Environments.

    ERIC Educational Resources Information Center

    Western Interstate Commission for Higher Education, Boulder, CO.

    This document stresses the increasing awareness in higher education of the impact student/environment transactions have upon the quality of educational life and details a model and design process for creating a better fit between educational environments and students. The ecosystem model uses an interdisciplinary approach for the make-up of its…

  13. Generalized Environment for Modeling Systems

    SciTech Connect

    2012-02-07

    GEMS is an integrated environment that allows technical analysts, modelers, researchers, etc. to integrate and deploy models and/or decision tools with associated data to the internet for direct use by customers. GEMS does not require that the model developer know how to code or script and therefore delivers this capability to a large group of technical specialists. Customers gain the benefit of being able to execute their own scenarios directly without need for technical support. GEMS is a process that leverages commercial software products with specialized codes that add connectivity and unique functions to support the overall capability. Users integrate pre-existing models with a commercial product and store parameters and input trajectories in a companion commercial database. The model is then exposed into a commercial web environment and a graphical user interface (GUI) is applied by the model developer. Users execute the model through the web based GUI and GEMS manages supply of proper inputs, execution of models, routing of data to models and display of results back to users. GEMS works in layers, the following description is from the bottom up. Modelers create models in the modeling tool of their choice such as Excel, Matlab, or Fortran. They can also use models from a library of previously wrapped legacy codes (models). Modelers integrate the models (or a single model) by wrapping and connecting the models using the Phoenix Integration tool entitled ModelCenter. Using a ModelCenter/SAS plugin (DOE copyright CW-10-08) the modeler gets data from either an SAS or SQL database and sends results back to SAS or SQL. Once the model is working properly, the ModelCenter file is saved and stored in a folder location to which a SharePoint server tool created at INL is pointed. This enables the ModelCenter model to be run from SharePoint. The modeler then goes into Microsoft SharePoint and creates a graphical user interface (GUI) using the ModelCenter WebPart (CW-12

  14. Geospace Environment Modeling Program

    NASA Astrophysics Data System (ADS)

    Dusenbery, Paul B.; Siscoe, George L.

    1992-02-01

    The geospace environment encompasses the highest and largest of the four physical geospheres—lithosphere, hydrosphere, atmosphere, and magnetosphere. Despite its size, its far-reaching structures interconnect and move together in a choreography of organized dynamics, whose complexity is reflected in the intricate movements of the northern lights. The vastness and inaccessibility of geospace, encompassing the plasma environment of the magnetosphere/ionosphere system, and the invisibility of its structures pose great challenges to scientists who want to study its dynamics by obtaining, in effect, video tapes of its globally organized motions. A key component of their strategy is the ability to see nearly all of geospace imaged onto the top of the atmosphere. The geomagnetic field threads the volume of geospace and transmits action, TV-like, from the magnetospheric stage down its lines of force onto the atmospheric screen.

  15. CAUSA - An Environment For Modeling And Simulation

    NASA Astrophysics Data System (ADS)

    Dilger, Werner; Moeller, Juergen

    1989-03-01

    CAUSA is an environment for modeling and simulation of dynamic systems on a quantitative level. The environment provides a conceptual framework including primitives like objects, processes and causal dependencies which allow the modeling of a broad class of complex systems. The facility of simulation allows the quantitative and qualitative inspection and empirical investigation of the behavior of the modeled system. CAUSA is implemented in Knowledge-Craft and runs on a Symbolics 3640.

  16. Managing Algorithmic Skeleton Nesting Requirements in Realistic Image Processing Applications: The Case of the SKiPPER-II Parallel Programming Environment's Operating Model

    NASA Astrophysics Data System (ADS)

    Coudarcher, Rémi; Duculty, Florent; Serot, Jocelyn; Jurie, Frédéric; Derutin, Jean-Pierre; Dhome, Michel

    2005-12-01

    SKiPPER is a SKeleton-based Parallel Programming EnviRonment being developed since 1996 and running at LASMEA Laboratory, the Blaise-Pascal University, France. The main goal of the project was to demonstrate the applicability of skeleton-based parallel programming techniques to the fast prototyping of reactive vision applications. This paper deals with the special features embedded in the latest version of the project: algorithmic skeleton nesting capabilities and a fully dynamic operating model. Throughout the case study of a complete and realistic image processing application, in which we have pointed out the requirement for skeleton nesting, we are presenting the operating model of this feature. The work described here is one of the few reported experiments showing the application of skeleton nesting facilities for the parallelisation of a realistic application, especially in the area of image processing. The image processing application we have chosen is a 3D face-tracking algorithm from appearance.

  17. Photohadronic Processes in Astrophysical Environments

    NASA Astrophysics Data System (ADS)

    Mücke, A.; Rachen, J. P.; Engel, Ralph; Protheroe, R. J.; Stanev, Todor

    1999-08-01

    We discuss the first applications of our newly developed Monte Carlo event generator SOPHIA to multiparticle photoproduction of relativistic protons with thermal and power-law radiation fields. The measured total cross section is reproduced in terms of excitation and decay of baryon resonances, direct pion production, diffractive scattering, and non-diffractive multiparticle production. Non-diffractive multiparticle production is described using a string fragmentation model. We demonstrate that the widely used `Δ-approximation' for the photoproduction cross section is reasonable only for a restricted set of astrophysical applications. The relevance of this result for cosmic ray propagation through the microwave background and hadronic models of active galactic nuclei and gamma-ray bursts is briefly discussed.

  18. Estimation of environment-related properties of chemicals for design of sustainable processes: development of group-contribution+ (GC+) property models and uncertainty analysis.

    PubMed

    Hukkerikar, Amol Shivajirao; Kalakul, Sawitree; Sarup, Bent; Young, Douglas M; Sin, Gürkan; Gani, Rafiqul

    2012-11-26

    of the developed property models for the estimation of environment-related properties and uncertainties of the estimated property values is highlighted through an illustrative example. The developed property models provide reliable estimates of environment-related properties needed to perform process synthesis, design, and analysis of sustainable chemical processes and allow one to evaluate the effect of uncertainties of estimated property values on the calculated performance of processes giving useful insights into quality and reliability of the design of sustainable processes.

  19. Modeling of LDEF contamination environment

    NASA Technical Reports Server (NTRS)

    Carruth, M. Ralph, Jr.; Rantanen, Ray; Gordon, Tim

    1993-01-01

    The Long Duration Exposure Facility (LDEF) satellite was unique in many ways. It was a large structure that was in space for an extended period of time and was stable in orientation relative to the velocity vector. There are obvious and well documented effects of contamination and space environment effects on the LDEF satellite. In order to examine the interaction of LDEF with its environment and the resulting effect on the satellite, the Integrated Spacecraft Environments Model (ISEM) was used to model the LDEF-induced neutral environment at several different times and altitudes during the mission.

  20. Autonomous environment modeling by a mobile robot

    NASA Astrophysics Data System (ADS)

    Moutarlier, Philippe

    1991-02-01

    Internal geometric representation of the environment is considered. The autonomy of a mobile robot partly relies on its ability to build a reliable representation of its environment. On the other hand, an autonomous environment building process requires that model be adapted to plan motions and perception actions. Therefore, the modeling process must be a reversible interface between perception motion devices and the model itself. Several kinds of models are necessary in order to achieve an autonomous process. Sensors give stochastic information on the surface, navigation needs free-space representation, and perception planning requires aspect graphs. The functions of stochastic surface modeling, free space representation, and topological graph computing are presented through the integrated geometric model builder called 'Yaka.' Since all environment data uncertainties are correlated together through the robot location inaccuracy, classical filtering methods are inadequate. A method of computing a linear variance estimator, that is adapted to the problem, is proposed. This general formalism is validated by a large number of experimentation wherein the robot incrementally builds a surfacic representation of its environment. Free space cannot be deduced directly, at each step, from the surfacic data provided by the sensors. Innacuracies on object surfaces and uncertainties on the visibility of objects by the sensor as well as the possible motion of objects must all be taken into account for building the free space incrementally. Then, motion and perception planning for autonomous environment modeling are achieved using this free space model and topological location and aspect graphs.

  1. Biosphere Process Model Report

    SciTech Connect

    J. Schmitt

    2000-05-25

    To evaluate the postclosure performance of a potential monitored geologic repository at Yucca Mountain, a Total System Performance Assessment (TSPA) will be conducted. Nine Process Model Reports (PMRs), including this document, are being developed to summarize the technical basis for each of the process models supporting the TSPA model. These reports cover the following areas: (1) Integrated Site Model; (2) Unsaturated Zone Flow and Transport; (3) Near Field Environment; (4) Engineered Barrier System Degradation, Flow, and Transport; (5) Waste Package Degradation; (6) Waste Form Degradation; (7) Saturated Zone Flow and Transport; (8) Biosphere; and (9) Disruptive Events. Analysis/Model Reports (AMRs) contain the more detailed technical information used to support TSPA and the PMRs. The AMRs consists of data, analyses, models, software, and supporting documentation that will be used to defend the applicability of each process model for evaluating the postclosure performance of the potential Yucca Mountain repository system. This documentation will ensure the traceability of information from its source through its ultimate use in the TSPA-Site Recommendation (SR) and in the National Environmental Policy Act (NEPA) analysis processes. The objective of the Biosphere PMR is to summarize (1) the development of the biosphere model, and (2) the Biosphere Dose Conversion Factors (BDCFs) developed for use in TSPA. The Biosphere PMR does not present or summarize estimates of potential radiation doses to human receptors. Dose calculations are performed as part of TSPA and will be presented in the TSPA documentation. The biosphere model is a component of the process to evaluate postclosure repository performance and regulatory compliance for a potential monitored geologic repository at Yucca Mountain, Nevada. The biosphere model describes those exposure pathways in the biosphere by which radionuclides released from a potential repository could reach a human receptor

  2. Adapting the Electrospinning Process to Provide Three Unique Environments for a Tri-layered In Vitro Model of the Airway Wall

    PubMed Central

    Bridge, Jack C.; Aylott, Jonathan W.; Brightling, Christopher E.; Ghaemmaghami, Amir M.; Knox, Alan J.; Lewis, Mark P.; Rose, Felicity R.A.J.; Morris, Gavin E.

    2015-01-01

    Electrospinning is a highly adaptable method producing porous 3D fibrous scaffolds that can be exploited in in vitro cell culture. Alterations to intrinsic parameters within the process allow a high degree of control over scaffold characteristics including fiber diameter, alignment and porosity. By developing scaffolds with similar dimensions and topographies to organ- or tissue-specific extracellular matrices (ECM), micro-environments representative to those that cells are exposed to in situ can be created. The airway bronchiole wall, comprised of three main micro-environments, was selected as a model tissue. Using decellularized airway ECM as a guide, we electrospun the non-degradable polymer, polyethylene terephthalate (PET), by three different protocols to produce three individual electrospun scaffolds optimized for epithelial, fibroblast or smooth muscle cell-culture. Using a commercially available bioreactor system, we stably co-cultured the three cell-types to provide an in vitro model of the airway wall over an extended time period. This model highlights the potential for such methods being employed in in vitro diagnostic studies investigating important inter-cellular cross-talk mechanisms or assessing novel pharmaceutical targets, by providing a relevant platform to allow the culture of fully differentiated adult cells within 3D, tissue-specific environments. PMID:26275100

  3. Adapting the Electrospinning Process to Provide Three Unique Environments for a Tri-layered In Vitro Model of the Airway Wall.

    PubMed

    Bridge, Jack C; Aylott, Jonathan W; Brightling, Christopher E; Ghaemmaghami, Amir M; Knox, Alan J; Lewis, Mark P; Rose, Felicity R A J; Morris, Gavin E

    2015-07-31

    Electrospinning is a highly adaptable method producing porous 3D fibrous scaffolds that can be exploited in in vitro cell culture. Alterations to intrinsic parameters within the process allow a high degree of control over scaffold characteristics including fiber diameter, alignment and porosity. By developing scaffolds with similar dimensions and topographies to organ- or tissue-specific extracellular matrices (ECM), micro-environments representative to those that cells are exposed to in situ can be created. The airway bronchiole wall, comprised of three main micro-environments, was selected as a model tissue. Using decellularized airway ECM as a guide, we electrospun the non-degradable polymer, polyethylene terephthalate (PET), by three different protocols to produce three individual electrospun scaffolds optimized for epithelial, fibroblast or smooth muscle cell-culture. Using a commercially available bioreactor system, we stably co-cultured the three cell-types to provide an in vitro model of the airway wall over an extended time period. This model highlights the potential for such methods being employed in in vitro diagnostic studies investigating important inter-cellular cross-talk mechanisms or assessing novel pharmaceutical targets, by providing a relevant platform to allow the culture of fully differentiated adult cells within 3D, tissue-specific environments.

  4. A Learning Model for Enhancing the Student's Control in Educational Process Using Web 2.0 Personal Learning Environments

    ERIC Educational Resources Information Center

    Rahimi, Ebrahim; van den Berg, Jan; Veen, Wim

    2015-01-01

    In recent educational literature, it has been observed that improving student's control has the potential of increasing his or her feeling of ownership, personal agency and activeness as means to maximize his or her educational achievement. While the main conceived goal for personal learning environments (PLEs) is to increase student's control by…

  5. A Learning Model for Enhancing the Student's Control in Educational Process Using Web 2.0 Personal Learning Environments

    ERIC Educational Resources Information Center

    Rahimi, Ebrahim; van den Berg, Jan; Veen, Wim

    2015-01-01

    In recent educational literature, it has been observed that improving student's control has the potential of increasing his or her feeling of ownership, personal agency and activeness as means to maximize his or her educational achievement. While the main conceived goal for personal learning environments (PLEs) is to increase student's control by…

  6. Improving Ground Penetrating Radar Imaging in High Loss Environments by Coordinated System Development, Data Processing, Numerical Modeling, & Visualization ...

    SciTech Connect

    Powers, Michael H.

    2003-06-01

    The Department of Energy has identified the location and characterization of subsurface contaminants and the characterization of the subsurface as a priority need. Many DOE facilities are in need of subsurface imaging in the vadose and saturated zones. This includes (1) the detection and characterization of metal and concrete structures, (2) the characterization of waste pits (for both contents and integrity) and (3) mapping the complex geological/hydrological framework of the vadose and saturated zones. The DOE has identified ground penetrating radar (GPR) as a method that can non-invasively map transportation pathways and vadose zone heterogeneity. An advanced GPR system and advanced subsurface modeling, processing, imaging, and inversion techniques can be directly applied to several DOE science needs in more than one focus area and at many sites. Needs for enhanced subsurface imaging have been identified at Hanford, INEEL, SRS, ORNL, LLNL, SNL, LANL, and many other sites. In fact, needs for better subsurface imaging probably exist at all DOE sites. However, GPR performance is often inadequate due to increased attenuation and dispersion when soil conductivities are high. Our objective is to extend the limits of performance of GPR by improvements to both hardware and numerical computation. The key features include (1) greater dynamic range through real time digitizing, receiver gain improvements, and high output pulser, (2) modified, fully characterized antennas with sensors to allow dynamic determination of the changing radiated waveform, (3) modified deconvolution and depth migration algorithms exploiting the new antenna output information, (4) development of automatic full waveform inversion made possible by the known radiated pulse shape.

  7. Computer-Based Modeling Environments

    DTIC Science & Technology

    1988-12-01

    and Kernighan 򒾃>), CAMPS (Lucas and Mitra 򒾁>), GAMS (Bisschop and Meeraus 򒽾>), LINGO (Cunningham and Schrage 򒾄>), LPL (Hurlimann and...times; and Vo 򒾁>, which describes the integration approach used by a UNIX -based analytical modeling environment at AT&T Bell Laboratories called...platform such as UNIX , as ANALYTICOL does (Childs and Meacham 򒾁>). Or one might build a modeling environment around a suitable, and probably relational

  8. Scalable Networked Information Processing Environment (SNIPE)

    SciTech Connect

    Fagg, G.E.; Moore, K.; Dongarra, J.J. |; Geist, A.

    1997-11-01

    SNIPE is a metacomputing system that aims to provide a reliable, secure, fault tolerant environment for long term distributed computing applications and data stores across the global Internet. This system combines global naming and replication of both processing and data to support large scale information processing applications leading to better availability and reliability than currently available with typical cluster computing and/or distributed computer environments.

  9. Modeling the Space Radiation Environment

    NASA Technical Reports Server (NTRS)

    Xapsos, Michael A.

    2006-01-01

    There has been a renaissance of interest in space radiation environment modeling. This has been fueled by the growing need to replace long time standard AP-9 and AE-8 trapped particle models, the interplanetary exploration initiative, the modern satellite instrumentation that has led to unprecedented measurement accuracy, and the pervasive use of Commercial off the Shelf (COTS) microelectronics that require more accurate predictive capabilities. The objective of this viewgraph presentation was to provide basic understanding of the components of the space radiation environment and their variations, review traditional radiation effects application models, and present recent developments.

  10. Development of an Integrated Modeling Framework for Simulations of Coastal Processes in Deltaic Environments Using High-Performance Computing

    DTIC Science & Technology

    2009-01-01

    ALPACA : This NSF funded project is developing debugging and profiling tools for the Cactus framework which will support the Coastal Modeling Framework...developed in this project. (http://www.cactuscode.org/Development/ alpaca ) CyberTools: This NSF/BOR funded project is developing a

  11. Development of an Integrated Modeling Framework for Simulations of Coastal Processes in Deltaic Environments Using High-Performance Computing

    DTIC Science & Technology

    2008-01-01

    http://www.cactuscode.org/Development/xirel) ALPACA : This NSF funded project is developing debugging and profiling tools for the Cactus framework...which will support the Coastal Modeling Framework developed in this project. (http://www.cactuscode.org/Development/ alpaca ) CyberTools: This NSF/BOR

  12. A Process and Environment Aware Sierra/SolidMechanics Cohesive Zone Modeling Capability for Polymer/Solid Interfaces

    SciTech Connect

    Reedy, E. D.; Chambers, Robert S.; Hughes, Lindsey Gloe; Kropka, Jamie Michael; Stavig, Mark E.; Stevens, Mark J.

    2015-09-01

    The performance and reliability of many mechanical and electrical components depend on the integrity of po lymer - to - solid interfaces . Such interfaces are found in adhesively bonded joints, encapsulated or underfilled electronic modules, protective coatings, and laminates. The work described herein was aimed at improving Sandia's finite element - based capability to predict interfacial crack growth by 1) using a high fidelity nonlinear viscoelastic material model for the adhesive in fracture simulations, and 2) developing and implementing a novel cohesive zone fracture model that generates a mode - mixity dependent toughness as a natural consequence of its formulation (i.e., generates the observed increase in interfacial toughness wi th increasing crack - tip interfacial shear). Furthermore, molecular dynamics simulations were used to study fundamental material/interfa cial physics so as to develop a fuller understanding of the connection between molecular structure and failure . Also reported are test results that quantify how joint strength and interfacial toughness vary with temperature.

  13. Modeling the Adoption Process of the Flight Training Synthetic Environment Technology (FTSET) in the Turkish Army Aviation (TUAA)

    DTIC Science & Technology

    2006-12-01

    new technologies is usually explained by the Diffusion of Innovations Model37 and its S-shaped growth patterns. French Sociologist Gabriel Tarde ...Dohme, and Robert T . Nullmeyer, “Optimizing Simulator-Aircraft Mix for US Army Initial Entry Rotary Wing Training,” Technical Report 1092 (March 1999): 6... T . Nullmeyer, Optimizing Simulator- Aircraft Mix for US Army Initial Entry Rotary Wing Training (US Army Research Institute for the Behavioral and

  14. Modeling the growth and constraints of thermophiles and biogeochemical processes in deep-sea hydrothermal environments (Invited)

    NASA Astrophysics Data System (ADS)

    Holden, J. F.; Ver Eecke, H. C.; Lin, T. J.; Butterfield, D. A.; Olson, E. J.; Jamieson, J.; Knutson, J. K.; Dyar, M. D.

    2010-12-01

    and contain an abundance of Fe(III) oxide and sulfate minerals, especially on surfaces of pore spaces. Hyperthermophilic iron reducers attach to iron oxide particles via cell wall invaginations and pili and reduce the iron through direct contact. The iron is reduced to magnetite, possibly with a maghemite intermediate. Thus iron reducers could outcompete methanogens in low H2, mildly reducing habitats such as Endeavour. Unlike strain JH146, respiration rates per cell were highest near the optimal growth temperature for the iron reducer Hyperthermus strain Ro04 and decreased near the temperature limits for growth. This study highlights the need to model microbe-metal interactions and improve respiration estimates from pure cultures to refine our in situ bioenergetic and habitat models.

  15. Spacecraft Internal Acoustic Environment Modeling

    NASA Technical Reports Server (NTRS)

    Allen, Christopher; Chu, S. Reynold

    2008-01-01

    The objective of the project is to develop an acoustic modeling capability, based on commercial off-the-shelf software, to be used as a tool for oversight of the future manned Constellation vehicles to ensure compliance with acoustic requirements and thus provide a safe and habitable acoustic environment for the crews, and to validate developed models via building physical mockups and conducting acoustic measurements.

  16. Electronic materials processing and the microgravity environment

    NASA Technical Reports Server (NTRS)

    Witt, A. F.

    1988-01-01

    The nature and origin of deficiencies in bulk electronic materials for device fabrication are analyzed. It is found that gravity generated perturbations during their formation account largely for the introduction of critical chemical and crystalline defects and, moreover, are responsible for the still existing gap between theory and experiment and thus for excessive reliance on proprietary empiricism in processing technology. Exploration of the potential of reduced gravity environment for electronic materials processing is found to be not only desirable but mandatory.

  17. Teaching Process Writing in an Online Environment

    ERIC Educational Resources Information Center

    Carolan, Fergal; Kyppö, Anna

    2015-01-01

    This reflective practice paper offers some insights into teaching an interdisciplinary academic writing course aimed at promoting process writing. The study reflects on students' acquisition of writing skills and the teacher's support practices in a digital writing environment. It presents writers' experiences related to various stages of process…

  18. Teaching Process Writing in an Online Environment

    ERIC Educational Resources Information Center

    Carolan, Fergal; Kyppö, Anna

    2015-01-01

    This reflective practice paper offers some insights into teaching an interdisciplinary academic writing course aimed at promoting process writing. The study reflects on students' acquisition of writing skills and the teacher's support practices in a digital writing environment. It presents writers' experiences related to various stages of process…

  19. Modeling hyporheic zone processes

    USGS Publications Warehouse

    Runkel, Robert L.; McKnight, Diane M.; Rajaram, Harihar

    2003-01-01

    Stream biogeochemistry is influenced by the physical and chemical processes that occur in the surrounding watershed. These processes include the mass loading of solutes from terrestrial and atmospheric sources, the physical transport of solutes within the watershed, and the transformation of solutes due to biogeochemical reactions. Research over the last two decades has identified the hyporheic zone as an important part of the stream system in which these processes occur. The hyporheic zone may be loosely defined as the porous areas of the stream bed and stream bank in which stream water mixes with shallow groundwater. Exchange of water and solutes between the stream proper and the hyporheic zone has many biogeochemical implications, due to differences in the chemical composition of surface and groundwater. For example, surface waters are typically oxidized environments with relatively high dissolved oxygen concentrations. In contrast, reducing conditions are often present in groundwater systems leading to low dissolved oxygen concentrations. Further, microbial oxidation of organic materials in groundwater leads to supersaturated concentrations of dissolved carbon dioxide relative to the atmosphere. Differences in surface and groundwater pH and temperature are also common. The hyporheic zone is therefore a mixing zone in which there are gradients in the concentrations of dissolved gasses, the concentrations of oxidized and reduced species, pH, and temperature. These gradients lead to biogeochemical reactions that ultimately affect stream water quality. Due to the complexity of these natural systems, modeling techniques are frequently employed to quantify process dynamics.

  20. Distributed System Modeling Environment (DSME)

    DTIC Science & Technology

    1990-07-01

    34 Simulation tools, such as the Internetted System Modeling (ISM) system; * Distributed operating systems, such as Cronus and A1I)ha; • Distributed...RADC/COTD in this area is the Cronus distributed operating system. Cronus provides an architecture and tools for building and operating distributed...applications on a diverse set of machines. Cronus is more accurately identified as a distributed computing environment, since its role as a distributed

  1. Cabin Environment Physics Risk Model

    NASA Technical Reports Server (NTRS)

    Mattenberger, Christopher J.; Mathias, Donovan Leigh

    2014-01-01

    This paper presents a Cabin Environment Physics Risk (CEPR) model that predicts the time for an initial failure of Environmental Control and Life Support System (ECLSS) functionality to propagate into a hazardous environment and trigger a loss-of-crew (LOC) event. This physics-of failure model allows a probabilistic risk assessment of a crewed spacecraft to account for the cabin environment, which can serve as a buffer to protect the crew during an abort from orbit and ultimately enable a safe return. The results of the CEPR model replace the assumption that failure of the crew critical ECLSS functionality causes LOC instantly, and provide a more accurate representation of the spacecraft's risk posture. The instant-LOC assumption is shown to be excessively conservative and, moreover, can impact the relative risk drivers identified for the spacecraft. This, in turn, could lead the design team to allocate mass for equipment to reduce overly conservative risk estimates in a suboptimal configuration, which inherently increases the overall risk to the crew. For example, available mass could be poorly used to add redundant ECLSS components that have a negligible benefit but appear to make the vehicle safer due to poor assumptions about the propagation time of ECLSS failures.

  2. Process engineering concerns in the lunar environment

    NASA Technical Reports Server (NTRS)

    Sullivan, T. A.

    1990-01-01

    The paper discusses the constraints on a production process imposed by the lunar or Martian environment on the space transportation system. A proposed chemical route to produce oxygen from iron oxide bearing minerals (including ilmenite) is presented in three different configurations which vary in complexity. A design for thermal energy storage is presented that could both provide power during the lunar night and act as a blast protection barrier for the outpost. A process to release carbon from the lunar regolith as methane is proposed, capitalizing on the greater abundance and favorable physical properties of methane relative to hydrogen to benefit the entire system.

  3. Course Material Model in A&O Learning Environment.

    ERIC Educational Resources Information Center

    Levasma, Jarkko; Nykanen, Ossi

    One of the problematic issues in the content development for learning environments is the process of importing various types of course material into the environment. This paper describes a method for importing material into the A&O open learning environment by introducing a material model for metadata recognized by the environment. The first…

  4. Galactic cosmic radiation environment models

    NASA Astrophysics Data System (ADS)

    Badhwar, G. D.; O'Neill, P. M.; Troung, A. G.

    2001-02-01

    Models of the radiation environment in free space and in near earth orbits are required to estimate the radiation dose to the astronauts for Mars, Space Shuttle, and the International Space Station missions, and to estimate the rate of single event upsets and latch-ups in electronic devices. Accurate knowledge of the environment is critical for the design of optimal shielding during both the cruise phase and for a habitat on Mars or the Moon. Measurements of the energy spectra of galactic cosmic rays (GCR) have been made for nearly four decades. In the last decade, models have been constructed that can predict the energy spectra of any GCR nuclei to an accuracy of better than 25%. Fresh and more accurate measurements have been made in the last year. These measurements can lead to more accurate models. Improvements in these models can be made in determining the local interstellar spectra and in predicting the level of solar modulation. It is the coupling of the two that defines a GCR model. This paper reviews of two of the more widely used models, and a comparison of their predictions with new proton and helium data from the Alpha Magnetic Spectrometer (AMS), and spectra of beryllium to iron in the ~40 to 500 MeV/n acquired by the Advanced Composition Explorer (ACE) during the 1997-98 solar minimum. Regressions equations relating the IMP-8 helium count rate to the solar modulation deceleration parameter calculated using the Climax neutron monitor rate have been developed and may lead to improvements in the predictive capacity of the models. .

  5. The process-based stand growth model Formix 3-Q applied in a GIS environment for growth and yield analysis in a tropical rain forest.

    PubMed

    Ditzer, T.; Glauner, R.; Förster, M.; Köhler, P.; Huth, A.

    2000-03-01

    Managing tropical rain forests is difficult because few long-term field data on forest growth and the impact of harvesting disturbance are available. Growth models may provide a valuable tool for managers of tropical forests, particularly if applied to the extended forest areas of up to 100,000 ha that typically constitute the so-called forest management units (FMUs). We used a stand growth model in a geographic information system (GIS) environment to simulate tropical rain forest growth at the FMU level. We applied the process-based rain forest growth model Formix 3-Q to the 55,000 ha Deramakot Forest Reserve (DFR) in Sabah, Malaysia. The FMU was considered to be composed of single and independent small-scale stands differing in site conditions and forest structure. Field data, which were analyzed with a GIS, comprised a terrestrial forest inventory, site and soil analyses (water, nutrients, slope), the interpretation of aerial photographs of the present vegetation and topographic maps. Different stand types were determined based on a classification of site quality (three classes), slopes (four classes), and present forest structure (four strata). The effects of site quality on tree allometry (height-diameter curve, biomass allometry, leaf area) and growth (increment size) are incorporated into Formix 3-Q. We derived allometric relations and growth factors for different site conditions from the field data. Climax forest structure at the stand level was shown to depend strongly on site conditions. Simulated successional pattern and climax structure were compared with field observations. Based on the current management plan for the DFR, harvesting scenarios were simulated for stands on different sites. The effects of harvesting guidelines on forest structure and the implications for sustainable forest management at Deramakot were analyzed. Based on the stand types and GIS analysis, we also simulated undisturbed regeneration of the logged-over forest in the DFR at

  6. Microbial processes in fractured rock environments

    NASA Astrophysics Data System (ADS)

    Kinner, Nancy E.; Eighmy, T. Taylor; Mills, M.; Coulburn, J.; Tisa, L.

    Little is known about the types and activities of microbes in fractured rock environments, but recent studies in a variety of bedrock formations have documented the presence of a diverse array of prokaryotes (Eubacteria and Archaea) and some protists. The prokaryotes appear to live in both diffusion-dominated microfractures and larger, more conductive open fractures. Some of the prokaryotes are associated with the surfaces of the host rock and mineral precipitates, while other planktonic forms are floating/moving in the groundwater filling the fractures. Studies indicate that the surface-associated and planktonic communities are distinct, and their importance in microbially mediated processes occurring in the bedrock environment may vary, depending on the availability of electron donors/acceptors and nutrients needed by the cells. In general, abundances of microbes are low compared with other environments, because of the paucity of these substances that are transported into the deeper subsurface where most bedrock occurs, unless there is significant pollution with an electron donor. To obtain a complete picture of the microbes present and their metabolic activity, it is usually necessary to sample formation water from specific fractures (versus open boreholes), and fracture surfaces (i.e., cores). Transport of the microbes through the major fracture pathways can be rapid, but may be quite limited in the microfractures. Very low abundances of small ( 2-3 μm) flagellated protists, which appear to prey upon planktonic bacteria, have been found in a bedrock aquifer. Much more research is needed to expand the understanding of all microbial processes in fractured rock environments.

  7. Modeling excessive nutrient loading in the environment.

    PubMed

    Reckhow, K H; Chapra, S C

    1999-01-01

    Models addressing excessive nutrient loading in the environment originated over 50 years ago with the simple nutrient concentration thresholds proposed by Sawyer (1947. Fertilization of lakes by agricultural and urban drainage. New Engl. Water Works Assoc. 61, 109-127). Since then, models have improved due to progress in modeling techniques and technology as well as enhancements in scientific knowledge. Several of these advances are examined here. Among the recent approaches in modeling techniques we review are error propagation, model confirmation, generalized sensitivity analysis, and Bayesian analysis. In the scientific arena and process characterization, we focus on advances in surface water modeling, discussing enhanced modeling of organic carbon, improved hydrodynamics, and refined characterization of sediment diagenesis. We conclude with some observations on future needs and anticipated developments.

  8. Space environment and lunar surface processes, 2

    NASA Technical Reports Server (NTRS)

    Comstock, G. M.

    1982-01-01

    The top few millimeters of a surface exposed to space represents a physically and chemically active zone with properties different from those of a surface in the environment of a planetary atmosphere. To meet the need or a quantitative synthesis of the various processes contributing to the evolution of surfaces of the Moon, Mercury, the asteroids, and similar bodies, (exposure to solar wind, solar flare particles, galactic cosmic rays, heating from solar radiation, and meteoroid bombardment), the MESS 2 computer program was developed. This program differs from earlier work in that the surface processes are broken down as a function of size scale and treated in three dimensions with good resolution on each scale. The results obtained apply to the development of soil near the surface and is based on lunar conditions. Parameters can be adjusted to describe asteroid regoliths and other space-related bodies.

  9. Probing protein environment in an enzymatic process: All-electron quantum chemical analysis combined with ab initio quantum mechanical/molecular mechanical modeling of chorismate mutase.

    PubMed

    Ishida, Toyokazu

    2008-09-28

    In this study, we investigated the electronic character of protein environment in enzymatic processes by performing all-electron QM calculations based on the fragment molecular orbital (FMO) method. By introducing a new computational strategy combining all-electron QM analysis with ab initio QM/MM modeling, we investigated the details of molecular interaction energy between a reactive substrate and amino acid residues at a catalytic site. For a practical application, we selected the chorismate mutase catalyzed reaction as an example. Because the computational time required to perform all-electron QM reaction path searches was very large, we employed the ab initio QM/MM modeling technique to construct reliable reaction profiles and performed all-electron FMO calculations for the selected geometries. The main focus of the paper is to analyze the details of electrostatic stabilization, which is considered to be the major feature of enzymatic catalyses, and to clarify how the electronic structure of proteins is polarized in response to the change in electron distribution of the substrate. By performing interaction energy decomposition analysis from a quantum chemical viewpoint, we clarified the relationship between the location of amino acid residues on the protein domain and the degree of electronic polarization of each residue. In particular, in the enzymatic transition state, Arg7, Glu78, and Arg90 are highly polarized in response to the delocalized electronic character of the substrate, and as a result, a large amount of electrostatic stabilization energy is stored in the molecular interaction between the enzyme and the substrate and supplied for transition state stabilization.

  10. Probing protein environment in an enzymatic process: All-electron quantum chemical analysis combined with ab initio quantum mechanical/molecular mechanical modeling of chorismate mutase

    NASA Astrophysics Data System (ADS)

    Ishida, Toyokazu

    2008-09-01

    In this study, we investigated the electronic character of protein environment in enzymatic processes by performing all-electron QM calculations based on the fragment molecular orbital (FMO) method. By introducing a new computational strategy combining all-electron QM analysis with ab initio QM/MM modeling, we investigated the details of molecular interaction energy between a reactive substrate and amino acid residues at a catalytic site. For a practical application, we selected the chorismate mutase catalyzed reaction as an example. Because the computational time required to perform all-electron QM reaction path searches was very large, we employed the ab initio QM/MM modeling technique to construct reliable reaction profiles and performed all-electron FMO calculations for the selected geometries. The main focus of the paper is to analyze the details of electrostatic stabilization, which is considered to be the major feature of enzymatic catalyses, and to clarify how the electronic structure of proteins is polarized in response to the change in electron distribution of the substrate. By performing interaction energy decomposition analysis from a quantum chemical viewpoint, we clarified the relationship between the location of amino acid residues on the protein domain and the degree of electronic polarization of each residue. In particular, in the enzymatic transition state, Arg7, Glu78, and Arg90 are highly polarized in response to the delocalized electronic character of the substrate, and as a result, a large amount of electrostatic stabilization energy is stored in the molecular interaction between the enzyme and the substrate and supplied for transition state stabilization.

  11. Estimation of Environment-Related Properties of Chemicals for Design of Sustainable Processes: Development of Group-Contribution+ (GC+) Property Models and Uncertainty Analysis

    EPA Science Inventory

    The aim of this work is to develop group-contribution+ (GC+) method (combined group-contribution (GC) method and atom connectivity index (CI) method) based property models to provide reliable estimations of environment-related properties of organic chemicals together with uncert...

  12. Estimation of Environment-Related Properties of Chemicals for Design of Sustainable Processes: Development of Group-Contribution+ (GC+) Property Models and Uncertainty Analysis

    EPA Science Inventory

    The aim of this work is to develop group-contribution+ (GC+) method (combined group-contribution (GC) method and atom connectivity index (CI) method) based property models to provide reliable estimations of environment-related properties of organic chemicals together with uncert...

  13. Open source integrated modeling environment Delta Shell

    NASA Astrophysics Data System (ADS)

    Donchyts, G.; Baart, F.; Jagers, B.; van Putten, H.

    2012-04-01

    In the last decade, integrated modelling has become a very popular topic in environmental modelling since it helps solving problems, which is difficult to model using a single model. However, managing complexity of integrated models and minimizing time required for their setup remains a challenging task. The integrated modelling environment Delta Shell simplifies this task. The software components of Delta Shell are easy to reuse separately from each other as well as a part of integrated environment that can run in a command-line or a graphical user interface mode. The most components of the Delta Shell are developed using C# programming language and include libraries used to define, save and visualize various scientific data structures as well as coupled model configurations. Here we present two examples showing how Delta Shell simplifies process of setting up integrated models from the end user and developer perspectives. The first example shows coupling of a rainfall-runoff, a river flow and a run-time control models. The second example shows how coastal morphological database integrates with the coastal morphological model (XBeach) and a custom nourishment designer. Delta Shell is also available as open-source software released under LGPL license and accessible via http://oss.deltares.nl.

  14. Microbial consortia in meat processing environments

    NASA Astrophysics Data System (ADS)

    Alessandria, V.; Rantsiou, K.; Cavallero, M. C.; Riva, S.; Cocolin, L.

    2017-09-01

    Microbial contamination in food processing plants can play a fundamental role in food quality and safety. The description of the microbial consortia in the meat processing environment is important since it is a first step in understanding possible routes of product contamination. Furthermore, it may contribute in the development of sanitation programs for effective pathogen removal. The purpose of this study was to characterize the type of microbiota in the environment of meat processing plants: the microbiota of three different meat plants was studied by both traditional and molecular methods (PCR-DGGE) in two different periods. Different levels of contamination emerged between the three plants as well as between the two sampling periods. Conventional methods of killing free-living bacteria through antimicrobial agents and disinfection are often ineffective against bacteria within a biofilm. The use of gas-discharge plasmas potentially can offer a good alternative to conventional sterilization methods. The purpose of this study was to measure the effectiveness of Atmospheric Pressure Plasma (APP) surface treatments against bacteria in biofilms. Biofilms produced by three different L. monocytogenes strains on stainless steel surface were subjected to three different conditions (power, exposure time) of APP. Our results showed how most of the culturable cells are inactivated after the Plasma exposure but the RNA analysis by qPCR highlighted the entrance of the cells in the viable-but non culturable (VBNC) state, confirming the hypothesis that cells are damaged after plasma treatment, but in a first step, still remain alive. The understanding of the effects of APP on the L. monocytogenes biofilm can improve the development of sanitation programs with the use of APP for effective pathogen removal.

  15. Control of the aseptic processing environment.

    PubMed

    Frieben, W R

    1983-11-01

    Methods used by industry with applications to hospital pharmacy for maintaining an aseptic environment in production of sterile pharmaceutical products are discussed. A major source of product contamination is airborne microorganisms. The laminar-airflow workbench with a high-efficiency particulate air filter provides an ultraclean environment for preparation of sterile products. However, the workbench does not guarantee sterility of products and is not effective if not properly installed and maintained or if the operator uses poor aseptic technique. The laminar-airflow workbench should be tested for leaks, airflow velocity, and airflow patterns when installed, and the workbench should be checked periodically thereafter. The workbench should be placed in a cleanroom where traffic and air disturbances that might affect the laminar airflow are eliminated. A major source of airborne microbial contamination in cleanrooms is people. Personnel movement through an area and presence of personnel without lint-free, nonshedding protective garments increase the levels of microbial contaminants in an area. The transport of nonsterile products (bottles, boxes, paper products) into a cleanroom should be minimized. The cleanroom itself should be sanitized and should be immaculate. Microbial or particulate monitoring should be conducted in the cleanroom using a quantitative method, and corrective-action limits should be set. Hospital pharmacists should examine industrial sterile-processing techniques and apply them to the preparation of sterile products.

  16. A multiarchitecture parallel-processing development environment

    NASA Technical Reports Server (NTRS)

    Townsend, Scott; Blech, Richard; Cole, Gary

    1993-01-01

    A description is given of the hardware and software of a multiprocessor test bed - the second generation Hypercluster system. The Hypercluster architecture consists of a standard hypercube distributed-memory topology, with multiprocessor shared-memory nodes. By using standard, off-the-shelf hardware, the system can be upgraded to use rapidly improving computer technology. The Hypercluster's multiarchitecture nature makes it suitable for researching parallel algorithms in computational field simulation applications (e.g., computational fluid dynamics). The dedicated test-bed environment of the Hypercluster and its custom-built software allows experiments with various parallel-processing concepts such as message passing algorithms, debugging tools, and computational 'steering'. Such research would be difficult, if not impossible, to achieve on shared, commercial systems.

  17. Spacecraft Internal Acoustic Environment Modeling

    NASA Technical Reports Server (NTRS)

    Chu, SShao-sheng R.; Allen, Christopher S.

    2009-01-01

    carried out by acquiring octave band microphone data simultaneously at ten fixed locations throughout the mockup. SPLs (Sound Pressure Levels) predicted by our SEA model match well with measurements for our CM mockup, with a more complicated shape. Additionally in FY09, background NC noise (Noise Criterion) simulation and MRT (Modified Rhyme Test) were developed and performed in the mockup to determine the maximum noise level in CM habitable volume for fair crew voice communications. Numerous demonstrations of simulated noise environment in the mockup and associated SIL (Speech Interference Level) via MRT were performed for various communities, including members from NASA and Orion prime-/sub-contractors. Also, a new HSIR (Human-Systems Integration Requirement) for limiting pre- and post-landing SIL was proposed.

  18. Gaussian Process Morphable Models.

    PubMed

    Luthi, Marcel; Gerig, Thomas; Jud, Christoph; Vetter, Thomas

    2017-08-14

    Models of shape variations have become a central component for the automated analysis of images. An important class of shape models are point distribution models (PDMs). These models represent a class of shapes as a normal distribution of point variations, whose parameters are estimated from example shapes. Principal component analysis (PCA) is applied to obtain a low-dimensional representation of the shape variation in terms of the leading principal components. In this paper, we propose a generalization of PDMs, which we refer to as Gaussian Process Morphable Models (GPMMs). We model the shape variations with a Gaussian process, which we represent using the leading components of its Karhunen-Loève expansion. To compute the expansion, we make use of an approximation scheme based on the Nyström method. The resulting model can be seen as a continuous analog of a standard PDM. However, while for PDMs the shape variation is restricted to the linear span of the example data, with GPMMs we can define the shape variation using any Gaussian process. For example, we can build shape models that correspond to classical spline models and thus do not require any example data. Furthermore, Gaussian processes make it possible to combine different models. For example, a PDM can be extended with a spline model, to obtain a model that incorporates learned shape characteristics but is flexible enough to explain shapes that cannot be represented by the PDM.

  19. Does microbial community structure matter for predicting ecosystem function? Use of statistical models to examine relationships between the environment, community and processes

    NASA Astrophysics Data System (ADS)

    Nemergut, D.; Graham, E. B.

    2014-12-01

    Microorganisms control all major biogeochemical cycles, yet the importance of microbial community structure for ecosystem function is widely debated. Indeed, few nutrient cycling models directly account for variation in community structure, leading some researchers to speculate that this information could provide important and missing explanatory power to predict ecosystem function. However, if variation in environmental variables strongly correlates with variation in microbial community composition, then information on microbial community composition may not improve models. Here, we use a data synthesis approach to ask when and where information on the microbial community matters for predictions of ecosystem function. We collated data from approximately 100 different studies and used statistical approaches to ask if models with data on microbial community composition significantly improved models of ecosystem function based on environmental data alone. We found that only 25% of models of ecosystem processes were significantly improved with the addition of data on microbial community composition. Specifically, we found that for phylogenetically broad processes, diversity indicators yielded more significant increases in explanatory power than abundance data. Our results also demonstrate that for phylogenetically narrow processes, qPCR data on functional genes yielded higher explanatory power than for broad processes. Further, we found that all types of data on microbial community composition explained more variation in obligate processes compared to facultative processes. Overall, our results suggest that trait distributions both within communities and within individuals affect the relative importance of microbial community composition for explaining ecosystem function.

  20. Gene-Environment Interplay in Twin Models

    PubMed Central

    Hatemi, Peter K.

    2013-01-01

    In this article, we respond to Shultziner’s critique that argues that identical twins are more alike not because of genetic similarity, but because they select into more similar environments and respond to stimuli in comparable ways, and that these effects bias twin model estimates to such an extent that they are invalid. The essay further argues that the theory and methods that undergird twin models, as well as the empirical studies which rely upon them, are unaware of these potential biases. We correct this and other misunderstandings in the essay and find that gene-environment (GE) interplay is a well-articulated concept in behavior genetics and political science, operationalized as gene-environment correlation and gene-environment interaction. Both are incorporated into interpretations of the classical twin design (CTD) and estimated in numerous empirical studies through extensions of the CTD. We then conduct simulations to quantify the influence of GE interplay on estimates from the CTD. Due to the criticism’s mischaracterization of the CTD and GE interplay, combined with the absence of any empirical evidence to counter what is presented in the extant literature and this article, we conclude that the critique does not enhance our understanding of the processes that drive political traits, genetic or otherwise. PMID:24808718

  1. Near-field environment/processes working group summary

    SciTech Connect

    Murphy, W.M.

    1995-09-01

    This article is a summary of the proceedings of a group discussion which took place at the Workshop on the Role of Natural Analogs in Geologic Disposal of High-Level Nuclear Waste in San Antonio, Texas on July 22-25, 1991. The working group concentrated on the subject of the near-field environment to geologic repositories for high-level nuclear waste. The near-field environment may be affected by thermal perturbations from the waste, and by disturbances caused by the introduction of exotic materials during construction of the repository. This group also discussed the application of modelling of performance-related processes.

  2. Radiology interpretation process modeling.

    PubMed

    Noumeir, Rita

    2006-04-01

    Information and communication technology in healthcare promises optimized patient care while ensuring efficiency and cost-effectiveness. However, the promised results are not yet achieved; the healthcare process requires analysis and radical redesign to achieve improvements in care quality and productivity. Healthcare process reengineering is thus necessary and involves modeling its workflow. Even though the healthcare process is very large and not very well modeled yet, its sub-processes can be modeled individually, providing fundamental pieces of the whole model. In this paper, we are interested in modeling the radiology interpretation process that results in generating a diagnostic radiology report. This radiology report is an important clinical element of the patient healthcare record and assists in healthcare decisions. We present the radiology interpretation process by identifying its boundaries and by positioning it on the large healthcare process map. Moreover, we discuss an information data model and identify roles, tasks and several information flows. Furthermore, we describe standard frameworks to enable radiology interpretation workflow implementations between heterogeneous systems.

  3. Processing Conditions, Rice Properties, Health and Environment

    PubMed Central

    Roy, Poritosh; Orikasa, Takahiro; Okadome, Hiroshi; Nakamura, Nobutaka; Shiina, Takeo

    2011-01-01

    Rice is the staple food for nearly two-thirds of the world’s population. Food components and environmental load of rice depends on the rice form that is resulted by different processing conditions. Brown rice (BR), germinated brown rice (GBR) and partially-milled rice (PMR) contains more health beneficial food components compared to the well milled rice (WMR). Although the arsenic concentration in cooked rice depends on the cooking methods, parboiled rice (PBR) seems to be relatively prone to arsenic contamination compared to that of untreated rice, if contaminated water is used for parboiling and cooking. A change in consumption patterns from PBR to untreated rice (non-parboiled), and WMR to PMR or BR may conserve about 43–54 million tons of rice and reduce the risk from arsenic contamination in the arsenic prone area. This study also reveals that a change in rice consumption patterns not only supply more food components but also reduces environmental loads. A switch in production and consumption patterns would improve food security where food grains are scarce, and provide more health beneficial food components, may prevent some diseases and ease the burden on the Earth. However, motivation and awareness of the environment and health, and even a nominal incentive may require for a method switching which may help in building a sustainable society. PMID:21776212

  4. Modelling of Indoor Environments Using Lindenmayer Systems

    NASA Astrophysics Data System (ADS)

    Peter, M.

    2017-09-01

    Documentation of the "as-built" state of building interiors has gained a lot of interest in the recent years. Various data acquisition methods exist, e.g. the extraction from photographed evacuation plans using image processing or, most prominently, indoor mobile laser scanning. Due to clutter or data gaps as well as errors during data acquisition and processing, automatic reconstruction of CAD/BIM-like models from these data sources is not a trivial task. Thus it is often tried to support reconstruction by general rules for the perpendicularity and parallelism which are predominant in man-made structures. Indoor environments of large, public buildings, however, often also follow higher-level rules like symmetry and repetition of e.g. room sizes and corridor widths. In the context of reconstruction of city city elements (e.g. street networks) or building elements (e.g. façade layouts), formal grammars have been put to use. In this paper, we describe the use of Lindenmayer systems - which originally have been developed for the computer-based modelling of plant growth - to model and reproduce the layout of indoor environments in 2D.

  5. Design of Training Systems Utility Assessment: The Training Process Flow and System Capabilities/Requirements and Resources Models Operating in the TRAPAC Environment.

    ERIC Educational Resources Information Center

    Duffy, Larry R.

    The report summarizes the results of a field test conducted for the purpose of determining the utility to Naval training of the Systems Capabilities/Requirements and Resources (SCRR) and the Training Process Flow (TPF) computer-based mathematical models. Basic descriptions of the SCRR and the TPF and their development are given. Training…

  6. Eolian Modeling System: Predicting Windblown Dust Hazards in Battlefield Environments

    DTIC Science & Technology

    2011-05-03

    environments and to understand the implications of eolian transport for environmental processes such as soil and desert pavement formation. The...REPORT Final Report for Eolian Modeling System (EMS): Predicting Windblown Sand and Dust Hazards in Battlefield Environments 14. ABSTRACT 16. SECURITY...Predicting Windblown Sand and Dust Hazards in Battlefield Environments ." The objectives of the research were to 1) develop numerical models for the

  7. Automated Environment Generation for Software Model Checking

    NASA Technical Reports Server (NTRS)

    Tkachuk, Oksana; Dwyer, Matthew B.; Pasareanu, Corina S.

    2003-01-01

    A key problem in model checking open systems is environment modeling (i.e., representing the behavior of the execution context of the system under analysis). Software systems are fundamentally open since their behavior is dependent on patterns of invocation of system components and values defined outside the system but referenced within the system. Whether reasoning about the behavior of whole programs or about program components, an abstract model of the environment can be essential in enabling sufficiently precise yet tractable verification. In this paper, we describe an approach to generating environments of Java program fragments. This approach integrates formally specified assumptions about environment behavior with sound abstractions of environment implementations to form a model of the environment. The approach is implemented in the Bandera Environment Generator (BEG) which we describe along with our experience using BEG to reason about properties of several non-trivial concurrent Java programs.

  8. Microwave sintering process model.

    PubMed

    Peng, Hu; Tinga, W R; Sundararaj, U; Eadie, R L

    2003-01-01

    In order to simulate and optimize the microwave sintering of a silicon nitride and tungsten carbide/cobalt toolbits process, a microwave sintering process model has been built. A cylindrical sintering furnace was used containing a heat insulating layer, a susceptor layer, and an alumina tube containing the green toolbit parts between parallel, electrically conductive, graphite plates. Dielectric and absorption properties of the silicon nitride green parts, the tungsten carbide/cobalt green parts, and an oxidizable susceptor material were measured using perturbation and waveguide transmission methods. Microwave absorption data were measured over a temperature range from 20 degrees C to 800 degrees C. These data were then used in the microwave process model which assumed plane wave propagation along the radial direction and included the microwave reflection at each interface between the materials and the microwave absorption in the bulk materials. Heat transfer between the components inside the cylindrical sintering furnace was also included in the model. The simulated heating process data for both silicon nitride and tungsten carbide/cobalt samples closely follow the experimental data. By varying the physical parameters of the sintering furnace model, such as the thickness of the susceptor layer, the thickness of the allumina tube wall, the sample load volume and the graphite plate mass, the model data predicts their effects which are helpful in optimizing those parameters in the industrial sintering process.

  9. Model for a Healthy Work Environment.

    PubMed

    Blevins, Jamie

    2016-01-01

    The Healthy Work Environment (HWE) Model, considered a model of standards of professional behaviors, was created to help foster an environment that is happy, healthy, realistic, and feasible. The model focuses on areas of PEOPLE and PRACTICE, where each letter of these words identifies core, professional qualities and behaviors to foster an environment amenable and conducive to accountability for one's behavior and action. Each of these characteristics is supported from a Christian, biblical perspective. The HWE Model provides a mental and physical checklist of what is important in creating and sustaining a healthy work environment in education and practice.

  10. Foam process models.

    SciTech Connect

    Moffat, Harry K.; Noble, David R.; Baer, Thomas A.; Adolf, Douglas Brian; Rao, Rekha Ranjana; Mondy, Lisa Ann

    2008-09-01

    In this report, we summarize our work on developing a production level foam processing computational model suitable for predicting the self-expansion of foam in complex geometries. The model is based on a finite element representation of the equations of motion, with the movement of the free surface represented using the level set method, and has been implemented in SIERRA/ARIA. An empirically based time- and temperature-dependent density model is used to encapsulate the complex physics of foam nucleation and growth in a numerically tractable model. The change in density with time is at the heart of the foam self-expansion as it creates the motion of the foam. This continuum-level model uses an homogenized description of foam, which does not include the gas explicitly. Results from the model are compared to temperature-instrumented flow visualization experiments giving the location of the foam front as a function of time for our EFAR model system.

  11. Mesoscale modeling of the severe thunderstorm environment

    NASA Technical Reports Server (NTRS)

    Koch, Steven E.

    1988-01-01

    The abilities of limited-area mesoscale models to provide accurate predictions of the environment of midlatitude severe thunderstorms and the possible feedback effects of the storms upon their environment are reviewed. Mesoaplha-scale models, mesobeta models, and terrain-induced mesoscale systems are discussed. The importance of the initial state and model numerics and physics is examined. It is found that mesoscale models must be run locally if they are to be used for short-range forecasting.

  12. Patient Data Synchronization Process in a Continuity of Care Environment

    PubMed Central

    Haras, Consuela; Sauquet, Dominique; Ameline, Philippe; Jaulent, Marie-Christine; Degoulet, Patrice

    2005-01-01

    In a distributed patient record environment, we analyze the processes needed to ensure exchange and access to EHR data. We propose an adapted method and the tools for data synchronization. Our study takes into account the issues of user rights management for data access and of decreasing the amount of data exchanged over the network. We describe a XML-based synchronization model that is portable and independent of specific medical data models. The implemented platform consists of several servers, of local network clients, of workstations running user’s interfaces and of data exchange and synchronization tools. PMID:16779049

  13. Students' mental models of the environment

    NASA Astrophysics Data System (ADS)

    Shepardson, Daniel P.; Wee, Bryan; Priddy, Michelle; Harbor, Jon

    2007-02-01

    What are students' mental models of the environment? In what ways, if any, do students' mental models vary by grade level or community setting? These two questions guided the research reported in this article. The Environments Task was administered to students from 25 different teacher-classrooms. The student responses were first inductively analyzed in order to identify students' mental models of the environment. The second phase of analysis involved the statistical testing of the identified mental models. From this analysis four mental models emerged: Model 1, the environment as a place where animals/plants live - a natural place; Model 2, the environment as a place that supports life; Model 3, the environment as a place impacted or modified by human activity; and Model 4, the environment as a place where animals, plants, and humans live. The dominant mental model was Mental Model 1. Yet, a greater frequency of urban students than suburban and rural students held Mental Model 3. The implications to environmental science education are explored.

  14. Radiolysis Process Model

    SciTech Connect

    Buck, Edgar C.; Wittman, Richard S.; Skomurski, Frances N.; Cantrell, Kirk J.; McNamara, Bruce K.; Soderquist, Chuck Z.

    2012-07-17

    Assessing the performance of spent (used) nuclear fuel in geological repository requires quantification of time-dependent phenomena that may influence its behavior on a time-scale up to millions of years. A high-level waste repository environment will be a dynamic redox system because of the time-dependent generation of radiolytic oxidants and reductants and the corrosion of Fe-bearing canister materials. One major difference between used fuel and natural analogues, including unirradiated UO2, is the intense radiolytic field. The radiation emitted by used fuel can produce radiolysis products in the presence of water vapor or a thin-film of water (including OH• and H• radicals, O2-, eaq, H2O2, H2, and O2) that may increase the waste form degradation rate and change radionuclide behavior. H2O2 is the dominant oxidant for spent nuclear fuel in an O2 depleted water environment, the most sensitive parameters have been identified with respect to predictions of a radiolysis model under typical conditions. As compared with the full model with about 100 reactions it was found that only 30-40 of the reactions are required to determine [H2O2] to one part in 10–5 and to preserve most of the predictions for major species. This allows a systematic approach for model simplification and offers guidance in designing experiments for validation.

  15. Students' Mental Models of the Environment

    ERIC Educational Resources Information Center

    Shepardson, Daniel P.; Wee, Bryan; Priddy, Michelle; Harbor, Jon

    2007-01-01

    What are students' mental models of the environment? In what ways, if any, do students' mental models vary by grade level or community setting? These two questions guided the research reported in this article. The Environments Task was administered to students from 25 different teacher-classrooms. The student responses were first inductively…

  16. Students' Mental Models of the Environment

    ERIC Educational Resources Information Center

    Shepardson, Daniel P.; Wee, Bryan; Priddy, Michelle; Harbor, Jon

    2007-01-01

    What are students' mental models of the environment? In what ways, if any, do students' mental models vary by grade level or community setting? These two questions guided the research reported in this article. The Environments Task was administered to students from 25 different teacher-classrooms. The student responses were first inductively…

  17. Optimal mutation rates in dynamic environments: The eigen model

    NASA Astrophysics Data System (ADS)

    Ancliff, Mark; Park, Jeong-Man

    2011-03-01

    We consider the Eigen quasispecies model with a dynamic environment. For an environment with sharp-peak fitness in which the most-fit sequence moves by k spin-flips each period T we find an asymptotic stationary state in which the quasispecies population changes regularly according to the regular environmental change. From this stationary state we estimate the maximum and the minimum mutation rates for a quasispecies to survive under the changing environment and calculate the optimum mutation rate that maximizes the population growth. Interestingly we find that the optimum mutation rate in the Eigen model is lower than that in the Crow-Kimura model, and at their optimum mutation rates the corresponding mean fitness in the Eigen model is lower than that in the Crow-Kimura model, suggesting that the mutation process which occurs in parallel to the replication process as in the Crow-Kimura model gives an adaptive advantage under changing environment.

  18. Neuroscientific model of motivational process.

    PubMed

    Kim, Sung-Il

    2013-01-01

    Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three sub processes, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous sub processes, namely reward-driven approach, value-based decision-making, and goal-directed control. Reward-driven approach is the process in which motivation is generated by reward anticipation and selective approach behaviors toward reward. This process recruits the ventral striatum (reward area) in which basic stimulus-action association is formed, and is classified as an automatic motivation to which relatively less attention is assigned. By contrast, value-based decision-making is the process of evaluating various outcomes of actions, learning through positive prediction error, and calculating the value continuously. The striatum and the orbitofrontal cortex (valuation area) play crucial roles in sustaining motivation. Lastly, the goal-directed control is the process of regulating motivation through cognitive control to achieve goals. This consciously controlled motivation is associated with higher-level cognitive functions such as planning, retaining the goal, monitoring the performance, and regulating action. The anterior cingulate cortex (attention area) and the dorsolateral prefrontal cortex (cognitive control area) are the main neural circuits related to regulation of motivation. These three sub processes interact with each other by sending reward prediction error signals through dopaminergic pathway from the striatum and to the prefrontal cortex. The neuroscientific model of motivational process suggests several educational implications with regard to the generation, maintenance, and regulation of motivation to learn in the learning environment.

  19. Neuroscientific Model of Motivational Process

    PubMed Central

    Kim, Sung-il

    2013-01-01

    Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three sub processes, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous sub processes, namely reward-driven approach, value-based decision-making, and goal-directed control. Reward-driven approach is the process in which motivation is generated by reward anticipation and selective approach behaviors toward reward. This process recruits the ventral striatum (reward area) in which basic stimulus-action association is formed, and is classified as an automatic motivation to which relatively less attention is assigned. By contrast, value-based decision-making is the process of evaluating various outcomes of actions, learning through positive prediction error, and calculating the value continuously. The striatum and the orbitofrontal cortex (valuation area) play crucial roles in sustaining motivation. Lastly, the goal-directed control is the process of regulating motivation through cognitive control to achieve goals. This consciously controlled motivation is associated with higher-level cognitive functions such as planning, retaining the goal, monitoring the performance, and regulating action. The anterior cingulate cortex (attention area) and the dorsolateral prefrontal cortex (cognitive control area) are the main neural circuits related to regulation of motivation. These three sub processes interact with each other by sending reward prediction error signals through dopaminergic pathway from the striatum and to the prefrontal cortex. The neuroscientific model of motivational process suggests several educational implications with regard to the generation, maintenance, and regulation of motivation to learn in the learning environment. PMID:23459598

  20. Environment, Motivation, and the Composing Process.

    ERIC Educational Resources Information Center

    Smith, Ron

    Recognizing the differences between reading and writing is as important as recognizing their similarities for improving current methods of teaching composition. Environment and motivation are two areas in which these differences are most noticeable. Since motivation is a preexisting quality that can only be fostered and not implanted, environment…

  1. Spacecraft Internal Acoustic Environment Modeling

    NASA Technical Reports Server (NTRS)

    Chu, S. Reynold; Allen, Chris

    2009-01-01

    The objective of the project is to develop an acoustic modeling capability, based on commercial off-the-shelf software, to be used as a tool for oversight of the future manned Constellation vehicles. The use of such a model will help ensure compliance with acoustic requirements. Also, this project includes modeling validation and development feedback via building physical mockups and conducting acoustic measurements to compare with the predictions.

  2. Space environment and lunar surface processes

    NASA Technical Reports Server (NTRS)

    Comstock, G. M.

    1979-01-01

    The development of a general rock/soil model capable of simulating in a self consistent manner the mechanical and exposure history of an assemblage of solid and loose material from submicron to planetary size scales, applicable to lunar and other space exposed planetary surfaces is discussed. The model was incorporated into a computer code called MESS.2 (model for the evolution of space exposed surfaces). MESS.2, which represents a considerable increase in sophistication and scope over previous soil and rock surface models, is described. The capabilities of previous models for near surface soil and rock surfaces are compared with the rock/soil model, MESS.2.

  3. The Integrated Model Development Environment

    DTIC Science & Technology

    1994-02-01

    IMDE)," was designed to support the Productivity Improvements in Simulation Modeling (PRISM) project. The objective of PRISM is to enhance the Air...Office of Primary Responsibility OT&E Operational Test and Evaluation PRISM Productivity Improvements in Simulation Modeling SAFORs Semi-Automated Forces

  4. Optimal mutation rates in dynamic environments: The Eigen model

    NASA Astrophysics Data System (ADS)

    Ancliff, Mark; Park, Jeong-Man

    2010-08-01

    We consider the Eigen quasispecies model with a dynamic environment. For an environment with sharp-peak fitness in which the most-fit sequence moves by k spin-flips each period T we find an asymptotic stationary state in which the quasispecies population changes regularly according to the regular environmental change. From this stationary state we estimate the maximum and the minimum mutation rates for a quasispecies to survive under the changing environment and calculate the optimum mutation rate that maximizes the population growth. Interestingly we find that the optimum mutation rate in the Eigen model is lower than that in the Crow-Kimura model, and at their optimum mutation rates the corresponding mean fitness in the eigenmodel is lower than that in the Crow-Kimura model, suggesting that the mutation process which occurs in parallel to the replication process as in the Crow-Kimura model gives an adaptive advantage under changing environment.

  5. Thermal modeling environment for TMT

    NASA Astrophysics Data System (ADS)

    Vogiatzis, Konstantinos

    2010-07-01

    In a previous study we had presented a summary of the TMT Aero-Thermal modeling effort to support thermal seeing and dynamic loading estimates. In this paper a summary of the current status of Computational Fluid Dynamics (CFD) simulations for TMT is presented, with the focus shifted in particular towards the synergy between CFD and the TMT Finite Element Analysis (FEA) structural and optical models, so that the thermal and consequent optical deformations of the telescope can be calculated. To minimize thermal deformations and mirror seeing the TMT enclosure will be air conditioned during day-time to the expected night-time ambient temperature. Transient simulations with closed shutter were performed to investigate the optimum cooling configuration and power requirements for the standard telescope parking position. A complete model of the observatory on Mauna Kea was used to calculate night-time air temperature inside the enclosure (along with velocity and pressure) for a matrix of given telescope orientations and enclosure configurations. Generated records of temperature variations inside the air volume of the optical paths are also fed into the TMT thermal seeing model. The temperature and heat transfer coefficient outputs from both models are used as input surface boundary conditions in the telescope structure and optics FEA models. The results are parameterized so that sequential records several days long can be generated and used by the FEA model to estimate the observing spatial and temporal temperature range of the structure and optics.

  6. Combustion Processes in the Aerospace Environment

    NASA Technical Reports Server (NTRS)

    Huggett, Clayton

    1969-01-01

    The aerospace environment introduces new and enhanced fire hazards because the special atmosphere employed may increase the frequency and intensity of fires, because the confinement associated with aerospace systems adversely affects the dynamics of fire development and control, and because the hostile external environments limit fire control and rescue operations. Oxygen enriched atmospheres contribute to the fire hazard in aerospace systems by extending the list of combustible fuels, increasing the probability of ignition, and increasing the rates of fire spread and energy release. A system for classifying atmospheres according to the degree of fire hazard, based on the heat capacity of the atmosphere per mole of oxygen, is suggested. A brief exploration of the dynamics of chamber fires shows that such fires will exhibit an exponential growth rate and may grow to dangerous size in a very short time. Relatively small quantities of fuel and oxygen can produce a catastrophic fire in a closed chamber.

  7. Improvement of pre- and post-processing environments of the dynamic two-dimensional reservoir model CE-QUAL-W2 based on GIS.

    PubMed

    Ha, S R; Bae, G J; Park, D H; Cho, J H

    2003-01-01

    An Environmental Information System (EIS) coupled with a Geographic Information System (GIS) and water quality models is developed to improve the pre- and post-data processing function of CE-QUAL-W2. Since the accuracy of the geometric data in terms of a diverse water body has a great effect on the water quality variables such as the velocity, kinetic reactions, the horizontal and vertical momentum, to prepare the bathymetry information has been considered a difficult issue for modellers who intend to use the model. For identifying Cross Section and Profile Information (CSPI), which precisely contains hydraulic features and geographical configuration of a waterway, the automated CSPI extraction program has been developed using Avenue Language of the PC Arc/view package. The program consists of three major steps: (1) getting the digital depth map of a waterway using GIS techniques; (2) creating a CSPI data set of segments in each branch using the program for CE-QUAL-W2 bathymetry input; (3) selecting the optimal set of bathymetry input by which the calculated water volume meets the observed volume of the water body. Through those approaches, it is clear that the model simulation results in terms of water quality as well as reservoir hydraulics rely upon the accuracy of bathymetry information.

  8. Benchmarking Ionizing Space Environment Models

    NASA Astrophysics Data System (ADS)

    Bourdarie, S.; Inguimbert, C.; Standarovski, D.; Vaillé, J.-R.; Sicard-Piet, A.; Falguere, D.; Ecoffet, R.; Poivey, C.; Lorfèvre, E.

    2017-08-01

    In-flight feedback data are collected, such as displacement damage doses, ionizing doses, and cumulated Single Event upset (SEU) on board various space vehicles and are compared to predictions performed with: 1) proton measurements performed with spectrometers data on board the same spacecraft if any and 2) protons spectrum predicted by the legacy AP8min model and the AP9 and Onera Proton Altitude Low models. When an accurate representation of the 3-D spacecraft shielding as well as appropriate ground calibrations are considered in the calculations, such comparisons provide powerful metrics to investigate engineering model accuracy. To describe >30 MeV trapped protons fluxes, the AP8 min model is found to provide closer predictions to observations than AP9 V1.30.001 (mean and perturbed mean).

  9. Probability Model for Designing Environment Condition

    NASA Astrophysics Data System (ADS)

    Lubis, Iswar; Nasution Mahyuddin, K. M.

    2017-01-01

    Transport equipment has the potential to contribute to environmental pollution. The pollution impact on the welfare of the environment. Thus, the potential of the environment needs to be raised to block the pollution. A design based on probability models of the determining factors in the environment should be a concern. This paper aims to reveal the scenarios as the beginning to express the clues, and based on surveys have been had the ability to choose a design.

  10. Engineered Barrier System: Physical and Chemical Environment Model

    SciTech Connect

    D. M. Jolley; R. Jarek; P. Mariner

    2004-02-09

    The conceptual and predictive models documented in this Engineered Barrier System: Physical and Chemical Environment Model report describe the evolution of the physical and chemical conditions within the waste emplacement drifts of the repository. The modeling approaches and model output data will be used in the total system performance assessment (TSPA-LA) to assess the performance of the engineered barrier system and the waste form. These models evaluate the range of potential water compositions within the emplacement drifts, resulting from the interaction of introduced materials and minerals in dust with water seeping into the drifts and with aqueous solutions forming by deliquescence of dust (as influenced by atmospheric conditions), and from thermal-hydrological-chemical (THC) processes in the drift. These models also consider the uncertainty and variability in water chemistry inside the drift and the compositions of introduced materials within the drift. This report develops and documents a set of process- and abstraction-level models that constitute the engineered barrier system: physical and chemical environment model. Where possible, these models use information directly from other process model reports as input, which promotes integration among process models used for total system performance assessment. Specific tasks and activities of modeling the physical and chemical environment are included in the technical work plan ''Technical Work Plan for: In-Drift Geochemistry Modeling'' (BSC 2004 [DIRS 166519]). As described in the technical work plan, the development of this report is coordinated with the development of other engineered barrier system analysis model reports.

  11. Mountains and man. A study of process and environment

    SciTech Connect

    Price, L.W.

    1986-01-01

    This book explores the processes and features of mountain environments: glaciers, snow and avalanches, landforms, weather and climate vegetation soils, and wildlife. The effects of latitudinal position on these processes and features are analyzed.

  12. Virtual Research Environments for Natural Hazard Modelling

    NASA Astrophysics Data System (ADS)

    Napier, Hazel; Aldridge, Tim

    2017-04-01

    The Natural Hazards Partnership (NHP) is a group of 17 collaborating public sector organisations providing a mechanism for co-ordinated advice to government and agencies responsible for civil contingency and emergency response during natural hazard events. The NHP has set up a Hazard Impact Model (HIM) group tasked with modelling the impact of a range of UK hazards with the aim of delivery of consistent hazard and impact information. The HIM group consists of 7 partners initially concentrating on modelling the socio-economic impact of 3 key hazards - surface water flooding, land instability and high winds. HIM group partners share scientific expertise and data within their specific areas of interest including hydrological modelling, meteorology, engineering geology, GIS, data delivery, and modelling of socio-economic impacts. Activity within the NHP relies on effective collaboration between partners distributed across the UK. The NHP are acting as a use case study for a new Virtual Research Environment (VRE) being developed by the EVER-EST project (European Virtual Environment for Research - Earth Science Themes: a solution). The VRE is allowing the NHP to explore novel ways of cooperation including improved capabilities for e-collaboration, e-research, automation of processes and e-learning. Collaboration tools are complemented by the adoption of Research Objects, semantically rich aggregations of resources enabling the creation of uniquely identified digital artefacts resulting in reusable science and research. Application of the Research Object concept to HIM development facilitates collaboration, by encapsulating scientific knowledge in a shareable format that can be easily shared and used by partners working on the same model but within their areas of expertise. This paper describes the application of the VRE to the NHP use case study. It outlines the challenges associated with distributed partnership working and how they are being addressed in the VRE. A case

  13. Radiation Environment Modeling for Spacecraft Design: New Model Developments

    NASA Technical Reports Server (NTRS)

    Barth, Janet; Xapsos, Mike; Lauenstein, Jean-Marie; Ladbury, Ray

    2006-01-01

    A viewgraph presentation on various new space radiation environment models for spacecraft design is described. The topics include: 1) The Space Radiatio Environment; 2) Effects of Space Environments on Systems; 3) Space Radiatio Environment Model Use During Space Mission Development and Operations; 4) Space Radiation Hazards for Humans; 5) "Standard" Space Radiation Environment Models; 6) Concerns about Standard Models; 7) Inadequacies of Current Models; 8) Development of New Models; 9) New Model Developments: Proton Belt Models; 10) Coverage of New Proton Models; 11) Comparison of TPM-1, PSB97, AP-8; 12) New Model Developments: Electron Belt Models; 13) Coverage of New Electron Models; 14) Comparison of "Worst Case" POLE, CRESELE, and FLUMIC Models with the AE-8 Model; 15) New Model Developments: Galactic Cosmic Ray Model; 16) Comparison of NASA, MSU, CIT Models with ACE Instrument Data; 17) New Model Developmemts: Solar Proton Model; 18) Comparison of ESP, JPL91, KIng/Stassinopoulos, and PSYCHIC Models; 19) New Model Developments: Solar Heavy Ion Model; 20) Comparison of CREME96 to CREDO Measurements During 2000 and 2002; 21) PSYCHIC Heavy ion Model; 22) Model Standardization; 23) Working Group Meeting on New Standard Radiation Belt and Space Plasma Models; and 24) Summary.

  14. Sanitation in the Shell Egg Processing Environment

    USDA-ARS?s Scientific Manuscript database

    In the past, most of the regulations regarding egg processing are concerned with quality rather than safety. Hazard Analysis and Critical Control Point (HACCP) will be required by retailers or by the federal government. GMPs (Good Manufacturing Practices) and SSOPs (Sanitation Standard Operating P...

  15. Advanced deformation process modeling

    SciTech Connect

    Kocks, U.F.; Embury, J.D.; Beaudoin, A.J.; Dawson, P.R.; MacEwen, S.R.; Mecking, H.J.

    1997-08-01

    Progress was made in achieving a comprehensive and coherent description of material behavior in deformation processing. The materials included were metals, alloys, intermetallic compounds, arbitrary lattice structure, and metal matrix composites. Aspects of behavior modeled included kinetics of flow and strain hardening, as well as recrystallization and the various anisotropies of strength and compliance. Highlights include a new prediction of the limiting strength of materials at high temperature, a new understanding of the generation of new grain boundaries during forming operations, and a quantitatively verified computer simulation of texture development and the resulting behavioral anisotropies.

  16. Preface. Forest ecohydrological processes in a changing environment.

    Treesearch

    Xiaohua Wei; Ge Sun; James Vose; Kyoichi Otsuki; Zhiqiang Zhang; Keith Smetterm

    2011-01-01

    The papers in this issue are a selection of the presentations made at the second International Conference on Forests and Water in a Changing Environment. This special issue ‘Forest Ecohydrological Processes in a Changing Environment’ covers the topics regarding the effects of forest, land use and climate changes on ecohydrological processes across forest stand,...

  17. Simple Thermal Environment Model (STEM) User's Guide

    NASA Technical Reports Server (NTRS)

    Justus, C.G.; Batts, G. W.; Anderson, B. J.; James, B. F.

    2001-01-01

    This report presents a Simple Thermal Environment Model (STEM) for determining appropriate engineering design values to specify the thermal environment of Earth-orbiting satellites. The thermal environment of a satellite, consists of three components: (1) direct solar radiation, (2) Earth-atmosphere reflected shortwave radiation, as characterized by Earth's albedo, and (3) Earth-atmosphere-emitted outgoing longwave radiation (OLR). This report, together with a companion "guidelines" report provides methodology and guidelines for selecting "design points" for thermal environment parameters for satellites and spacecraft systems. The methods and models reported here are outgrowths of Earth Radiation Budget Experiment (ERBE) satellite data analysis and thermal environment specifications discussed by Anderson and Smith (1994). In large part, this report is intended to update (and supersede) those results.

  18. Rock fracture processes in chemically reactive environments

    NASA Astrophysics Data System (ADS)

    Eichhubl, P.

    2015-12-01

    Rock fracture is traditionally viewed as a brittle process involving damage nucleation and growth in a zone ahead of a larger fracture, resulting in fracture propagation once a threshold loading stress is exceeded. It is now increasingly recognized that coupled chemical-mechanical processes influence fracture growth in wide range of subsurface conditions that include igneous, metamorphic, and geothermal systems, and diagenetically reactive sedimentary systems with possible applications to hydrocarbon extraction and CO2 sequestration. Fracture processes aided or driven by chemical change can affect the onset of fracture, fracture shape and branching characteristics, and fracture network geometry, thus influencing mechanical strength and flow properties of rock systems. We are investigating two fundamental modes of chemical-mechanical interactions associated with fracture growth: 1. Fracture propagation may be aided by chemical dissolution or hydration reactions at the fracture tip allowing fracture propagation under subcritical stress loading conditions. We are evaluating effects of environmental conditions on critical (fracture toughness KIc) and subcritical (subcritical index) fracture properties using double torsion fracture mechanics tests on shale and sandstone. Depending on rock composition, the presence of reactive aqueous fluids can increase or decrease KIc and/or subcritical index. 2. Fracture may be concurrent with distributed dissolution-precipitation reactions in the hostrock beyond the immediate vicinity of the fracture tip. Reconstructing the fracture opening history recorded in crack-seal fracture cement of deeply buried sandstone we find that fracture length growth and fracture opening can be decoupled, with a phase of initial length growth followed by a phase of dominant fracture opening. This suggests that mechanical crack-tip failure processes, possibly aided by chemical crack-tip weakening, and distributed

  19. Process Architecture in a Multimodel Environment

    DTIC Science & Technology

    2008-03-01

    unclassified c . THIS PAGE unclassified Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 2 Permissions given: Addison Wesley to...FA8721-05- C -003 with Carnegie Mellon University for the operation of the Software Engineering Institute, a federally funded research and development...2000. [Avrunin] Avrunin, George S., Lori A. Clarke, Elizabeth A. Henneman , and Leon J. Osterweil, Complex Medical Processes as Context for

  20. Space Environments and Effects: Trapped Proton Model

    NASA Technical Reports Server (NTRS)

    Huston, S. L.; Kauffman, W. (Technical Monitor)

    2002-01-01

    An improved model of the Earth's trapped proton environment has been developed. This model, designated Trapped Proton Model version 1 (TPM-1), determines the omnidirectional flux of protons with energy between 1 and 100 MeV throughout near-Earth space. The model also incorporates a true solar cycle dependence. The model consists of several data files and computer software to read them. There are three versions of the mo'del: a FORTRAN-Callable library, a stand-alone model, and a Web-based model.

  1. Float-zone processing in a weightless environment

    NASA Technical Reports Server (NTRS)

    Fowle, A. A.; Haggerty, J. S.; Perron, R. R.; Strong, P. F.; Swanson, J. L.

    1976-01-01

    The results were reported of investigations to: (1) test the validity of analyses which set maximum practical diameters for Si crystals that can be processed by the float zone method in a near weightless environment, (2) determine the convective flow patterns induced in a typical float zone, Si melt under conditions perceived to be advantageous to the crystal growth process using flow visualization techniques applied to a dimensionally scaled model of the Si melt, (3) revise the estimates of the economic impact of space produced Si crystal by the float zone method on the U.S. electronics industry, and (4) devise a rational plan for future work related to crystal growth phenomena wherein low gravity conditions available in a space site can be used to maximum benefit to the U.S. electronics industry.

  2. Modular process modeling for OPC

    NASA Astrophysics Data System (ADS)

    Keck, M. C.; Bodendorf, C.; Schmidtling, T.; Schlief, R.; Wildfeuer, R.; Zumpe, S.; Niehoff, M.

    2007-03-01

    Modular OPC modeling, describing mask, optics, resist and etch processes separately is an approach to keep efforts for OPC manageable. By exchanging single modules of a modular OPC model, a fast response to process changes during process development is possible. At the same time efforts can be reduced, since only single modular process steps have to be re-characterized as input for OPC modeling as the process is adjusted and optimized. Commercially available OPC tools for full chip processing typically make use of semi-empirical models. The goal of our work is to investigate to what extent these OPC tools can be applied for modeling of single process steps as separate modules. For an advanced gate level process we analyze the modeling accuracy over different process conditions (focus and dose) when combining models for each process step - optics, resist and etch - for differing single processes to a model describing the total process.

  3. Understanding the Impact of Virtual World Environments on Social and Cognitive Processes in Learning

    ERIC Educational Resources Information Center

    Zhang, Chi

    2009-01-01

    Researchers in information systems and technology-mediated learning have begun to examine how virtual world environments can be used in learning and how they enable learning processes and enhance learning outcomes. This research examined learning processes in a virtual world learning environment (VWLE). A research model of VWLE effects on learning…

  4. Building an environment model using depth information

    NASA Technical Reports Server (NTRS)

    Roth-Tabak, Y.; Jain, Ramesh

    1989-01-01

    Modeling the environment is one of the most crucial issues for the development and research of autonomous robot and tele-perception. Though the physical robot operates (navigates and performs various tasks) in the real world, any type of reasoning, such as situation assessment, planning or reasoning about action, is performed based on information in its internal world. Hence, the robot's intentional actions are inherently constrained by the models it has. These models may serve as interfaces between sensing modules and reasoning modules, or in the case of telerobots serve as interface between the human operator and the distant robot. A robot operating in a known restricted environment may have a priori knowledge of its whole possible work domain, which will be assimilated in its World Model. As the information in the World Model is relatively fixed, an Environment Model must be introduced to cope with the changes in the environment and to allow exploring entirely new domains. Introduced here is an algorithm that uses dense range data collected at various positions in the environment to refine and update or generate a 3-D volumetric model of an environment. The model, which is intended for autonomous robot navigation and tele-perception, consists of cubic voxels with the possible attributes: Void, Full, and Unknown. Experimental results from simulations of range data in synthetic environments are given. The quality of the results show great promise for dealing with noisy input data. The performance measures for the algorithm are defined, and quantitative results for noisy data and positional uncertainty are presented.

  5. Processing Motion Signals in Complex Environments

    NASA Technical Reports Server (NTRS)

    Verghese, Preeti

    2000-01-01

    Motion information is critical for human locomotion and scene segmentation. Currently we have excellent neurophysiological models that are able to predict human detection and discrimination of local signals. Local motion signals are insufficient by themselves to guide human locomotion and to provide information about depth, object boundaries and surface structure. My research is aimed at understanding the mechanisms underlying the combination of motion signals across space and time. A target moving on an extended trajectory amidst noise dots in Brownian motion is much more detectable than the sum of signals generated by independent motion energy units responding to the trajectory segments. This result suggests that facilitation occurs between motion units tuned to similar directions, lying along the trajectory path. We investigated whether the interaction between local motion units along the motion direction is mediated by contrast. One possibility is that contrast-driven signals from motion units early in the trajectory sequence are added to signals in subsequent units. If this were the case, then units later in the sequence would have a larger signal than those earlier in the sequence. To test this possibility, we compared contrast discrimination thresholds for the first and third patches of a triplet of sequentially presented Gabor patches, aligned along the motion direction. According to this simple additive model, contrast increment thresholds for the third patch should be higher than thresholds for the first patch.The lack of a measurable effect on contrast thresholds for these various manipulations suggests that the pooling of signals along a trajectory is not mediated by contrast-driven signals. Instead, these results are consistent with models that propose that the facilitation of trajectory signals is achieved by a second-level network that chooses the strongest local motion signals and combines them if they occur in a spatio-temporal sequence consistent

  6. Processing Motion Signals in Complex Environments

    NASA Technical Reports Server (NTRS)

    Verghese, Preeti

    2000-01-01

    Motion information is critical for human locomotion and scene segmentation. Currently we have excellent neurophysiological models that are able to predict human detection and discrimination of local signals. Local motion signals are insufficient by themselves to guide human locomotion and to provide information about depth, object boundaries and surface structure. My research is aimed at understanding the mechanisms underlying the combination of motion signals across space and time. A target moving on an extended trajectory amidst noise dots in Brownian motion is much more detectable than the sum of signals generated by independent motion energy units responding to the trajectory segments. This result suggests that facilitation occurs between motion units tuned to similar directions, lying along the trajectory path. We investigated whether the interaction between local motion units along the motion direction is mediated by contrast. One possibility is that contrast-driven signals from motion units early in the trajectory sequence are added to signals in subsequent units. If this were the case, then units later in the sequence would have a larger signal than those earlier in the sequence. To test this possibility, we compared contrast discrimination thresholds for the first and third patches of a triplet of sequentially presented Gabor patches, aligned along the motion direction. According to this simple additive model, contrast increment thresholds for the third patch should be higher than thresholds for the first patch.The lack of a measurable effect on contrast thresholds for these various manipulations suggests that the pooling of signals along a trajectory is not mediated by contrast-driven signals. Instead, these results are consistent with models that propose that the facilitation of trajectory signals is achieved by a second-level network that chooses the strongest local motion signals and combines them if they occur in a spatio-temporal sequence consistent

  7. MATLAB/Simulink analytic radar modeling environment

    NASA Astrophysics Data System (ADS)

    Esken, Bruce L.; Clayton, Brian L.

    2001-09-01

    Analytic radar models are simulations based on abstract representations of the radar, the RF environment that radar signals are propagated, and the reflections produced by targets, clutter and multipath. These models have traditionally been developed in FORTRAN and have evolved over the last 20 years into efficient and well-accepted codes. However, current models are limited in two primary areas. First, by the nature of algorithm based analytical models, they can be difficult to understand by non-programmers and equally difficult to modify or extend. Second, there is strong interest in re-using these models to support higher-level weapon system and mission level simulations. To address these issues, a model development approach has been demonstrated which utilizes the MATLAB/Simulink graphical development environment. Because the MATLAB/Simulink environment graphically represents model algorithms - thus providing visibility into the model - algorithms can be easily analyzed and modified by engineers and analysts with limited software skills. In addition, software tools have been created that provide for the automatic code generation of C++ objects. These objects are created with well-defined interfaces enabling them to be used by modeling architectures external to the MATLAB/Simulink environment. The approach utilized is generic and can be extended to other engineering fields.

  8. Improving Ground Penetrating Radar Imaging in High Loss Environments by Coordinated System Development, Data Processing, Numerical Modeling, and Visualization Methods with Applications to Site Characterization

    SciTech Connect

    Wright, David L.

    2003-06-01

    The Department of Energy has identified the location and characterization of subsurface contaminants and the characterization of the subsurface as a priority need. Many DOE facilities are in need of subsurface imaging in the vadose and saturated zones. This includes (1) the detection and characterization of metal and concrete structures, (2) the characterization of waste pits (for both contents and integrity) and (3) mapping the complex geological/hydrological framework of the vadose and saturated zones. The DOE has identified ground penetrating radar (GPR) as a method that can non-invasively map transportation pathways and vadose zone heterogeneity. An advanced GPR system and advanced subsurface modeling, processing, imaging, and inversion techniques can be directly applied to several DOE science needs in more than one focus area and at many sites. Needs for enhanced subsurface imaging have been identified at Hanford, INEEL, SRS, ORNL, LLNL, SNL, LANL, and many other sites. In fact, needs for better subsurface imaging probably exist at all DOE sites. However, GPR performance is often inadequate due to increased attenuation and dispersion when soil conductivities are high.

  9. A Formal Environment Model for Multi-Agent Systems

    NASA Astrophysics Data System (ADS)

    da Silva, Paulo Salem; de Melo, Ana C. V.

    Multi-agent systems are employed to model complex systems which can be decomposed into several interacting pieces called agents. In such systems, agents exist, evolve and interact within an environment. In this paper we present a model for the specification of such environments. This Environment Model for Multi-Agent Systems (EMMAS), as we call it, defines both structural and dynamic aspects of environments. Structurally, EMMAS connects agents by a social network, in which the link between agents is specified as the capability that one agent has to act upon another. Dynamically, EMMAS provides operations that can be composed together in order to create a number of different environmental situations and to respond appropriately to agents' actions. These features are founded on a mathematical model that we provide and that defines rigorously what constitutes an environment. Formality is achieved by employing the π-calculus process algebra in order to give the semantics of this model. This allows, in particular, a simple characterization of the evolution of the environment structure. Moreover, owing to this formal semantics, it is possible to perform formal analyses on environments thus described. For the sake of illustration, a concrete example of environment specification using EMMAS is also given.

  10. The Educational Process in the Emerging Information Society: Conditions for the Reversal of the Linear Model of Education and the Development of an Open Type Hybrid Learning Environment.

    ERIC Educational Resources Information Center

    Anastasiades, Panagiotes S.; Retalis, Simos

    The introduction of communications and information technologies in the area of education tends to create a totally different environment, which is marked by a change of the teacher's role and a transformation of the basic components that make up the meaning and content of the learning procedure as a whole. It could be said that, despite any…

  11. Liberty High School Transition Project: Model Process for Assimilating School, Community, Business, Government and Service Groups of the Least Restrictive Environment for Nondisabled and Disabled.

    ERIC Educational Resources Information Center

    Grimes, Michael K.

    The panel presentation traces the development of and describes the operation of a Brentwood (California) project to prepare approximately 75 severely disabled individuals, ages 12-22, to function in the least restrictive recreation/leisure, vocational, and general community environments. Transition Steering Committee developed such project…

  12. An intelligent processing environment for real-time simulation

    NASA Technical Reports Server (NTRS)

    Carroll, Chester C.; Wells, Buren Earl, Jr.

    1988-01-01

    The development of a highly efficient and thus truly intelligent processing environment for real-time general purpose simulation of continuous systems is described. Such an environment can be created by mapping the simulation process directly onto the University of Alamba's OPERA architecture. To facilitate this effort, the field of continuous simulation is explored, highlighting areas in which efficiency can be improved. Areas in which parallel processing can be applied are also identified, and several general OPERA type hardware configurations that support improved simulation are investigated. Three direct execution parallel processing environments are introduced, each of which greatly improves efficiency by exploiting distinct areas of the simulation process. These suggested environments are candidate architectures around which a highly intelligent real-time simulation configuration can be developed.

  13. The AE-8 trapped electron model environment

    NASA Technical Reports Server (NTRS)

    Vette, James I.

    1991-01-01

    The machine sensible version of the AE-8 electron model environment was completed in December 1983. It has been sent to users on the model environment distribution list and is made available to new users by the National Space Science Data Center (NSSDC). AE-8 is the last in a series of terrestrial trapped radiation models that includes eight proton and eight electron versions. With the exception of AE-8, all these models were documented in formal reports as well as being available in a machine sensible form. The purpose of this report is to complete the documentation, finally, for AE-8 so that users can understand its construction and see the comparison of the model with the new data used, as well as with the AE-4 model.

  14. An Interactive Programming Environment For Integrated Signal-Symbol Processing

    NASA Astrophysics Data System (ADS)

    Upton, Richard A.; Lynch, Denis

    1987-06-01

    This paper describes an interactive programming environment and tools designed to facilitate the rapid implementation, testing and evaluation of algorithms and systems for image processing, image understanding, and 2- and 3-D graphics processing. The environment, termed Scope, is Lisp-based, resides on a Symbolics 36xx Lisp machine, and provides a tightly-coupled interface between the Symbolics Lisp machine and a Pixar 2D Image Computer. In particular, the environment provides an integrated set of utilities for program development and program maintenance based on the Symbolics Genera operating system. In addition, a wide range of near-real-time image and symbolic operations are provided, and a variety of image and symbolic representations are supported. The environment is specifically designed to facilitate crosstalk between numeric and symbolic data representations and processes. This paper discusses the major features of the environment and their use in developing and investigating selected image understanding capabilities.

  15. Use and perception of the environment: cultural and developmental processes

    Treesearch

    Martin M. Chemers; Irwin Altman

    1977-01-01

    This paper presents a "social systems" orientation for integrating the diverse aspects of environment, culture, and individual behavior. It suggests that a wide range of variables, including the physical environment, cultural and social processes, environmental perceptions and cognitions, behavior, and products of behavior, are connected in a complex,...

  16. Bioflims in the poultry production and processing environment

    USDA-ARS?s Scientific Manuscript database

    The chapter conveys the importance of biofilm study in the environment of the poultry production and processing industires. Implications for food safety and security are established for sites of occurrences and causes of biofilm formation in poultry environments. Regulations and testing methods th...

  17. Advanced modeling environment for developing and testing FES control systems.

    PubMed

    Davoodi, R; Brown, I E; Loeb, G E

    2003-01-01

    Realistic models of neuromusculoskeletal systems can provide a safe and convenient environment for the design and evaluation of controllers for functional electrical stimulation (FES) prior to clinical trials. We have developed a set of integrated musculoskeletal modeling tools to facilitate the model building process. Simulink models of musculoskeletal systems are created using two software packages developed in our laboratory, Musculoskeletal Modeling in Simulink (MMS) and virtual muscle, in addition to one software package available commercially, SIMM (Musculographics Inc., USA). MMS converts anatomically accurate musculoskeletal models generated by SIMM into Simulink(R) blocks. It also removes run-time constraints on kinetic simulations in SIMM, and allows the development of complex musculoskeletal models without writing a line of code. Virtual muscle builds realistic Simulink models of muscles responding to either natural recruitment or FES. Models of sensorimotor control systems can be developed using various Matlab (Mathworks Inc., USA) toolboxes and integrated easily with these musculoskeletal blocks in the graphical environment of Simulink.

  18. The national operational environment model (NOEM)

    NASA Astrophysics Data System (ADS)

    Salerno, John J.; Romano, Brian; Geiler, Warren

    2011-06-01

    The National Operational Environment Model (NOEM) is a strategic analysis/assessment tool that provides insight into the complex state space (as a system) that is today's modern operational environment. The NOEM supports baseline forecasts by generating plausible futures based on the current state. It supports what-if analysis by forecasting ramifications of potential "Blue" actions on the environment. The NOEM also supports sensitivity analysis by identifying possible pressure (leverage) points in support of the Commander that resolves forecasted instabilities, and by ranking sensitivities in a list for each leverage point and response. The NOEM can be used to assist Decision Makers, Analysts and Researchers with understanding the inter-workings of a region or nation state, the consequences of implementing specific policies, and the ability to plug in new operational environment theories/models as they mature. The NOEM is built upon an open-source, license-free set of capabilities, and aims to provide support for pluggable modules that make up a given model. The NOEM currently has an extensive number of modules (e.g. economic, security & social well-being pieces such as critical infrastructure) completed along with a number of tools to exercise them. The focus this year is on modeling the social and behavioral aspects of a populace within their environment, primarily the formation of various interest groups, their beliefs, their requirements, their grievances, their affinities, and the likelihood of a wide range of their actions, depending on their perceived level of security and happiness. As such, several research efforts are currently underway to model human behavior from a group perspective, in the pursuit of eventual integration and balance of populace needs/demands within their respective operational environment and the capacity to meet those demands. In this paper we will provide an overview of the NOEM, the need for and a description of its main components

  19. The dynamic radiation environment assimilation model (DREAM)

    SciTech Connect

    Reeves, Geoffrey D; Koller, Josef; Tokar, Robert L; Chen, Yue; Henderson, Michael G; Friedel, Reiner H

    2010-01-01

    The Dynamic Radiation Environment Assimilation Model (DREAM) is a 3-year effort sponsored by the US Department of Energy to provide global, retrospective, or real-time specification of the natural and potential nuclear radiation environments. The DREAM model uses Kalman filtering techniques that combine the strengths of new physical models of the radiation belts with electron observations from long-term satellite systems such as GPS and geosynchronous systems. DREAM includes a physics model for the production and long-term evolution of artificial radiation belts from high altitude nuclear explosions. DREAM has been validated against satellites in arbitrary orbits and consistently produces more accurate results than existing models. Tools for user-specific applications and graphical displays are in beta testing and a real-time version of DREAM has been in continuous operation since November 2009.

  20. GREENSCOPE: Sustainable Process Modeling

    EPA Science Inventory

    EPA researchers are responding to environmental problems by incorporating sustainability into process design and evaluation. EPA researchers are also developing a tool that allows users to assess modifications to existing and new chemical processes to determine whether changes in...

  1. GREENSCOPE: Sustainable Process Modeling

    EPA Science Inventory

    EPA researchers are responding to environmental problems by incorporating sustainability into process design and evaluation. EPA researchers are also developing a tool that allows users to assess modifications to existing and new chemical processes to determine whether changes in...

  2. Securing Provenance of Distributed Processes in an Untrusted Environment

    NASA Astrophysics Data System (ADS)

    Syalim, Amril; Nishide, Takashi; Sakurai, Kouichi

    Recently, there is much concern about the provenance of distributed processes, that is about the documentation of the origin and the processes to produce an object in a distributed system. The provenance has many applications in the forms of medical records, documentation of processes in the computer systems, recording the origin of data in the cloud, and also documentation of human-executed processes. The provenance of distributed processes can be modeled by a directed acyclic graph (DAG) where each node represents an entity, and an edge represents the origin and causal relationship between entities. Without sufficient security mechanisms, the provenance graph suffers from integrity and confidentiality problems, for example changes or deletions of the correct nodes, additions of fake nodes and edges, and unauthorized accesses to the sensitive nodes and edges. In this paper, we propose an integrity mechanism for provenance graph using the digital signature involving three parties: the process executors who are responsible in the nodes' creation, a provenance owner that records the nodes to the provenance store, and a trusted party that we call the Trusted Counter Server (TCS) that records the number of nodes stored by the provenance owner. We show that the mechanism can detect the integrity problem in the provenance graph, namely unauthorized and malicious “authorized” updates even if all the parties, except the TCS, collude to update the provenance. In this scheme, the TCS only needs a very minimal storage (linear with the number of the provenance owners). To protect the confidentiality and for an efficient access control administration, we propose a method to encrypt the provenance graph that allows access by paths and compartments in the provenance graph. We argue that encryption is important as a mechanism to protect the provenance data stored in an untrusted environment. We analyze the security of the integrity mechanism, and perform experiments to measure

  3. Designing user models in a virtual cave environment

    SciTech Connect

    Brown-VanHoozer, S.; Hudson, R.; Gokhale, N.

    1995-12-31

    In this paper, the results of a first study into the use of virtual reality for human factor studies and design of simple and complex models of control systems, components, and processes are described. The objective was to design a model in a virtual environment that would reflect more characteristics of the user`s mental model of a system and fewer of the designer`s. The technology of a CAVE{trademark} virtual environment and the methodology of Neuro Linguistic Programming were employed in this study.

  4. Optical modeling in Testbed Environment for Space Situational Awareness (TESSA).

    PubMed

    Nikolaev, Sergei

    2011-08-01

    We describe optical systems modeling in the Testbed Environment for Space Situational Awareness (TESSA) simulator. We begin by presenting a brief outline of the overall TESSA architecture and focus on components for modeling optical sensors. Both image generation and image processing stages are described in detail, highlighting the differences in modeling ground- and space-based sensors. We conclude by outlining the applicability domains for the TESSA simulator, including potential real-life scenarios.

  5. Model for integrated management of quality, labor risks prevention, environment and ethical aspects, applied to R&D&I and production processes in an organization

    NASA Astrophysics Data System (ADS)

    González, M. R.; Torres, F.; Yoldi, V.; Arcega, F.; Plaza, I.

    2012-04-01

    It is proposed an integrated management model for an organization. This model is based on the continuous improvement Plan-Do-Check-Act cycle and it intends to integrate the environmental, risk prevention and ethical aspects as well as research, development and innovation projects management in the general quality management structure proposed by ISO 9001:2008. It aims to fulfill the standards ISO 9001, ISO 14001, OSHAS 18001, SGE 21 y 166002.

  6. [Watershed water environment pollution models and their applications: a review].

    PubMed

    Zhu, Yao; Liang, Zhi-Wei; Li, Wei; Yang, Yi; Yang, Mu-Yi; Mao, Wei; Xu, Han-Li; Wu, Wei-Xiang

    2013-10-01

    Watershed water environment pollution model is the important tool for studying watershed environmental problems. Through the quantitative description of the complicated pollution processes of whole watershed system and its parts, the model can identify the main sources and migration pathways of pollutants, estimate the pollutant loadings, and evaluate their impacts on water environment, providing a basis for watershed planning and management. This paper reviewed the watershed water environment models widely applied at home and abroad, with the focuses on the models of pollutants loading (GWLF and PLOAD), water quality of received water bodies (QUAL2E and WASP), and the watershed models integrated pollutant loadings and water quality (HSPF, SWAT, AGNPS, AnnAGNPS, and SWMM), and introduced the structures, principles, and main characteristics as well as the limitations in practical applications of these models. The other models of water quality (CE-QUAL-W2, EFDC, and AQUATOX) and watershed models (GLEAMS and MIKE SHE) were also briefly introduced. Through the case analysis on the applications of single model and integrated models, the development trend and application prospect of the watershed water environment pollution models were discussed.

  7. A Case Study on the Processes of Academic Advising in a School-Centric Environment

    ERIC Educational Resources Information Center

    Dickson, Thomas

    2014-01-01

    This study examined the processes of academic advisement in a school-centric university environment utilizing the O'Banion Model of Academic Advising (1972) as a baseline for theoretical comparison. The primary research question sought to explore if the O'Banion Model of Academic Advising, a dominant theory of advisement processes, was still…

  8. A Case Study on the Processes of Academic Advising in a School-Centric Environment

    ERIC Educational Resources Information Center

    Dickson, Thomas

    2014-01-01

    This study examined the processes of academic advisement in a school-centric university environment utilizing the O'Banion Model of Academic Advising (1972) as a baseline for theoretical comparison. The primary research question sought to explore if the O'Banion Model of Academic Advising, a dominant theory of advisement processes, was still…

  9. Visual Modelling of Learning Processes

    ERIC Educational Resources Information Center

    Copperman, Elana; Beeri, Catriel; Ben-Zvi, Nava

    2007-01-01

    This paper introduces various visual models for the analysis and description of learning processes. The models analyse learning on two levels: the dynamic level (as a process over time) and the functional level. Two types of model for dynamic modelling are proposed: the session trace, which documents a specific learner in a particular learning…

  10. Visual Modelling of Learning Processes

    ERIC Educational Resources Information Center

    Copperman, Elana; Beeri, Catriel; Ben-Zvi, Nava

    2007-01-01

    This paper introduces various visual models for the analysis and description of learning processes. The models analyse learning on two levels: the dynamic level (as a process over time) and the functional level. Two types of model for dynamic modelling are proposed: the session trace, which documents a specific learner in a particular learning…

  11. Modelling Technology for Building Fire Scene with Virtual Geographic Environment

    NASA Astrophysics Data System (ADS)

    Song, Y.; Zhao, L.; Wei, M.; Zhang, H.; Liu, W.

    2017-09-01

    Building fire is a risky activity that can lead to disaster and massive destruction. The management and disposal of building fire has always attracted much interest from researchers. Integrated Virtual Geographic Environment (VGE) is a good choice for building fire safety management and emergency decisions, in which a more real and rich fire process can be computed and obtained dynamically, and the results of fire simulations and analyses can be much more accurate as well. To modelling building fire scene with VGE, the application requirements and modelling objective of building fire scene were analysed in this paper. Then, the four core elements of modelling building fire scene (the building space environment, the fire event, the indoor Fire Extinguishing System (FES) and the indoor crowd) were implemented, and the relationship between the elements was discussed also. Finally, with the theory and framework of VGE, the technology of building fire scene system with VGE was designed within the data environment, the model environment, the expression environment, and the collaborative environment as well. The functions and key techniques in each environment are also analysed, which may provide a reference for further development and other research on VGE.

  12. Macro Level Simulation Model Of Space Shuttle Processing

    NASA Technical Reports Server (NTRS)

    2000-01-01

    The contents include: 1) Space Shuttle Processing Simulation Model; 2) Knowledge Acquisition; 3) Simulation Input Analysis; 4) Model Applications in Current Shuttle Environment; and 5) Model Applications for Future Reusable Launch Vehicles (RLV's). This paper is presented in viewgraph form.

  13. Design of Training Systems Utility Assessment. The Training Process Flow and System Capabilities/Requirements and Resources Models Operating in the TRAPAC Environment

    DTIC Science & Technology

    1976-05-01

    TRAINING ANALYSIS AND EVALUATION GROUP TAEG REPORT NO. 33. DOTS UTILITY ASSESSMENT: The Training Process Flow anu System...purpose of the United States Government. ALFRED F. SMODE, Ph.D., Director Training Analysis & Evaluation Group TAEG REPORT NO. 33 FOREWORD The... SUMMARY 46 ill TAEG REPORT NO. 33 LIST OF FIGURES FIGURE NO. PAGE 1 DOTS SYSTEM DIAGRAM 2 FIELD TEST SCHEDULE lv TAEG REPORT NO. 33 LIST OF

  14. A model for a seamless user environment

    SciTech Connect

    Stevens, D.F. )

    1989-07-01

    The seamless user environment appears to be an idea whose time has come. It has long been a dream of information technology visionaries, but only recently have we begun to see the emergence of the infrastructure and products necessary to the realization of that dream. Work is progressing on several aspects of an overall seamless environment, but there is at present no generally-accepted architecture framework around which to structure a discussion about what such an entity might be. As is often the case when evocative vocabulary is applied to complex realities or (as in this case) potentialities, it is rare to find two interpretations that are more than approximately equivalent. The model described in this note is intended to provide a foundation for one possible interpretation. Before moving to the user environment, however, let us look briefly at the user-system interface, wherein lie the origins of many of the seams we wish to eliminate or conceal.

  15. THE IMPORTANCE OF CONCURRENT MONITORING AND MODELING FOR UNDERSTANDING MERCURY EXPOSURE IN THE ENVIRONMENT

    EPA Science Inventory

    Understanding the cycling processes governing mercury exposure in the environment requires sufficient process-based modeling and monitoring data. Monitoring provides ambient concentration data for specific sample times and locations. Modeling provides a tool for investigating the...

  16. THE IMPORTANCE OF CONCURRENT MONITORING AND MODELING FOR UNDERSTANDING MERCURY EXPOSURE IN THE ENVIRONMENT

    EPA Science Inventory

    Understanding the cycling processes governing mercury exposure in the environment requires sufficient process-based modeling and monitoring data. Monitoring provides ambient concentration data for specific sample times and locations. Modeling provides a tool for investigating the...

  17. Virtual Collaborative Simulation Environment for Integrated Product and Process Development

    NASA Technical Reports Server (NTRS)

    Gulli, Michael A.

    1997-01-01

    Deneb Robotics is a leader in the development of commercially available, leading edge three- dimensional simulation software tools for virtual prototyping,, simulation-based design, manufacturing process simulation, and factory floor simulation and training applications. Deneb has developed and commercially released a preliminary Virtual Collaborative Engineering (VCE) capability for Integrated Product and Process Development (IPPD). This capability allows distributed, real-time visualization and evaluation of design concepts, manufacturing processes, and total factory and enterprises in one seamless simulation environment.

  18. Virtual Collaborative Simulation Environment for Integrated Product and Process Development

    NASA Technical Reports Server (NTRS)

    Gulli, Michael A.

    1997-01-01

    Deneb Robotics is a leader in the development of commercially available, leading edge three- dimensional simulation software tools for virtual prototyping,, simulation-based design, manufacturing process simulation, and factory floor simulation and training applications. Deneb has developed and commercially released a preliminary Virtual Collaborative Engineering (VCE) capability for Integrated Product and Process Development (IPPD). This capability allows distributed, real-time visualization and evaluation of design concepts, manufacturing processes, and total factory and enterprises in one seamless simulation environment.

  19. A Multiagent Modeling Environment for Simulating Work Practice in Organizations

    NASA Technical Reports Server (NTRS)

    Sierhuis, Maarten; Clancey, William J.; vanHoof, Ron

    2004-01-01

    In this paper we position Brahms as a tool for simulating organizational processes. Brahms is a modeling and simulation environment for analyzing human work practice, and for using such models to develop intelligent software agents to support the work practice in organizations. Brahms is the result of more than ten years of research at the Institute for Research on Learning (IRL), NYNEX Science & Technology (the former R&D institute of the Baby Bell telephone company in New York, now Verizon), and for the last six years at NASA Ames Research Center, in the Work Systems Design and Evaluation group, part of the Computational Sciences Division (Code IC). Brahms has been used on more than ten modeling and simulation research projects, and recently has been used as a distributed multiagent development environment for developing work practice support tools for human in-situ science exploration on planetary surfaces, in particular a human mission to Mars. Brahms was originally conceived of as a business process modeling and simulation tool that incorporates the social systems of work, by illuminating how formal process flow descriptions relate to people s actual located activities in the workplace. Our research started in the early nineties as a reaction to experiences with work process modeling and simulation . Although an effective tool for convincing management of the potential cost-savings of the newly designed work processes, the modeling and simulation environment was only able to describe work as a normative workflow. However, the social systems, uncovered in work practices studied by the design team played a significant role in how work actually got done-actual lived work. Multi- tasking, informal assistance and circumstantial work interactions could not easily be represented in a tool with a strict workflow modeling paradigm. In response, we began to develop a tool that would have the benefits of work process modeling and simulation, but be distinctively able to

  20. Models of Cognition in Distributed Learning Environments

    DTIC Science & Technology

    2010-07-13

    SUBTITLE Models of Cognition in Distributed Learning Environments 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR( S ) 5d...PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME( S ) AND ADDRESS(ES) Institute for Defense Analyses,4850 Mark...Center Dr ,Alexandria,VA,22311 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING AGENCY NAME( S ) AND ADDRESS(ES) 10. SPONSOR

  1. Modeling Primary Atomization Processes

    DTIC Science & Technology

    2007-11-02

    I., "Generation of Ripples by Wind Blowing Over a Viscous Fluid", The Scientific Papers of Sir Geoffrey Ingram Taylor, 1963. 2. A. A. Amsden, P. J...92, 1983. 28. Jin, Xiaoshi, "Boundary Element Study on Particle Orientation Caused by the Fountain Flow in Injection Molding ", Polymer Engineering...HTPB, PE is a thermoplastic which is commonly produced via extrusion from a die in a continuous process. Hence, PE grains could be produced using

  2. A model environment for outer zone electrons

    NASA Technical Reports Server (NTRS)

    Singley, G. W.; Vette, J. I.

    1972-01-01

    A brief morphology of outer zone electrons is given to illustrate the nature of the phenomena that we are attempting to model. This is followed by a discussion of the data processing that was done with the various data received from the experimenters before incorporating it into the data base from which this model was ultimately derived. The details of the derivation are given, and several comparisons of the final model with the various experimental measurements are presented.

  3. Interplanetary Radiation and Internal Charging Environment Models for Solar Sails

    NASA Technical Reports Server (NTRS)

    Minow, Joseph I.; Altstatt, Richard L.; NeegaardParker, Linda

    2005-01-01

    A Solar Sail Radiation Environment (SSRE) model has been developed for defining charged particle environments over an energy range from 0.01 keV to 1 MeV for hydrogen ions, helium ions, and electrons. The SSRE model provides the free field charged particle environment required for characterizing energy deposition per unit mass, charge deposition, and dose rate dependent conductivity processes required to evaluate radiation dose and internal (bulk) charging processes in the solar sail membrane in interplanetary space. Solar wind and energetic particle measurements from instruments aboard the Ulysses spacecraft in a solar, near-polar orbit provide the particle data over a range of heliospheric latitudes used to derive the environment that can be used for radiation and charging environments for both high inclination 0.5 AU Solar Polar Imager mission and the 1.0 AU L1 solar missions. This paper describes the techniques used to model comprehensive electron, proton, and helium spectra over the range of particle energies of significance to energy and charge deposition in thin (less than 25 micrometers) solar sail materials.

  4. SPARX, a new environment for Cryo-EM image processing.

    PubMed

    Hohn, Michael; Tang, Grant; Goodyear, Grant; Baldwin, P R; Huang, Zhong; Penczek, Pawel A; Yang, Chao; Glaeser, Robert M; Adams, Paul D; Ludtke, Steven J

    2007-01-01

    SPARX (single particle analysis for resolution extension) is a new image processing environment with a particular emphasis on transmission electron microscopy (TEM) structure determination. It includes a graphical user interface that provides a complete graphical programming environment with a novel data/process-flow infrastructure, an extensive library of Python scripts that perform specific TEM-related computational tasks, and a core library of fundamental C++ image processing functions. In addition, SPARX relies on the EMAN2 library and cctbx, the open-source computational crystallography library from PHENIX. The design of the system is such that future inclusion of other image processing libraries is a straightforward task. The SPARX infrastructure intelligently handles retention of intermediate values, even those inside programming structures such as loops and function calls. SPARX and all dependencies are free for academic use and available with complete source.

  5. Observations of chemical processing in the circumstellar environment

    NASA Technical Reports Server (NTRS)

    Mundy, L. G.; McMullin, J. P.; Blake, G. A.

    1995-01-01

    High resolution interferometer and single-dish observations of young, deeply embedded stellar systems reveal a complex chemistry in the circumstellar environments of low to intermediate mass stars. Depletions of gas-phase molecules, grain mantle evaporation, and shock interactions actively drive chemical processes in different regions around young stars. We present results for two systems, IRAS 05338-0624 and NCG 1333 IRAS 4, to illustrate the behavior found and to examine the physical processes at work.

  6. Observations of chemical processing in the circumstellar environment

    NASA Technical Reports Server (NTRS)

    Mundy, L. G.; McMullin, J. P.; Blake, G. A.

    1995-01-01

    High resolution interferometer and single-dish observations of young, deeply embedded stellar systems reveal a complex chemistry in the circumstellar environments of low to intermediate mass stars. Depletions of gas-phase molecules, grain mantle evaporation, and shock interactions actively drive chemical processes in different regions around young stars. We present results for two systems, IRAS 05338-0624 and NCG 1333 IRAS 4, to illustrate the behavior found and to examine the physical processes at work.

  7. Introducing ORACLE: Library Processing in a Multi-User Environment.

    ERIC Educational Resources Information Center

    Queensland Library Board, Brisbane (Australia).

    Currently being developed by the State Library of Queensland, Australia, ORACLE (On-Line Retrieval of Acquisitions, Cataloguing, and Circulation Details for Library Enquiries) is a computerized library system designed to provide rapid processing of library materials in a multi-user environment. It is based on the Australian MARC format and fully…

  8. Introducing ORACLE: Library Processing in a Multi-User Environment.

    ERIC Educational Resources Information Center

    Queensland Library Board, Brisbane (Australia).

    Currently being developed by the State Library of Queensland, Australia, ORACLE (On-Line Retrieval of Acquisitions, Cataloguing, and Circulation Details for Library Enquiries) is a computerized library system designed to provide rapid processing of library materials in a multi-user environment. It is based on the Australian MARC format and fully…

  9. Processing of remote sensing information in cooperative intelligent grid environment

    NASA Astrophysics Data System (ADS)

    Sun, Jie; Ma, Hongchao; Zhong, Liang

    2008-12-01

    In order to raise the intelligent level and improve cooperative ability of grid. This paper proposes an agent oriented middleware, which is applied to the traditional OGSA architecture to compose a new architecture named CIG (Cooperative Intelligent Grid) and expounds the types of cooperative processing of remote sensing, the architecture of CIG and how to implement the cooperation in the CIG environment.

  10. Chemical Process Modeling and Control.

    ERIC Educational Resources Information Center

    Bartusiak, R. Donald; Price, Randel M.

    1987-01-01

    Describes some of the features of Lehigh University's (Pennsylvania) process modeling and control program. Highlights the creation and operation of the Chemical Process Modeling and Control Center (PMC). Outlines the program's philosophy, faculty, technical program, current research projects, and facilities. (TW)

  11. Extreme Environments Development of Decision Processes and Training Programs for Medical Policy Formulation

    NASA Technical Reports Server (NTRS)

    Stough, Roger

    2004-01-01

    The purpose of this workshop was to survey existing health and safety policies as well as processes and practices for various extreme environments; to identify strengths and shortcomings of these processes; and to recommend parameters for inclusion in a generic approach to policy formulation, applicable to the broadest categories of extreme environments. It was anticipated that two additional workshops would follow. The November 7, 2003 workshop would be devoted to the evaluation of different model(s) and a concluding expert evaluation of the usefulness of the model using a policy formulation example. The final workshop was planned for March 2004.

  12. Extreme Environments Development of Decision Processes and Training Programs for Medical Policy Formulation

    NASA Technical Reports Server (NTRS)

    Stough, Roger

    2004-01-01

    The purpose of this workshop was to survey existing health and safety policies as well as processes and practices for various extreme environments; to identify strengths and shortcomings of these processes; and to recommend parameters for inclusion in a generic approach to policy formulation, applicable to the broadest categories of extreme environments. It was anticipated that two additional workshops would follow. The November 7, 2003 workshop would be devoted to the evaluation of different model(s) and a concluding expert evaluation of the usefulness of the model using a policy formulation example. The final workshop was planned for March 2004.

  13. Pupils' Problem-Solving Processes in a Complex Computerized Learning Environment.

    ERIC Educational Resources Information Center

    Suomala, Jyrki; Alajaaski, Jarkko

    2002-01-01

    Describes a study that examined fifth-grade Finnish pupils' problem-solving processes in a LEGO/Logo technology-based learning environment. Results indicate that learning model and gender account for group differences in problem solving processes, and are interpreted as supporting the validity of discovery learning. (Author/LRW)

  14. Business process modeling in healthcare.

    PubMed

    Ruiz, Francisco; Garcia, Felix; Calahorra, Luis; Llorente, César; Gonçalves, Luis; Daniel, Christel; Blobel, Bernd

    2012-01-01

    The importance of the process point of view is not restricted to a specific enterprise sector. In the field of health, as a result of the nature of the service offered, health institutions' processes are also the basis for decision making which is focused on achieving their objective of providing quality medical assistance. In this chapter the application of business process modelling - using the Business Process Modelling Notation (BPMN) standard is described. Main challenges of business process modelling in healthcare are the definition of healthcare processes, the multi-disciplinary nature of healthcare, the flexibility and variability of the activities involved in health care processes, the need of interoperability between multiple information systems, and the continuous updating of scientific knowledge in healthcare.

  15. Conceptual models of information processing

    NASA Technical Reports Server (NTRS)

    Stewart, L. J.

    1983-01-01

    The conceptual information processing issues are examined. Human information processing is defined as an active cognitive process that is analogous to a system. It is the flow and transformation of information within a human. The human is viewed as an active information seeker who is constantly receiving, processing, and acting upon the surrounding environmental stimuli. Human information processing models are conceptual representations of cognitive behaviors. Models of information processing are useful in representing the different theoretical positions and in attempting to define the limits and capabilities of human memory. It is concluded that an understanding of conceptual human information processing models and their applications to systems design leads to a better human factors approach.

  16. Conceptual models of information processing

    NASA Technical Reports Server (NTRS)

    Stewart, L. J.

    1983-01-01

    The conceptual information processing issues are examined. Human information processing is defined as an active cognitive process that is analogous to a system. It is the flow and transformation of information within a human. The human is viewed as an active information seeker who is constantly receiving, processing, and acting upon the surrounding environmental stimuli. Human information processing models are conceptual representations of cognitive behaviors. Models of information processing are useful in representing the different theoretical positions and in attempting to define the limits and capabilities of human memory. It is concluded that an understanding of conceptual human information processing models and their applications to systems design leads to a better human factors approach.

  17. MOVIE - A Software Environment For Modeling Complex Adaptive Systems

    NASA Astrophysics Data System (ADS)

    Furmanski, Wojtek; Fox, Geoffrey J.

    1990-02-01

    We discuss here the basic elements of the new software system for large scale computation -MOVIE- (Metashell based Object oriented Visual Interactive Environment), recently designed and implemented at Caltech within the Caltech Concurrent Computation Program. From the research perspective, the goal of the MOVIE project is to create a simulation environment for modeling complex systems, with the focus on computational structures capable to adapt and act "intelligently", such as ensembles of image processing, early vision, neural network and Artificial Intelligence modules, integrated in the form of "neural robots". The high level MOVIE model, based on portable communication and computation protocol is suitable for large scale "intelligence engineering" by modeling such systems in distributed heterogeneous multicomputer environment and porting successful implementations to dedicated massively parallel hardware. From the software engineering point of view, the MOVIE model offers a platform for unifying elements of contemporary computing such as networking, windowing, parallelism, number crunching and symbolic processing. The basic idea, borrowed from Sun NeWS, is to use an appropriately extended PostScript as the unifying language. The MOVIE extension aims at promoting PostScript to a general purpose high level object oriented language with a high performance user expandable computational sector, fully compatible with the Adobe model for 2D graphics and the Sun X11 /NeWS model for windowing and multitasking.

  18. Modeling nuclear processes by Simulink

    SciTech Connect

    Rashid, Nahrul Khair Alang Md

    2015-04-29

    Modelling and simulation are essential parts in the study of dynamic systems behaviours. In nuclear engineering, modelling and simulation are important to assess the expected results of an experiment before the actual experiment is conducted or in the design of nuclear facilities. In education, modelling can give insight into the dynamic of systems and processes. Most nuclear processes can be described by ordinary or partial differential equations. Efforts expended to solve the equations using analytical or numerical solutions consume time and distract attention from the objectives of modelling itself. This paper presents the use of Simulink, a MATLAB toolbox software that is widely used in control engineering, as a modelling platform for the study of nuclear processes including nuclear reactor behaviours. Starting from the describing equations, Simulink models for heat transfer, radionuclide decay process, delayed neutrons effect, reactor point kinetic equations with delayed neutron groups, and the effect of temperature feedback are used as examples.

  19. Modeling nuclear processes by Simulink

    NASA Astrophysics Data System (ADS)

    Rashid, Nahrul Khair Alang Md

    2015-04-01

    Modelling and simulation are essential parts in the study of dynamic systems behaviours. In nuclear engineering, modelling and simulation are important to assess the expected results of an experiment before the actual experiment is conducted or in the design of nuclear facilities. In education, modelling can give insight into the dynamic of systems and processes. Most nuclear processes can be described by ordinary or partial differential equations. Efforts expended to solve the equations using analytical or numerical solutions consume time and distract attention from the objectives of modelling itself. This paper presents the use of Simulink, a MATLAB toolbox software that is widely used in control engineering, as a modelling platform for the study of nuclear processes including nuclear reactor behaviours. Starting from the describing equations, Simulink models for heat transfer, radionuclide decay process, delayed neutrons effect, reactor point kinetic equations with delayed neutron groups, and the effect of temperature feedback are used as examples.

  20. An integrative model linking feedback environment and organizational citizenship behavior.

    PubMed

    Peng, Jei-Chen; Chiu, Su-Fen

    2010-01-01

    Past empirical evidence has suggested that a positive supervisor feedback environment may enhance employees' organizational citizenship behavior (OCB). In this study, we aim to extend previous research by proposing and testing an integrative model that examines the mediating processes underlying the relationship between supervisor feedback environment and employee OCB. Data were collected from 259 subordinate-supervisor dyads across a variety of organizations in Taiwan. We used structural equation modeling to test our hypotheses. The results demonstrated that supervisor feedback environment influenced employees' OCB indirectly through (1) both positive affective-cognition and positive attitude (i.e., person-organization fit and organizational commitment), and (2) both negative affective-cognition and negative attitude (i.e., role stressors and job burnout). Theoretical and practical implications are discussed.

  1. Multiscale Materials Modeling in an Industrial Environment.

    PubMed

    Weiß, Horst; Deglmann, Peter; In 't Veld, Pieter J; Cetinkaya, Murat; Schreiner, Eduard

    2016-06-07

    In this review, we sketch the materials modeling process in industry. We show that predictive and fast modeling is a prerequisite for successful participation in research and development processes in the chemical industry. Stable and highly automated workflows suitable for handling complex systems are a must. In particular, we review approaches to build and parameterize soft matter systems. By satisfying these prerequisites, efficiency for the development of new materials can be significantly improved, as exemplified here for formulation polymer development. This is in fact in line with recent Materials Genome Initiative efforts sponsored by the US government. Valuable contributions to product development are possible today by combining existing modeling techniques in an intelligent fashion, provided modeling and experiment work hand in hand.

  2. Process material management in the Space Station environment

    NASA Technical Reports Server (NTRS)

    Perry, J. L.; Humphries, W. R.

    1988-01-01

    The Space Station will provide a unique facility for conducting material-processing and life-science experiments under microgravity conditions. These conditions place special requirements on the U.S. Laboratory for storing and transporting chemicals and process fluids, reclaiming water from selected experiments, treating and storing experiment wastes, and providing vacuum utilities. To meet these needs and provide a safe laboratory environment, the Process Material Management System (PMMS) is being developed. Preliminary design requirements and concepts related to the PMMS are addressed, and the MSFC PMMS breadboard test facility and a preliminary plan for validating the overall system design are discussed.

  3. Process material management in the Space Station environment

    NASA Technical Reports Server (NTRS)

    Perry, J. L.; Humphries, W. R.

    1988-01-01

    The Space Station will provide a unique facility for conducting material-processing and life-science experiments under microgravity conditions. These conditions place special requirements on the U.S. Laboratory for storing and transporting chemicals and process fluids, reclaiming water from selected experiments, treating and storing experiment wastes, and providing vacuum utilities. To meet these needs and provide a safe laboratory environment, the Process Material Management System (PMMS) is being developed. Preliminary design requirements and concepts related to the PMMS are addressed, and the MSFC PMMS breadboard test facility and a preliminary plan for validating the overall system design are discussed.

  4. Models of the Reading Process.

    PubMed

    Rayner, Keith; Reichle, Erik D

    2010-11-01

    Reading is a complex skill involving the orchestration of a number of components. Researchers often talk about a "model of reading" when talking about only one aspect of the reading process (for example, models of word identification are often referred to as "models of reading"). Here, we review prominent models that are designed to account for (1) word identification, (2) syntactic parsing, (3) discourse representations, and (4) how certain aspects of language processing (e.g., word identification), in conjunction with other constraints (e g., limited visual acuity, saccadic error, etc.), guide readers' eyes. Unfortunately, it is the case that these various models addressing specific aspects of the reading process seldom make contact with models dealing with other aspects of reading. Thus, for example, the models of word identification seldom make contact with models of eye movement control, and vice versa. While this may be unfortunate in some ways, it is quite understandable in other ways because reading itself is a very complex process. We discuss prototypical models of aspects of the reading process in the order mentioned above. We do not review all possible models, but rather focus on those we view as being representative and most highly recognized.

  5. Three-dimensional environment models from airborne laser radar data

    NASA Astrophysics Data System (ADS)

    Soderman, Ulf; Ahlberg, Simon; Elmqvist, Magnus; Persson, Asa

    2004-09-01

    Detailed 3D environment models for visualization and computer based analyses are important in many defence and homeland security applications, e.g. crisis management, mission planning and rehearsal, damage assessment, etc. The high resolution data from airborne laser radar systems for 3D sensing provide an excellent source of data for obtaining the information needed for many of these models. To utilise the 3D data provided by the laser radar systems however, efficient methods for data processing and environment model construction needs to be developed. In this paper we will present some results on the development of laser data processing methods, including methods for data classification, bare earth extraction, 3D-reconstruction of buildings, and identification of single trees and estimation of their position, height, canopy size and species. We will also show how the results can be used for the construction of detailed 3D environment models for military modelling and simulation applications. The methods use data from discrete return airborne laser radar systems and digital cameras.

  6. Modelling of CWS combustion process

    NASA Astrophysics Data System (ADS)

    Rybenko, I. A.; Ermakova, L. A.

    2016-10-01

    The paper considers the combustion process of coal water slurry (CWS) drops. The physico-chemical process scheme consisting of several independent parallel-sequential stages is offered. This scheme of drops combustion process is proved by the particle size distribution test and research stereomicroscopic analysis of combustion products. The results of mathematical modelling and optimization of stationary regimes of CWS combustion are provided. During modeling the problem of defining possible equilibrium composition of products, which can be obtained as a result of CWS combustion processes at different temperatures, is solved.

  7. Modeling of space environment impact on nanostructured materials. General principles

    NASA Astrophysics Data System (ADS)

    Voronina, Ekaterina; Novikov, Lev

    2016-07-01

    In accordance with the resolution of ISO TC20/SC14 WG4/WG6 joint meeting, Technical Specification (TS) 'Modeling of space environment impact on nanostructured materials. General principles' which describes computer simulation methods of space environment impact on nanostructured materials is being prepared. Nanomaterials surpass traditional materials for space applications in many aspects due to their unique properties associated with nanoscale size of their constituents. This superiority in mechanical, thermal, electrical and optical properties will evidently inspire a wide range of applications in the next generation spacecraft intended for the long-term (~15-20 years) operation in near-Earth orbits and the automatic and manned interplanetary missions. Currently, ISO activity on developing standards concerning different issues of nanomaterials manufacturing and applications is high enough. Most such standards are related to production and characterization of nanostructures, however there is no ISO documents concerning nanomaterials behavior in different environmental conditions, including the space environment. The given TS deals with the peculiarities of the space environment impact on nanostructured materials (i.e. materials with structured objects which size in at least one dimension lies within 1-100 nm). The basic purpose of the document is the general description of the methodology of applying computer simulation methods which relate to different space and time scale to modeling processes occurring in nanostructured materials under the space environment impact. This document will emphasize the necessity of applying multiscale simulation approach and present the recommendations for the choice of the most appropriate methods (or a group of methods) for computer modeling of various processes that can occur in nanostructured materials under the influence of different space environment components. In addition, TS includes the description of possible

  8. Kinetic Modeling of Microbiological Processes

    SciTech Connect

    Liu, Chongxuan; Fang, Yilin

    2012-08-26

    Kinetic description of microbiological processes is vital for the design and control of microbe-based biotechnologies such as waste water treatment, petroleum oil recovery, and contaminant attenuation and remediation. Various models have been proposed to describe microbiological processes. This editorial article discusses the advantages and limiation of these modeling approaches in cluding tranditional, Monod-type models and derivatives, and recently developed constraint-based approaches. The article also offers the future direction of modeling researches that best suit for petroleum and environmental biotechnologies.

  9. Social Models: Blueprints or Processes?

    ERIC Educational Resources Information Center

    Little, Graham R.

    1981-01-01

    Discusses the nature and implications of two different models for societal planning: (1) the problem-solving process approach based on Karl Popper; and (2) the goal-setting "blueprint" approach based on Karl Marx. (DC)

  10. Social Models: Blueprints or Processes?

    ERIC Educational Resources Information Center

    Little, Graham R.

    1981-01-01

    Discusses the nature and implications of two different models for societal planning: (1) the problem-solving process approach based on Karl Popper; and (2) the goal-setting "blueprint" approach based on Karl Marx. (DC)

  11. Performance of redundant disk array organizations in transaction processing environments

    NASA Technical Reports Server (NTRS)

    Mourad, Antoine N.; Fuchs, W. K.; Saab, Daniel G.

    1993-01-01

    A performance evaluation is conducted for two redundant disk-array organizations in a transaction-processing environment, relative to the performance of both mirrored disk organizations and organizations using neither striping nor redundancy. The proposed parity-striping alternative to striping with rotated parity is shown to furnish rapid recovery from failure at the same low storage cost without interleaving the data over multiple disks. Both noncached systems and systems using a nonvolatile cache as the controller are considered.

  12. Random Walks and Branching Processes in Correlated Gaussian Environment

    NASA Astrophysics Data System (ADS)

    Aurzada, Frank; Devulder, Alexis; Guillotin-Plantard, Nadine; Pène, Françoise

    2017-01-01

    We study persistence probabilities for random walks in correlated Gaussian random environment investigated by Oshanin et al. (Phys Rev Lett, 110:100602, 2013). From the persistence results, we can deduce properties of critical branching processes with offspring sizes geometrically distributed with correlated random parameters. More precisely, we obtain estimates on the tail distribution of its total population size, of its maximum population, and of its extinction time.

  13. Combustion modeling for experimentation in a space environment

    NASA Technical Reports Server (NTRS)

    Berlad, A. L.

    1974-01-01

    The merits of combustion experimentation in a space environment are assessed, and the impact of such experimentation on current theoretical models is considered. It is noted that combustion theory and experimentation for less than normal gravitational conditions are incomplete, inadequate, or nonexistent. Extensive and systematic experimentation in a space environment is viewed as essential for more adequate and complete theoretical models of such processes as premixed flame propagation and extinction limits, premixed flame propagation in droplet and particle clouds, ignition and autoignition in premixed combustible media, and gas jet combustion of unpremixed reactants. Current theories and models in these areas are described, and some combustion studies that can be undertaken in the Space Shuttle Program are proposed, including crossed molecular beam, turbulence, and upper pressure limit (of gases) studies.

  14. Distributed collaborative environments for 21st century modeling and simulation

    NASA Astrophysics Data System (ADS)

    McQuay, William K.

    2001-09-01

    Distributed collaboration is an emerging technology that will significantly change how modeling and simulation is employed in 21st century organizations. Modeling and simulation (M&S) is already an integral part of how many organizations conduct business and, in the future, will continue to spread throughout government and industry enterprises and across many domains from research and development to logistics to training to operations. This paper reviews research that is focusing on the open standards agent-based framework, product and process modeling, structural architecture, and the integration technologies - the glue to integrate the software components. A distributed collaborative environment is the underlying infrastructure that makes communication between diverse simulations and other assets possible and manages the overall flow of a simulation based experiment. The AFRL Collaborative Environment concept will foster a major cultural change in how the acquisition, training, and operational communities employ M&S.

  15. NG6: Integrated next generation sequencing storage and processing environment

    PubMed Central

    2012-01-01

    Background Next generation sequencing platforms are now well implanted in sequencing centres and some laboratories. Upcoming smaller scale machines such as the 454 junior from Roche or the MiSeq from Illumina will increase the number of laboratories hosting a sequencer. In such a context, it is important to provide these teams with an easily manageable environment to store and process the produced reads. Results We describe a user-friendly information system able to manage large sets of sequencing data. It includes, on one hand, a workflow environment already containing pipelines adapted to different input formats (sff, fasta, fastq and qseq), different sequencers (Roche 454, Illumina HiSeq) and various analyses (quality control, assembly, alignment, diversity studies,…) and, on the other hand, a secured web site giving access to the results. The connected user will be able to download raw and processed data and browse through the analysis result statistics. The provided workflows can easily be modified or extended and new ones can be added. Ergatis is used as a workflow building, running and monitoring system. The analyses can be run locally or in a cluster environment using Sun Grid Engine. Conclusions NG6 is a complete information system designed to answer the needs of a sequencing platform. It provides a user-friendly interface to process, store and download high-throughput sequencing data. PMID:22958229

  16. Analysis of Interpersonal Communication Processes in Digital Factory Environments

    NASA Astrophysics Data System (ADS)

    Schütze, Jens; Baum, Heiko; Laue, Martin; Müller, Egon

    The paper outlines the scope of influence of digital factory on the interpersonal communication process and the exemplary description of them. On the basis of a brief description about the theoretical basic concepts of the digital factory occurs the illustration of communicative features in digital factory. Practical coherences of interpersonal communication from a human oriented view were analyzed in Volkswagen AG in Wolfsburg in a pilot project. A modeling method was developed within the process analysis. This method makes it possible to visualize interpersonal communication and its human oriented attribute in a technically focused workflow. Due to the results of a developed inquiry about communication analysis and process models of modeling methods it was possible to build the processes in a suitable way for humans and to obtain a positive implication on the communication processes.

  17. MODELING WIND TURBINES IN THE GRIDLAB-D SOFTWARE ENVIRONMENT

    SciTech Connect

    Fuller, J.C.; Schneider, K.P.

    2009-01-01

    In recent years, the rapid expansion of wind power has resulted in a need to more accurately model the effects of wind penetration on the electricity infrastructure. GridLAB-D is a new simulation environment developed for the U.S. Department of Energy (DOE) by the Pacifi c Northwest National Laboratory (PNNL), in cooperation with academic and industrial partners. GridLAB-D was originally written and designed to help integrate end-use smart grid technologies, and it is currently being expanded to include a number of other technologies, including distributed energy resources (DER). The specifi c goal of this project is to create a preliminary wind turbine generator (WTG) model for integration into GridLAB-D. As wind power penetration increases, models are needed to accurately study the effects of increased penetration; this project is a beginning step at examining these effects within the GridLAB-D environment. Aerodynamic, mechanical and electrical power models were designed to simulate the process by which mechanical power is extracted by a wind turbine and converted into electrical energy. The process was modeled using historic atmospheric data, collected over a period of 30 years as the primary energy input. This input was then combined with preliminary models for synchronous and induction generators. Additionally, basic control methods were implemented, using either constant power factor or constant power modes. The model was then compiled into the GridLAB-D simulation environment, and the power outputs were compared against manufacturers’ data and then a variation of the IEEE 4 node test feeder was used to examine the model’s behavior. Results showed the designs were suffi cient for a prototype model and provided output power similar to the available manufacturers’ data. The prototype model is designed as a template for the creation of new modules, with turbine-specifi c parameters to be added by the user.

  18. CAD tool environment for MEMS process design support

    NASA Astrophysics Data System (ADS)

    Schmidt, T.; Wagener, A.; Popp, J.; Hahn, K.; Bruck, R.

    2005-07-01

    MEMS fabrication processes are characterized by a numerous useable process steps, materials and effects to fabricate the intended microstructure. Up to now CAD support in this domain concentrates mainly on the structural design (e.g. simulation programs on FEM basis). These tools often assume fixed interfaces to fabrication process like material parameters or design rules. Taking into account that MEMS design requires concurrently structural design (defining the lateral 2-dim shapes) as well as process design (responsible for the third dimension) it turns out that technology interfaces consisting only of sets of static data are no longer sufficient. For successful design flows in these areas it is necessary to incorporate a higher degree of process related data. A broader interface between process configuration on the one side and the application design on the other side seems to be needed. This paper proposes a novel approach. A process management system is introduced. It allows the specification of processes for specific applications. The system is based on a dedicated database environment that is able to store and manage all process related design constraints linked to the fabrication process data itself. The interdependencies between application specific processes and all stages of the design flow will be discussed and the complete software system PRINCE will be introduced meeting the requirements of this new approach. Based on a concurrent design methodology presented in the beginning of this paper, a system is presented that supports application specific process design. The paper will highlight the incorporated tools and the present status of the software system. A complete configuration of an Si-thin film process example will demonstrate the usage of PRINCE.

  19. Science Process Evaluation Model. Monograph.

    ERIC Educational Resources Information Center

    Small, Larry

    The goal of this monograph is to explain the evaluation program designed by Schaumburg Community Consolidated District 54, Schaumberg, Illinois. It discusses the process used in the development of the model, the product, the implication for classroom teachers and the effects of using an evaluation to assess science process skills. The process…

  20. Modified Process Reduces Porosity when Soldering in Reduced Gravity Environments

    NASA Technical Reports Server (NTRS)

    Watson, Kevin; Struk, Peter; Pettegrew, Richard; Downs, Robert; Haylett, Daniel

    2012-01-01

    A modified process yields lower levels of internal porosity for solder joints produced in reduced-gravity environments. The process incorporates both alternative materials and a modified procedure. The process provides the necessary cleaning action to enable effective bonding of the applied solder alloy with the materials to be joined. The modified process incorporates a commercially available liquid flux that is applied to the solder joint before heating with the soldering iron. It is subsequently heated with the soldering iron to activate the cleaning action of the flux and to evaporate most of the flux, followed by application of solder alloy in the form of commercially available solid solder wire (containing no flux). Continued heating ensures adequate flow of the solder alloy around and onto the materials to be joined. The final step is withdrawal of the soldering iron to allow alloy solidification and cooling of the solder joint.

  1. Modeling Extracellular Matrix Reorganization in 3D Environments

    PubMed Central

    Harjanto, Dewi; Zaman, Muhammad H.

    2013-01-01

    Extracellular matrix (ECM) remodeling is a key physiological process that occurs in a number of contexts, including cell migration, and is especially important for cellular form and function in three-dimensional (3D) matrices. However, there have been few attempts to computationally model how cells modify their environment in a manner that accounts for both cellular properties and the architecture of the surrounding ECM. To this end, we have developed and validated a novel model to simulate matrix remodeling that explicitly defines cells in a 3D collagenous matrix. In our simulation, cells can degrade, deposit, or pull on local fibers, depending on the fiber density around each cell. The cells can also move within the 3D matrix. Different cell phenotypes can be modeled by varying key cellular parameters. Using the model we have studied how two model cancer cell lines, of differing invasiveness, modify matrices with varying fiber density in their vicinity by tracking the metric of fraction of matrix occupied by fibers. Our results quantitatively demonstrate that in low density environments, cells deposit more collagen to uniformly increase fibril fraction. On the other hand, in higher density environments, the less invasive model cell line reduced the fibril fraction as compared to the highly invasive phenotype. These results show good qualitative and quantitative agreement with existing experimental literature. Our simulation is therefore able to function as a novel platform to provide new insights into the clinically relevant and physiologically critical process of matrix remodeling by helping identify critical parameters that dictate cellular behavior in complex native-like environments. PMID:23341900

  2. Command Process Modeling & Risk Analysis

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila

    2011-01-01

    Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.

  3. Command Process Modeling & Risk Analysis

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila

    2011-01-01

    Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.

  4. Regulatory Models and the Environment: Practice, Pitfalls, and Prospects

    SciTech Connect

    Holmes, K. John; Graham, Judith A.; McKone, Thomas; Whipple, Chris

    2008-06-01

    Computational models support environmental regulatory activities by providing the regulator an ability to evaluate available knowledge, assess alternative regulations, and provide a framework to assess compliance. But all models face inherent uncertainties, because human and natural systems are always more complex and heterogeneous than can be captured in a model. Here we provide a summary discussion of the activities, findings, and recommendations of the National Research Council's Committee on Regulatory Environmental Models, a committee funded by the US Environmental Protection Agency to provide guidance on the use of computational models in the regulatory process. Modeling is a difficult enterprise even outside of the potentially adversarial regulatory environment. The demands grow when the regulatory requirements for accountability, transparency, public accessibility, and technical rigor are added to the challenges. Moreover, models cannot be validated (declared true) but instead should be evaluated with regard to their suitability as tools to address a specific question. The committee concluded that these characteristics make evaluation of a regulatory model more complex than simply comparing measurement data with model results. Evaluation also must balance the need for a model to be accurate with the need for a model to be reproducible, transparent, and useful for the regulatory decision at hand. Meeting these needs requires model evaluation to be applied over the"life cycle" of a regulatory model with an approach that includes different forms of peer review, uncertainty analysis, and extrapolation methods than for non-regulatory models.

  5. A spatially structured metapopulation model within a stochastic environment.

    PubMed

    Smith, Andrew G

    2017-09-01

    Populations often exist, either by choice or by external pressure, in a fragmented way, referred to as a metapopulation. Typically, the dynamics accounted for within metapopulation models are assumed to be static. For example, patch occupancy models often assume that the colonisation and extinction rates do not change, while spatially structured models often assume that the rates of births, deaths and migrations do not depend on time. While some progress has been made when these dynamics are changing deterministically, less is known when the changes are stochastic. It can be quite common that the environment a population inhabits determines how these dynamics change over time. Changes to this environment can have a large impact on the survival probability of a population and such changes will often be stochastic. The typical metapopulation model allows for catastrophes that could eradicate most, if not all, individuals on an entire patch. It is this type of phenomenon that this article addresses. A Markov process is developed that models the number of individuals on each patch within a metapopulation. An approximation for the original model is presented in the form of a piecewise-deterministic Markov process and the approximation is analysed to present conditions for extinction. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. A network-oriented business modeling environment

    NASA Astrophysics Data System (ADS)

    Bisconti, Cristian; Storelli, Davide; Totaro, Salvatore; Arigliano, Francesco; Savarino, Vincenzo; Vicari, Claudia

    The development of formal models related to the organizational aspects of an enterprise is fundamental when these aspects must be re-engineered and digitalized, especially when the enterprise is involved in the dynamics and value flows of a business network. Business modeling provides an opportunity to synthesize and make business processes, business rules and the structural aspects of an organization explicit, allowing business managers to control their complexity and guide an enterprise through effective decisional and strategic activities. This chapter discusses the main results of the TEKNE project in terms of software components that enable enterprises to configure, store, search and share models of any aspects of their business while leveraging standard and business-oriented technologies and languages to bridge the gap between the world of business people and IT experts and to foster effective business-to-business collaborations.

  7. Model-based description of environment interaction for mobile robots

    NASA Astrophysics Data System (ADS)

    Borghi, Giuseppe; Ferrari, Carlo; Pagello, Enrico; Vianello, Marco

    1999-01-01

    We consider a mobile robot that attempts to accomplish a task by reaching a given goal, and interacts with its environment through a finite set of actions and observations. The interaction between robot and environment is modeled by Partially Observable Markov Decision Processes (POMDP). The robot takes its decisions in presence of uncertainty about the current state, by maximizing its reward gained during interactions with the environment. It is able to self-locate into the environment by collecting actions and perception histories during the navigation. To make the state estimation more reliable, we introduce an additional information in the model without adding new states and without discretizing the considered measures. Thus, we associate to the state transition probabilities also a continuous metric given through the mean and the variance of some significant sensor measurements suitable to be kept under continuous form, such as odometric measurements, showing that also such unreliable data can supply a great deal of information to the robot. The overall control system of the robot is structured as a two-levels layered architecture, where the low level implements several collision avoidance algorithms, while the upper level takes care of the navigation problem. In this paper, we concentrate on how to use POMDP models at the upper level.

  8. Memory processes and motor control in extreme environments.

    PubMed

    Newman, D J; Lathan, C E

    1999-08-01

    Cognitive-performance and motor-performance activities in multi-task, high-workload environments were assessed during astronaut performance in space flight and in isolation. Data was collected in microgravity on the International Micro-gravity Laboratory (IML) space shuttle mission (STS-42), and the Canadian Astronaut Program Space Unit Life Simulation (CAPSULS) mission offered an ideal opportunity to collect data for individuals in extreme isolation to complement the space flight data using similar hardware, software, and experimental protocols. The mental workload and performance experiment (MWPE) was performed during the IML-1 space flight mission, and the memory processes and motor control (MEMO) experiment was performed during the CAPSULS isolation mission. In both experiments, short-term exhaustive memory and fine motor control associated with human-computer interaction was studied. Memory processes were assessed using a Sternberg-like exhaustive memory search containing 1, 2, 4, or 7 letters. Fine motor control was assessed using velocity-controlled (joystick) and position-controlled (trackball) computer input devices to acquire targets as displayed on a computer screen. Subjects repeated the tasks under two conditions that tested perceptual motor adaptation strategies: 1) During adaptation to the microgravity environment; and 2) While wearing left-right reversing prism goggles during the CAPSULS mission. Both conditions significantly degraded motor performance but not cognitive performance. The data collected during both the MEMO experiment and the MWPE experiments enhance the knowledge base of human interface technology for human performance in extreme environments.

  9. Simulation model for plant growth in controlled environment systems

    NASA Technical Reports Server (NTRS)

    Raper, C. D., Jr.; Wann, M.

    1986-01-01

    The role of the mathematical model is to relate the individual processes to environmental conditions and the behavior of the whole plant. Using the controlled-environment facilities of the phytotron at North Carolina State University for experimentation at the whole-plant level and methods for handling complex models, researchers developed a plant growth model to describe the relationships between hierarchial levels of the crop production system. The fundamental processes that are considered are: (1) interception of photosynthetically active radiation by leaves, (2) absorption of photosynthetically active radiation, (3) photosynthetic transformation of absorbed radiation into chemical energy of carbon bonding in solube carbohydrates in the leaves, (4) translocation between carbohydrate pools in leaves, stems, and roots, (5) flow of energy from carbohydrate pools for respiration, (6) flow from carbohydrate pools for growth, and (7) aging of tissues. These processes are described at the level of organ structure and of elementary function processes. The driving variables of incident photosynthetically active radiation and ambient temperature as inputs pertain to characterization at the whole-plant level. The output of the model is accumulated dry matter partitioned among leaves, stems, and roots; thus, the elementary processes clearly operate under the constraints of the plant structure which is itself the output of the model.

  10. Development of the Delta Shell as an integrated modeling environment

    NASA Astrophysics Data System (ADS)

    Donchyts, Gennadii; Baart, Fedor; Jagers, Bert

    2010-05-01

    Many engineering problem require the use of multiple numerical models from multiple disciplines. For example the use of river model for flow calculation coupled with groundwater model and rainfall-runoff model. These models need to be setup, coupled, run, results need to be visualized, input and output data need to be stored. For some of these steps a software or standards already exist, but there is a need for an environment allowing to perform all these steps.The goal of the present work is to create a modeling environment where models from different domains can perform all the sixe steps: setup, couple, run, visualize, store. This presentation deals with the different problems which arise when setting up a modelling framework, such as terminology, numerical aspects as well as the software development issues which arise. In order to solve these issues we use Domain Driven Design methods, available open standards and open source components. While creating an integrated modeling environment we have identified that a separation of the following domains is essential: a framework allowing to link and exchange data between models; a framework allowing to integrate different components of the environment; graphical user interface; GIS; hybrid relational and multi-dimensional data store; discipline-specific libraries: river hydrology, morphology, water quality, statistics; model-specific components Delta Shell environment which is the basis for several products such as HABITAT, SOBEK and the future Delft3D interface. It implements and integrates components covering the above mentioned domains by making use of open standards and open source components. Different components have been developed to fill in gaps. For exchaning data with the GUI an object oriented scientific framework in .NET was developed within Delta Shell somewhat similar to the JSR-275. For the GIS domain several OGC standards were used such as SFS, WCS and WFS. For storage the CF standard together with

  11. Visualizing the process of interaction in a 3D environment

    NASA Astrophysics Data System (ADS)

    Vaidya, Vivek; Suryanarayanan, Srikanth; Krishnan, Kajoli; Mullick, Rakesh

    2007-03-01

    As the imaging modalities used in medicine transition to increasingly three-dimensional data the question of how best to interact with and analyze this data becomes ever more pressing. Immersive virtual reality systems seem to hold promise in tackling this, but how individuals learn and interact in these environments is not fully understood. Here we will attempt to show some methods in which user interaction in a virtual reality environment can be visualized and how this can allow us to gain greater insight into the process of interaction/learning in these systems. Also explored is the possibility of using this method to improve understanding and management of ergonomic issues within an interface.

  12. Float zone processing in a weightless environment. [Si crystals

    NASA Technical Reports Server (NTRS)

    Fowle, A. A.; Haggerty, J. S.; Strong, P. F.; Rudenberg, G.; Kronauer, R.

    1974-01-01

    Results are given for investigations into: (1) the physical limits which set the maximum practical diameters of Si crystals that can be processed by the float-zone method in a near weightless environment, and (2) the economic impact of large, space-produced Si crystals on the electronics industry. The stability of the melt is evaluated. Heat transfer and fluid flow within the melt as dependent on the crystal size and the degree and type of rotation imparted to the melt are studied. Methods of utilizing the weightless environment for the production of large, stress-free Si crystals of uniform composition are proposed. The economic effect of large size Si crystals, their potential applications, likely utilization and cost advantages in LSI, integrated circuits, and power devices are also evaluated. Foreseeable advantages of larger diameter wafers of good characteristics and the possibilities seen for greater perfection resulting from stress-free growth are discussed.

  13. ISLE (Image and Signal Processing LISP Environment) reference manual

    SciTech Connect

    Sherwood, R.J.; Searfus, R.M.

    1990-01-01

    ISLE is a rapid prototyping system for performing image and signal processing. It is designed to meet the needs of a person doing development of image and signal processing algorithms in a research environment. The image and signal processing modules in ISLE form a very capable package in themselves. They also provide a rich environment for quickly and easily integrating user-written software modules into the package. ISLE is well suited to applications in which there is a need to develop a processing algorithm in an interactive manner. It is straightforward to develop the algorithms, load it into ISLE, apply the algorithm to an image or signal, display the results, then modify the algorithm and repeat the develop-load-apply-display cycle. ISLE consists of a collection of image and signal processing modules integrated into a cohesive package through a standard command interpreter. ISLE developer elected to concentrate their effort on developing image and signal processing software rather than developing a command interpreter. A COMMON LISP interpreter was selected for the command interpreter because it already has the features desired in a command interpreter, it supports dynamic loading of modules for customization purposes, it supports run-time parameter and argument type checking, it is very well documented, and it is a commercially supported product. This manual is intended to be a reference manual for the ISLE functions The functions are grouped into a number of categories and briefly discussed in the Function Summary chapter. The full descriptions of the functions and all their arguments are given in the Function Descriptions chapter. 6 refs.

  14. Meteoroid Environment Modeling: the Meteoroid Engineering Model and Shower Forecasting

    NASA Technical Reports Server (NTRS)

    Moorhead, Althea V.

    2017-01-01

    The meteoroid environment is often divided conceptually into meteor showers plus a sporadic background component. The sporadic complex poses the bulk of the risk to spacecraft, but showers can produce significant short-term enhancements of the meteoroid flux. The Meteoroid Environment Office (MEO) has produced two environment models to handle these cases: the Meteoroid Engineering Model (MEM) and an annual meteor shower forecast. Both MEM and the forecast are used by multiple manned spaceflight projects in their meteoroid risk evaluation, and both tools are being revised to incorporate recent meteor velocity, density, and timing measurements. MEM describes the sporadic meteoroid complex and calculates the flux, speed, and directionality of the meteoroid environment relative to a user-supplied spacecraft trajectory, taking the spacecraft's motion into account. MEM is valid in the inner solar system and offers near-Earth and cis-lunar environments. While the current version of MEM offers a nominal meteoroid environment corresponding to a single meteoroid bulk density, the next version of MEMR3 will offer both flux uncertainties and a density distribution in addition to a revised near-Earth environment. We have updated the near-Earth meteor speed distribution and have made the first determination of uncertainty in this distribution. We have also derived a meteor density distribution from the work of Kikwaya et al. (2011). The annual meteor shower forecast takes the form of a report and data tables that can be used in conjunction with an existing MEM assessment. Fluxes are typically quoted to a constant limiting kinetic energy in order to comport with commonly used ballistic limit equations. For the 2017 annual forecast, the MEO substantially revised the list of showers and their characteristics using 14 years of meteor flux measurements from the Canadian Meteor Orbit Radar (CMOR). Defunct or insignificant showers were removed and the temporal profiles of many showers

  15. Process modeling - It's history, current status, and future

    NASA Astrophysics Data System (ADS)

    Duttweiler, Russell E.; Griffith, Walter M.; Jain, Sulekh C.

    1991-04-01

    The development of process modeling is reviewed to examine the potential of process applications to prevent and solve problems associated with the aerospace industry. The business and global environments is assessed, and the traditional approach to product/process design is argued to be obsolete. A revised engineering process is described which involves planning and prediction before production by means of process simulation. Process simulation can permit simultaneous engineering of unit processes and complex processes, and examples are given in the cross-coupling of forging-process variance. The implementation of process modeling, CAE, and computer simulations are found to reduce costs and time associated with technological development when incorporated judiciously.

  16. Physical processes affecting the sedimentary environments of Long Island Sound

    USGS Publications Warehouse

    Signell, R.P.; Knebel, H. J.; List, J.H.; Farris, A.S.; ,

    1997-01-01

    A modeling study was undertaken to simulate the bottom tidal-, wave-, and wind-driven currents in Long Island Sound in order to provide a general physical oceanographic framework for understanding the characteristics and distribution of seafloor sedimentary environments. Tidal currents are important in the funnel-shaped eastern part of the Sound, where a strong gradient of tidal-current speed was found. This current gradient parallels the general westward progression of sedimentary environments from erosion or non-deposition, through bedload transport and sediment sorting, to fine-grained deposition. Wave-driven currents, meanwhile, appear to be important along the shallow margins of the basin, explaining the occurrence of relatively coarse sediments in regions where tidal currents alone are not strong enough to move sediment. Finally, westerly wind events are shown to locally enhance bottom currents along the axial depression of the sound, providing a possible explanation for the relatively coarse sediments found in the depression despite tide- and wave-induced currents below the threshold of sediment movement. The strong correlation between the near-bottom current intensity based on the model results and the sediment response as indicated by the distribution of sedimentary environments provides a framework for predicting the long-term effects of anthropogenic activities.

  17. Meteoroid Environment Modeling: The Meteoroid Engineering Model and Shower Forecasting

    NASA Technical Reports Server (NTRS)

    Moorhead, Althea V.

    2017-01-01

    The meteoroid environment is often divided conceptually into meteor showers and the sporadic meteor background. It is commonly but incorrectly assumed that meteoroid impacts primarily occur during meteor showers; instead, the vast majority of hazardous meteoroids belong to the sporadic complex. Unlike meteor showers, which persist for a few hours to a few weeks, sporadic meteoroids impact the Earth's atmosphere and spacecraft throughout the year. The Meteoroid Environment Office (MEO) has produced two environment models to handle these cases: the Meteoroid Engineering Model (MEM) and an annual meteor shower forecast. The sporadic complex, despite its year-round activity, is not isotropic in its directionality. Instead, their apparent points of origin, or radiants, are organized into groups called "sources". The speed, directionality, and size distribution of these sporadic sources are modeled by the Meteoroid Engineering Model (MEM), which is currently in its second major release version (MEMR2) [Moorhead et al., 2015]. MEM provides the meteoroid flux relative to a user-provided spacecraft trajectory; it provides the total flux as well as the flux per angular bin, speed interval, and on specific surfaces (ram, wake, etc.). Because the sporadic complex dominates the meteoroid flux, MEM is the most appropriate model to use in spacecraft design. Although showers make up a small fraction of the meteoroid environment, they can produce significant short-term enhancements of the meteoroid flux. Thus, it can be valuable to consider showers when assessing risks associated with vehicle operations that are brief in duration. To assist with such assessments, the MEO issues an annual forecast that reports meteor shower fluxes as a function of time and compares showers with the time-averaged total meteoroid flux. This permits missions to do quick assessments of the increase in risk posed by meteor showers. Section II describes MEM in more detail and describes our current efforts

  18. Meteoroid Environment Modeling: the Meteoroid Engineering Model and Shower Forecasting

    NASA Technical Reports Server (NTRS)

    Moorhead, Althea V.

    2017-01-01

    INTRODUCTION: The meteoroid environment is often divided conceptually into meteor showers and the sporadic meteor background. It is commonly but incorrectly assumed that meteoroid impacts primarily occur during meteor showers; instead, the vast majority of hazardous meteoroids belong to the sporadic complex. Unlike meteor showers, which persist for a few hours to a few weeks, sporadic meteoroids impact the Earth's atmosphere and spacecraft throughout the year. The Meteoroid Environment Office (MEO) has produced two environment models to handle these cases: the Meteoroid Engineering Model (MEM) and an annual meteor shower forecast. The sporadic complex, despite its year-round activity, is not isotropic in its directionality. Instead, their apparent points of origin, or radiants, are organized into groups called "sources". The speed, directionality, and size distribution of these sporadic sources are modeled by the Meteoroid Engineering Model (MEM), which is currently in its second major release version (MEMR2) [Moorhead et al., 2015]. MEM provides the meteoroid flux relative to a user-provided spacecraft trajectory; it provides the total flux as well as the flux per angular bin, speed interval, and on specific surfaces (ram, wake, etc.). Because the sporadic complex dominates the meteoroid flux, MEM is the most appropriate model to use in spacecraft design. Although showers make up a small fraction of the meteoroid environment, they can produce significant short-term enhancements of the meteoroid flux. Thus, it can be valuable to consider showers when assessing risks associated with vehicle operations that are brief in duration. To assist with such assessments, the MEO issues an annual forecast that reports meteor shower fluxes as a function of time and compares showers with the time-averaged total meteoroid flux. This permits missions to do quick assessments of the increase in risk posed by meteor showers.

  19. Hybrid Models for Trajectory Error Modelling in Urban Environments

    NASA Astrophysics Data System (ADS)

    Angelatsa, E.; Parés, M. E.; Colomina, I.

    2016-06-01

    This paper tackles the first step of any strategy aiming to improve the trajectory of terrestrial mobile mapping systems in urban environments. We present an approach to model the error of terrestrial mobile mapping trajectories, combining deterministic and stochastic models. Due to urban specific environment, the deterministic component will be modelled with non-continuous functions composed by linear shifts, drifts or polynomial functions. In addition, we will introduce a stochastic error component for modelling residual noise of the trajectory error function. First step for error modelling requires to know the actual trajectory error values for several representative environments. In order to determine as accurately as possible the trajectories error, (almost) error less trajectories should be estimated using extracted nonsemantic features from a sequence of images collected with the terrestrial mobile mapping system and from a full set of ground control points. Once the references are estimated, they will be used to determine the actual errors in terrestrial mobile mapping trajectory. The rigorous analysis of these data sets will allow us to characterize the errors of a terrestrial mobile mapping system for a wide range of environments. This information will be of great use in future campaigns to improve the results of the 3D points cloud generation. The proposed approach has been evaluated using real data. The data originate from a mobile mapping campaign over an urban and controlled area of Dortmund (Germany), with harmful GNSS conditions. The mobile mapping system, that includes two laser scanner and two cameras, was mounted on a van and it was driven over a controlled area around three hours. The results show the suitability to decompose trajectory error with non-continuous deterministic and stochastic components.

  20. Geant4 models for space radiation environment.

    NASA Astrophysics Data System (ADS)

    Ivantchenko, Anton; Nieminen, Petteri; Incerti, Sebastien; Santin, Giovanni; Ivantchenko, Vladimir; Grichine, Vladimir; Allison, John

    The space radiation environment includes wide varieties of particles from electrons to heavy ions. In order to correctly predict the dose received by astronauts and devices the simulation models must have good applicability and produce accurate results from 10 MeV/u up to 10 GeV/u, where the most radioactive hazardous particles are present in the spectra. Appropriate models should also provide a good description of electromagnetic interactions down to very low energies (10 eV/u - 10 MeV/u) for understanding the damage mechanisms due to long-term low doses. Predictions of biological dose during long interplanetary journeys also need models for hadronic interactions of energetic heavy ions extending higher energies (10 GeV/u - 100 GeV/u, but possibly up to 1 TeV/u). Geant4 is a powerful toolkit, which in some areas well surpasses the needs from space radiation studies, while in other areas is being developed and/or validated to properly cover the modelling requirements outlined above. Our activities in ESA projects deal with the research and development of both Geant4 hadronic and electromagnetic physics. Recently the scope of verification tests and benchmarks has been extended. Hadronic tests and benchmarks run proton, pion, and ion interactions with matter at various energies. In the Geant4 hadronic sub-libraries, the most accurate cross sections have been identified and selected as a default for all particle types relevant to space applications. Significant developments were carried out for ion/ion interaction models. These now allow one to perform Geant4 simulations for all particle types and energies relevant to space applications. For the validation of ion models the hadronic testing suite for ion interactions was significantly extended. In this work the results of benchmarking versus data in a wide energy range for projectile protons and ions will be shown and discussed. Here we show results of the tests runs and their precision. Recommendations for Geant4

  1. Modeling of acetone biofiltration process

    SciTech Connect

    Hsiu-Mu Tang; Shyh-Jye Hwang; Wen-Chuan Wang

    1996-12-31

    The objective of this research was to investigate the kinetic behavior of the biofiltration process for the removal of acetone 41 which was used as a model compound for highly water soluble gas pollutants. A mathematical model was developed by taking into account diffusion and biodegradation of acetone and oxygen in the biofilm, mass transfer resistance in the gas film, and flow pattern of the bulk gas phase. The simulated results obtained by the proposed model indicated that mass transfer resistance in the gas phase was negligible for this biofiltration process. Analysis of the relative importance of various rate steps indicated that the overall acetone removal process was primarily limited by the oxygen diffusion rate. 11 refs., 6 figs., 1 tab.

  2. Modeling Production Plant Forming Processes

    SciTech Connect

    Rhee, M; Becker, R; Couch, R; Li, M

    2004-09-22

    Engineering has simulation tools and experience in modeling forming processes. Y-12 personnel have expressed interest in validating our tools and experience against their manufacturing process activities such as rolling, casting, and forging etc. We have demonstrated numerical capabilities in a collaborative DOE/OIT project with ALCOA that is nearing successful completion. The goal was to use ALE3D to model Alcoa's slab rolling process in order to demonstrate a computational tool that would allow Alcoa to define a rolling schedule that would minimize the probability of ingot fracture, thus reducing waste and energy consumption. It is intended to lead to long-term collaboration with Y-12 and perhaps involvement with other components of the weapons production complex. Using simulations to aid in design of forming processes can: decrease time to production; reduce forming trials and associated expenses; and guide development of products with greater uniformity and less scrap.

  3. An Overview of NASA's Oribital Debris Environment Model

    NASA Technical Reports Server (NTRS)

    Matney, Mark

    2010-01-01

    Using updated measurement data, analysis tools, and modeling techniques; the NASA Orbital Debris Program Office has created a new Orbital Debris Environment Model. This model extends the coverage of orbital debris flux throughout the Earth orbit environment, and includes information on the mass density of the debris as well as the uncertainties in the model environment. This paper will give an overview of this model and its implications for spacecraft risk analysis.

  4. Learning Environment, Learning Process, Academic Outcomes and Career Success of University Graduates

    ERIC Educational Resources Information Center

    Vermeulen, Lyanda; Schmidt, Henk G.

    2008-01-01

    This study expands on literature covering models on educational productivity, student integration and effectiveness of instruction. An expansion of the literature concerning the impact of higher education on workplace performance is also covered. Relationships were examined between the quality of the academic learning environment, the process of…

  5. Water related environment modelling on Mars.

    PubMed

    Kereszturi, Akos

    2004-01-01

    During a human Mars exploration because of the lack of time astronauts need fast methods for the interpretation of unexpected observations which give them flexibility and new, important targets. With in-situ modelling it is possible to get information on various past and present processes at the same location on a far wider spectrum than would be realized even during a long mission. This work summarizes the potential technical requirements and benefits of the modelling. Based on a simple estimation with a 300 kg package, and 1-10% of the working time of 1-2 astronauts at the same location, they can get plenty of new and important information for the whole past and present Mars. With the proposed five test groups astronauts will be able to make better and newer kinds of interpretations of observations, and find better targets and methods during the same mission.

  6. Mathematical modeling of biomass fuels formation process.

    PubMed

    Gaska, Krzysztof; Wandrasz, Andrzej J

    2008-01-01

    The increasing demand for thermal and electric energy in many branches of industry and municipal management accounts for a drastic diminishing of natural resources (fossil fuels). Meanwhile, in numerous technical processes, a huge mass of wastes is produced. A segregated and converted combustible fraction of the wastes, with relatively high calorific value, may be used as a component of formed fuels. The utilization of the formed fuel components from segregated groups of waste in associated processes of co-combustion with conventional fuels causes significant savings resulting from partial replacement of fossil fuels, and reduction of environmental pollution resulting directly from the limitation of waste migration to the environment (soil, atmospheric air, surface and underground water). The realization of technological processes with the utilization of formed fuel in associated thermal systems should be qualified by technical criteria, which means that elementary processes as well as factors of sustainable development, from a global viewpoint, must not be disturbed. The utilization of post-process waste should be preceded by detailed technical, ecological and economic analyses. In order to optimize the mixing process of fuel components, a mathematical model of the forming process was created. The model is defined as a group of data structures which uniquely identify a real process and conversion of this data in algorithms based on a problem of linear programming. The paper also presents the optimization of parameters in the process of forming fuels using a modified simplex algorithm with a polynomial worktime. This model is a datum-point in the numerical modeling of real processes, allowing a precise determination of the optimal elementary composition of formed fuels components, with assumed constraints and decision variables of the task.

  7. Sampling the food processing environment: taking up the cudgel for preventive quality management in food processing environments.

    PubMed

    Wagner, Martin; Stessl, Beatrix

    2014-01-01

    The Listeria monitoring program for Austrian cheese factories was established in 1988. The basic idea is to control the introduction of L. monocytogenes into the food processing environment, preventing the pathogen from contaminating the food under processing. The Austrian Listeria monitoring program comprises four levels of investigation, dealing with routine monitoring of samples and consequences of finding a positive sample. Preventive quality control concepts attempt to detect a foodborne hazard along the food processing chain, prior to food delivery, retailing, and consumption. The implementation of a preventive food safety concept provokes a deepened insight by the manufacturers into problems concerning food safety. The development of preventive quality assurance strategies contributes to the national food safety status and protects public health.

  8. A GPGPU accelerated modeling environment for quantitatively characterizing karst systems

    NASA Astrophysics Data System (ADS)

    Myre, J. M.; Covington, M. D.; Luhmann, A. J.; Saar, M. O.

    2011-12-01

    The ability to derive quantitative information on the geometry of karst aquifer systems is highly desirable. Knowing the geometric makeup of a karst aquifer system enables quantitative characterization of the systems response to hydraulic events. However, the relationship between flow path geometry and karst aquifer response is not well understood. One method to improve this understanding is the use of high speed modeling environments. High speed modeling environments offer great potential in this regard as they allow researchers to improve their understanding of the modeled karst aquifer through fast quantitative characterization. To that end, we have implemented a finite difference model using General Purpose Graphics Processing Units (GPGPUs). GPGPUs are special purpose accelerators which are capable of high speed and highly parallel computation. The GPGPU architecture is a grid like structure, making it is a natural fit for structured systems like finite difference models. To characterize the highly complex nature of karst aquifer systems our modeling environment is designed to use an inverse method to conduct the parameter tuning. Using an inverse method reduces the total amount of parameter space needed to produce a set of parameters describing a system of good fit. Systems of good fit are determined with a comparison to reference storm responses. To obtain reference storm responses we have collected data from a series of data-loggers measuring water depth, temperature, and conductivity at locations along a cave stream with a known geometry in southeastern Minnesota. By comparing the modeled response to those of the reference responses the model parameters can be tuned to quantitatively characterize geometry, and thus, the response of the karst system.

  9. Comparison of nucleation processes in different extrasolar planetary environments

    NASA Astrophysics Data System (ADS)

    Patzer, A. B. C.; Gebauer, S.

    The formation of solid particles and/or liquid droplets has significant effects on the dynamic, thermal, and chemical structure of the planetary environments, in which they are formed. In particular, the appearance of such objects depends on the character and distribution of such atmospheric condensates. For example, the first (preliminary) classification of extra-solar giant planets (Sudarsky et al, 2003, ApJ, 588, 1121) is based on the likely condensates in their atmospheres. One of the most important step in the study of how the condensates form is to investigate the nucleation processes from the gas phase under the prevailing atmospheric conditions. In this contribution different nucleation processes in atmospheres of giant planets - ranging from extrasolar 'hot jupiters' to gas giants of the Solar System - are discussed. Selected applications are presented and compared.

  10. The concepts of energy, environment, and cost for process design

    SciTech Connect

    Abu-Khader, M.M.; Speight, J.G.

    2004-05-01

    The process industries (specifically, energy and chemicals) are characterized by a variety of reactors and reactions to bring about successful process operations. The design of energy-related and chemical processes and their evolution is a complex process that determines the competitiveness of these industries, as well as their environmental impact. Thus, we have developed an Enviro-Energy Concept designed to facilitate sustainable industrial development. The Complete Onion Model represents a complete methodology for chemical process design and illustrates all of the requirements to achieve the best possible design within the accepted environmental standards. Currently, NOx emissions from industrial processes continue to receive maximum attention, therefore the issue problem of NOx emissions from industrial sources such as power stations and nitric acid plants is considered. The Selective Catalytic Reduction (SCR) is one of the most promising and effective commercial technologies. It is considered the Best Available Control Technology (BACT) for NOx reduction. The solution of NOx emissions problem is either through modifying the chemical process design and/or installing an end-of-pipe technology. The degree of integration between the process design and the installed technology plays a critical role in the capital cost evaluation. Therefore, integrating process units and then optimizing the design has a vital effect on the total cost. Both the environmental regulations and the cost evaluation are the boundary constraints of the optimum solution.

  11. An Instructional Method for the AutoCAD Modeling Environment.

    ERIC Educational Resources Information Center

    Mohler, James L.

    1997-01-01

    Presents a command organizer for AutoCAD to aid new uses in operating within the 3-D modeling environment. Addresses analyzing the problem, visualization skills, nonlinear tools, a static view of a dynamic model, the AutoCAD organizer, environment attributes, and control of the environment. Contains 11 references. (JRH)

  12. MASCARET: creating virtual learning environments from system modelling

    NASA Astrophysics Data System (ADS)

    Querrec, Ronan; Vallejo, Paola; Buche, Cédric

    2013-03-01

    The design process for a Virtual Learning Environment (VLE) such as that put forward in the SIFORAS project (SImulation FOR training and ASsistance) means that system specifications can be differentiated from pedagogical specifications. System specifications can also be obtained directly from the specialists' expertise; that is to say directly from Product Lifecycle Management (PLM) tools. To do this, the system model needs to be considered as a piece of VLE data. In this paper we present Mascaret, a meta-model which can be used to represent such system models. In order to ensure that the meta-model is capable of describing, representing and simulating such systems, MASCARET is based SysML1, a standard defined by Omg.

  13. Practical Framework: Implementing OEE Method in Manufacturing Process Environment

    NASA Astrophysics Data System (ADS)

    Maideen, N. C.; Sahudin, S.; Mohd Yahya, N. H.; Norliawati, A. O.

    2016-02-01

    Manufacturing process environment requires reliable machineries in order to be able to satisfy the market demand. Ideally, a reliable machine is expected to be operated and produce a quality product at its maximum designed capability. However, due to some reason, the machine usually unable to achieved the desired performance. Since the performance will affect the productivity of the system, a measurement technique should be applied. Overall Equipment Effectiveness (OEE) is a good method to measure the performance of the machine. The reliable result produced from OEE can then be used to propose a suitable corrective action. There are a lot of published paper mentioned about the purpose and benefit of OEE that covers what and why factors. However, the how factor not yet been revealed especially the implementation of OEE in manufacturing process environment. Thus, this paper presents a practical framework to implement OEE and a case study has been discussed to explain in detail each steps proposed. The proposed framework is beneficial to the engineer especially the beginner to start measure their machine performance and later improve the performance of the machine.

  14. Planning: The Participatory Process Model.

    ERIC Educational Resources Information Center

    McDowell, Elizabeth V.

    The participatory planning process model developed by Peirce Junior College is described in this paper. First, the rationale for shifting from a traditional authoritarian style of institutional leadership to a participatory style which encourages a broader concern for the institution and lessens morale problems is offered. The development of a new…

  15. Thermoplastic matrix composite processing model

    NASA Technical Reports Server (NTRS)

    Dara, P. H.; Loos, A. C.

    1985-01-01

    The effects the processing parameters pressure, temperature, and time have on the quality of continuous graphite fiber reinforced thermoplastic matrix composites were quantitatively accessed by defining the extent to which intimate contact and bond formation has occurred at successive ply interfaces. Two models are presented predicting the extents to which the ply interfaces have achieved intimate contact and cohesive strength. The models are based on experimental observation of compression molded laminates and neat resin conditions, respectively. Identified as the mechanism explaining the phenomenon by which the plies bond to themselves is the theory of autohesion (or self diffusion). Theoretical predictions from the Reptation Theory between autohesive strength and contact time are used to explain the effects of the processing parameters on the observed experimental strengths. The application of a time-temperature relationship for autohesive strength predictions is evaluated. A viscoelastic compression molding model of a tow was developed to explain the phenomenon by which the prepreg ply interfaces develop intimate contact.

  16. Biogeochemical processes in model estuaries

    NASA Astrophysics Data System (ADS)

    Church, Thomas M.

    Sixty researchers met to evaluate the effects of global change on estuaries and to improve estuarine modeling at the Second International Symposium on the Biogeochemistry of Model Estuaries, held April 15-19, 1991, at Jekyll Island, Ga. The importance of successful sampling in evaluating chemical fluxes and establishing records of estuarine change was articulated, as was the need for tracer tools for improved modeling. The symposium was sponsored by the National Science Foundation, National Oceanic and Atmospheric Administration, and the Department of Energy.Participants discussed particles and sedimentology, trace elements and metals, organic chemistry, and nutrient cycling of estuarine processes. Four days of presentations were followed by a half-day of discussion on advances in these topics and the overall goal of assessing estuarine processes in global change. What follows is a synopsis of this discussion.

  17. AMBA/D: a new programming environment for image processing

    NASA Astrophysics Data System (ADS)

    Roth, Karl n.; Hufnagl, Peter; Wolf, Guenter

    1992-04-01

    Recent practice in image processing is dominated by heuristic methods used to design practical, relevant algorithms. To ensure high efficiency in the design process, the communication between user and computer should be as direct as possible. An interactive software system for image processing is required to fulfill this demand. Interpreter-based systems with high interactivity available on the software market have the drawback of low operation speed. In AMBA/D we combine the performance of a compiler/based system, with the interactivity of an interpreter system. The AMBA/D system is an interactive programming environment with integrated facilities to create, compile, execute, and debug programs. In AMBA/D, a compiler language, direct execution, and programming concept is combined with a collection of high-level image processing procedures. The design of a special compiler language was necessary because existing computer languages like FORTRAN, C, etc., do not fulfill our requirement of interactivity. The system runs of an IBM-compatible personal computer and can be used with different types of commercially available frame grabbers.

  18. Family Environment and Cognitive Development: Twelve Analytic Models

    ERIC Educational Resources Information Center

    Walberg, Herbert J.; Marjoribanks, Kevin

    1976-01-01

    The review indicates that refined measures of the family environment and the use of complex statistical models increase the understanding of the relationships between socioeconomic status, sibling variables, family environment, and cognitive development. (RC)

  19. Group Modeling in Social Learning Environments

    ERIC Educational Resources Information Center

    Stankov, Slavomir; Glavinic, Vlado; Krpan, Divna

    2012-01-01

    Students' collaboration while learning could provide better learning environments. Collaboration assumes social interactions which occur in student groups. Social theories emphasize positive influence of such interactions on learning. In order to create an appropriate learning environment that enables social interactions, it is important to…

  20. Group Modeling in Social Learning Environments

    ERIC Educational Resources Information Center

    Stankov, Slavomir; Glavinic, Vlado; Krpan, Divna

    2012-01-01

    Students' collaboration while learning could provide better learning environments. Collaboration assumes social interactions which occur in student groups. Social theories emphasize positive influence of such interactions on learning. In order to create an appropriate learning environment that enables social interactions, it is important to…

  1. Measurement and modeling of moist processes

    NASA Technical Reports Server (NTRS)

    Cotton, William; Starr, David; Mitchell, Kenneth; Fleming, Rex; Koch, Steve; Smith, Steve; Mailhot, Jocelyn; Perkey, Don; Tripoli, Greg

    1993-01-01

    The keynote talk summarized five years of work simulating observed mesoscale convective systems with the RAMS (Regional Atmospheric Modeling System) model. Excellent results are obtained when simulating squall line or other convective systems that are strongly forced by fronts or other lifting mechanisms. Less highly forced systems are difficult to model. The next topic in this colloquium was measurement of water vapor and other constituents of the hydrologic cycle. Impressive accuracy was shown measuring water vapor with both the airborne DIAL (Differential Absorption Lidar) system and the the ground-based Raman Lidar. NMC's plans for initializing land water hydrology in mesoscale models was presented before water vapor measurement concepts for GCIP were discussed. The subject of using satellite data to provide mesoscale moisture and wind analyses was next. Recent activities in modeling of moist processes in mesoscale systems was reported on. These modeling activities at the Canadian Atmospheric Environment Service (AES) used a hydrostatic, variable-resolution grid model. Next the spatial resolution effects of moisture budgets was discussed; in particular, the effects of temporal resolution on heat and moisture budgets for cumulus parameterization. The conclusion of this colloquium was on modeling scale interaction processes.

  2. Exploring Undergraduate Students' Mental Models of the Environment: Are They Related to Environmental Affect and Behavior?

    ERIC Educational Resources Information Center

    Liu, Shu-Chiu; Lin, Huann-shyang

    2015-01-01

    A draw-and-explain task and questionnaire were used to explore Taiwanese undergraduate students' mental models of the environment and whether and how they relate to their environmental affect and behavioral commitment. We found that students generally held incomplete mental models of the environment, focusing on objects rather than on processes or…

  3. Exploring Undergraduate Students' Mental Models of the Environment: Are They Related to Environmental Affect and Behavior?

    ERIC Educational Resources Information Center

    Liu, Shu-Chiu; Lin, Huann-shyang

    2015-01-01

    A draw-and-explain task and questionnaire were used to explore Taiwanese undergraduate students' mental models of the environment and whether and how they relate to their environmental affect and behavioral commitment. We found that students generally held incomplete mental models of the environment, focusing on objects rather than on processes or…

  4. A Conceptual Model of Training Transfer that Includes the Physical Environment

    ERIC Educational Resources Information Center

    Hillsman, Terron L.; Kupritz, Virginia W.

    2007-01-01

    The study presents the physical environment as an emerging factor impacting training transfer and proposes to position this variable in the Baldwin and Ford (1988) model of the training transfer process. The amended model positions workplace design, one element of the physical environment, as a part of organizational context in the work…

  5. Welding process modelling and control

    NASA Technical Reports Server (NTRS)

    Romine, Peter L.; Adenwala, Jinen A.

    1993-01-01

    The research and analysis performed, and software developed, and hardware/software recommendations made during 1992 in development of the PC-based data acquisition system for support of Welding Process Modeling and Control is reported. A need was identified by the Metals Processing Branch of NASA Marshall Space Flight Center, for a mobile data aquisition and analysis system, customized for welding measurement and calibration. Several hardware configurations were evaluated and a PC-based system was chosen. The Welding Measurement System (WMS) is a dedicated instrument, strictly for the use of data aquisition and analysis. Although the WMS supports many of the functions associated with the process control, it is not the intention for this system to be used for welding process control.

  6. Modeling Low-temperature Geochemical Processes

    NASA Astrophysics Data System (ADS)

    Nordstrom, D. K.

    2003-12-01

    Geochemical modeling has become a popular and useful tool for a wide number of applications from research on the fundamental processes of water-rock interactions to regulatory requirements and decisions regarding permits for industrial and hazardous wastes. In low-temperature environments, generally thought of as those in the temperature range of 0-100 °C and close to atmospheric pressure (1 atm=1.01325 bar=101,325 Pa), complex hydrobiogeochemical reactions participate in an array of interconnected processes that affect us, and that, in turn, we affect. Understanding these complex processes often requires tools that are sufficiently sophisticated to portray multicomponent, multiphase chemical reactions yet transparent enough to reveal the main driving forces. Geochemical models are such tools. The major processes that they are required to model include mineral dissolution and precipitation; aqueous inorganic speciation and complexation; solute adsorption and desorption; ion exchange; oxidation-reduction; or redox; transformations; gas uptake or production; organic matter speciation and complexation; evaporation; dilution; water mixing; reaction during fluid flow; reaction involving biotic interactions; and photoreaction. These processes occur in rain, snow, fog, dry atmosphere, soils, bedrock weathering, streams, rivers, lakes, groundwaters, estuaries, brines, and diagenetic environments. Geochemical modeling attempts to understand the redistribution of elements and compounds, through anthropogenic and natural means, for a large range of scale from nanometer to global. "Aqueous geochemistry" and "environmental geochemistry" are often used interchangeably with "low-temperature geochemistry" to emphasize hydrologic or environmental objectives.Recognition of the strategy or philosophy behind the use of geochemical modeling is not often discussed or explicitly described. Plummer (1984, 1992) and Parkhurst and Plummer (1993) compare and contrast two approaches for

  7. Modeling the cometary environment using a fluid approach

    NASA Astrophysics Data System (ADS)

    Shou, Yinsi

    Comets are believed to have preserved the building material of the early solar system and to hold clues to the origin of life on Earth. Abundant remote observations of comets by telescopes and the in-situ measurements by a handful of space missions reveal that the cometary environments are complicated by various physical and chemical processes among the neutral gases and dust grains released from comets, cometary ions, and the solar wind in the interplanetary space. Therefore, physics-based numerical models are in demand to interpret the observational data and to deepen our understanding of the cometary environment. In this thesis, three models using a fluid approach, which include important physical and chemical processes underlying the cometary environment, have been developed to study the plasma, neutral gas, and the dust grains, respectively. Although models based on the fluid approach have limitations in capturing all of the correct physics for certain applications, especially for very low gas density environment, they are computationally much more efficient than alternatives. In the simulations of comet 67P/Churyumov-Gerasimenko at various heliocentric distances with a wide range of production rates, our multi-fluid cometary neutral gas model and multi-fluid cometary dust model have achieved comparable results to the Direct Simulation Monte Carlo (DSMC) model, which is based on a kinetic approach that is valid in all collisional regimes. Therefore, our model is a powerful alternative to the particle-based model, especially for some computationally intensive simulations. Capable of accounting for the varying heating efficiency under various physical conditions in a self-consistent way, the multi-fluid cometary neutral gas model is a good tool to study the dynamics of the cometary coma with different production rates and heliocentric distances. The modeled H2O expansion speeds reproduce the general trend and the speed's nonlinear dependencies of production rate

  8. Workflows for microarray data processing in the Kepler environment

    PubMed Central

    2012-01-01

    Background Microarray data analysis has been the subject of extensive and ongoing pipeline development due to its complexity, the availability of several options at each analysis step, and the development of new analysis demands, including integration with new data sources. Bioinformatics pipelines are usually custom built for different applications, making them typically difficult to modify, extend and repurpose. Scientific workflow systems are intended to address these issues by providing general-purpose frameworks in which to develop and execute such pipelines. The Kepler workflow environment is a well-established system under continual development that is employed in several areas of scientific research. Kepler provides a flexible graphical interface, featuring clear display of parameter values, for design and modification of workflows. It has capabilities for developing novel computational components in the R, Python, and Java programming languages, all of which are widely used for bioinformatics algorithm development, along with capabilities for invoking external applications and using web services. Results We developed a series of fully functional bioinformatics pipelines addressing common tasks in microarray processing in the Kepler workflow environment. These pipelines consist of a set of tools for GFF file processing of NimbleGen chromatin immunoprecipitation on microarray (ChIP-chip) datasets and more comprehensive workflows for Affymetrix gene expression microarray bioinformatics and basic primer design for PCR experiments, which are often used to validate microarray results. Although functional in themselves, these workflows can be easily customized, extended, or repurposed to match the needs of specific projects and are designed to be a toolkit and starting point for specific applications. These workflows illustrate a workflow programming paradigm focusing on local resources (programs and data) and therefore are close to traditional shell scripting or

  9. Workflows for microarray data processing in the Kepler environment.

    PubMed

    Stropp, Thomas; McPhillips, Timothy; Ludäscher, Bertram; Bieda, Mark

    2012-05-17

    Microarray data analysis has been the subject of extensive and ongoing pipeline development due to its complexity, the availability of several options at each analysis step, and the development of new analysis demands, including integration with new data sources. Bioinformatics pipelines are usually custom built for different applications, making them typically difficult to modify, extend and repurpose. Scientific workflow systems are intended to address these issues by providing general-purpose frameworks in which to develop and execute such pipelines. The Kepler workflow environment is a well-established system under continual development that is employed in several areas of scientific research. Kepler provides a flexible graphical interface, featuring clear display of parameter values, for design and modification of workflows. It has capabilities for developing novel computational components in the R, Python, and Java programming languages, all of which are widely used for bioinformatics algorithm development, along with capabilities for invoking external applications and using web services. We developed a series of fully functional bioinformatics pipelines addressing common tasks in microarray processing in the Kepler workflow environment. These pipelines consist of a set of tools for GFF file processing of NimbleGen chromatin immunoprecipitation on microarray (ChIP-chip) datasets and more comprehensive workflows for Affymetrix gene expression microarray bioinformatics and basic primer design for PCR experiments, which are often used to validate microarray results. Although functional in themselves, these workflows can be easily customized, extended, or repurposed to match the needs of specific projects and are designed to be a toolkit and starting point for specific applications. These workflows illustrate a workflow programming paradigm focusing on local resources (programs and data) and therefore are close to traditional shell scripting or R

  10. Modelling the Neutral Atmosphere and Plasma Environment of Saturn

    NASA Technical Reports Server (NTRS)

    Richardson, John D.; Jurac, S.; Johnson, R.; McGrath, M.

    2005-01-01

    The first year of this contract has resulted in two publications with the P.I. and co-I Jurac as lead authors and two publications where these team members are co-authors. These papers discuss modeling work undertaken in preparation for Cassini; the goal was to summarize our current best knowledge of the ion and neutrals sources and distributions. One of the major goals of this project is to improve models of the plasma and neutral environment near Saturn. The paper "A self-consistent model of plasma and neutrals at Saturn: Neutral cloud morphology" [Jurac and Richardson, 20051 presents results on the neutral clouds near Saturn using a model which for the first times treats the ions and neutrals self-consistently. We also for the first time include a directly sputtered H source. The Voyager and HST observations are used as model constraints. The neutral source is adjusted to give a good match to the HST observations of OH. For this initial run the ion parameters from Richardson et al. are used; charge exchange with ions is a major neutral loss process. The neutral profile derived from the model is then used in a model of plasma transport and chemistry (with the plasma diffusion rate the only free parameter). This model gives new values of the ion composition which are then fed back into the neutral model. This iteration continues until the values converge.

  11. Animal models and conserved processes

    PubMed Central

    2012-01-01

    Background The concept of conserved processes presents unique opportunities for using nonhuman animal models in biomedical research. However, the concept must be examined in the context that humans and nonhuman animals are evolved, complex, adaptive systems. Given that nonhuman animals are examples of living systems that are differently complex from humans, what does the existence of a conserved gene or process imply for inter-species extrapolation? Methods We surveyed the literature including philosophy of science, biological complexity, conserved processes, evolutionary biology, comparative medicine, anti-neoplastic agents, inhalational anesthetics, and drug development journals in order to determine the value of nonhuman animal models when studying conserved processes. Results Evolution through natural selection has employed components and processes both to produce the same outcomes among species but also to generate different functions and traits. Many genes and processes are conserved, but new combinations of these processes or different regulation of the genes involved in these processes have resulted in unique organisms. Further, there is a hierarchy of organization in complex living systems. At some levels, the components are simple systems that can be analyzed by mathematics or the physical sciences, while at other levels the system cannot be fully analyzed by reducing it to a physical system. The study of complex living systems must alternate between focusing on the parts and examining the intact whole organism while taking into account the connections between the two. Systems biology aims for this holism. We examined the actions of inhalational anesthetic agents and anti-neoplastic agents in order to address what the characteristics of complex living systems imply for inter-species extrapolation of traits and responses related to conserved processes. Conclusion We conclude that even the presence of conserved processes is insufficient for inter

  12. A Process for Technology Prioritization in a Competitive Environment

    NASA Technical Reports Server (NTRS)

    Stephens, Karen; Herman, Melody; Griffin, Brand

    2006-01-01

    This slide presentation reviews NASA's process for prioritizing technology requirements where there is a competitive environment. The In-Space Propulsion Technology (ISPT) project is used to exemplify the process. The ISPT project focuses on the mid level Technology Readiness Level (TRL) for development. These are TRL's 4 through 6, (i.e. Technology Development and Technology Demonstration. The objective of the planning activity is to identify the current most likely date each technology is needed and create ISPT technology development schedules based on these dates. There is a minimum of 4 years between flight and pacing mission. The ISPT Project needed to identify the "pacing mission" for each technology in order to provide funding for each area. Graphic representations show the development of the process. A matrix shows which missions are currently receiving pull from the both the Solar System Exploration and the Sun-Solar System Connection Roadmaps. The timeframes of the pacing missions technologies are shown for various types of propulsion. A pacing mission that was in the near future serves to increase the priority for funding. Adaptations were made when budget reductions precluded the total implementation of the plan.

  13. Model for amorphous aggregation processes

    NASA Astrophysics Data System (ADS)

    Stranks, Samuel D.; Ecroyd, Heath; van Sluyter, Steven; Waters, Elizabeth J.; Carver, John A.; von Smekal, Lorenz

    2009-11-01

    The amorphous aggregation of proteins is associated with many phenomena, ranging from the formation of protein wine haze to the development of cataract in the eye lens and the precipitation of recombinant proteins during their expression and purification. While much literature exists describing models for linear protein aggregation, such as amyloid fibril formation, there are few reports of models which address amorphous aggregation. Here, we propose a model to describe the amorphous aggregation of proteins which is also more widely applicable to other situations where a similar process occurs, such as in the formation of colloids and nanoclusters. As first applications of the model, we have tested it against experimental turbidimetry data of three proteins relevant to the wine industry and biochemistry, namely, thaumatin, a thaumatinlike protein, and α -lactalbumin. The model is very robust and describes amorphous experimental data to a high degree of accuracy. Details about the aggregation process, such as shape parameters of the aggregates and rate constants, can also be extracted.

  14. Face Processing: Models For Recognition

    NASA Astrophysics Data System (ADS)

    Turk, Matthew A.; Pentland, Alexander P.

    1990-03-01

    The human ability to process faces is remarkable. We can identify perhaps thousands of faces learned throughout our lifetime and read facial expression to understand such subtle qualities as emotion. These skills are quite robust, despite sometimes large changes in the visual stimulus due to expression, aging, and distractions such as glasses or changes in hairstyle or facial hair. Computers which model and recognize faces will be useful in a variety of applications, including criminal identification, human-computer interface, and animation. We discuss models for representing faces and their applicability to the task of recognition, and present techniques for identifying faces and detecting eye blinks.

  15. Self-assembly processes in the prebiotic environment

    PubMed Central

    Deamer, David; Singaram, Sara; Rajamani, Sudha; Kompanichenko, Vladimir; Guggenheim, Stephen

    2006-01-01

    An important question guiding research on the origin of life concerns the environmental conditions where molecular systems with the properties of life first appeared on the early Earth. An appropriate site would require liquid water, a source of organic compounds, a source of energy to drive polymerization reactions and a process by which the compounds were sufficiently concentrated to undergo physical and chemical interactions. One such site is a geothermal setting, in which organic compounds interact with mineral surfaces to promote self-assembly and polymerization reactions. Here, we report an initial study of two geothermal sites where mixtures of representative organic solutes (amino acids, nucleobases, a fatty acid and glycerol) and phosphate were mixed with high-temperature water in clay-lined pools. Most of the added organics and phosphate were removed from solution with half-times measured in minutes to a few hours. Analysis of the clay, primarily smectite and kaolin, showed that the organics were adsorbed to the mineral surfaces at the acidic pH of the pools, but could subsequently be released in basic solutions. These results help to constrain the range of possible environments for the origin of life. A site conducive to self-assembly of organic solutes would be an aqueous environment relatively low in ionic solutes, at an intermediate temperature range and neutral pH ranges, in which cyclic concentration of the solutes can occur by transient dry intervals. PMID:17008220

  16. Construction material processed using lunar simulant in various environments

    NASA Technical Reports Server (NTRS)

    Chase, Stan; Ocallaghan-Hay, Bridget; Housman, Ralph; Kindig, Michael; King, John; Montegrande, Kevin; Norris, Raymond; Vanscotter, Ryan; Willenborg, Jonathan; Staubs, Harry

    1995-01-01

    The manufacture of construction materials from locally available resources in space is an important first step in the establishment of lunar and planetary bases. The objective of the CoMPULSIVE (Construction Material Processed Using Lunar Simulant In Various Environments) experiment is to develop a procedure to produce construction materials by sintering or melting Johnson Space Center Simulant 1 (JSC-1) lunar soil simulant in both earth-based (1-g) and microgravity (approximately 0-g) environments. The characteristics of the resultant materials will be tested to determine its physical and mechanical properties. The physical characteristics include: crystalline, thermal, and electrical properties. The mechanical properties include: compressive tensile, and flexural strengths. The simulant, placed in a sealed graphite crucible, will be heated using a high temperature furnace. The crucible will then be cooled by radiative and forced convective means. The core furnace element consists of space qualified quartz-halogen incandescent lamps with focusing mirrors. Sample temperatures of up to 2200 C are attainable using this heating method.

  17. Self-assembly processes in the prebiotic environment.

    PubMed

    Deamer, David; Singaram, Sara; Rajamani, Sudha; Kompanichenko, Vladimir; Guggenheim, Stephen

    2006-10-29

    An important question guiding research on the origin of life concerns the environmental conditions where molecular systems with the properties of life first appeared on the early Earth. An appropriate site would require liquid water, a source of organic compounds, a source of energy to drive polymerization reactions and a process by which the compounds were sufficiently concentrated to undergo physical and chemical interactions. One such site is a geothermal setting, in which organic compounds interact with mineral surfaces to promote self-assembly and polymerization reactions. Here, we report an initial study of two geothermal sites where mixtures of representative organic solutes (amino acids, nucleobases, a fatty acid and glycerol) and phosphate were mixed with high-temperature water in clay-lined pools. Most of the added organics and phosphate were removed from solution with half-times measured in minutes to a few hours. Analysis of the clay, primarily smectite and kaolin, showed that the organics were adsorbed to the mineral surfaces at the acidic pH of the pools, but could subsequently be released in basic solutions. These results help to constrain the range of possible environments for the origin of life. A site conducive to self-assembly of organic solutes would be an aqueous environment relatively low in ionic solutes, at an intermediate temperature range and neutral pH ranges, in which cyclic concentration of the solutes can occur by transient dry intervals.

  18. Operational SAR Data Processing in GIS Environments for Rapid Disaster Mapping

    NASA Astrophysics Data System (ADS)

    Meroni, A.; Bahr, T.

    2013-05-01

    Having access to SAR data can be highly important and critical especially for disaster mapping. Updating a GIS with contemporary information from SAR data allows to deliver a reliable set of geospatial information to advance civilian operations, e.g. search and rescue missions. Therefore, we present in this paper the operational processing of SAR data within a GIS environment for rapid disaster mapping. This is exemplified by the November 2010 flash flood in the Veneto region, Italy. A series of COSMO-SkyMed acquisitions was processed in ArcGIS® using a single-sensor, multi-mode, multi-temporal approach. The relevant processing steps were combined using the ArcGIS ModelBuilder to create a new model for rapid disaster mapping in ArcGIS, which can be accessed both via a desktop and a server environment.

  19. Extending the LWS Data Environment: Distributed Data Processing and Analysis

    NASA Technical Reports Server (NTRS)

    Narock, Thomas

    2005-01-01

    The final stages of this work saw changes to the original framework, as well as the completion and integration of several data processing services. Initially, it was thought that a peer-to-peer architecture was necessary to make this work possible. The peer-to-peer architecture provided many benefits including the dynamic discovery of new services that would be continually added. A prototype example was built and while it showed promise, a major disadvantage was seen in that it was not easily integrated into the existing data environment. While the peer-to-peer system worked well for finding and accessing distributed data processing services, it was found that its use was limited by the difficulty in calling it from existing tools and services. After collaborations with members of the data community, it was determined that our data processing system was of high value and that a new interface should be pursued in order for the community to take full advantage of it. As such; the framework was modified from a peer-to-peer architecture to a more traditional web service approach. Following this change multiple data processing services were added. These services include such things as coordinate transformations and sub setting of data. Observatory (VHO), assisted with integrating the new architecture into the VHO. This allows anyone using the VHO to search for data, to then pass that data through our processing services prior to downloading it. As a second attempt at demonstrating the new system, a collaboration was established with the Collaborative Sun Earth Connector (CoSEC) group at Lockheed Martin. This group is working on a graphical user interface to the Virtual Observatories and data processing software. The intent is to provide a high-level easy-to-use graphical interface that will allow access to the existing Virtual Observatories and data processing services from one convenient application. Working with the CoSEC group we provided access to our data

  20. An ecohydrologic model for a shallow groundwater urban environment.

    PubMed

    Arden, Sam; Ma, Xin Cissy; Brown, Mark

    2014-01-01

    The urban environment is a patchwork of natural and artificial surfaces that results in complex interactions with and impacts to natural hydrologic cycles. Evapotranspiration is a major hydrologic flow that is often altered through urbanization, although the mechanisms of change are sometimes difficult to tease out due to difficulty in effectively simulating soil-plant-atmosphere interactions. This paper introduces a simplified yet realistic model that is a combination of existing surface runoff and ecohydrology models designed to increase the quantitative understanding of complex urban hydrologic processes. Results demonstrate that the model is capable of simulating the long-term variability of major hydrologic fluxes as a function of impervious surface, temperature, water table elevation, canopy interception, soil characteristics, precipitation and complex mechanisms of plant water uptake. These understandings have potential implications for holistic urban water system management.

  1. The role of an electromagnetic environment model in spectrum management

    NASA Astrophysics Data System (ADS)

    Feller, A. H.

    1981-04-01

    The role of an electromagnetic (EM) environment model in spectrum management is developed. Spectrum management is traced from electromagnetic compatibility (EMC) considerations in international agreements through related domestic law to the fundamental spectrum management procedures allocation, allotment, assignment. The need for a model of the EM environment is derived from requirements of allocation, allotment, and assignment proceedings. Data elements required to support and EM environment model for spectrum management purpose are reviewed. An outline and derivation of a general EM environment model is given. The ways systems respond to the EM environment are cataloged and reviewed so that specific applications of an EM environment model are readily apparent. Application and limitations of current models are discussed.

  2. IFR fuel cycle process equipment design environment and objectives

    SciTech Connect

    Rigg, R.H.

    1993-03-01

    Argonne National laboratory (ANL) is refurbishing the hot cell facility originally constructed with the EBR-II reactor. When refurbishment is complete, the facility win demonstrate the complete fuel cycle for current generation high burnup metallic fuel elements. These are sodium bonded, stainless steel clad fuel pins of U-Zr or U-Pu-Zr composition typical of the fuel type proposed for a future Integral Fast Reactor (IFR) design. To the extent possible, the process equipment is being built at full commercial scale, and the facility is being modified to incorporate current DOE facility design requirements and modem remote maintenance principles. The current regulatory and safety environment has affected the design of the fuel fabrication equipment, most of which will be described in greater detail in subsequent papers in this session.

  3. IFR fuel cycle process equipment design environment and objectives

    SciTech Connect

    Rigg, R.H.

    1993-01-01

    Argonne National laboratory (ANL) is refurbishing the hot cell facility originally constructed with the EBR-II reactor. When refurbishment is complete, the facility win demonstrate the complete fuel cycle for current generation high burnup metallic fuel elements. These are sodium bonded, stainless steel clad fuel pins of U-Zr or U-Pu-Zr composition typical of the fuel type proposed for a future Integral Fast Reactor (IFR) design. To the extent possible, the process equipment is being built at full commercial scale, and the facility is being modified to incorporate current DOE facility design requirements and modem remote maintenance principles. The current regulatory and safety environment has affected the design of the fuel fabrication equipment, most of which will be described in greater detail in subsequent papers in this session.

  4. Enhancement of the triple alpha process in hot, dense environments

    NASA Astrophysics Data System (ADS)

    Beard, Mary; Austin, Sam M.; Cyburt, Richard

    2016-09-01

    The triple alpha process plays a particularly important role in nuclear astrophysics, bridging the A =5 and A =8 stability gaps, producing 12C. The reaction itself proceeds via the 0 + (Hoyle) resonance at 7.65 MeV in 12C, at a rate proportional to the radiative width of the state. For sufficiently hot and dense environments, the rate of the triple alpha reaction is significantly enhanced by hadronic inelastic scattering that de-excites the Hoyle state. We present theoretical calculations for the enhancement of the triple alpha rate based on inelastic n, p and alpha cross sections. For comparable densities, neutrons play the largest role. NSCL PHY11-02511, JINA-CEE PHY14-30152.

  5. A Process Study of the Development of Virtual Research Environments

    NASA Astrophysics Data System (ADS)

    Ahmed, I.; Cooper, K.; McGrath, R.; Griego, G.; Poole, M. S.; Hanisch, R. J.

    2014-05-01

    In recent years, cyberinfrastructures have been deployed to create virtual research environments (VREs) - such as the Virtual Astronomical Observatory (VAO) - to enhance the quality and speed of scientific research, and to foster global scientific communities. Our study utilizes process methodology to study the evolution of VREs. This approach focuses on a series of events that bring about or lead to some outcome, and attempts to specify the generative mechanism that could produce the event series. This paper briefly outlines our approach and describes initial results of a case study of the VAO, one of the participating VREs. The case study is based on interviews with seven individuals participating in the VAO, and analysis of project documents and online resources. These sources are hand tagged to identify events related to the thematic tracks, to yield a narrative of the project. Results demonstrate the event series of an organization through traditional methods augmented by virtual sources.

  6. Theoretical Models of Astrochemical Processes

    NASA Technical Reports Server (NTRS)

    Charnley, Steven

    2009-01-01

    Interstellar chemistry provides a natural laboratory for studying exotic species and processes at densities, temperatures, and reaction rates. that are difficult or impractical to address in the laboratory. Thus, many chemical reactions considered too sloe by the standards of terrestrial chemistry, can be 'observed and modeled. Curious proposals concerning the nature and chemistry of complex interstellar organic molecules will be described. Catalytic reactions on "rain surfaces can, in principle, lead to a lame variety of species and this has motivated many laboratory and theoretical studies. Gas phase processes may also build lame species in molecular clouds. Future laboratory data and computational tools needed to construct accurate chemical models of various astronomical sources to be observed by Herschel and ALMA will be outlined.

  7. Mesoscopic Modeling of Reactive Transport Processes

    NASA Astrophysics Data System (ADS)

    Kang, Q.; Chen, L.; Deng, H.

    2012-12-01

    Reactive transport processes involving precipitation and/or dissolution are pervasive in geochemical, biological and engineered systems. Typical examples include self-assembled patterns such as Liesegang rings or bands, cones of stalactites in limestones caves, biofilm growth in aqueous environment, formation of mineral deposits in boilers and heat exchangers, uptake of toxic metal ions from polluted water by calcium carbonate, and mineral trapping of CO2. Compared to experimental studies, a numerical approach enables a systematic study of the reaction kinetics, mass transport, and mechanisms of nucleation and crystal growth, and hence provides a detailed description of reactive transport processes. In this study, we enhance a previously developed lattice Boltzmann pore-scale model by taking into account the nucleation process, and develop a mesoscopic approach to simulate reactive transport processes involving precipitation and/or dissolution of solid phases. The model is then used to simulate the formation of Liesegang precipitation patterns and investigate the effects of gel on the morphology of the precipitates. It is shown that this model can capture the porous structures of the precipitates and can account for the effects of the gel concentration and material. A wide range of precipitation patterns is predicted under different gel concentrations, including regular bands, treelike patterns, and for the first time with numerical models, transition patterns from regular bands to treelike patterns. The model is also applied to study the effect of secondary precipitate on the dissolution of primary mineral. Several types of dissolution and precipitation processes are identified based on the morphology and structures of the precipitates and on the extent to which the precipitates affect the dissolution of the primary mineral. Finally the model is applied to study the formation of pseudomorph. It is demonstrated for the first time by numerical simulation that a

  8. Diagnostic Modeling of PAMS VOC Observation on Regional Scale Environment

    NASA Astrophysics Data System (ADS)

    Chen, S.; Liu, T.; Chen, T.; Ou Yang, C.; Wang, J.; Chang, J. S.

    2008-12-01

    While a number of gas-phase chemical mechanisms, such as CBM-Z, RADM2, SAPRC-07 had been successful in studying gas-phase atmospheric chemical processes they all used some lumped organic species to varying degrees. Photochemical Assessment Monitoring Stations (PAMS) has been in use for over ten years and yet it is not clear how the detailed organic species measured by PAMS compare to the lumped model species under regional-scale transport and chemistry interactions. By developing a detailed mechanism specifically for the PAMS organics and embedding this diagnostic model within a regional-scale transport and chemistry model we can then directly compare PAMS observation with regional-scale model simulations. We modify one regional-scale chemical transport model (Taiwan Air Quality Model, TAQM) by adding a submodel with chemical mechanism for interactions of the 56 species observed by PAMS. This submodel then calculates the time evolution of these 56 PAMS species within the environment established by TAQM. It is assumed that TAQM can simulate the overall regional-scale environment including impact of regional-scale transport and time evolution of oxidants and radicals. Therefore we can scale these influences to the PAMS organic species and study their time evolution with their species-specific source functions, meteorological transport, and chemical interactions. Model simulations of each species are compared with PAMS hourly surface measurements. A case study located in a metropolitan area in central Taiwan showed that with wind speeds lower than 3 m/s, when meteorological simulation is comparable with observation, the diurnal pattern of each species performs well with PAMS data. It is found that for many observations meteorological transport is an influence and that local emissions of specific species must be represented correctly. At this time there are still species that cannot be modeled properly. We suspect this is mostly due to lack of information on local

  9. Influence of global climatic processes on environment The Arctic seas

    NASA Astrophysics Data System (ADS)

    Kholmyansky, Mikhael; Anokhin, Vladimir; Kartashov, Alexandr

    2016-04-01

    One of the most actual problems of the present is changes of environment of Arctic regions under the influence of global climatic processes. Authors as a result of the works executed by them in different areas of the Russian Arctic regions, have received the materials characterising intensity of these processes. Complex researches are carried out on water area and in a coastal zone the White, the Barents, the Kara and the East-Siberian seas, on lake water areas of subarctic region since 1972 on the present. Into structure of researches enter: hydrophysical, cryological observations, direct measurements of temperatures, the analysis of the drill data, electrometric definitions of the parametres of a frozen zone, lithodynamic and geochemical definitions, geophysical investigations of boreholes, studying of glaciers on the basis of visual observations and the analysis of photographs. The obtained data allows to estimate change of temperature of a water layer, deposits and benthonic horizon of atmosphere for last 25 years. On the average they make 0,38⁰C for sea waters, 0,23⁰C for friable deposits and 0,72⁰C for atmosphere. Under the influence of temperature changes in hydrosphere and lithosphere of a shelf cryolithic zone changes the characteristics. It is possible to note depth increase of roof position of the cryolithic zone on the most part of the studied water area. Modern fast rise in temperature high-ice rocks composing coast, has led to avalanche process thermo - denudation and to receipt in the sea of quantity of a material of 1978 three times exceeding level Rise in temperature involves appreciable deviation borders of the Arctic glacial covers. On our monitoring measurements change of the maintenance of oxygen in benthonic area towards increase that is connected with reduction of the general salinity of waters at the expense of fresh water arriving at ice thawing is noticed. It, in turn, leads to change of a biogene part of ecosystem. The executed

  10. Morpheus: a user-friendly modeling environment for multiscale and multicellular systems biology.

    PubMed

    Starruß, Jörn; de Back, Walter; Brusch, Lutz; Deutsch, Andreas

    2014-05-01

    Morpheus is a modeling environment for the simulation and integration of cell-based models with ordinary differential equations and reaction-diffusion systems. It allows rapid development of multiscale models in biological terms and mathematical expressions rather than programming code. Its graphical user interface supports the entire workflow from model construction and simulation to visualization, archiving and batch processing.

  11. The Virtual Cell Modeling and Simulation Software Environment

    PubMed Central

    Moraru, Ion I.; Schaff, James C.; Slepchenko, Boris M.; Blinov, Michael; Morgan, Frank; Lakshminarayana, Anuradha; Gao, Fei; Li, Ye; Loew, Leslie M.

    2009-01-01

    The Virtual Cell (VCell; http://vcell.org/) is a problem solving environment, built on a central database, for analysis, modeling and simulation of cell biological processes. VCell integrates a growing range of molecular mechanisms, including reaction kinetics, diffusion, flow, membrane transport, lateral membrane diffusion and electrophysiology, and can associate these with geometries derived from experimental microscope images. It has been developed and deployed as a web-based, distributed, client-server system, with more than a thousand world-wide users. VCell provides a separation of layers (core technologies and abstractions) representing biological models, physical mechanisms, geometry, mathematical models and numerical methods. This separation clarifies the impact of modeling decisions, assumptions, and approximations. The result is a physically consistent, mathematically rigorous, spatial modeling and simulation framework. Users create biological models and VCell will automatically (i) generate the appropriate mathematical encoding for running a simulation, and (ii) generate and compile the appropriate computer code. Both deterministic and stochastic algorithms are supported for describing and running non-spatial simulations; a full partial differential equation solver using the finite volume numerical algorithm is available for reaction-diffusion-advection simulations in complex cell geometries including 3D geometries derived from microscope images. Using the VCell database, models and model components can be reused and updated, as well as privately shared among collaborating groups, or published. Exchange of models with other tools is possible via import/export of SBML, CellML, and MatLab formats. Furthermore, curation of models is facilitated by external database binding mechanisms for unique identification of components and by standardized annotations compliant with the MIRIAM standard. VCell is now open source, with its native model encoding language

  12. Study of cache performance in distributed environment for data processing

    NASA Astrophysics Data System (ADS)

    Makatun, Dzmitry; Lauret, Jérôme; Šumbera, Michal

    2014-06-01

    Processing data in distributed environment has found its application in many fields of science (Nuclear and Particle Physics (NPP), astronomy, biology to name only those). Efficiently transferring data between sites is an essential part of such processing. The implementation of caching strategies in data transfer software and tools, such as the Reasoner for Intelligent File Transfer (RIFT) being developed in the STAR collaboration, can significantly decrease network load and waiting time by reusing the knowledge of data provenance as well as data placed in transfer cache to further expand on the availability of sources for files and data-sets. Though, a great variety of caching algorithms is known, a study is needed to evaluate which one can deliver the best performance in data access considering the realistic demand patterns. Records of access to the complete data-sets of NPP experiments were analyzed and used as input for computer simulations. Series of simulations were done in order to estimate the possible cache hits and cache hits per byte for known caching algorithms. The simulations were done for cache of different sizes within interval 0.001 - 90% of complete data-set and low-watermark within 0-90%. Records of data access were taken from several experiments and within different time intervals in order to validate the results. In this paper, we will discuss the different data caching strategies from canonical algorithms to hybrid cache strategies, present the results of our simulations for the diverse algorithms, debate and identify the choice for the best algorithm in the context of Physics Data analysis in NPP. While the results of those studies have been implemented in RIFT, they can also be used when setting up cache in any other computational work-flow (Cloud processing for example) or managing data storages with partial replicas of the entire data-set.

  13. Modeling climate related feedback processes

    SciTech Connect

    Elzen, M.G.J. den; Rotmans, J. )

    1993-11-01

    In order to assess their impact, the feedbacks which at present can be quantified reasonably are built into the Integrated Model to Assess the Greenhouse Effect (IMAGE). Unlike previous studies, this study describes the scenario- and time-dependent role of biogeochemical feedbacks. A number of simulation experiments are performed with IMAGE to project climate changes. Besides estimates of their absolute importance, the relative importance of individual biogeochemical feedbacks is considered by calculating the gain for each feedback process. This study focuses on feedback processes in the carbon cycle and the methane (semi-) cycle. Modeled feedbacks are then used to balance the past and present carbon budget. This results in substantially lower projections for atmospheric carbon dioxide than the Intergovernmental Panel on Climate Change (IPCC) estimates. The difference is approximately 18% from the 1990 level for the IPCC [open quotes]Business-as-Usual[close quotes] scenario. Furthermore, the IPCC's [open quotes]best guess[close quotes] value of the CO[sub 2] concentration in the year 2100 falls outside the uncertainty range estimated with our balanced modeling approach. For the IPCC [open quotes]Business-as-Usual[close quotes] scenario, the calculated total gain of the feedbacks within the carbon cycle appears to be negative, a result of the dominant role of the fertilization feedback. This study also shows that if temperature feedbacks on methane emissions from wetlands, rice paddies, and hydrates do materialize, methane concentrations might be increased by 30% by 2100. 70 refs., 17 figs., 7 tabs.

  14. Periglacial process and Pleistocene environment in northern China

    SciTech Connect

    Guo Xudong; Liu Dongsheng ); Yan Fuhua )

    1991-03-01

    In the present time, five kinds of periglacial phenomena have been defined: ice wedges, periglacial involutions, congelifolds, congeliturbations, and loess dunes. From the stratigraphical and geochronological data, the periglacial process is divided into six stages. (1) Guanting periglacial stage, characterized by the congeliturbative deposits that have developed in early Pleistocene Guanting loess-like formation. Paleomagnetic dating gives 2.43 Ma B.P. (2) Yanchi periglacial stage, characterized by the congelifold that has developed in middle Pleistocene Yanchi Lishi loess formation. Paleomagnetic dating gives 0.50 Ma B.P. (3) Zhaitang periglacial stage (II), characterized by the periglacial involutions that have developed in lower middle Pleistocene Lishi loess formation. Paleomagnetic dating gives 0.30 Ma B.P. (4) Zhaitang periglacial state (I), characterized by the ice (soil) wedge that has developed in upper-middle Pleistocene Lishi loess formation. Paleomagnetic dating gives 0.20 Ma B.P. (5) Qiansangyu periglacial stage (II), characterized by the ice (sand) wedges that has developed in late Pleistocene Malan loess formation. Paleomagnetic dating gives 0.13 Ma B.P. (6) Qiansangyu periglacial stage (I), characterized by the ice (soil) wedge that has developed in late Pleistocene Malan loess-like formation. Thermoluminescent dating gives 0.018 Ma B.P. Spore-pollen composition analysis shows that the savannah steppe environment prevailed in northern China during Pleistocene periglacial periods. These fossilized periglacial phenomena indicate a rather arid and windy periglacial environment with a mean annual temperature estimated some 12-15C colder than that in the present.

  15. A Collaborative Model for Ubiquitous Learning Environments

    ERIC Educational Resources Information Center

    Barbosa, Jorge; Barbosa, Debora; Rabello, Solon

    2016-01-01

    Use of mobile devices and widespread adoption of wireless networks have enabled the emergence of Ubiquitous Computing. Application of this technology to improving education strategies gave rise to Ubiquitous e-Learning, also known as Ubiquitous Learning. There are several approaches to organizing ubiquitous learning environments, but most of them…

  16. A Collaborative Model for Ubiquitous Learning Environments

    ERIC Educational Resources Information Center

    Barbosa, Jorge; Barbosa, Debora; Rabello, Solon

    2016-01-01

    Use of mobile devices and widespread adoption of wireless networks have enabled the emergence of Ubiquitous Computing. Application of this technology to improving education strategies gave rise to Ubiquitous e-Learning, also known as Ubiquitous Learning. There are several approaches to organizing ubiquitous learning environments, but most of them…

  17. Modeling Stem Cell Induction Processes

    PubMed Central

    Grácio, Filipe; Cabral, Joaquim; Tidor, Bruce

    2013-01-01

    Technology for converting human cells to pluripotent stem cell using induction processes has the potential to revolutionize regenerative medicine. However, the production of these so called iPS cells is still quite inefficient and may be dominated by stochastic effects. In this work we build mass-action models of the core regulatory elements controlling stem cell induction and maintenance. The models include not only the network of transcription factors NANOG, OCT4, SOX2, but also important epigenetic regulatory features of DNA methylation and histone modification. We show that the network topology reported in the literature is consistent with the observed experimental behavior of bistability and inducibility. Based on simulations of stem cell generation protocols, and in particular focusing on changes in epigenetic cellular states, we show that cooperative and independent reaction mechanisms have experimentally identifiable differences in the dynamics of reprogramming, and we analyze such differences and their biological basis. It had been argued that stochastic and elite models of stem cell generation represent distinct fundamental mechanisms. Work presented here suggests an alternative possibility that they represent differences in the amount of information we have about the distribution of cellular states before and during reprogramming protocols. We show further that unpredictability and variation in reprogramming decreases as the cell progresses along the induction process, and that identifiable groups of cells with elite-seeming behavior can come about by a stochastic process. Finally we show how different mechanisms and kinetic properties impact the prospects of improving the efficiency of iPS cell generation protocols. PMID:23667423

  18. Collective Properties of a Transcription Initiation Model Under Varying Environment.

    PubMed

    Hu, Yucheng; Lowengrub, John S

    2016-01-01

    The dynamics of gene transcription is tightly regulated in eukaryotes. Recent experiments have revealed various kinds of transcriptional dynamics, such as RNA polymerase II pausing, that involves regulation at the transcription initiation stage, and the choice of different regulation pattern is closely related to the physiological functions of the target gene. Here we consider a simplified model of transcription initiation, a process including the assembly of transcription complex and the pausing and releasing of the RNA polymerase II. Focusing on the collective behaviors of a population level, we explore the potential regulatory functions this model can offer. These functions include fast and synchronized response to environmental change, or long-term memory about the transcriptional status. As a proof of concept we also show that, by selecting different control mechanisms cells can adapt to different environments. These findings may help us better understand the design principles of transcriptional regulation.

  19. Modeling pellet impact drilling process

    NASA Astrophysics Data System (ADS)

    Kovalyov, A. V.; Ryabchikov, S. Ya; Isaev, Ye D.; Ulyanova, O. S.

    2016-03-01

    The paper describes pellet impact drilling which could be used to increase the drilling speed and the rate of penetration when drilling hard rocks. Pellet impact drilling implies rock destruction by metal pellets with high kinetic energy in the immediate vicinity of the earth formation encountered. The pellets are circulated in the bottom hole by a high velocity fluid jet, which is the principle component of the ejector pellet impact drill bit. The experiments conducted has allowed modeling the process of pellet impact drilling, which creates the scientific and methodological basis for engineering design of drilling operations under different geo-technical conditions.

  20. Shuttle measured contaminant environment and modeling for payloads. Preliminary assessment of the space telescope environment in the shuttle bay

    NASA Technical Reports Server (NTRS)

    Scialdone, J. J.

    1983-01-01

    A baseline gaseous and particulate environment of the Shuttle bay was developed based on the various measurements which were made during the first four flights of the Shuttle. The environment is described by the time dependent pressure, density, scattered molecular fluxes, the column densities and including the transient effects of water dumps, engine firings and opening and closing of the bay doors. The particulate conditions in the ambient and on surfaces were predicted as a function of the mission time based on the available data. This basic Shuttle environment when combined with the outgassing and the particulate contributions of the payloads, can provide a description of the environment of a payload in the Shuttle bay. As an example of this application, the environment of the Space Telescope in the bay, which may be representative of the environment of several payloads, was derived. Among the many findings obtained in the process of modeling the environment, one is that the payloads environment in the bay is not substantially different or more objectionable than the self-generated environment of a large payload or spacecraft. It is, however, more severe during ground facilities operations, the first 15 to 20 hours of the flight, during and for a short period after ater was dumped overboard, and the reaction control engines are being fired.

  1. Causal Model Progressions as a Foundation for Intelligent Learning Environments.

    DTIC Science & Technology

    1987-11-01

    Learning Environments 12. PERSONAL AUTHOR(S? Barbara Y. White and John R. Frederiksen 13a. TYPE OF REPORT 13b TIME COVERED 14. DATE OF REPORT (Year...architecture of a new type of learning environment that incorporates features of microworlds and of intelligent tutorng systems. The environment is based on...The design principles underlying the creation of one type of causal model are then given (for zero-order models of electrical circuit behavior); and

  2. Atmospheric Processing of Methylglyoxal and Glyoxal in Aqueous Environments

    NASA Astrophysics Data System (ADS)

    Axson, J. L.; Vaida, V.

    2009-12-01

    Methylglyoxal is a known oxidation product of biogenic and anthropogenic VOCs, being observed by field studies and incorporated into atmospheric models. While the gas-phase chemistry of this compound is fairly well understood, its modeled concentration and role in SOA formation remains controversial. Methylglyoxal, like other aldehydes and ketones, when in the presence of water is hydrated to form alcohols. When the alcohol forms, it is likely to partition into clouds and aerosols because of its tendency to form intermolecular hydrogen bonds. This study evaluates the water-mediated equilibrium between methylglyoxal and its hydrates in the gas phase. The results can be used to understand the atmospheric fate and processing of this and similar organics. My experimental approach employs Fourier-Transform spectroscopy to characterize gas phase reagents and products as a function of relative humidity and obtain an equilibrium constant between methylglyoxal and its hydrate.

  3. Process Model for Friction Stir Welding

    NASA Technical Reports Server (NTRS)

    Adams, Glynn

    1996-01-01

    forging affect of the shoulder. The energy balance at the boundary of the plastic region with the environment required that energy flow away from the boundary in both radial directions. One resolution to this problem may be to introduce a time dependency into the process model, allowing the energy flow to oscillate across this boundary. Finally, experimental measurements are needed to verify the concepts used here and to aid in improving the model.

  4. Arctic mosses govern below-ground environment and ecosystem processes.

    PubMed

    Gornall, J L; Jónsdóttir, I S; Woodin, S J; Van der Wal, R

    2007-10-01

    Mosses dominate many northern ecosystems and their presence is integral to soil thermal and hydrological regimes which, in turn, dictate important ecological processes. Drivers, such as climate change and increasing herbivore pressure, affect the moss layer thus, assessment of the functional role of mosses in determining soil characteristics is essential. Field manipulations conducted in high arctic Spitsbergen (78 degrees N), creating shallow (3 cm), intermediate (6 cm) and deep (12 cm) moss layers over the soil surface, had an immediate impact on soil temperature in terms of both average temperatures and amplitude of fluctuations. In soil under deep moss, temperature was substantially lower and organic layer thaw occurred 4 weeks later than in other treatment plots; the growing season for vascular plants was thereby reduced by 40%. Soil moisture was also reduced under deep moss, reflecting the influence of local heterogeneity in moss depth, over and above the landscape-scale topographic control of soil moisture. Data from field and laboratory experiments show that moss-mediated effects on the soil environment influenced microbial biomass and activity, resulting in warmer and wetter soil under thinner moss layers containing more plant-available nitrogen. In arctic ecosystems, which are limited by soil temperature, growing season length and nutrient availability, spatial and temporal variation in the depth of the moss layer has significant repercussions for ecosystem function. Evidence from our mesic tundra site shows that any disturbance causing reduction in the depth of the moss layer will alleviate temperature and moisture constraints and therefore profoundly influence a wide range of ecosystem processes, including nutrient cycling and energy transfer.

  5. Development, validation and application of numerical space environment models

    NASA Astrophysics Data System (ADS)

    Honkonen, Ilja

    2013-10-01

    Currently the majority of space-based assets are located inside the Earth's magnetosphere where they must endure the effects of the near-Earth space environment, i.e. space weather, which is driven by the supersonic flow of plasma from the Sun. Space weather refers to the day-to-day changes in the temperature, magnetic field and other parameters of the near-Earth space, similarly to ordinary weather which refers to changes in the atmosphere above ground level. Space weather can also cause adverse effects on the ground, for example, by inducing large direct currents in power transmission systems. The performance of computers has been growing exponentially for many decades and as a result the importance of numerical modeling in science has also increased rapidly. Numerical modeling is especially important in space plasma physics because there are no in-situ observations of space plasmas outside of the heliosphere and it is not feasible to study all aspects of space plasmas in a terrestrial laboratory. With the increasing number of computational cores in supercomputers, the parallel performance of numerical models on distributed memory hardware is also becoming crucial. This thesis consists of an introduction, four peer reviewed articles and describes the process of developing numerical space environment/weather models and the use of such models to study the near-Earth space. A complete model development chain is presented starting from initial planning and design to distributed memory parallelization and optimization, and finally testing, verification and validation of numerical models. A grid library that provides good parallel scalability on distributed memory hardware and several novel features, the distributed cartesian cell-refinable grid (DCCRG), is designed and developed. DCCRG is presently used in two numerical space weather models being developed at the Finnish Meteorological Institute. The first global magnetospheric test particle simulation based on the

  6. Exactly constructing model of quantum mechanics with random environment

    SciTech Connect

    Gevorkyan, A. S.

    2010-02-15

    Dissipation and decoherence, interaction with the random media, continuous measurements and many other complicated problems of open quantum systems are a result of interaction of quantum system with the random environment. These problems mathematically are described in terms of complex probabilistic processes (CPP). Note that CPP satisfies the stochastic differential equation (SDE) of Langevin-Schroedinger(L-Sch)type, and is defined on the extended space R{sup 1} - R{sub {l_brace}{gamma}{r_brace}}, where R{sup 1} and R{sub {l_brace}{gamma}{r_brace}} are the Euclidean and the functional spaces, correspondingly. For simplicity, the model of 1D quantum harmonic oscillator (QHO) with the stochastic environment is considered. On the basis of orthogonal CPP, the method of stochastic density matrix (SDM) is developed. By S DM method, the thermodynamical potentials, such as the nonequilibrium entropy and the energy of the 'ground state' are constructed in a closed form. The expressions for uncertain relations and Wigner function depending on interaction's constant between 1D QHO and the environment are obtained.

  7. Gravity Modeling for Variable Fidelity Environments

    NASA Technical Reports Server (NTRS)

    Madden, Michael M.

    2006-01-01

    Aerospace simulations can model worlds, such as the Earth, with differing levels of fidelity. The simulation may represent the world as a plane, a sphere, an ellipsoid, or a high-order closed surface. The world may or may not rotate. The user may select lower fidelity models based on computational limits, a need for simplified analysis, or comparison to other data. However, the user will also wish to retain a close semblance of behavior to the real world. The effects of gravity on objects are an important component of modeling real-world behavior. Engineers generally equate the term gravity with the observed free-fall acceleration. However, free-fall acceleration is not equal to all observers. To observers on the sur-face of a rotating world, free-fall acceleration is the sum of gravitational attraction and the centrifugal acceleration due to the world's rotation. On the other hand, free-fall acceleration equals gravitational attraction to an observer in inertial space. Surface-observed simulations (e.g. aircraft), which use non-rotating world models, may choose to model observed free fall acceleration as the gravity term; such a model actually combines gravitational at-traction with centrifugal acceleration due to the Earth s rotation. However, this modeling choice invites confusion as one evolves the simulation to higher fidelity world models or adds inertial observers. Care must be taken to model gravity in concert with the world model to avoid denigrating the fidelity of modeling observed free fall. The paper will go into greater depth on gravity modeling and the physical disparities and synergies that arise when coupling specific gravity models with world models.

  8. Collapse models and perceptual processes

    NASA Astrophysics Data System (ADS)

    Carlo Ghirardi, Gian; Romano, Raffaele

    2014-04-01

    Theories including a collapse mechanism have been presented various years ago. They are based on a modification of standard quantum mechanics in which nonlinear and stochastic terms are added to the evolution equation. Their principal merits derive from the fact that they are mathematically precise schemes accounting, on the basis of a unique universal dynamical principle, both for the quantum behavior of microscopic systems as well as for the reduction associated to measurement processes and for the classical behavior of macroscopic objects. Since such theories qualify themselves not as new interpretations but as modifications of the standard theory they can be, in principle, tested against quantum mechanics. Recently, various investigations identifying possible crucial test have been discussed. In spite of the extreme difficulty to perform such tests it seems that recent technological developments allow at least to put precise limits on the parameters characterizing the modifications of the evolution equation. Here we will simply mention some of the recent investigations in this direction, while we will mainly concentrate our attention to the way in which collapse theories account for definite perceptual process. The differences between the case of reductions induced by perceptions and those related to measurement procedures by means of standard macroscopic devices will be discussed. On this basis, we suggest a precise experimental test of collapse theories involving conscious observers. We make plausible, by discussing in detail a toy model, that the modified dynamics can give rise to quite small but systematic errors in the visual perceptual process.

  9. Indoor environment modeling for interactive robot security application

    NASA Astrophysics Data System (ADS)

    Jo, Sangwoo; Shahab, Qonita M.; Kwon, Yong-Moo; Ahn, Sang Chul

    2006-10-01

    This paper presents our simple and easy to use method to obtain a 3D textured model. For expression of reality, we need to integrate the 3D models and real scenes. Most of other cases of 3D modeling method consist of two data acquisition devices. One is for getting a 3D model and another for obtaining realistic textures. In this case, the former device would be 2D laser range-finder and the latter device would be common camera. Our algorithm consists of building a measurement-based 2D metric map which is acquired by laser range-finder, texture acquisition/stitching and texture-mapping to corresponding 3D model. The algorithm is implemented with laser sensor for obtaining 2D/3D metric map and two cameras for gathering texture. Our geometric 3D model consists of planes that model the floor and walls. The geometry of the planes is extracted from the 2D metric map data. Textures for the floor and walls are generated from the images captured by two 1394 cameras which have wide Field of View angle. Image stitching and image cutting process is used to generate textured images for corresponding with a 3D model. The algorithm is applied to 2 cases which are corridor and space that has the four walls like room of building. The generated 3D map model of indoor environment is shown with VRML format and can be viewed in a web browser with a VRML plug-in. The proposed algorithm can be applied to 3D model-based remote surveillance system through WWW.

  10. Information-educational environment with adaptive control of learning process

    NASA Astrophysics Data System (ADS)

    Modjaev, A. D.; Leonova, N. M.

    2017-01-01

    Recent years, a new scientific branch connected with the activities in social sphere management developing intensively and it is called "Social Cybernetics". In the framework of this scientific branch, theory and methods of management of social sphere are formed. Considerable attention is paid to the management, directly in real time. However, the decision of such management tasks is largely constrained by the lack of or insufficiently deep study of the relevant sections of the theory and methods of management. The article discusses the use of cybernetic principles in solving problems of control in social systems. Applying to educational activities a model of composite interrelated objects representing the behaviour of students at various stages of educational process is introduced. Statistical processing of experimental data obtained during the actual learning process is being done. If you increase the number of features used, additionally taking into account the degree and nature of variability of levels of current progress of students during various types of studies, new properties of students' grouping are discovered. L-clusters were identified, reflecting the behaviour of learners with similar characteristics during lectures. It was established that the characteristics of the clusters contain information about the dynamics of learners' behaviour, allowing them to be used in additional lessons. The ways of solving the problem of adaptive control based on the identified dynamic characteristics of the learners are planned.

  11. Performance modeling codes for the QuakeSim problem solving environment

    NASA Technical Reports Server (NTRS)

    Parker, J. W.; Donnellan, A.; Lyzenga, G.; Rundle, J.; Tullis, T.

    2003-01-01

    The QuakeSim Problem Solving Environment uses a web-services approach to unify and deploy diverse remote data sources and processing services within a browser environment. Here we focus on the high-performance crustal modeling applications that will be included in this set of remote but interoperable applications.

  12. Performance modeling codes for the QuakeSim problem solving environment

    NASA Technical Reports Server (NTRS)

    Parker, J. W.; Donnellan, A.; Lyzenga, G.; Rundle, J.; Tullis, T.

    2003-01-01

    The QuakeSim Problem Solving Environment uses a web-services approach to unify and deploy diverse remote data sources and processing services within a browser environment. Here we focus on the high-performance crustal modeling applications that will be included in this set of remote but interoperable applications.

  13. NUMERICAL MODELING OF FINE SEDIMENT PHYSICAL PROCESSES.

    USGS Publications Warehouse

    Schoellhamer, David H.

    1985-01-01

    Fine sediment in channels, rivers, estuaries, and coastal waters undergo several physical processes including flocculation, floc disruption, deposition, bed consolidation, and resuspension. This paper presents a conceptual model and reviews mathematical models of these physical processes. Several general fine sediment models that simulate some of these processes are reviewed. These general models do not directly simulate flocculation and floc disruption, but the conceptual model and existing functions are shown to adequately model these two processes for one set of laboratory data.

  14. Understanding Fundamental Material Degradation Processes in High Temperature Aggressive Chemomechanical Environments

    SciTech Connect

    Stubbins, James; Gewirth, Andrew; Sehitoglu, Huseyin; Sofronis, Petros; Robertson, Ian

    2014-01-16

    The objective of this project is to develop a fundamental understanding of the mechanisms that limit materials durability for very high-temperature applications. Current design limitations are based on material strength and corrosion resistance. This project will characterize the interactions of high-temperature creep, fatigue, and environmental attack in structural metallic alloys of interest for the very high-temperature gas-cooled reactor (VHTR) or Next–Generation Nuclear Plant (NGNP) and for the associated thermo-chemical processing systems for hydrogen generation. Each of these degradation processes presents a major materials design challenge on its own, but in combination, they can act synergistically to rapidly degrade materials and limit component lives. This research and development effort will provide experimental results to characterize creep-fatigue-environment interactions and develop predictive models to define operation limits for high-temperature structural material applications. Researchers will study individually and in combination creep-fatigue-environmental attack processes in Alloys 617, 230, and 800H, as well as in an advanced Ni-Cr oxide dispersion strengthened steel (ODS) system. For comparison, the study will also examine basic degradation processes in nichrome (Ni-20Cr), which is a basis for most high-temperature structural materials, as well as many of the superalloys. These materials are selected to represent primary candidate alloys, one advanced developmental alloy that may have superior high-temperature durability, and one model system on which basic performance and modeling efforts can be based. The research program is presented in four parts, which all complement each other. The first three are primarily experimental in nature, and the last will tie the work together in a coordinated modeling effort. The sections are (1) dynamic creep-fatigue-environment process, (2) subcritical crack processes, (3) dynamic corrosion – crack

  15. GREENSCOPE: A Method for Modeling Chemical Process ...

    EPA Pesticide Factsheets

    Current work within the U.S. Environmental Protection Agency’s National Risk Management Research Laboratory is focused on the development of a method for modeling chemical process sustainability. The GREENSCOPE methodology, defined for the four bases of Environment, Economics, Efficiency, and Energy, can evaluate processes with over a hundred different indicators. These indicators provide a means for realizing the principles of green chemistry and green engineering in the context of sustainability. Development of the methodology has centered around three focal points. One is a taxonomy of impacts that describe the indicators and provide absolute scales for their evaluation. The setting of best and worst limits for the indicators allows the user to know the status of the process under study in relation to understood values. Thus, existing or imagined processes can be evaluated according to their relative indicator scores, and process modifications can strive towards realizable targets. A second area of focus is in advancing definitions of data needs for the many indicators of the taxonomy. Each of the indicators has specific data that is necessary for their calculation. Values needed and data sources have been identified. These needs can be mapped according to the information source (e.g., input stream, output stream, external data, etc.) for each of the bases. The user can visualize data-indicator relationships on the way to choosing selected ones for evalua

  16. Bayesian Genomic Prediction with Genotype × Environment Interaction Kernel Models.

    PubMed

    Cuevas, Jaime; Crossa, José; Montesinos-López, Osval A; Burgueño, Juan; Pérez-Rodríguez, Paulino; de Los Campos, Gustavo

    2017-01-05

    The phenomenon of genotype × environment (G × E) interaction in plant breeding decreases selection accuracy, thereby negatively affecting genetic gains. Several genomic prediction models incorporating G × E have been recently developed and used in genomic selection of plant breeding programs. Genomic prediction models for assessing multi-environment G × E interaction are extensions of a single-environment model, and have advantages and limitations. In this study, we propose two multi-environment Bayesian genomic models: the first model considers genetic effects [Formula: see text] that can be assessed by the Kronecker product of variance-covariance matrices of genetic correlations between environments and genomic kernels through markers under two linear kernel methods, linear (genomic best linear unbiased predictors, GBLUP) and Gaussian (Gaussian kernel, GK). The other model has the same genetic component as the first model [Formula: see text] plus an extra component, F: , that captures random effects between environments that were not captured by the random effects [Formula: see text] We used five CIMMYT data sets (one maize and four wheat) that were previously used in different studies. Results show that models with G × E always have superior prediction ability than single-environment models, and the higher prediction ability of multi-environment models with [Formula: see text] over the multi-environment model with only u occurred 85% of the time with GBLUP and 45% of the time with GK across the five data sets. The latter result indicated that including the random effect f is still beneficial for increasing prediction ability after adjusting by the random effect [Formula: see text]. Copyright © 2017 Cuevas et al.

  17. Bayesian Genomic Prediction with Genotype × Environment Interaction Kernel Models

    PubMed Central

    Cuevas, Jaime; Crossa, José; Montesinos-López, Osval A.; Burgueño, Juan; Pérez-Rodríguez, Paulino; de los Campos, Gustavo

    2016-01-01

    The phenomenon of genotype × environment (G × E) interaction in plant breeding decreases selection accuracy, thereby negatively affecting genetic gains. Several genomic prediction models incorporating G × E have been recently developed and used in genomic selection of plant breeding programs. Genomic prediction models for assessing multi-environment G × E interaction are extensions of a single-environment model, and have advantages and limitations. In this study, we propose two multi-environment Bayesian genomic models: the first model considers genetic effects (u) that can be assessed by the Kronecker product of variance–covariance matrices of genetic correlations between environments and genomic kernels through markers under two linear kernel methods, linear (genomic best linear unbiased predictors, GBLUP) and Gaussian (Gaussian kernel, GK). The other model has the same genetic component as the first model (u) plus an extra component, f, that captures random effects between environments that were not captured by the random effects u. We used five CIMMYT data sets (one maize and four wheat) that were previously used in different studies. Results show that models with G × E always have superior prediction ability than single-environment models, and the higher prediction ability of multi-environment models with u and f over the multi-environment model with only u occurred 85% of the time with GBLUP and 45% of the time with GK across the five data sets. The latter result indicated that including the random effect f is still beneficial for increasing prediction ability after adjusting by the random effect u. PMID:27793970

  18. Interoperation Modeling for Intelligent Domotic Environments

    NASA Astrophysics Data System (ADS)

    Bonino, Dario; Corno, Fulvio

    This paper introduces an ontology-based model for domotic device inter-operation. Starting from a previously published ontology (DogOnt) a refactoring and extension is described allowing to explicitly represent device capabilities, states and commands, and supporting abstract modeling of device inter-operation.

  19. [Applying analytical hierarchy process to assess eco-environment quality of Heilongjiang province].

    PubMed

    Li, Song; Qiu, Wei; Zhao, Qing-liang; Liu, Zheng-mao

    2006-05-01

    The analytical hierarchy process (AHP) was adopted to study the index system of eco-province and the index system was set up for eco-province construction. The comparison matrix was constructed on the basis of experts' investigation questionnaires. MATLAB 6.5 was used to confirm the weights of the indices. The general environment quality index model was used to grade the environment quality and assessed the progress of constructing eco-province in Heilongjiang province. The results indicate that it is feasible to apply the AHP to assess quantitatively the ecological environmental quality province-wide. The ecological environment quality of Heilongjiang province has been improved obviously from the beginning of eco-province construction.

  20. Network Modeling and Simulation Environment (NEMSE)

    DTIC Science & Technology

    2012-07-01

    Loop Using OPNET Modeler Demo ................................................ 3 3.2.3 COPE Demo...11 B 4.3 OPNET ...13 Figure 15: Antenna Pattern in OpNet ............................................................................... 13 Figure 16: NEMSE Box

  1. Development of a Heterogenic Distributed Environment for Spatial Data Processing Using Cloud Technologies

    NASA Astrophysics Data System (ADS)

    Garov, A. S.; Karachevtseva, I. P.; Matveev, E. V.; Zubarev, A. E.; Florinsky, I. V.

    2016-06-01

    We are developing a unified distributed communication environment for processing of spatial data which integrates web-, desktop- and mobile platforms and combines volunteer computing model and public cloud possibilities. The main idea is to create a flexible working environment for research groups, which may be scaled according to required data volume and computing power, while keeping infrastructure costs at minimum. It is based upon the "single window" principle, which combines data access via geoportal functionality, processing possibilities and communication between researchers. Using an innovative software environment the recently developed planetary information system (http://cartsrv.mexlab.ru/geoportal) will be updated. The new system will provide spatial data processing, analysis and 3D-visualization and will be tested based on freely available Earth remote sensing data as well as Solar system planetary images from various missions. Based on this approach it will be possible to organize the research and representation of results on a new technology level, which provides more possibilities for immediate and direct reuse of research materials, including data, algorithms, methodology, and components. The new software environment is targeted at remote scientific teams, and will provide access to existing spatial distributed information for which we suggest implementation of a user interface as an advanced front-end, e.g., for virtual globe system.

  2. Application of integration algorithms in a parallel processing environment for the simulation of jet engines

    SciTech Connect

    Krosel, S.M.; Milner, E.J.

    1982-01-01

    Illustrates the application of predictor-corrector integration algorithms developed for the digital parallel processing environment. The algorithms are implemented and evaluated through the use of a software simulator which provides an approximate representation of the parallel processing hardware. Test cases which focus on the use of the algorithms are presented and a specific application using a linear model of a turbofan engine is considered. Results are presented showing the effects of integration step size and the number of processors on simulation accuracy. Real-time performance, inter-processor communication and algorithm startup are also discussed. 10 references.

  3. Application of integration algorithms in a parallel processing environment for the simulation of jet engines

    NASA Technical Reports Server (NTRS)

    Krosel, S. M.; Milner, E. J.

    1982-01-01

    The application of Predictor corrector integration algorithms developed for the digital parallel processing environment are investigated. The algorithms are implemented and evaluated through the use of a software simulator which provides an approximate representation of the parallel processing hardware. Test cases which focus on the use of the algorithms are presented and a specific application using a linear model of a turbofan engine is considered. Results are presented showing the effects of integration step size and the number of processors on simulation accuracy. Real time performance, interprocessor communication, and algorithm startup are also discussed.

  4. Modelling Three-Dimensional Sound Propagation in Wedge Environments

    NASA Astrophysics Data System (ADS)

    Austin, Melanie Elizabeth

    Ocean environments with sloped seafloors can give rise to sound paths that do not remain in a constant plane of propagation. Numerical modelling of sound fields in such environments requires the use of computer models that fully account for out-of-plane sound propagation effects. The inclusion of these three-dimensional effects can be computationally intensive and the effects are often neglected in computer sound propagation codes. The current state-of-the art in sound propagation modelling has seen the development of models that can fully account for out-of-plane sound propagation. Such a model has been implemented in this research to provide acoustic consultants JASCO Applied Sciences with an important tool for environmental noise impact assessment in complicated marine environments. The model is described and validation results are shown for benchmark test cases. The model is also applied to study three-dimensional propagation effects in measured data from a realistic ocean environment. Particular analysis techniques assist in the interpretation of the modelled sound field for this physical test environment providing new insight into the characteristics of the test environment.

  5. LIGHT-INDUCED PROCESSES AFFECTING ENTEROCOCCI IN AQUATIC ENVIRONMENTS

    EPA Science Inventory

    Fecal indicator bacteria such as enterococci have been used to assess contamination of freshwater and marine environments by pathogenic microorganisms. Various past studies have shown that sunlight plays an important role in reducing concentrations of culturable enterococci and ...

  6. LIGHT-INDUCED PROCESSES AFFECTING ENTEROCOCCI IN AQUATIC ENVIRONMENTS

    EPA Science Inventory

    Fecal indicator bacteria such as enterococci have been used to assess contamination of freshwater and marine environments by pathogenic microorganisms. Various past studies have shown that sunlight plays an important role in reducing concentrations of culturable enterococci and ...

  7. Silicon-Carbide Power MOSFET Performance in High Efficiency Boost Power Processing Unit for Extreme Environments

    NASA Technical Reports Server (NTRS)

    Ikpe, Stanley A.; Lauenstein, Jean-Marie; Carr, Gregory A.; Hunter, Don; Ludwig, Lawrence L.; Wood, William; Del Castillo, Linda Y.; Fitzpatrick, Fred; Chen, Yuan

    2016-01-01

    Silicon-Carbide device technology has generated much interest in recent years. With superior thermal performance, power ratings and potential switching frequencies over its Silicon counterpart, Silicon-Carbide offers a greater possibility for high powered switching applications in extreme environment. In particular, Silicon-Carbide Metal-Oxide- Semiconductor Field-Effect Transistors' (MOSFETs) maturing process technology has produced a plethora of commercially available power dense, low on-state resistance devices capable of switching at high frequencies. A novel hard-switched power processing unit (PPU) is implemented utilizing Silicon-Carbide power devices. Accelerated life data is captured and assessed in conjunction with a damage accumulation model of gate oxide and drain-source junction lifetime to evaluate potential system performance at high temperature environments.

  8. Modeling Obscurants in an Urban Environment

    DTIC Science & Technology

    2007-12-01

    cascades over the inertial subrange of the atmosphere, the Hurst parameter H= 1/3. For uncorrelated Brownian motion H= 1/2. (25) (26) (27) (24) 14... Pollock Editor, SPIE Optical Engineering Press, Chapter 6, pp 359- 493. Hoock, Donald W. Jr., 2002a: “New Transmission Algorithms for Modeling

  9. A new security model for collaborative environments

    SciTech Connect

    Agarwal, Deborah; Lorch, Markus; Thompson, Mary; Perry, Marcia

    2003-06-06

    Prevalent authentication and authorization models for distributed systems provide for the protection of computer systems and resources from unauthorized use. The rules and policies that drive the access decisions in such systems are typically configured up front and require trust establishment before the systems can be used. This approach does not work well for computer software that moderates human-to-human interaction. This work proposes a new model for trust establishment and management in computer systems supporting collaborative work. The model supports the dynamic addition of new users to a collaboration with very little initial trust placed into their identity and supports the incremental building of trust relationships through endorsements from established collaborators. It also recognizes the strength of a users authentication when making trust decisions. By mimicking the way humans build trust naturally the model can support a wide variety of usage scenarios. Its particular strength lies in the support for ad-hoc and dynamic collaborations and the ubiquitous access to a Computer Supported Collaboration Workspace (CSCW) system from locations with varying levels of trust and security.

  10. Integrated Modeling of the Battlespace Environment

    DTIC Science & Technology

    2010-10-01

    hour pet response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and...Principal Author) (Siflnahire) CODE SIGNATURE DATE COMMENTS Au»W(t«> iotofibciL if/?/****? N«edbv fU £> Czt &<? Pu Ulcty BoceMfote...representation of the envi- ronmental conditions that might impact a DoD mission. Environmental processes interact on multiple time scales, and many such

  11. Process-Response Modeling and the Scientific Process.

    ERIC Educational Resources Information Center

    Fichter, Lynn S.

    1988-01-01

    Discusses the process-response model (PRM) in its theoretical and practical forms. Describes how geologists attempt to reconstruct the process from the response (the geologic phenomenon) being studied. (TW)

  12. Process-Response Modeling and the Scientific Process.

    ERIC Educational Resources Information Center

    Fichter, Lynn S.

    1988-01-01

    Discusses the process-response model (PRM) in its theoretical and practical forms. Describes how geologists attempt to reconstruct the process from the response (the geologic phenomenon) being studied. (TW)

  13. Managing Analysis Models in the Design Process

    NASA Technical Reports Server (NTRS)

    Briggs, Clark

    2006-01-01

    Design of large, complex space systems depends on significant model-based support for exploration of the design space. Integrated models predict system performance in mission-relevant terms given design descriptions and multiple physics-based numerical models. Both the design activities and the modeling activities warrant explicit process definitions and active process management to protect the project from excessive risk. Software and systems engineering processes have been formalized and similar formal process activities are under development for design engineering and integrated modeling. JPL is establishing a modeling process to define development and application of such system-level models.

  14. Managing Analysis Models in the Design Process

    NASA Technical Reports Server (NTRS)

    Briggs, Clark

    2006-01-01

    Design of large, complex space systems depends on significant model-based support for exploration of the design space. Integrated models predict system performance in mission-relevant terms given design descriptions and multiple physics-based numerical models. Both the design activities and the modeling activities warrant explicit process definitions and active process management to protect the project from excessive risk. Software and systems engineering processes have been formalized and similar formal process activities are under development for design engineering and integrated modeling. JPL is establishing a modeling process to define development and application of such system-level models.

  15. Cupola Furnace Computer Process Model

    SciTech Connect

    Seymour Katz

    2004-12-31

    The cupola furnace generates more than 50% of the liquid iron used to produce the 9+ million tons of castings annually. The cupola converts iron and steel into cast iron. The main advantages of the cupola furnace are lower energy costs than those of competing furnaces (electric) and the ability to melt less expensive metallic scrap than the competing furnaces. However the chemical and physical processes that take place in the cupola furnace are highly complex making it difficult to operate the furnace in optimal fashion. The results are low energy efficiency and poor recovery of important and expensive alloy elements due to oxidation. Between 1990 and 2004 under the auspices of the Department of Energy, the American Foundry Society and General Motors Corp. a computer simulation of the cupola furnace was developed that accurately describes the complex behavior of the furnace. When provided with the furnace input conditions the model provides accurate values of the output conditions in a matter of seconds. It also provides key diagnostics. Using clues from the diagnostics a trained specialist can infer changes in the operation that will move the system toward higher efficiency. Repeating the process in an iterative fashion leads to near optimum operating conditions with just a few iterations. More advanced uses of the program have been examined. The program is currently being combined with an ''Expert System'' to permit optimization in real time. The program has been combined with ''neural network'' programs to affect very easy scanning of a wide range of furnace operation. Rudimentary efforts were successfully made to operate the furnace using a computer. References to these more advanced systems will be found in the ''Cupola Handbook''. Chapter 27, American Foundry Society, Des Plaines, IL (1999).

  16. Duties Performed by Department Chairmen in Holland's Model Environments

    ERIC Educational Resources Information Center

    Smart, John C.

    1976-01-01

    Results of this study demonstrate that department chairmen of academic departments, classified according to Holland's model environments, devote significantly different amounts of time to selected dimensions of their job and that these differences are generally consistent with the psychological resemblances among these environments. (Author/DEP)

  17. The Quality of Home Environment in Brazil: An Ecological Model

    ERIC Educational Resources Information Center

    de Oliveira, Ebenezer A.; Barros, Fernando C.; Anselmi, Luciana D. da Silva; Piccinini, Cesar A.

    2006-01-01

    Based on Bronfenbrenner's (1999) ecological perspective, a longitudinal, prospective model of individual differences in the quality of home environment (Home Observation for Measurement of the Environment--HOME) was tested in a sample of 179 Brazilian children and their families. Perinatal measures of family socioeconomic status (SES) and child…

  18. Adaptive User Model for Web-Based Learning Environment.

    ERIC Educational Resources Information Center

    Garofalakis, John; Sirmakessis, Spiros; Sakkopoulos, Evangelos; Tsakalidis, Athanasios

    This paper describes the design of an adaptive user model and its implementation in an advanced Web-based Virtual University environment that encompasses combined and synchronized adaptation between educational material and well-known communication facilities. The Virtual University environment has been implemented to support a postgraduate…

  19. User behavioral model in hypertext environment

    NASA Astrophysics Data System (ADS)

    Moskvin, Oleksii M.; Sailarbek, Saltanat; Gromaszek, Konrad

    2015-12-01

    There is an important role of the users that are traversing Internet resources and their activities which, according to the practice, aren't usually considered by the Internet resource owners so to adjust and optimize hypertext structure. Optimal hypertext structure allows users to locate pages of interest, which are the goals of the informational search, in a faster way. Paper presents a model that conducts user auditory behavior analysis in order to figure out their goals in particular hypertext segment and allows finding out optimal routes for reaching those goals in terms of the routes length and informational value. Potential application of the proposed model is mostly the systems that evaluate hypertext networks and optimize their referential structure for faster information retrieval.

  20. Opponent Modeling in Interesting Adversarial Environments

    DTIC Science & Technology

    2008-07-01

    the range of this random variable is 55 42. In a match, even a fraction of a small bet per hand is significant in differentiating performance, we... bets per hand would the learning agent have beat the non-learning agent. Furthermore, if we know that our target opponent O uses a fixed policy that does...We calculate the performance bounds for several classes of models in two domains: high card draw with simultaneous betting and a new simultaneous

  1. Integrated approaches to the application of advanced modeling technology in process development and optimization

    SciTech Connect

    Allgor, R.J.; Feehery, W.F.; Tolsma, J.E.

    1995-12-31

    The batch process development problem serves as good candidate to guide the development of process modeling environments. It demonstrates that very robust numerical techniques are required within an environment that can collect, organize, and maintain the data and models required to address the batch process development problem. This paper focuses on improving the robustness and efficiency of the numerical algorithms required in such a modeling environment through the development of hybrid numerical and symbolic strategies.

  2. Quantitative Modeling of Human-Environment Interactions in Preindustrial Time

    NASA Astrophysics Data System (ADS)

    Sommer, Philipp S.; Kaplan, Jed O.

    2017-04-01

    Quantifying human-environment interactions and anthropogenic influences on the environment prior to the Industrial revolution is essential for understanding the current state of the earth system. This is particularly true for the terrestrial biosphere, but marine ecosystems and even climate were likely modified by human activities centuries to millennia ago. Direct observations are however very sparse in space and time, especially as one considers prehistory. Numerical models are therefore essential to produce a continuous picture of human-environment interactions in the past. Agent-based approaches, while widely applied to quantifying human influence on the environment in localized studies, are unsuitable for global spatial domains and Holocene timescales because of computational demands and large parameter uncertainty. Here we outline a new paradigm for the quantitative modeling of human-environment interactions in preindustrial time that is adapted to the global Holocene. Rather than attempting to simulate agency directly, the model is informed by a suite of characteristics describing those things about society that cannot be predicted on the basis of environment, e.g., diet, presence of agriculture, or range of animals exploited. These categorical data are combined with the properties of the physical environment in coupled human-environment model. The model is, at its core, a dynamic global vegetation model with a module for simulating crop growth that is adapted for preindustrial agriculture. This allows us to simulate yield and calories for feeding both humans and their domesticated animals. We couple this basic caloric availability with a simple demographic model to calculate potential population, and, constrained by labor requirements and land limitations, we create scenarios of land use and land cover on a moderate-resolution grid. We further implement a feedback loop where anthropogenic activities lead to changes in the properties of the physical

  3. A process-based standard for the Solar Energetic Particle Event Environment

    NASA Astrophysics Data System (ADS)

    Gabriel, Stephen

    For 10 years or more, there has been a lack of concensus on what the ISO standard model for the Solar Energetic Particle Event (SEPE) environment should be. Despite many technical discussions between the world experts in this field, it has been impossible to agree on which of the several models available should be selected as the standard. Most of these discussions at the ISO WG4 meetings and conferences, etc have centred around the differences in modelling approach between the MSU model and the several remaining models from elsewhere worldwide (mainly the USA and Europe). The topic is considered timely given the inclusion of a session on reference data sets at the Space Weather Workshop in Boulder in April 2014. The original idea of a ‘process-based’ standard was conceived by Dr Kent Tobiska as a way of getting round the problems associated with not only the presence of different models, which in themselves could have quite distinct modelling approaches but could also be based on different data sets. In essence, a process based standard approach overcomes these issues by allowing there to be more than one model and not necessarily a single standard model; however, any such model has to be completely transparent in that the data set and the modelling techniques used have to be not only to be clearly and unambiguously defined but also subject to peer review. If the model meets all of these requirements then it should be acceptable as a standard model. So how does this process-based approach resolve the differences between the existing modelling approaches for the SEPE environment and remove the impasse? In a sense, it does not remove all of the differences but only some of them; however, most importantly it will allow something which so far has been impossible without ambiguities and disagreement and that is a comparison of the results of the various models. To date one of the problems (if not the major one) in comparing the results of the various different SEPE

  4. Process modeling and industrial energy use

    SciTech Connect

    Howe, S O; Pilati, D A; Sparrow, F T

    1980-11-01

    How the process models developed at BNL are used to analyze industrial energy use is described and illustrated. Following a brief overview of the industry modeling program, the general methodology of process modeling is discussed. The discussion highlights the important concepts, contents, inputs, and outputs of a typical process model. A model of the US pulp and paper industry is then discussed as a specific application of process modeling methodology. Case study results from the pulp and paper model illustrate how process models can be used to analyze a variety of issues. Applications addressed with the case study results include projections of energy demand, conservation technology assessment, energy-related tax policies, and sensitivity analysis. A subsequent discussion of these results supports the conclusion that industry process models are versatile and powerful tools for energy end-use modeling and conservation analysis. Information on the current status of industry models at BNL is tabulated.

  5. ARTEMIS. Ares Real Time Environment for Modeling, Integrating, and Simulation

    NASA Technical Reports Server (NTRS)

    Walker, David; Hughes, Ryan

    2009-01-01

    This slide presentation reviews the ARTEMIS (Ares Real Time Environment for Modeling, Integration, and Simulation) for Ares hardware testing. It includes information on the ARTEMIS organization, SIL architecture, and I/O layer.

  6. Modeling Small-Scale Nearshore Processes

    NASA Astrophysics Data System (ADS)

    Slinn, D.; Holland, T.; Puleo, J.; Puleo, J.; Hanes, D.

    2001-12-01

    In recent years advances in high performance computing have made it possible to gain new qualitative and quantitative insights into the behavior and effects of coastal processes using high-resolution physical-mathematical models. The Coastal Dynamics program at the U.S. Office of Naval Research under the guidance of Dr. Thomas Kinder has encouraged collaboration between modelers, theoreticians, and field and laboratory experimentalists and supported innovative modeling efforts to examine a wide range of nearshore processes. An area of emphasis has been small-scale, time-dependent, turbulent flows, such as the wave bottom boundary layer, breaking surface waves, and the swash zone and their effects on shoaling waves, mean currents, and sediment transport that integrate to impact the long-term and large-scale response of the beach system to changing environmental conditions. Examples of small-scale modeling studies supported by CD-321 related to our work include simulation of the wave bottom boundary layer. Under mild wave field conditions the seabed forms sand ripples and simulations demonstrate that the ripples cause increases in the bed friction, the kinetic energy dissipation rates, the boundary layer thickness, and turbulence in the water column. Under energetic wave field conditions the ripples are sheared smooth and sheet flow conditions can predominate, causing the top few layers of sand grains to move as a fluidized bed, making large aggregate contributions to sediment transport. Complementary models of aspects of these processes have been developed simultaneously in various directions (e.g., Jenkins and Hanes, JFM 1998; Drake and Calantoni, 2001; Trowbridge and Madsen, JGR, 1984). Insight into near-bed fluid-sediment interactions has also been advanced using Navier-Stokes based models of swash events. Our recent laboratory experiments at the Waterways Experiment Station demonstrate that volume-of-fluid models can predict salient features of swash uprush

  7. UWB Radio Link Modeling for Multipath Environment

    NASA Astrophysics Data System (ADS)

    Uguen, B.; Talom, F. T.; Chassay, G.

    Ultra-wideband (UWB) systems have the potential to provide very high data rates over short distances. This paper details the construction of the received signal from a pulse waveform on the transmission line at the transmitter to the received signal at the receiver allowing a fine analysis of antennas impact on UWB systems. This is used for deterministic modeling based on ray tracing (RT) and geometrical optics (GO). The paper presents the way of sizing the pulse amplitude in order to satisfy a given power spectral density (PSD) mask for a given modulation. The proposed approach is demonstrated to be coherent with classical power link budget.

  8. Analytical modeling of satellites in geosynchronous environment

    NASA Technical Reports Server (NTRS)

    Stevens, N. J.

    1980-01-01

    Experiences with surface charging of geosynchronous satellites are reviewed and mechanisms leading to discharges on satellite surfaces are considered. It was found that the large differential voltages between the surface and the substrate required to produce massive laboratory discharges do not occur on satellites in space. Analytical modeling predictions supported by dielectric charging data from P78-2, SCATHA (Spacecraft Charging at High Altitudes) flight results are discussed. Ungrounded insulator areas, buried charge layers (due to mid-energy range particles), and positive differential voltages (where structure voltages are less negative than surrounding dielectric surface voltages) are considered as possible mechanisms producing satellite charge up.

  9. Exascale Co-design for Modeling Materials in Extreme Environments

    SciTech Connect

    Germann, Timothy C.

    2014-07-08

    Computational materials science has provided great insight into the response of materials under extreme conditions that are difficult to probe experimentally. For example, shock-induced plasticity and phase transformation processes in single-crystal and nanocrystalline metals have been widely studied via large-scale molecular dynamics simulations, and many of these predictions are beginning to be tested at advanced 4th generation light sources such as the Advanced Photon Source (APS) and Linac Coherent Light Source (LCLS). I will describe our simulation predictions and their recent verification at LCLS, outstanding challenges in modeling the response of materials to extreme mechanical and radiation environments, and our efforts to tackle these as part of the multi-institutional, multi-disciplinary Exascale Co-design Center for Materials in Extreme Environments (ExMatEx). ExMatEx has initiated an early and deep collaboration between domain (computational materials) scientists, applied mathematicians, computer scientists, and hardware architects, in order to establish the relationships between algorithms, software stacks, and architectures needed to enable exascale-ready materials science application codes within the next decade. We anticipate that we will be able to exploit hierarchical, heterogeneous architectures to achieve more realistic large-scale simulations with adaptive physics refinement, and are using tractable application scale-bridging proxy application testbeds to assess new approaches and requirements. Such current scale-bridging strategies accumulate (or recompute) a distributed response database from fine-scale calculations, in a top-down rather than bottom-up multiscale approach.

  10. Sensitivity of UO2 Stability in a Reducing Environment on Radiolysis Model Parameters

    SciTech Connect

    Wittman, Richard S.; Buck, Edgar C.

    2012-09-01

    Results for a radiolysis model sensitivity study of radiolytically produced H2O2 are presented as they relate to Spent (or Used) Light Water Reactor uranium oxide (UO2) nuclear fuel (UNF) oxidation in a low oxygen environment. The model builds on previous reaction kinetic studies to represent the radiolytic processes occurring at the nuclear fuel surface. Hydrogen peroxide (H2O2) is the dominant oxidant for spent nuclear fuel in an O2-depleted water environment.

  11. Analog modelling of obduction processes

    NASA Astrophysics Data System (ADS)

    Agard, P.; Zuo, X.; Funiciello, F.; Bellahsen, N.; Faccenna, C.; Savva, D.

    2012-04-01

    Obduction corresponds to one of plate tectonics oddities, whereby dense, oceanic rocks (ophiolites) are presumably 'thrust' on top of light, continental ones, as for the short-lived, almost synchronous Peri-Arabic obduction (which took place along thousands of km from Turkey to Oman in c. 5-10 Ma). Analog modelling experiments were performed to study the mechanisms of obduction initiation and test various triggering hypotheses (i.e., plate acceleration, slab hitting the 660 km discontinuity, ridge subduction; Agard et al., 2007). The experimental setup comprises (1) an upper mantle, modelled as a low-viscosity transparent Newtonian glucose syrup filling a rigid Plexiglas tank and (2) high-viscosity silicone plates (Rhodrosil Gomme with PDMS iron fillers to reproduce densities of continental or oceanic plates), located at the centre of the tank above the syrup to simulate the subducting and the overriding plates - and avoid friction on the sides of the tank. Convergence is simulated by pushing on a piston at one end of the model with velocities comparable to those of plate tectonics (i.e., in the range 1-10 cm/yr). The reference set-up includes, from one end to the other (~60 cm): (i) the piston, (ii) a continental margin containing a transition zone to the adjacent oceanic plate, (iii) a weakness zone with variable resistance and dip (W), (iv) an oceanic plate - with or without a spreading ridge, (v) a subduction zone (S) dipping away from the piston and (vi) an upper, active continental margin, below which the oceanic plate is being subducted at the start of the experiment (as is known to have been the case in Oman). Several configurations were tested and over thirty different parametric tests were performed. Special emphasis was placed on comparing different types of weakness zone (W) and the extent of mechanical coupling across them, particularly when plates were accelerated. Displacements, together with along-strike and across-strike internal deformation in all

  12. Use of terrestrial laser scanning (TLS) for monitoring and modelling of geomorphic processes and phenomena at a small and medium spatial scale in Polar environment (Scott River — Spitsbergen)

    NASA Astrophysics Data System (ADS)

    Kociuba, Waldemar; Kubisz, Waldemar; Zagórski, Piotr

    2014-05-01

    The application of Terrestrial Laser Scanning (TLS) for precise modelling of land relief and quantitative estimation of spatial and temporal transformations can contribute to better understanding of catchment-forming processes. Experimental field measurements utilising the 3D laser scanning technology were carried out within the Scott River catchment located in the NW part of the Wedel Jarlsberg Land (Spitsbergen). The measurements concerned the glacier-free part of the Scott River valley floor with a length of 3.5 km and width from 0.3 to 1.5 km and were conducted with a state-of-the-art medium-range stationary laser scanner, a Leica Scan Station C10. A complex set of measurements of the valley floor were carried out from 86 measurement sites interrelated by the application of 82 common 'target points'. During scanning, from 5 to 19 million measurements were performed at each of the sites, and a point-cloud constituting a 'model space' was obtained. By merging individual 'model spaces', a Digital Surface Model (DSM) of the Scott River valley was obtained, with a co-registration error not exceeding ± 9 mm. The accuracy of the model permitted precise measurements of dimensions of landforms of varied scales on the main valley floor and slopes and in selected sub-catchments. The analyses verified the efficiency of the measurement system in Polar meteorological conditions of Spitsbergen in mid-summer.

  13. Process correlation analysis model for process improvement identification.

    PubMed

    Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  14. Multidimensional Data Modeling for Business Process Analysis

    NASA Astrophysics Data System (ADS)

    Mansmann, Svetlana; Neumuth, Thomas; Scholl, Marc H.

    The emerging area of business process intelligence attempts to enhance the analytical capabilities of business process management systems by employing data warehousing and mining technologies. This paper presents an approach to re-engineering the business process modeling in conformity with the multidimensional data model. Since the business process and the multidimensional model are driven by rather different objectives and assumptions, there is no straightforward solution to converging these models.

  15. Focal and Ambient Processing of Built Environments: Intellectual and Atmospheric Experiences of Architecture.

    PubMed

    Rooney, Kevin K; Condia, Robert J; Loschky, Lester C

    2017-01-01

    Neuroscience has well established that human vision divides into the central and peripheral fields of view. Central vision extends from the point of gaze (where we are looking) out to about 5° of visual angle (the width of one's fist at arm's length), while peripheral vision is the vast remainder of the visual field. These visual fields project to the parvo and magno ganglion cells, which process distinctly different types of information from the world around us and project that information to the ventral and dorsal visual streams, respectively. Building on the dorsal/ventral stream dichotomy, we can further distinguish between focal processing of central vision, and ambient processing of peripheral vision. Thus, our visual processing of and attention to objects and scenes depends on how and where these stimuli fall on the retina. The built environment is no exception to these dependencies, specifically in terms of how focal object perception and ambient spatial perception create different types of experiences we have with built environments. We argue that these foundational mechanisms of the eye and the visual stream are limiting parameters of architectural experience. We hypothesize that people experience architecture in two basic ways based on these visual limitations; by intellectually assessing architecture consciously through focal object processing and assessing architecture in terms of atmosphere through pre-conscious ambient spatial processing. Furthermore, these separate ways of processing architectural stimuli operate in parallel throughout the visual perceptual system. Thus, a more comprehensive understanding of architecture must take into account that built environments are stimuli that are treated differently by focal and ambient vision, which enable intellectual analysis of architectural experience versus the experience of architectural atmosphere, respectively. We offer this theoretical model to help advance a more precise understanding of the

  16. Focal and Ambient Processing of Built Environments: Intellectual and Atmospheric Experiences of Architecture

    PubMed Central

    Rooney, Kevin K.; Condia, Robert J.; Loschky, Lester C.

    2017-01-01

    Neuroscience has well established that human vision divides into the central and peripheral fields of view. Central vision extends from the point of gaze (where we are looking) out to about 5° of visual angle (the width of one’s fist at arm’s length), while peripheral vision is the vast remainder of the visual field. These visual fields project to the parvo and magno ganglion cells, which process distinctly different types of information from the world around us and project that information to the ventral and dorsal visual streams, respectively. Building on the dorsal/ventral stream dichotomy, we can further distinguish between focal processing of central vision, and ambient processing of peripheral vision. Thus, our visual processing of and attention to objects and scenes depends on how and where these stimuli fall on the retina. The built environment is no exception to these dependencies, specifically in terms of how focal object perception and ambient spatial perception create different types of experiences we have with built environments. We argue that these foundational mechanisms of the eye and the visual stream are limiting parameters of architectural experience. We hypothesize that people experience architecture in two basic ways based on these visual limitations; by intellectually assessing architecture consciously through focal object processing and assessing architecture in terms of atmosphere through pre-conscious ambient spatial processing. Furthermore, these separate ways of processing architectural stimuli operate in parallel throughout the visual perceptual system. Thus, a more comprehensive understanding of architecture must take into account that built environments are stimuli that are treated differently by focal and ambient vision, which enable intellectual analysis of architectural experience versus the experience of architectural atmosphere, respectively. We offer this theoretical model to help advance a more precise understanding of the

  17. Simulation model of clastic sedimentary processes

    SciTech Connect

    Tetzlaff, D.M.

    1987-01-01

    This dissertation describes SEDSIM, a computer model that simulates erosion, transport, and deposition of clastic sediments by free-surface flow in natural environments. SEDSIM is deterministic and is applicable to sedimentary processes in rivers, deltas, continental shelves, submarine canyons, and turbidite fans. The model is used to perform experiments in clastic sedimentation. Computer experimentation is limited by computing power available, but is free from scaling problems associated with laboratory experiments. SEDSIM responds to information provided to it at the outset of a simulation experiment, including topography, subsurface configuration, physical parameters of fluid and sediment, and characteristics of sediment sources. Extensive computer graphics are incorporated in SEDSIM. The user can display the three-dimensional geometry of simulated deposits in the form of successions of contour maps, perspective diagrams, vector plots of current velocities, and vertical sections of any azimuth orientation. The sections show both sediment age and composition. SEDSIM works realistically with processes involving channel shifting and topographic changes. Example applications include simulation of an ancient submarine canyon carved into a Cretaceous sequence in the National Petroleum Reserve in Alaska, known mainly from seismic sections and a sequence of Tertiary age in the Golden Meadow oil field of Louisiana, known principally from well logs.

  18. Time of arrival through interacting environments: Tunneling processes

    NASA Astrophysics Data System (ADS)

    Aoki, Ken-Ichi; Horikoshi, Atsushi; Nakamura, Etsuko

    2000-08-01

    We discuss the propagation of wave packets through interacting environments. Such environments generally modify the dispersion relation or shape of the wave function. To study such effects in detail, we define the distribution function PX(T), which describes the arrival time T of a packet at a detector located at point X. We calculate PX(T) for wave packets traveling through a tunneling barrier and find that our results actually explain recent experiments. We compare our results with Nelson's stochastic interpretation of quantum mechanics and resolve a paradox previously apparent in Nelson's viewpoint about the tunneling time.

  19. Electric discharge processes in the ISS plasma environment

    NASA Astrophysics Data System (ADS)

    Tverdokhlebova, E. M.; Korsun, A. G.; Gabdullin, F. F.; Karabadzhak, G. F.

    We consider the behaviour of the electric discharges which can be initiated between constructional elements of the International Space Station (ISS) due to the electric field of high-voltaic solar arrays (HVSA). The characteristics of the ISS plasma environment are evaluated taking into account the influence of space ionizing fluxes, the Earth's magnetic field, and the HVSA's electric field. We offer the statement of the space experiment "Plasma-ISS", the aim of which is to investigate, using optical emission characteristics, parameters of the ISS plasma environment formed at operation of both the onboard engines and other plasma sources.

  20. Model of the home food environment pertaining to childhood obesity.

    PubMed

    Rosenkranz, Richard R; Dzewaltowski, David A

    2008-03-01

    The home food environment can be conceptualized as overlapping interactive domains composed of built and natural, sociocultural, political and economic, micro-level and macro-level environments. Each type and level of environment uniquely contributes influence through a mosaic of determinants depicting the home food environment as a major setting for shaping child dietary behavior and the development of obesity. Obesity is a multifactorial problem, and the home food environmental aspects described here represent a substantial part of the full environmental context in which a child grows, develops, eats, and behaves. The present review includes selected literature relevant to the home food environment's influence on obesity with the aim of presenting an ecologically informed model for future research and intervention in the home food environment.

  1. LEGEND, a LEO-to-GEO Environment Debris Model

    NASA Technical Reports Server (NTRS)

    Liou, Jer Chyi; Hall, Doyle T.

    2013-01-01

    LEGEND (LEO-to-GEO Environment Debris model) is a three-dimensional orbital debris evolutionary model that is capable of simulating the historical and future debris populations in the near-Earth environment. The historical component in LEGEND adopts a deterministic approach to mimic the known historical populations. Launched rocket bodies, spacecraft, and mission-related debris (rings, bolts, etc.) are added to the simulated environment. Known historical breakup events are reproduced, and fragments down to 1 mm in size are created. The LEGEND future projection component adopts a Monte Carlo approach and uses an innovative pair-wise collision probability evaluation algorithm to simulate the future breakups and the growth of the debris populations. This algorithm is based on a new "random sampling in time" approach that preserves characteristics of the traditional approach and captures the rapidly changing nature of the orbital debris environment. LEGEND is a Fortran 90-based numerical simulation program. It operates in a UNIX/Linux environment.

  2. Prediction under change: invariant model parameters in a varying environment

    NASA Astrophysics Data System (ADS)

    Schymanski, S. J.; Or, D.; Roderick, M. L.; Sivapalan, M.

    2012-04-01

    Hydrological understanding is commonly synthesised into complex mechanistic models, some of which have become "as inscrutable as nature itself" (Harte, 2002). Parameters for most models are estimated from past observations. This may result in an ill-posed problem with associated "equifinality" (Beven, 1993), in which the information content in calibration data is insufficient for distinguishing a suitable parameter set among all possible sets. Consequently, we are unable to identify the "correct" parameter set that produces the right results for the right reasons. Incorporation of new process knowledge into a model adds new parameters that exacerbate the equifinality problem. Hence improved process understanding has not necessarily translated into improved models nor contributed to better predictions. Prediction under change confronts us with additional challenges: 1. Varying boundary conditions: Projections into the future can no longer be guided by observations in the past to the same degree as they could when the boundary conditions were considered stationary. 2. Ecohydrological adaptation: Common model parameters related to vegetation properties (e.g. canopy conductance, rooting depths) cannot be assumed invariant, as vegetation dynamically adapts to its environment. 3. No analog conditions for model evaluation: Climate change and in particular rising atmospheric CO2 concentrations will lead to conditions that cannot be found anywhere on Earth at present. Therefore it is doubtful whether the ability of a hydrological model to reproduce the past is indicative of its trustworthiness for predicting the future. We propose that optimality theory can help addressing some of the above challenges. Optimality theory submits that natural systems self-optimise to attain certain goal functions (or "objective functions"). Optimality principles allow an independent prediction of system properties that would otherwise require direct observations or calibration. The resulting

  3. NoteCards: A Multimedia Idea Processing Environment.

    ERIC Educational Resources Information Center

    Halasz, Frank G.

    1986-01-01

    Notecards is a computer environment designed to help people work with ideas by providing a set of tools for a variety of specific activities, which can range from sketching on the back of an envelope to formally representing knowledge. The basic framework of this hypermedia system is a semantic network of electronic notecards connected by…

  4. Autism and Digital Learning Environments: Processes of Interaction and Mediation

    ERIC Educational Resources Information Center

    Passerino, Liliana M.; Santarosa, Lucila M. Costi

    2008-01-01

    Using a socio-historical perspective to explain social interaction and taking advantage of information and communication technologies (ICTs) currently available for creating digital learning environments (DLEs), this paper seeks to redress the absence of empirical data concerning technology-aided social interaction between autistic individuals. In…

  5. Intelligent sensing in dynamic environments using markov decision process.

    PubMed

    Nanayakkara, Thrishantha; Halgamuge, Malka N; Sridhar, Prasanna; Madni, Asad M

    2011-01-01

    In a network of low-powered wireless sensors, it is essential to capture as many environmental events as possible while still preserving the battery life of the sensor node. This paper focuses on a real-time learning algorithm to extend the lifetime of a sensor node to sense and transmit environmental events. A common method that is generally adopted in ad-hoc sensor networks is to periodically put the sensor nodes to sleep. The purpose of the learning algorithm is to couple the sensor's sleeping behavior to the natural statistics of the environment hence that it can be in optimal harmony with changes in the environment, the sensors can sleep when steady environment and stay awake when turbulent environment. This paper presents theoretical and experimental validation of a reward based learning algorithm that can be implemented on an embedded sensor. The key contribution of the proposed approach is the design and implementation of a reward function that satisfies a trade-off between the above two mutually contradicting objectives, and a linear critic function to approximate the discounted sum of future rewards in order to perform policy learning.

  6. Designing a Web-Based Science Learning Environment for Model-Based Collaborative Inquiry

    ERIC Educational Resources Information Center

    Sun, Daner; Looi, Chee-Kit

    2013-01-01

    The paper traces a research process in the design and development of a science learning environment called WiMVT (web-based inquirer with modeling and visualization technology). The WiMVT system is designed to help secondary school students build a sophisticated understanding of scientific conceptions, and the science inquiry process, as well as…

  7. Designing a Web-Based Science Learning Environment for Model-Based Collaborative Inquiry

    ERIC Educational Resources Information Center

    Sun, Daner; Looi, Chee-Kit

    2013-01-01

    The paper traces a research process in the design and development of a science learning environment called WiMVT (web-based inquirer with modeling and visualization technology). The WiMVT system is designed to help secondary school students build a sophisticated understanding of scientific conceptions, and the science inquiry process, as well as…

  8. Unveiling Hidden Unstructured Regions in Process Models

    NASA Astrophysics Data System (ADS)

    Polyvyanyy, Artem; García-Bañuelos, Luciano; Weske, Mathias

    Process models define allowed process execution scenarios. The models are usually depicted as directed graphs, with gateway nodes regulating the control flow routing logic and with edges specifying the execution order constraints between tasks. While arbitrarily structured control flow patterns in process models complicate model analysis, they also permit creativity and full expressiveness when capturing non-trivial process scenarios. This paper gives a classification of arbitrarily structured process models based on the hierarchical process model decomposition technique. We identify a structural class of models consisting of block structured patterns which, when combined, define complex execution scenarios spanning across the individual patterns. We show that complex behavior can be localized by examining structural relations of loops in hidden unstructured regions of control flow. The correctness of the behavior of process models within these regions can be validated in linear time. These observations allow us to suggest techniques for transforming hidden unstructured regions into block-structured ones.

  9. Particle Swarm Based Collective Searching Model for Adaptive Environment

    SciTech Connect

    Cui, Xiaohui; Patton, Robert M; Potok, Thomas E; Treadwell, Jim N

    2008-01-01

    This report presents a pilot study of an integration of particle swarm algorithm, social knowledge adaptation and multi-agent approaches for modeling the collective search behavior of self-organized groups in an adaptive environment. The objective of this research is to apply the particle swarm metaphor as a model of social group adaptation for the dynamic environment and to provide insight and understanding of social group knowledge discovering and strategic searching. A new adaptive environment model, which dynamically reacts to the group collective searching behaviors, is proposed in this research. The simulations in the research indicate that effective communication between groups is not the necessary requirement for whole self-organized groups to achieve the efficient collective searching behavior in the adaptive environment.

  10. Particle Swarm Based Collective Searching Model for Adaptive Environment

    SciTech Connect

    Cui, Xiaohui; Patton, Robert M; Potok, Thomas E; Treadwell, Jim N

    2007-01-01

    This report presents a pilot study of an integration of particle swarm algorithm, social knowledge adaptation and multi-agent approaches for modeling the collective search behavior of self-organized groups in an adaptive environment. The objective of this research is to apply the particle swarm metaphor as a model of social group adaptation for the dynamic environment and to provide insight and understanding of social group knowledge discovering and strategic searching. A new adaptive environment model, which dynamically reacts to the group collective searching behaviors, is proposed in this research. The simulations in the research indicate that effective communication between groups is not the necessary requirement for whole self-organized groups to achieve the efficient collective searching behavior in the adaptive environment.

  11. Drift-Scale Coupled Processes (DST and THC Seepage) Models

    SciTech Connect

    E. Gonnenthal; N. Spyoher

    2001-02-05

    The purpose of this Analysis/Model Report (AMR) is to document the Near-Field Environment (NFE) and Unsaturated Zone (UZ) models used to evaluate the potential effects of coupled thermal-hydrologic-chemical (THC) processes on unsaturated zone flow and transport. This is in accordance with the ''Technical Work Plan (TWP) for Unsaturated Zone Flow and Transport Process Model Report'', Addendum D, Attachment D-4 (Civilian Radioactive Waste Management System (CRWMS) Management and Operating Contractor (M and O) 2000 [153447]) and ''Technical Work Plan for Nearfield Environment Thermal Analyses and Testing'' (CRWMS M and O 2000 [153309]). These models include the Drift Scale Test (DST) THC Model and several THC seepage models. These models provide the framework to evaluate THC coupled processes at the drift scale, predict flow and transport behavior for specified thermal loading conditions, and predict the chemistry of waters and gases entering potential waste-emplacement drifts. The intended use of this AMR is to provide input for the following: (1) Performance Assessment (PA); (2) Abstraction of Drift-Scale Coupled Processes AMR (ANL-NBS-HS-000029); (3) UZ Flow and Transport Process Model Report (PMR); and (4) Near-Field Environment (NFE) PMR. The work scope for this activity is presented in the TWPs cited above, and summarized as follows: continue development of the repository drift-scale THC seepage model used in support of the TSPA in-drift geochemical model; incorporate heterogeneous fracture property realizations; study sensitivity of results to changes in input data and mineral assemblage; validate the DST model by comparison with field data; perform simulations to predict mineral dissolution and precipitation and their effects on fracture properties and chemistry of water (but not flow rates) that may seep into drifts; submit modeling results to the TDMS and document the models. The model development, input data, sensitivity and validation studies described in

  12. Drift-Scale Coupled Processes (DST and THC Seepage) Models

    SciTech Connect

    E. Sonnenthale

    2001-04-16

    The purpose of this Analysis/Model Report (AMR) is to document the Near-Field Environment (NFE) and Unsaturated Zone (UZ) models used to evaluate the potential effects of coupled thermal-hydrologic-chemical (THC) processes on unsaturated zone flow and transport. This is in accordance with the ''Technical Work Plan (TWP) for Unsaturated Zone Flow and Transport Process Model Report'', Addendum D, Attachment D-4 (Civilian Radioactive Waste Management System (CRWMS) Management and Operating Contractor (M&O) 2000 [1534471]) and ''Technical Work Plan for Nearfield Environment Thermal Analyses and Testing'' (CRWMS M&O 2000 [153309]). These models include the Drift Scale Test (DST) THC Model and several THC seepage models. These models provide the framework to evaluate THC coupled processes at the drift scale, predict flow and transport behavior for specified thermal loading conditions, and predict the chemistry of waters and gases entering potential waste-emplacement drifts. The intended use of this AMR is to provide input for the following: Performance Assessment (PA); Near-Field Environment (NFE) PMR; Abstraction of Drift-Scale Coupled Processes AMR (ANL-NBS-HS-000029); and UZ Flow and Transport Process Model Report (PMR). The work scope for this activity is presented in the TWPs cited above, and summarized as follows: Continue development of the repository drift-scale THC seepage model used in support of the TSPA in-drift geochemical model; incorporate heterogeneous fracture property realizations; study sensitivity of results to changes in input data and mineral assemblage; validate the DST model by comparison with field data; perform simulations to predict mineral dissolution and precipitation and their effects on fracture properties and chemistry of water (but not flow rates) that may seep into drifts; submit modeling results to the TDMS and document the models. The model development, input data, sensitivity and validation studies described in this AMR are required

  13. Econobiophysics - game of choosing. Model of selection or election process with diverse accessible information

    PubMed Central

    2011-01-01

    We propose several models applicable to both selection and election processes when each selecting or electing subject has access to different information about the objects to choose from. We wrote special software to simulate these processes. We consider both the cases when the environment is neutral (natural process) as well as when the environment is involved (controlled process). PMID:21892959

  14. Differential Susceptibility to the Environment: Are Developmental Models Compatible with the Evidence from Twin Studies?

    ERIC Educational Resources Information Center

    Del Giudice, Marco

    2016-01-01

    According to models of differential susceptibility, the same neurobiological and temperamental traits that determine increased sensitivity to stress and adversity also confer enhanced responsivity to the positive aspects of the environment. Differential susceptibility models have expanded to include complex developmental processes in which genetic…

  15. Differential Susceptibility to the Environment: Are Developmental Models Compatible with the Evidence from Twin Studies?

    ERIC Educational Resources Information Center

    Del Giudice, Marco

    2016-01-01

    According to models of differential susceptibility, the same neurobiological and temperamental traits that determine increased sensitivity to stress and adversity also confer enhanced responsivity to the positive aspects of the environment. Differential susceptibility models have expanded to include complex developmental processes in which genetic…

  16. Composing Models of Geographic Physical Processes

    NASA Astrophysics Data System (ADS)

    Hofer, Barbara; Frank, Andrew U.

    Processes are central for geographic information science; yet geographic information systems (GIS) lack capabilities to represent process related information. A prerequisite to including processes in GIS software is a general method to describe geographic processes independently of application disciplines. This paper presents such a method, namely a process description language. The vocabulary of the process description language is derived formally from mathematical models. Physical processes in geography can be described in two equivalent languages: partial differential equations or partial difference equations, where the latter can be shown graphically and used as a method for application specialists to enter their process models. The vocabulary of the process description language comprises components for describing the general behavior of prototypical geographic physical processes. These process components can be composed by basic models of geographic physical processes, which is shown by means of an example.

  17. Supporting Inquiry Processes with an Interactive Learning Environment: Inquiry Island

    NASA Astrophysics Data System (ADS)

    Eslinger, Eric; White, Barbara; Frederiksen, John; Brobst, Joseph

    2008-12-01

    This research addresses the effectiveness of an interactive learning environment, Inquiry Island, as a general-purpose framework for the design of inquiry-based science curricula. We introduce the software as a scaffold designed to support the creation and assessment of inquiry projects, and describe its use in a middle-school genetics unit. Students in the intervention showed significant gains in inquiry skills. We also illustrate the power of the software to gather and analyze qualitative data about student learning.

  18. Inquiry, play, and problem solving in a process learning environment

    NASA Astrophysics Data System (ADS)

    Thwaits, Anne Y.

    What is the nature of art/science collaborations in museums? How do art objects and activities contribute to the successes of science centers? Based on the premise that art exhibitions and art-based activities engage museum visitors in different ways than do strictly factual, information-based displays, I address these questions in a case study that examines the roles of visual art and artists in the Exploratorium, a museum that has influenced exhibit design and professional practice in many of the hands-on science centers in the United States and around the world. The marriage of art and science in education is not a new idea---Leonardo da Vinci and other early polymaths surely understood how their various endeavors informed one another, and some 20th century educators understood the value of the arts and creativity in the learning and practice of other disciplines. When, in 2010, the National Science Teachers Association added an A to the federal government's ubiquitous STEM initiative and turned it into STEAM, art educators nationwide took notice. With a heightened interest in the integration of and collaboration between disciplines comes an increased need for models of best practice for educators and educational institutions. With the intention to understand the nature of such collaborations and the potential they hold, I undertook this study. I made three site visits to the Exploratorium, where I took photos, recorded notes in a journal, interacted with exhibits, and observed museum visitors. I collected other data by examining the institution's website, press releases, annual reports, and fact sheets; and by reading popular and scholarly articles written by museum staff members and by independent journalists. I quickly realized that the Exploratorium was not created in the way than most museums are, and the history of its founding and the ideals of its founder illuminate what was then and continues now to be different about this museum from most others in the

  19. Model-based software process improvement

    NASA Technical Reports Server (NTRS)

    Zettervall, Brenda T.

    1994-01-01

    The activities of a field test site for the Software Engineering Institute's software process definition project are discussed. Products tested included the improvement model itself, descriptive modeling techniques, the CMM level 2 framework document, and the use of process definition guidelines and templates. The software process improvement model represents a five stage cyclic approach for organizational process improvement. The cycles consist of the initiating, diagnosing, establishing, acting, and leveraging phases.

  20. Active microrheology of a model of the nuclear micromechanical environment

    NASA Astrophysics Data System (ADS)

    Byrd, Henry; Kilfoil, Maria

    2014-03-01

    In order to successfully complete the final stages of chromosome segregation, eukaryotic cells require the motor enzyme topoisomerase II, which can resolve topological constraints between entangled strands of duplex DNA. We created an in vitro model of a close approximation of the nuclear micromechanical environment in terms of DNA mass and entanglement density, and investigated the influence of this motor enzyme on the DNA mechanics. Topoisomerase II is a non-processive ATPase which we found significantly increases the motions of embedded microspheres in the DNA network. Because of this activity, we study the mechanical properties of our model system by active microrheology by optical trapping. We test the limits of fluctuation dissipation theorem (FDT) under this type of activity by comparing the active microrheology to passive measurements, where thermal motion alone drives the beads. We can relate any departure from FDT to the timescale of topoisomerase II activity in the DNA network. These experiments provide insight into the physical necessity of this motor enzyme in the cell.

  1. Rapid prototyping and AI programming environments applied to payload modeling

    NASA Technical Reports Server (NTRS)

    Carnahan, Richard S., Jr.; Mendler, Andrew P.

    1987-01-01

    This effort focused on using artificial intelligence (AI) programming environments and rapid prototyping to aid in both space flight manned and unmanned payload simulation and training. Significant problems addressed are the large amount of development time required to design and implement just one of these payload simulations and the relative inflexibility of the resulting model to accepting future modification. Results of this effort have suggested that both rapid prototyping and AI programming environments can significantly reduce development time and cost when applied to the domain of payload modeling for crew training. The techniques employed are applicable to a variety of domains where models or simulations are required.

  2. Interactive Schematic Integration Within the Propellant System Modeling Environment

    NASA Technical Reports Server (NTRS)

    Coote, David; Ryan, Harry; Burton, Kenneth; McKinney, Lee; Woodman, Don

    2012-01-01

    Task requirements for rocket propulsion test preparations of the test stand facilities drive the need to model the test facility propellant systems prior to constructing physical modifications. The Propellant System Modeling Environment (PSME) is an initiative designed to enable increased efficiency and expanded capabilities to a broader base of NASA engineers in the use of modeling and simulation (M&S) technologies for rocket propulsion test and launch mission requirements. PSME will enable a wider scope of users to utilize M&S of propulsion test and launch facilities for predictive and post-analysis functionality by offering a clean, easy-to-use, high-performance application environment.

  3. Heuristic and Linear Models of Judgment: Matching Rules and Environments

    ERIC Educational Resources Information Center

    Hogarth, Robin M.; Karelaia, Natalia

    2007-01-01

    Much research has highlighted incoherent implications of judgmental heuristics, yet other findings have demonstrated high correspondence between predictions and outcomes. At the same time, judgment has been well modeled in the form of as if linear models. Accepting the probabilistic nature of the environment, the authors use statistical tools to…

  4. Transforming Collaborative Process Models into Interface Process Models by Applying an MDA Approach

    NASA Astrophysics Data System (ADS)

    Lazarte, Ivanna M.; Chiotti, Omar; Villarreal, Pablo D.

    Collaborative business models among enterprises require defining collaborative business processes. Enterprises implement B2B collaborations to execute these processes. In B2B collaborations the integration and interoperability of processes and systems of the enterprises are required to support the execution of collaborative processes. From a collaborative process model, which describes the global view of the enterprise interactions, each enterprise must define the interface process that represents the role it performs in the collaborative process in order to implement the process in a Business Process Management System. Hence, in this work we propose a method for the automatic generation of the interface process model of each enterprise from a collaborative process model. This method is based on a Model-Driven Architecture to transform collaborative process models into interface process models. By applying this method, interface processes are guaranteed to be interoperable and defined according to a collaborative process.

  5. Birth/death process model

    NASA Technical Reports Server (NTRS)

    Solloway, C. B.; Wakeland, W.

    1976-01-01

    First-order Markov model developed on digital computer for population with specific characteristics. System is user interactive, self-documenting, and does not require user to have complete understanding of underlying model details. Contains thorough error-checking algorithms on input and default capabilities.

  6. Mathematical Modeling: A Structured Process

    ERIC Educational Resources Information Center

    Anhalt, Cynthia Oropesa; Cortez, Ricardo

    2015-01-01

    Mathematical modeling, in which students use mathematics to explain or interpret physical, social, or scientific phenomena, is an essential component of the high school curriculum. The Common Core State Standards for Mathematics (CCSSM) classify modeling as a K-12 standard for mathematical practice and as a conceptual category for high school…

  7. Birth/death process model

    NASA Technical Reports Server (NTRS)

    Solloway, C. B.; Wakeland, W.

    1976-01-01

    First-order Markov model developed on digital computer for population with specific characteristics. System is user interactive, self-documenting, and does not require user to have complete understanding of underlying model details. Contains thorough error-checking algorithms on input and default capabilities.

  8. Mathematical Modeling: A Structured Process

    ERIC Educational Resources Information Center

    Anhalt, Cynthia Oropesa; Cortez, Ricardo

    2015-01-01

    Mathematical modeling, in which students use mathematics to explain or interpret physical, social, or scientific phenomena, is an essential component of the high school curriculum. The Common Core State Standards for Mathematics (CCSSM) classify modeling as a K-12 standard for mathematical practice and as a conceptual category for high school…

  9. Hypercompetitive Environments: An Agent-based model approach

    NASA Astrophysics Data System (ADS)

    Dias, Manuel; Araújo, Tanya

    Information technology (IT) environments are characterized by complex changes and rapid evolution. Globalization and the spread of technological innovation have increased the need for new strategic information resources, both from individual firms and management environments. Improvements in multidisciplinary methods and, particularly, the availability of powerful computational tools, are giving researchers an increasing opportunity to investigate management environments in their true complex nature. The adoption of a complex systems approach allows for modeling business strategies from a bottom-up perspective — understood as resulting from repeated and local interaction of economic agents — without disregarding the consequences of the business strategies themselves to individual behavior of enterprises, emergence of interaction patterns between firms and management environments. Agent-based models are at the leading approach of this attempt.

  10. Report of the 2014 Programming Models and Environments Summit

    SciTech Connect

    Heroux, Michael; Lethin, Richard

    2016-09-19

    Programming models and environments play the essential roles in high performance computing of enabling the conception, design, implementation and execution of science and engineering application codes. Programmer productivity is strongly influenced by the effectiveness of our programming models and environments, as is software sustainability since our codes have lifespans measured in decades, so the advent of new computing architectures, increased concurrency, concerns for resilience, and the increasing demands for high-fidelity, multi-physics, multi-scale and data-intensive computations mean that we have new challenges to address as part of our fundamental R&D requirements. Fortunately, we also have new tools and environments that make design, prototyping and delivery of new programming models easier than ever. The combination of new and challenging requirements and new, powerful toolsets enables significant synergies for the next generation of programming models and environments R&D. This report presents the topics discussed and results from the 2014 DOE Office of Science Advanced Scientific Computing Research (ASCR) Programming Models & Environments Summit, and subsequent discussions among the summit participants and contributors to topics in this report.

  11. GREENSCOPE: A Method for Modeling Chemical Process Sustainability

    EPA Science Inventory

    Current work within the U.S. Environmental Protection Agency’s National Risk Management Research Laboratory is focused on the development of a method for modeling chemical process sustainability. The GREENSCOPE methodology, defined for the four bases of Environment, Economics, Ef...

  12. GREENSCOPE: A Method for Modeling Chemical Process Sustainability

    EPA Science Inventory

    Current work within the U.S. Environmental Protection Agency’s National Risk Management Research Laboratory is focused on the development of a method for modeling chemical process sustainability. The GREENSCOPE methodology, defined for the four bases of Environment, Economics, Ef...

  13. Quality and Safety in Health Care, Part XIV: The External Environment and Research for Diagnostic Processes.

    PubMed

    Harolds, Jay A

    2016-09-01

    The work system in which diagnosis takes place is affected by the external environment, which includes requirements such as certification, accreditation, and regulations. How errors are reported, malpractice, and the system for payment are some other aspects of the external environment. Improving the external environment is expected to decrease errors in diagnosis. More research on improving the diagnostic process is needed.

  14. A Comparative of business process modelling techniques

    NASA Astrophysics Data System (ADS)

    Tangkawarow, I. R. H. T.; Waworuntu, J.

    2016-04-01

    In this era, there is a lot of business process modeling techniques. This article is the research about differences of business process modeling techniques. For each technique will explain about the definition and the structure. This paper presents a comparative analysis of some popular business process modelling techniques. The comparative framework is based on 2 criteria: notation and how it works when implemented in Somerleyton Animal Park. Each technique will end with the advantages and disadvantages. The final conclusion will give recommend of business process modeling techniques that easy to use and serve the basis for evaluating further modelling techniques.

  15. Campylobacter jejuni survival in a poultry processing plant environment.

    PubMed

    García-Sánchez, Lourdes; Melero, Beatriz; Jaime, Isabel; Hänninen, Marja-Liisa; Rossi, Mirko; Rovira, Jordi

    2017-08-01

    Campylobacteriosis is the most common cause of bacterial gastroenteritis worldwide. Consumption of poultry, especially chicken's meat is considered the most common route for human infection. The aim of this study was to determine if Campylobacter spp. might persist in the poultry plant environment before and after cleaning and disinfection procedures and the distribution and their genetic relatedness. During one month from a poultry plant were analyzed a total of 494 samples -defeathering machine, evisceration machine, floor, sink, conveyor belt, shackles and broiler meat- in order to isolate C. jejuni and C. coli. Results showed that C. jejuni and C. coli prevalence was 94.5% and 5.5% respectively. Different typing techniques as PFGE, MLST established seven C. jejuni genotypes. Whole genome MLST strongly suggest that highly clonal populations of C. jejuni can survive in adverse environmental conditions, even after cleaning and disinfection, and persist for longer periods than previous thought (at least 21 days) in the poultry plant environment. Even so, it might act as a source of contamination independently of the contamination level of the flock entering the slaughter line. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Charged Particle Environment Definition for NGST: Model Development

    NASA Technical Reports Server (NTRS)

    Blackwell, William C.; Minow, Joseph I.; Evans, Steven W.; Hardage, Donna M.; Suggs, Robert M.

    2000-01-01

    NGST will operate in a halo orbit about the L2 point, 1.5 million km from the Earth, where the spacecraft will periodically travel through the magnetotail region. There are a number of tools available to calculate the high energy, ionizing radiation particle environment from galactic cosmic rays and from solar disturbances. However, space environment tools are not generally available to provide assessments of charged particle environment and its variations in the solar wind, magnetosheath, and magnetotail at L2 distances. An engineering-level phenomenology code (LRAD) was therefore developed to facilitate the definition of charged particle environments in the vicinity of the L2 point in support of the NGST program. LRAD contains models tied to satellite measurement data of the solar wind and magnetotail regions. The model provides particle flux and fluence calculations necessary to predict spacecraft charging conditions and the degradation of materials used in the construction of NGST. This paper describes the LRAD environment models for the deep magnetotail (XGSE < -100 Re) and solar wind, and presents predictions of the charged particle environment for NGST.

  17. Genomic Prediction of Genotype × Environment Interaction Kernel Regression Models.

    PubMed

    Cuevas, Jaime; Crossa, José; Soberanis, Víctor; Pérez-Elizalde, Sergio; Pérez-Rodríguez, Paulino; Campos, Gustavo de Los; Montesinos-López, O A; Burgueño, Juan

    2016-11-01

    In genomic selection (GS), genotype × environment interaction (G × E) can be modeled by a marker × environment interaction (M × E). The G × E may be modeled through a linear kernel or a nonlinear (Gaussian) kernel. In this study, we propose using two nonlinear Gaussian kernels: the reproducing kernel Hilbert space with kernel averaging (RKHS KA) and the Gaussian kernel with the bandwidth estimated through an empirical Bayesian method (RKHS EB). We performed single-environment analyses and extended to account for G × E interaction (GBLUP-G × E, RKHS KA-G × E and RKHS EB-G × E) in wheat ( L.) and maize ( L.) data sets. For single-environment analyses of wheat and maize data sets, RKHS EB and RKHS KA had higher prediction accuracy than GBLUP for all environments. For the wheat data, the RKHS KA-G × E and RKHS EB-G × E models did show up to 60 to 68% superiority over the corresponding single environment for pairs of environments with positive correlations. For the wheat data set, the models with Gaussian kernels had accuracies up to 17% higher than that of GBLUP-G × E. For the maize data set, the prediction accuracy of RKHS EB-G × E and RKHS KA-G × E was, on average, 5 to 6% higher than that of GBLUP-G × E. The superiority of the Gaussian kernel models over the linear kernel is due to more flexible kernels that accounts for small, more complex marker main effects and marker-specific interaction effects.

  18. A Hierarchical Process-Dissociation Model

    ERIC Educational Resources Information Center

    Rouder, Jeffrey N.; Lu, Jun; Morey, Richard D.; Sun, Dongchu; Speckman, Paul L.

    2008-01-01

    In fitting the process-dissociation model (L. L. Jacoby, 1991) to observed data, researchers aggregate outcomes across participant, items, or both. T. Curran and D. L. Hintzman (1995) demonstrated how biases from aggregation may lead to artifactual support for the model. The authors develop a hierarchical process-dissociation model that does not…

  19. The Computer-Aided Analytic Process Model. Operations Handbook for the Analytic Process Model Demonstration Package

    DTIC Science & Technology

    1986-01-01

    Research Note 86-06 THE COMPUTER-AIDED ANALYTIC PROCESS MODEL : OPERATIONS HANDBOOK FOR THE ANALYTIC PROCESS MODEL DE ONSTRATION PACKAGE Ronald G...ic Process Model ; Operations Handbook; Tutorial; Apple; Systems Taxonomy Mod--l; Training System; Bradl1ey infantry Fighting * Vehicle; BIFV...8217. . . . . . . .. . . . . . . . . . . . . . . . * - ~ . - - * m- .. . . . . . . item 20. Abstract -continued companion volume-- "The Analytic Process Model for

  20. Application of Mathematical Models for Different Electroslag Remelting Processes

    NASA Astrophysics Data System (ADS)

    Jiang, Zhou Hua; Yu, Jia; Liu, Fu Bin; Chen, Xu; Geng, Xin

    2017-04-01

    The electroslag remelting (ESR) process has been effectively applied to produce high grade special steels and super alloys based on the controllable solidification and chemical refining process. Due to the difficulties of precise measurements in a high temperature environment and the excessive expenses, mathematical models have been more and more attractive in terms of investigating the transport phenomena in ESR process. In this paper, the numerical models for different ESR processes made by our lab in last decade have been introduced. The first topic deals with traditional ESR process predicting the relationship between operating parameters and metallurgical parameters of interest. The second topic is concerning the new ESR technology process including ESR with current-conductive mould (CCM), ESR hollow ingot technology, electroslag casting with liquid metal(ESC LM), and so on. Finally, the numerical simulation of solidification microstructure with multi-scale model is presented, which reveals the formation mechanism of microstructure.

  1. Modeling and Performance Simulation of the Mass Storage Network Environment

    NASA Technical Reports Server (NTRS)

    Kim, Chan M.; Sang, Janche

    2000-01-01

    This paper describes the application of modeling and simulation in evaluating and predicting the performance of the mass storage network environment. Network traffic is generated to mimic the realistic pattern of file transfer, electronic mail, and web browsing. The behavior and performance of the mass storage network and a typical client-server Local Area Network (LAN) are investigated by modeling and simulation. Performance characteristics in throughput and delay demonstrate the important role of modeling and simulation in network engineering and capacity planning.

  2. Mechanistic Fermentation Models for Process Design, Monitoring, and Control.

    PubMed

    Mears, Lisa; Stocks, Stuart M; Albaek, Mads O; Sin, Gürkan; Gernaey, Krist V

    2017-08-21

    Mechanistic models require a significant investment of time and resources, but their application to multiple stages of fermentation process development and operation can make this investment highly valuable. This Opinion article discusses how an established fermentation model may be adapted for application to different stages of fermentation process development: planning, process design, monitoring, and control. Although a longer development time is required for such modeling methods in comparison to purely data-based model techniques, the wide range of applications makes them a highly valuable tool for fermentation research and development. In addition, in a research environment, where collaboration is important, developing mechanistic models provides a platform for knowledge sharing and consolidation of existing process understanding. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Commercial applications in biomedical processing in the microgravity environment

    NASA Astrophysics Data System (ADS)

    Johnson, Terry C.; Taub, Floyd

    1995-01-01

    A series of studies have shown that a purified cell regulatory sialoglycopeptide (CeReS) that arrests cell division and induces cellular differentiation is fully capable of functionally interacting with target insect and mammalian cells in the microgravity environment. Data from several shuttle missions suggest that the signal transduction events that are known to be associated with CeReS action function as well in microgravity as in ground-based experiments. The molecular events known to be associated with CeReS include an ability to interfere with Ca2+ metabolism, the subsequent alkalinization of cell cytosol, and the inhibition of the phosphorylation of the nuclear protein product encoded by the retinoblastoma (RB) gene. The ability of CeReS to function in microgravity opens a wide variety of applications in space life sciences.

  4. Using process groups to implement failure detection in asynchronous environments

    NASA Technical Reports Server (NTRS)

    Ricciardi, Aleta M.; Birman, Kenneth P.

    1991-01-01

    Agreement on the membership of a group of processes in a distributed system is a basic problem that arises in a wide range of applications. Such groups occur when a set of processes cooperate to perform some task, share memory, monitor one another, subdivide a computation, and so forth. The group membership problems is discussed as it relates to failure detection in asynchronous, distributed systems. A rigorous, formal specification for group membership is presented under this interpretation. A solution is then presented for this problem.

  5. Computational Process Modeling for Additive Manufacturing

    NASA Technical Reports Server (NTRS)

    Bagg, Stacey; Zhang, Wei

    2014-01-01

    Computational Process and Material Modeling of Powder Bed additive manufacturing of IN 718. Optimize material build parameters with reduced time and cost through modeling. Increase understanding of build properties. Increase reliability of builds. Decrease time to adoption of process for critical hardware. Potential to decrease post-build heat treatments. Conduct single-track and coupon builds at various build parameters. Record build parameter information and QM Meltpool data. Refine Applied Optimization powder bed AM process model using data. Report thermal modeling results. Conduct metallography of build samples. Calibrate STK models using metallography findings. Run STK models using AO thermal profiles and report STK modeling results. Validate modeling with additional build. Photodiode Intensity measurements highly linear with power input. Melt Pool Intensity highly correlated to Melt Pool Size. Melt Pool size and intensity increase with power. Applied Optimization will use data to develop powder bed additive manufacturing process model.

  6. Dissolving decision making? Models and their roles in decision-making processes and policy at large.

    PubMed

    Zeiss, Ragna; van Egmond, Stans

    2014-12-01

    This article studies the roles three science-based models play in Dutch policy and decision making processes. Key is the interaction between model construction and environment. Their political and scientific environments form contexts that shape the roles of models in policy decision making. Attention is paid to three aspects of the wider context of the models: a) the history of the construction process; b) (changes in) the political and scientific environments; and c) the use in policy processes over longer periods of time. Models are more successfully used when they are constructed in a stable political and scientific environment. Stability and certainty within a scientific field seems to be a key predictor for the usefulness of models for policy making. The economic model is more disputed than the ecology-based model and the model that has its theoretical foundation in physics and chemistry. The roles models play in policy processes are too complex to be considered as straightforward technocratic powers.

  7. [Analytic methods for seed models with genotype x environment interactions].

    PubMed

    Zhu, J

    1996-01-01

    Genetic models with genotype effect (G) and genotype x environment interaction effect (GE) are proposed for analyzing generation means of seed quantitative traits in crops. The total genetic effect (G) is partitioned into seed direct genetic effect (G0), cytoplasm genetic of effect (C), and maternal plant genetic effect (Gm). Seed direct genetic effect (G0) can be further partitioned into direct additive (A) and direct dominance (D) genetic components. Maternal genetic effect (Gm) can also be partitioned into maternal additive (Am) and maternal dominance (Dm) genetic components. The total genotype x environment interaction effect (GE) can also be partitioned into direct genetic by environment interaction effect (G0E), cytoplasm genetic by environment interaction effect (CE), and maternal genetic by environment interaction effect (GmE). G0E can be partitioned into direct additive by environment interaction (AE) and direct dominance by environment interaction (DE) genetic components. GmE can also be partitioned into maternal additive by environment interaction (AmE) and maternal dominance by environment interaction (DmE) genetic components. Partitions of genetic components are listed for parent, F1, F2 and backcrosses. A set of parents, their reciprocal F1 and F2 seeds is applicable for efficient analysis of seed quantitative traits. MINQUE(0/1) method can be used for estimating variance and covariance components. Unbiased estimation for covariance components between two traits can also be obtained by the MINQUE(0/1) method. Random genetic effects in seed models are predictable by the Adjusted Unbiased Prediction (AUP) approach with MINQUE(0/1) method. The jackknife procedure is suggested for estimation of sampling variances of estimated variance and covariance components and of predicted genetic effects, which can be further used in a t-test for parameter. Unbiasedness and efficiency for estimating variance components and predicting genetic effects are tested by

  8. Modeling human behaviors and reactions under dangerous environment.

    PubMed

    Kang, J; Wright, D K; Qin, S F; Zhao, Y

    2005-01-01

    This paper describes the framework of a real-time simulation system to model human behavior and reactions in dangerous environments. The system utilizes the latest 3D computer animation techniques, combined with artificial intelligence, robotics and psychology, to model human behavior, reactions and decision making under expected/unexpected dangers in real-time in virtual environments. The development of the system includes: classification on the conscious/subconscious behaviors and reactions of different people; capturing different motion postures by the Eagle Digital System; establishing 3D character animation models; establishing 3D models for the scene; planning the scenario and the contents; and programming within Virtools Dev. Programming within Virtools Dev is subdivided into modeling dangerous events, modeling character's perceptions, modeling character's decision making, modeling character's movements, modeling character's interaction with environment and setting up the virtual cameras. The real-time simulation of human reactions in hazardous environments is invaluable in military defense, fire escape, rescue operation planning, traffic safety studies, and safety planning in chemical factories, the design of buildings, airplanes, ships and trains. Currently, human motion modeling can be realized through established technology, whereas to integrate perception and intelligence into virtual human's motion is still a huge undertaking. The challenges here are the synchronization of motion and intelligence, the accurate modeling of human's vision, smell, touch and hearing, the diversity and effects of emotion and personality in decision making. There are three types of software platforms which could be employed to realize the motion and intelligence within one system, and their advantages and disadvantages are discussed.

  9. Attributes of Success in a Challenging Information Processing Environment

    DTIC Science & Technology

    2007-09-01

    Organizational theorists choose to deconstruct and analyze organizations in many ways, including in terms of structure and macro level design features. While...3 D. ORGANIZATION OF THESIS .............................3 II. LITERATURE REVIEW .......................................5 A. ORGANIZATIONAL ...IMPLICATIONS FOR ORGANIZATIONAL DESIGN ............41 C. IMPLICATIONS FOR INFORMATION PROCESSING THEORY ....42 D. IMPLICATIONS FOR USING NETWORK

  10. Simulation Modeling of Software Development Processes

    NASA Technical Reports Server (NTRS)

    Calavaro, G. F.; Basili, V. R.; Iazeolla, G.

    1996-01-01

    A simulation modeling approach is proposed for the prediction of software process productivity indices, such as cost and time-to-market, and the sensitivity analysis of such indices to changes in the organization parameters and user requirements. The approach uses a timed Petri Net and Object Oriented top-down model specification. Results demonstrate the model representativeness, and its usefulness in verifying process conformance to expectations, and in performing continuous process improvement and optimization.

  11. Ada COCOMO and the Ada Process Model

    DTIC Science & Technology

    1989-01-01

    language, the use of incremental development, and the use of the Ada process model capitalizing on the strengths of Ada to improve the efficiency of software...development. This paper presents the portions of the revised Ada COCOMO dealing with the effects of Ada and the Ada process model . The remainder of...this section of the paper discusses the objectives of Ada COCOMO. Section 2 describes the Ada Process Model and its overall effects on software

  12. Simulation Modeling of Software Development Processes

    NASA Technical Reports Server (NTRS)

    Calavaro, G. F.; Basili, V. R.; Iazeolla, G.

    1996-01-01

    A simulation modeling approach is proposed for the prediction of software process productivity indices, such as cost and time-to-market, and the sensitivity analysis of such indices to changes in the organization parameters and user requirements. The approach uses a timed Petri Net and Object Oriented top-down model specification. Results demonstrate the model representativeness, and its usefulness in verifying process conformance to expectations, and in performing continuous process improvement and optimization.

  13. Large urban fire environment. Trends and model city predictions

    SciTech Connect

    Larson, D.A.; Small, R.D.

    1982-01-01

    The urban fire environment that would result from a megaton-yield nuclear weapon burst is considered. The dependence of temperatures and velocities on fire size, burning intensity, turbulence, and radiation is explored, and specific calculations for three model urban areas are presented. In all cases, high velocity fire winds are predicted. The model-city results show the influence of building density and urban sprawl on the fire environment. Additional calculations consider large-area fires with the burning intensity reduced in a blast-damaged urban center.

  14. A Practical Environment to Apply Model-Driven Web Engineering

    NASA Astrophysics Data System (ADS)

    Escalona, Maria Jose; Gutiérrez, J. J.; Morero, F.; Parra, C. L.; Nieto, J.; Pérez, F.; Martín, F.; Llergo, A.

    The application of a model-driven paradigm in the development of Web Systems has yielded very good research results. Several research groups are defining metamodels, transformations, and tools which offer a suitable environment, known as model-driven Web engineering (MDWE). However, there are very few practical experiences in real Web system developments using real development teams. This chapter presents a practical environment of MDWE based on the use of NDT (navigational development techniques) and Java Web systems, and it provides a practical evaluation of its application within a real project: specialized Diraya.

  15. Models for Turbulent Transport Processes.

    ERIC Educational Resources Information Center

    Hill, James C.

    1979-01-01

    Since the statistical theories of turbulence that have developed over the last twenty or thirty years are too abstract and unreliable to be of much use to chemical engineers, this paper introduces the techniques of single point models and suggests some areas of needed research. (BB)

  16. Total Ship Design Process Modeling

    DTIC Science & Technology

    2012-04-30

    Microsoft Project® or Primavera ®, and perform process simulations that can investigate risk, cost, and schedule trade-offs. Prior efforts to capture...planning in the face of disruption, delay, and late‐changing  requirements. ADePT is interfaced with  PrimaVera , the AEC  industry favorite program

  17. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    DOE PAGES

    Dai, Heng; Ye, Ming; Walker, Anthony P.; ...

    2017-03-28

    A hydrological model consists of multiple process level submodels, and each submodel represents a process key to the operation of the simulated system. Global sensitivity analysis methods have been widely used to identify important processes for system model development and improvement. The existing methods of global sensitivity analysis only consider parametric uncertainty, and are not capable of handling model uncertainty caused by multiple process models that arise from competing hypotheses about one or more processes. To address this problem, this study develops a new method to probe model output sensitivity to competing process models by integrating model averaging methods withmore » variance-based global sensitivity analysis. A process sensitivity index is derived as a single summary measure of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and their parameters. Here, for demonstration, the new index is used to assign importance to the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that convert precipitation to recharge, and the geology process is simulated by two models of hydraulic conductivity. Each process model has its own random parameters. Finally, the new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less

  18. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    NASA Astrophysics Data System (ADS)

    Dai, Heng; Ye, Ming; Walker, Anthony P.; Chen, Xingyuan

    2017-04-01

    A hydrological model consists of multiple process level submodels, and each submodel represents a process key to the operation of the simulated system. Global sensitivity analysis methods have been widely used to identify important processes for system model development and improvement. The existing methods of global sensitivity analysis only consider parametric uncertainty, and are not capable of handling model uncertainty caused by multiple process models that arise from competing hypotheses about one or more processes. To address this problem, this study develops a new method to probe model output sensitivity to competing process models by integrating model averaging methods with variance-based global sensitivity analysis. A process sensitivity index is derived as a single summary measure of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and their parameters. For demonstration, the new index is used to assign importance to the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that convert precipitation to recharge, and the geology process is simulated by two models of hydraulic conductivity. Each process model has its own random parameters. The new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.

  19. Multi-Environment Model Estimation for Motility Analysis of Caenorhabditis elegans

    PubMed Central

    Sznitman, Raphael; Gupta, Manaswi; Hager, Gregory D.; Arratia, Paulo E.; Sznitman, Josué

    2010-01-01

    The nematode Caenorhabditis elegans is a well-known model organism used to investigate fundamental questions in biology. Motility assays of this small roundworm are designed to study the relationships between genes and behavior. Commonly, motility analysis is used to classify nematode movements and characterize them quantitatively. Over the past years, C. elegans' motility has been studied across a wide range of environments, including crawling on substrates, swimming in fluids, and locomoting through microfluidic substrates. However, each environment often requires customized image processing tools relying on heuristic parameter tuning. In the present study, we propose a novel Multi-Environment Model Estimation (MEME) framework for automated image segmentation that is versatile across various environments. The MEME platform is constructed around the concept of Mixture of Gaussian (MOG) models, where statistical models for both the background environment and the nematode appearance are explicitly learned and used to accurately segment a target nematode. Our method is designed to simplify the burden often imposed on users; here, only a single image which includes a nematode in its environment must be provided for model learning. In addition, our platform enables the extraction of nematode ‘skeletons’ for straightforward motility quantification. We test our algorithm on various locomotive environments and compare performances with an intensity-based thresholding method. Overall, MEME outperforms the threshold-based approach for the overwhelming majority of cases examined. Ultimately, MEME provides researchers with an attractive platform for C. elegans' segmentation and ‘skeletonizing’ across a wide range of motility assays. PMID:20661478

  20. Containerless processing of single crystals in low-G environment

    NASA Technical Reports Server (NTRS)

    Walter, H. U.

    1974-01-01

    Experiments on containerless crystal growth from the melt were conducted during Skylab missions SL3 and SL4 (Skylab Experiment M-560). Six samples of InSb were processed, one of them heavily doped with selenium. The concept of the experiment is discussed and related to general crystal growth methods and their merits as techniques for containerless processing in space. The morphology of the crystals obtained is explained in terms of volume changes associated with solidification and wetting conditions during solidification. All samples exhibit extremely well developed growth facets. Analysis by X-ray topographical methods and chemical etching shows that the crystals are of high structural perfection. Average dislocation density as revealed by etching is of the order of 100 per sq cm; no dislocation clusters could be observed in the space-grown samples. A sequence of striations that is observed in the first half of the selenium-doped sample is explained as being caused by periodic surface breakdown.

  1. Compound Cue Processing in Linearly and Nonlinearly Separable Environments

    ERIC Educational Resources Information Center

    Hoffrage, Ulrich; Garcia-Retamero, Rocio; Czienskowski, Uwe

    2008-01-01

    Take-the-best (TTB) is a fast and frugal heuristic for paired comparison that has been proposed as a model of bounded rationality. This heuristic has been criticized for not taking compound cues into account to predict a criterion, although such an approach is sometimes required to make accurate predictions. By means of computer simulations, it is…

  2. Employing Noisy Environments to Support Quantum Information Processing

    DTIC Science & Technology

    2007-11-02

    Quantum Information Processing 5. FUNDING NUMBERS DAAD19-02-1-0161 6. AUTHOR(S) Martin B Plenio and Susana F Huelga...designated by other documentation. 12 a. DISTRIBUTION / AVAILABILITY STATEMENT Approved for public release; distribution unlimited. 12 b ...entanglement dynamics can be achieved in such a system. The results of this work have been published in E. Jané, M.B. Plenio and D. Jonathan, ”Quantum

  3. VHP - An environment for the remote visualization of heuristic processes

    NASA Technical Reports Server (NTRS)

    Crawford, Stuart L.; Leiner, Barry M.

    1991-01-01

    A software system called VHP is introduced which permits the visualization of heuristic algorithms on both resident and remote hardware platforms. The VHP is based on the DCF tool for interprocess communication and is applicable to remote algorithms which can be on different types of hardware and in languages other than VHP. The VHP system is of particular interest to systems in which the visualization of remote processes is required such as robotics for telescience applications.

  4. Modeling and simulation of membrane process

    NASA Astrophysics Data System (ADS)

    Staszak, Maciej

    2017-06-01

    The article presents the different approaches to polymer membrane mathematical modeling. Traditional models based on experimental physicochemical correlations and balance models are presented in the first part. Quantum and molecular mechanics models are presented as they are more popular for polymer membranes in fuel cells. The initial part is enclosed by neural network models which found their use for different types of processes in polymer membranes. The second part is devoted to models of fluid dynamics. The computational fluid dynamics technique can be divided into solving of Navier-Stokes equations and into Boltzmann lattice models. Both approaches are presented focusing on membrane processes.

  5. Sensitivity of membranes to their environment. Role of stochastic processes.

    PubMed Central

    Offner, F F

    1984-01-01

    Ionic flow through biomembranes often exhibits a sensitivity to the environment, which is difficult to explain by classical theory, that usually assumes that the free energy available to change the membrane permeability results from the environmental change acting directly on the permeability control mechanism. This implies, for example, that a change delta V in the trans-membrane potential can produce a maximum free energy change, delta V X q, on a gate (control mechanism) carrying a charge q. The analysis presented here shows that when stochastic fluctuations are considered, under suitable conditions (gate cycle times rapid compared with the field relaxation time within a channel), the change in free energy is limited, not by the magnitude of the stimulus, but by the electrochemical potential difference across the membrane, which may be very much greater. Conformational channel gates probably relax more slowly than the field within the channel; this would preclude appreciable direct amplification of the stimulus. It is shown, however, that the effect of impermeable cations such as Ca++ is to restore the amplification of the stimulus through its interaction with the electric field. The analysis predicts that the effect of Ca++ should be primarily to affect the number of channels that are open, while only slightly affecting the conductivity of an open channel. PMID:6093903

  6. Applicability of the protein environment equilibrium approximation for describing ultrafast biophysical processes

    NASA Astrophysics Data System (ADS)

    Poddubnyy, V. V.; Glebov, I. O.; Sudarkova, S. M.

    2015-06-01

    The theoretical description of ultrafast processes in biological systems, in particular, electron transfer in photosynthetic reaction centers, is an important problem in modern biological physics. Because these processes occur in a protein medium with which an energy exchange is possible, methods of the quantum theory of open systems must be used to describe them. But because of a high process rate and the specifics of the protein environment, basic approximations of this theory might be inapplicable. We study the applicability of the approximation of the protein environment (bath) state invariance for the dissipative dynamics of charge transfer between molecule-pigments contained in reaction centers. For this, we use model systems whose parameters are close to real ones. We conclude that this approximation can be used to describe both the monotonic and the oscillating dynamics of the reaction subsystem in large biological molecules. We consider various mechanisms for bath thermalization and show that the bath thermalization occurs not because of the intramolecular redistribution of the vibrational energy in it but because of its coupling to the reaction subsystem.

  7. Processing of soot in an urban environment: case study from the Mexico City Metropolitan Area

    NASA Astrophysics Data System (ADS)

    Johnson, K. S.; Zuberi, B.; Molina, L. T.; Molina, M. J.; Iedema, M. J.; Cowin, J. P.; Gaspar, D. J.; Wang, C.; Laskin, A.

    2005-08-01

    Chemical composition, size, and mixing state of atmospheric particles are critical in determining their effects on the environment. There is growing evidence that soot aerosols play a particularly important role in both climate and human health, but still relatively little is known of their physical and chemical nature. In addition, the atmospheric residence times and removal mechanisms for soot are neither well understood nor adequately represented in regional and global climate models. To investigate the effect of locality and residence time on properties of soot and mixing state in a polluted urban environment, particles of diameter 0.2-2.0 µm were collected in the Mexico City Metropolitan Area (MCMA) during the MCMA-2003 field campaign from various sites within the city. Individual particle analysis by different electron microscopy methods coupled with energy dispersed X-ray spectroscopy, and secondary ionization mass spectrometry show that freshly-emitted soot particles become rapidly processed in the MCMA. Whereas fresh particulate emissions from mixed-traffic are almost entirely carbonaceous, consisting of soot aggregates with liquid coatings suggestive of unburned lubricating oil and water, ambient soot particles which have been processed for less than a few hours are heavily internally mixed, primarily with ammonium sulfate. Single particle analysis suggests that this mixing occurs through several mechanisms that require further investigation. In light of previously published results, the internally-mixed nature of processed soot particles is expected to affect heterogeneous chemistry on the soot

  8. Processing of soot in an urban environment: case study from the Mexico City Metropolitan Area

    NASA Astrophysics Data System (ADS)

    Johnson, K. S.; Zuberi, B.; Molina, L. T.; Molina, M. J.; Iedema, M. J.; Cowin, J. P.; Gaspar, D. J.; Wang, C.; Laskin, A.

    2005-11-01

    Chemical composition, size, and mixing state of atmospheric particles are critical in determining their effects on the environment. There is growing evidence that soot aerosols play a particularly important role in both climate and human health, but still relatively little is known of their physical and chemical nature. In addition, the atmospheric residence times and removal mechanisms for soot are neither well understood nor adequately represented in regional and global climate models. To investigate the effect of locality and residence time on properties of soot and mixing state in a polluted urban environment, particles of diameter 0.2-2.0 μm were collected in the Mexico City Metropolitan Area (MCMA) during the MCMA-2003 Field Campaign from various sites within the city. Individual particle analysis by different electron microscopy methods coupled with energy dispersed x-ray spectroscopy, and secondary ionization mass spectrometry show that freshly-emitted soot particles become rapidly processed in the MCMA. Whereas fresh particulate emissions from mixed-traffic are almost entirely carbonaceous, consisting of soot aggregates with liquid coatings suggestive of unburned lubricating oil and water, ambient soot particles which have been processed for less than a few hours are heavily internally mixed, primarily with ammonium sulfate. Single particle analysis suggests that this mixing occurs through several mechanisms that require further investigation. In light of previously published results, the internally-mixed nature of processed soot particles is expected to affect heterogeneous chemistry on the soot surface, including interaction with water during wet-removal.

  9. Modelling urban rainfall-runoff responses using an experimental, two-tiered physical modelling environment

    NASA Astrophysics Data System (ADS)

    Green, Daniel; Pattison, Ian; Yu, Dapeng

    2016-04-01

    Surface water (pluvial) flooding occurs when rainwater from intense precipitation events is unable to infiltrate into the subsurface or drain via natural or artificial drainage channels. Surface water flooding poses a serious hazard to urban areas across the world, with the UK's perceived risk appearing to have increased in recent years due to surface water flood events seeming more severe and frequent. Surface water flood risk currently accounts for 1/3 of all UK flood risk, with approximately two million people living in urban areas at risk of a 1 in 200-year flood event. Research often focuses upon using numerical modelling techniques to understand the extent, depth and severity of actual or hypothetical flood scenarios. Although much research has been conducted using numerical modelling, field data available for model calibration and validation is limited due to the complexities associated with data collection in surface water flood conditions. Ultimately, the data which numerical models are based upon is often erroneous and inconclusive. Physical models offer a novel, alternative and innovative environment to collect data within, creating a controlled, closed system where independent variables can be altered independently to investigate cause and effect relationships. A physical modelling environment provides a suitable platform to investigate rainfall-runoff processes occurring within an urban catchment. Despite this, physical modelling approaches are seldom used in surface water flooding research. Scaled laboratory experiments using a 9m2, two-tiered 1:100 physical model consisting of: (i) a low-cost rainfall simulator component able to simulate consistent, uniformly distributed (>75% CUC) rainfall events of varying intensity, and; (ii) a fully interchangeable, modular plot surface have been conducted to investigate and quantify the influence of a number of terrestrial and meteorological factors on overland flow and rainfall-runoff patterns within a modelled

  10. Multiscale simulation of molecular processes in cellular environments.

    PubMed

    Chiricotto, Mara; Sterpone, Fabio; Derreumaux, Philippe; Melchionna, Simone

    2016-11-13

    We describe the recent advances in studying biological systems via multiscale simulations. Our scheme is based on a coarse-grained representation of the macromolecules and a mesoscopic description of the solvent. The dual technique handles particles, the aqueous solvent and their mutual exchange of forces resulting in a stable and accurate methodology allowing biosystems of unprecedented size to be simulated.This article is part of the themed issue 'Multiscale modelling at the physics-chemistry-biology interface'.

  11. Multiscale simulation of molecular processes in cellular environments

    NASA Astrophysics Data System (ADS)

    Chiricotto, Mara; Sterpone, Fabio; Derreumaux, Philippe; Melchionna, Simone

    2016-11-01

    We describe the recent advances in studying biological systems via multiscale simulations. Our scheme is based on a coarse-grained representation of the macromolecules and a mesoscopic description of the solvent. The dual technique handles particles, the aqueous solvent and their mutual exchange of forces resulting in a stable and accurate methodology allowing biosystems of unprecedented size to be simulated. This article is part of the themed issue 'Multiscale modelling at the physics-chemistry-biology interface'.

  12. Adaptive Matched Field Processing in an Uncertain Propagation Environment

    DTIC Science & Technology

    1992-01-01

    signal. Adaptive processors, such as Capon’s MVDR Processor [9], are particularly sen- sitive to inaccuarate or imprecise knowledge of the...problem is a model mismatch problem since it results in a mismatch between the assumed and the actual second-order statistics of the desired signal...characteris- tics are parameterized by the statistics of the temporally and spatially varying sound speed structure of the ocean C(z, t). This chapter

  13. Computing confidence intervals for point process models.

    PubMed

    Sarma, Sridevi V; Nguyen, David P; Czanner, Gabriela; Wirth, Sylvia; Wilson, Matthew A; Suzuki, Wendy; Brown, Emery N

    2011-11-01

    Characterizing neural spiking activity as a function of intrinsic and extrinsic factors is important in neuroscience. Point process models are valuable for capturing such information; however, the process of fully applying these models is not always obvious. A complete model application has four broad steps: specification of the model, estimation of model parameters given observed data, verification of the model using goodness of fit, and characterization of the model using confidence bounds. Of these steps, only the first three have been applied widely in the literature, suggesting the need to dedicate a discussion to how the time-rescaling theorem, in combination with parametric bootstrap sampling, can be generally used to compute confidence bounds of point process models. In our first example, we use a generalized linear model of spiking propensity to demonstrate that confidence bounds derived from bootstrap simulations are consistent with those computed from closed-form analytic solutions. In our second example, we consider an adaptive point process model of hippocampal place field plasticity for which no analytical confidence bounds can be derived. We demonstrate how to simulate bootstrap samples from adaptive point process models, how to use these samples to generate confidence bounds, and how to statistically test the hypothesis that neural representations at two time points are significantly different. These examples have been designed as useful guides for performing scientific inference based on point process models.

  14. Model-based design of peptide chromatographic purification processes.

    PubMed

    Gétaz, David; Stroehlein, Guido; Butté, Alessandro; Morbidelli, Massimo

    2013-04-05

    In this work we present a general procedure for the model-based optimization of a polypeptide crude mixture purification process through its application to a case of industrial relevance. This is done to show how much modeling can be beneficial to optimize complex chromatographic processes in the industrial environment. The target peptide elution profile was modeled with a two sites adsorption equilibrium isotherm exhibiting two inflection points. The variation of the isotherm parameters with the modifier concentration was accounted for. The adsorption isotherm parameters of the target peptide were obtained by the inverse method. The elution of the impurities was approximated by lumping them into pseudo-impurities and by regressing their adsorption isotherm parameters directly as a function of the corresponding parameters of the target peptide. After model calibration and validation by comparison with suitable experimental data, Pareto optimizations of the process were carried out so as to select the optimal batch process.

  15. Modeling integrated sensor/actuator functions in realistic environments

    NASA Astrophysics Data System (ADS)

    Kim, Jae-Wan; Varadan, Vasundara V.; Varadan, Vijay K.

    1993-07-01

    Smart materials are expected to adapt to their environment and provide a useful response to changes in the environment. Both the sensor and actuator functions with the appropriate feedback mechanism must be integrated and comprise the `brains' of the material. Piezoelectric ceramics have proved to be effective as both sensors and actuators for a wide variety of applications. Thus, realistic simulation models are needed that can predict the performance of smart materials that incorporate piezoceramics. The environment may include the structure on which the transducers are mounted, fluid medium and material damping. In all cases, the smart material should sense the change and make a useful response. A hybrid numerical method involving finite element modeling in the plate structure and transducer region and a plane wave representation in the fluid region is used. The simulation of the performance of smart materials are performed.

  16. Prevalence and concentration of Salmonella and Campylobacter in the processing environment of small-scale pastured broiler farms

    USDA-ARS?s Scientific Manuscript database

    A growing niche in the locally grown food movement is the small scale production of broiler chickens using the pasture-raised poultry production model. Little research exists that focuses on Salmonella and Campylobacter contamination in the environment associated with on-farm processing of pasture-r...

  17. Model Based Verification of Cyber Range Event Environments

    DTIC Science & Technology

    2015-12-10

    Environment Representation (CCER) ontology . We also provide an overview of a methodology to specify verification rules and the corresponding error...lack models for their configuration. Our main contributions in this paper are the following. First, we have developed a configuration ontology ...the CCER ontology , and its current component ontologies . Section 3 describes the connections between diagrams, verification rules, and error messages

  18. Model Based Verification of Cyber Range Event Environments

    DTIC Science & Technology

    2015-11-13

    Environment Representation (CCER) ontology . We also provide an overview of a methodology to specify verification rules and the corresponding error...which lack models for their configuration. Our main contributions in this paper are the following. First, we have developed a configuration ontology ...the CCER ontology , and its current component ontologies . Section 3 describes the connections between diagrams, verification rules, and error

  19. Exploring Distance Learning Environments: A Proposal for Model Categorization.

    ERIC Educational Resources Information Center

    Morgado, Eduardo Martins; Yonezawa, Wilson; Reinhard, Nicolau

    This article proposes a categorization model for online distance education environments, based on two different aspects: interaction and content. The proposed categorization, which was based on the experience acquired in developing, implementing, and operating different remote training courses, is aimed at providing evidence to help educational…

  20. A Tutoring and Student Modelling Paradigm for Gaming Environments.

    ERIC Educational Resources Information Center

    Burton, Richard R.; Brown, John Seely

    This paper describes a paradigm for tutorial systems capable of automatically providing feedback and hints in a game environment. The paradigm is illustrated by a tutoring system for the PLATO game "How the West Was Won." The system uses a computer-based "Expert" player to evaluate a student's moves and construct a "differential model" of the…

  1. Environment Modeling Using Runtime Values for JPF-Android

    NASA Technical Reports Server (NTRS)

    van der Merwe, Heila; Tkachuk, Oksana; Nel, Seal; van der Merwe, Brink; Visser, Willem

    2015-01-01

    Software applications are developed to be executed in a specific environment. This environment includes external native libraries to add functionality to the application and drivers to fire the application execution. For testing and verification, the environment of an application is simplified abstracted using models or stubs. Empty stubs, returning default values, are simple to generate automatically, but they do not perform well when the application expects specific return values. Symbolic execution is used to find input parameters for drivers and return values for library stubs, but it struggles to detect the values of complex objects. In this work-in-progress paper, we explore an approach to generate drivers and stubs based on values collected during runtime instead of using default values. Entry-points and methods that need to be modeled are instrumented to log their parameters and return values. The instrumented applications are then executed using a driver and instrumented libraries. The values collected during runtime are used to generate driver and stub values on- the-fly that improve coverage during verification by enabling the execution of code that previously crashed or was missed. We are implementing this approach to improve the environment model of JPF-Android, our model checking and analysis tool for Android applications.

  2. Modelling between Epistemological Beliefs and Constructivist Learning Environment

    ERIC Educational Resources Information Center

    Çetin-Dindar, Ayla; Kirbulut, Zübeyde Demet; Boz, Yezdan

    2014-01-01

    The purpose of this study was to model the relationship between pre-service chemistry teachers' epistemological beliefs and their preference to use constructivist-learning environment in their future class. The sample was 125 pre-service chemistry teachers from five universities in Turkey. Two instruments were used in this study. One of the…

  3. Health, Supportive Environments, and the Reasonable Person Model

    Treesearch

    Stephen Kaplan; Rachel Kaplan

    2003-01-01

    The Reasonable Person Model is a conceptual framework that links environmental factors with human behavior. People are more reasonable, cooperative, helpful, and satisfied when the environment supports their basic informational needs. The same environmental supports are important factors in enhancing human health. We use this framework to identify the informational...

  4. Modelling between Epistemological Beliefs and Constructivist Learning Environment

    ERIC Educational Resources Information Center

    Çetin-Dindar, Ayla; Kirbulut, Zübeyde Demet; Boz, Yezdan

    2014-01-01

    The purpose of this study was to model the relationship between pre-service chemistry teachers' epistemological beliefs and their preference to use constructivist-learning environment in their future class. The sample was 125 pre-service chemistry teachers from five universities in Turkey. Two instruments were used in this study. One of the…

  5. Modeling battlefield sensor environments with the views workbench

    SciTech Connect

    Woyna, M.A.; Christiansen, J.H.; Hield, C.W.; Simunich, K.L.

    1994-08-01

    The Visual Intelligence and Electronic Warfare Simulation (VIEWS) Workbench software system has been developed by Argonne National Laboratory (ANL) to enable Army intelligence and electronic warfare (IEW) analysts at Unix workstations to conveniently build detailed IEW battlefield scenarios, or ``sensor environments,`` to drive he Army`s high-resolution IEW sensor performance models. VIEWS is fully object-oriented, including the underlying database.

  6. GENI: A graphical environment for model-based control

    NASA Astrophysics Data System (ADS)

    Kleban, Stephen; Lee, Martin; Zambre, Yadunath

    1990-08-01

    A new method of operating machine-modeling and beam-simulation programs for accelerator control has been developed. Existing methods, although cumbersome, have been used in control systems for commissioning and operation of many machines. We developed GENI, a generalized graphical interface to these programs for model-based control. This "object-oriented"-like environment is described and some typical applications are presented.

  7. Problems in modeling man machine control behavior in biodynamic environments

    NASA Technical Reports Server (NTRS)

    Jex, H. R.

    1972-01-01

    Reviewed are some current problems in modeling man-machine control behavior in a biodynamic environment. It is given in two parts: (1) a review of the models which are appropriate for manual control behavior and the added elements necessary to deal with biodynamic interfaces; and (2) a review of some biodynamic interface pilot/vehicle problems which have occurred, been solved, or need to be solved.

  8. Modeling of Radiowave Propagation in a Forested Environment

    DTIC Science & Technology

    2014-09-01

    leaves are randomly distributed. This randomness causes attenuation, scattering, diffraction, and absorption of the signal energy . Such characteristics...diffraction, and absorption of the signal energy . This makes radiowave propagation through such environments a complex problem to model. Modeling of...the observation point compared to [18] due to the assumption made by Tamir that all energy arriving at the canopy edge is assumed to be coupled

  9. The Icelandic volcanic aeolian environment: Processes and impacts - A review

    NASA Astrophysics Data System (ADS)

    Arnalds, Olafur; Dagsson-Waldhauserova, Pavla; Olafsson, Haraldur

    2016-03-01

    Iceland has the largest area of volcaniclastic sandy desert on Earth or 22,000 km2. The sand has been mostly produced by glacio-fluvial processes, leaving behind fine-grained unstable sediments which are later re-distributed by repeated aeolian events. Volcanic eruptions add to this pool of unstable sediments, often from subglacial eruptions. Icelandic desert surfaces are divided into sand fields, sandy lavas and sandy lag gravel, each with separate aeolian surface characteristics such as threshold velocities. Storms are frequent due to Iceland's location on the North Atlantic Storm track. Dry winds occur on the leeward sides of mountains and glaciers, in spite of the high moisture content of the Atlantic cyclones. Surface winds often move hundreds to more than 1000 kg m-1 per annum, and more than 10,000 kg m-1 have been measured in a single storm. Desertification occurs when aeolian processes push sand fronts and have thus destroyed many previously fully vegetated ecosystems since the time of the settlement of Iceland in the late ninth century. There are about 135 dust events per annum, ranging from minor storms to >300,000 t of dust emitted in single storms. Dust production is on the order of 30-40 million tons annually, some traveling over 1000 km and deposited on land and sea. Dust deposited on deserts tends to be re-suspended during subsequent storms. High PM10 concentrations occur during major dust storms. They are more frequent in the wake of volcanic eruptions, such as after the Eyjafjallajökull 2010 eruption. Airborne dust affects human health, with negative effects enhanced by the tubular morphology of the grains, and the basaltic composition with its high metal content. Dust deposition on snow and glaciers intensifies melting. Moreover, the dust production probably also influences atmospheric conditions and parameters that affect climate change.

  10. Current models of the intensely ionizing particle environment in space

    NASA Technical Reports Server (NTRS)

    Adams, James H., Jr.

    1988-01-01

    The Cosmic Ray Effects on MicroElectronics (CREME) model that is currently in use to estimate single event effect rates in spacecraft is described. The CREME model provides a description of the radiation environment in interplanetary space near the orbit of the earth that contains no major deficiencies. The accuracy of the galactic cosmic ray model is limited by the uncertainties in solar modulation. The model for solar energetic particles could be improved by making use of all the data that has been collected on solar energetic particle events. There remain major uncertainties about the environment within the earth's magnetosphere, because of the uncertainties over the charge states of the heavy ions in the anomalous component and solar flares, and because of trapped heavy ions. The present CREME model is valid only at 1 AU, but it could be extended to other parts of the heliosphere. There is considerable data on the radiation environment from 0.2 to 35 AU in the ecliptic plane. This data could be used to extend the CREME model.

  11. Framework for Modeling the Cognitive Process

    DTIC Science & Technology

    2005-06-16

    Yaworsky Air Force Research Laboratory/IFSB Rome, NY Keywords: Cognitive Process Modeling, Cognition, Conceptual Framework , Information...center of our conceptual framework and will distinguish our use of terms within the context of this framework. 3. A Conceptual Framework for...Modeling the Cognitive Process We will describe our conceptual framework using graphical examples to help illustrate main points. We form the two

  12. An Extension to the Weibull Process Model

    DTIC Science & Technology

    1981-11-01

    Subt5l . TYPE OF REPORT & PERIOD COVERED AN EXTENSION+TO THE WEIBULL PROCESS MODEL 6. PERFORMING O’G. REPORT NUMBER I. AuTHOR() S. CONTRACT OR GRANT...indicatinq its imrportance to applications. 4 AN EXTENSION TO TE WEIBULL PROCESS MODEL 1. INTRODUCTION Recent papers by Bain and Engelhardt (1980)1 and Crow

  13. Waves and coupling processes at the Polar Environment Atmospheric Research Laboratory (PEARL): Observations and science approach

    NASA Astrophysics Data System (ADS)

    Ward, William E.

    Over the past three years, installation of the suite of instruments planned for investigations of atmospheric phenomena from the ground to the mesopause region at the Polar Environment Atmospheric Research Laboratory (PEARL) in the Canadian Arctic (Eureka, Nunavut, 80N, 86W) has been completed and observations have now started. A subset of this instrumentation is associated with the scientific theme, Waves and Coupling Processes of the Middle Atmosphere. This subset includes E-Region Wind Interferometer, the meteor radar, the Spectral Airglow Temperature Imager SATI), the PEARL All-Sky Imager, the ozone and Rayleigh/Mie/Raman lidars, the VHF and cloud radar, the Fourier Transform Spectrometer and the Atmospheric Emitted Radiance Interferometer. This instrumentation set allows the wave environment above Eureka to be investigated and the coupling of the dynamics between atmospheric layers and geographical locations studied. These studies require contextual information on the large scale state of the atmosphere and collaborations with modelling groups, ground based observatories in the Arctic, and satellite teams have been initiated. This paper will describe the capabilities of the instrumentation involved in these studies, outline the scientific approach and present some initial results. PEARL is supported by the Canadian Foundation for Innovation (CFI); Canadian Foundation for Climate and Atmospheric Science (CFCAS); Canadian Space Agency (CSA); Environment Canada (EC); Government of Canada IPY funding; Ontario Innovation Trust (OIT); Natural Sciences and Engineering Research Council (NSERC); Nova Scotia Research Innovation Trust (NSRIT); Ontario Research Fund (ORF); and the Polar Continental Shelf Program (PCSP).

  14. Hybrid modelling of anaerobic wastewater treatment processes.

    PubMed

    Karama, A; Bernard, O; Genovesi, A; Dochain, D; Benhammou, A; Steyer, J P

    2001-01-01

    This paper presents a hybrid approach for the modelling of an anaerobic digestion process. The hybrid model combines a feed-forward network, describing the bacterial kinetics, and the a priori knowledge based on the mass balances of the process components. We have considered an architecture which incorporates the neural network as a static model of unmeasured process parameters (kinetic growth rate) and an integrator for the dynamic representation of the process using a set of dynamic differential equations. The paper contains a description of the neural network component training procedure. The performance of this approach is illustrated with experimental data.

  15. Distillation modeling for a uranium refining process

    SciTech Connect

    Westphal, B.R.

    1996-03-01

    As part of the spent fuel treatment program at Argonne National Laboratory, a vacuum distillation process is being employed for the recovery of uranium following an electrorefining process. Distillation of a salt electrolyte, containing a eutectic mixture of lithium and potassium chlorides, from uranium is achieved by a simple batch operation and is termed {open_quotes}cathode processing{close_quotes}. The incremental distillation of electrolyte salt will be modeled by an equilibrium expression and on a molecular basis since the operation is conducted under moderate vacuum conditions. As processing continues, the two models will be compared and analyzed for correlation with actual operating results. Possible factors that may contribute to aberrations from the models include impurities at the vapor-liquid boundary, distillate reflux, anomalous pressure gradients, and mass transport phenomena at the evaporating surface. Ultimately, the purpose of either process model is to enable the parametric optimization of the process.

  16. Direct numerical simulation of microcavitation processes in different bio environments

    NASA Astrophysics Data System (ADS)

    Ly, Kevin; Wen, Sy-Bor; Schmidt, Morgan S.; Thomas, Robert J.

    2017-02-01

    Laser-induced microcavitation refers to the rapid formation and expansion of a vapor bubble inside the bio-tissue when it is exposed to intense, pulsed laser energy. With the associated microscale dissection occurring within the tissue, laserinduced microcavitation is a common approach for high precision bio-surgeries. For example, laser-induced microcavitation is used for laser in-situ keratomileusis (LASIK) to precisely reshape the midstromal corneal tissue through excimer laser beam. Multiple efforts over the last several years have observed unique characteristics of microcavitions in biotissues. For example, it was found that the threshold energy for microcavitation can be significantly reduced when the size of the biostructure is increased. Also, it was found that the dynamics of microcavitation are significantly affected by the elastic modules of the bio-tissue. However, these efforts have not focused on the early events during microcavitation development. In this study, a direct numerical simulation of the microcavitation process based on equation of state of the biotissue was established. With the direct numerical simulation, we were able to reproduce the dynamics of microcavitation in water-rich bio tissues. Additionally, an experimental setup in deionized water and 10% PAA gel was made to verify the results of the simulation for early micro-cavitation formation for 10% Polyacrylamide (PAA) gel in deionized water.

  17. ESO C Library for an Image Processing Software Environment (eclipse)

    NASA Astrophysics Data System (ADS)

    Devillard, N.

    Written in ANSI C, eclipse is a library offering numerous services related to astronomical image processing: FITS data access, various image and cube loading methods, binary image handling and filtering (including convolution and morphological filters), 2-D cross-correlation, connected components, cube and image arithmetic, dead pixel detection and correction, object detection, data extraction, flat-fielding with robust fit, image generation, statistics, photometry, image-space resampling, image combination, and cube stacking. It also contains support for mathematical tools like random number generation, FFT, curve fitting, matrices, fast median computation, and point-pattern matching. The main feature of this library is its ability to handle large amounts of input data (up to 2 GB in the current version) regardless of the amount of memory and swap available on the local machine. Another feature is the very high speed allowed by optimized C, making it an ideal base tool for programming efficient number-crunching applications, e.g., on parallel (Beowulf) systems. Running on all Unix-like platforms, eclipse is portable. A high-level interface to Python is foreseen that would allow programmers to prototype their applications much faster than through C programs.

  18. Eclipse: ESO C Library for an Image Processing Software Environment

    NASA Astrophysics Data System (ADS)

    Devillard, Nicolas

    2011-12-01

    Written in ANSI C, eclipse is a library offering numerous services related to astronomical image processing: FITS data access, various image and cube loading methods, binary image handling and filtering (including convolution and morphological filters), 2-D cross-correlation, connected components, cube and image arithmetic, dead pixel detection and correction, object detection, data extraction, flat-fielding with robust fit, image generation, statistics, photometry, image-space resampling, image combination, and cube stacking. It also contains support for mathematical tools like random number generation, FFT, curve fitting, matrices, fast median computation, and point-pattern matching. The main feature of this library is its ability to handle large amounts of input data (up to 2GB in the current version) regardless of the amount of memory and swap available on the local machine. Another feature is the very high speed allowed by optimized C, making it an ideal base tool for programming efficient number-crunching applications, e.g., on parallel (Beowulf) systems.

  19. Threat processing: models and mechanisms.

    PubMed

    Bentz, Dorothée; Schiller, Daniela

    2015-01-01

    The experience of fear is closely linked to the survival of species. Fear can be conceptualized as a brain state that orchestrates defense reactions to threats. To avoid harm, an organism must be equipped with neural circuits that allow learning, detecting, and rapidly responding to threats. Past experience with threat can transform neutral stimuli present at the time of experience into learned threat-related stimuli via associative learning. Pavlovian threat conditioning is the central experimental paradigm to study associative learning. Once learned, these stimulus-response associations are not always expressed depending on context or new experiences with the conditioned stimuli. Neural circuits mediating threat learning have the inherent plasticity to adapt to changing environmental threats. Encounters devoid of danger pave the way for extinction or reconsolidation to occur. Extinction and reconsolidation can both lead to changes in the expression of threat-induced defense responses, but differ in stability and have a different neural basis. This review presents the behavioral models and the system-level neural mechanisms in animals and humans of threat learning and modulation.

  20. VARTM Process Modeling of Aerospace Composite Structures

    NASA Technical Reports Server (NTRS)

    Song, Xiao-Lan; Grimsley, Brian W.; Hubert, Pascal; Cano, Roberto J.; Loos, Alfred C.

    2003-01-01

    A three-dimensional model was developed to simulate the VARTM composite manufacturing process. The model considers the two important mechanisms that occur during the process: resin flow, and compaction and relaxation of the preform. The model was used to simulate infiltration of a carbon preform with an epoxy resin by the VARTM process. The model predicted flow patterns and preform thickness changes agreed qualitatively with the measured values. However, the predicted total infiltration times were much longer than measured most likely due to the inaccurate preform permeability values used in the simulation.

  1. Declarative business process modelling: principles and modelling languages

    NASA Astrophysics Data System (ADS)

    Goedertier, Stijn; Vanthienen, Jan; Caron, Filip

    2015-02-01

    The business process literature has proposed a multitude of business process modelling approaches or paradigms, each in response to a different business process type with a unique set of requirements. Two polar paradigms, i.e. the imperative and the declarative paradigm, appear to define the extreme positions on the paradigm spectrum. While imperative approaches focus on explicitly defining how an organisational goal should be reached, the declarative approaches focus on the directives, policies and regulations restricting the potential ways to achieve the organisational goal. In between, a variety of hybrid-paradigms can be distinguished, e.g. the advanced and adaptive case management. This article focuses on the less-exposed declarative approach on process modelling. An outline of the declarative process modelling and the modelling approaches is presented, followed by an overview of the observed declarative process modelling principles and an evaluation of the declarative process modelling approaches.

  2. Validating instrument models through the calibration process

    NASA Astrophysics Data System (ADS)

    Bingham, G. E.; Tansock, J. J.

    2006-08-01

    The performance of modern IR instruments is becoming so good that meeting science requirements requires an accurate instrument model be used throughout the design and development process. The huge cost overruns on recent major programs are indicative that the design and cost models being used to predict performance have lagged behind anticipated performance. Tuning these models to accurately reflect the true performance of target instruments requires a modeling process that has been developed over several instruments and validated by careful calibration. The process of developing a series of Engineering Development Models is often used on longer duration programs to achieve this end. The accuracy of the models and their components has to be validated by a carefully planned calibration process, preferably considered in the instrument design. However, a good model does not satisfy all the requirements to bring acquisition programs under control. Careful detail in the specification process and a similar, validated model on the government side will also be required. This paper discusses the model development process and calibration approaches used to verify and update the models of several new instruments, including Geosynchronous Imaging Fourier Transform Spectrometer (GIFTS) and Far Infrared Spectroscopy of the Troposphere (FIRST).

  3. How Is the Learning Environment in Physics Lesson with Using 7E Model Teaching Activities

    ERIC Educational Resources Information Center

    Turgut, Umit; Colak, Alp; Salar, Riza

    2017-01-01

    The aim of this research is to reveal the results in the planning, implementation and evaluation of the process for learning environments to be designed in compliance with 7E learning cycle model in physics lesson. "Action research", which is a qualitative research pattern, is employed in this research in accordance with the aim of the…

  4. Multidimensional vibrational spectroscopy for tunneling processes in a dissipative environment.

    PubMed

    Ishizaki, Akihito; Tanimura, Yoshitaka

    2005-07-01

    Simulating tunneling processes as well as their observation are challenging problems for many areas. In this study, we consider a double-well potential system coupled to a heat bath with a linear-linear (LL) and square-linear (SL) system-bath interactions. The LL interaction leads to longitudinal (T1) and transversal (T2) homogeneous relaxations, whereas the SL interaction leads to the inhomogeneous dephasing (T2*) relaxation in the white noise limit with a rotating wave approximation. We discuss the dynamics of the double-well system under infrared (IR) laser excitations from a Gaussian-Markovian quantum Fokker-Planck equation approach, which was developed by generalizing Kubo's stochastic Liouville equation. Analytical expression of the Green function is obtained for a case of two-state-jump modulation by performing the Fourier-Laplace transformation. We then calculate a two-dimensional infrared signal, which is defined by the four-body correlation function of optical dipole, for various noise correlation time, system-bath coupling parameters, and temperatures. It is shown that the bath-induced vibrational excitation and relaxation dynamics between the tunneling splitting levels can be detected as the isolated off-diagonal peaks in the third-order two-dimensional infrared (2D-IR) spectroscopy for a specific phase matching condition. Furthermore, this spectroscopy also allows us to directly evaluate the rate constants for tunneling reactions, which relates to the coherence between the splitting levels; it can be regarded as a novel technique for measuring chemical reaction rates. We depict the change of reaction rates as a function of system-bath coupling strength and a temperature through the 2D-IR signal.

  5. Discriminating Tectonic Tremor from Magmatic Processes in Observationally Challenging Environments

    NASA Astrophysics Data System (ADS)

    Brown, J. R.; Beroza, G. C.

    2011-12-01

    Deep tectonic tremor is a long-duration, low amplitude signal that has been shown to consist of low frequency earthquakes (LFEs) on the plate interface in subduction zones. Detecting LFEs from tremor-like signals in subduction settings can be challenging due to the combination of volcanic seismicity and sparse station geometry. This is particularly true for island arcs such as the Alaska-Aleutian subduction zone where the islands are small and noise levels are high. We have detected and located LFEs within tremor signals along the Alaska-Aleutian Arc in four locations: Kodiak Island, Alaska Peninsula, eastern Aleutians, and the Andreanof Islands. In all areas, the LFEs are located 10-40 km trenchward of the volcanic chain at depths ranging from 45-70 km. Location errors are significant (+/- 20 km in depth) due to sparse station geometry such that there is the possibility that the tremor could be associated with nearby volcanoes. Since most documented volcanic tremor is located in the shallow crust, it can often be discriminated from tectonic tremor simply based on location. However, deep volcanic tremor has been documented in Hawaii to depths of 40 km and could be more widespread. In the Aleutian arc, deep long period events (DLPs), which are thought to result from the movement of magma and volatiles, have been located as deep as 45 km and sometimes resemble tremor-like signals. The spectral character is another potential discriminant. We compare the cepstra (Fourier transform of the logarithmic power spectrum of a time series) of the tectonic tremor-like signals/LFEs and DLPs associated with volcanoes. Source characteristics of DLPs (non-shear slip) and tectonic tremor/LFEs (shear slip) are distinct and should be noticeable in the cepstral domain. This approach of using tremor locations and cepstral analysis could be useful for detecting and differentiating tectonic tremor from deep volcanic processes in other island arcs as well.

  6. An information processing model of anxiety: automatic and strategic processes.

    PubMed

    Beck, A T; Clark, D A

    1997-01-01

    A three-stage schema-based information processing model of anxiety is described that involves: (a) the initial registration of a threat stimulus; (b) the activation of a primal threat mode; and (c) the secondary activation of more elaborative and reflective modes of thinking. The defining elements of automatic and strategic processing are discussed with the cognitive bias in anxiety reconceptualized in terms of a mixture of automatic and strategic processing characteristics depending on which stage of the information processing model is under consideration. The goal in the treatment of anxiety is to deactivate the more automatic primal threat mode and to strengthen more constructive reflective modes of thinking. Arguments are presented for the inclusion of verbal mediation as a necessary but not sufficient component in the cognitive and behavioral treatment of anxiety.

  7. Modeling and control for closed environment plant production systems

    NASA Technical Reports Server (NTRS)

    Fleisher, David H.; Ting, K. C.; Janes, H. W. (Principal Investigator)

    2002-01-01

    A computer program was developed to study multiple crop production and control in controlled environment plant production systems. The program simulates crop growth and development under nominal and off-nominal environments. Time-series crop models for wheat (Triticum aestivum), soybean (Glycine max), and white potato (Solanum tuberosum) are integrated with a model-based predictive controller. The controller evaluates and compensates for effects of environmental disturbances on crop production scheduling. The crop models consist of a set of nonlinear polynomial equations, six for each crop, developed using multivariate polynomial regression (MPR). Simulated data from DSSAT crop models, previously modified for crop production in controlled environments with hydroponics under elevated atmospheric carbon dioxide concentration, were used for the MPR fitting. The model-based predictive controller adjusts light intensity, air temperature, and carbon dioxide concentration set points in response to environmental perturbations. Control signals are determined from minimization of a cost function, which is based on the weighted control effort and squared-error between the system response and desired reference signal.

  8. Modeling and control for closed environment plant production systems

    NASA Technical Reports Server (NTRS)

    Fleisher, David H.; Ting, K. C.; Janes, H. W. (Principal Investigator)

    2002-01-01

    A computer program was developed to study multiple crop production and control in controlled environment plant production systems. The program simulates crop growth and development under nominal and off-nominal environments. Time-series crop models for wheat (Triticum aestivum), soybean (Glycine max), and white potato (Solanum tuberosum) are integrated with a model-based predictive controller. The controller evaluates and compensates for effects of environmental disturbances on crop production scheduling. The crop models consist of a set of nonlinear polynomial equations, six for each crop, developed using multivariate polynomial regression (MPR). Simulated data from DSSAT crop models, previously modified for crop production in controlled environments with hydroponics under elevated atmospheric carbon dioxide concentration, were used for the MPR fitting. The model-based predictive controller adjusts light intensity, air temperature, and carbon dioxide concentration set points in response to environmental perturbations. Control signals are determined from minimization of a cost function, which is based on the weighted control effort and squared-error between the system response and desired reference signal.

  9. Modeling Users, Context and Devices for Ambient Assisted Living Environments

    PubMed Central

    Castillejo, Eduardo; Almeida, Aitor; López-de-Ipiña, Diego; Chen, Liming

    2014-01-01

    The participation of users within AAL environments is increasing thanks to the capabilities of the current wearable devices. Furthermore, the significance of considering user's preferences, context conditions and device's capabilities help smart environments to personalize services and resources for them. Being aware of different characteristics of the entities participating in these situations is vital for reaching the main goals of the corresponding systems efficiently. To collect different information from these entities, it is necessary to design several formal models which help designers to organize and give some meaning to the gathered data. In this paper, we analyze several literature solutions for modeling users, context and devices considering different approaches in the Ambient Assisted Living domain. Besides, we remark different ongoing standardization works in this area. We also discuss the used techniques, modeled characteristics and the advantages and drawbacks of each approach to finally draw several conclusions about the reviewed works. PMID:24643006

  10. Modeling users, context and devices for ambient assisted living environments.

    PubMed

    Castillejo, Eduardo; Almeida, Aitor; López-de-Ipiña, Diego; Chen, Liming

    2014-03-17

    The participation of users within AAL environments is increasing thanks to the capabilities of the current wearable devices. Furthermore, the significance of considering user's preferences, context conditions and device's capabilities help smart environments to personalize services and resources for them. Being aware of different characteristics of the entities participating in these situations is vital for reaching the main goals of the corresponding systems efficiently. To collect different information from these entities, it is necessary to design several formal models which help designers to organize and give some meaning to the gathered data. In this paper, we analyze several literature solutions for modeling users, context and devices considering different approaches in the Ambient Assisted Living domain. Besides, we remark different ongoing standardization works in this area. We also discuss the used techniques, modeled characteristics and the advantages and drawbacks of each approach to finally draw several conclusions about the reviewed works.

  11. Strengthening the weak link: Built Environment modelling for loss analysis

    NASA Astrophysics Data System (ADS)

    Millinship, I.

    2012-04-01

    Methods to analyse insured losses from a range of natural perils, including pricing by primary insurers and catastrophe modelling by reinsurers, typically lack sufficient exposure information. Understanding the hazard intensity in terms of spatial severity and frequency is only the first step towards quantifying the risk of a catastrophic event. For any given event we need to know: Are any structures affected? What type of buildings are they? How much damaged occurred? How much will the repairs cost? To achieve this, detailed exposure information is required to assess the likely damage and to effectively calculate the resultant loss. Modelling exposures in the Built Environment therefore plays as important a role in understanding re/insurance risk as characterising the physical hazard. Across both primary insurance books and aggregated reinsurance portfolios, the location of a property (a risk) and its monetary value is typically known. Exactly what that risk is in terms of detailed property descriptors including structure type and rebuild cost - and therefore its vulnerability to loss - is often omitted. This data deficiency is a primary source of variations between modelled losses and the actual claims value. Built Environment models are therefore required at a high resolution to describe building attributes that relate vulnerability to property damage. However, national-scale household-level datasets are often not computationally practical in catastrophe models and data must be aggregated. In order to provide more accurate risk analysis, we have developed and applied a methodology for Built Environment modelling for incorporation into a range of re/insurance applications, including operational models for different international regions and different perils and covering residential, commercial and industry exposures. Illustrated examples are presented, including exposure modelling suitable for aggregated reinsurance analysis for the UK and bespoke high resolution

  12. Distributed data processing and analysis environment for neutron scattering experiments at CSNS

    NASA Astrophysics Data System (ADS)

    Tian, H. L.; Zhang, J. R.; Yan, L. L.; Tang, M.; Hu, L.; Zhao, D. X.; Qiu, Y. X.; Zhang, H. Y.; Zhuang, J.; Du, R.

    2016-10-01

    China Spallation Neutron Source (CSNS) is the first high-performance pulsed neutron source in China, which will meet the increasing fundamental research and technique applications demands domestically and overseas. A new distributed data processing and analysis environment has been developed, which has generic functionalities for neutron scattering experiments. The environment consists of three parts, an object-oriented data processing framework adopting a data centered architecture, a communication and data caching system based on the C/S paradigm, and data analysis and visualization software providing the 2D/3D experimental data display. This environment will be widely applied in CSNS for live data processing.

  13. Combining Wireless Sensor Networks and Groundwater Transport Models: Protocol and Model Development in a Simulative Environment

    NASA Astrophysics Data System (ADS)

    Barnhart, K.; Urteaga, I.; Han, Q.; Porta, L.; Jayasumana, A.; Illangasekare, T.

    2007-12-01

    Groundwater transport modeling is intended to aid in remediation processes by providing prediction of plume location and by helping to bridge data gaps in the typically undersampled subsurface environment. Increased availability of computer resources has made computer-based transport models almost ubiquitous in calculating health risks, determining cleanup strategies, guiding environmental regulatory policy, and in determining culpable parties in lawsuits. Despite their broad use, very few studies exist which verify model correctness or even usefulness, and those that have shown significant discrepancies between predicted and actual results. Better predictions can only be gained from additional and higher quality data, but this is an expensive proposition using current sampling techniques. A promising technology is the use of wireless sensor networks (WSNs) which are comprised of wireless nodes (motes) coupled to in-situ sensors that are capable of measuring hydrological parameters. As the motes are typically battery powered, power consumption is a major concern in routing algorithms. By supplying predictions about the direction and arrival time of the contaminant, the application-driven routing protocol would then become more efficient. A symbiotic relationship then exists between the WSN, which is supplying the data to calibrate the transport model, and the model, which may be supplying predictive information to the WSN for optimum monitoring performance. Many challenges exist before the above can be realized: WSN protocols must mature, as must sensor technology, and inverse models and tools must be developed for integration into the system. As current model calibration, even automatic calibration, still often requires manual tweaking of calibration parameters, implementing this in a real-time closed-loop process may require significant work. Based on insights from a previous proof-of-concept intermediate-scale tank experiment, we are developing the models, tools

  14. ARTEMIS: Ares Real Time Environments for Modeling, Integration, and Simulation

    NASA Technical Reports Server (NTRS)

    Hughes, Ryan; Walker, David

    2009-01-01

    This slide presentation reviews the use of ARTEMIS in the development and testing of the ARES launch vehicles. Ares Real Time Environment for Modeling, Simulation and Integration (ARTEMIS) is the real time simulation supporting Ares I hardware-in-the-loop (HWIL) testing. ARTEMIS accurately models all Ares/Orion/Ground subsystems which interact with Ares avionics components from pre-launch through orbit insertion The ARTEMIS System integration Lab, and the STIF architecture is reviewed. The functional components of ARTEMIS are outlined. An overview of the models and a block diagram is presented.

  15. Simulation models of early visual processes

    NASA Technical Reports Server (NTRS)

    Ahumada, Albert J., Jr.; Watson, Andrew B.

    1988-01-01

    Several areas of early visual processes are studied using computer models. These models include retinal cone placement, cone color arrangement, development of geniculate receptive fields, cortical simple cells, and motion field extraction. The receptive field of a model cortical unit is indicated schematically.

  16. Three Models for the Curriculum Development Process

    ERIC Educational Resources Information Center

    O'Hanlon, James

    1973-01-01

    Presents descriptions of the management, systematic, and open-access curriculum development models to identify the decisionmaking bases, operational processes, evaluation requirements, and curriculum control methods of each model. A possible relationship among these models is then suggested. (Author/DN)

  17. The Automation of Nowcast Model Assessment Processes

    DTIC Science & Technology

    of forecasts produced by US Army Research Laboratory’s nowcast model, Weather Running Estimate-Nowcast (WRE- N ). This report documents the design and...implementation of the automated process of generating domain-level error statistics that can be used by modelers to improve the accuracy of WRE- N model

  18. Thermal modeling of carbon-epoxy laminates in fire environments.

    SciTech Connect

    McGurn, Matthew T. , Buffalo, NY); DesJardin, Paul Edward , Buffalo, NY); Dodd, Amanda B.

    2010-10-01

    A thermal model is developed for the response of carbon-epoxy composite laminates in fire environments. The model is based on a porous media description that includes the effects of gas transport within the laminate along with swelling. Model comparisons are conducted against the data from Quintere et al. Simulations are conducted for both coupon level and intermediate scale one-sided heating tests. Comparisons of the heat release rate (HRR) as well as the final products (mass fractions, volume percentages, porosity, etc.) are conducted. Overall, the agreement between available the data and model is excellent considering the simplified approximations to account for flame heat flux. A sensitivity study using a newly developed swelling model shows the importance of accounting for laminate expansion for the prediction of burnout. Excellent agreement is observed between the model and data of the final product composition that includes porosity, mass fractions and volume expansion ratio.

  19. Modeling Gene-Environment Interactions With Quasi-Natural Experiments.

    PubMed

    Schmitz, Lauren; Conley, Dalton

    2017-02-01

    This overview develops new empirical models that can effectively document Gene × Environment (G×E) interactions in observational data. Current G×E studies are often unable to support causal inference because they use endogenous measures of the environment or fail to adequately address the nonrandom distribution of genes across environments, confounding estimates. Comprehensive measures of genetic variation are incorporated into quasi-natural experimental designs to exploit exogenous environmental shocks or isolate variation in environmental exposure to avoid potential confounders. In addition, we offer insights from population genetics that improve upon extant approaches to address problems from population stratification. Together, these tools offer a powerful way forward for G×E research on the origin and development of social inequality across the life course.

  20. Preform Characterization in VARTM Process Model Development

    NASA Technical Reports Server (NTRS)

    Grimsley, Brian W.; Cano, Roberto J.; Hubert, Pascal; Loos, Alfred C.; Kellen, Charles B.; Jensen, Brian J.

    2004-01-01

    Vacuum-Assisted Resin Transfer Molding (VARTM) is a Liquid Composite Molding (LCM) process where both resin injection and fiber compaction are achieved under pressures of 101.3 kPa or less. Originally developed over a decade ago for marine composite fabrication, VARTM is now considered a viable process for the fabrication of aerospace composites (1,2). In order to optimize and further improve the process, a finite element analysis (FEA) process model is being developed to include the coupled phenomenon of resin flow, preform compaction and resin cure. The model input parameters are obtained from resin and fiber-preform characterization tests. In this study, the compaction behavior and the Darcy permeability of a commercially available carbon fabric are characterized. The resulting empirical model equations are input to the 3- Dimensional Infiltration, version 5 (3DINFILv.5) process model to simulate infiltration of a composite panel.

  1. Hidden process models for animal population dynamics.

    PubMed

    Newman, K B; Buckland, S T; Lindley, S T; Thomas, L; Fernández, C

    2006-02-01

    Hidden process models are a conceptually useful and practical way to simultaneously account for process variation in animal population dynamics and measurement errors in observations and estimates made on the population. Process variation, which can be both demographic and environmental, is modeled by linking a series of stochastic and deterministic subprocesses that characterize processes such as birth, survival, maturation, and movement. Observations of the population can be modeled as functions of true abundance with realistic probability distributions to describe observation or estimation error. Computer-intensive procedures, such as sequential Monte Carlo methods or Markov chain Monte Carlo, condition on the observed data to yield estimates of both the underlying true population abundances and the unknown population dynamics parameters. Formulation and fitting of a hidden process model are demonstrated for Sacramento River winter-run chinook salmon (Oncorhynchus tshawytsha).

  2. Modelling radionuclide distribution and transport in the environment.

    PubMed

    Thiessen, K M; Thorne, M C; Maul, P R; Pröhl, G; Wheater, H S

    1999-01-01

    Mathematical models of radionuclide distribution and transport in the environment have been developed to assess the impact on people of routine and accidental releases of radioactivity from a variety of nuclear activities, including: weapons development, production, and testing; power production; and waste disposal. The models are used to estimate human exposures and doses in situations where measurements have not been made or would be impossible or impractical to make. Model results are used to assess whether nuclear facilities are operated in compliance with regulatory requirements, to determine the need for remediation of contaminated sites, to estimate the effects on human health of past releases, and to predict the potential effects of accidental releases or new facilities. This paper describes the various applications and types of models currently used to represent the distribution and transport of radionuclides in the terrestrial and aquatic environments, as well as integrated global models for selected radionuclides and special issues in the fields of solid radioactive waste disposal and dose reconstruction. Particular emphasis is placed on the issue of improving confidence in the model results, including the importance of uncertainty analysis and of model verification and validation.

  3. A new Mars radiation environment model with visualization

    NASA Technical Reports Server (NTRS)

    De Angelis, G.; Clowdsley, M. S.; Singleterry, R. C.; Wilson, J. W.

    2004-01-01

    A new model for the radiation environment to be found on the planet Mars due to Galactic Cosmic Rays (OCR) has been developed at the NASA Langley Research Center. Solar modulated primary particles rescaled for Mars conditions are transported through the Martian atmosphere, with temporal properties modeled with variable timescales, down to the surface, with altitude and backscattering patterns taken into account. The Martian atmosphere has been modeled by using the Mars Global Reference Atmospheric Model--version 2001 (Mars-GRAM 2001). The altitude to compute the atmospheric thickness profile has been determined by using a model for the topography based on the data provided by the Mars Orbiter Laser Altimeter (MOLA) instrument on board the Mars Global Surveyor (MGS) spacecraft. The Mars surface composition has been modeled based on averages over the measurements obtained from orbiting spacecraft and at various landing sites, taking into account the possible volatile inventory (e.g., CO2 ice, H2O ice) along with its time variation throughout the Martian year. Particle transport has been performed with the HZETRN heavy ion code. The Mars Radiation Environment Model has been made available worldwide through the Space Ionizing Radiation Effects and Shielding Tools (SIREST) website, a project of NASA Langley Research Center. c2004 COSPAR. Published by Elsevier Ltd. All rights reserved.

  4. A new Mars radiation environment model with visualization

    NASA Technical Reports Server (NTRS)

    De Angelis, G.; Clowdsley, M. S.; Singleterry, R. C.; Wilson, J. W.

    2004-01-01

    A new model for the radiation environment to be found on the planet Mars due to Galactic Cosmic Rays (OCR) has been developed at the NASA Langley Research Center. Solar modulated primary particles rescaled for Mars conditions are transported through the Martian atmosphere, with temporal properties modeled with variable timescales, down to the surface, with altitude and backscattering patterns taken into account. The Martian atmosphere has been modeled by using the Mars Global Reference Atmospheric Model--version 2001 (Mars-GRAM 2001). The altitude to compute the atmospheric thickness profile has been determined by using a model for the topography based on the data provided by the Mars Orbiter Laser Altimeter (MOLA) instrument on board the Mars Global Surveyor (MGS) spacecraft. The Mars surface composition has been modeled based on averages over the measurements obtained from orbiting spacecraft and at various landing sites, taking into account the possible volatile inventory (e.g., CO2 ice, H2O ice) along with its time variation throughout the Martian year. Particle transport has been performed with the HZETRN heavy ion code. The Mars Radiation Environment Model has been made available worldwide through the Space Ionizing Radiation Effects and Shielding Tools (SIREST) website, a project of NASA Langley Research Center. c2004 COSPAR. Published by Elsevier Ltd. All rights reserved.

  5. A new Mars radiation environment model with visualization.

    PubMed

    De Angelis, G; Clowdsley, M S; Singleterry, R C; Wilson, J W

    2004-01-01

    A new model for the radiation environment to be found on the planet Mars due to Galactic Cosmic Rays (OCR) has been developed at the NASA Langley Research Center. Solar modulated primary particles rescaled for Mars conditions are transported through the Martian atmosphere, with temporal properties modeled with variable timescales, down to the surface, with altitude and backscattering patterns taken into account. The Martian atmosphere has been modeled by using the Mars Global Reference Atmospheric Model--version 2001 (Mars-GRAM 2001). The altitude to compute the atmospheric thickness profile has been determined by using a model for the topography based on the data provided by the Mars Orbiter Laser Altimeter (MOLA) instrument on board the Mars Global Surveyor (MGS) spacecraft. The Mars surface composition has been modeled based on averages over the measurements obtained from orbiting spacecraft and at various landing sites, taking into account the possible volatile inventory (e.g., CO2 ice, H2O ice) along with its time variation throughout the Martian year. Particle transport has been performed with the HZETRN heavy ion code. The Mars Radiation Environment Model has been made available worldwide through the Space Ionizing Radiation Effects and Shielding Tools (SIREST) website, a project of NASA Langley Research Center. c2004 COSPAR. Published by Elsevier Ltd. All rights reserved.

  6. Modeling of the Adiabatic and Isothermal Methanation Process

    NASA Astrophysics Data System (ADS)

    Porubova, Jekaterina; Bazbauers, Gatis; Markova, Darja

    2011-01-01

    Increased use of biomass offers one of the ways to reduce anthropogenic impact on the environment. Using various biomass conversion processes, it is possible to obtain different types of fuels: • solid, e.g. bio-carbon; • liquid, e.g. biodiesel and ethanol; • gaseous, e.g. biomethane. Biomethane can be used in the transport and energy sector, and the total methane production efficiency can reach 65%. By modeling adiabatic and isothermal methanation processes, the most effective one from the methane production point of view is defined. Influence of the process parameters on the overall efficiency of the methane production is determined.

  7. Periglacial process research for improved understanding of climate change in periglacial environments

    NASA Astrophysics Data System (ADS)

    Hvidtfeldt Christiansen, Hanne

    2010-05-01

    Periglacial landscapes extend widely outside the glaciated areas and the areas underlain by permafrost and with seasonal frost. Yet recently significant attention has in cryosphere research, related to periglacial geomorphology, been given to a direct climate permafrost relationship. The focus is on the permafrost thermal state including the thickness of the active layer, and often simplifying how these two key conditions are directly climatically controlled. There has been less focus on the understanding and quantification of the different periglacial processes, which largely control the consequences of changing climatic conditions on the permafrost and on seasonal frost all over the periglacial environments. It is the complex relationship between climate, micro-climate and local geomorphological, geological and ecological conditions, which controls periglacial processes. In several cases local erosion or deposition will affect the rates of landform change significantly more than any climate change. Thus detailed periglacial process studies will sophisticate the predictions of how periglacial landscapes can be expected to respond to climatic changes, and be built into Earth System Modelling. Particularly combining direct field observations and measurements with remote sensing and geochronological studies of periglacial landforms, enables a significantly improved understanding of periglacial process rates. An overview of the state of research in key periglacial processes are given focusing on ice-wedges and solifluction landforms, and seasonal ground thermal dynamics, all with examples from the high Arctic in Svalbard. Thermal contraction cracking and its seasonal meteorological control is presented, and potential thermal erosion of ice-wedges leading to development of thermokarst is discussed. Local and meteorological controls on solifluction rates are presented and their climatic control indicated. Seasonal ground thermal processes and their dependence on local

  8. On Choosing Between Two Probabilistic Choice Sub-models in a Dynamic Multitask Environment

    NASA Technical Reports Server (NTRS)

    Soulsby, E. P.

    1984-01-01

    An independent random utility model based on Thurstone's Theory of Comparative Judgment and a constant utility model based on Luce's Choice Axiom are reviewed in detail. Predictions from the two models are shown to be equivalent under certain restrictions on the distribution of the underlying random process. Each model is applied as a stochastic choice submodel in a dynamic, multitask, environment. Resulting choice probabilities are nearly identical, indicating that, despite their conceptual differences, neither model may be preferred over the other based solely on its predictive capability.

  9. Database Selection for Processing k Nearest Neighbors Queries in Distributed Environments.

    ERIC Educational Resources Information Center

    Yu, Clement; Sharma, Prasoon; Meng, Weiyi; Qin, Yan

    This paper considers the processing of digital library queries, consisting of a text component and a structured component in distributed environments. The paper concentrates on the processing of the structured component of a distributed query. A method is proposed to identify the databases that are likely to be useful for processing any given…

  10. DISTRIBUTED PROCESSING TRADE-OFF MODEL FOR ELECTRIC UTILITY OPERATION

    NASA Technical Reports Server (NTRS)

    Klein, S. A.

    1994-01-01

    The Distributed processing Trade-off Model for Electric Utility Operation is based upon a study performed for the California Institute of Technology's Jet Propulsion Laboratory. This study presented a technique that addresses the question of trade-offs between expanding a communications network or expanding the capacity of distributed computers in an electric utility Energy Management System (EMS). The technique resulted in the development of a quantitative assessment model that is presented in a Lotus 1-2-3 worksheet environment. The model gives EMS planners a macroscopic tool for evaluating distributed processing architectures and the major technical and economic tradeoffs as well as interactions within these architectures. The model inputs (which may be varied according to application and need) include geographic parameters, data flow and processing workload parameters, operator staffing parameters, and technology/economic parameters. The model's outputs are total cost in various categories, a number of intermediate cost and technical calculation results, as well as graphical presentation of Costs vs. Percent Distribution for various parameters. The model has been implemented on an IBM PC using the LOTUS 1-2-3 spreadsheet environment and was developed in 1986. Also included with the spreadsheet model are a number of representative but hypothetical utility system examples.

  11. DISTRIBUTED PROCESSING TRADE-OFF MODEL FOR ELECTRIC UTILITY OPERATION

    NASA Technical Reports Server (NTRS)

    Klein, S. A.

    1994-01-01

    The Distributed processing Trade-off Model for Electric Utility Operation is based upon a study performed for the California Institute of Technology's Jet Propulsion Laboratory. This study presented a technique that addresses the question of trade-offs between expanding a communications network or expanding the capacity of distributed computers in an electric utility Energy Management System (EMS). The technique resulted in the development of a quantitative assessment model that is presented in a Lotus 1-2-3 worksheet environment. The model gives EMS planners a macroscopic tool for evaluating distributed processing architectures and the major technical and economic tradeoffs as well as interactions within these architectures. The model inputs (which may be varied according to application and need) include geographic parameters, data flow and processing workload parameters, operator staffing parameters, and technology/economic parameters. The model's outputs are total cost in various categories, a number of intermediate cost and technical calculation results, as well as graphical presentation of Costs vs. Percent Distribution for various parameters. The model has been implemented on an IBM PC using the LOTUS 1-2-3 spreadsheet environment and was developed in 1986. Also included with the spreadsheet model are a number of representative but hypothetical utility system examples.

  12. Modelling the near-Earth space environment using LDEF data

    NASA Technical Reports Server (NTRS)

    Atkinson, Dale R.; Coombs, Cassandra R.; Crowell, Lawrence B.; Watts, Alan J.

    1992-01-01

    Near-Earth space is a dynamic environment, that is currently not well understood. In an effort to better characterize the near-Earth space environment, this study compares the results of actual impact crater measurement data and the Space Environment (SPENV) Program developed in-house at POD, to theoretical models established by Kessler (NASA TM-100471, 1987) and Cour-Palais (NASA SP-8013, 1969). With the continuing escalation of debris there will exist a definite hazard to unmanned satellites as well as manned operations. Since the smaller non-trackable debris has the highest impact rate, it is clearly necessary to establish the true debris environment for all particle sizes. Proper comprehension of the near-Earth space environment and its origin will permit improvement in spacecraft design and mission planning, thereby reducing potential disasters and extreme costs. Results of this study directly relate to the survivability of future spacecraft and satellites that are to travel through and/or reside in low Earth orbit (LEO). More specifically, these data are being used to: (1) characterize the effects of the LEO micrometeoroid an debris environment on satellite designs and components; (2) update the current theoretical micrometeoroid and debris models for LEO; (3) help assess the survivability of spacecraft and satellites that must travel through or reside in LEO, and the probability of their collision with already resident debris; and (4) help define and evaluate future debris mitigation and disposal methods. Combined model predictions match relatively well with the LDEF data for impact craters larger than approximately 0.05 cm, diameter; however, for smaller impact craters, the combined predictions diverge and do not reflect the sporadic clouds identified by the Interplanetary Dust Experiment (IDE) aboard LDEF. The divergences cannot currently be explained by the authors or model developers. The mean flux of small craters (approximately 0.05 cm diameter) is

  13. Modelling the near-Earth space environment using LDEF data

    NASA Technical Reports Server (NTRS)

    Atkinson, Dale R.; Coombs, Cassandra R.; Crowell, Lawrence B.; Watts, Alan J.

    1992-01-01

    Near-Earth space is a dynamic environment, that is currently not well understood. In an effort to better characterize the near-Earth space environment, this study compares the results of actual impact crater measurement data and the Space Environment (SPENV) Program developed in-house at POD, to theoretical models established by Kessler (NASA TM-100471, 1987) and Cour-Palais (NASA SP-8013, 1969). With the continuing escalation of debris there will exist a definite hazard to unmanned satellites as well as manned operations. Since the smaller non-trackable debris has the highest impact rate, it is clearly necessary to establish the true debris environment for all particle sizes. Proper comprehension of the near-Earth space environment and its origin will permit improvement in spacecraft design and mission planning, thereby reducing potential disasters and extreme costs. Results of this study directly relate to the survivability of future spacecraft and satellites that are to travel through and/or reside in low Earth orbit (LEO). More specifically, these data are being used to: (1) characterize the effects of the LEO micrometeoroid an debris environment on satellite designs and components; (2) update the current theoretical micrometeoroid and debris models for LEO; (3) help assess the survivability of spacecraft and satellites that must travel through or reside in LEO, and the probability of their collision with already resident debris; and (4) help define and evaluate future debris mitigation and disposal methods. Combined model predictions match relatively well with the LDEF data for impact craters larger than approximately 0.05 cm, diameter; however, for smaller impact craters, the combined predictions diverge and do not reflect the sporadic clouds identified by the Interplanetary Dust Experiment (IDE) aboard LDEF. The divergences cannot currently be explained by the authors or model developers. The mean flux of small craters (approximately 0.05 cm diameter) is

  14. Complex Unsaturated Zone Flow and Thermohydrologic Processes in a Regulatory Environment: A Perspective on Uncertainty

    NASA Astrophysics Data System (ADS)

    Fedors, R. W.; Manepally, C.; Justus, P. S.; Basagaoglu, H.; Pensado, O.; Dubreuilh, P.

    2007-12-01

    An important part of a risk-informed, performance-based regulatory review of a potential license application for disposal of high-level radioactive waste at Yucca Mountain, Nevada, is the consideration of alternative interpretations and models of risk significant physical processes. The Nuclear Regulatory Commission (NRC) expects that simplified models will be abstracted from complex process-level models to conduct total-system performance assessments. There are several phases or steps to developing an abstracted model and its supporting basis from more detailed and complicated models for each area of the total system. For complex ambient and thermally perturbed flow in fractured tuffs of the unsaturated zone at Yucca Mountain, these steps c,an be summarized as (i) site characterization and observation, (ii) field and laboratory tests, (iii) conceptual model development, (iv) process-level numerical modeling, and (v) abstraction development. Each step is affected by uncertainty in (i) assessing parameters for models and (ii) conceptualization and understanding of governing processes. Because of the complexity and uncertainty, alternative interpretations and models become important aspects in the regulatory environment. NRC staff gain confidence in performance assessment model results through understanding the uncertainty in the various models. An example of a complex process in the unsaturated zone is seepage into drifts, which leads to liquid water potentially contacting waste packages. Seepage is a risk-important process for the unsaturated zone at Yucca Mountain because of its potential effect on waste package integrity and trainsport of potentially released radionuclides. Complexities for seepage include (i) characterization of fractures that carry flow, (ii) effect of small to intermediate scale structural features on flow, (iii) consideration of the diverse flow regimes (rivulets, film flow, capillarity) in fractures, (iv) effect of vapor transport associated

  15. Quantum jump model for a system with a finite-size environment.

    PubMed

    Suomela, S; Kutvonen, A; Ala-Nissila, T

    2016-06-01

    Measuring the thermodynamic properties of open quantum systems poses a major challenge. A calorimetric detection has been proposed as a feasible experimental scheme to measure work and fluctuation relations in open quantum systems. However, the detection requires a finite size for the environment, which influences the system dynamics. This process cannot be modeled with the standard stochastic approaches. We develop a quantum jump model suitable for systems coupled to a finite-size environment. We use the method to study the common fluctuation relations and prove that they are satisfied.

  16. Software-Engineering Process Simulation (SEPS) model

    NASA Technical Reports Server (NTRS)

    Lin, C. Y.; Abdel-Hamid, T.; Sherif, J. S.

    1992-01-01

    The Software Engineering Process Simulation (SEPS) model is described which was developed at JPL. SEPS is a dynamic simulation model of the software project development process. It uses the feedback principles of system dynamics to simulate the dynamic interactions among various software life cycle development activities and management decision making processes. The model is designed to be a planning tool to examine tradeoffs of cost, schedule, and functionality, and to test the implications of different managerial policies on a project's outcome. Furthermore, SEPS will enable software managers to gain a better understanding of the dynamics of software project development and perform postmodern assessments.

  17. Job Aiding/Training Decision Process Model

    DTIC Science & Technology

    1992-09-01

    I[ -, . 1’, oo Ii AL-CR-i1992-0004 AD-A256 947lEE = IIEI ifl ll 1l I JOB AIDING/TRAINING DECISION PROCESS MODEL A R M John P. Zenyuh DTIC S Phillip C...March 1990 - April 1990 4. TITLE AND SUBTITLE S. FUNDING NUMBERS C - F33615-86-C-0545 Job Aiding/Training Decision Process Model PE - 62205F PR - 1121 6...Components to Process Model Decision and Selection Points ........... 32 13. Summary of Subject Recommendations for Aiding Approaches

  18. Gene-Environment Processes Linking Aggression, Peer Victimization, and the Teacher-Child Relationship

    ERIC Educational Resources Information Center

    Brendgen, Mara; Boivin, Michel; Dionne, Ginette; Barker, Edward D.; Vitaro, Frank; Girard, Alain; Tremblay, Richard; Perusse, Daniel

    2011-01-01

    Aggressive behavior in middle childhood is at least partly explained by genetic factors. Nevertheless, estimations of simple effects ignore possible gene-environment interactions (G x E) or gene-environment correlations (rGE) in the etiology of aggression. The present study aimed to simultaneously test for G x E and rGE processes between…

  19. A Delineation of the Cognitive Processes Manifested in a Social Annotation Environment

    ERIC Educational Resources Information Center

    Li, S. C.; Pow, J. W. C.; Cheung, W. C.

    2015-01-01

    This study aims to examine how students' learning trajectories progress in an online social annotation environment, and how their cognitive processes and levels of interaction correlate with their learning outcomes. Three different types of activities (cognitive, metacognitive and social) were identified in the online environment. The time…

  20. A Delineation of the Cognitive Processes Manifested in a Social Annotation Environment

    ERIC Educational Resources Information Center

    Li, S. C.; Pow, J. W. C.; Cheung, W. C.

    2015-01-01

    This study aims to examine how students' learning trajectories progress in an online social annotation environment, and how their cognitive processes and levels of interaction correlate with their learning outcomes. Three different types of activities (cognitive, metacognitive and social) were identified in the online environment. The time…