Sample records for environment process model

  1. Modeling socio-cultural processes in network-centric environments

    NASA Astrophysics Data System (ADS)

    Santos, Eunice E.; Santos, Eugene, Jr.; Korah, John; George, Riya; Gu, Qi; Kim, Keumjoo; Li, Deqing; Russell, Jacob; Subramanian, Suresh

    2012-05-01

    The major focus in the field of modeling & simulation for network centric environments has been on the physical layer while making simplifications for the human-in-the-loop. However, the human element has a big impact on the capabilities of network centric systems. Taking into account the socio-behavioral aspects of processes such as team building, group decision-making, etc. are critical to realistically modeling and analyzing system performance. Modeling socio-cultural processes is a challenge because of the complexity of the networks, dynamism in the physical and social layers, feedback loops and uncertainty in the modeling data. We propose an overarching framework to represent, model and analyze various socio-cultural processes within network centric environments. The key innovation in our methodology is to simultaneously model the dynamism in both the physical and social layers while providing functional mappings between them. We represent socio-cultural information such as friendships, professional relationships and temperament by leveraging the Culturally Infused Social Network (CISN) framework. The notion of intent is used to relate the underlying socio-cultural factors to observed behavior. We will model intent using Bayesian Knowledge Bases (BKBs), a probabilistic reasoning network, which can represent incomplete and uncertain socio-cultural information. We will leverage previous work on a network performance modeling framework called Network-Centric Operations Performance and Prediction (N-COPP) to incorporate dynamism in various aspects of the physical layer such as node mobility, transmission parameters, etc. We validate our framework by simulating a suitable scenario, incorporating relevant factors and providing analyses of the results.

  2. Space - A unique environment for process modeling R&D

    NASA Technical Reports Server (NTRS)

    Overfelt, Tony

    1991-01-01

    Process modeling, the application of advanced computational techniques to simulate real processes as they occur in regular use, e.g., welding, casting and semiconductor crystal growth, is discussed. Using the low-gravity environment of space will accelerate the technical validation of the procedures and enable extremely accurate determinations of the many necessary thermophysical properties. Attention is given to NASA's centers for the commercial development of space; joint ventures of universities, industries, and goverment agencies to study the unique attributes of space that offer potential for applied R&D and eventual commercial exploitation.

  3. Autoplan: A self-processing network model for an extended blocks world planning environment

    NASA Technical Reports Server (NTRS)

    Dautrechy, C. Lynne; Reggia, James A.; Mcfadden, Frank

    1990-01-01

    Self-processing network models (neural/connectionist models, marker passing/message passing networks, etc.) are currently undergoing intense investigation for a variety of information processing applications. These models are potentially very powerful in that they support a large amount of explicit parallel processing, and they cleanly integrate high level and low level information processing. However they are currently limited by a lack of understanding of how to apply them effectively in many application areas. The formulation of self-processing network methods for dynamic, reactive planning is studied. The long-term goal is to formulate robust, computationally effective information processing methods for the distributed control of semiautonomous exploration systems, e.g., the Mars Rover. The current research effort is focusing on hierarchical plan generation, execution and revision through local operations in an extended blocks world environment. This scenario involves many challenging features that would be encountered in a real planning and control environment: multiple simultaneous goals, parallel as well as sequential action execution, action sequencing determined not only by goals and their interactions but also by limited resources (e.g., three tasks, two acting agents), need to interpret unanticipated events and react appropriately through replanning, etc.

  4. Modeling snow accumulation and ablation processes in forested environments

    NASA Astrophysics Data System (ADS)

    Andreadis, Konstantinos M.; Storck, Pascal; Lettenmaier, Dennis P.

    2009-05-01

    The effects of forest canopies on snow accumulation and ablation processes can be very important for the hydrology of midlatitude and high-latitude areas. A mass and energy balance model for snow accumulation and ablation processes in forested environments was developed utilizing extensive measurements of snow interception and release in a maritime mountainous site in Oregon. The model was evaluated using 2 years of weighing lysimeter data and was able to reproduce the snow water equivalent (SWE) evolution throughout winters both beneath the canopy and in the nearby clearing, with correlations to observations ranging from 0.81 to 0.99. Additionally, the model was evaluated using measurements from a Boreal Ecosystem-Atmosphere Study (BOREAS) field site in Canada to test the robustness of the canopy snow interception algorithm in a much different climate. Simulated SWE was relatively close to the observations for the forested sites, with discrepancies evident in some cases. Although the model formulation appeared robust for both types of climates, sensitivity to parameters such as snow roughness length and maximum interception capacity suggested the magnitude of improvements of SWE simulations that might be achieved by calibration.

  5. Conceptualization of Approaches and Thought Processes Emerging in Validating of Model in Mathematical Modeling in Technology Aided Environment

    ERIC Educational Resources Information Center

    Hidiroglu, Çaglar Naci; Bukova Güzel, Esra

    2013-01-01

    The aim of the present study is to conceptualize the approaches displayed for validation of model and thought processes provided in mathematical modeling process performed in technology-aided learning environment. The participants of this grounded theory study were nineteen secondary school mathematics student teachers. The data gathered from the…

  6. A simple hyperbolic model for communication in parallel processing environments

    NASA Technical Reports Server (NTRS)

    Stoica, Ion; Sultan, Florin; Keyes, David

    1994-01-01

    We introduce a model for communication costs in parallel processing environments called the 'hyperbolic model,' which generalizes two-parameter dedicated-link models in an analytically simple way. Dedicated interprocessor links parameterized by a latency and a transfer rate that are independent of load are assumed by many existing communication models; such models are unrealistic for workstation networks. The communication system is modeled as a directed communication graph in which terminal nodes represent the application processes that initiate the sending and receiving of the information and in which internal nodes, called communication blocks (CBs), reflect the layered structure of the underlying communication architecture. The direction of graph edges specifies the flow of the information carried through messages. Each CB is characterized by a two-parameter hyperbolic function of the message size that represents the service time needed for processing the message. The parameters are evaluated in the limits of very large and very small messages. Rules are given for reducing a communication graph consisting of many to an equivalent two-parameter form, while maintaining an approximation for the service time that is exact in both large and small limits. The model is validated on a dedicated Ethernet network of workstations by experiments with communication subprograms arising in scientific applications, for which a tight fit of the model predictions with actual measurements of the communication and synchronization time between end processes is demonstrated. The model is then used to evaluate the performance of two simple parallel scientific applications from partial differential equations: domain decomposition and time-parallel multigrid. In an appropriate limit, we also show the compatibility of the hyperbolic model with the recently proposed LogP model.

  7. A Dynamic Process Model for Optimizing the Hospital Environment Cash-Flow

    NASA Astrophysics Data System (ADS)

    Pater, Flavius; Rosu, Serban

    2011-09-01

    In this article is presented a new approach to some fundamental techniques of solving dynamic programming problems with the use of functional equations. We will analyze the problem of minimizing the cost of treatment in a hospital environment. Mathematical modeling of this process leads to an optimal control problem with a finite horizon.

  8. OpenDA Open Source Generic Data Assimilation Environment and its Application in Process Models

    NASA Astrophysics Data System (ADS)

    El Serafy, Ghada; Verlaan, Martin; Hummel, Stef; Weerts, Albrecht; Dhondia, Juzer

    2010-05-01

    Data Assimilation techniques are essential elements in state-of-the-art development of models and their optimization with data in the field of groundwater, surface water and soil systems. They are essential tools in calibration of complex modelling systems and improvement of model forecasts. The OpenDA is a new and generic open source data assimilation environment for application to a choice of physical process models, applied to case dependent domains. OpenDA was introduced recently when the developers of Costa, an open-source TU Delft project [http://www.costapse.org; Van Velzen and Verlaan; 2007] and those of the DATools from the former WL|Delft Hydraulics [El Serafy et al 2007; Weerts et al. 2009] decided to join forces. OpenDA makes use of a set of interfaces that describe the interaction between models, observations and data assimilation algorithms. It focuses on flexible applications in portable systems for modelling geophysical processes. It provides a generic interfacing protocol that allows combination of the implemented data assimilation techniques with, in principle, any time-stepping model duscribing a process(atmospheric processes, 3D circulation, 2D water level, sea surface temperature, soil systems, groundwater etc.). Presently, OpenDA features filtering techniques and calibration techniques. The presentation will give an overview of the OpenDA and the results of some of its practical applications. Application of data assimilation in portable operational forecasting systems—the DATools assimilation environment, El Serafy G.Y., H. Gerritsen, S. Hummel, A. H. Weerts, A.E. Mynett and M. Tanaka (2007), Journal of Ocean Dynamics, DOI 10.1007/s10236-007-0124-3, pp.485-499. COSTA a problem solving environment for data assimilation applied for hydrodynamical modelling, Van Velzen and Verlaan (2007), Meteorologische Zeitschrift, Volume 16, Number 6, December 2007 , pp. 777-793(17). Application of generic data assimilation tools (DATools) for flood

  9. Integrated Modeling Environment

    NASA Technical Reports Server (NTRS)

    Mosier, Gary; Stone, Paul; Holtery, Christopher

    2006-01-01

    The Integrated Modeling Environment (IME) is a software system that establishes a centralized Web-based interface for integrating people (who may be geographically dispersed), processes, and data involved in a common engineering project. The IME includes software tools for life-cycle management, configuration management, visualization, and collaboration.

  10. Gaussian process based modeling and experimental design for sensor calibration in drifting environments

    PubMed Central

    Geng, Zongyu; Yang, Feng; Chen, Xi; Wu, Nianqiang

    2016-01-01

    It remains a challenge to accurately calibrate a sensor subject to environmental drift. The calibration task for such a sensor is to quantify the relationship between the sensor’s response and its exposure condition, which is specified by not only the analyte concentration but also the environmental factors such as temperature and humidity. This work developed a Gaussian Process (GP)-based procedure for the efficient calibration of sensors in drifting environments. Adopted as the calibration model, GP is not only able to capture the possibly nonlinear relationship between the sensor responses and the various exposure-condition factors, but also able to provide valid statistical inference for uncertainty quantification of the target estimates (e.g., the estimated analyte concentration of an unknown environment). Built on GP’s inference ability, an experimental design method was developed to achieve efficient sampling of calibration data in a batch sequential manner. The resulting calibration procedure, which integrates the GP-based modeling and experimental design, was applied on a simulated chemiresistor sensor to demonstrate its effectiveness and its efficiency over the traditional method. PMID:26924894

  11. Modeling critical zone processes in intensively managed environments

    NASA Astrophysics Data System (ADS)

    Kumar, Praveen; Le, Phong; Woo, Dong; Yan, Qina

    2017-04-01

    Processes in the Critical Zone (CZ), which sustain terrestrial life, are tightly coupled across hydrological, physical, biochemical, and many other domains over both short and long timescales. In addition, vegetation acclimation resulting from elevated atmospheric CO2 concentration, along with response to increased temperature and altered rainfall pattern, is expected to result in emergent behaviors in ecologic and hydrologic functions, subsequently controlling CZ processes. We hypothesize that the interplay between micro-topographic variability and these emergent behaviors will shape complex responses of a range of ecosystem dynamics within the CZ. Here, we develop a modeling framework ('Dhara') that explicitly incorporates micro-topographic variability based on lidar topographic data with coupling of multi-layer modeling of the soil-vegetation continuum and 3-D surface-subsurface transport processes to study ecological and biogeochemical dynamics. We further couple a C-N model with a physically based hydro-geomorphologic model to quantify (i) how topographic variability controls the spatial distribution of soil moisture, temperature, and biogeochemical processes, and (ii) how farming activities modify the interaction between soil erosion and soil organic carbon (SOC) dynamics. To address the intensive computational demand from high-resolution modeling at lidar data scale, we use a hybrid CPU-GPU parallel computing architecture run over large supercomputing systems for simulations. Our findings indicate that rising CO2 concentration and air temperature have opposing effects on soil moisture, surface water and ponding in topographic depressions. Further, the relatively higher soil moisture and lower soil temperature contribute to decreased soil microbial activities in the low-lying areas due to anaerobic conditions and reduced temperatures. The decreased microbial relevant processes cause the reduction of nitrification rates, resulting in relatively lower nitrate

  12. Process migration in UNIX environments

    NASA Technical Reports Server (NTRS)

    Lu, Chin; Liu, J. W. S.

    1988-01-01

    To support process migration in UNIX environments, the main problem is how to encapsulate the location dependent features of the system in such a way that a host independent virtual environment is maintained by the migration handlers on the behalf of each migrated process. An object-oriented approach is used to describe the interaction between a process and its environment. More specifically, environmental objects were introduced in UNIX systems to carry out the user-environment interaction. The implementation of the migration handlers is based on both the state consistency criterion and the property consistency criterion.

  13. Effects of Kinetic Processes in Shaping Io's Global Plasma Environment: A 3D Hybrid Model

    NASA Technical Reports Server (NTRS)

    Lipatov, Alexander S.; Combi, Michael R.

    2004-01-01

    The global dynamics of the ionized and neutral components in the environment of Io plays an important role in the interaction of Jupiter's corotating magnetospheric plasma with Io. The stationary simulation of this problem was done in the MHD and the electrodynamics approaches. One of the main significant results from the simplified two-fluid model simulations was a production of the structure of the double-peak in the magnetic field signature of the I0 flyby that could not be explained by standard MHD models. In this paper, we develop a method of kinetic ion simulation. This method employs the fluid description for electrons and neutrals whereas for ions multilevel, drift-kinetic and particle, approaches are used. We also take into account charge-exchange and photoionization processes. Our model provides much more accurate description for ion dynamics and allows us to take into account the realistic anisotropic ion distribution that cannot be done in fluid simulations. The first results of such simulation of the dynamics of ions in the Io's environment are discussed in this paper.

  14. Developing a multi-systemic fall prevention model, incorporating the physical environment, the care process and technology: a systematic review.

    PubMed

    Choi, Young-Seon; Lawler, Erin; Boenecke, Clayton A; Ponatoski, Edward R; Zimring, Craig M

    2011-12-01

    This paper reports a review that assessed the effectiveness and characteristics of fall prevention interventions implemented in hospitals. A multi-systemic fall prevention model that establishes a practical framework was developed from the evidence. Falls occur through complex interactions between patient-related and environmental risk factors, suggesting a need for multifaceted fall prevention approaches that address both factors. We searched Medline, CINAHL, PsycInfo and the Web of Science databases for references published between January 1990 and June 2009 and scrutinized secondary references from acquired papers. Due to the heterogeneity of interventions and populations, we conducted a quantitative systematic review without a meta-analysis and used a narrative summary to report findings. From the review, three distinct characteristics of fall prevention interventions emerged: (1) the physical environment, (2) the care process and culture and (3) technology. While clinically significant evidence shows the efficacy of environment-related interventions in reducing falls and fall-related injuries, the literature identified few hospitals that had introduced environment-related interventions in their multifaceted fall intervention strategies. Using the multi-systemic fall prevention model, hospitals should promote a practical strategy that benefits from the collective effects of the physical environment, the care process and culture and technology to prevent falls and fall-related injuries. By doing so, they can more effectively address the various risk factors for falling and therefore, prevent falls. Studies that test the proposed model need to be conducted to establish the efficacy of the model in practice. © 2011 The Authors. Journal of Advanced Nursing © 2011 Blackwell Publishing Ltd.

  15. Near-field environment/processes working group summary

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Murphy, W.M.

    1995-09-01

    This article is a summary of the proceedings of a group discussion which took place at the Workshop on the Role of Natural Analogs in Geologic Disposal of High-Level Nuclear Waste in San Antonio, Texas on July 22-25, 1991. The working group concentrated on the subject of the near-field environment to geologic repositories for high-level nuclear waste. The near-field environment may be affected by thermal perturbations from the waste, and by disturbances caused by the introduction of exotic materials during construction of the repository. This group also discussed the application of modelling of performance-related processes.

  16. Use of an uncertainty analysis for genome-scale models as a prediction tool for microbial growth processes in subsurface environments.

    PubMed

    Klier, Christine

    2012-03-06

    The integration of genome-scale, constraint-based models of microbial cell function into simulations of contaminant transport and fate in complex groundwater systems is a promising approach to help characterize the metabolic activities of microorganisms in natural environments. In constraint-based modeling, the specific uptake flux rates of external metabolites are usually determined by Michaelis-Menten kinetic theory. However, extensive data sets based on experimentally measured values are not always available. In this study, a genome-scale model of Pseudomonas putida was used to study the key issue of uncertainty arising from the parametrization of the influx of two growth-limiting substrates: oxygen and toluene. The results showed that simulated growth rates are highly sensitive to substrate affinity constants and that uncertainties in specific substrate uptake rates have a significant influence on the variability of simulated microbial growth. Michaelis-Menten kinetic theory does not, therefore, seem to be appropriate for descriptions of substrate uptake processes in the genome-scale model of P. putida. Microbial growth rates of P. putida in subsurface environments can only be accurately predicted if the processes of complex substrate transport and microbial uptake regulation are sufficiently understood in natural environments and if data-driven uptake flux constraints can be applied.

  17. Dynamical nexus of water supply, hydropower and environment based on the modeling of multiple socio-natural processes: from socio-hydrological perspective

    NASA Astrophysics Data System (ADS)

    Liu, D.; Wei, X.; Li, H. Y.; Lin, M.; Tian, F.; Huang, Q.

    2017-12-01

    In the socio-hydrological system, the ecological functions and environmental services, which are chosen to maintain, are determined by the preference of the society, which is making the trade-off among the values of riparian vegetation, fish, river landscape, water supply, hydropower, navigation and so on. As the society develops, the preference of the value will change and the ecological functions and environmental services which are chosen to maintain will change. The aim of the study is to focus on revealing the feedback relationship of water supply, hydropower and environment and the dynamical feedback mechanism at macro-scale, and to establish socio-hydrological evolution model of the watershed based on the modeling of multiple socio-natural processes. The study will aim at the Han River in China, analyze the impact of the water supply and hydropower on the ecology, hydrology and other environment elements, and study the effect on the water supply and hydropower to ensure the ecological and environmental water of the different level. Water supply and ecology are usually competitive. In some reservoirs, hydropower and ecology are synergic relationship while they are competitive in some reservoirs. The study will analyze the multiple mechanisms to implement the dynamical feedbacks of environment to hydropower, set up the quantitative relationship description of the feedback mechanisms, recognize the dominant processes in the feedback relationships of hydropower and environment and then analyze the positive and negative feedbacks in the feedback networks. The socio-hydrological evolution model at the watershed scale will be built and applied to simulate the long-term evolution processes of the watershed of the current situation. Dynamical nexus of water supply, hydropower and environment will be investigated.

  18. Estimation of environment-related properties of chemicals for design of sustainable processes: development of group-contribution+ (GC+) property models and uncertainty analysis.

    PubMed

    Hukkerikar, Amol Shivajirao; Kalakul, Sawitree; Sarup, Bent; Young, Douglas M; Sin, Gürkan; Gani, Rafiqul

    2012-11-26

    of the developed property models for the estimation of environment-related properties and uncertainties of the estimated property values is highlighted through an illustrative example. The developed property models provide reliable estimates of environment-related properties needed to perform process synthesis, design, and analysis of sustainable chemical processes and allow one to evaluate the effect of uncertainties of estimated property values on the calculated performance of processes giving useful insights into quality and reliability of the design of sustainable processes.

  19. Extreme Environments Development of Decision Processes and Training Programs for Medical Policy Formulation

    NASA Technical Reports Server (NTRS)

    Stough, Roger

    2004-01-01

    The purpose of this workshop was to survey existing health and safety policies as well as processes and practices for various extreme environments; to identify strengths and shortcomings of these processes; and to recommend parameters for inclusion in a generic approach to policy formulation, applicable to the broadest categories of extreme environments. It was anticipated that two additional workshops would follow. The November 7, 2003 workshop would be devoted to the evaluation of different model(s) and a concluding expert evaluation of the usefulness of the model using a policy formulation example. The final workshop was planned for March 2004.

  20. Interplanetary Radiation and Internal Charging Environment Models for Solar Sails

    NASA Technical Reports Server (NTRS)

    Minow, Joseph I.; Altstatt, Richard L.; NeegaardParker, Linda

    2005-01-01

    A Solar Sail Radiation Environment (SSRE) model has been developed for defining charged particle environments over an energy range from 0.01 keV to 1 MeV for hydrogen ions, helium ions, and electrons. The SSRE model provides the free field charged particle environment required for characterizing energy deposition per unit mass, charge deposition, and dose rate dependent conductivity processes required to evaluate radiation dose and internal (bulk) charging processes in the solar sail membrane in interplanetary space. Solar wind and energetic particle measurements from instruments aboard the Ulysses spacecraft in a solar, near-polar orbit provide the particle data over a range of heliospheric latitudes used to derive the environment that can be used for radiation and charging environments for both high inclination 0.5 AU Solar Polar Imager mission and the 1.0 AU L1 solar missions. This paper describes the techniques used to model comprehensive electron, proton, and helium spectra over the range of particle energies of significance to energy and charge deposition in thin (less than 25 micrometers) solar sail materials.

  1. Modelling Dust Processing and Evolution in Extreme Environments as seen by Herschel Space Observatory

    NASA Astrophysics Data System (ADS)

    Bocchio, Marco

    2014-09-01

    The main goal of my PhD study is to understand the dust processing that occurs during the mixing between the galactic interstellar medium and the intracluster medium. This process is of particular interest in violent phenomena such as galaxy-galaxy interactions or the ``Ram Pressure Stripping'' due to the infalling of a galaxy towards the cluster centre.Initially, I focus my attention to the problem of dust destruction and heating processes, re-visiting the available models in literature. I particularly stress on the cases of extreme environments such as a hot coronal-type gas (e.g., IGM, ICM, HIM) and supernova-generated interstellar shocks. Under these conditions small grains are destroyed on short timescales and large grains are heated by the collisions with fast electrons making the dust spectral energy distribution very different from what observed in the diffuse ISM.In order to test our models I apply them to the case of an interacting galaxy, NGC 4438. Herschel data of this galaxy indicates the presence of dust with a higher-than-expected temperature.With a multi-wavelength analysis on a pixel-by-pixel basis we show that this hot dust seems to be embedded in a hot ionised gas therefore undergoing both collisional heating and small grain destruction.Furthermore, I focus on the long-standing conundrum about the dust destruction and dust formation timescales in the Milky Way. Based on the destruction efficiency in interstellar shocks, previous estimates led to a dust lifetime shorter than the typical timescale for dust formation in AGB stars. Using a recent dust model and an updated dust processing model we re-evaluate the dust lifetime in our Galaxy. Finally, I turn my attention to the phenomenon of ``Ram Pressure Stripping''. The galaxy ESO 137-001 represents one of the best cases to study this effect. Its long H2 tail embedded in a hot and ionised tail raises questions about its possible stripping from the galaxy or formation downstream in the tail. Based on

  2. A neural network ActiveX based integrated image processing environment.

    PubMed

    Ciuca, I; Jitaru, E; Alaicescu, M; Moisil, I

    2000-01-01

    The paper outlines an integrated image processing environment that uses neural networks ActiveX technology for object recognition and classification. The image processing environment which is Windows based, encapsulates a Multiple-Document Interface (MDI) and is menu driven. Object (shape) parameter extraction is focused on features that are invariant in terms of translation, rotation and scale transformations. The neural network models that can be incorporated as ActiveX components into the environment allow both clustering and classification of objects from the analysed image. Mapping neural networks perform an input sensitivity analysis on the extracted feature measurements and thus facilitate the removal of irrelevant features and improvements in the degree of generalisation. The program has been used to evaluate the dimensions of the hydrocephalus in a study for calculating the Evans index and the angle of the frontal horns of the ventricular system modifications.

  3. An intelligent processing environment for real-time simulation

    NASA Technical Reports Server (NTRS)

    Carroll, Chester C.; Wells, Buren Earl, Jr.

    1988-01-01

    The development of a highly efficient and thus truly intelligent processing environment for real-time general purpose simulation of continuous systems is described. Such an environment can be created by mapping the simulation process directly onto the University of Alamba's OPERA architecture. To facilitate this effort, the field of continuous simulation is explored, highlighting areas in which efficiency can be improved. Areas in which parallel processing can be applied are also identified, and several general OPERA type hardware configurations that support improved simulation are investigated. Three direct execution parallel processing environments are introduced, each of which greatly improves efficiency by exploiting distinct areas of the simulation process. These suggested environments are candidate architectures around which a highly intelligent real-time simulation configuration can be developed.

  4. Electrophysiological models of neural processing.

    PubMed

    Nelson, Mark E

    2011-01-01

    The brain is an amazing information processing system that allows organisms to adaptively monitor and control complex dynamic interactions with their environment across multiple spatial and temporal scales. Mathematical modeling and computer simulation techniques have become essential tools in understanding diverse aspects of neural processing ranging from sub-millisecond temporal coding in the sound localization circuity of barn owls to long-term memory storage and retrieval in humans that can span decades. The processing capabilities of individual neurons lie at the core of these models, with the emphasis shifting upward and downward across different levels of biological organization depending on the nature of the questions being addressed. This review provides an introduction to the techniques for constructing biophysically based models of individual neurons and local networks. Topics include Hodgkin-Huxley-type models of macroscopic membrane currents, Markov models of individual ion-channel currents, compartmental models of neuronal morphology, and network models involving synaptic interactions among multiple neurons.

  5. A Multiagent Modeling Environment for Simulating Work Practice in Organizations

    NASA Technical Reports Server (NTRS)

    Sierhuis, Maarten; Clancey, William J.; vanHoof, Ron

    2004-01-01

    In this paper we position Brahms as a tool for simulating organizational processes. Brahms is a modeling and simulation environment for analyzing human work practice, and for using such models to develop intelligent software agents to support the work practice in organizations. Brahms is the result of more than ten years of research at the Institute for Research on Learning (IRL), NYNEX Science & Technology (the former R&D institute of the Baby Bell telephone company in New York, now Verizon), and for the last six years at NASA Ames Research Center, in the Work Systems Design and Evaluation group, part of the Computational Sciences Division (Code IC). Brahms has been used on more than ten modeling and simulation research projects, and recently has been used as a distributed multiagent development environment for developing work practice support tools for human in-situ science exploration on planetary surfaces, in particular a human mission to Mars. Brahms was originally conceived of as a business process modeling and simulation tool that incorporates the social systems of work, by illuminating how formal process flow descriptions relate to people s actual located activities in the workplace. Our research started in the early nineties as a reaction to experiences with work process modeling and simulation . Although an effective tool for convincing management of the potential cost-savings of the newly designed work processes, the modeling and simulation environment was only able to describe work as a normative workflow. However, the social systems, uncovered in work practices studied by the design team played a significant role in how work actually got done-actual lived work. Multi- tasking, informal assistance and circumstantial work interactions could not easily be represented in a tool with a strict workflow modeling paradigm. In response, we began to develop a tool that would have the benefits of work process modeling and simulation, but be distinctively able to

  6. Modelling of Indoor Environments Using Lindenmayer Systems

    NASA Astrophysics Data System (ADS)

    Peter, M.

    2017-09-01

    Documentation of the "as-built" state of building interiors has gained a lot of interest in the recent years. Various data acquisition methods exist, e.g. the extraction from photographed evacuation plans using image processing or, most prominently, indoor mobile laser scanning. Due to clutter or data gaps as well as errors during data acquisition and processing, automatic reconstruction of CAD/BIM-like models from these data sources is not a trivial task. Thus it is often tried to support reconstruction by general rules for the perpendicularity and parallelism which are predominant in man-made structures. Indoor environments of large, public buildings, however, often also follow higher-level rules like symmetry and repetition of e.g. room sizes and corridor widths. In the context of reconstruction of city city elements (e.g. street networks) or building elements (e.g. façade layouts), formal grammars have been put to use. In this paper, we describe the use of Lindenmayer systems - which originally have been developed for the computer-based modelling of plant growth - to model and reproduce the layout of indoor environments in 2D.

  7. Gene-Environment Interplay in Twin Models

    PubMed Central

    Hatemi, Peter K.

    2013-01-01

    In this article, we respond to Shultziner’s critique that argues that identical twins are more alike not because of genetic similarity, but because they select into more similar environments and respond to stimuli in comparable ways, and that these effects bias twin model estimates to such an extent that they are invalid. The essay further argues that the theory and methods that undergird twin models, as well as the empirical studies which rely upon them, are unaware of these potential biases. We correct this and other misunderstandings in the essay and find that gene-environment (GE) interplay is a well-articulated concept in behavior genetics and political science, operationalized as gene-environment correlation and gene-environment interaction. Both are incorporated into interpretations of the classical twin design (CTD) and estimated in numerous empirical studies through extensions of the CTD. We then conduct simulations to quantify the influence of GE interplay on estimates from the CTD. Due to the criticism’s mischaracterization of the CTD and GE interplay, combined with the absence of any empirical evidence to counter what is presented in the extant literature and this article, we conclude that the critique does not enhance our understanding of the processes that drive political traits, genetic or otherwise. PMID:24808718

  8. Run Environment and Data Management for Earth System Models

    NASA Astrophysics Data System (ADS)

    Widmann, H.; Lautenschlager, M.; Fast, I.; Legutke, S.

    2009-04-01

    The Integrating Model and Data Infrastructure (IMDI) developed and maintained by the Model and Data Group (M&D) comprises the Standard Compile Environment (SCE) and the Standard Run Environment (SRE). The IMDI software has a modular design, which allows to combine and couple a suite of model components and as well to execute the tasks independently and on various platforms. Furthermore the modular structure enables the extension to new model combinations and new platforms. The SRE presented here enables the configuration and performance of earth system model experiments from model integration up to storage and visualization of data. We focus on recently implemented tasks such as synchronous data base filling, graphical monitoring and automatic generation of meta data in XML forms during run time. As well we address the capability to run experiments in heterogeneous IT environments with different computing systems for model integration, data processing and storage. These features are demonstrated for model configurations and on platforms used in current or upcoming projects, e.g. MILLENNIUM or IPCC AR5.

  9. Performance modeling codes for the QuakeSim problem solving environment

    NASA Technical Reports Server (NTRS)

    Parker, J. W.; Donnellan, A.; Lyzenga, G.; Rundle, J.; Tullis, T.

    2003-01-01

    The QuakeSim Problem Solving Environment uses a web-services approach to unify and deploy diverse remote data sources and processing services within a browser environment. Here we focus on the high-performance crustal modeling applications that will be included in this set of remote but interoperable applications.

  10. Modelling Technology for Building Fire Scene with Virtual Geographic Environment

    NASA Astrophysics Data System (ADS)

    Song, Y.; Zhao, L.; Wei, M.; Zhang, H.; Liu, W.

    2017-09-01

    Building fire is a risky activity that can lead to disaster and massive destruction. The management and disposal of building fire has always attracted much interest from researchers. Integrated Virtual Geographic Environment (VGE) is a good choice for building fire safety management and emergency decisions, in which a more real and rich fire process can be computed and obtained dynamically, and the results of fire simulations and analyses can be much more accurate as well. To modelling building fire scene with VGE, the application requirements and modelling objective of building fire scene were analysed in this paper. Then, the four core elements of modelling building fire scene (the building space environment, the fire event, the indoor Fire Extinguishing System (FES) and the indoor crowd) were implemented, and the relationship between the elements was discussed also. Finally, with the theory and framework of VGE, the technology of building fire scene system with VGE was designed within the data environment, the model environment, the expression environment, and the collaborative environment as well. The functions and key techniques in each environment are also analysed, which may provide a reference for further development and other research on VGE.

  11. [Watershed water environment pollution models and their applications: a review].

    PubMed

    Zhu, Yao; Liang, Zhi-Wei; Li, Wei; Yang, Yi; Yang, Mu-Yi; Mao, Wei; Xu, Han-Li; Wu, Wei-Xiang

    2013-10-01

    Watershed water environment pollution model is the important tool for studying watershed environmental problems. Through the quantitative description of the complicated pollution processes of whole watershed system and its parts, the model can identify the main sources and migration pathways of pollutants, estimate the pollutant loadings, and evaluate their impacts on water environment, providing a basis for watershed planning and management. This paper reviewed the watershed water environment models widely applied at home and abroad, with the focuses on the models of pollutants loading (GWLF and PLOAD), water quality of received water bodies (QUAL2E and WASP), and the watershed models integrated pollutant loadings and water quality (HSPF, SWAT, AGNPS, AnnAGNPS, and SWMM), and introduced the structures, principles, and main characteristics as well as the limitations in practical applications of these models. The other models of water quality (CE-QUAL-W2, EFDC, and AQUATOX) and watershed models (GLEAMS and MIKE SHE) were also briefly introduced. Through the case analysis on the applications of single model and integrated models, the development trend and application prospect of the watershed water environment pollution models were discussed.

  12. Float-zone processing in a weightless environment

    NASA Technical Reports Server (NTRS)

    Fowle, A. A.; Haggerty, J. S.; Perron, R. R.; Strong, P. F.; Swanson, J. L.

    1976-01-01

    The results were reported of investigations to: (1) test the validity of analyses which set maximum practical diameters for Si crystals that can be processed by the float zone method in a near weightless environment, (2) determine the convective flow patterns induced in a typical float zone, Si melt under conditions perceived to be advantageous to the crystal growth process using flow visualization techniques applied to a dimensionally scaled model of the Si melt, (3) revise the estimates of the economic impact of space produced Si crystal by the float zone method on the U.S. electronics industry, and (4) devise a rational plan for future work related to crystal growth phenomena wherein low gravity conditions available in a space site can be used to maximum benefit to the U.S. electronics industry.

  13. Radiation Environment Modeling for Spacecraft Design: New Model Developments

    NASA Technical Reports Server (NTRS)

    Barth, Janet; Xapsos, Mike; Lauenstein, Jean-Marie; Ladbury, Ray

    2006-01-01

    A viewgraph presentation on various new space radiation environment models for spacecraft design is described. The topics include: 1) The Space Radiatio Environment; 2) Effects of Space Environments on Systems; 3) Space Radiatio Environment Model Use During Space Mission Development and Operations; 4) Space Radiation Hazards for Humans; 5) "Standard" Space Radiation Environment Models; 6) Concerns about Standard Models; 7) Inadequacies of Current Models; 8) Development of New Models; 9) New Model Developments: Proton Belt Models; 10) Coverage of New Proton Models; 11) Comparison of TPM-1, PSB97, AP-8; 12) New Model Developments: Electron Belt Models; 13) Coverage of New Electron Models; 14) Comparison of "Worst Case" POLE, CRESELE, and FLUMIC Models with the AE-8 Model; 15) New Model Developments: Galactic Cosmic Ray Model; 16) Comparison of NASA, MSU, CIT Models with ACE Instrument Data; 17) New Model Developmemts: Solar Proton Model; 18) Comparison of ESP, JPL91, KIng/Stassinopoulos, and PSYCHIC Models; 19) New Model Developments: Solar Heavy Ion Model; 20) Comparison of CREME96 to CREDO Measurements During 2000 and 2002; 21) PSYCHIC Heavy ion Model; 22) Model Standardization; 23) Working Group Meeting on New Standard Radiation Belt and Space Plasma Models; and 24) Summary.

  14. Open source integrated modeling environment Delta Shell

    NASA Astrophysics Data System (ADS)

    Donchyts, G.; Baart, F.; Jagers, B.; van Putten, H.

    2012-04-01

    In the last decade, integrated modelling has become a very popular topic in environmental modelling since it helps solving problems, which is difficult to model using a single model. However, managing complexity of integrated models and minimizing time required for their setup remains a challenging task. The integrated modelling environment Delta Shell simplifies this task. The software components of Delta Shell are easy to reuse separately from each other as well as a part of integrated environment that can run in a command-line or a graphical user interface mode. The most components of the Delta Shell are developed using C# programming language and include libraries used to define, save and visualize various scientific data structures as well as coupled model configurations. Here we present two examples showing how Delta Shell simplifies process of setting up integrated models from the end user and developer perspectives. The first example shows coupling of a rainfall-runoff, a river flow and a run-time control models. The second example shows how coastal morphological database integrates with the coastal morphological model (XBeach) and a custom nourishment designer. Delta Shell is also available as open-source software released under LGPL license and accessible via http://oss.deltares.nl.

  15. Macro Level Simulation Model Of Space Shuttle Processing

    NASA Technical Reports Server (NTRS)

    2000-01-01

    The contents include: 1) Space Shuttle Processing Simulation Model; 2) Knowledge Acquisition; 3) Simulation Input Analysis; 4) Model Applications in Current Shuttle Environment; and 5) Model Applications for Future Reusable Launch Vehicles (RLV's). This paper is presented in viewgraph form.

  16. Modeling Low-temperature Geochemical Processes

    NASA Astrophysics Data System (ADS)

    Nordstrom, D. K.

    2003-12-01

    Geochemical modeling has become a popular and useful tool for a wide number of applications from research on the fundamental processes of water-rock interactions to regulatory requirements and decisions regarding permits for industrial and hazardous wastes. In low-temperature environments, generally thought of as those in the temperature range of 0-100 °C and close to atmospheric pressure (1 atm=1.01325 bar=101,325 Pa), complex hydrobiogeochemical reactions participate in an array of interconnected processes that affect us, and that, in turn, we affect. Understanding these complex processes often requires tools that are sufficiently sophisticated to portray multicomponent, multiphase chemical reactions yet transparent enough to reveal the main driving forces. Geochemical models are such tools. The major processes that they are required to model include mineral dissolution and precipitation; aqueous inorganic speciation and complexation; solute adsorption and desorption; ion exchange; oxidation-reduction; or redox; transformations; gas uptake or production; organic matter speciation and complexation; evaporation; dilution; water mixing; reaction during fluid flow; reaction involving biotic interactions; and photoreaction. These processes occur in rain, snow, fog, dry atmosphere, soils, bedrock weathering, streams, rivers, lakes, groundwaters, estuaries, brines, and diagenetic environments. Geochemical modeling attempts to understand the redistribution of elements and compounds, through anthropogenic and natural means, for a large range of scale from nanometer to global. "Aqueous geochemistry" and "environmental geochemistry" are often used interchangeably with "low-temperature geochemistry" to emphasize hydrologic or environmental objectives.Recognition of the strategy or philosophy behind the use of geochemical modeling is not often discussed or explicitly described. Plummer (1984, 1992) and Parkhurst and Plummer (1993) compare and contrast two approaches for

  17. Integrated approaches to the application of advanced modeling technology in process development and optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allgor, R.J.; Feehery, W.F.; Tolsma, J.E.

    The batch process development problem serves as good candidate to guide the development of process modeling environments. It demonstrates that very robust numerical techniques are required within an environment that can collect, organize, and maintain the data and models required to address the batch process development problem. This paper focuses on improving the robustness and efficiency of the numerical algorithms required in such a modeling environment through the development of hybrid numerical and symbolic strategies.

  18. A Comparison of Reasoning Processes in a Collaborative Modelling Environment: Learning about genetics problems using virtual chat

    NASA Astrophysics Data System (ADS)

    Pata, Kai; Sarapuu, Tago

    2006-09-01

    This study investigated the possible activation of different types of model-based reasoning processes in two learning settings, and the influence of various terms of reasoning on the learners’ problem representation development. Changes in 53 students’ problem representations about genetic issue were analysed while they worked with different modelling tools in a synchronous network-based environment. The discussion log-files were used for the “microgenetic” analysis of reasoning types. For studying the stages of students’ problem representation development, individual pre-essays and post-essays and their utterances during two reasoning phases were used. An approach for mapping problem representations was developed. Characterizing the elements of mental models and their reasoning level enabled the description of five hierarchical categories of problem representations. Learning in exploratory and experimental settings was registered as the shift towards more complex stages of problem representations in genetics. The effect of different types of reasoning could be observed as the divergent development of problem representations within hierarchical categories.

  19. Monitoring Biogeochemical Processes in Coral Reef Environments with Remote Sensing: A Cross-Disciplinary Approach.

    NASA Astrophysics Data System (ADS)

    Perez, D.; Phinn, S. R.; Roelfsema, C. M.; Shaw, E. C.; Johnston, L.; Iguel, J.; Camacho, R.

    2017-12-01

    Primary production and calcification are important to measure and monitor over time, because of their fundamental roles in the carbon cycling and accretion of habitat structure for reef ecosystems. However, monitoring biogeochemical processes in coastal environments has been difficult due to complications in resolving differences in water optical properties from biological productivity and other sources (sediment, dissolved organics, etc.). This complicates application of algorithms developed for satellite image data from open ocean conditions, and requires alternative approaches. This project applied a cross-disciplinary approach, using established methods for monitoring productivity in terrestrial environments to coral reef systems. Availability of regularly acquired high spatial (< 5m pixels), multispectral satellite imagery has improved mapping and monitoring capabilities for shallow, marine environments such as seagrass and coral reefs. There is potential to further develop optical models for remote sensing applications to estimate and monitor reef system processes, such as primary productivity and calcification. This project collected field measurements of spectral absorptance and primary productivity and calcification rates for two reef systems: Heron Reef, southern Great Barrier Reef and Saipan Lagoon, Commonwealth of the Northern Mariana Islands. Field data were used to parameterize a light-use efficiency (LUE) model, estimating productivity from absorbed photosynthetically active radiation. The LUE model has been successfully applied in terrestrial environments for the past 40 years, and could potentially be used in shallow, marine environments. The model was used in combination with a map of benthic community composition produced from objective based image analysis of WorldView 2 imagery. Light-use efficiency was measured for functional groups: coral, algae, seagrass, and sediment. However, LUE was overestimated for sediment, which led to overestimation

  20. The Conceptualization of the Mathematical Modelling Process in Technology-Aided Environment

    ERIC Educational Resources Information Center

    Hidiroglu, Çaglar Naci; Güzel, Esra Bukova

    2017-01-01

    The aim of the study is to conceptualize the technology-aided mathematical modelling process in the frame of cognitive modelling perspective. The grounded theory approach was adopted in the study. The research was conducted with seven groups consisting of nineteen prospective mathematics teachers. The data were collected from the video records of…

  1. Updated Model of the Solar Energetic Proton Environment in Space

    NASA Astrophysics Data System (ADS)

    Jiggens, Piers; Heynderickx, Daniel; Sandberg, Ingmar; Truscott, Pete; Raukunen, Osku; Vainio, Rami

    2018-05-01

    The Solar Accumulated and Peak Proton and Heavy Ion Radiation Environment (SAPPHIRE) model provides environment specification outputs for all aspects of the Solar Energetic Particle (SEP) environment. The model is based upon a thoroughly cleaned and carefully processed data set. Herein the evolution of the solar proton model is discussed with comparisons to other models and data. This paper discusses the construction of the underlying data set, the modelling methodology, optimisation of fitted flux distributions and extrapolation of model outputs to cover a range of proton energies from 0.1 MeV to 1 GeV. The model provides outputs in terms of mission cumulative fluence, maximum event fluence and peak flux for both solar maximum and solar minimum periods. A new method for describing maximum event fluence and peak flux outputs in terms of 1-in-x-year SPEs is also described. SAPPHIRE proton model outputs are compared with previous models including CREME96, ESP-PSYCHIC and the JPL model. Low energy outputs are compared to SEP data from ACE/EPAM whilst high energy outputs are compared to a new model based on GLEs detected by Neutron Monitors (NMs).

  2. A cluster expansion model for predicting activation barrier of atomic processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rehman, Tafizur; Jaipal, M.; Chatterjee, Abhijit, E-mail: achatter@iitk.ac.in

    2013-06-15

    We introduce a procedure based on cluster expansion models for predicting the activation barrier of atomic processes encountered while studying the dynamics of a material system using the kinetic Monte Carlo (KMC) method. Starting with an interatomic potential description, a mathematical derivation is presented to show that the local environment dependence of the activation barrier can be captured using cluster interaction models. Next, we develop a systematic procedure for training the cluster interaction model on-the-fly, which involves: (i) obtaining activation barriers for handful local environments using nudged elastic band (NEB) calculations, (ii) identifying the local environment by analyzing the NEBmore » results, and (iii) estimating the cluster interaction model parameters from the activation barrier data. Once a cluster expansion model has been trained, it is used to predict activation barriers without requiring any additional NEB calculations. Numerical studies are performed to validate the cluster expansion model by studying hop processes in Ag/Ag(100). We show that the use of cluster expansion model with KMC enables efficient generation of an accurate process rate catalog.« less

  3. r-process nucleosynthesis in dynamic helium-burning environments

    NASA Technical Reports Server (NTRS)

    Cowan, J. J.; Cameron, A. G. W.; Truran, J. W.

    1985-01-01

    The results of an extended examination of r-process nucleosynthesis in helium-burning enviroments are presented. Using newly calculated nuclear rates, dynamical r-process calculations have been made of thermal runaways in helium cores typical of low-mass stars and in the helium zones of stars undergoing supernova explosions. These calculations show that, for a sufficient flux of neutrons produced by the C-13 neutron source, r-process nuclei in solar proportions can be produced. The conditions required for r-process production are found to be 10 to the 20th-10 to the 21st neutrons per cubic centimeter for times of 0.01-0.1 s and neutron number densities in excess of 10 to the 19th per cubic centimeter for times of about 1 s. The amount of C-13 required is found to be exceedingly high - larger than is found to occur in any current stellar evolutionary model. It is thus unlikely that these helium-burning environments are responsible for producing the bulk of the r-process elements seen in the solar system.

  4. Drift-Scale Coupled Processes (DST and THC Seepage) Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    E. Gonnenthal; N. Spyoher

    The purpose of this Analysis/Model Report (AMR) is to document the Near-Field Environment (NFE) and Unsaturated Zone (UZ) models used to evaluate the potential effects of coupled thermal-hydrologic-chemical (THC) processes on unsaturated zone flow and transport. This is in accordance with the ''Technical Work Plan (TWP) for Unsaturated Zone Flow and Transport Process Model Report'', Addendum D, Attachment D-4 (Civilian Radioactive Waste Management System (CRWMS) Management and Operating Contractor (M and O) 2000 [153447]) and ''Technical Work Plan for Nearfield Environment Thermal Analyses and Testing'' (CRWMS M and O 2000 [153309]). These models include the Drift Scale Test (DST) THCmore » Model and several THC seepage models. These models provide the framework to evaluate THC coupled processes at the drift scale, predict flow and transport behavior for specified thermal loading conditions, and predict the chemistry of waters and gases entering potential waste-emplacement drifts. The intended use of this AMR is to provide input for the following: (1) Performance Assessment (PA); (2) Abstraction of Drift-Scale Coupled Processes AMR (ANL-NBS-HS-000029); (3) UZ Flow and Transport Process Model Report (PMR); and (4) Near-Field Environment (NFE) PMR. The work scope for this activity is presented in the TWPs cited above, and summarized as follows: continue development of the repository drift-scale THC seepage model used in support of the TSPA in-drift geochemical model; incorporate heterogeneous fracture property realizations; study sensitivity of results to changes in input data and mineral assemblage; validate the DST model by comparison with field data; perform simulations to predict mineral dissolution and precipitation and their effects on fracture properties and chemistry of water (but not flow rates) that may seep into drifts; submit modeling results to the TDMS and document the models. The model development, input data, sensitivity and validation studies

  5. Drift-Scale Coupled Processes (DST and THC Seepage) Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    E. Sonnenthale

    The purpose of this Analysis/Model Report (AMR) is to document the Near-Field Environment (NFE) and Unsaturated Zone (UZ) models used to evaluate the potential effects of coupled thermal-hydrologic-chemical (THC) processes on unsaturated zone flow and transport. This is in accordance with the ''Technical Work Plan (TWP) for Unsaturated Zone Flow and Transport Process Model Report'', Addendum D, Attachment D-4 (Civilian Radioactive Waste Management System (CRWMS) Management and Operating Contractor (M&O) 2000 [1534471]) and ''Technical Work Plan for Nearfield Environment Thermal Analyses and Testing'' (CRWMS M&O 2000 [153309]). These models include the Drift Scale Test (DST) THC Model and several THCmore » seepage models. These models provide the framework to evaluate THC coupled processes at the drift scale, predict flow and transport behavior for specified thermal loading conditions, and predict the chemistry of waters and gases entering potential waste-emplacement drifts. The intended use of this AMR is to provide input for the following: Performance Assessment (PA); Near-Field Environment (NFE) PMR; Abstraction of Drift-Scale Coupled Processes AMR (ANL-NBS-HS-000029); and UZ Flow and Transport Process Model Report (PMR). The work scope for this activity is presented in the TWPs cited above, and summarized as follows: Continue development of the repository drift-scale THC seepage model used in support of the TSPA in-drift geochemical model; incorporate heterogeneous fracture property realizations; study sensitivity of results to changes in input data and mineral assemblage; validate the DST model by comparison with field data; perform simulations to predict mineral dissolution and precipitation and their effects on fracture properties and chemistry of water (but not flow rates) that may seep into drifts; submit modeling results to the TDMS and document the models. The model development, input data, sensitivity and validation studies described in this AMR are

  6. Chandra Radiation Environment Modeling

    NASA Technical Reports Server (NTRS)

    Minow, Joseph I.; Blackwell, W. C.

    2003-01-01

    CRMFLX (Chandra Radiation Model of ion FluX) is a radiation environment risk mitigation tool for use as a decision aid in planning the operations times for Chandra's Advanced CCD Imaging Spectrometer (ACIS) detector. The accurate prediction of the proton flux environment with energies of 100 - 200 keV is needed in order to protect the ACIS detector against proton degradation. Unfortunately, protons of this energy are abundant in the region of space Chandra must operate, and on-board particle detectors do not measure proton flux levels of the required energy range. This presentation will describe the plasma environment data analysis and modeling basis of the CRMFLX engineering environment model developed to predict the proton flux in the solar wind, magnetosheath, and magnetosphere phenomenological regions of geospace. The recently released CRMFLX Version 2 implementation includes an algorithm that propagates flux from an observation location to other regions of the magnetosphere based on convective ExB and VB-curvature particle drift motions. This technique has the advantage of more completely filling out the database and makes maximum use of limited data obtained during high Kp periods or in areas of the magnetosphere with poor satellite flux measurement coverage.

  7. Learning Environment, Learning Process, Academic Outcomes and Career Success of University Graduates

    ERIC Educational Resources Information Center

    Vermeulen, Lyanda; Schmidt, Henk G.

    2008-01-01

    This study expands on literature covering models on educational productivity, student integration and effectiveness of instruction. An expansion of the literature concerning the impact of higher education on workplace performance is also covered. Relationships were examined between the quality of the academic learning environment, the process of…

  8. An integrative model linking feedback environment and organizational citizenship behavior.

    PubMed

    Peng, Jei-Chen; Chiu, Su-Fen

    2010-01-01

    Past empirical evidence has suggested that a positive supervisor feedback environment may enhance employees' organizational citizenship behavior (OCB). In this study, we aim to extend previous research by proposing and testing an integrative model that examines the mediating processes underlying the relationship between supervisor feedback environment and employee OCB. Data were collected from 259 subordinate-supervisor dyads across a variety of organizations in Taiwan. We used structural equation modeling to test our hypotheses. The results demonstrated that supervisor feedback environment influenced employees' OCB indirectly through (1) both positive affective-cognition and positive attitude (i.e., person-organization fit and organizational commitment), and (2) both negative affective-cognition and negative attitude (i.e., role stressors and job burnout). Theoretical and practical implications are discussed.

  9. Metal Catalyzed Fusion: Nuclear Active Environment vs. Process

    NASA Astrophysics Data System (ADS)

    Chubb, Talbot

    2009-03-01

    To achieve radiationless dd fusion and/or other LENR reactions via chemistry: some focus on environment of interior or altered near-surface volume of bulk metal; some on environment inside metal nanocrystals or on their surface; some on the interface between nanometal crystals and ionic crystals; some on a momentum shock-stimulation reaction process. Experiment says there is also a spontaneous reaction process.

  10. Preface. Forest ecohydrological processes in a changing environment.

    Treesearch

    Xiaohua Wei; Ge Sun; James Vose; Kyoichi Otsuki; Zhiqiang Zhang; Keith Smetterm

    2011-01-01

    The papers in this issue are a selection of the presentations made at the second International Conference on Forests and Water in a Changing Environment. This special issue ‘Forest Ecohydrological Processes in a Changing Environment’ covers the topics regarding the effects of forest, land use and climate changes on ecohydrological processes across forest stand,...

  11. Multispectral simulation environment for modeling low-light-level sensor systems

    NASA Astrophysics Data System (ADS)

    Ientilucci, Emmett J.; Brown, Scott D.; Schott, John R.; Raqueno, Rolando V.

    1998-11-01

    Image intensifying cameras have been found to be extremely useful in low-light-level (LLL) scenarios including military night vision and civilian rescue operations. These sensors utilize the available visible region photons and an amplification process to produce high contrast imagery. It has been demonstrated that processing techniques can further enhance the quality of this imagery. For example, fusion with matching thermal IR imagery can improve image content when very little visible region contrast is available. To aid in the improvement of current algorithms and the development of new ones, a high fidelity simulation environment capable of producing radiometrically correct multi-band imagery for low- light-level conditions is desired. This paper describes a modeling environment attempting to meet these criteria by addressing the task as two individual components: (1) prediction of a low-light-level radiance field from an arbitrary scene, and (2) simulation of the output from a low- light-level sensor for a given radiance field. The radiance prediction engine utilized in this environment is the Digital Imaging and Remote Sensing Image Generation (DIRSIG) model which is a first principles based multi-spectral synthetic image generation model capable of producing an arbitrary number of bands in the 0.28 to 20 micrometer region. The DIRSIG model is utilized to produce high spatial and spectral resolution radiance field images. These images are then processed by a user configurable multi-stage low-light-level sensor model that applies the appropriate noise and modulation transfer function (MTF) at each stage in the image processing chain. This includes the ability to reproduce common intensifying sensor artifacts such as saturation and 'blooming.' Additionally, co-registered imagery in other spectral bands may be simultaneously generated for testing fusion and exploitation algorithms. This paper discusses specific aspects of the DIRSIG radiance prediction for low

  12. Modeling the cometary environment using a fluid approach

    NASA Astrophysics Data System (ADS)

    Shou, Yinsi

    Comets are believed to have preserved the building material of the early solar system and to hold clues to the origin of life on Earth. Abundant remote observations of comets by telescopes and the in-situ measurements by a handful of space missions reveal that the cometary environments are complicated by various physical and chemical processes among the neutral gases and dust grains released from comets, cometary ions, and the solar wind in the interplanetary space. Therefore, physics-based numerical models are in demand to interpret the observational data and to deepen our understanding of the cometary environment. In this thesis, three models using a fluid approach, which include important physical and chemical processes underlying the cometary environment, have been developed to study the plasma, neutral gas, and the dust grains, respectively. Although models based on the fluid approach have limitations in capturing all of the correct physics for certain applications, especially for very low gas density environment, they are computationally much more efficient than alternatives. In the simulations of comet 67P/Churyumov-Gerasimenko at various heliocentric distances with a wide range of production rates, our multi-fluid cometary neutral gas model and multi-fluid cometary dust model have achieved comparable results to the Direct Simulation Monte Carlo (DSMC) model, which is based on a kinetic approach that is valid in all collisional regimes. Therefore, our model is a powerful alternative to the particle-based model, especially for some computationally intensive simulations. Capable of accounting for the varying heating efficiency under various physical conditions in a self-consistent way, the multi-fluid cometary neutral gas model is a good tool to study the dynamics of the cometary coma with different production rates and heliocentric distances. The modeled H2O expansion speeds reproduce the general trend and the speed's nonlinear dependencies of production rate

  13. A process-based standard for the Solar Energetic Particle Event Environment

    NASA Astrophysics Data System (ADS)

    Gabriel, Stephen

    For 10 years or more, there has been a lack of concensus on what the ISO standard model for the Solar Energetic Particle Event (SEPE) environment should be. Despite many technical discussions between the world experts in this field, it has been impossible to agree on which of the several models available should be selected as the standard. Most of these discussions at the ISO WG4 meetings and conferences, etc have centred around the differences in modelling approach between the MSU model and the several remaining models from elsewhere worldwide (mainly the USA and Europe). The topic is considered timely given the inclusion of a session on reference data sets at the Space Weather Workshop in Boulder in April 2014. The original idea of a ‘process-based’ standard was conceived by Dr Kent Tobiska as a way of getting round the problems associated with not only the presence of different models, which in themselves could have quite distinct modelling approaches but could also be based on different data sets. In essence, a process based standard approach overcomes these issues by allowing there to be more than one model and not necessarily a single standard model; however, any such model has to be completely transparent in that the data set and the modelling techniques used have to be not only to be clearly and unambiguously defined but also subject to peer review. If the model meets all of these requirements then it should be acceptable as a standard model. So how does this process-based approach resolve the differences between the existing modelling approaches for the SEPE environment and remove the impasse? In a sense, it does not remove all of the differences but only some of them; however, most importantly it will allow something which so far has been impossible without ambiguities and disagreement and that is a comparison of the results of the various models. To date one of the problems (if not the major one) in comparing the results of the various different SEPE

  14. Condensation Processes in Astrophysical Environments

    NASA Technical Reports Server (NTRS)

    Nuth, Joseph A., III; Rietmeijer, Frans J. M.; Hill, Hugh G. M.

    2002-01-01

    Astrophysical systems present an intriguing set of challenges for laboratory chemists. Chemistry occurs in regions considered an excellent vacuum by laboratory standards and at temperatures that would vaporize laboratory equipment. Outflows around Asymptotic Giant Branch (AGB) stars have timescales ranging from seconds to weeks depending on the distance of the region of interest from the star and, on the way significant changes in the state variables are defined. The atmospheres in normal stars may only change significantly on several billion-year timescales. Most laboratory experiments carried out to understand astrophysical processes are not done at conditions that perfectly match the natural suite of state variables or timescales appropriate for natural conditions. Experimenters must make use of simple analog experiments that place limits on the behavior of natural systems, often extrapolating to lower-pressure and/or higher-temperature environments. Nevertheless, we argue that well-conceived experiments will often provide insights into astrophysical processes that are impossible to obtain through models or observations. This is especially true for complex chemical phenomena such as the formation and metamorphism of refractory grains under a range of astrophysical conditions. Data obtained in our laboratory has been surprising in numerous ways, ranging from the composition of the condensates to the thermal evolution of their spectral properties. None of this information could have been predicted from first principals and would not have been credible even if it had.

  15. Patient Data Synchronization Process in a Continuity of Care Environment

    PubMed Central

    Haras, Consuela; Sauquet, Dominique; Ameline, Philippe; Jaulent, Marie-Christine; Degoulet, Patrice

    2005-01-01

    In a distributed patient record environment, we analyze the processes needed to ensure exchange and access to EHR data. We propose an adapted method and the tools for data synchronization. Our study takes into account the issues of user rights management for data access and of decreasing the amount of data exchanged over the network. We describe a XML-based synchronization model that is portable and independent of specific medical data models. The implemented platform consists of several servers, of local network clients, of workstations running user’s interfaces and of data exchange and synchronization tools. PMID:16779049

  16. Using of Group-Modeling in Predesign Phase of New Healthcare Environments: Stakeholders Experiences.

    PubMed

    Elf, Marie; Eldh, Ann Catrine; Malmqvist, Inga; Öhrn, Kerstin; von Koch, Lena

    2016-01-01

    Current research shows a relationship between healthcare architecture and patient-related outcomes. The planning and designing of new healthcare environments is a complex process. The needs of the various end users of the environment must be considered, including the patients, the patients' significant others, and the staff. The aim of this study was to explore the experiences of healthcare professionals participating in group modeling utilizing system dynamics in the predesign phase of new healthcare environments. We engaged healthcare professionals in a series of workshops using system dynamics to discuss the planning of healthcare environments in the beginning of a construction and then interviewed them about their experience. An explorative and qualitative design was used to describe participants' experiences of participating in the group-modeling projects. Participants (N = 20) were recruited from a larger intervention study using group modeling and system dynamics in planning and designing projects. The interviews were analyzed by qualitative content analysis. Two themes were formed, representing the experiences in the group-modeling process: "Participation in the group modeling generated knowledge and was empowering" and "Participation in the group modeling differed from what was expected and required the dedication of time and skills." The method can support participants in design teams to focus more on their healthcare organization, their care activities, and their aims rather than focusing on detailed layout solutions. This clarification is important when decisions about the design are discussed and prepared and will most likely lead to greater readiness for future building process. © The Author(s) 2015.

  17. Glass processing in a microgravity environment

    NASA Technical Reports Server (NTRS)

    Uhlmann, D. R.

    1982-01-01

    The basic techniques used in the processing of glasses and crystalline ceramics under terrestrial conditions are briefly reviewed, and the features of the space environment relevant to the processing of glasses are examined. These include reduced gravitational forces, a vacuum of essentially unlimited pumping capacity, unique radiation conditions, and the unlimited dimensions of space. Of these factors, particular attention is given to reduced gravitational forces, and the advantages of containerless processing are discussed. Finally, current programs concerned with glass processing in space are reviewed along with additional areas which merit investigation.

  18. Analysing Students' Shared Activity while Modeling a Biological Process in a Computer-Supported Educational Environment

    ERIC Educational Resources Information Center

    Ergazaki, M.; Zogza, V.; Komis, V.

    2007-01-01

    This paper reports on a case study with three dyads of high school students (age 14 years) each collaborating on a plant growth modeling task in the computer-supported educational environment "ModelsCreator". Following a qualitative line of research, the present study aims at highlighting the ways in which the collaborating students as well as the…

  19. Research environments that promote integrity.

    PubMed

    Jeffers, Brenda Recchia; Whittemore, Robin

    2005-01-01

    The body of empirical knowledge about research integrity and the factors that promote research integrity in nursing research environments remains small. To propose an internal control model as an innovative framework for the design and structure of nursing research environments that promote integrity. An internal control model is adapted to illustrate its use for conceptualizing and designing research environments that promote integrity. The internal control model integrates both the organizational elements necessary to promote research integrity and the processes needed to assess research environments. The model provides five interrelated process components within which any number of research integrity variables and processes may be used and studied: internal control environment, risk assessment, internal control activities, monitoring, and information and communication. The components of the proposed research integrity internal control model proposed comprise an integrated conceptualization of the processes that provide reasonable assurance that research integrity will be promoted within the nursing research environment. Schools of nursing can use the model to design, implement, and evaluate systems that promote research integrity. The model process components need further exploration to substantiate the use of the model in nursing research environments.

  20. DEVELOPMENT OF CAPE-OPEN COMPLIANT PROCESS MODELING COMPONENTS IN MICROSOFT .NET

    EPA Science Inventory

    The CAPE-OPEN middleware standards were created to allow process modeling components (PMCs) developed by third parties to be used in any process modeling environment (PME) utilizing these standards. The CAPE-OPEN middleware specifications were based upon both Microsoft's Compone...

  1. Multiscale Modeling of Diffusion in a Crowded Environment.

    PubMed

    Meinecke, Lina

    2017-11-01

    We present a multiscale approach to model diffusion in a crowded environment and its effect on the reaction rates. Diffusion in biological systems is often modeled by a discrete space jump process in order to capture the inherent noise of biological systems, which becomes important in the low copy number regime. To model diffusion in the crowded cell environment efficiently, we compute the jump rates in this mesoscopic model from local first exit times, which account for the microscopic positions of the crowding molecules, while the diffusing molecules jump on a coarser Cartesian grid. We then extract a macroscopic description from the resulting jump rates, where the excluded volume effect is modeled by a diffusion equation with space-dependent diffusion coefficient. The crowding molecules can be of arbitrary shape and size, and numerical experiments demonstrate that those factors together with the size of the diffusing molecule play a crucial role on the magnitude of the decrease in diffusive motion. When correcting the reaction rates for the altered diffusion we can show that molecular crowding either enhances or inhibits chemical reactions depending on local fluctuations of the obstacle density.

  2. Modeling Standards of Care for an Online Environment

    PubMed Central

    Jones-Schenk, Jan; Rossi, Julia

    1998-01-01

    At Intermountain Health Care in Salt Lake City a team was created to develop core standards for clinical practice that would enhance consistency of care across the care continuum. The newly developed Standards of Care had to meet the following criteria: electronic delivery, research-based, and support an interdisciplinary care environment along with an exception-based documentation system. The process has slowly evolved and the team has grown to include clinicians from multiple sites and disciplines who have met on a regular basis for over a year. The first challenge was to develop a model for the standards of care that would be suitable for an online environment.

  3. Automated Environment Generation for Software Model Checking

    NASA Technical Reports Server (NTRS)

    Tkachuk, Oksana; Dwyer, Matthew B.; Pasareanu, Corina S.

    2003-01-01

    A key problem in model checking open systems is environment modeling (i.e., representing the behavior of the execution context of the system under analysis). Software systems are fundamentally open since their behavior is dependent on patterns of invocation of system components and values defined outside the system but referenced within the system. Whether reasoning about the behavior of whole programs or about program components, an abstract model of the environment can be essential in enabling sufficiently precise yet tractable verification. In this paper, we describe an approach to generating environments of Java program fragments. This approach integrates formally specified assumptions about environment behavior with sound abstractions of environment implementations to form a model of the environment. The approach is implemented in the Bandera Environment Generator (BEG) which we describe along with our experience using BEG to reason about properties of several non-trivial concurrent Java programs.

  4. Simulation model for plant growth in controlled environment systems

    NASA Technical Reports Server (NTRS)

    Raper, C. D., Jr.; Wann, M.

    1986-01-01

    The role of the mathematical model is to relate the individual processes to environmental conditions and the behavior of the whole plant. Using the controlled-environment facilities of the phytotron at North Carolina State University for experimentation at the whole-plant level and methods for handling complex models, researchers developed a plant growth model to describe the relationships between hierarchial levels of the crop production system. The fundamental processes that are considered are: (1) interception of photosynthetically active radiation by leaves, (2) absorption of photosynthetically active radiation, (3) photosynthetic transformation of absorbed radiation into chemical energy of carbon bonding in solube carbohydrates in the leaves, (4) translocation between carbohydrate pools in leaves, stems, and roots, (5) flow of energy from carbohydrate pools for respiration, (6) flow from carbohydrate pools for growth, and (7) aging of tissues. These processes are described at the level of organ structure and of elementary function processes. The driving variables of incident photosynthetically active radiation and ambient temperature as inputs pertain to characterization at the whole-plant level. The output of the model is accumulated dry matter partitioned among leaves, stems, and roots; thus, the elementary processes clearly operate under the constraints of the plant structure which is itself the output of the model.

  5. MASCARET: creating virtual learning environments from system modelling

    NASA Astrophysics Data System (ADS)

    Querrec, Ronan; Vallejo, Paola; Buche, Cédric

    2013-03-01

    The design process for a Virtual Learning Environment (VLE) such as that put forward in the SIFORAS project (SImulation FOR training and ASsistance) means that system specifications can be differentiated from pedagogical specifications. System specifications can also be obtained directly from the specialists' expertise; that is to say directly from Product Lifecycle Management (PLM) tools. To do this, the system model needs to be considered as a piece of VLE data. In this paper we present Mascaret, a meta-model which can be used to represent such system models. In order to ensure that the meta-model is capable of describing, representing and simulating such systems, MASCARET is based SysML1, a standard defined by Omg.

  6. Service-oriented model-encapsulation strategy for sharing and integrating heterogeneous geo-analysis models in an open web environment

    NASA Astrophysics Data System (ADS)

    Yue, Songshan; Chen, Min; Wen, Yongning; Lu, Guonian

    2016-04-01

    Earth environment is extremely complicated and constantly changing; thus, it is widely accepted that the use of a single geo-analysis model cannot accurately represent all details when solving complex geo-problems. Over several years of research, numerous geo-analysis models have been developed. However, a collaborative barrier between model providers and model users still exists. The development of cloud computing has provided a new and promising approach for sharing and integrating geo-analysis models across an open web environment. To share and integrate these heterogeneous models, encapsulation studies should be conducted that are aimed at shielding original execution differences to create services which can be reused in the web environment. Although some model service standards (such as Web Processing Service (WPS) and Geo Processing Workflow (GPW)) have been designed and developed to help researchers construct model services, various problems regarding model encapsulation remain. (1) The descriptions of geo-analysis models are complicated and typically require rich-text descriptions and case-study illustrations, which are difficult to fully represent within a single web request (such as the GetCapabilities and DescribeProcess operations in the WPS standard). (2) Although Web Service technologies can be used to publish model services, model users who want to use a geo-analysis model and copy the model service into another computer still encounter problems (e.g., they cannot access the model deployment dependencies information). This study presents a strategy for encapsulating geo-analysis models to reduce problems encountered when sharing models between model providers and model users and supports the tasks with different web service standards (e.g., the WPS standard). A description method for heterogeneous geo-analysis models is studied. Based on the model description information, the methods for encapsulating the model-execution program to model services and

  7. Process membership in asynchronous environments

    NASA Technical Reports Server (NTRS)

    Ricciardi, Aleta M.; Birman, Kenneth P.

    1993-01-01

    The development of reliable distributed software is simplified by the ability to assume a fail-stop failure model. The emulation of such a model in an asynchronous distributed environment is discussed. The solution proposed, called Strong-GMP, can be supported through a highly efficient protocol, and was implemented as part of a distributed systems software project at Cornell University. The precise definition of the problem, the protocol, correctness proofs, and an analysis of costs are addressed.

  8. Managing Algorithmic Skeleton Nesting Requirements in Realistic Image Processing Applications: The Case of the SKiPPER-II Parallel Programming Environment's Operating Model

    NASA Astrophysics Data System (ADS)

    Coudarcher, Rémi; Duculty, Florent; Serot, Jocelyn; Jurie, Frédéric; Derutin, Jean-Pierre; Dhome, Michel

    2005-12-01

    SKiPPER is a SKeleton-based Parallel Programming EnviRonment being developed since 1996 and running at LASMEA Laboratory, the Blaise-Pascal University, France. The main goal of the project was to demonstrate the applicability of skeleton-based parallel programming techniques to the fast prototyping of reactive vision applications. This paper deals with the special features embedded in the latest version of the project: algorithmic skeleton nesting capabilities and a fully dynamic operating model. Throughout the case study of a complete and realistic image processing application, in which we have pointed out the requirement for skeleton nesting, we are presenting the operating model of this feature. The work described here is one of the few reported experiments showing the application of skeleton nesting facilities for the parallelisation of a realistic application, especially in the area of image processing. The image processing application we have chosen is a 3D face-tracking algorithm from appearance.

  9. Development of climate data storage and processing model

    NASA Astrophysics Data System (ADS)

    Okladnikov, I. G.; Gordov, E. P.; Titov, A. G.

    2016-11-01

    We present a storage and processing model for climate datasets elaborated in the framework of a virtual research environment (VRE) for climate and environmental monitoring and analysis of the impact of climate change on the socio-economic processes on local and regional scales. The model is based on a «shared nothings» distributed computing architecture and assumes using a computing network where each computing node is independent and selfsufficient. Each node holds a dedicated software for the processing and visualization of geospatial data providing programming interfaces to communicate with the other nodes. The nodes are interconnected by a local network or the Internet and exchange data and control instructions via SSH connections and web services. Geospatial data is represented by collections of netCDF files stored in a hierarchy of directories in the framework of a file system. To speed up data reading and processing, three approaches are proposed: a precalculation of intermediate products, a distribution of data across multiple storage systems (with or without redundancy), and caching and reuse of the previously obtained products. For a fast search and retrieval of the required data, according to the data storage and processing model, a metadata database is developed. It contains descriptions of the space-time features of the datasets available for processing, their locations, as well as descriptions and run options of the software components for data analysis and visualization. The model and the metadata database together will provide a reliable technological basis for development of a high- performance virtual research environment for climatic and environmental monitoring.

  10. Modelling of processes occurring in deep geological repository - Development of new modules in the GoldSim environment

    NASA Astrophysics Data System (ADS)

    Vopálka, D.; Lukin, D.; Vokál, A.

    2006-01-01

    Three new modules modelling the processes that occur in a deep geological repository have been prepared in the GoldSim computer code environment (using its Transport Module). These modules help to understand the role of selected parameters in the near-field region of the final repository and to prepare an own complex model of the repository behaviour. The source term module includes radioactive decay and ingrowth in the canister, first order degradation of fuel matrix, solubility limitation of the concentration of the studied nuclides, and diffusive migration through the surrounding bentonite layer controlled by the output boundary condition formulated with respect to the rate of water flow in the rock. The corrosion module describes corrosion of canisters made of carbon steel and transport of corrosion products in the near-field region. This module computes balance equations between dissolving species and species transported by diffusion and/or advection from the surface of a solid material. The diffusion module that includes also non-linear form of the interaction isotherm can be used for an evaluation of small-scale diffusion experiments.

  11. Modeling of space environment impact on nanostructured materials. General principles

    NASA Astrophysics Data System (ADS)

    Voronina, Ekaterina; Novikov, Lev

    2016-07-01

    In accordance with the resolution of ISO TC20/SC14 WG4/WG6 joint meeting, Technical Specification (TS) 'Modeling of space environment impact on nanostructured materials. General principles' which describes computer simulation methods of space environment impact on nanostructured materials is being prepared. Nanomaterials surpass traditional materials for space applications in many aspects due to their unique properties associated with nanoscale size of their constituents. This superiority in mechanical, thermal, electrical and optical properties will evidently inspire a wide range of applications in the next generation spacecraft intended for the long-term (~15-20 years) operation in near-Earth orbits and the automatic and manned interplanetary missions. Currently, ISO activity on developing standards concerning different issues of nanomaterials manufacturing and applications is high enough. Most such standards are related to production and characterization of nanostructures, however there is no ISO documents concerning nanomaterials behavior in different environmental conditions, including the space environment. The given TS deals with the peculiarities of the space environment impact on nanostructured materials (i.e. materials with structured objects which size in at least one dimension lies within 1-100 nm). The basic purpose of the document is the general description of the methodology of applying computer simulation methods which relate to different space and time scale to modeling processes occurring in nanostructured materials under the space environment impact. This document will emphasize the necessity of applying multiscale simulation approach and present the recommendations for the choice of the most appropriate methods (or a group of methods) for computer modeling of various processes that can occur in nanostructured materials under the influence of different space environment components. In addition, TS includes the description of possible

  12. Building an environment model using depth information

    NASA Technical Reports Server (NTRS)

    Roth-Tabak, Y.; Jain, Ramesh

    1989-01-01

    Modeling the environment is one of the most crucial issues for the development and research of autonomous robot and tele-perception. Though the physical robot operates (navigates and performs various tasks) in the real world, any type of reasoning, such as situation assessment, planning or reasoning about action, is performed based on information in its internal world. Hence, the robot's intentional actions are inherently constrained by the models it has. These models may serve as interfaces between sensing modules and reasoning modules, or in the case of telerobots serve as interface between the human operator and the distant robot. A robot operating in a known restricted environment may have a priori knowledge of its whole possible work domain, which will be assimilated in its World Model. As the information in the World Model is relatively fixed, an Environment Model must be introduced to cope with the changes in the environment and to allow exploring entirely new domains. Introduced here is an algorithm that uses dense range data collected at various positions in the environment to refine and update or generate a 3-D volumetric model of an environment. The model, which is intended for autonomous robot navigation and tele-perception, consists of cubic voxels with the possible attributes: Void, Full, and Unknown. Experimental results from simulations of range data in synthetic environments are given. The quality of the results show great promise for dealing with noisy input data. The performance measures for the algorithm are defined, and quantitative results for noisy data and positional uncertainty are presented.

  13. Modeling the Space Radiation Environment

    NASA Technical Reports Server (NTRS)

    Xapsos, Michael A.

    2006-01-01

    There has been a renaissance of interest in space radiation environment modeling. This has been fueled by the growing need to replace long time standard AP-9 and AE-8 trapped particle models, the interplanetary exploration initiative, the modern satellite instrumentation that has led to unprecedented measurement accuracy, and the pervasive use of Commercial off the Shelf (COTS) microelectronics that require more accurate predictive capabilities. The objective of this viewgraph presentation was to provide basic understanding of the components of the space radiation environment and their variations, review traditional radiation effects application models, and present recent developments.

  14. The enhanced Software Life Cyle Support Environment (ProSLCSE): Automation for enterprise and process modeling

    NASA Technical Reports Server (NTRS)

    Milligan, James R.; Dutton, James E.

    1993-01-01

    In this paper, we have introduced a comprehensive method for enterprise modeling that addresses the three important aspects of how an organization goes about its business. FirstEP includes infrastructure modeling, information modeling, and process modeling notations that are intended to be easy to learn and use. The notations stress the use of straightforward visual languages that are intuitive, syntactically simple, and semantically rich. ProSLCSE will be developed with automated tools and services to facilitate enterprise modeling and process enactment. In the spirit of FirstEP, ProSLCSE tools will also be seductively easy to use. Achieving fully managed, optimized software development and support processes will be long and arduous for most software organizations, and many serious problems will have to be solved along the way. ProSLCSE will provide the ability to document, communicate, and modify existing processes, which is the necessary first step.

  15. GREENSCOPE: A Method for Modeling Chemical Process Sustainability

    EPA Science Inventory

    Current work within the U.S. Environmental Protection Agency’s National Risk Management Research Laboratory is focused on the development of a method for modeling chemical process sustainability. The GREENSCOPE methodology, defined for the four bases of Environment, Economics, Ef...

  16. Designing a Web-Based Science Learning Environment for Model-Based Collaborative Inquiry

    ERIC Educational Resources Information Center

    Sun, Daner; Looi, Chee-Kit

    2013-01-01

    The paper traces a research process in the design and development of a science learning environment called WiMVT (web-based inquirer with modeling and visualization technology). The WiMVT system is designed to help secondary school students build a sophisticated understanding of scientific conceptions, and the science inquiry process, as well as…

  17. Students' Mental Models of the Environment

    ERIC Educational Resources Information Center

    Shepardson, Daniel P.; Wee, Bryan; Priddy, Michelle; Harbor, Jon

    2007-01-01

    What are students' mental models of the environment? In what ways, if any, do students' mental models vary by grade level or community setting? These two questions guided the research reported in this article. The Environments Task was administered to students from 25 different teacher-classrooms. The student responses were first inductively…

  18. An integrated model of social environment and social context for pediatric rehabilitation.

    PubMed

    Batorowicz, Beata; King, Gillian; Mishra, Lipi; Missiuna, Cheryl

    2016-01-01

    This article considers the conceptualization and operationalization of "social environment" and "social context" with implications for research and practice with children and youth with impairments. We first discuss social environment and social context as constructs important for understanding interaction between external environmental qualities and the individual's experience. The article considers existing conceptualizations within psychological and sociological bodies of literature, research using these concepts, current developmental theories and issues in the understanding of environment and participation within rehabilitation science. We then describe a model that integrates a person-focused perspective with an environment-focused perspective and that outlines the mechanisms through which children/youth and social environment interact and transact. Finally, we consider the implications of the proposed model for research and clinical practice. This conceptual model directs researchers and practitioners toward interventions that will address the mechanisms of child-environment interaction and that will build capacity within both children and their social environments, including families, peers groups and communities. Health is created and lived by people within the settings of their everyday life; where they learn, work, play, and love [p.2]. Understanding how social environment and personal factors interact over time to affect the development of children/youth can influence the design of services for children and youth with impairments. The model described integrates the individual-focused and environment-focused perspectives and outlines the mechanisms of the ongoing reciprocal interaction between children/youth and their social environments: provision of opportunities, resources and supports and contextual processes of choice, active engagement and collaboration. Addressing these mechanisms could contribute to creating healthier environments in which all

  19. FAME, a microprocessor based front-end analysis and modeling environment

    NASA Technical Reports Server (NTRS)

    Rosenbaum, J. D.; Kutin, E. B.

    1980-01-01

    Higher order software (HOS) is a methodology for the specification and verification of large scale, complex, real time systems. The HOS methodology was implemented as FAME (front end analysis and modeling environment), a microprocessor based system for interactively developing, analyzing, and displaying system models in a low cost user-friendly environment. The nature of the model is such that when completed it can be the basis for projection to a variety of forms such as structured design diagrams, Petri-nets, data flow diagrams, and PSL/PSA source code. The user's interface with the analyzer is easily recognized by any current user of a structured modeling approach; therefore extensive training is unnecessary. Furthermore, when all the system capabilities are used one can check on proper usage of data types, functions, and control structures thereby adding a new dimension to the design process that will lead to better and more easily verified software designs.

  20. Development of a Heterogenic Distributed Environment for Spatial Data Processing Using Cloud Technologies

    NASA Astrophysics Data System (ADS)

    Garov, A. S.; Karachevtseva, I. P.; Matveev, E. V.; Zubarev, A. E.; Florinsky, I. V.

    2016-06-01

    We are developing a unified distributed communication environment for processing of spatial data which integrates web-, desktop- and mobile platforms and combines volunteer computing model and public cloud possibilities. The main idea is to create a flexible working environment for research groups, which may be scaled according to required data volume and computing power, while keeping infrastructure costs at minimum. It is based upon the "single window" principle, which combines data access via geoportal functionality, processing possibilities and communication between researchers. Using an innovative software environment the recently developed planetary information system (http://cartsrv.mexlab.ru/geoportal) will be updated. The new system will provide spatial data processing, analysis and 3D-visualization and will be tested based on freely available Earth remote sensing data as well as Solar system planetary images from various missions. Based on this approach it will be possible to organize the research and representation of results on a new technology level, which provides more possibilities for immediate and direct reuse of research materials, including data, algorithms, methodology, and components. The new software environment is targeted at remote scientific teams, and will provide access to existing spatial distributed information for which we suggest implementation of a user interface as an advanced front-end, e.g., for virtual globe system.

  1. Measurement and modeling of moist processes

    NASA Technical Reports Server (NTRS)

    Cotton, William; Starr, David; Mitchell, Kenneth; Fleming, Rex; Koch, Steve; Smith, Steve; Mailhot, Jocelyn; Perkey, Don; Tripoli, Greg

    1993-01-01

    The keynote talk summarized five years of work simulating observed mesoscale convective systems with the RAMS (Regional Atmospheric Modeling System) model. Excellent results are obtained when simulating squall line or other convective systems that are strongly forced by fronts or other lifting mechanisms. Less highly forced systems are difficult to model. The next topic in this colloquium was measurement of water vapor and other constituents of the hydrologic cycle. Impressive accuracy was shown measuring water vapor with both the airborne DIAL (Differential Absorption Lidar) system and the the ground-based Raman Lidar. NMC's plans for initializing land water hydrology in mesoscale models was presented before water vapor measurement concepts for GCIP were discussed. The subject of using satellite data to provide mesoscale moisture and wind analyses was next. Recent activities in modeling of moist processes in mesoscale systems was reported on. These modeling activities at the Canadian Atmospheric Environment Service (AES) used a hydrostatic, variable-resolution grid model. Next the spatial resolution effects of moisture budgets was discussed; in particular, the effects of temporal resolution on heat and moisture budgets for cumulus parameterization. The conclusion of this colloquium was on modeling scale interaction processes.

  2. An Overview of NASA's Oribital Debris Environment Model

    NASA Technical Reports Server (NTRS)

    Matney, Mark

    2010-01-01

    Using updated measurement data, analysis tools, and modeling techniques; the NASA Orbital Debris Program Office has created a new Orbital Debris Environment Model. This model extends the coverage of orbital debris flux throughout the Earth orbit environment, and includes information on the mass density of the debris as well as the uncertainties in the model environment. This paper will give an overview of this model and its implications for spacecraft risk analysis.

  3. Exploring Undergraduate Students' Mental Models of the Environment: Are They Related to Environmental Affect and Behavior?

    ERIC Educational Resources Information Center

    Liu, Shu-Chiu; Lin, Huann-shyang

    2015-01-01

    A draw-and-explain task and questionnaire were used to explore Taiwanese undergraduate students' mental models of the environment and whether and how they relate to their environmental affect and behavioral commitment. We found that students generally held incomplete mental models of the environment, focusing on objects rather than on processes or…

  4. Commentary on the shifting processes model: a conceptual model for weight management.

    PubMed

    Pagoto, Sherry; Rodrigues, Stephanie

    2013-12-01

    Macchi and colleagues propose a theoretical model that merges concepts from the biopsychosocial model and family systems theory to produce a broader framework for understanding weight loss and maintenance (see record 2013-28564-001). The Shifting Processes Model views individual weight loss and maintenance in the context of family dynamics, including family eating and exercise habits, home environment, and family relationships. The authors reason that traditional models put the burden of change on the individual rather than the family system, when the latter is an important context of individual behavior.

  5. Microphysics, Radiation and Surface Processes in the Goddard Cumulus Ensemble (GCE) Model

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo; Simpson, J.; Baker, D.; Braun, S.; Chou, M.-D.; Ferrier, B.; Johnson, D.; Khain, A.; Lang, S.; Lynn, B.

    2001-01-01

    The response of cloud systems to their environment is an important link in a chain of processes responsible for monsoons, frontal depression, El Nino Southern Oscillation (ENSO) episodes and other climate variations (e.g., 30-60 day intra-seasonal oscillations). Numerical models of cloud properties provide essential insights into the interactions of clouds with each other, with their surroundings, and with land and ocean surfaces. Significant advances are currently being made in the modeling of rainfall and rain-related cloud processes, ranging in scales from the very small up to the simulation of an extensive population of raining cumulus clouds in a tropical- or midlatitude-storm environment. The Goddard Cumulus Ensemble (GCE) model is a multi-dimensional nonhydrostatic dynamic/microphysical cloud resolving model. It has been used to simulate many different mesoscale convective systems that occurred in various geographic locations. In this paper, recent GCE model improvements (microphysics, radiation and surface processes) will be described as well as their impact on the development of precipitation events from various geographic locations. The performance of these new physical processes will be examined by comparing the model results with observations. In addition, the explicit interactive processes between cloud, radiation and surface processes will be discussed.

  6. An Interactive, Web-based High Performance Modeling Environment for Computational Epidemiology.

    PubMed

    Deodhar, Suruchi; Bisset, Keith R; Chen, Jiangzhuo; Ma, Yifei; Marathe, Madhav V

    2014-07-01

    We present an integrated interactive modeling environment to support public health epidemiology. The environment combines a high resolution individual-based model with a user-friendly web-based interface that allows analysts to access the models and the analytics back-end remotely from a desktop or a mobile device. The environment is based on a loosely-coupled service-oriented-architecture that allows analysts to explore various counter factual scenarios. As the modeling tools for public health epidemiology are getting more sophisticated, it is becoming increasingly hard for non-computational scientists to effectively use the systems that incorporate such models. Thus an important design consideration for an integrated modeling environment is to improve ease of use such that experimental simulations can be driven by the users. This is achieved by designing intuitive and user-friendly interfaces that allow users to design and analyze a computational experiment and steer the experiment based on the state of the system. A key feature of a system that supports this design goal is the ability to start, stop, pause and roll-back the disease propagation and intervention application process interactively. An analyst can access the state of the system at any point in time and formulate dynamic interventions based on additional information obtained through state assessment. In addition, the environment provides automated services for experiment set-up and management, thus reducing the overall time for conducting end-to-end experimental studies. We illustrate the applicability of the system by describing computational experiments based on realistic pandemic planning scenarios. The experiments are designed to demonstrate the system's capability and enhanced user productivity.

  7. An Interactive, Web-based High Performance Modeling Environment for Computational Epidemiology

    PubMed Central

    Deodhar, Suruchi; Bisset, Keith R.; Chen, Jiangzhuo; Ma, Yifei; Marathe, Madhav V.

    2014-01-01

    We present an integrated interactive modeling environment to support public health epidemiology. The environment combines a high resolution individual-based model with a user-friendly web-based interface that allows analysts to access the models and the analytics back-end remotely from a desktop or a mobile device. The environment is based on a loosely-coupled service-oriented-architecture that allows analysts to explore various counter factual scenarios. As the modeling tools for public health epidemiology are getting more sophisticated, it is becoming increasingly hard for non-computational scientists to effectively use the systems that incorporate such models. Thus an important design consideration for an integrated modeling environment is to improve ease of use such that experimental simulations can be driven by the users. This is achieved by designing intuitive and user-friendly interfaces that allow users to design and analyze a computational experiment and steer the experiment based on the state of the system. A key feature of a system that supports this design goal is the ability to start, stop, pause and roll-back the disease propagation and intervention application process interactively. An analyst can access the state of the system at any point in time and formulate dynamic interventions based on additional information obtained through state assessment. In addition, the environment provides automated services for experiment set-up and management, thus reducing the overall time for conducting end-to-end experimental studies. We illustrate the applicability of the system by describing computational experiments based on realistic pandemic planning scenarios. The experiments are designed to demonstrate the system's capability and enhanced user productivity. PMID:25530914

  8. EvoBuild: A Quickstart Toolkit for Programming Agent-Based Models of Evolutionary Processes

    NASA Astrophysics Data System (ADS)

    Wagh, Aditi; Wilensky, Uri

    2018-04-01

    Extensive research has shown that one of the benefits of programming to learn about scientific phenomena is that it facilitates learning about mechanisms underlying the phenomenon. However, using programming activities in classrooms is associated with costs such as requiring additional time to learn to program or students needing prior experience with programming. This paper presents a class of programming environments that we call quickstart: Environments with a negligible threshold for entry into programming and a modest ceiling. We posit that such environments can provide benefits of programming for learning without incurring associated costs for novice programmers. To make this claim, we present a design-based research study conducted to compare programming models of evolutionary processes with a quickstart toolkit with exploring pre-built models of the same processes. The study was conducted in six seventh grade science classes in two schools. Students in the programming condition used EvoBuild, a quickstart toolkit for programming agent-based models of evolutionary processes, to build their NetLogo models. Students in the exploration condition used pre-built NetLogo models. We demonstrate that although students came from a range of academic backgrounds without prior programming experience, and all students spent the same number of class periods on the activities including the time students took to learn programming in this environment, EvoBuild students showed greater learning about evolutionary mechanisms. We discuss the implications of this work for design research on programming environments in K-12 science education.

  9. Business Process Elicitation, Modeling, and Reengineering: Teaching and Learning with Simulated Environments

    ERIC Educational Resources Information Center

    Jeyaraj, Anand

    2010-01-01

    The design of enterprise information systems requires students to master technical skills for elicitation, modeling, and reengineering business processes as well as soft skills for information gathering and communication. These tacit skills and behaviors cannot be effectively taught students but rather experienced and learned by students. This…

  10. The national operational environment model (NOEM)

    NASA Astrophysics Data System (ADS)

    Salerno, John J.; Romano, Brian; Geiler, Warren

    2011-06-01

    The National Operational Environment Model (NOEM) is a strategic analysis/assessment tool that provides insight into the complex state space (as a system) that is today's modern operational environment. The NOEM supports baseline forecasts by generating plausible futures based on the current state. It supports what-if analysis by forecasting ramifications of potential "Blue" actions on the environment. The NOEM also supports sensitivity analysis by identifying possible pressure (leverage) points in support of the Commander that resolves forecasted instabilities, and by ranking sensitivities in a list for each leverage point and response. The NOEM can be used to assist Decision Makers, Analysts and Researchers with understanding the inter-workings of a region or nation state, the consequences of implementing specific policies, and the ability to plug in new operational environment theories/models as they mature. The NOEM is built upon an open-source, license-free set of capabilities, and aims to provide support for pluggable modules that make up a given model. The NOEM currently has an extensive number of modules (e.g. economic, security & social well-being pieces such as critical infrastructure) completed along with a number of tools to exercise them. The focus this year is on modeling the social and behavioral aspects of a populace within their environment, primarily the formation of various interest groups, their beliefs, their requirements, their grievances, their affinities, and the likelihood of a wide range of their actions, depending on their perceived level of security and happiness. As such, several research efforts are currently underway to model human behavior from a group perspective, in the pursuit of eventual integration and balance of populace needs/demands within their respective operational environment and the capacity to meet those demands. In this paper we will provide an overview of the NOEM, the need for and a description of its main components

  11. Processes controlling the physico-chemical micro-environments associated with Pompeii worms

    NASA Astrophysics Data System (ADS)

    Le Bris, N.; Zbinden, M.; Gaill, F.

    2005-06-01

    Alvinella pompejana is a tube-dwelling polychaete colonizing hydrothermal smokers of the East Pacific Rise. Extreme temperature, low pH and millimolar sulfide levels have been reported in its immediate surroundings. The conditions experienced by this organism and its associated microbes are, however, poorly known and the processes controlling the physico-chemical gradients in this environment remain to be elucidated. Using miniature in situ sensors coupled with close-up video imagery, we have characterized fine-scale pH and temperature profiles in the biogeoassemblage constituting A. pompejana colonies. Steep discontinuities at both the individual and the colony scale were highlighted, indicating a partitioning of the vent fluid-seawater interface into chemically and thermally distinct micro-environments. The comparison of geochemical models with these data furthermore reveals that temperature is not a relevant tracer of the fluid dilution at these scales. The inner-tube micro-environment is expected to be supplied from the seawater-dominated medium overlying tube openings and to undergo subsequent conductive heating through the tube walls. Its neutral pH is likely to be associated with moderately oxidative conditions. Such a model provides an explanation of the atypical thermal and chemical patterns that were previously reported for this medium from discrete samples and in situ measurements. Conversely, the medium surrounding the tubes is shown to be dominated by the fluid venting from the chimney wall. This hot fluid appears to be gradually cooled (120-30 °C) as it passes through the thickness of the worm colony, as a result of a thermal exchange mechanism induced by the tube assemblage. Its pH, however, remains very low (pH˜4), and reducing conditions can be expected in this medium. Such a thermal and chemical buffering mechanism is consistent with the mineralogical anomalies previously highlighted and provides a first explanation of the exceptional ability of

  12. Spacecraft Internal Acoustic Environment Modeling

    NASA Technical Reports Server (NTRS)

    Chu, Shao-Sheng R.; Allen Christopher S.

    2010-01-01

    Acoustic modeling can be used to identify key noise sources, determine/analyze sub-allocated requirements, keep track of the accumulation of minor noise sources, and to predict vehicle noise levels at various stages in vehicle development, first with estimates of noise sources, later with experimental data. This paper describes the implementation of acoustic modeling for design purposes by incrementally increasing model fidelity and validating the accuracy of the model while predicting the noise of sources under various conditions. During FY 07, a simple-geometry Statistical Energy Analysis (SEA) model was developed and validated using a physical mockup and acoustic measurements. A process for modeling the effects of absorptive wall treatments and the resulting reverberation environment were developed. During FY 08, a model with more complex and representative geometry of the Orion Crew Module (CM) interior was built, and noise predictions based on input noise sources were made. A corresponding physical mockup was also built. Measurements were made inside this mockup, and comparisons were made with the model and showed excellent agreement. During FY 09, the fidelity of the mockup and corresponding model were increased incrementally by including a simple ventilation system. The airborne noise contribution of the fans was measured using a sound intensity technique, since the sound power levels were not known beforehand. This is opposed to earlier studies where Reference Sound Sources (RSS) with known sound power level were used. Comparisons of the modeling result with the measurements in the mockup showed excellent results. During FY 10, the fidelity of the mockup and the model were further increased by including an ECLSS (Environmental Control and Life Support System) wall, associated closeout panels, and the gap between ECLSS wall and mockup wall. The effect of sealing the gap and adding sound absorptive treatment to ECLSS wall were also modeled and validated.

  13. A stochastic vision-based model inspired by zebrafish collective behaviour in heterogeneous environments

    PubMed Central

    Collignon, Bertrand; Séguret, Axel; Halloy, José

    2016-01-01

    Collective motion is one of the most ubiquitous behaviours displayed by social organisms and has led to the development of numerous models. Recent advances in the understanding of sensory system and information processing by animals impels one to revise classical assumptions made in decisional algorithms. In this context, we present a model describing the three-dimensional visual sensory system of fish that adjust their trajectory according to their perception field. Furthermore, we introduce a stochastic process based on a probability distribution function to move in targeted directions rather than on a summation of influential vectors as is classically assumed by most models. In parallel, we present experimental results of zebrafish (alone or in group of 10) swimming in both homogeneous and heterogeneous environments. We use these experimental data to set the parameter values of our model and show that this perception-based approach can simulate the collective motion of species showing cohesive behaviour in heterogeneous environments. Finally, we discuss the advances of this multilayer model and its possible outcomes in biological, physical and robotic sciences. PMID:26909173

  14. Model Based Verification of Cyber Range Event Environments

    DTIC Science & Technology

    2015-12-10

    Model Based Verification of Cyber Range Event Environments Suresh K. Damodaran MIT Lincoln Laboratory 244 Wood St., Lexington, MA, USA...apply model based verification to cyber range event environment configurations, allowing for the early detection of errors in event environment...Environment Representation (CCER) ontology. We also provide an overview of a methodology to specify verification rules and the corresponding error

  15. Model-based adaptive 3D sonar reconstruction in reverberating environments.

    PubMed

    Saucan, Augustin-Alexandru; Sintes, Christophe; Chonavel, Thierry; Caillec, Jean-Marc Le

    2015-10-01

    In this paper, we propose a novel model-based approach for 3D underwater scene reconstruction, i.e., bathymetry, for side scan sonar arrays in complex and highly reverberating environments like shallow water areas. The presence of multipath echoes and volume reverberation generates false depth estimates. To improve the resulting bathymetry, this paper proposes and develops an adaptive filter, based on several original geometrical models. This multimodel approach makes it possible to track and separate the direction of arrival trajectories of multiple echoes impinging the array. Echo tracking is perceived as a model-based processing stage, incorporating prior information on the temporal evolution of echoes in order to reject cluttered observations generated by interfering echoes. The results of the proposed filter on simulated and real sonar data showcase the clutter-free and regularized bathymetric reconstruction. Model validation is carried out with goodness of fit tests, and demonstrates the importance of model-based processing for bathymetry reconstruction.

  16. Neuroscientific Model of Motivational Process

    PubMed Central

    Kim, Sung-il

    2013-01-01

    Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three sub processes, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous sub processes, namely reward-driven approach, value-based decision-making, and goal-directed control. Reward-driven approach is the process in which motivation is generated by reward anticipation and selective approach behaviors toward reward. This process recruits the ventral striatum (reward area) in which basic stimulus-action association is formed, and is classified as an automatic motivation to which relatively less attention is assigned. By contrast, value-based decision-making is the process of evaluating various outcomes of actions, learning through positive prediction error, and calculating the value continuously. The striatum and the orbitofrontal cortex (valuation area) play crucial roles in sustaining motivation. Lastly, the goal-directed control is the process of regulating motivation through cognitive control to achieve goals. This consciously controlled motivation is associated with higher-level cognitive functions such as planning, retaining the goal, monitoring the performance, and regulating action. The anterior cingulate cortex (attention area) and the dorsolateral prefrontal cortex (cognitive control area) are the main neural circuits related to regulation of motivation. These three sub processes interact with each other by sending reward prediction error signals through dopaminergic pathway from the striatum and to the prefrontal cortex. The neuroscientific model of motivational process suggests several educational implications with regard to the generation, maintenance, and regulation of motivation to learn in the learning environment. PMID:23459598

  17. Neuroscientific model of motivational process.

    PubMed

    Kim, Sung-Il

    2013-01-01

    Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three sub processes, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous sub processes, namely reward-driven approach, value-based decision-making, and goal-directed control. Reward-driven approach is the process in which motivation is generated by reward anticipation and selective approach behaviors toward reward. This process recruits the ventral striatum (reward area) in which basic stimulus-action association is formed, and is classified as an automatic motivation to which relatively less attention is assigned. By contrast, value-based decision-making is the process of evaluating various outcomes of actions, learning through positive prediction error, and calculating the value continuously. The striatum and the orbitofrontal cortex (valuation area) play crucial roles in sustaining motivation. Lastly, the goal-directed control is the process of regulating motivation through cognitive control to achieve goals. This consciously controlled motivation is associated with higher-level cognitive functions such as planning, retaining the goal, monitoring the performance, and regulating action. The anterior cingulate cortex (attention area) and the dorsolateral prefrontal cortex (cognitive control area) are the main neural circuits related to regulation of motivation. These three sub processes interact with each other by sending reward prediction error signals through dopaminergic pathway from the striatum and to the prefrontal cortex. The neuroscientific model of motivational process suggests several educational implications with regard to the generation, maintenance, and regulation of motivation to learn in the learning environment.

  18. Study on intelligent processing system of man-machine interactive garment frame model

    NASA Astrophysics Data System (ADS)

    Chen, Shuwang; Yin, Xiaowei; Chang, Ruijiang; Pan, Peiyun; Wang, Xuedi; Shi, Shuze; Wei, Zhongqian

    2018-05-01

    A man-machine interactive garment frame model intelligent processing system is studied in this paper. The system consists of several sensor device, voice processing module, mechanical parts and data centralized acquisition devices. The sensor device is used to collect information on the environment changes brought by the body near the clothes frame model, the data collection device is used to collect the information of the environment change induced by the sensor device, voice processing module is used for speech recognition of nonspecific person to achieve human-machine interaction, mechanical moving parts are used to make corresponding mechanical responses to the information processed by data collection device.it is connected with data acquisition device by a means of one-way connection. There is a one-way connection between sensor device and data collection device, two-way connection between data acquisition device and voice processing module. The data collection device is one-way connection with mechanical movement parts. The intelligent processing system can judge whether it needs to interact with the customer, realize the man-machine interaction instead of the current rigid frame model.

  19. Evolution of quantum-like modeling in decision making processes

    NASA Astrophysics Data System (ADS)

    Khrennikova, Polina

    2012-12-01

    The application of the mathematical formalism of quantum mechanics to model behavioral patterns in social science and economics is a novel and constantly emerging field. The aim of the so called 'quantum like' models is to model the decision making processes in a macroscopic setting, capturing the particular 'context' in which the decisions are taken. Several subsequent empirical findings proved that when making a decision people tend to violate the axioms of expected utility theory and Savage's Sure Thing principle, thus violating the law of total probability. A quantum probability formula was devised to describe more accurately the decision making processes. A next step in the development of QL-modeling in decision making was the application of Schrödinger equation to describe the evolution of people's mental states. A shortcoming of Schrödinger equation is its inability to capture dynamics of an open system; the brain of the decision maker can be regarded as such, actively interacting with the external environment. Recently the master equation, by which quantum physics describes the process of decoherence as the result of interaction of the mental state with the environmental 'bath', was introduced for modeling the human decision making. The external environment and memory can be referred to as a complex 'context' influencing the final decision outcomes. The master equation can be considered as a pioneering and promising apparatus for modeling the dynamics of decision making in different contexts.

  20. Spontaneous cortical activity reveals hallmarks of an optimal internal model of the environment.

    PubMed

    Berkes, Pietro; Orbán, Gergo; Lengyel, Máté; Fiser, József

    2011-01-07

    The brain maintains internal models of its environment to interpret sensory inputs and to prepare actions. Although behavioral studies have demonstrated that these internal models are optimally adapted to the statistics of the environment, the neural underpinning of this adaptation is unknown. Using a Bayesian model of sensory cortical processing, we related stimulus-evoked and spontaneous neural activities to inferences and prior expectations in an internal model and predicted that they should match if the model is statistically optimal. To test this prediction, we analyzed visual cortical activity of awake ferrets during development. Similarity between spontaneous and evoked activities increased with age and was specific to responses evoked by natural scenes. This demonstrates the progressive adaptation of internal models to the statistics of natural stimuli at the neural level.

  1. Coarse-grained models of key self-assembly processes in HIV-1

    NASA Astrophysics Data System (ADS)

    Grime, John

    Computational molecular simulations can elucidate microscopic information that is inaccessible to conventional experimental techniques. However, many processes occur over time and length scales that are beyond the current capabilities of atomic-resolution molecular dynamics (MD). One such process is the self-assembly of the HIV-1 viral capsid, a biological structure that is crucial to viral infectivity. The nucleation and growth of capsid structures requires the interaction of large numbers of capsid proteins within a complicated molecular environment. Coarse-grained (CG) models, where degrees of freedom are removed to produce more computationally efficient models, can in principle access large-scale phenomena such as the nucleation and growth of HIV-1 capsid lattice. We report here studies of the self-assembly behaviors of a CG model of HIV-1 capsid protein, including the influence of the local molecular environment on nucleation and growth processes. Our results suggest a multi-stage process, involving several characteristic structures, eventually producing metastable capsid lattice morphologies that are amenable to subsequent capsid dissociation in order to transmit the viral infection.

  2. Model-Based Analysis of Cell Cycle Responses to Dynamically Changing Environments

    PubMed Central

    Seaton, Daniel D; Krishnan, J

    2016-01-01

    Cell cycle progression is carefully coordinated with a cell’s intra- and extracellular environment. While some pathways have been identified that communicate information from the environment to the cell cycle, a systematic understanding of how this information is dynamically processed is lacking. We address this by performing dynamic sensitivity analysis of three mathematical models of the cell cycle in Saccharomyces cerevisiae. We demonstrate that these models make broadly consistent qualitative predictions about cell cycle progression under dynamically changing conditions. For example, it is shown that the models predict anticorrelated changes in cell size and cell cycle duration under different environments independently of the growth rate. This prediction is validated by comparison to available literature data. Other consistent patterns emerge, such as widespread nonmonotonic changes in cell size down generations in response to parameter changes. We extend our analysis by investigating glucose signalling to the cell cycle, showing that known regulation of Cln3 translation and Cln1,2 transcription by glucose is sufficient to explain the experimentally observed changes in cell cycle dynamics at different glucose concentrations. Together, these results provide a framework for understanding the complex responses the cell cycle is capable of producing in response to dynamic environments. PMID:26741131

  3. Silicon-Carbide Power MOSFET Performance in High Efficiency Boost Power Processing Unit for Extreme Environments

    NASA Technical Reports Server (NTRS)

    Ikpe, Stanley A.; Lauenstein, Jean-Marie; Carr, Gregory A.; Hunter, Don; Ludwig, Lawrence L.; Wood, William; Del Castillo, Linda Y.; Fitzpatrick, Fred; Chen, Yuan

    2016-01-01

    Silicon-Carbide device technology has generated much interest in recent years. With superior thermal performance, power ratings and potential switching frequencies over its Silicon counterpart, Silicon-Carbide offers a greater possibility for high powered switching applications in extreme environment. In particular, Silicon-Carbide Metal-Oxide- Semiconductor Field-Effect Transistors' (MOSFETs) maturing process technology has produced a plethora of commercially available power dense, low on-state resistance devices capable of switching at high frequencies. A novel hard-switched power processing unit (PPU) is implemented utilizing Silicon-Carbide power devices. Accelerated life data is captured and assessed in conjunction with a damage accumulation model of gate oxide and drain-source junction lifetime to evaluate potential system performance at high temperature environments.

  4. An Extended Petri-Net Based Approach for Supply Chain Process Enactment in Resource-Centric Web Service Environment

    NASA Astrophysics Data System (ADS)

    Wang, Xiaodong; Zhang, Xiaoyu; Cai, Hongming; Xu, Boyi

    Enacting a supply-chain process involves variant partners and different IT systems. REST receives increasing attention for distributed systems with loosely coupled resources. Nevertheless, resource model incompatibilities and conflicts prevent effective process modeling and deployment in resource-centric Web service environment. In this paper, a Petri-net based framework for supply-chain process integration is proposed. A resource meta-model is constructed to represent the basic information of resources. Then based on resource meta-model, XML schemas and documents are derived, which represent resources and their states in Petri-net. Thereafter, XML-net, a high level Petri-net, is employed for modeling control and data flow of process. From process model in XML-net, RESTful services and choreography descriptions are deduced. Therefore, unified resource representation and RESTful services description are proposed for cross-system integration in a more effective way. A case study is given to illustrate the approach and the desirable features of the approach are discussed.

  5. The statistical analysis of multi-environment data: modeling genotype-by-environment interaction and its genetic basis

    PubMed Central

    Malosetti, Marcos; Ribaut, Jean-Marcel; van Eeuwijk, Fred A.

    2013-01-01

    Genotype-by-environment interaction (GEI) is an important phenomenon in plant breeding. This paper presents a series of models for describing, exploring, understanding, and predicting GEI. All models depart from a two-way table of genotype by environment means. First, a series of descriptive and explorative models/approaches are presented: Finlay–Wilkinson model, AMMI model, GGE biplot. All of these approaches have in common that they merely try to group genotypes and environments and do not use other information than the two-way table of means. Next, factorial regression is introduced as an approach to explicitly introduce genotypic and environmental covariates for describing and explaining GEI. Finally, QTL modeling is presented as a natural extension of factorial regression, where marker information is translated into genetic predictors. Tests for regression coefficients corresponding to these genetic predictors are tests for main effect QTL expression and QTL by environment interaction (QEI). QTL models for which QEI depends on environmental covariables form an interesting model class for predicting GEI for new genotypes and new environments. For realistic modeling of genotypic differences across multiple environments, sophisticated mixed models are necessary to allow for heterogeneity of genetic variances and correlations across environments. The use and interpretation of all models is illustrated by an example data set from the CIMMYT maize breeding program, containing environments differing in drought and nitrogen stress. To help readers to carry out the statistical analyses, GenStat® programs, 15th Edition and Discovery® version, are presented as “Appendix.” PMID:23487515

  6. A network-based training environment: a medical image processing paradigm.

    PubMed

    Costaridou, L; Panayiotakis, G; Sakellaropoulos, P; Cavouras, D; Dimopoulos, J

    1998-01-01

    The capability of interactive multimedia and Internet technologies is investigated with respect to the implementation of a distance learning environment. The system is built according to a client-server architecture, based on the Internet infrastructure, composed of server nodes conceptually modelled as WWW sites. Sites are implemented by customization of available components. The environment integrates network-delivered interactive multimedia courses, network-based tutoring, SIG support, information databases of professional interest, as well as course and tutoring management. This capability has been demonstrated by means of an implemented system, validated with digital image processing content, specifically image enhancement. Image enhancement methods are theoretically described and applied to mammograms. Emphasis is given to the interactive presentation of the effects of algorithm parameters on images. The system end-user access depends on available bandwidth, so high-speed access can be achieved via LAN or local ISDN connections. Network based training offers new means of improved access and sharing of learning resources and expertise, as promising supplements in training.

  7. Simple Thermal Environment Model (STEM) User's Guide

    NASA Technical Reports Server (NTRS)

    Justus, C.G.; Batts, G. W.; Anderson, B. J.; James, B. F.

    2001-01-01

    This report presents a Simple Thermal Environment Model (STEM) for determining appropriate engineering design values to specify the thermal environment of Earth-orbiting satellites. The thermal environment of a satellite, consists of three components: (1) direct solar radiation, (2) Earth-atmosphere reflected shortwave radiation, as characterized by Earth's albedo, and (3) Earth-atmosphere-emitted outgoing longwave radiation (OLR). This report, together with a companion "guidelines" report provides methodology and guidelines for selecting "design points" for thermal environment parameters for satellites and spacecraft systems. The methods and models reported here are outgrowths of Earth Radiation Budget Experiment (ERBE) satellite data analysis and thermal environment specifications discussed by Anderson and Smith (1994). In large part, this report is intended to update (and supersede) those results.

  8. Forest Canopy Processes in a Regional Chemical Transport Model

    NASA Astrophysics Data System (ADS)

    Makar, Paul; Staebler, Ralf; Akingunola, Ayodeji; Zhang, Junhua; McLinden, Chris; Kharol, Shailesh; Moran, Michael; Robichaud, Alain; Zhang, Leiming; Stroud, Craig; Pabla, Balbir; Cheung, Philip

    2016-04-01

    Forest canopies have typically been absent or highly parameterized in regional chemical transport models. Some forest-related processes are often considered - for example, biogenic emissions from the forests are included as a flux lower boundary condition on vertical diffusion, as is deposition to vegetation. However, real forest canopies comprise a much more complicated set of processes, at scales below the "transport model-resolved scale" of vertical levels usually employed in regional transport models. Advective and diffusive transport within the forest canopy typically scale with the height of the canopy, and the former process tends to dominate over the latter. Emissions of biogenic hydrocarbons arise from the foliage, which may be located tens of metres above the surface, while emissions of biogenic nitric oxide from decaying plant matter are located at the surface - in contrast to the surface flux boundary condition usually employed in chemical transport models. Deposition, similarly, is usually parameterized as a flux boundary condition, but may be differentiated between fluxes to vegetation and fluxes to the surface when the canopy scale is considered. The chemical environment also changes within forest canopies: shading, temperature, and relativity humidity changes with height within the canopy may influence chemical reaction rates. These processes have been observed in a host of measurement studies, and have been simulated using site-specific one-dimensional forest canopy models. Their influence on regional scale chemistry has been unknown, until now. In this work, we describe the results of the first attempt to include complex canopy processes within a regional chemical transport model (GEM-MACH). The original model core was subdivided into "canopy" and "non-canopy" subdomains. In the former, three additional near-surface layers based on spatially and seasonally varying satellite-derived canopy height and leaf area index were added to the original model

  9. Sampling the food processing environment: taking up the cudgel for preventive quality management in food processing environments.

    PubMed

    Wagner, Martin; Stessl, Beatrix

    2014-01-01

    The Listeria monitoring program for Austrian cheese factories was established in 1988. The basic idea is to control the introduction of L. monocytogenes into the food processing environment, preventing the pathogen from contaminating the food under processing. The Austrian Listeria monitoring program comprises four levels of investigation, dealing with routine monitoring of samples and consequences of finding a positive sample. Preventive quality control concepts attempt to detect a foodborne hazard along the food processing chain, prior to food delivery, retailing, and consumption. The implementation of a preventive food safety concept provokes a deepened insight by the manufacturers into problems concerning food safety. The development of preventive quality assurance strategies contributes to the national food safety status and protects public health.

  10. Focal and Ambient Processing of Built Environments: Intellectual and Atmospheric Experiences of Architecture

    PubMed Central

    Rooney, Kevin K.; Condia, Robert J.; Loschky, Lester C.

    2017-01-01

    Neuroscience has well established that human vision divides into the central and peripheral fields of view. Central vision extends from the point of gaze (where we are looking) out to about 5° of visual angle (the width of one’s fist at arm’s length), while peripheral vision is the vast remainder of the visual field. These visual fields project to the parvo and magno ganglion cells, which process distinctly different types of information from the world around us and project that information to the ventral and dorsal visual streams, respectively. Building on the dorsal/ventral stream dichotomy, we can further distinguish between focal processing of central vision, and ambient processing of peripheral vision. Thus, our visual processing of and attention to objects and scenes depends on how and where these stimuli fall on the retina. The built environment is no exception to these dependencies, specifically in terms of how focal object perception and ambient spatial perception create different types of experiences we have with built environments. We argue that these foundational mechanisms of the eye and the visual stream are limiting parameters of architectural experience. We hypothesize that people experience architecture in two basic ways based on these visual limitations; by intellectually assessing architecture consciously through focal object processing and assessing architecture in terms of atmosphere through pre-conscious ambient spatial processing. Furthermore, these separate ways of processing architectural stimuli operate in parallel throughout the visual perceptual system. Thus, a more comprehensive understanding of architecture must take into account that built environments are stimuli that are treated differently by focal and ambient vision, which enable intellectual analysis of architectural experience versus the experience of architectural atmosphere, respectively. We offer this theoretical model to help advance a more precise understanding of the

  11. Focal and Ambient Processing of Built Environments: Intellectual and Atmospheric Experiences of Architecture.

    PubMed

    Rooney, Kevin K; Condia, Robert J; Loschky, Lester C

    2017-01-01

    Neuroscience has well established that human vision divides into the central and peripheral fields of view. Central vision extends from the point of gaze (where we are looking) out to about 5° of visual angle (the width of one's fist at arm's length), while peripheral vision is the vast remainder of the visual field. These visual fields project to the parvo and magno ganglion cells, which process distinctly different types of information from the world around us and project that information to the ventral and dorsal visual streams, respectively. Building on the dorsal/ventral stream dichotomy, we can further distinguish between focal processing of central vision, and ambient processing of peripheral vision. Thus, our visual processing of and attention to objects and scenes depends on how and where these stimuli fall on the retina. The built environment is no exception to these dependencies, specifically in terms of how focal object perception and ambient spatial perception create different types of experiences we have with built environments. We argue that these foundational mechanisms of the eye and the visual stream are limiting parameters of architectural experience. We hypothesize that people experience architecture in two basic ways based on these visual limitations; by intellectually assessing architecture consciously through focal object processing and assessing architecture in terms of atmosphere through pre-conscious ambient spatial processing. Furthermore, these separate ways of processing architectural stimuli operate in parallel throughout the visual perceptual system. Thus, a more comprehensive understanding of architecture must take into account that built environments are stimuli that are treated differently by focal and ambient vision, which enable intellectual analysis of architectural experience versus the experience of architectural atmosphere, respectively. We offer this theoretical model to help advance a more precise understanding of the

  12. Physical Conditions of Eta Car Complex Environment Revealed From Photoionization Modeling

    NASA Technical Reports Server (NTRS)

    Verner, E. M.; Bruhweiler, F.; Nielsen, K. E.; Gull, T.; Kober, G. Vieira; Corcoran, M.

    2006-01-01

    The very massive star, Eta Carinae, is enshrouded in an unusual complex environment of nebulosities and structures. The circumstellar gas gives rise to distinct absorption and emission components at different velocities and distances from the central source(s). Through photoionization modeling, we find that the radiation field from the more massive B-star companion supports the low ionization structure throughout the 5.54 year period. The radiation field of an evolved O-star is required to produce the higher ionization . emission seen across the broad maximum. Our studies utilize the HST/STIS data and model calculations of various regimes from doubly ionized species (T= 10,000K) to the low temperature (T = 760 K) conditions conductive to molecule formation (CH and OH). Overall analysis suggests the high depletion in C and O and the enrichment in He and N. The sharp molecular and ionic absorptions in this extensively CNO - processed material offers a unique environment for studying the chemistry, dust formation processes, and nucleosynthesis in the ejected layers of a highly evolved massive star.

  13. SPARX, a new environment for Cryo-EM image processing.

    PubMed

    Hohn, Michael; Tang, Grant; Goodyear, Grant; Baldwin, P R; Huang, Zhong; Penczek, Pawel A; Yang, Chao; Glaeser, Robert M; Adams, Paul D; Ludtke, Steven J

    2007-01-01

    SPARX (single particle analysis for resolution extension) is a new image processing environment with a particular emphasis on transmission electron microscopy (TEM) structure determination. It includes a graphical user interface that provides a complete graphical programming environment with a novel data/process-flow infrastructure, an extensive library of Python scripts that perform specific TEM-related computational tasks, and a core library of fundamental C++ image processing functions. In addition, SPARX relies on the EMAN2 library and cctbx, the open-source computational crystallography library from PHENIX. The design of the system is such that future inclusion of other image processing libraries is a straightforward task. The SPARX infrastructure intelligently handles retention of intermediate values, even those inside programming structures such as loops and function calls. SPARX and all dependencies are free for academic use and available with complete source.

  14. Modeling and optimum time performance for concurrent processing

    NASA Technical Reports Server (NTRS)

    Mielke, Roland R.; Stoughton, John W.; Som, Sukhamoy

    1988-01-01

    The development of a new graph theoretic model for describing the relation between a decomposed algorithm and its execution in a data flow environment is presented. Called ATAMM, the model consists of a set of Petri net marked graphs useful for representing decision-free algorithms having large-grained, computationally complex primitive operations. Performance time measures which determine computing speed and throughput capacity are defined, and the ATAMM model is used to develop lower bounds for these times. A concurrent processing operating strategy for achieving optimum time performance is presented and illustrated by example.

  15. Specification, testing, and interpretation of gene-by-measured-environment interaction models in the presence of gene-environment correlation

    PubMed Central

    Rathouz, Paul J.; Van Hulle, Carol A.; Lee Rodgers, Joseph; Waldman, Irwin D.; Lahey, Benjamin B.

    2009-01-01

    Purcell (2002) proposed a bivariate biometric model for testing and quantifying the interaction between latent genetic influences and measured environments in the presence of gene-environment correlation. Purcell’s model extends the Cholesky model to include gene-environment interaction. We examine a number of closely-related alternative models that do not involve gene-environment interaction but which may fit the data as well Purcell’s model. Because failure to consider these alternatives could lead to spurious detection of gene-environment interaction, we propose alternative models for testing gene-environment interaction in the presence of gene-environment correlation, including one based on the correlated factors model. In addition, we note mathematical errors in the calculation of effect size via variance components in Purcell’s model. We propose a statistical method for deriving and interpreting variance decompositions that are true to the fitted model. PMID:18293078

  16. Electronic materials processing and the microgravity environment

    NASA Technical Reports Server (NTRS)

    Witt, A. F.

    1988-01-01

    The nature and origin of deficiencies in bulk electronic materials for device fabrication are analyzed. It is found that gravity generated perturbations during their formation account largely for the introduction of critical chemical and crystalline defects and, moreover, are responsible for the still existing gap between theory and experiment and thus for excessive reliance on proprietary empiricism in processing technology. Exploration of the potential of reduced gravity environment for electronic materials processing is found to be not only desirable but mandatory.

  17. Applications integration in a hybrid cloud computing environment: modelling and platform

    NASA Astrophysics Data System (ADS)

    Li, Qing; Wang, Ze-yuan; Li, Wei-hua; Li, Jun; Wang, Cheng; Du, Rui-yang

    2013-08-01

    With the development of application services providers and cloud computing, more and more small- and medium-sized business enterprises use software services and even infrastructure services provided by professional information service companies to replace all or part of their information systems (ISs). These information service companies provide applications, such as data storage, computing processes, document sharing and even management information system services as public resources to support the business process management of their customers. However, no cloud computing service vendor can satisfy the full functional IS requirements of an enterprise. As a result, enterprises often have to simultaneously use systems distributed in different clouds and their intra enterprise ISs. Thus, this article presents a framework to integrate applications deployed in public clouds and intra ISs. A run-time platform is developed and a cross-computing environment process modelling technique is also developed to improve the feasibility of ISs under hybrid cloud computing environments.

  18. Teaching Process Writing in an Online Environment

    ERIC Educational Resources Information Center

    Carolan, Fergal; Kyppö, Anna

    2015-01-01

    This reflective practice paper offers some insights into teaching an interdisciplinary academic writing course aimed at promoting process writing. The study reflects on students' acquisition of writing skills and the teacher's support practices in a digital writing environment. It presents writers' experiences related to various stages of process…

  19. Cabin Environment Physics Risk Model

    NASA Technical Reports Server (NTRS)

    Mattenberger, Christopher J.; Mathias, Donovan Leigh

    2014-01-01

    This paper presents a Cabin Environment Physics Risk (CEPR) model that predicts the time for an initial failure of Environmental Control and Life Support System (ECLSS) functionality to propagate into a hazardous environment and trigger a loss-of-crew (LOC) event. This physics-of failure model allows a probabilistic risk assessment of a crewed spacecraft to account for the cabin environment, which can serve as a buffer to protect the crew during an abort from orbit and ultimately enable a safe return. The results of the CEPR model replace the assumption that failure of the crew critical ECLSS functionality causes LOC instantly, and provide a more accurate representation of the spacecraft's risk posture. The instant-LOC assumption is shown to be excessively conservative and, moreover, can impact the relative risk drivers identified for the spacecraft. This, in turn, could lead the design team to allocate mass for equipment to reduce overly conservative risk estimates in a suboptimal configuration, which inherently increases the overall risk to the crew. For example, available mass could be poorly used to add redundant ECLSS components that have a negligible benefit but appear to make the vehicle safer due to poor assumptions about the propagation time of ECLSS failures.

  20. Virtual Research Environments for Natural Hazard Modelling

    NASA Astrophysics Data System (ADS)

    Napier, Hazel; Aldridge, Tim

    2017-04-01

    The Natural Hazards Partnership (NHP) is a group of 17 collaborating public sector organisations providing a mechanism for co-ordinated advice to government and agencies responsible for civil contingency and emergency response during natural hazard events. The NHP has set up a Hazard Impact Model (HIM) group tasked with modelling the impact of a range of UK hazards with the aim of delivery of consistent hazard and impact information. The HIM group consists of 7 partners initially concentrating on modelling the socio-economic impact of 3 key hazards - surface water flooding, land instability and high winds. HIM group partners share scientific expertise and data within their specific areas of interest including hydrological modelling, meteorology, engineering geology, GIS, data delivery, and modelling of socio-economic impacts. Activity within the NHP relies on effective collaboration between partners distributed across the UK. The NHP are acting as a use case study for a new Virtual Research Environment (VRE) being developed by the EVER-EST project (European Virtual Environment for Research - Earth Science Themes: a solution). The VRE is allowing the NHP to explore novel ways of cooperation including improved capabilities for e-collaboration, e-research, automation of processes and e-learning. Collaboration tools are complemented by the adoption of Research Objects, semantically rich aggregations of resources enabling the creation of uniquely identified digital artefacts resulting in reusable science and research. Application of the Research Object concept to HIM development facilitates collaboration, by encapsulating scientific knowledge in a shareable format that can be easily shared and used by partners working on the same model but within their areas of expertise. This paper describes the application of the VRE to the NHP use case study. It outlines the challenges associated with distributed partnership working and how they are being addressed in the VRE. A case

  1. Differential Susceptibility to the Environment: Are Developmental Models Compatible with the Evidence from Twin Studies?

    ERIC Educational Resources Information Center

    Del Giudice, Marco

    2016-01-01

    According to models of differential susceptibility, the same neurobiological and temperamental traits that determine increased sensitivity to stress and adversity also confer enhanced responsivity to the positive aspects of the environment. Differential susceptibility models have expanded to include complex developmental processes in which genetic…

  2. Hypercompetitive Environments: An Agent-based model approach

    NASA Astrophysics Data System (ADS)

    Dias, Manuel; Araújo, Tanya

    Information technology (IT) environments are characterized by complex changes and rapid evolution. Globalization and the spread of technological innovation have increased the need for new strategic information resources, both from individual firms and management environments. Improvements in multidisciplinary methods and, particularly, the availability of powerful computational tools, are giving researchers an increasing opportunity to investigate management environments in their true complex nature. The adoption of a complex systems approach allows for modeling business strategies from a bottom-up perspective — understood as resulting from repeated and local interaction of economic agents — without disregarding the consequences of the business strategies themselves to individual behavior of enterprises, emergence of interaction patterns between firms and management environments. Agent-based models are at the leading approach of this attempt.

  3. Bayesian Genomic Prediction with Genotype × Environment Interaction Kernel Models

    PubMed Central

    Cuevas, Jaime; Crossa, José; Montesinos-López, Osval A.; Burgueño, Juan; Pérez-Rodríguez, Paulino; de los Campos, Gustavo

    2016-01-01

    The phenomenon of genotype × environment (G × E) interaction in plant breeding decreases selection accuracy, thereby negatively affecting genetic gains. Several genomic prediction models incorporating G × E have been recently developed and used in genomic selection of plant breeding programs. Genomic prediction models for assessing multi-environment G × E interaction are extensions of a single-environment model, and have advantages and limitations. In this study, we propose two multi-environment Bayesian genomic models: the first model considers genetic effects (u) that can be assessed by the Kronecker product of variance–covariance matrices of genetic correlations between environments and genomic kernels through markers under two linear kernel methods, linear (genomic best linear unbiased predictors, GBLUP) and Gaussian (Gaussian kernel, GK). The other model has the same genetic component as the first model (u) plus an extra component, f, that captures random effects between environments that were not captured by the random effects u. We used five CIMMYT data sets (one maize and four wheat) that were previously used in different studies. Results show that models with G × E always have superior prediction ability than single-environment models, and the higher prediction ability of multi-environment models with u and f over the multi-environment model with only u occurred 85% of the time with GBLUP and 45% of the time with GK across the five data sets. The latter result indicated that including the random effect f is still beneficial for increasing prediction ability after adjusting by the random effect u. PMID:27793970

  4. A Methodology and Software Environment for Testing Process Model’s Sequential Predictions with Protocols

    DTIC Science & Technology

    1992-12-21

    in preparation). Foundations of artificial intelligence. Cambridge, MA: MIT Press. O’Reilly, R. C. (1991). X3DNet: An X- Based Neural Network ...2.2.3 Trace based protocol analysis 19 2.2A Summary of important data features 21 2.3 Tools related to process model testing 23 2.3.1 Tools for building...algorithm 57 3. Requirements for testing process models using trace based protocol 59 analysis 3.1 Definition of trace based protocol analysis (TBPA) 59

  5. An approach for investigation of secure access processes at a combined e-learning environment

    NASA Astrophysics Data System (ADS)

    Romansky, Radi; Noninska, Irina

    2017-12-01

    The article discuses an approach to investigate processes for regulation the security and privacy control at a heterogenous e-learning environment realized as a combination of traditional and cloud means and tools. Authors' proposal for combined architecture of e-learning system is presented and main subsystems and procedures are discussed. A formalization of the processes for using different types resources (public, private internal and private external) is proposed. The apparatus of Markovian chains (MC) is used for modeling and analytical investigation of the secure access to the resources is used and some assessments are presented.

  6. Understanding Immersivity: Image Generation and Transformation Processes in 3D Immersive Environments

    PubMed Central

    Kozhevnikov, Maria; Dhond, Rupali P.

    2012-01-01

    Most research on three-dimensional (3D) visual-spatial processing has been conducted using traditional non-immersive 2D displays. Here we investigated how individuals generate and transform mental images within 3D immersive (3DI) virtual environments, in which the viewers perceive themselves as being surrounded by a 3D world. In Experiment 1, we compared participants’ performance on the Shepard and Metzler (1971) mental rotation (MR) task across the following three types of visual presentation environments; traditional 2D non-immersive (2DNI), 3D non-immersive (3DNI – anaglyphic glasses), and 3DI (head mounted display with position and head orientation tracking). In Experiment 2, we examined how the use of different backgrounds affected MR processes within the 3DI environment. In Experiment 3, we compared electroencephalogram data recorded while participants were mentally rotating visual-spatial images presented in 3DI vs. 2DNI environments. Overall, the findings of the three experiments suggest that visual-spatial processing is different in immersive and non-immersive environments, and that immersive environments may require different image encoding and transformation strategies than the two other non-immersive environments. Specifically, in a non-immersive environment, participants may utilize a scene-based frame of reference and allocentric encoding whereas immersive environments may encourage the use of a viewer-centered frame of reference and egocentric encoding. These findings also suggest that MR performed in laboratory conditions using a traditional 2D computer screen may not reflect spatial processing as it would occur in the real world. PMID:22908003

  7. Thoughts About Created Environment: A Neuman Systems Model Concept.

    PubMed

    Verberk, Frans; Fawcett, Jacqueline

    2017-04-01

    This essay is about the Neuman systems model concept of the created environment. The essay, based on work by Frans Verberk, a Neuman systems model scholar from the Netherlands, extends understanding of the created environment by explaining how this distinctive perspective of environment represents an elaboration of the physiological, psychological, sociocultural, developmental, and spiritual variables, which are other central concepts of the Neuman Systems Model.

  8. Charged Particle Environment Definition for NGST: Model Development

    NASA Technical Reports Server (NTRS)

    Blackwell, William C.; Minow, Joseph I.; Evans, Steven W.; Hardage, Donna M.; Suggs, Robert M.

    2000-01-01

    NGST will operate in a halo orbit about the L2 point, 1.5 million km from the Earth, where the spacecraft will periodically travel through the magnetotail region. There are a number of tools available to calculate the high energy, ionizing radiation particle environment from galactic cosmic rays and from solar disturbances. However, space environment tools are not generally available to provide assessments of charged particle environment and its variations in the solar wind, magnetosheath, and magnetotail at L2 distances. An engineering-level phenomenology code (LRAD) was therefore developed to facilitate the definition of charged particle environments in the vicinity of the L2 point in support of the NGST program. LRAD contains models tied to satellite measurement data of the solar wind and magnetotail regions. The model provides particle flux and fluence calculations necessary to predict spacecraft charging conditions and the degradation of materials used in the construction of NGST. This paper describes the LRAD environment models for the deep magnetotail (XGSE < -100 Re) and solar wind, and presents predictions of the charged particle environment for NGST.

  9. Quantitative Modeling of Human-Environment Interactions in Preindustrial Time

    NASA Astrophysics Data System (ADS)

    Sommer, Philipp S.; Kaplan, Jed O.

    2017-04-01

    Quantifying human-environment interactions and anthropogenic influences on the environment prior to the Industrial revolution is essential for understanding the current state of the earth system. This is particularly true for the terrestrial biosphere, but marine ecosystems and even climate were likely modified by human activities centuries to millennia ago. Direct observations are however very sparse in space and time, especially as one considers prehistory. Numerical models are therefore essential to produce a continuous picture of human-environment interactions in the past. Agent-based approaches, while widely applied to quantifying human influence on the environment in localized studies, are unsuitable for global spatial domains and Holocene timescales because of computational demands and large parameter uncertainty. Here we outline a new paradigm for the quantitative modeling of human-environment interactions in preindustrial time that is adapted to the global Holocene. Rather than attempting to simulate agency directly, the model is informed by a suite of characteristics describing those things about society that cannot be predicted on the basis of environment, e.g., diet, presence of agriculture, or range of animals exploited. These categorical data are combined with the properties of the physical environment in coupled human-environment model. The model is, at its core, a dynamic global vegetation model with a module for simulating crop growth that is adapted for preindustrial agriculture. This allows us to simulate yield and calories for feeding both humans and their domesticated animals. We couple this basic caloric availability with a simple demographic model to calculate potential population, and, constrained by labor requirements and land limitations, we create scenarios of land use and land cover on a moderate-resolution grid. We further implement a feedback loop where anthropogenic activities lead to changes in the properties of the physical

  10. Bayesian Genomic Prediction with Genotype × Environment Interaction Kernel Models.

    PubMed

    Cuevas, Jaime; Crossa, José; Montesinos-López, Osval A; Burgueño, Juan; Pérez-Rodríguez, Paulino; de Los Campos, Gustavo

    2017-01-05

    The phenomenon of genotype × environment (G × E) interaction in plant breeding decreases selection accuracy, thereby negatively affecting genetic gains. Several genomic prediction models incorporating G × E have been recently developed and used in genomic selection of plant breeding programs. Genomic prediction models for assessing multi-environment G × E interaction are extensions of a single-environment model, and have advantages and limitations. In this study, we propose two multi-environment Bayesian genomic models: the first model considers genetic effects [Formula: see text] that can be assessed by the Kronecker product of variance-covariance matrices of genetic correlations between environments and genomic kernels through markers under two linear kernel methods, linear (genomic best linear unbiased predictors, GBLUP) and Gaussian (Gaussian kernel, GK). The other model has the same genetic component as the first model [Formula: see text] plus an extra component, F: , that captures random effects between environments that were not captured by the random effects [Formula: see text] We used five CIMMYT data sets (one maize and four wheat) that were previously used in different studies. Results show that models with G × E always have superior prediction ability than single-environment models, and the higher prediction ability of multi-environment models with [Formula: see text] over the multi-environment model with only u occurred 85% of the time with GBLUP and 45% of the time with GK across the five data sets. The latter result indicated that including the random effect f is still beneficial for increasing prediction ability after adjusting by the random effect [Formula: see text]. Copyright © 2017 Cuevas et al.

  11. An approach for modelling snowcover ablation and snowmelt runoff in cold region environments

    NASA Astrophysics Data System (ADS)

    Dornes, Pablo Fernando

    Reliable hydrological model simulations are the result of numerous complex interactions among hydrological inputs, landscape properties, and initial conditions. Determination of the effects of these factors is one of the main challenges in hydrological modelling. This situation becomes even more difficult in cold regions due to the ungauged nature of subarctic and arctic environments. This research work is an attempt to apply a new approach for modelling snowcover ablation and snowmelt runoff in complex subarctic environments with limited data while retaining integrity in the process representations. The modelling strategy is based on the incorporation of both detailed process understanding and inputs along with information gained from observations of basin-wide streamflow phenomenon; essentially a combination of deductive and inductive approaches. The study was conducted in the Wolf Creek Research Basin, Yukon Territory, using three models, a small-scale physically based hydrological model, a land surface scheme, and a land surface hydrological model. The spatial representation was based on previous research studies and observations, and was accomplished by incorporating landscape units, defined according to topography and vegetation, as the spatial model elements. Comparisons between distributed and aggregated modelling approaches showed that simulations incorporating distributed initial snowcover and corrected solar radiation were able to properly simulate snowcover ablation and snowmelt runoff whereas the aggregated modelling approaches were unable to represent the differential snowmelt rates and complex snowmelt runoff dynamics. Similarly, the inclusion of spatially distributed information in a land surface scheme clearly improved simulations of snowcover ablation. Application of the same modelling approach at a larger scale using the same landscape based parameterisation showed satisfactory results in simulating snowcover ablation and snowmelt runoff with

  12. An Instructional Method for the AutoCAD Modeling Environment.

    ERIC Educational Resources Information Center

    Mohler, James L.

    1997-01-01

    Presents a command organizer for AutoCAD to aid new uses in operating within the 3-D modeling environment. Addresses analyzing the problem, visualization skills, nonlinear tools, a static view of a dynamic model, the AutoCAD organizer, environment attributes, and control of the environment. Contains 11 references. (JRH)

  13. Microbial consortia in meat processing environments

    NASA Astrophysics Data System (ADS)

    Alessandria, V.; Rantsiou, K.; Cavallero, M. C.; Riva, S.; Cocolin, L.

    2017-09-01

    Microbial contamination in food processing plants can play a fundamental role in food quality and safety. The description of the microbial consortia in the meat processing environment is important since it is a first step in understanding possible routes of product contamination. Furthermore, it may contribute in the development of sanitation programs for effective pathogen removal. The purpose of this study was to characterize the type of microbiota in the environment of meat processing plants: the microbiota of three different meat plants was studied by both traditional and molecular methods (PCR-DGGE) in two different periods. Different levels of contamination emerged between the three plants as well as between the two sampling periods. Conventional methods of killing free-living bacteria through antimicrobial agents and disinfection are often ineffective against bacteria within a biofilm. The use of gas-discharge plasmas potentially can offer a good alternative to conventional sterilization methods. The purpose of this study was to measure the effectiveness of Atmospheric Pressure Plasma (APP) surface treatments against bacteria in biofilms. Biofilms produced by three different L. monocytogenes strains on stainless steel surface were subjected to three different conditions (power, exposure time) of APP. Our results showed how most of the culturable cells are inactivated after the Plasma exposure but the RNA analysis by qPCR highlighted the entrance of the cells in the viable-but non culturable (VBNC) state, confirming the hypothesis that cells are damaged after plasma treatment, but in a first step, still remain alive. The understanding of the effects of APP on the L. monocytogenes biofilm can improve the development of sanitation programs with the use of APP for effective pathogen removal.

  14. Barotropic Tidal Predictions and Validation in a Relocatable Modeling Environment. Revised

    NASA Technical Reports Server (NTRS)

    Mehra, Avichal; Passi, Ranjit; Kantha, Lakshmi; Payne, Steven; Brahmachari, Shuvobroto

    1998-01-01

    Under funding from the Office of Naval Research (ONR), the Mississippi State University Center for Air Sea Technology (CAST) has been working on developing a Relocatable Modeling Environment (RME) to provide a uniform and unbiased infrastructure for efficiently configuring numerical models in any geographic or oceanic region. Under Naval Oceanographic Office (NAVOCEANO) funding, the model was implemented and tested for NAVOCEANO use. With our current emphasis on ocean tidal modeling, CAST has adopted the Colorado University's numerical ocean model, known as CURReNTSS (Colorado University Rapidly Relocatable Nestable Storm Surge) Model, as the model of choice. During the RME development process, CURReNTSS has been relocated to several coastal oceanic regions, providing excellent results that demonstrate its veracity. This report documents the model validation results and provides a brief description of the Graphic user Interface.

  15. Ada (Trade name) Compiler Validation Summary Report: Rational. Rational Environment (Trademark) A952. Rational Architecture (R1000 (Trade name) Model 200).

    DTIC Science & Technology

    1987-05-06

    Rational . Rational Environment A_9_5_2. Rational Arthitecture (R1000 Model 200) 6. PERFORMING ORG. REPORT...validation testing performed on the Rational Environment , A_9_5_2, using Version 1.8 of the Ada0 Compiler Validation Capability (ACVC). The Rational ... Environment is hosted on a Rational Architecture (R1000 Model 200) operating under Rational Environment , Release A 95 2. Programs processed by this

  16. Modelling urban rainfall-runoff responses using an experimental, two-tiered physical modelling environment

    NASA Astrophysics Data System (ADS)

    Green, Daniel; Pattison, Ian; Yu, Dapeng

    2016-04-01

    Surface water (pluvial) flooding occurs when rainwater from intense precipitation events is unable to infiltrate into the subsurface or drain via natural or artificial drainage channels. Surface water flooding poses a serious hazard to urban areas across the world, with the UK's perceived risk appearing to have increased in recent years due to surface water flood events seeming more severe and frequent. Surface water flood risk currently accounts for 1/3 of all UK flood risk, with approximately two million people living in urban areas at risk of a 1 in 200-year flood event. Research often focuses upon using numerical modelling techniques to understand the extent, depth and severity of actual or hypothetical flood scenarios. Although much research has been conducted using numerical modelling, field data available for model calibration and validation is limited due to the complexities associated with data collection in surface water flood conditions. Ultimately, the data which numerical models are based upon is often erroneous and inconclusive. Physical models offer a novel, alternative and innovative environment to collect data within, creating a controlled, closed system where independent variables can be altered independently to investigate cause and effect relationships. A physical modelling environment provides a suitable platform to investigate rainfall-runoff processes occurring within an urban catchment. Despite this, physical modelling approaches are seldom used in surface water flooding research. Scaled laboratory experiments using a 9m2, two-tiered 1:100 physical model consisting of: (i) a low-cost rainfall simulator component able to simulate consistent, uniformly distributed (>75% CUC) rainfall events of varying intensity, and; (ii) a fully interchangeable, modular plot surface have been conducted to investigate and quantify the influence of a number of terrestrial and meteorological factors on overland flow and rainfall-runoff patterns within a modelled

  17. Use and perception of the environment: cultural and developmental processes

    Treesearch

    Martin M. Chemers; Irwin Altman

    1977-01-01

    This paper presents a "social systems" orientation for integrating the diverse aspects of environment, culture, and individual behavior. It suggests that a wide range of variables, including the physical environment, cultural and social processes, environmental perceptions and cognitions, behavior, and products of behavior, are connected in a complex,...

  18. Modeling the role of environment in addiction.

    PubMed

    Caprioli, Daniele; Celentano, Michele; Paolone, Giovanna; Badiani, Aldo

    2007-11-15

    The aim of this review is to provide an overview of the main types of animal models used to investigate the modulatory role of environment on drug addiction. The environment can alter the responsiveness to addictive drugs in at least three major ways. First, adverse life experiences can make an individual more vulnerable to develop drug addiction or to relapse into drug seeking. Second, neutral environmental cues can acquire, through Pavlovian conditioning, the ability to trigger drug seeking even after long periods of abstinence. Third, the environment immediately surrounding drug taking can alter the behavioral, subjective, and rewarding effects of a given drug, thus influencing the propensity to use the same drug again. We have focused in particular on the results obtained using an animal model we have developed to study the latter type of drug-environment interaction.

  19. The AE-8 trapped electron model environment

    NASA Technical Reports Server (NTRS)

    Vette, James I.

    1991-01-01

    The machine sensible version of the AE-8 electron model environment was completed in December 1983. It has been sent to users on the model environment distribution list and is made available to new users by the National Space Science Data Center (NSSDC). AE-8 is the last in a series of terrestrial trapped radiation models that includes eight proton and eight electron versions. With the exception of AE-8, all these models were documented in formal reports as well as being available in a machine sensible form. The purpose of this report is to complete the documentation, finally, for AE-8 so that users can understand its construction and see the comparison of the model with the new data used, as well as with the AE-4 model.

  20. Spacecraft Internal Acoustic Environment Modeling

    NASA Technical Reports Server (NTRS)

    Allen, Christopher; Chu, S. Reynold

    2008-01-01

    The objective of the project is to develop an acoustic modeling capability, based on commercial off-the-shelf software, to be used as a tool for oversight of the future manned Constellation vehicles to ensure compliance with acoustic requirements and thus provide a safe and habitable acoustic environment for the crews, and to validate developed models via building physical mockups and conducting acoustic measurements.

  1. A Conceptual Model of the Cognitive Processing of Environmental Distance Information

    NASA Astrophysics Data System (ADS)

    Montello, Daniel R.

    I review theories and research on the cognitive processing of environmental distance information by humans, particularly that acquired via direct experience in the environment. The cognitive processes I consider for acquiring and thinking about environmental distance information include working-memory, nonmediated, hybrid, and simple-retrieval processes. Based on my review of the research literature, and additional considerations about the sources of distance information and the situations in which it is used, I propose an integrative conceptual model to explain the cognitive processing of distance information that takes account of the plurality of possible processes and information sources, and describes conditions under which particular processes and sources are likely to operate. The mechanism of summing vista distances is identified as widely important in situations with good visual access to the environment. Heuristics based on time, effort, or other information are likely to play their most important role when sensory access is restricted.

  2. Model for Simulating a Spiral Software-Development Process

    NASA Technical Reports Server (NTRS)

    Mizell, Carolyn; Curley, Charles; Nayak, Umanath

    2010-01-01

    A discrete-event simulation model, and a computer program that implements the model, have been developed as means of analyzing a spiral software-development process. This model can be tailored to specific development environments for use by software project managers in making quantitative cases for deciding among different software-development processes, courses of action, and cost estimates. A spiral process can be contrasted with a waterfall process, which is a traditional process that consists of a sequence of activities that include analysis of requirements, design, coding, testing, and support. A spiral process is an iterative process that can be regarded as a repeating modified waterfall process. Each iteration includes assessment of risk, analysis of requirements, design, coding, testing, delivery, and evaluation. A key difference between a spiral and a waterfall process is that a spiral process can accommodate changes in requirements at each iteration, whereas in a waterfall process, requirements are considered to be fixed from the beginning and, therefore, a waterfall process is not flexible enough for some projects, especially those in which requirements are not known at the beginning or may change during development. For a given project, a spiral process may cost more and take more time than does a waterfall process, but may better satisfy a customer's expectations and needs. Models for simulating various waterfall processes have been developed previously, but until now, there have been no models for simulating spiral processes. The present spiral-process-simulating model and the software that implements it were developed by extending a discrete-event simulation process model of the IEEE 12207 Software Development Process, which was built using commercially available software known as the Process Analysis Tradeoff Tool (PATT). Typical inputs to PATT models include industry-average values of product size (expressed as number of lines of code

  3. Operational SAR Data Processing in GIS Environments for Rapid Disaster Mapping

    NASA Astrophysics Data System (ADS)

    Bahr, Thomas

    2014-05-01

    DEM without the need of ground control points. This step includes radiometric calibration. (3) A subsequent change detection analysis generates the final map showing the extent of the flash flood on Nov. 5th 2010. The underlying algorithms are provided by three different sources: Geocoding & radiometric calibration (2) is a standard functionality from the commercial SARscape Toolbox for ArcGIS. This toolbox is extended by the filter tool (1), which is called from the SARscape modules in ENVI. The change detection analysis (3) is based on ENVI processing routines and scripted with IDL. (2) and (3) are integrated with ArcGIS using a predefined Python interface. These 3 processing steps are combined using the ArcGIS ModelBuilder to create a new model for rapid disaster mapping in ArcGIS, based on SAR data. Moreover, this model can be dissolved from its desktop environment and published to users across the ArcGIS Server enterprise. Thus disaster zones, e.g. after severe flooding, can be automatically identified and mapped to support local task forces - using an operational workflow for SAR image analysis, which can be executed by the responsible operators without SAR expert knowledge.

  4. An ecohydrologic model for a shallow groundwater urban environment.

    PubMed

    Arden, Sam; Ma, Xin Cissy; Brown, Mark

    2014-01-01

    The urban environment is a patchwork of natural and artificial surfaces that results in complex interactions with and impacts to natural hydrologic cycles. Evapotranspiration is a major hydrologic flow that is often altered through urbanization, although the mechanisms of change are sometimes difficult to tease out due to difficulty in effectively simulating soil-plant-atmosphere interactions. This paper introduces a simplified yet realistic model that is a combination of existing surface runoff and ecohydrology models designed to increase the quantitative understanding of complex urban hydrologic processes. Results demonstrate that the model is capable of simulating the long-term variability of major hydrologic fluxes as a function of impervious surface, temperature, water table elevation, canopy interception, soil characteristics, precipitation and complex mechanisms of plant water uptake. These understandings have potential implications for holistic urban water system management.

  5. Modeling human behaviors and reactions under dangerous environment.

    PubMed

    Kang, J; Wright, D K; Qin, S F; Zhao, Y

    2005-01-01

    This paper describes the framework of a real-time simulation system to model human behavior and reactions in dangerous environments. The system utilizes the latest 3D computer animation techniques, combined with artificial intelligence, robotics and psychology, to model human behavior, reactions and decision making under expected/unexpected dangers in real-time in virtual environments. The development of the system includes: classification on the conscious/subconscious behaviors and reactions of different people; capturing different motion postures by the Eagle Digital System; establishing 3D character animation models; establishing 3D models for the scene; planning the scenario and the contents; and programming within Virtools Dev. Programming within Virtools Dev is subdivided into modeling dangerous events, modeling character's perceptions, modeling character's decision making, modeling character's movements, modeling character's interaction with environment and setting up the virtual cameras. The real-time simulation of human reactions in hazardous environments is invaluable in military defense, fire escape, rescue operation planning, traffic safety studies, and safety planning in chemical factories, the design of buildings, airplanes, ships and trains. Currently, human motion modeling can be realized through established technology, whereas to integrate perception and intelligence into virtual human's motion is still a huge undertaking. The challenges here are the synchronization of motion and intelligence, the accurate modeling of human's vision, smell, touch and hearing, the diversity and effects of emotion and personality in decision making. There are three types of software platforms which could be employed to realize the motion and intelligence within one system, and their advantages and disadvantages are discussed.

  6. Performance analysis of no-vent fill process for liquid hydrogen tank in terrestrial and on-orbit environments

    NASA Astrophysics Data System (ADS)

    Wang, Lei; Li, Yanzhong; Zhang, Feini; Ma, Yuan

    2015-12-01

    Two finite difference computer models, aiming at the process predictions of no-vent fill in normal gravity and microgravity environments respectively, are developed to investigate the filling performance in a liquid hydrogen (LH2) tank. In the normal gravity case model, the tank/fluid system is divided into five control volume including ullage, bulk liquid, gas-liquid interface, ullage-adjacent wall, and liquid-adjacent wall. In the microgravity case model, vapor-liquid thermal equilibrium state is maintained throughout the process, and only two nodes representing fluid and wall regions are applied. To capture the liquid-wall heat transfer accurately, a series of heat transfer mechanisms are considered and modeled successively, including film boiling, transition boiling, nucleate boiling and liquid natural convection. The two models are validated by comparing their prediction with experimental data, which shows good agreement. Then the two models are used to investigate the performance of no-vent fill in different conditions and several conclusions are obtained. It shows that in the normal gravity environment the no-vent fill experiences a continuous pressure rise during the whole process and the maximum pressure occurs at the end of the operation, while the maximum pressure of the microgravity case occurs at the beginning stage of the process. Moreover, it seems that increasing inlet mass flux has an apparent influence on the pressure evolution of no-vent fill process in normal gravity but a little influence in microgravity. The larger initial wall temperature brings about more significant liquid evaporation during the filling operation, and then causes higher pressure evolution, no matter the filling process occurs under normal gravity or microgravity conditions. Reducing inlet liquid temperature can improve the filling performance in normal gravity, but cannot significantly reduce the maximum pressure in microgravity. The presented work benefits the

  7. Modeling the space debris environment with MASTER-2009 and ORDEM2010

    NASA Astrophysics Data System (ADS)

    Flegel, Sven Kevin; Krisko, Paula; Gelhaus, Johannes; Wiedemann, Carsten; Moeckel, Marek; Krag, Holger; Klinkrad, Heiner; Xu, Yu-Lin; Horstman, Matthew; Matney, Mark; Vörsmann, Peter

    The two software tools MASTER-2009 and ORDEM2010 are the ESA and NASA reference software tools respectively which describe the earth's debris environment. The primary goal of both programs is to allow users to estimate the object flux onto a target object for mission planning. The current paper describes the basic distinctions in the model philosophies. At the core of each model lies the method by which the object environment is established. Cen-tral to this process is the role played by the results from radar/telescope observations or impact fluxes on surfaces returned from earth orbit. The ESA Meteoroid and Space Debris Terrestrial Environment Reference Model (MASTER) is engineered to give a realistic description of the natural and the man-made particulate environment of the earth. Debris sources are simulated based on detailed lists of known historical events such as fragmentations or solid rocket motor firings or through simulation of secondary debris such as impact ejecta or the release of paint flakes from degrading spacecraft surfaces. The resulting population is then validated against historical telescope/radar campaigns using the ESA Program for Radar and Optical Observa-tion Forecasting (PROOF) and against object impact fluxes on surfaces returned from space. The NASA Orbital Debris Engineering Model (ORDEM) series is designed to provide reliable estimates of orbital debris flux on spacecraft and through telescope or radar fields-of-view. Central to the model series is the empirical nature of the input populations. These are derived from NASA orbital debris modeling but verified, where possible, with measurement data from various sources. The latest version of the series, ORDEM2010, compiles over two decades of data from NASA radar systems, telescopes, in-situ sources, and ground tests that are analyzed by statistical methods. For increased understanding of the application ranges of the two programs, the current paper provides an overview of the two

  8. Designing a Web-Based Science Learning Environment for Model-Based Collaborative Inquiry

    NASA Astrophysics Data System (ADS)

    Sun, Daner; Looi, Chee-Kit

    2013-02-01

    The paper traces a research process in the design and development of a science learning environment called WiMVT (web-based inquirer with modeling and visualization technology). The WiMVT system is designed to help secondary school students build a sophisticated understanding of scientific conceptions, and the science inquiry process, as well as develop critical learning skills through model-based collaborative inquiry approach. It is intended to support collaborative inquiry, real-time social interaction, progressive modeling, and to provide multiple sources of scaffolding for students. We first discuss the theoretical underpinnings for synthesizing the WiMVT design framework, introduce the components and features of the system, and describe the proposed work flow of WiMVT instruction. We also elucidate our research approach that supports the development of the system. Finally, the findings of a pilot study are briefly presented to demonstrate of the potential for learning efficacy of the WiMVT implementation in science learning. Implications are drawn on how to improve the existing system, refine teaching strategies and provide feedback to researchers, designers and teachers. This pilot study informs designers like us on how to narrow the gap between the learning environment's intended design and its actual usage in the classroom.

  9. Transforming Collaborative Process Models into Interface Process Models by Applying an MDA Approach

    NASA Astrophysics Data System (ADS)

    Lazarte, Ivanna M.; Chiotti, Omar; Villarreal, Pablo D.

    Collaborative business models among enterprises require defining collaborative business processes. Enterprises implement B2B collaborations to execute these processes. In B2B collaborations the integration and interoperability of processes and systems of the enterprises are required to support the execution of collaborative processes. From a collaborative process model, which describes the global view of the enterprise interactions, each enterprise must define the interface process that represents the role it performs in the collaborative process in order to implement the process in a Business Process Management System. Hence, in this work we propose a method for the automatic generation of the interface process model of each enterprise from a collaborative process model. This method is based on a Model-Driven Architecture to transform collaborative process models into interface process models. By applying this method, interface processes are guaranteed to be interoperable and defined according to a collaborative process.

  10. Process material management in the Space Station environment

    NASA Technical Reports Server (NTRS)

    Perry, J. L.; Humphries, W. R.

    1988-01-01

    The Space Station will provide a unique facility for conducting material-processing and life-science experiments under microgravity conditions. These conditions place special requirements on the U.S. Laboratory for storing and transporting chemicals and process fluids, reclaiming water from selected experiments, treating and storing experiment wastes, and providing vacuum utilities. To meet these needs and provide a safe laboratory environment, the Process Material Management System (PMMS) is being developed. Preliminary design requirements and concepts related to the PMMS are addressed, and the MSFC PMMS breadboard test facility and a preliminary plan for validating the overall system design are discussed.

  11. Barotropic Tidal Predictions and Validation in a Relocatable Modeling Environment. Revised

    NASA Technical Reports Server (NTRS)

    Mehra, Avichal; Passi, Ranjit; Kantha, Lakshmi; Payne, Steven; Brahmachari, Shuvobroto

    1998-01-01

    Under funding from the Office of Naval Research (ONR), and the Naval Oceanographic Office (NAVOCEANO), the Mississippi State University Center for Air Sea Technology (CAST) has been working on developing a Relocatable Modeling Environment(RME) to provide a uniform and unbiased infrastructure for efficiently configuring numerical models in any geographic/oceanic region. Under Naval Oceanographic Office (NAVO-CEANO) funding, the model was implemented and tested for NAVOCEANO use. With our current emphasis on ocean tidal modeling, CAST has adopted the Colorado University's numerical ocean model, known as CURReNTSS (Colorado University Rapidly Relocatable Nestable Storm Surge) Model, as the model of choice. During the RME development process, CURReNTSS has been relocated to several coastal oceanic regions, providing excellent results that demonstrate its veracity. This report documents the model validation results and provides a brief description of the Graphic user Interface (GUI).

  12. One-dimensional biomass fast pyrolysis model with reaction kinetics integrated in an Aspen Plus Biorefinery Process Model

    DOE PAGES

    Humbird, David; Trendewicz, Anna; Braun, Robert; ...

    2017-01-12

    A biomass fast pyrolysis reactor model with detailed reaction kinetics and one-dimensional fluid dynamics was implemented in an equation-oriented modeling environment (Aspen Custom Modeler). Portions of this work were detailed in previous publications; further modifications have been made here to improve stability and reduce execution time of the model to make it compatible for use in large process flowsheets. The detailed reactor model was integrated into a larger process simulation in Aspen Plus and was stable for different feedstocks over a range of reactor temperatures. Sample results are presented that indicate general agreement with experimental results, but with higher gasmore » losses caused by stripping of the bio-oil by the fluidizing gas in the simulated absorber/condenser. Lastly, this integrated modeling approach can be extended to other well-defined, predictive reactor models for fast pyrolysis, catalytic fast pyrolysis, as well as other processes.« less

  13. One-dimensional biomass fast pyrolysis model with reaction kinetics integrated in an Aspen Plus Biorefinery Process Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Humbird, David; Trendewicz, Anna; Braun, Robert

    A biomass fast pyrolysis reactor model with detailed reaction kinetics and one-dimensional fluid dynamics was implemented in an equation-oriented modeling environment (Aspen Custom Modeler). Portions of this work were detailed in previous publications; further modifications have been made here to improve stability and reduce execution time of the model to make it compatible for use in large process flowsheets. The detailed reactor model was integrated into a larger process simulation in Aspen Plus and was stable for different feedstocks over a range of reactor temperatures. Sample results are presented that indicate general agreement with experimental results, but with higher gasmore » losses caused by stripping of the bio-oil by the fluidizing gas in the simulated absorber/condenser. Lastly, this integrated modeling approach can be extended to other well-defined, predictive reactor models for fast pyrolysis, catalytic fast pyrolysis, as well as other processes.« less

  14. Modeling the Blast Load Simulator Airblast Environment using First Principles Codes. Report 1, Blast Load Simulator Environment

    DTIC Science & Technology

    2016-11-01

    ER D C/ G SL T R- 16 -3 1 Modeling the Blast Load Simulator Airblast Environment Using First Principles Codes Report 1, Blast Load...Simulator Airblast Environment using First Principles Codes Report 1, Blast Load Simulator Environment Gregory C. Bessette, James L. O’Daniel...evaluate several first principles codes (FPCs) for modeling airblast environments typical of those encountered in the BLS. The FPCs considered were

  15. Application of integration algorithms in a parallel processing environment for the simulation of jet engines

    NASA Technical Reports Server (NTRS)

    Krosel, S. M.; Milner, E. J.

    1982-01-01

    The application of Predictor corrector integration algorithms developed for the digital parallel processing environment are investigated. The algorithms are implemented and evaluated through the use of a software simulator which provides an approximate representation of the parallel processing hardware. Test cases which focus on the use of the algorithms are presented and a specific application using a linear model of a turbofan engine is considered. Results are presented showing the effects of integration step size and the number of processors on simulation accuracy. Real time performance, interprocessor communication, and algorithm startup are also discussed.

  16. Prevalence and survival of Listeria monocytogenes in Danish aquatic and fish-processing environments.

    PubMed

    Hansen, Cisse Hedegaard; Vogel, Birte Fonnesbech; Gram, Lone

    2006-09-01

    Listeria monocytogenes contamination of ready-to-eat food products such as cold-smoked fish is often caused by pathogen subtypes persisting in food-processing environments. The purpose of the present study was to determine whether these L. monocytogenes subtypes can be found in the outside environment, i.e., outside food processing plants, and whether they survive better in the aquatic environment than do other strains. A total of 400 samples were collected from the outside environment, fish slaughterhouses, fish farms, and a smokehouse. L. monocytogenes was not detected in a freshwater stream, but prevalence increased with the degree of human activity: 2% in seawater fish farms, 10% in freshwater fish farms, 16% in fish slaughterhouses, and 68% in a fish smokehouse. The fish farms and slaughterhouses processed Danish rainbow trout, whereas the smokehouse was used for farm-raised Norwegian salmon. No variation with season was observed. Inside the processing plants, the pattern of randomly amplified polymorphic DNA (RAPD) types was homogeneous, but greater diversity existed among isolates from the outside environments. The RAPD type dominating the inside of the fish smokehouse was found only sporadically in outside environments. To examine survival in different environments, L. monocytogenes or Listeria innocua strains were inoculated into freshwater and saltwater microcosms. Pathogen counts decreased over time in Instant Ocean and remained constant in phosphate-buffered saline. In contrast, counts decreased rapidly in natural seawater and fresh water. The count reduction was much slower when the natural waters were autoclaved or filtered (0.2-microm pore size), indicating that the pathogen reduction in natural waters was attributable to a biological mechanism, e.g., protozoan grazing. A low prevalence of L. monocytogenes was found in the outside environment, and the bacteria did not survive well in natural environments. Therefore, L. monocytogenes in the outer

  17. Influence of fractal substructures of the percolating cluster on transferring processes in macroscopically disordered environments

    NASA Astrophysics Data System (ADS)

    Kolesnikov, B. P.

    2017-11-01

    The presented work belongs to the issue of searching for the effective kinetic properties of macroscopically disordered environments (MDE). These properties characterize MDE in general on the sizes which significantly exceed the sizes of macro inhomogeneity. The structure of MDE is considered as a complex of interpenetrating percolating and finite clusters consolidated from homonymous components, topological characteristics of which influence on the properties of the whole environment. The influence of percolating clusters’ fractal substructures (backbone, skeleton of backbone, red bonds) on the transfer processes during crossover (a structure transition from fractal to homogeneous condition) is investigated based on the offered mathematical approach for finding the effective conductivity of MDEs and on the percolating cluster model. The nature of the change of the critical conductivity index t during crossover from the characteristic value for the area close to percolation threshold to the value corresponded to homogeneous condition is demonstrated. The offered model describes the transfer processes in MDE with the finite conductivity relation of «conductive» and «low conductive» phases above and below percolation threshold and in smearing area (an analogue of a blur area of the second-order phase transfer).

  18. Process engineering concerns in the lunar environment

    NASA Technical Reports Server (NTRS)

    Sullivan, T. A.

    1990-01-01

    The paper discusses the constraints on a production process imposed by the lunar or Martian environment on the space transportation system. A proposed chemical route to produce oxygen from iron oxide bearing minerals (including ilmenite) is presented in three different configurations which vary in complexity. A design for thermal energy storage is presented that could both provide power during the lunar night and act as a blast protection barrier for the outpost. A process to release carbon from the lunar regolith as methane is proposed, capitalizing on the greater abundance and favorable physical properties of methane relative to hydrogen to benefit the entire system.

  19. Meteoroid Environment Modeling: the Meteoroid Engineering Model and Shower Forecasting

    NASA Technical Reports Server (NTRS)

    Moorhead, Althea V.

    2017-01-01

    The meteoroid environment is often divided conceptually into meteor showers plus a sporadic background component. The sporadic complex poses the bulk of the risk to spacecraft, but showers can produce significant short-term enhancements of the meteoroid flux. The Meteoroid Environment Office (MEO) has produced two environment models to handle these cases: the Meteoroid Engineering Model (MEM) and an annual meteor shower forecast. Both MEM and the forecast are used by multiple manned spaceflight projects in their meteoroid risk evaluation, and both tools are being revised to incorporate recent meteor velocity, density, and timing measurements. MEM describes the sporadic meteoroid complex and calculates the flux, speed, and directionality of the meteoroid environment relative to a user-supplied spacecraft trajectory, taking the spacecraft's motion into account. MEM is valid in the inner solar system and offers near-Earth and cis-lunar environments. While the current version of MEM offers a nominal meteoroid environment corresponding to a single meteoroid bulk density, the next version of MEMR3 will offer both flux uncertainties and a density distribution in addition to a revised near-Earth environment. We have updated the near-Earth meteor speed distribution and have made the first determination of uncertainty in this distribution. We have also derived a meteor density distribution from the work of Kikwaya et al. (2011). The annual meteor shower forecast takes the form of a report and data tables that can be used in conjunction with an existing MEM assessment. Fluxes are typically quoted to a constant limiting kinetic energy in order to comport with commonly used ballistic limit equations. For the 2017 annual forecast, the MEO substantially revised the list of showers and their characteristics using 14 years of meteor flux measurements from the Canadian Meteor Orbit Radar (CMOR). Defunct or insignificant showers were removed and the temporal profiles of many showers

  20. Physical Processes and Real-Time Chemical Measurement of the Insect Olfactory Environment

    PubMed Central

    Abrell, Leif; Hildebrand, John G.

    2009-01-01

    Odor-mediated insect navigation in airborne chemical plumes is vital to many ecological interactions, including mate finding, flower nectaring, and host locating (where disease transmission or herbivory may begin). After emission, volatile chemicals become rapidly mixed and diluted through physical processes that create a dynamic olfactory environment. This review examines those physical processes and some of the analytical technologies available to characterize those behavior-inducing chemical signals at temporal scales equivalent to the olfactory processing in insects. In particular, we focus on two areas of research that together may further our understanding of olfactory signal dynamics and its processing and perception by insects. First, measurement of physical atmospheric processes in the field can provide insight into the spatiotemporal dynamics of the odor signal available to insects. Field measurements in turn permit aspects of the physical environment to be simulated in the laboratory, thereby allowing careful investigation into the links between odor signal dynamics and insect behavior. Second, emerging analytical technologies with high recording frequencies and field-friendly inlet systems may offer new opportunities to characterize natural odors at spatiotemporal scales relevant to insect perception and behavior. Characterization of the chemical signal environment allows the determination of when and where olfactory-mediated behaviors may control ecological interactions. Finally, we argue that coupling of these two research areas will foster increased understanding of the physicochemical environment and enable researchers to determine how olfactory environments shape insect behaviors and sensory systems. PMID:18548311

  1. Modeling of Radiowave Propagation in a Forested Environment

    DTIC Science & Technology

    2014-09-01

    is unlimited 12b. DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) Propagation models used in wireless communication system design play an...domains. Applications in both domains require communication devices and sensors to be operated in forested environments. Various methods have been...wireless communication system design play an important role in overall link performance. Propagation models in a forested environment, in particular

  2. The dynamic radiation environment assimilation model (DREAM)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reeves, Geoffrey D; Koller, Josef; Tokar, Robert L

    2010-01-01

    The Dynamic Radiation Environment Assimilation Model (DREAM) is a 3-year effort sponsored by the US Department of Energy to provide global, retrospective, or real-time specification of the natural and potential nuclear radiation environments. The DREAM model uses Kalman filtering techniques that combine the strengths of new physical models of the radiation belts with electron observations from long-term satellite systems such as GPS and geosynchronous systems. DREAM includes a physics model for the production and long-term evolution of artificial radiation belts from high altitude nuclear explosions. DREAM has been validated against satellites in arbitrary orbits and consistently produces more accurate resultsmore » than existing models. Tools for user-specific applications and graphical displays are in beta testing and a real-time version of DREAM has been in continuous operation since November 2009.« less

  3. Meteoroid Environment Modeling: the Meteoroid Engineering Model and Shower Forecasting

    NASA Technical Reports Server (NTRS)

    Moorhead, Althea V.

    2017-01-01

    INTRODUCTION: The meteoroid environment is often divided conceptually into meteor showers and the sporadic meteor background. It is commonly but incorrectly assumed that meteoroid impacts primarily occur during meteor showers; instead, the vast majority of hazardous meteoroids belong to the sporadic complex. Unlike meteor showers, which persist for a few hours to a few weeks, sporadic meteoroids impact the Earth's atmosphere and spacecraft throughout the year. The Meteoroid Environment Office (MEO) has produced two environment models to handle these cases: the Meteoroid Engineering Model (MEM) and an annual meteor shower forecast. The sporadic complex, despite its year-round activity, is not isotropic in its directionality. Instead, their apparent points of origin, or radiants, are organized into groups called "sources". The speed, directionality, and size distribution of these sporadic sources are modeled by the Meteoroid Engineering Model (MEM), which is currently in its second major release version (MEMR2) [Moorhead et al., 2015]. MEM provides the meteoroid flux relative to a user-provided spacecraft trajectory; it provides the total flux as well as the flux per angular bin, speed interval, and on specific surfaces (ram, wake, etc.). Because the sporadic complex dominates the meteoroid flux, MEM is the most appropriate model to use in spacecraft design. Although showers make up a small fraction of the meteoroid environment, they can produce significant short-term enhancements of the meteoroid flux. Thus, it can be valuable to consider showers when assessing risks associated with vehicle operations that are brief in duration. To assist with such assessments, the MEO issues an annual forecast that reports meteor shower fluxes as a function of time and compares showers with the time-averaged total meteoroid flux. This permits missions to do quick assessments of the increase in risk posed by meteor showers.

  4. The Rapid Integration and Test Environment: A Process for Achieving Software Test Acceptance

    DTIC Science & Technology

    2010-05-01

    Test Environment : A Process for Achieving Software Test Acceptance 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S...mlif`v= 365= k^s^i=mlpqdo^ar^qb=p`elli= The Rapid Integration and Test Environment : A Process for Achieving Software Test Acceptance Patrick V...was awarded the Bronze Star. Introduction The Rapid Integration and Test Environment (RITE) initiative, implemented by the Program Executive Office

  5. A Novel Petri Nets-Based Modeling Method for the Interaction between the Sensor and the Geographic Environment in Emerging Sensor Networks

    PubMed Central

    Zhang, Feng; Xu, Yuetong; Chou, Jarong

    2016-01-01

    The service of sensor device in Emerging Sensor Networks (ESNs) is the extension of traditional Web services. Through the sensor network, the service of sensor device can communicate directly with the entity in the geographic environment, and even impact the geographic entity directly. The interaction between the sensor device in ESNs and geographic environment is very complex, and the interaction modeling is a challenging problem. This paper proposed a novel Petri Nets-based modeling method for the interaction between the sensor device and the geographic environment. The feature of the sensor device service in ESNs is more easily affected by the geographic environment than the traditional Web service. Therefore, the response time, the fault-tolerant ability and the resource consumption become important factors in the performance of the whole sensor application system. Thus, this paper classified IoT services as Sensing services and Controlling services according to the interaction between IoT service and geographic entity, and classified GIS services as data services and processing services. Then, this paper designed and analyzed service algebra and Colored Petri Nets model to modeling the geo-feature, IoT service, GIS service and the interaction process between the sensor and the geographic enviroment. At last, the modeling process is discussed by examples. PMID:27681730

  6. X-ray emission processes in stars and their immediate environment

    PubMed Central

    Testa, Paola

    2010-01-01

    A decade of X-ray stellar observations with Chandra and XMM-Newton has led to significant advances in our understanding of the physical processes at work in hot (magnetized) plasmas in stars and their immediate environment, providing new perspectives and challenges, and in turn the need for improved models. The wealth of high-quality stellar spectra has allowed us to investigate, in detail, the characteristics of the X-ray emission across the Hertzsprung-Russell (HR) diagram. Progress has been made in addressing issues ranging from classical stellar activity in stars with solar-like dynamos (such as flares, activity cycles, spatial and thermal structuring of the X-ray emitting plasma, and evolution of X-ray activity with age), to X-ray generating processes (e.g., accretion, jets, magnetically confined winds) that were poorly understood in the preChandra/XMM-Newton era. I will discuss the progress made in the study of high energy stellar physics and its impact in a wider astrophysical context, focusing on the role of spectral diagnostics now accessible. PMID:20360562

  7. An extended car-following model considering the acceleration derivative in some typical traffic environments

    NASA Astrophysics Data System (ADS)

    Zhou, Tong; Chen, Dong; Liu, Weining

    2018-03-01

    Based on the full velocity difference and acceleration car-following model, an extended car-following model is proposed by considering the vehicle’s acceleration derivative. The stability condition is given by applying the control theory. Considering some typical traffic environments, the results of theoretical analysis and numerical simulation show the extended model has a more actual acceleration of string vehicles than that of the previous models in starting process, stopping process and sudden brake. Meanwhile, the traffic jams more easily occur when the coefficient of vehicle’s acceleration derivative increases, which is presented by space-time evolution. The results confirm that the vehicle’s acceleration derivative plays an important role in the traffic jamming transition and the evolution of traffic congestion.

  8. Creating an inclusive mall environment with the PRECEDE-PROCEED model: a living lab case study.

    PubMed

    Ahmed, Sara; Swaine, Bonnie; Milot, Marc; Gaudet, Caroline; Poldma, Tiiu; Bartlett, Gillian; Mazer, Barbara; Le Dorze, Guylaine; Barbic, Skye; Rodriguez, Ana Maria; Lefebvre, Hélène; Archambault, Philippe; Kairy, Dahlia; Fung, Joyce; Labbé, Delphine; Lamontagne, Anouk; Kehayia, Eva

    2017-10-01

    Although public environments provide opportunities for participation and social inclusion, they are not always inclusive spaces and may not accommodate the wide diversity of people. The Rehabilitation Living Lab in the Mall is a unique, interdisciplinary, and multi-sectoral research project with an aim to transform a shopping complex in Montreal, Canada, into an inclusive environment optimizing the participation and social inclusion of all people. The PRECEDE-PROCEDE Model (PPM), a community-oriented and participatory planning model, was applied as a framework. The PPM is comprised of nine steps divided between planning, implementation, and evaluation. The PPM is well suited as a framework for the development of an inclusive mall. Its ecological approach considers the environment, as well as the social and individual factors relating to mall users' needs and expectations. Transforming a mall to be more inclusive is a complex process involving many stakeholders. The PPM allows the synthesis of several sources of information, as well as the identification and prioritization of key issues to address. The PPM also helps to frame and drive the implementation and evaluate the components of the project. This knowledge can help others interested in using the PPM to create similar enabling and inclusive environments world-wide. Implication for rehabilitation While public environments provide opportunities for participation and social inclusion, they are not always inclusive spaces and may not accommodate the wide diversity of people. The PRECEDE PROCEDE Model (PPM) is well suited as a framework for the development, implementation, and evaluation of an inclusive mall. Environmental barriers can negatively impact the rehabilitation process by impeding the restoration and augmentation of function. Removing barriers to social participation and independent living by improving inclusivity in the mall and other environments positively impacts the lives of people with disabilities.

  9. Integration of Modelling and Graphics to Create an Infrared Signal Processing Test Bed

    NASA Astrophysics Data System (ADS)

    Sethi, H. R.; Ralph, John E.

    1989-03-01

    The work reported in this paper was carried out as part of a contract with MoD (PE) UK. It considers the problems associated with realistic modelling of a passive infrared system in an operational environment. Ideally all aspects of the system and environment should be integrated into a complete end-to-end simulation but in the past limited computing power has prevented this. Recent developments in workstation technology and the increasing availability of parallel processing techniques makes the end-to-end simulation possible. However the complexity and speed of such simulations means difficulties for the operator in controlling the software and understanding the results. These difficulties can be greatly reduced by providing an extremely user friendly interface and a very flexible, high power, high resolution colour graphics capability. Most system modelling is based on separate software simulation of the individual components of the system itself and its environment. These component models may have their own characteristic inbuilt assumptions and approximations, may be written in the language favoured by the originator and may have a wide variety of input and output conventions and requirements. The models and their limitations need to be matched to the range of conditions appropriate to the operational scenerio. A comprehensive set of data bases needs to be generated by the component models and these data bases must be made readily available to the investigator. Performance measures need to be defined and displayed in some convenient graphics form. Some options are presented for combining available hardware and software to create an environment within which the models can be integrated, and which provide the required man-machine interface, graphics and computing power. The impact of massively parallel processing and artificial intelligence will be discussed. Parallel processing will make real time end-to-end simulation possible and will greatly improve the

  10. Bayesian GGE biplot models applied to maize multi-environments trials.

    PubMed

    de Oliveira, L A; da Silva, C P; Nuvunga, J J; da Silva, A Q; Balestre, M

    2016-06-17

    The additive main effects and multiplicative interaction (AMMI) and the genotype main effects and genotype x environment interaction (GGE) models stand out among the linear-bilinear models used in genotype x environment interaction studies. Despite the advantages of their use to describe genotype x environment (AMMI) or genotype and genotype x environment (GGE) interactions, these methods have known limitations that are inherent to fixed effects models, including difficulty in treating variance heterogeneity and missing data. Traditional biplots include no measure of uncertainty regarding the principal components. The present study aimed to apply the Bayesian approach to GGE biplot models and assess the implications for selecting stable and adapted genotypes. Our results demonstrated that the Bayesian approach applied to GGE models with non-informative priors was consistent with the traditional GGE biplot analysis, although the credible region incorporated into the biplot enabled distinguishing, based on probability, the performance of genotypes, and their relationships with the environments in the biplot. Those regions also enabled the identification of groups of genotypes and environments with similar effects in terms of adaptability and stability. The relative position of genotypes and environments in biplots is highly affected by the experimental accuracy. Thus, incorporation of uncertainty in biplots is a key tool for breeders to make decisions regarding stability selection and adaptability and the definition of mega-environments.

  11. Modeling hyporheic zone processes

    USGS Publications Warehouse

    Runkel, Robert L.; McKnight, Diane M.; Rajaram, Harihar

    2003-01-01

    Stream biogeochemistry is influenced by the physical and chemical processes that occur in the surrounding watershed. These processes include the mass loading of solutes from terrestrial and atmospheric sources, the physical transport of solutes within the watershed, and the transformation of solutes due to biogeochemical reactions. Research over the last two decades has identified the hyporheic zone as an important part of the stream system in which these processes occur. The hyporheic zone may be loosely defined as the porous areas of the stream bed and stream bank in which stream water mixes with shallow groundwater. Exchange of water and solutes between the stream proper and the hyporheic zone has many biogeochemical implications, due to differences in the chemical composition of surface and groundwater. For example, surface waters are typically oxidized environments with relatively high dissolved oxygen concentrations. In contrast, reducing conditions are often present in groundwater systems leading to low dissolved oxygen concentrations. Further, microbial oxidation of organic materials in groundwater leads to supersaturated concentrations of dissolved carbon dioxide relative to the atmosphere. Differences in surface and groundwater pH and temperature are also common. The hyporheic zone is therefore a mixing zone in which there are gradients in the concentrations of dissolved gasses, the concentrations of oxidized and reduced species, pH, and temperature. These gradients lead to biogeochemical reactions that ultimately affect stream water quality. Due to the complexity of these natural systems, modeling techniques are frequently employed to quantify process dynamics.

  12. Process-based modeling of temperature and water profiles in the seedling recruitment zone: Part I. Model validation

    USDA-ARS?s Scientific Manuscript database

    Process-based modeling provides detailed spatial and temporal information of the soil environment in the shallow seedling recruitment zone across field topography where measurements of soil temperature and water may not sufficiently describe the zone. Hourly temperature and water profiles within the...

  13. Fermentation process tracking through enhanced spectral calibration modeling.

    PubMed

    Triadaphillou, Sophia; Martin, Elaine; Montague, Gary; Norden, Alison; Jeffkins, Paul; Stimpson, Sarah

    2007-06-15

    The FDA process analytical technology (PAT) initiative will materialize in a significant increase in the number of installations of spectroscopic instrumentation. However, to attain the greatest benefit from the data generated, there is a need for calibration procedures that extract the maximum information content. For example, in fermentation processes, the interpretation of the resulting spectra is challenging as a consequence of the large number of wavelengths recorded, the underlying correlation structure that is evident between the wavelengths and the impact of the measurement environment. Approaches to the development of calibration models have been based on the application of partial least squares (PLS) either to the full spectral signature or to a subset of wavelengths. This paper presents a new approach to calibration modeling that combines a wavelength selection procedure, spectral window selection (SWS), where windows of wavelengths are automatically selected which are subsequently used as the basis of the calibration model. However, due to the non-uniqueness of the windows selected when the algorithm is executed repeatedly, multiple models are constructed and these are then combined using stacking thereby increasing the robustness of the final calibration model. The methodology is applied to data generated during the monitoring of broth concentrations in an industrial fermentation process from on-line near-infrared (NIR) and mid-infrared (MIR) spectrometers. It is shown that the proposed calibration modeling procedure outperforms traditional calibration procedures, as well as enabling the identification of the critical regions of the spectra with regard to the fermentation process.

  14. Models of Solar Wind Structures and Their Interaction with the Earth's Space Environment

    NASA Astrophysics Data System (ADS)

    Watermann, J.; Wintoft, P.; Sanahuja, B.; Saiz, E.; Poedts, S.; Palmroth, M.; Milillo, A.; Metallinou, F.-A.; Jacobs, C.; Ganushkina, N. Y.; Daglis, I. A.; Cid, C.; Cerrato, Y.; Balasis, G.; Aylward, A. D.; Aran, A.

    2009-11-01

    The discipline of “Space Weather” is built on the scientific foundation of solar-terrestrial physics but with a strong orientation toward applied research. Models describing the solar-terrestrial environment are therefore at the heart of this discipline, for both physical understanding of the processes involved and establishing predictive capabilities of the consequences of these processes. Depending on the requirements, purely physical models, semi-empirical or empirical models are considered to be the most appropriate. This review focuses on the interaction of solar wind disturbances with geospace. We cover interplanetary space, the Earth’s magnetosphere (with the exception of radiation belt physics), the ionosphere (with the exception of radio science), the neutral atmosphere and the ground (via electromagnetic induction fields). Space weather relevant state-of-the-art physical and semi-empirical models of the various regions are reviewed. They include models for interplanetary space, its quiet state and the evolution of recurrent and transient solar perturbations (corotating interaction regions, coronal mass ejections, their interplanetary remnants, and solar energetic particle fluxes). Models of coupled large-scale solar wind-magnetosphere-ionosphere processes (global magnetohydrodynamic descriptions) and of inner magnetosphere processes (ring current dynamics) are discussed. Achievements in modeling the coupling between magnetospheric processes and the neutral and ionized upper and middle atmospheres are described. Finally we mention efforts to compile comprehensive and flexible models from selections of existing modules applicable to particular regions and conditions in interplanetary space and geospace.

  15. Process Model for Friction Stir Welding

    NASA Technical Reports Server (NTRS)

    Adams, Glynn

    1996-01-01

    forging affect of the shoulder. The energy balance at the boundary of the plastic region with the environment required that energy flow away from the boundary in both radial directions. One resolution to this problem may be to introduce a time dependency into the process model, allowing the energy flow to oscillate across this boundary. Finally, experimental measurements are needed to verify the concepts used here and to aid in improving the model.

  16. A Data Stream Model For Runoff Simulation In A Changing Environment

    NASA Astrophysics Data System (ADS)

    Yang, Q.; Shao, J.; Zhang, H.; Wang, G.

    2017-12-01

    Runoff simulation is of great significance for water engineering design, water disaster control, water resources planning and management in a catchment or region. A large number of methods including concept-based process-driven models and statistic-based data-driven models, have been proposed and widely used in worldwide during past decades. Most existing models assume that the relationship among runoff and its impacting factors is stationary. However, in the changing environment (e.g., climate change, human disturbance), their relationship usually evolves over time. In this study, we propose a data stream model for runoff simulation in a changing environment. Specifically, the proposed model works in three steps: learning a rule set, expansion of a rule, and simulation. The first step is to initialize a rule set. When a new observation arrives, the model will check which rule covers it and then use the rule for simulation. Meanwhile, Page-Hinckley (PH) change detection test is used to monitor the online simulation error of each rule. If a change is detected, the corresponding rule is removed from the rule set. In the second step, for each rule, if it covers more than a given number of instance, the rule is expected to expand. In the third step, a simulation model of each leaf node is learnt with a perceptron without activation function, and is updated with adding a newly incoming observation. Taking Fuxi River catchment as a case study, we applied the model to simulate the monthly runoff in the catchment. Results show that abrupt change is detected in the year of 1997 by using the Page-Hinckley change detection test method, which is consistent with the historic record of flooding. In addition, the model achieves good simulation results with the RMSE of 13.326, and outperforms many established methods. The findings demonstrated that the proposed data stream model provides a promising way to simulate runoff in a changing environment.

  17. Family Environment and Cognitive Development: Twelve Analytic Models

    ERIC Educational Resources Information Center

    Walberg, Herbert J.; Marjoribanks, Kevin

    1976-01-01

    The review indicates that refined measures of the family environment and the use of complex statistical models increase the understanding of the relationships between socioeconomic status, sibling variables, family environment, and cognitive development. (RC)

  18. LEGEND, a LEO-to-GEO Environment Debris Model

    NASA Technical Reports Server (NTRS)

    Liou, Jer Chyi; Hall, Doyle T.

    2013-01-01

    LEGEND (LEO-to-GEO Environment Debris model) is a three-dimensional orbital debris evolutionary model that is capable of simulating the historical and future debris populations in the near-Earth environment. The historical component in LEGEND adopts a deterministic approach to mimic the known historical populations. Launched rocket bodies, spacecraft, and mission-related debris (rings, bolts, etc.) are added to the simulated environment. Known historical breakup events are reproduced, and fragments down to 1 mm in size are created. The LEGEND future projection component adopts a Monte Carlo approach and uses an innovative pair-wise collision probability evaluation algorithm to simulate the future breakups and the growth of the debris populations. This algorithm is based on a new "random sampling in time" approach that preserves characteristics of the traditional approach and captures the rapidly changing nature of the orbital debris environment. LEGEND is a Fortran 90-based numerical simulation program. It operates in a UNIX/Linux environment.

  19. A validated agent-based model to study the spatial and temporal heterogeneities of malaria incidence in the rainforest environment.

    PubMed

    Pizzitutti, Francesco; Pan, William; Barbieri, Alisson; Miranda, J Jaime; Feingold, Beth; Guedes, Gilvan R; Alarcon-Valenzuela, Javiera; Mena, Carlos F

    2015-12-22

    The Amazon environment has been exposed in the last decades to radical changes that have been accompanied by a remarkable rise of both Plasmodium falciparum and Plasmodium vivax malaria. The malaria transmission process is highly influenced by factors such as spatial and temporal heterogeneities of the environment and individual-based characteristics of mosquitoes and humans populations. All these determinant factors can be simulated effectively trough agent-based models. This paper presents a validated agent-based model of local-scale malaria transmission. The model reproduces the environment of a typical riverine village in the northern Peruvian Amazon, where the malaria transmission is highly seasonal and apparently associated with flooding of large areas caused by the neighbouring river. Agents representing humans, mosquitoes and the two species of Plasmodium (P. falciparum and P. vivax) are simulated in a spatially explicit representation of the environment around the village. The model environment includes: climate, people houses positions and elevation. A representation of changes in the mosquito breeding areas extension caused by the river flooding is also included in the simulation environment. A calibration process was carried out to reproduce the variations of the malaria monthly incidence over a period of 3 years. The calibrated model is also able to reproduce the spatial heterogeneities of local scale malaria transmission. A "what if" eradication strategy scenario is proposed: if the mosquito breeding sites are eliminated through mosquito larva habitat management in a buffer area extended at least 200 m around the village, the malaria transmission is eradicated from the village. The use of agent-based models can reproduce effectively the spatiotemporal variations of the malaria transmission in a low endemicity environment dominated by river floodings like in the Amazon.

  20. Model-based reasoning for system and software engineering: The Knowledge From Pictures (KFP) environment

    NASA Technical Reports Server (NTRS)

    Bailin, Sydney; Paterra, Frank; Henderson, Scott; Truszkowski, Walt

    1993-01-01

    This paper presents a discussion of current work in the area of graphical modeling and model-based reasoning being undertaken by the Automation Technology Section, Code 522.3, at Goddard. The work was initially motivated by the growing realization that the knowledge acquisition process was a major bottleneck in the generation of fault detection, isolation, and repair (FDIR) systems for application in automated Mission Operations. As with most research activities this work started out with a simple objective: to develop a proof-of-concept system demonstrating that a draft rule-base for a FDIR system could be automatically realized by reasoning from a graphical representation of the system to be monitored. This work was called Knowledge From Pictures (KFP) (Truszkowski et. al. 1992). As the work has successfully progressed the KFP tool has become an environment populated by a set of tools that support a more comprehensive approach to model-based reasoning. This paper continues by giving an overview of the graphical modeling objectives of the work, describing the three tools that now populate the KFP environment, briefly presenting a discussion of related work in the field, and by indicating future directions for the KFP environment.

  1. Event-based hydrological modeling for detecting dominant hydrological process and suitable model strategy for semi-arid catchments

    NASA Astrophysics Data System (ADS)

    Huang, Pengnian; Li, Zhijia; Chen, Ji; Li, Qiaoling; Yao, Cheng

    2016-11-01

    To simulate the hydrological processes in semi-arid areas properly is still challenging. This study assesses the impact of different modeling strategies on simulating flood processes in semi-arid catchments. Four classic hydrological models, TOPMODEL, XINANJIANG (XAJ), SAC-SMA and TANK, were selected and applied to three semi-arid catchments in North China. Based on analysis and comparison of the simulation results of these classic models, four new flexible models were constructed and used to further investigate the suitability of various modeling strategies for semi-arid environments. Numerical experiments were also designed to examine the performances of the models. The results show that in semi-arid catchments a suitable model needs to include at least one nonlinear component to simulate the main process of surface runoff generation. If there are more than two nonlinear components in the hydrological model, they should be arranged in parallel, rather than in series. In addition, the results show that the parallel nonlinear components should be combined by multiplication rather than addition. Moreover, this study reveals that the key hydrological process over semi-arid catchments is the infiltration excess surface runoff, a non-linear component.

  2. Environment Modeling Using Runtime Values for JPF-Android

    NASA Technical Reports Server (NTRS)

    van der Merwe, Heila; Tkachuk, Oksana; Nel, Seal; van der Merwe, Brink; Visser, Willem

    2015-01-01

    Software applications are developed to be executed in a specific environment. This environment includes external native libraries to add functionality to the application and drivers to fire the application execution. For testing and verification, the environment of an application is simplified abstracted using models or stubs. Empty stubs, returning default values, are simple to generate automatically, but they do not perform well when the application expects specific return values. Symbolic execution is used to find input parameters for drivers and return values for library stubs, but it struggles to detect the values of complex objects. In this work-in-progress paper, we explore an approach to generate drivers and stubs based on values collected during runtime instead of using default values. Entry-points and methods that need to be modeled are instrumented to log their parameters and return values. The instrumented applications are then executed using a driver and instrumented libraries. The values collected during runtime are used to generate driver and stub values on- the-fly that improve coverage during verification by enabling the execution of code that previously crashed or was missed. We are implementing this approach to improve the environment model of JPF-Android, our model checking and analysis tool for Android applications.

  3. Multi-model-based interactive authoring environment for creating shareable medical knowledge.

    PubMed

    Ali, Taqdir; Hussain, Maqbool; Ali Khan, Wajahat; Afzal, Muhammad; Hussain, Jamil; Ali, Rahman; Hassan, Waseem; Jamshed, Arif; Kang, Byeong Ho; Lee, Sungyoung

    2017-10-01

    criteria, which we assessed through the system- and user-centric evaluation processes. For system-centric evaluation, we compared the implementation of clinical information modelling system requirements in our proposed system and in existing systems. The results suggested that 82.05% of the requirements were fully supported, 7.69% were partially supported, and 10.25% were not supported by our system. In the existing systems, 35.89% of requirements were fully supported, 28.20% were partially supported, and 35.89% were not supported. For user-centric evaluation, the assessment criterion was 'ease of use'. Our proposed system showed 15 times better results with respect to MLM creation time than the existing systems. Moreover, on average, the participants made only one error in MLM creation using our proposed system, but 13 errors per MLM using the existing systems. We provide a user-friendly authoring environment for creation of shareable and interoperable knowledge for CDSS to overcome knowledge acquisition complexity. The authoring environment uses state-of-the-art decision support-related clinical standards with increased ease of use. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Improving the Aircraft Design Process Using Web-Based Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Reed, John A.; Follen, Gregory J.; Afjeh, Abdollah A.; Follen, Gregory J. (Technical Monitor)

    2000-01-01

    Designing and developing new aircraft systems is time-consuming and expensive. Computational simulation is a promising means for reducing design cycle times, but requires a flexible software environment capable of integrating advanced multidisciplinary and multifidelity analysis methods, dynamically managing data across heterogeneous computing platforms, and distributing computationally complex tasks. Web-based simulation, with its emphasis on collaborative composition of simulation models, distributed heterogeneous execution, and dynamic multimedia documentation, has the potential to meet these requirements. This paper outlines the current aircraft design process, highlighting its problems and complexities, and presents our vision of an aircraft design process using Web-based modeling and simulation.

  5. Improving the Aircraft Design Process Using Web-based Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Reed, John A.; Follen, Gregory J.; Afjeh, Abdollah A.

    2003-01-01

    Designing and developing new aircraft systems is time-consuming and expensive. Computational simulation is a promising means for reducing design cycle times, but requires a flexible software environment capable of integrating advanced multidisciplinary and muitifidelity analysis methods, dynamically managing data across heterogeneous computing platforms, and distributing computationally complex tasks. Web-based simulation, with its emphasis on collaborative composition of simulation models, distributed heterogeneous execution, and dynamic multimedia documentation, has the potential to meet these requirements. This paper outlines the current aircraft design process, highlighting its problems and complexities, and presents our vision of an aircraft design process using Web-based modeling and simulation.

  6. A Big Data-driven Model for the Optimization of Healthcare Processes.

    PubMed

    Koufi, Vassiliki; Malamateniou, Flora; Vassilacopoulos, George

    2015-01-01

    Healthcare organizations increasingly navigate a highly volatile, complex environment in which technological advancements and new healthcare delivery business models are the only constants. In their effort to out-perform in this environment, healthcare organizations need to be agile enough in order to become responsive to these increasingly changing conditions. To act with agility, healthcare organizations need to discover new ways to optimize their operations. To this end, they focus on healthcare processes that guide healthcare delivery and on the technologies that support them. Business process management (BPM) and Service-Oriented Architecture (SOA) can provide a flexible, dynamic, cloud-ready infrastructure where business process analytics can be utilized to extract useful insights from mountains of raw data, and make them work in ways beyond the abilities of human brains, or IT systems from just a year ago. This paper presents a framework which provides healthcare professionals gain better insight within and across your business processes. In particular, it performs real-time analysis on process-related data in order reveal areas of potential process improvement.

  7. Space Environments and Effects: Trapped Proton Model

    NASA Technical Reports Server (NTRS)

    Huston, S. L.; Kauffman, W. (Technical Monitor)

    2002-01-01

    An improved model of the Earth's trapped proton environment has been developed. This model, designated Trapped Proton Model version 1 (TPM-1), determines the omnidirectional flux of protons with energy between 1 and 100 MeV throughout near-Earth space. The model also incorporates a true solar cycle dependence. The model consists of several data files and computer software to read them. There are three versions of the mo'del: a FORTRAN-Callable library, a stand-alone model, and a Web-based model.

  8. Representing environment-induced helix-coil transitions in a coarse grained peptide model

    NASA Astrophysics Data System (ADS)

    Dalgicdir, Cahit; Globisch, Christoph; Sayar, Mehmet; Peter, Christine

    2016-10-01

    Coarse grained (CG) models are widely used in studying peptide self-assembly and nanostructure formation. One of the recurrent challenges in CG modeling is the problem of limited transferability, for example to different thermodynamic state points and system compositions. Understanding transferability is generally a prerequisite to knowing for which problems a model can be reliably used and predictive. For peptides, one crucial transferability question is whether a model reproduces the molecule's conformational response to a change in its molecular environment. This is of particular importance since CG peptide models often have to resort to auxiliary interactions that aid secondary structure formation. Such interactions take care of properties of the real system that are per se lost in the coarse graining process such as dihedral-angle correlations along the backbone or backbone hydrogen bonding. These auxiliary interactions may then easily overstabilize certain conformational propensities and therefore destroy the ability of the model to respond to stimuli and environment changes, i.e. they impede transferability. In the present paper we have investigated a short peptide with amphiphilic EALA repeats which undergoes conformational transitions between a disordered and a helical state upon a change in pH value or due to the presence of a soft apolar/polar interface. We designed a base CG peptide model that does not carry a specific (backbone) bias towards a secondary structure. This base model was combined with two typical approaches of ensuring secondary structure formation, namely a C α -C α -C α -C α pseudodihedral angle potential or a virtual site interaction that mimics hydrogen bonding. We have investigated the ability of the two resulting CG models to represent the environment-induced conformational changes in the helix-coil equilibrium of EALA. We show that with both approaches a CG peptide model can be obtained that is environment-transferable and that

  9. Meteoroid Environment Modeling: The Meteoroid Engineering Model and Shower Forecasting

    NASA Technical Reports Server (NTRS)

    Moorhead, Althea V.

    2017-01-01

    The meteoroid environment is often divided conceptually into meteor showers and the sporadic meteor background. It is commonly but incorrectly assumed that meteoroid impacts primarily occur during meteor showers; instead, the vast majority of hazardous meteoroids belong to the sporadic complex. Unlike meteor showers, which persist for a few hours to a few weeks, sporadic meteoroids impact the Earth's atmosphere and spacecraft throughout the year. The Meteoroid Environment Office (MEO) has produced two environment models to handle these cases: the Meteoroid Engineering Model (MEM) and an annual meteor shower forecast. The sporadic complex, despite its year-round activity, is not isotropic in its directionality. Instead, their apparent points of origin, or radiants, are organized into groups called "sources". The speed, directionality, and size distribution of these sporadic sources are modeled by the Meteoroid Engineering Model (MEM), which is currently in its second major release version (MEMR2) [Moorhead et al., 2015]. MEM provides the meteoroid flux relative to a user-provided spacecraft trajectory; it provides the total flux as well as the flux per angular bin, speed interval, and on specific surfaces (ram, wake, etc.). Because the sporadic complex dominates the meteoroid flux, MEM is the most appropriate model to use in spacecraft design. Although showers make up a small fraction of the meteoroid environment, they can produce significant short-term enhancements of the meteoroid flux. Thus, it can be valuable to consider showers when assessing risks associated with vehicle operations that are brief in duration. To assist with such assessments, the MEO issues an annual forecast that reports meteor shower fluxes as a function of time and compares showers with the time-averaged total meteoroid flux. This permits missions to do quick assessments of the increase in risk posed by meteor showers. Section II describes MEM in more detail and describes our current efforts

  10. MOUNTAIN-SCALE COUPLED PROCESSES (TH/THC/THM)MODELS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Y.S. Wu

    This report documents the development and validation of the mountain-scale thermal-hydrologic (TH), thermal-hydrologic-chemical (THC), and thermal-hydrologic-mechanical (THM) models. These models provide technical support for screening of features, events, and processes (FEPs) related to the effects of coupled TH/THC/THM processes on mountain-scale unsaturated zone (UZ) and saturated zone (SZ) flow at Yucca Mountain, Nevada (BSC 2005 [DIRS 174842], Section 2.1.1.1). The purpose and validation criteria for these models are specified in ''Technical Work Plan for: Near-Field Environment and Transport: Coupled Processes (Mountain-Scale TH/THC/THM, Drift-Scale THC Seepage, and Drift-Scale Abstraction) Model Report Integration'' (BSC 2005 [DIRS 174842]). Model results are used tomore » support exclusion of certain FEPs from the total system performance assessment for the license application (TSPA-LA) model on the basis of low consequence, consistent with the requirements of 10 CFR 63.342 [DIRS 173273]. Outputs from this report are not direct feeds to the TSPA-LA. All the FEPs related to the effects of coupled TH/THC/THM processes on mountain-scale UZ and SZ flow are discussed in Sections 6 and 7 of this report. The mountain-scale coupled TH/THC/THM processes models numerically simulate the impact of nuclear waste heat release on the natural hydrogeological system, including a representation of heat-driven processes occurring in the far field. The mountain-scale TH simulations provide predictions for thermally affected liquid saturation, gas- and liquid-phase fluxes, and water and rock temperature (together called the flow fields). The main focus of the TH model is to predict the changes in water flux driven by evaporation/condensation processes, and drainage between drifts. The TH model captures mountain-scale three-dimensional flow effects, including lateral diversion and mountain-scale flow patterns. The mountain-scale THC model evaluates TH effects on water and gas

  11. A poisson process model for hip fracture risk.

    PubMed

    Schechner, Zvi; Luo, Gangming; Kaufman, Jonathan J; Siffert, Robert S

    2010-08-01

    The primary method for assessing fracture risk in osteoporosis relies primarily on measurement of bone mass. Estimation of fracture risk is most often evaluated using logistic or proportional hazards models. Notwithstanding the success of these models, there is still much uncertainty as to who will or will not suffer a fracture. This has led to a search for other components besides mass that affect bone strength. The purpose of this paper is to introduce a new mechanistic stochastic model that characterizes the risk of hip fracture in an individual. A Poisson process is used to model the occurrence of falls, which are assumed to occur at a rate, lambda. The load induced by a fall is assumed to be a random variable that has a Weibull probability distribution. The combination of falls together with loads leads to a compound Poisson process. By retaining only those occurrences of the compound Poisson process that result in a hip fracture, a thinned Poisson process is defined that itself is a Poisson process. The fall rate is modeled as an affine function of age, and hip strength is modeled as a power law function of bone mineral density (BMD). The risk of hip fracture can then be computed as a function of age and BMD. By extending the analysis to a Bayesian framework, the conditional densities of BMD given a prior fracture and no prior fracture can be computed and shown to be consistent with clinical observations. In addition, the conditional probabilities of fracture given a prior fracture and no prior fracture can also be computed, and also demonstrate results similar to clinical data. The model elucidates the fact that the hip fracture process is inherently random and improvements in hip strength estimation over and above that provided by BMD operate in a highly "noisy" environment and may therefore have little ability to impact clinical practice.

  12. Modeling Small-Scale Nearshore Processes

    NASA Astrophysics Data System (ADS)

    Slinn, D.; Holland, T.; Puleo, J.; Puleo, J.; Hanes, D.

    2001-12-01

    In recent years advances in high performance computing have made it possible to gain new qualitative and quantitative insights into the behavior and effects of coastal processes using high-resolution physical-mathematical models. The Coastal Dynamics program at the U.S. Office of Naval Research under the guidance of Dr. Thomas Kinder has encouraged collaboration between modelers, theoreticians, and field and laboratory experimentalists and supported innovative modeling efforts to examine a wide range of nearshore processes. An area of emphasis has been small-scale, time-dependent, turbulent flows, such as the wave bottom boundary layer, breaking surface waves, and the swash zone and their effects on shoaling waves, mean currents, and sediment transport that integrate to impact the long-term and large-scale response of the beach system to changing environmental conditions. Examples of small-scale modeling studies supported by CD-321 related to our work include simulation of the wave bottom boundary layer. Under mild wave field conditions the seabed forms sand ripples and simulations demonstrate that the ripples cause increases in the bed friction, the kinetic energy dissipation rates, the boundary layer thickness, and turbulence in the water column. Under energetic wave field conditions the ripples are sheared smooth and sheet flow conditions can predominate, causing the top few layers of sand grains to move as a fluidized bed, making large aggregate contributions to sediment transport. Complementary models of aspects of these processes have been developed simultaneously in various directions (e.g., Jenkins and Hanes, JFM 1998; Drake and Calantoni, 2001; Trowbridge and Madsen, JGR, 1984). Insight into near-bed fluid-sediment interactions has also been advanced using Navier-Stokes based models of swash events. Our recent laboratory experiments at the Waterways Experiment Station demonstrate that volume-of-fluid models can predict salient features of swash uprush

  13. Kinetic Modeling of the Lunar Dust-Plasma Environment

    NASA Astrophysics Data System (ADS)

    Kallio, Esa; Alho, Markku; Alvarez, Francisco; Barabash, Stas; Dyadechkin, Sergey; Fernandes, Vera; Futaana, Yoshifumi; Harri, Ari-Matti; Haunia, Touko; Heilimo, Jyri; Holmström, Mats; Jarvinen, Riku; Lue, Charles; Makela, Jakke; Porjo, Niko; Schmidt, Walter; Shahab, Fatemi; Siili, Tero; Wurz, Peter

    2014-05-01

    Modeling of the lunar dust and plasma environment is a challenging task because a self-consistent model should include ions, electrons and dust particles and numerous other factors. However, most of the parameters are not well established or constrained by measurements in the lunar environment. More precisely, a comprehensive model should contain electrons originating from 1) the solar wind, 2) the lunar material (photoelectrons, secondary electrons) and 3) the lunar dust. Ions originate from the solar wind, the lunar material, the lunar exosphere and the dust. To model the role of the dust in the lunar plasma environment is a highly complex task since the properties of the dust particles in the exosphere are poorly known (e.g. mass, size, shape, conductivity) or not known (e.g. charge and photoelectron emission) and probably are time dependent. Models should also include the effects of interactions between the surface and solar wind and energetic particles, and micrometeorites. Largely different temporal and spatial scales are also a challenge for the numerical models. In addition, the modeling of a region on the Moon - for example on the South Pole - at a given time requires also knowledge of the solar illumination conditions at that time, mineralogical and electric properties of the local lunar surface, lunar magnetic anomalies, solar UV flux and the properties of the solar wind. Harmful effects of lunar dust to technical devices and to human health as well as modeling of the properties of the lunar plasma and dust environment have been topics of two ESA funded projects L-DEPP and DPEM. In the presentation we will summarize some basic results and characteristics of plasma and fields near and around the Moon as studied and discovered in these projects. Especially, we analyse three different space and time scales by kinetic models: [1] the "microscale" region near surface with an electrostatic PIC (ions and electrons are particles) model, [2] the "mesoscale

  14. Parallel processing optimization strategy based on MapReduce model in cloud storage environment

    NASA Astrophysics Data System (ADS)

    Cui, Jianming; Liu, Jiayi; Li, Qiuyan

    2017-05-01

    Currently, a large number of documents in the cloud storage process employed the way of packaging after receiving all the packets. From the local transmitter this stored procedure to the server, packing and unpacking will consume a lot of time, and the transmission efficiency is low as well. A new parallel processing algorithm is proposed to optimize the transmission mode. According to the operation machine graphs model work, using MPI technology parallel execution Mapper and Reducer mechanism. It is good to use MPI technology to implement Mapper and Reducer parallel mechanism. After the simulation experiment of Hadoop cloud computing platform, this algorithm can not only accelerate the file transfer rate, but also shorten the waiting time of the Reducer mechanism. It will break through traditional sequential transmission constraints and reduce the storage coupling to improve the transmission efficiency.

  15. Ant-mediated ecosystem processes are driven by trophic community structure but mainly by the environment.

    PubMed

    Salas-Lopez, Alex; Mickal, Houadria; Menzel, Florian; Orivel, Jérôme

    2017-01-01

    The diversity and functional identity of organisms are known to be relevant to the maintenance of ecosystem processes but can be variable in different environments. Particularly, it is uncertain whether ecosystem processes are driven by complementary effects or by dominant groups of species. We investigated how community structure (i.e., the diversity and relative abundance of biological entities) explains the community-level contribution of Neotropical ant communities to different ecosystem processes in different environments. Ants were attracted with food resources representing six ant-mediated ecosystem processes in four environments: ground and vegetation strata in cropland and forest habitats. The exploitation frequencies of the baits were used to calculate the taxonomic and trophic structures of ant communities and their contribution to ecosystem processes considered individually or in combination (i.e., multifunctionality). We then investigated whether community structure variables could predict ecosystem processes and whether such relationships were affected by the environment. We found that forests presented a greater biodiversity and trophic complementarity and lower dominance than croplands, but this did not affect ecosystem processes. In contrast, trophic complementarity was greater on the ground than on vegetation and was followed by greater resource exploitation levels. Although ant participation in ecosystem processes can be predicted by means of trophic-based indices, we found that variations in community structure and performance in ecosystem processes were best explained by environment. We conclude that determining the extent to which the dominance and complementarity of communities affect ecosystem processes in different environments requires a better understanding of resource availability to different species.

  16. Sanitizing in Dry-Processing Environments Using Isopropyl Alcohol Quaternary Ammonium Formula.

    PubMed

    Kane, Deborah M; Getty, Kelly J K; Mayer, Brian; Mazzotta, Alejandro

    2016-01-01

    Dry-processing environments are particularly challenging to clean and sanitize because introduced water can favor growth and establishment of pathogenic microorganisms such as Salmonella. Our objective was to determine the efficacy of an isopropyl alcohol quaternary ammonium (IPAQuat) formula for eliminating potential Salmonella contamination on food contact surfaces. Clean stainless steel coupons and conveyor belt materials used in dry-processing environments were spot inoculated in the center of coupons (5 by 5 cm) with a six-serotype composite of Salmonella (approximately 10 log CFU/ml), subjected to IPAQuat sanitizer treatments with exposure times of 30 s, 1 min, or 5 min, and then swabbed for enumeration of posttreatment survivors. A subset of inoculated surfaces was soiled with a breadcrumb-flour blend and allowed to sit on the laboratory bench for a minimum of 16 h before sanitation. Pretreatment Salmonella populations (inoculated controls, 0 s treatment) were approximately 7.0 log CFU/25 cm(2), and posttreatment survivors were 1.31, 0.72, and < 0.7 (detection limit) log CFU/25 cm(2) after sanitizer exposure for 30 s, 1 min, or 5 min, respectively, for both clean (no added soil) and soiled surfaces. Treatment with the IPAQuat formula using 30-s sanitizer exposures resulted in 5.68-log reductions, whereas >6.0-log reductions were observed for sanitizer exposures of 1 and 5 min. Because water is not introduced into the processing environment with this approach, the IPAQuat formula could have sanitation applications in dry-processing environments to eliminate potential contamination from Salmonella on food contact surfaces.

  17. A Neural Network Model to Learn Multiple Tasks under Dynamic Environments

    NASA Astrophysics Data System (ADS)

    Tsumori, Kenji; Ozawa, Seiichi

    When environments are dynamically changed for agents, the knowledge acquired in an environment might be useless in future. In such dynamic environments, agents should be able to not only acquire new knowledge but also modify old knowledge in learning. However, modifying all knowledge acquired before is not efficient because the knowledge once acquired may be useful again when similar environment reappears and some knowledge can be shared among different environments. To learn efficiently in such environments, we propose a neural network model that consists of the following modules: resource allocating network, long-term & short-term memory, and environment change detector. We evaluate the model under a class of dynamic environments where multiple function approximation tasks are sequentially given. The experimental results demonstrate that the proposed model possesses stable incremental learning, accurate environmental change detection, proper association and recall of old knowledge, and efficient knowledge transfer.

  18. Simulating the decentralized processes of the human immune system in a virtual anatomy model.

    PubMed

    Sarpe, Vladimir; Jacob, Christian

    2013-01-01

    Many physiological processes within the human body can be perceived and modeled as large systems of interacting particles or swarming agents. The complex processes of the human immune system prove to be challenging to capture and illustrate without proper reference to the spatial distribution of immune-related organs and systems. Our work focuses on physical aspects of immune system processes, which we implement through swarms of agents. This is our first prototype for integrating different immune processes into one comprehensive virtual physiology simulation. Using agent-based methodology and a 3-dimensional modeling and visualization environment (LINDSAY Composer), we present an agent-based simulation of the decentralized processes in the human immune system. The agents in our model - such as immune cells, viruses and cytokines - interact through simulated physics in two different, compartmentalized and decentralized 3-dimensional environments namely, (1) within the tissue and (2) inside a lymph node. While the two environments are separated and perform their computations asynchronously, an abstract form of communication is allowed in order to replicate the exchange, transportation and interaction of immune system agents between these sites. The distribution of simulated processes, that can communicate across multiple, local CPUs or through a network of machines, provides a starting point to build decentralized systems that replicate larger-scale processes within the human body, thus creating integrated simulations with other physiological systems, such as the circulatory, endocrine, or nervous system. Ultimately, this system integration across scales is our goal for the LINDSAY Virtual Human project. Our current immune system simulations extend our previous work on agent-based simulations by introducing advanced visualizations within the context of a virtual human anatomy model. We also demonstrate how to distribute a collection of connected simulations over a

  19. Modeling mechanical cardiopulmonary interactions for virtual environments.

    PubMed

    Kaye, J M

    1997-01-01

    We have developed a computer system for modeling mechanical cardiopulmonary behavior in an interactive, 3D virtual environment. The system consists of a compact, scalar description of cardiopulmonary mechanics, with an emphasis on respiratory mechanics, that drives deformable 3D anatomy to simulate mechanical behaviors of and interactions between physiological systems. Such an environment can be used to facilitate exploration of cardiopulmonary physiology, particularly in situations that are difficult to reproduce clinically. We integrate 3D deformable body dynamics with new, formal models of (scalar) cardiorespiratory physiology, associating the scalar physiological variables and parameters with corresponding 3D anatomy. Our approach is amenable to modeling patient-specific circumstances in two ways. First, using CT scan data, we apply semi-automatic methods for extracting and reconstructing the anatomy to use in our simulations. Second, our scalar models are defined in terms of clinically-measurable, patient-specific parameters. This paper describes our approach and presents a sample of results showing normal breathing and acute effects of pneumothoraces.

  20. OpenFLUID: an open-source software environment for modelling fluxes in landscapes

    NASA Astrophysics Data System (ADS)

    Fabre, Jean-Christophe; Rabotin, Michaël; Crevoisier, David; Libres, Aline; Dagès, Cécile; Moussa, Roger; Lagacherie, Philippe; Raclot, Damien; Voltz, Marc

    2013-04-01

    Integrative landscape functioning has become a common concept in environmental management. Landscapes are complex systems where many processes interact in time and space. In agro-ecosystems, these processes are mainly physical processes, including hydrological-processes, biological processes and human activities. Modelling such systems requires an interdisciplinary approach, coupling models coming from different disciplines, developed by different teams. In order to support collaborative works, involving many models coupled in time and space for integrative simulations, an open software modelling platform is a relevant answer. OpenFLUID is an open source software platform for modelling landscape functioning, mainly focused on spatial fluxes. It provides an advanced object-oriented architecture allowing to i) couple models developed de novo or from existing source code, and which are dynamically plugged to the platform, ii) represent landscapes as hierarchical graphs, taking into account multi-scale, spatial heterogeneities and landscape objects connectivity, iii) run and explore simulations in many ways : using the OpenFLUID software interfaces for users (command line interface, graphical user interface), or using external applications such as GNU R through the provided ROpenFLUID package. OpenFLUID is developed in C++ and relies on open source libraries only (Boost, libXML2, GLib/GTK, OGR/GDAL, …). For modelers and developers, OpenFLUID provides a dedicated environment for model development, which is based on an open source toolchain, including the Eclipse editor, the GCC compiler and the CMake build system. OpenFLUID is distributed under the GPLv3 open source license, with a special exception allowing to plug existing models licensed under any license. It is clearly in the spirit of sharing knowledge and favouring collaboration in a community of modelers. OpenFLUID has been involved in many research applications, such as modelling of hydrological network

  1. Process Correlation Analysis Model for Process Improvement Identification

    PubMed Central

    Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data. PMID:24977170

  2. Process correlation analysis model for process improvement identification.

    PubMed

    Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  3. Virtual Collaborative Simulation Environment for Integrated Product and Process Development

    NASA Technical Reports Server (NTRS)

    Gulli, Michael A.

    1997-01-01

    Deneb Robotics is a leader in the development of commercially available, leading edge three- dimensional simulation software tools for virtual prototyping,, simulation-based design, manufacturing process simulation, and factory floor simulation and training applications. Deneb has developed and commercially released a preliminary Virtual Collaborative Engineering (VCE) capability for Integrated Product and Process Development (IPPD). This capability allows distributed, real-time visualization and evaluation of design concepts, manufacturing processes, and total factory and enterprises in one seamless simulation environment.

  4. The Methodology of Interactive Parametric Modelling of Construction Site Facilities in BIM Environment

    NASA Astrophysics Data System (ADS)

    Kozlovská, Mária; Čabala, Jozef; Struková, Zuzana

    2014-11-01

    Information technology is becoming a strong tool in different industries, including construction. The recent trend of buildings designing is leading up to creation of the most comprehensive virtual building model (Building Information Model) in order to solve all the problems relating to the project as early as in the designing phase. Building information modelling is a new way of approaching to the design of building projects documentation. Currently, the building site layout as a part of the building design documents has a very little support in the BIM environment. Recently, the research of designing the construction process conditions has centred on improvement of general practice in planning and on new approaches to construction site layout planning. The state of art in field of designing the construction process conditions indicated an unexplored problem related to connection of knowledge system with construction site facilities (CSF) layout through interactive modelling. The goal of the paper is to present the methodology for execution of 3D construction site facility allocation model (3D CSF-IAM), based on principles of parametric and interactive modelling.

  5. Programmatic access to logical models in the Cell Collective modeling environment via a REST API.

    PubMed

    Kowal, Bryan M; Schreier, Travis R; Dauer, Joseph T; Helikar, Tomáš

    2016-01-01

    Cell Collective (www.cellcollective.org) is a web-based interactive environment for constructing, simulating and analyzing logical models of biological systems. Herein, we present a Web service to access models, annotations, and simulation data in the Cell Collective platform through the Representational State Transfer (REST) Application Programming Interface (API). The REST API provides a convenient method for obtaining Cell Collective data through almost any programming language. To ensure easy processing of the retrieved data, the request output from the API is available in a standard JSON format. The Cell Collective REST API is freely available at http://thecellcollective.org/tccapi. All public models in Cell Collective are available through the REST API. For users interested in creating and accessing their own models through the REST API first need to create an account in Cell Collective (http://thecellcollective.org). thelikar2@unl.edu. Technical user documentation: https://goo.gl/U52GWo. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  6. NG6: Integrated next generation sequencing storage and processing environment.

    PubMed

    Mariette, Jérôme; Escudié, Frédéric; Allias, Nicolas; Salin, Gérald; Noirot, Céline; Thomas, Sylvain; Klopp, Christophe

    2012-09-09

    Next generation sequencing platforms are now well implanted in sequencing centres and some laboratories. Upcoming smaller scale machines such as the 454 junior from Roche or the MiSeq from Illumina will increase the number of laboratories hosting a sequencer. In such a context, it is important to provide these teams with an easily manageable environment to store and process the produced reads. We describe a user-friendly information system able to manage large sets of sequencing data. It includes, on one hand, a workflow environment already containing pipelines adapted to different input formats (sff, fasta, fastq and qseq), different sequencers (Roche 454, Illumina HiSeq) and various analyses (quality control, assembly, alignment, diversity studies,…) and, on the other hand, a secured web site giving access to the results. The connected user will be able to download raw and processed data and browse through the analysis result statistics. The provided workflows can easily be modified or extended and new ones can be added. Ergatis is used as a workflow building, running and monitoring system. The analyses can be run locally or in a cluster environment using Sun Grid Engine. NG6 is a complete information system designed to answer the needs of a sequencing platform. It provides a user-friendly interface to process, store and download high-throughput sequencing data.

  7. Adaptive User Model for Web-Based Learning Environment.

    ERIC Educational Resources Information Center

    Garofalakis, John; Sirmakessis, Spiros; Sakkopoulos, Evangelos; Tsakalidis, Athanasios

    This paper describes the design of an adaptive user model and its implementation in an advanced Web-based Virtual University environment that encompasses combined and synchronized adaptation between educational material and well-known communication facilities. The Virtual University environment has been implemented to support a postgraduate…

  8. Extending BPM Environments of Your Choice with Performance Related Decision Support

    NASA Astrophysics Data System (ADS)

    Fritzsche, Mathias; Picht, Michael; Gilani, Wasif; Spence, Ivor; Brown, John; Kilpatrick, Peter

    What-if Simulations have been identified as one solution for business performance related decision support. Such support is especially useful in cases where it can be automatically generated out of Business Process Management (BPM) Environments from the existing business process models and performance parameters monitored from the executed business process instances. Currently, some of the available BPM Environments offer basic-level performance prediction capabilities. However, these functionalities are normally too limited to be generally useful for performance related decision support at business process level. In this paper, an approach is presented which allows the non-intrusive integration of sophisticated tooling for what-if simulations, analytic performance prediction tools, process optimizations or a combination of such solutions into already existing BPM environments. The approach abstracts from process modelling techniques which enable automatic decision support spanning processes across numerous BPM Environments. For instance, this enables end-to-end decision support for composite processes modelled with the Business Process Modelling Notation (BPMN) on top of existing Enterprise Resource Planning (ERP) processes modelled with proprietary languages.

  9. Modelling the cohesive sediment transport in the marine environment: the case of Thermaikos Gulf

    NASA Astrophysics Data System (ADS)

    Krestenitis, Y. N.; Kombiadou, K. D.; Savvidis, Y. G.

    2007-02-01

    The transport of fine-grained sediments in the marine environment entails risks of pollutant intrusions from substances absorbed onto the cohesive flocks' surface, gradually released to the aquatic field. These substances include nutrients such as nitrate, phosphate and silicate compounds from drainage from fertilization of adjacent cultivated areas that enter the coastal areas through rivers and streams, or trace metals as remainders from urban and industrial activities. As a consequence, knowledge on the motion and distribution of sediment particles coming from a given pollutant source is expected to provide the 'bulk' information on pollutant distribution, necessary for determining the region of influence of the source and to estimate probable trophic levels of the seawater and potential environmental risks. In that aim a numerical model has been developed to predict the fate of the sediments introduced to the marine environment from different pollution sources, such as river outflows, erosion of the seabed, aeolian transported material and drainage systems. The proposed three-dimensional mathematical model is based on the particle tracking method, according to which matter concentration is expressed by particles, each representing a particular amount of sedimentary mass, passively advected and dispersed by the currents. The processes affecting characteristics and propagation of sedimentary material in the marine environment, incorporated in the parameterization, apart from advection and dispersion, include cohesive sediment and near-bed processes. The movement of the particles along with variations in sedimentary characteristics and state, carried by each particle as personal information, are traced with time. Specifically, concerning transport processes, the local seawater velocity and the particle's settling control advection, whereas the random Brownian motion due to turbulence simulates turbulent diffusion. The vertical stratification of the water-column is

  10. Modelling the cohesive sediment transport in the marine environment: the case of Thermaikos Gulf

    NASA Astrophysics Data System (ADS)

    Krestenitis, Y. N.; Kombiadou, K. D.; Savvidis, Y. G.

    2006-07-01

    The transport of fine-grained sediments in the marine environment entails risks of pollutant intrusions from substances absorbed onto the cohesive flocks' surface, gradually released to the aquatic field. These substances include nutrients such as nitrate, phosphate and silicate compounds from drainage from fertilization of adjacent cultivated areas that enter the coastal areas through rivers and streams, or trace metals as remainders from urban and industrial activities. As a consequence, knowledge on the motion and distribution of sediment particles coming from a given pollutant source is expected to provide the ''bulk'' information on pollutant distribution, necessary for determining the region of influence of the source and to estimate probable trophic levels of the seawater and potential environmental risks. In that aim a numerical model has been developed to predict the fate of the sediments introduced to the marine environment from different pollution sources, such as river outflows, erosion of the seabed, aeolian transported material and drainage systems. The proposed three-dimensional mathematical model is based on the particle tracking method, according to which matter concentration is expressed by particles, each representing a particular amount of sedimentary mass, passively advected and dispersed by the currents. The processes affecting characteristics and propagation of sedimentary material in the marine environment, incorporated in the parameterization, apart from advection and dispersion, include cohesive sediment and near-bed processes. The movement of the particles along with variations in sedimentary characteristics and state, carried by each particle as personal information, are traced with time. Specifically, concerning transport processes, the local seawater velocity and the particle's settling control advection, whereas the random Brownian motion due to turbulence simulates turbulent diffusion. The vertical stratification of the water

  11. Securing Provenance of Distributed Processes in an Untrusted Environment

    NASA Astrophysics Data System (ADS)

    Syalim, Amril; Nishide, Takashi; Sakurai, Kouichi

    Recently, there is much concern about the provenance of distributed processes, that is about the documentation of the origin and the processes to produce an object in a distributed system. The provenance has many applications in the forms of medical records, documentation of processes in the computer systems, recording the origin of data in the cloud, and also documentation of human-executed processes. The provenance of distributed processes can be modeled by a directed acyclic graph (DAG) where each node represents an entity, and an edge represents the origin and causal relationship between entities. Without sufficient security mechanisms, the provenance graph suffers from integrity and confidentiality problems, for example changes or deletions of the correct nodes, additions of fake nodes and edges, and unauthorized accesses to the sensitive nodes and edges. In this paper, we propose an integrity mechanism for provenance graph using the digital signature involving three parties: the process executors who are responsible in the nodes' creation, a provenance owner that records the nodes to the provenance store, and a trusted party that we call the Trusted Counter Server (TCS) that records the number of nodes stored by the provenance owner. We show that the mechanism can detect the integrity problem in the provenance graph, namely unauthorized and malicious “authorized” updates even if all the parties, except the TCS, collude to update the provenance. In this scheme, the TCS only needs a very minimal storage (linear with the number of the provenance owners). To protect the confidentiality and for an efficient access control administration, we propose a method to encrypt the provenance graph that allows access by paths and compartments in the provenance graph. We argue that encryption is important as a mechanism to protect the provenance data stored in an untrusted environment. We analyze the security of the integrity mechanism, and perform experiments to measure

  12. The impact of working memory and the "process of process modelling" on model quality: Investigating experienced versus inexperienced modellers.

    PubMed

    Martini, Markus; Pinggera, Jakob; Neurauter, Manuel; Sachse, Pierre; Furtner, Marco R; Weber, Barbara

    2016-05-09

    A process model (PM) represents the graphical depiction of a business process, for instance, the entire process from online ordering a book until the parcel is delivered to the customer. Knowledge about relevant factors for creating PMs of high quality is lacking. The present study investigated the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension, modelling, and reconciliation) were related to PM quality. Our results show that the WM function of relational integration was positively related to PM quality in both modelling groups. The ratio of comprehension phases was negatively related to PM quality in inexperienced modellers and the ratio of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling.

  13. ISLE (Image and Signal Processing LISP Environment) reference manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sherwood, R.J.; Searfus, R.M.

    1990-01-01

    ISLE is a rapid prototyping system for performing image and signal processing. It is designed to meet the needs of a person doing development of image and signal processing algorithms in a research environment. The image and signal processing modules in ISLE form a very capable package in themselves. They also provide a rich environment for quickly and easily integrating user-written software modules into the package. ISLE is well suited to applications in which there is a need to develop a processing algorithm in an interactive manner. It is straightforward to develop the algorithms, load it into ISLE, apply themore » algorithm to an image or signal, display the results, then modify the algorithm and repeat the develop-load-apply-display cycle. ISLE consists of a collection of image and signal processing modules integrated into a cohesive package through a standard command interpreter. ISLE developer elected to concentrate their effort on developing image and signal processing software rather than developing a command interpreter. A COMMON LISP interpreter was selected for the command interpreter because it already has the features desired in a command interpreter, it supports dynamic loading of modules for customization purposes, it supports run-time parameter and argument type checking, it is very well documented, and it is a commercially supported product. This manual is intended to be a reference manual for the ISLE functions The functions are grouped into a number of categories and briefly discussed in the Function Summary chapter. The full descriptions of the functions and all their arguments are given in the Function Descriptions chapter. 6 refs.« less

  14. A multiarchitecture parallel-processing development environment

    NASA Technical Reports Server (NTRS)

    Townsend, Scott; Blech, Richard; Cole, Gary

    1993-01-01

    A description is given of the hardware and software of a multiprocessor test bed - the second generation Hypercluster system. The Hypercluster architecture consists of a standard hypercube distributed-memory topology, with multiprocessor shared-memory nodes. By using standard, off-the-shelf hardware, the system can be upgraded to use rapidly improving computer technology. The Hypercluster's multiarchitecture nature makes it suitable for researching parallel algorithms in computational field simulation applications (e.g., computational fluid dynamics). The dedicated test-bed environment of the Hypercluster and its custom-built software allows experiments with various parallel-processing concepts such as message passing algorithms, debugging tools, and computational 'steering'. Such research would be difficult, if not impossible, to achieve on shared, commercial systems.

  15. A collaborative molecular modeling environment using a virtual tunneling service.

    PubMed

    Lee, Jun; Kim, Jee-In; Kang, Lin-Woo

    2012-01-01

    Collaborative researches of three-dimensional molecular modeling can be limited by different time zones and locations. A networked virtual environment can be utilized to overcome the problem caused by the temporal and spatial differences. However, traditional approaches did not sufficiently consider integration of different computing environments, which were characterized by types of applications, roles of users, and so on. We propose a collaborative molecular modeling environment to integrate different molecule modeling systems using a virtual tunneling service. We integrated Co-Coot, which is a collaborative crystallographic object-oriented toolkit, with VRMMS, which is a virtual reality molecular modeling system, through a collaborative tunneling system. The proposed system showed reliable quantitative and qualitative results through pilot experiments.

  16. A Collaborative Molecular Modeling Environment Using a Virtual Tunneling Service

    PubMed Central

    Lee, Jun; Kim, Jee-In; Kang, Lin-Woo

    2012-01-01

    Collaborative researches of three-dimensional molecular modeling can be limited by different time zones and locations. A networked virtual environment can be utilized to overcome the problem caused by the temporal and spatial differences. However, traditional approaches did not sufficiently consider integration of different computing environments, which were characterized by types of applications, roles of users, and so on. We propose a collaborative molecular modeling environment to integrate different molecule modeling systems using a virtual tunneling service. We integrated Co-Coot, which is a collaborative crystallographic object-oriented toolkit, with VRMMS, which is a virtual reality molecular modeling system, through a collaborative tunneling system. The proposed system showed reliable quantitative and qualitative results through pilot experiments. PMID:22927721

  17. PROCESS DESIGN FOR ENVIRONMENT: A MULTI-OBJECTIVE FRAMEWORK UNDER UNCERTAINTY

    EPA Science Inventory

    Designing chemical processes for environment requires consideration of several indexes of environmental impact including ozone depletion and global warming potentials, human and aquatic toxicity, and photochemical oxidation, and acid rain potentials. Current methodologies like t...

  18. Analytical Model For Fluid Dynamics In A Microgravity Environment

    NASA Technical Reports Server (NTRS)

    Naumann, Robert J.

    1995-01-01

    Report presents analytical approximation methodology for providing coupled fluid-flow, heat, and mass-transfer equations in microgravity environment. Experimental engineering estimates accurate to within factor of 2 made quickly and easily, eliminating need for time-consuming and costly numerical modeling. Any proposed experiment reviewed to see how it would perform in microgravity environment. Model applied in commercial setting for preliminary design of low-Grashoff/Rayleigh-number experiments.

  19. Introducing ORACLE: Library Processing in a Multi-User Environment.

    ERIC Educational Resources Information Center

    Queensland Library Board, Brisbane (Australia).

    Currently being developed by the State Library of Queensland, Australia, ORACLE (On-Line Retrieval of Acquisitions, Cataloguing, and Circulation Details for Library Enquiries) is a computerized library system designed to provide rapid processing of library materials in a multi-user environment. It is based on the Australian MARC format and fully…

  20. Analysis of the Growth Process of Neural Cells in Culture Environment Using Image Processing Techniques

    NASA Astrophysics Data System (ADS)

    Mirsafianf, Atefeh S.; Isfahani, Shirin N.; Kasaei, Shohreh; Mobasheri, Hamid

    Here we present an approach for processing neural cells images to analyze their growth process in culture environment. We have applied several image processing techniques for: 1- Environmental noise reduction, 2- Neural cells segmentation, 3- Neural cells classification based on their dendrites' growth conditions, and 4- neurons' features Extraction and measurement (e.g., like cell body area, number of dendrites, axon's length, and so on). Due to the large amount of noise in the images, we have used feed forward artificial neural networks to detect edges more precisely.

  1. Model and Analytic Processes for Export License Assessments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thompson, Sandra E.; Whitney, Paul D.; Weimar, Mark R.

    2011-09-29

    This paper represents the Department of Energy Office of Nonproliferation Research and Development (NA-22) Simulations, Algorithms and Modeling (SAM) Program's first effort to identify and frame analytical methods and tools to aid export control professionals in effectively predicting proliferation intent; a complex, multi-step and multi-agency process. The report focuses on analytical modeling methodologies that alone, or combined, may improve the proliferation export control license approval process. It is a follow-up to an earlier paper describing information sources and environments related to international nuclear technology transfer. This report describes the decision criteria used to evaluate modeling techniques and tools to determinemore » which approaches will be investigated during the final 2 years of the project. The report also details the motivation for why new modeling techniques and tools are needed. The analytical modeling methodologies will enable analysts to evaluate the information environment for relevance to detecting proliferation intent, with specific focus on assessing risks associated with transferring dual-use technologies. Dual-use technologies can be used in both weapons and commercial enterprises. A decision-framework was developed to evaluate which of the different analytical modeling methodologies would be most appropriate conditional on the uniqueness of the approach, data availability, laboratory capabilities, relevance to NA-22 and Office of Arms Control and Nonproliferation (NA-24) research needs and the impact if successful. Modeling methodologies were divided into whether they could help micro-level assessments (e.g., help improve individual license assessments) or macro-level assessment. Macro-level assessment focuses on suppliers, technology, consumers, economies, and proliferation context. Macro-level assessment technologies scored higher in the area of uniqueness because less work has been done at the macro level. An

  2. Entity Modeling and Immersive Decision Environments

    DTIC Science & Technology

    2011-09-01

    Simulation Technologies (REST) Lerman, D. J. (2010). Correct Weather Modeling of non-Standard Days (10F- SIW -004). In Proceedings of 2010 Fall Simulation...Interoperability Workshop (Fall SIW ) SISO. Orlando, FL: SISO. Most flight simulators compute and fly in a weather environment that matches a

  3. Examining Student Research Choices and Processes in a Disintermediated Searching Environment

    ERIC Educational Resources Information Center

    Rempel, Hannah Gascho; Buck, Stefanie; Deitering, Anne-Marie

    2013-01-01

    Students today perform research in a disintermediated environment, which often allows them to struggle directly with the process of selecting research tools and choosing scholarly sources. The authors conducted a qualitative study with twenty students, using structured observations to ascertain the processes students use to select databases and…

  4. A Sandbox Environment for the Community Sensor Model Standard

    NASA Astrophysics Data System (ADS)

    Hare, T. M.; Laura, J. R.; Humpreys, I. R.; Wilson, T. J.; Hahn, M. A.; Shepherd, M. R.; Sides, S. C.

    2017-06-01

    Here we present ongoing work Astrogeology is undertaking to provide a programming sandbox environment for the Community Sensor Model standard. We define a sandbox as a testing environment that allows programmers to experiment.

  5. Constitutive and damage material modeling in a high pressure hydrogen environment

    NASA Technical Reports Server (NTRS)

    Russell, D. A.; Fritzemeier, L. G.

    1991-01-01

    Numerous components in reusable space propulsion systems such as the SSME are exposed to high pressure gaseous hydrogen environments. Flow areas and passages in the fuel turbopump, fuel and oxidizer preburners, main combustion chamber, and injector assembly contain high pressure hydrogen either high in purity or as hydrogen rich steam. Accurate constitutive and damage material models applicable to high pressure hydrogen environments are therefore needed for engine design and analysis. Existing constitutive and cyclic crack initiation models were evaluated only for conditions of oxidizing environments. The main objective is to evaluate these models for applicability to high pressure hydrogen environments.

  6. A UML approach to process modelling of clinical practice guidelines for enactment.

    PubMed

    Knape, T; Hederman, L; Wade, V P; Gargan, M; Harris, C; Rahman, Y

    2003-01-01

    Although clinical practice guidelines (CPGs) have been suggested as a means of encapsulating best practice in evidence-based medical treatment, their usage in clinical environments has been disappointing. Criticisms of guideline representations have been that they are predominantly narrative and are difficult to incorporate into clinical information systems. This paper analyses the use of UML process modelling techniques for guideline representation and proposes the automated generation of executable guidelines using XMI. This hybrid UML-XMI approach provides flexible authoring of guideline decision and control structures whilst integrating appropriate data flow. It also uses an open XMI standard interface to allow the use of authoring tools and process control systems from multiple vendors. The paper first surveys CPG modelling formalisms followed by a brief introduction to process modelling in UMI. Furthermore, the modelling of CPGs in UML is presented leading to a case study of encoding a diabetes mellitus CPG using UML.

  7. Challenging the Expanding Environment Model of Teaching Elementary Social Studies.

    ERIC Educational Resources Information Center

    Palmer, Jesse

    1989-01-01

    Looks at criticism of the Expanding Environments Model in the elementary school social studies curriculum. Cites recent reports that recommend a history-centered elementary curriculum. States that teaching methods may be the cause of historical, civic, and geographic illiteracy rather than the Expanding Environments Model. (LS)

  8. Genomic-Enabled Prediction Kernel Models with Random Intercepts for Multi-environment Trials

    PubMed Central

    Cuevas, Jaime; Granato, Italo; Fritsche-Neto, Roberto; Montesinos-Lopez, Osval A.; Burgueño, Juan; Bandeira e Sousa, Massaine; Crossa, José

    2018-01-01

    In this study, we compared the prediction accuracy of the main genotypic effect model (MM) without G×E interactions, the multi-environment single variance G×E deviation model (MDs), and the multi-environment environment-specific variance G×E deviation model (MDe) where the random genetic effects of the lines are modeled with the markers (or pedigree). With the objective of further modeling the genetic residual of the lines, we incorporated the random intercepts of the lines (l) and generated another three models. Each of these 6 models were fitted with a linear kernel method (Genomic Best Linear Unbiased Predictor, GB) and a Gaussian Kernel (GK) method. We compared these 12 model-method combinations with another two multi-environment G×E interactions models with unstructured variance-covariances (MUC) using GB and GK kernels (4 model-method). Thus, we compared the genomic-enabled prediction accuracy of a total of 16 model-method combinations on two maize data sets with positive phenotypic correlations among environments, and on two wheat data sets with complex G×E that includes some negative and close to zero phenotypic correlations among environments. The two models (MDs and MDE with the random intercept of the lines and the GK method) were computationally efficient and gave high prediction accuracy in the two maize data sets. Regarding the more complex G×E wheat data sets, the prediction accuracy of the model-method combination with G×E, MDs and MDe, including the random intercepts of the lines with GK method had important savings in computing time as compared with the G×E interaction multi-environment models with unstructured variance-covariances but with lower genomic prediction accuracy. PMID:29476023

  9. Genomic Prediction of Genotype × Environment Interaction Kernel Regression Models.

    PubMed

    Cuevas, Jaime; Crossa, José; Soberanis, Víctor; Pérez-Elizalde, Sergio; Pérez-Rodríguez, Paulino; Campos, Gustavo de Los; Montesinos-López, O A; Burgueño, Juan

    2016-11-01

    In genomic selection (GS), genotype × environment interaction (G × E) can be modeled by a marker × environment interaction (M × E). The G × E may be modeled through a linear kernel or a nonlinear (Gaussian) kernel. In this study, we propose using two nonlinear Gaussian kernels: the reproducing kernel Hilbert space with kernel averaging (RKHS KA) and the Gaussian kernel with the bandwidth estimated through an empirical Bayesian method (RKHS EB). We performed single-environment analyses and extended to account for G × E interaction (GBLUP-G × E, RKHS KA-G × E and RKHS EB-G × E) in wheat ( L.) and maize ( L.) data sets. For single-environment analyses of wheat and maize data sets, RKHS EB and RKHS KA had higher prediction accuracy than GBLUP for all environments. For the wheat data, the RKHS KA-G × E and RKHS EB-G × E models did show up to 60 to 68% superiority over the corresponding single environment for pairs of environments with positive correlations. For the wheat data set, the models with Gaussian kernels had accuracies up to 17% higher than that of GBLUP-G × E. For the maize data set, the prediction accuracy of RKHS EB-G × E and RKHS KA-G × E was, on average, 5 to 6% higher than that of GBLUP-G × E. The superiority of the Gaussian kernel models over the linear kernel is due to more flexible kernels that accounts for small, more complex marker main effects and marker-specific interaction effects. Copyright © 2016 Crop Science Society of America.

  10. Modeling and control for closed environment plant production systems

    NASA Technical Reports Server (NTRS)

    Fleisher, David H.; Ting, K. C.; Janes, H. W. (Principal Investigator)

    2002-01-01

    A computer program was developed to study multiple crop production and control in controlled environment plant production systems. The program simulates crop growth and development under nominal and off-nominal environments. Time-series crop models for wheat (Triticum aestivum), soybean (Glycine max), and white potato (Solanum tuberosum) are integrated with a model-based predictive controller. The controller evaluates and compensates for effects of environmental disturbances on crop production scheduling. The crop models consist of a set of nonlinear polynomial equations, six for each crop, developed using multivariate polynomial regression (MPR). Simulated data from DSSAT crop models, previously modified for crop production in controlled environments with hydroponics under elevated atmospheric carbon dioxide concentration, were used for the MPR fitting. The model-based predictive controller adjusts light intensity, air temperature, and carbon dioxide concentration set points in response to environmental perturbations. Control signals are determined from minimization of a cost function, which is based on the weighted control effort and squared-error between the system response and desired reference signal.

  11. A Delineation of the Cognitive Processes Manifested in a Social Annotation Environment

    ERIC Educational Resources Information Center

    Li, S. C.; Pow, J. W. C.; Cheung, W. C.

    2015-01-01

    This study aims to examine how students' learning trajectories progress in an online social annotation environment, and how their cognitive processes and levels of interaction correlate with their learning outcomes. Three different types of activities (cognitive, metacognitive and social) were identified in the online environment. The time…

  12. Rumor Processes in Random Environment on and on Galton-Watson Trees

    NASA Astrophysics Data System (ADS)

    Bertacchi, Daniela; Zucca, Fabio

    2013-11-01

    The aim of this paper is to study rumor processes in random environment. In a rumor process a signal starts from the stations of a fixed vertex (the root) and travels on a graph from vertex to vertex. We consider two rumor processes. In the firework process each station, when reached by the signal, transmits it up to a random distance. In the reverse firework process, on the other hand, stations do not send any signal but they “listen” for it up to a random distance. The first random environment that we consider is the deterministic 1-dimensional tree with a random number of stations on each vertex; in this case the root is the origin of . We give conditions for the survival/extinction on almost every realization of the sequence of stations. Later on, we study the processes on Galton-Watson trees with random number of stations on each vertex. We show that if the probability of survival is positive, then there is survival on almost every realization of the infinite tree such that there is at least one station at the root. We characterize the survival of the process in some cases and we give sufficient conditions for survival/extinction.

  13. Trust Model to Enhance Security and Interoperability of Cloud Environment

    NASA Astrophysics Data System (ADS)

    Li, Wenjuan; Ping, Lingdi

    Trust is one of the most important means to improve security and enable interoperability of current heterogeneous independent cloud platforms. This paper first analyzed several trust models used in large and distributed environment and then introduced a novel cloud trust model to solve security issues in cross-clouds environment in which cloud customer can choose different providers' services and resources in heterogeneous domains can cooperate. The model is domain-based. It divides one cloud provider's resource nodes into the same domain and sets trust agent. It distinguishes two different roles cloud customer and cloud server and designs different strategies for them. In our model, trust recommendation is treated as one type of cloud services just like computation or storage. The model achieves both identity authentication and behavior authentication. The results of emulation experiments show that the proposed model can efficiently and safely construct trust relationship in cross-clouds environment.

  14. Defining the Environment in Gene–Environment Research: Lessons From Social Epidemiology

    PubMed Central

    Daw, Jonathan; Freese, Jeremy

    2013-01-01

    In this article, we make the case that social epidemiology provides a useful framework to define the environment within gene–environment (G×E) research. We describe the environment in a multilevel, multidomain, longitudinal framework that accounts for upstream processes influencing health outcomes. We then illustrate the utility of this approach by describing how intermediate levels of social organization, such as neighborhoods or schools, are key environmental components of G×E research. We discuss different models of G×E research and encourage public health researchers to consider the value of including genetic information from their study participants. We also encourage researchers interested in G×E interplay to consider the merits of the social epidemiology model when defining the environment. PMID:23927514

  15. Enhancing Users' Participation in Business Process Modeling through Ontology-Based Training

    NASA Astrophysics Data System (ADS)

    Macris, A.; Malamateniou, F.; Vassilacopoulos, G.

    Successful business process design requires active participation of users who are familiar with organizational activities and business process modelling concepts. Hence, there is a need to provide users with reusable, flexible, agile and adaptable training material in order to enable them instil their knowledge and expertise in business process design and automation activities. Knowledge reusability is of paramount importance in designing training material on process modelling since it enables users participate actively in process design/redesign activities stimulated by the changing business environment. This paper presents a prototype approach for the design and use of training material that provides significant advantages to both the designer (knowledge - content reusability and semantic web enabling) and the user (semantic search, knowledge navigation and knowledge dissemination). The approach is based on externalizing domain knowledge in the form of ontology-based knowledge networks (i.e. training scenarios serving specific training needs) so that it is made reusable.

  16. Gene-environment interaction on neural mechanisms of orthographic processing in Chinese children

    PubMed Central

    Su, Mengmeng; Wang, Jiuju; Maurer, Urs; Zhang, Yuping; Li, Jun; McBride-Chang, Catherine; Tardif, Twila; Liu, Youyi; Shu, Hua

    2015-01-01

    The ability to process and identify visual words requires efficient orthographic processing of print, consisting of letters in alphabetic languages or characters in Chinese. The N170 is a robust neural marker for orthographic processes. Both genetic and environmental factors, such as home literacy, have been shown to influence orthographic processing at the behavioral level, but their relative contributions and interactions are not well understood. The present study aimed to reveal possible gene-by-environment interactions on orthographic processing at the behavioral and neural level in a normal children sample. Sixty 12 year old Chinese children from a 10-year longitudinal sample underwent an implicit visual-word color decision task on real words and stroke combinations. The ERP analysis focused on the increase of the occipito-temporal N170 to words compared to stroke combinations. The genetic analysis focused on two SNPs (rs1419228, rs1091047) in the gene DCDC2 based on previous findings linking these 2 SNPs to orthographic coding. Home literacy was measured previously as the number of children's books at home, when the children were at the age of 3. Relative to stroke combinations, real words evoked greater N170 in bilateral posterior brain regions. A significant interaction between rs1091047 and home literacy was observed on the changes of N170 comparing real words to stroke combinations in the left hemisphere. Particularly, children carrying the major allele “G” showed a similar N170 effect irrespective of their environment, while children carrying the minor allele “C” showed a smaller N170 effect in low home-literacy environment than those in good environment. PMID:26294811

  17. A neuroconstructivist model of past tense development and processing.

    PubMed

    Westermann, Gert; Ruh, Nicolas

    2012-07-01

    We present a neural network model of learning and processing the English past tense that is based on the notion that experience-dependent cortical development is a core aspect of cognitive development. During learning the model adds and removes units and connections to develop a task-specific final architecture. The model provides an integrated account of characteristic errors during learning the past tense, adult generalization to pseudoverbs, and dissociations between verbs observed after brain damage in aphasic patients. We put forward a theory of verb inflection in which a functional processing architecture develops through interactions between experience-dependent brain development and the structure of the environment, in this case, the statistical properties of verbs in the language. The outcome of this process is a structured processing system giving rise to graded dissociations between verbs that are easy and verbs that are hard to learn and process. In contrast to dual-mechanism accounts of inflection, we argue that describing dissociations as a dichotomy between regular and irregular verbs is a post hoc abstraction and is not linked to underlying processing mechanisms. We extend current single-mechanism accounts of inflection by highlighting the role of structural adaptation in development and in the formation of the adult processing system. In contrast to some single-mechanism accounts, we argue that the link between irregular inflection and verb semantics is not causal and that existing data can be explained on the basis of phonological representations alone. This work highlights the benefit of taking brain development seriously in theories of cognitive development. Copyright 2012 APA, all rights reserved.

  18. A proposed model for an optimal mentoring environment for medical residents: a literature review.

    PubMed

    Davis, Orin C; Nakamura, Jeanne

    2010-06-01

    To develop a model of the optimal mentoring environment for medical residents. The authors propose that such an environment is a function of a relationship that rests upon a set of interactional foundations that allow a protégé to capitalize on the strengths of the mentor, and it facilitates behaviors that will enable the protégé to develop and internalize the requisite knowledge, skills, and attitudes (KSAs) as fully as possible. The authors searched the literature using Web of Science and Google Scholar in 2007-2008 to identify articles addressing the mentoring process and the context in which it occurs (mentoring environment), and the effect both have on KSA development. The authors distilled the attributes of a good mentor that were consistent across the 20 papers that met inclusion criteria and described good mentoring of residents or curricula for training mentors or residents. The authors identified six interactional foundations that underlie the optimal mentoring relationship: emotional safety, support, protégé-centeredness, informality, responsiveness, and respect. These foundations enable protégés to engage in four key developmental behaviors: exercising independence, reflecting, extrapolating, and synthesizing. This model identifies mentoring practices that empower protégés to engage in developmental behaviors that will help them become the best physicians possible. Educators may use this model to develop training tools to teach attendings how to create an optimal mentoring environment. Researchers can use the model to help guide their future investigations of mentoring in medicine.

  19. Practical Framework: Implementing OEE Method in Manufacturing Process Environment

    NASA Astrophysics Data System (ADS)

    Maideen, N. C.; Sahudin, S.; Mohd Yahya, N. H.; Norliawati, A. O.

    2016-02-01

    Manufacturing process environment requires reliable machineries in order to be able to satisfy the market demand. Ideally, a reliable machine is expected to be operated and produce a quality product at its maximum designed capability. However, due to some reason, the machine usually unable to achieved the desired performance. Since the performance will affect the productivity of the system, a measurement technique should be applied. Overall Equipment Effectiveness (OEE) is a good method to measure the performance of the machine. The reliable result produced from OEE can then be used to propose a suitable corrective action. There are a lot of published paper mentioned about the purpose and benefit of OEE that covers what and why factors. However, the how factor not yet been revealed especially the implementation of OEE in manufacturing process environment. Thus, this paper presents a practical framework to implement OEE and a case study has been discussed to explain in detail each steps proposed. The proposed framework is beneficial to the engineer especially the beginner to start measure their machine performance and later improve the performance of the machine.

  20. Genomic-Enabled Prediction Kernel Models with Random Intercepts for Multi-environment Trials.

    PubMed

    Cuevas, Jaime; Granato, Italo; Fritsche-Neto, Roberto; Montesinos-Lopez, Osval A; Burgueño, Juan; Bandeira E Sousa, Massaine; Crossa, José

    2018-03-28

    In this study, we compared the prediction accuracy of the main genotypic effect model (MM) without G×E interactions, the multi-environment single variance G×E deviation model (MDs), and the multi-environment environment-specific variance G×E deviation model (MDe) where the random genetic effects of the lines are modeled with the markers (or pedigree). With the objective of further modeling the genetic residual of the lines, we incorporated the random intercepts of the lines ([Formula: see text]) and generated another three models. Each of these 6 models were fitted with a linear kernel method (Genomic Best Linear Unbiased Predictor, GB) and a Gaussian Kernel (GK) method. We compared these 12 model-method combinations with another two multi-environment G×E interactions models with unstructured variance-covariances (MUC) using GB and GK kernels (4 model-method). Thus, we compared the genomic-enabled prediction accuracy of a total of 16 model-method combinations on two maize data sets with positive phenotypic correlations among environments, and on two wheat data sets with complex G×E that includes some negative and close to zero phenotypic correlations among environments. The two models (MDs and MDE with the random intercept of the lines and the GK method) were computationally efficient and gave high prediction accuracy in the two maize data sets. Regarding the more complex G×E wheat data sets, the prediction accuracy of the model-method combination with G×E, MDs and MDe, including the random intercepts of the lines with GK method had important savings in computing time as compared with the G×E interaction multi-environment models with unstructured variance-covariances but with lower genomic prediction accuracy. Copyright © 2018 Cuevas et al.

  1. Model-based metrics of human-automation function allocation in complex work environments

    NASA Astrophysics Data System (ADS)

    Kim, So Young

    Function allocation is the design decision which assigns work functions to all agents in a team, both human and automated. Efforts to guide function allocation systematically has been studied in many fields such as engineering, human factors, team and organization design, management science, and cognitive systems engineering. Each field focuses on certain aspects of function allocation, but not all; thus, an independent discussion of each does not address all necessary issues with function allocation. Four distinctive perspectives emerged from a review of these fields: technology-centered, human-centered, team-oriented, and work-oriented. Each perspective focuses on different aspects of function allocation: capabilities and characteristics of agents (automation or human), team structure and processes, and work structure and the work environment. Together, these perspectives identify the following eight issues with function allocation: 1) Workload, 2) Incoherency in function allocations, 3) Mismatches between responsibility and authority, 4) Interruptive automation, 5) Automation boundary conditions, 6) Function allocation preventing human adaptation to context, 7) Function allocation destabilizing the humans' work environment, and 8) Mission Performance. Addressing these issues systematically requires formal models and simulations that include all necessary aspects of human-automation function allocation: the work environment, the dynamics inherent to the work, agents, and relationships among them. Also, addressing these issues requires not only a (static) model, but also a (dynamic) simulation that captures temporal aspects of work such as the timing of actions and their impact on the agent's work. Therefore, with properly modeled work as described by the work environment, the dynamics inherent to the work, agents, and relationships among them, a modeling framework developed by this thesis, which includes static work models and dynamic simulation, can capture the

  2. Surfactants in aquatic and terrestrial environment: occurrence, behavior, and treatment processes.

    PubMed

    Jardak, K; Drogui, P; Daghrir, R

    2016-02-01

    Surfactants belong to a group of chemicals that are well known for their cleaning properties. Their excessive use as ingredients in care products (e.g., shampoos, body wash) and in household cleaning products (e.g., dishwashing detergents, laundry detergents, hard-surface cleaners) has led to the discharge of highly contaminated wastewaters in aquatic and terrestrial environment. Once reached in the different environmental compartments (rivers, lakes, soils, and sediments), surfactants can undergo aerobic or anaerobic degradation. The most studied surfactants so far are linear alkylbenzene sulfonate (LAS), quaternary ammonium compounds (QACs), alkylphenol ethoxylate (APEOs), and alcohol ethoxylate (AEOs). Concentrations of surfactants in wastewaters can range between few micrograms to hundreds of milligrams in some cases, while it reaches several grams in sludge used for soil amendments in agricultural areas. Above the legislation standards, surfactants can be toxic to aquatic and terrestrial organisms which make treatment processes necessary before their discharge into the environment. Given this fact, biological and chemical processes should be considered for better surfactants removal. In this review, we investigate several issues with regard to: (1) the toxicity of surfactants in the environment, (2) their behavior in different ecological systems, (3) and the different treatment processes used in wastewater treatment plants in order to reduce the effects of surfactants on living organisms.

  3. Recent corrections to meteoroid environment models

    NASA Astrophysics Data System (ADS)

    Moorhead, A.; Brown, P.; Campbell-Brown, M. D.; Moser, D. E.; Blaauw, R. C.; Cooke, W.

    2017-12-01

    The dynamical and physical characteristics of a meteoroid affects its behavior in the atmosphere and the damage it does to spacecraft surfaces. Accurate environment models must therefore correctly describe the speed, size, density, and direction of meteoroids. However, the measurement of dynamical characteristics such as speed is subject to observational biases, and physical properties such as size and density cannot be directly measured. De-biasing techniques and proxies are needed to overcome these challenges. In this presentation, we discuss several recent improvements to the derivation of the meteoroid velocity, directionality, and bulk density distributions. We derive our speed distribution from observations made by the Canadian Meteor Orbit Radar. These observations are de-biased using modern descriptions of the ionization efficiency and sharpened to remove the effects of measurement uncertainty, and the result is a meteoroid speed distribution that is skewed slower than in previous analyses. We also adopt a higher fidelity density distribution than that used by many older models. In our distribution, meteoroids with TJ < 2 are assigned to a low-density population, while those with TJ > 2 have higher densities. This division and the distributions themselves are derived from the densities reported by Kikwaya et al. (2009, 2011). These changes have implications for the environment. For instance, helion and antihelion meteors have lower speeds and higher densities than apex and toroidal meteors. A slower speed distribution therefore corresponds to a sporadic environment that is more completely dominated by the helion and antihelion sources than in previous models. Finally, assigning these meteors high densities further increases their significance from a spacecraft damage perspective.

  4. Recent Corrections to Meteoroid Environment Models

    NASA Technical Reports Server (NTRS)

    Moorhead, A. V.; Brown, P. G.; Campbell-Brown, M. D.; Moser, D. E.; Blaauw, R. C.; Cooke, W. J.

    2017-01-01

    The dynamical and physical characteristics of a meteoroid affects its behavior in the atmosphere and the damage it does to spacecraft surfaces. Accurate environment models must therefore correctly describe the speed, size, density, and direction of meteoroids. However, the measurement of dynamical characteristics such as speed is subject to observational biases, and physical properties such as size and density cannot be directly measured. De-biasing techniques and proxies are needed to overcome these challenges. In this presentation, we discuss several recent improvements to the derivation of the meteoroid velocity, directionality, and bulk density distributions. We derive our speed distribution from observations made by the Canadian Meteor Orbit Radar. These observations are de-biased using modern descriptions of the ionization efficiency and sharpened to remove the effects of measurement uncertainty, and the result is a meteoroid speed distribution that is skewed slower than in previous analyses. We also adopt a higher fidelity density distribution than that used by many older models. In our distribution, meteoroids with T(sub J) less than 2 are assigned to a low-density population, while those with T(sub J) greater than 2 have higher densities. This division and the distributions themselves are derived from the densities reported by Kikwaya et al. (2009, 2011). These changes have implications for the environment. For instance, helion and antihelion meteors have lower speeds and higher densities than apex and toroidal meteors. A slower speed distribution therefore corresponds to a sporadic environment that is more completely dominated by the helion and antihelion sources than in previous models. Finally, assigning these meteors high densities further increases their significance from a spacecraft damage perspective.

  5. Modeling the C. elegans nematode and its environment using a particle system.

    PubMed

    Rönkkö, Mauno; Wong, Garry

    2008-07-21

    A particle system, as understood in computer science, is a novel technique for modeling living organisms in their environment. Such particle systems have traditionally been used for modeling the complex dynamics of fluids and gases. In the present study, a particle system was devised to model the movement and feeding behavior of the nematode Caenorhabditis elegans in three different virtual environments: gel, liquid, and soil. The results demonstrate that distinct movements of the nematode can be attributed to its mechanical interactions with the virtual environment. These results also revealed emergent properties associated with modeling organisms within environment-based systems.

  6. A New Fractal Model of Chromosome and DNA Processes

    NASA Astrophysics Data System (ADS)

    Bouallegue, K.

    Dynamic chromosome structure remains unknown. Can fractals and chaos be used as new tools to model, identify and generate a structure of chromosomes?Fractals and chaos offer a rich environment for exploring and modeling the complexity of nature. In a sense, fractal geometry is used to describe, model, and analyze the complex forms found in nature. Fractals have also been widely not only in biology but also in medicine. To this effect, a fractal is considered an object that displays self-similarity under magnification and can be constructed using a simple motif (an image repeated on ever-reduced scales).It is worth noting that the problem of identifying a chromosome has become a challenge to find out which one of the models it belongs to. Nevertheless, the several different models (a hierarchical coiling, a folded fiber, and radial loop) have been proposed for mitotic chromosome but have not reached a dynamic model yet.This paper is an attempt to solve topological problems involved in the model of chromosome and DNA processes. By combining the fractal Julia process and the numerical dynamical system, we have finally found out four main points. First, we have developed not only a model of chromosome but also a model of mitosis and one of meiosis. Equally important, we have identified the centromere position through the numerical model captured below. More importantly, in this paper, we have discovered the processes of the cell divisions of both mitosis and meiosis. All in all, the results show that this work could have a strong impact on the welfare of humanity and can lead to a cure of genetic diseases.

  7. Business Process Modeling: Perceived Benefits

    NASA Astrophysics Data System (ADS)

    Indulska, Marta; Green, Peter; Recker, Jan; Rosemann, Michael

    The process-centered design of organizations and information systems is globally seen as an appropriate response to the increased economic pressure on organizations. At the methodological core of process-centered management is process modeling. However, business process modeling in large initiatives can be a time-consuming and costly exercise, making it potentially difficult to convince executive management of its benefits. To date, and despite substantial interest and research in the area of process modeling, the understanding of the actual benefits of process modeling in academia and practice is limited. To address this gap, this paper explores the perception of benefits derived from process modeling initiatives, as reported through a global Delphi study. The study incorporates the views of three groups of stakeholders - academics, practitioners and vendors. Our findings lead to the first identification and ranking of 19 unique benefits associated with process modeling. The study in particular found that process modeling benefits vary significantly between practitioners and academics. We argue that the variations may point to a disconnect between research projects and practical demands.

  8. Space Environment Effects: Low-Altitude Trapped Radiation Model

    NASA Technical Reports Server (NTRS)

    Huston, S. L.; Pfitzer, K. A.

    1998-01-01

    Accurate models of the Earth's trapped energetic proton environment are required for both piloted and robotic space missions. For piloted missions, the concern is mainly total dose to the astronauts, particularly in long-duration missions and during extravehicular activity (EVA). As astronomical and remote-sensing detectors become more sensitive, the proton flux can induce unwanted backgrounds in these instruments. Due to this unwanted background, the following description details the development of a new model for the low-trapped proton environment. The model is based on nearly 20 years of data from the TIRO/NOAA weather satellites. The model, which has been designated NOAAPRO (for NOAA protons), predicts the integral omnidirectional proton flux in three energy ranges: >16, >36, and >80 MeV. It contains a true solar cycle variation and accounts for the secular variation in the Earth's magnetic field. It also extends to lower values of the magnetic L parameter than does AP8. Thus, the model addresses the major shortcomings of AP8.

  9. Space environment and lunar surface processes, 2

    NASA Technical Reports Server (NTRS)

    Comstock, G. M.

    1982-01-01

    The top few millimeters of a surface exposed to space represents a physically and chemically active zone with properties different from those of a surface in the environment of a planetary atmosphere. To meet the need or a quantitative synthesis of the various processes contributing to the evolution of surfaces of the Moon, Mercury, the asteroids, and similar bodies, (exposure to solar wind, solar flare particles, galactic cosmic rays, heating from solar radiation, and meteoroid bombardment), the MESS 2 computer program was developed. This program differs from earlier work in that the surface processes are broken down as a function of size scale and treated in three dimensions with good resolution on each scale. The results obtained apply to the development of soil near the surface and is based on lunar conditions. Parameters can be adjusted to describe asteroid regoliths and other space-related bodies.

  10. A GIS based model for active transportation in the built environment

    NASA Astrophysics Data System (ADS)

    Addison, Veronica Marie Medina

    Obesity and physical inactivity have been major risk factors associated with morbidity and mortality in the United States. Recently, obesity and physical inactivity have been on the rise. Determining connections between this trend and the environment could lead to a built environment that is conducive to healthy, active people. In my previous research, I have studied the built environment and its connection to health. For my dissertation, I build on this fundamental work by incorporating energy, specifically by studying the built environment and its connection to energy expenditures. This research models the built environment and combines this with human energy expenditure information in order to provide a planning tool that allows an individual to actively address health issues, particularly obesity. This research focuses on the design and development of an internet based model that enables individuals to understand their own energy expenditures in relation to their environment. The model will work to find the energy consumed by an individual in their navigation through campus. This is accomplished by using Geographic Information Systems (GIS) to model the campus and using it as the basis for calculating energy expended through active transportation. Using GIS to create the model allows for the incorporation of built environment factors such as elevation and energy expenditures in relation to physical exertion rate. This research will contribute to the long-term solution to the obesity epidemic by creating healthy communities through smart growth and sustainable design. This research provides users with a tool to use in their current environment for their personal and community well being.

  11. CoLeMo: A Collaborative Learning Environment for UML Modelling

    ERIC Educational Resources Information Center

    Chen, Weiqin; Pedersen, Roger Heggernes; Pettersen, Oystein

    2006-01-01

    This paper presents the design, implementation, and evaluation of a distributed collaborative UML modelling environment, CoLeMo. CoLeMo is designed for students studying UML modelling. It can also be used as a platform for collaborative design of software. We conducted formative evaluations and a summative evaluation to improve the environment and…

  12. Improving science and mathematics education with computational modelling in interactive engagement environments

    NASA Astrophysics Data System (ADS)

    Neves, Rui Gomes; Teodoro, Vítor Duarte

    2012-09-01

    A teaching approach aiming at an epistemologically balanced integration of computational modelling in science and mathematics education is presented. The approach is based on interactive engagement learning activities built around computational modelling experiments that span the range of different kinds of modelling from explorative to expressive modelling. The activities are designed to make a progressive introduction to scientific computation without requiring prior development of a working knowledge of programming, generate and foster the resolution of cognitive conflicts in the understanding of scientific and mathematical concepts and promote performative competency in the manipulation of different and complementary representations of mathematical models. The activities are supported by interactive PDF documents which explain the fundamental concepts, methods and reasoning processes using text, images and embedded movies, and include free space for multimedia enriched student modelling reports and teacher feedback. To illustrate, an example from physics implemented in the Modellus environment and tested in undergraduate university general physics and biophysics courses is discussed.

  13. Time Domain and Frequency Domain Deterministic Channel Modeling for Tunnel/Mining Environments.

    PubMed

    Zhou, Chenming; Jacksha, Ronald; Yan, Lincan; Reyes, Miguel; Kovalchik, Peter

    2017-01-01

    Understanding wireless channels in complex mining environments is critical for designing optimized wireless systems operated in these environments. In this paper, we propose two physics-based, deterministic ultra-wideband (UWB) channel models for characterizing wireless channels in mining/tunnel environments - one in the time domain and the other in the frequency domain. For the time domain model, a general Channel Impulse Response (CIR) is derived and the result is expressed in the classic UWB tapped delay line model. The derived time domain channel model takes into account major propagation controlling factors including tunnel or entry dimensions, frequency, polarization, electrical properties of the four tunnel walls, and transmitter and receiver locations. For the frequency domain model, a complex channel transfer function is derived analytically. Based on the proposed physics-based deterministic channel models, channel parameters such as delay spread, multipath component number, and angular spread are analyzed. It is found that, despite the presence of heavy multipath, both channel delay spread and angular spread for tunnel environments are relatively smaller compared to that of typical indoor environments. The results and findings in this paper have application in the design and deployment of wireless systems in underground mining environments.

  14. Modeling a distributed environment for a petroleum reservoir engineering application with software product line

    NASA Astrophysics Data System (ADS)

    de Faria Scheidt, Rafael; Vilain, Patrícia; Dantas, M. A. R.

    2014-10-01

    Petroleum reservoir engineering is a complex and interesting field that requires large amount of computational facilities to achieve successful results. Usually, software environments for this field are developed without taking care out of possible interactions and extensibilities required by reservoir engineers. In this paper, we present a research work which it is characterized by the design and implementation based on a software product line model for a real distributed reservoir engineering environment. Experimental results indicate successfully the utilization of this approach for the design of distributed software architecture. In addition, all components from the proposal provided greater visibility of the organization and processes for the reservoir engineers.

  15. How Is the Learning Environment in Physics Lesson with Using 7E Model Teaching Activities

    ERIC Educational Resources Information Center

    Turgut, Umit; Colak, Alp; Salar, Riza

    2017-01-01

    The aim of this research is to reveal the results in the planning, implementation and evaluation of the process for learning environments to be designed in compliance with 7E learning cycle model in physics lesson. "Action research", which is a qualitative research pattern, is employed in this research in accordance with the aim of the…

  16. Correcting Inadequate Model Snow Process Descriptions Dramatically Improves Mountain Hydrology Simulations

    NASA Astrophysics Data System (ADS)

    Pomeroy, J. W.; Fang, X.

    2014-12-01

    The vast effort in hydrology devoted to parameter calibration as a means to improve model performance assumes that the models concerned are not fundamentally wrong. By focussing on finding optimal parameter sets and ascribing poor model performance to parameter or data uncertainty, these efforts may fail to consider the need to improve models with more intelligent descriptions of hydrological processes. To test this hypothesis, a flexible physically based hydrological model including a full suite of snow hydrology processes as well as warm season, hillslope and groundwater hydrology was applied to Marmot Creek Research Basin, Canadian Rocky Mountains where excellent driving meteorology and basin biophysical descriptions exist. Model parameters were set from values found in the basin or from similar environments; no parameters were calibrated. The model was tested against snow surveys and streamflow observations. The model used algorithms that describe snow redistribution, sublimation and forest canopy effects on snowmelt and evaporative processes that are rarely implemented in hydrological models. To investigate the contribution of these processes to model predictive capability, the model was "falsified" by deleting parameterisations for forest canopy snow mass and energy, blowing snow, intercepted rain evaporation, and sublimation. Model falsification by ignoring forest canopy processes contributed to a large increase in SWE errors for forested portions of the research basin with RMSE increasing from 19 to 55 mm and mean bias (MB) increasing from 0.004 to 0.62. In the alpine tundra portion, removing blowing processes resulted in an increase in model SWE MB from 0.04 to 2.55 on north-facing slopes and -0.006 to -0.48 on south-facing slopes. Eliminating these algorithms degraded streamflow prediction with the Nash Sutcliffe efficiency dropping from 0.58 to 0.22 and MB increasing from 0.01 to 0.09. These results show dramatic model improvements by including snow

  17. Diffusion Dominant Solute Transport Modelling in Fractured Media Under Deep Geological Environment - 12211

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kwong, S.; Jivkov, A.P.

    2012-07-01

    Deep geologic disposal of high activity and long-lived radioactive waste is gaining increasing support in many countries, where suitable low permeability geological formation in combination with engineered barriers are used to provide long term waste contaminant and minimise the impacts to the environment and risk to the biosphere. This modelling study examines the solute transport in fractured media under low flow velocities that are relevant to a deep geological environment. In particular, reactive solute transport through fractured media is studied using a 2-D model, that considers advection and diffusion, to explore the coupled effects of kinetic and equilibrium chemical processes.more » The effects of water velocity in the fracture, matrix porosity and diffusion on solute transport are investigated and discussed. Some illustrative modelled results are presented to demonstrate the use of the model to examine the effects of media degradation on solute transport, under the influences of hydrogeological (diffusion dominant) and microbially mediated chemical processes. The challenges facing the prediction of long term degradation such as cracks evolution, interaction and coalescence are highlighted. The potential of a novel microstructure informed modelling approach to account for these effects is discussed, particularly with respect to investigating multiple phenomena impact on material performance. The GRM code is used to examine the effects of media degradation for a geological waste disposal package, under the combined hydrogeological (diffusion dominant) and chemical effects in low groundwater flow conditions that are typical of deep geological disposal systems. An illustrative reactive transport modelling application demonstrates the use of the code to examine the interplay of kinetic controlled biogeochemical reactive processes with advective and diffusive transport, under the influence of media degradation. The initial model results are encouraging which

  18. Exposing earth surface process model simulations to a large audience

    NASA Astrophysics Data System (ADS)

    Overeem, I.; Kettner, A. J.; Borkowski, L.; Russell, E. L.; Peddicord, H.

    2015-12-01

    The Community Surface Dynamics Modeling System (CSDMS) represents a diverse group of >1300 scientists who develop and apply numerical models to better understand the Earth's surface. CSDMS has a mandate to make the public more aware of model capabilities and therefore started sharing state-of-the-art surface process modeling results with large audiences. One platform to reach audiences outside the science community is through museum displays on 'Science on a Sphere' (SOS). Developed by NOAA, SOS is a giant globe, linked with computers and multiple projectors and can display data and animations on a sphere. CSDMS has developed and contributed model simulation datasets for the SOS system since 2014, including hydrological processes, coastal processes, and human interactions with the environment. Model simulations of a hydrological and sediment transport model (WBM-SED) illustrate global river discharge patterns. WAVEWATCH III simulations have been specifically processed to show the impacts of hurricanes on ocean waves, with focus on hurricane Katrina and super storm Sandy. A large world dataset of dams built over the last two centuries gives an impression of the profound influence of humans on water management. Given the exposure of SOS, CSDMS aims to contribute at least 2 model datasets a year, and will soon provide displays of global river sediment fluxes and changes of the sea ice free season along the Arctic coast. Over 100 facilities worldwide show these numerical model displays to an estimated 33 million people every year. Datasets storyboards, and teacher follow-up materials associated with the simulations, are developed to address common core science K-12 standards. CSDMS dataset documentation aims to make people aware of the fact that they look at numerical model results, that underlying models have inherent assumptions and simplifications, and that limitations are known. CSDMS contributions aim to familiarize large audiences with the use of numerical

  19. An open system approach to process reengineering in a healthcare operational environment.

    PubMed

    Czuchry, A J; Yasin, M M; Norris, J

    2000-01-01

    The objective of this study is to examine the applicability of process reengineering in a healthcare operational environment. The intake process of a mental healthcare service delivery system is analyzed systematically to identify process-related problems. A methodology which utilizes an open system orientation coupled with process reengineering is utilized to overcome operational and patient related problems associated with the pre-reengineered intake process. The systematic redesign of the intake process resulted in performance improvements in terms of cost, quality, service and timing.

  20. Real-time modeling of primitive environments through wavelet sensors and Hebbian learning

    NASA Astrophysics Data System (ADS)

    Vaccaro, James M.; Yaworsky, Paul S.

    1999-06-01

    Modeling the world through sensory input necessarily provides a unique perspective for the observer. Given a limited perspective, objects and events cannot always be encoded precisely but must involve crude, quick approximations to deal with sensory information in a real- time manner. As an example, when avoiding an oncoming car, a pedestrian needs to identify the fact that a car is approaching before ascertaining the model or color of the vehicle. In our methodology, we use wavelet-based sensors with self-organized learning to encode basic sensory information in real-time. The wavelet-based sensors provide necessary transformations while a rank-based Hebbian learning scheme encodes a self-organized environment through translation, scale and orientation invariant sensors. Such a self-organized environment is made possible by combining wavelet sets which are orthonormal, log-scale with linear orientation and have automatically generated membership functions. In earlier work we used Gabor wavelet filters, rank-based Hebbian learning and an exponential modulation function to encode textural information from images. Many different types of modulation are possible, but based on biological findings the exponential modulation function provided a good approximation of first spike coding of `integrate and fire' neurons. These types of Hebbian encoding schemes (e.g., exponential modulation, etc.) are useful for quick response and learning, provide several advantages over contemporary neural network learning approaches, and have been found to quantize data nonlinearly. By combining wavelets with Hebbian learning we can provide a real-time front-end for modeling an intelligent process, such as the autonomous control of agents in a simulated environment.

  1. Parameter and Process Significance in Mechanistic Modeling of Cellulose Hydrolysis

    NASA Astrophysics Data System (ADS)

    Rotter, B.; Barry, A.; Gerhard, J.; Small, J.; Tahar, B.

    2005-12-01

    The rate of cellulose hydrolysis, and of associated microbial processes, is important in determining the stability of landfills and their potential impact on the environment, as well as associated time scales. To permit further exploration in this field, a process-based model of cellulose hydrolysis was developed. The model, which is relevant to both landfill and anaerobic digesters, includes a novel approach to biomass transfer between a cellulose-bound biofilm and biomass in the surrounding liquid. Model results highlight the significance of the bacterial colonization of cellulose particles by attachment through contact in solution. Simulations revealed that enhanced colonization, and therefore cellulose degradation, was associated with reduced cellulose particle size, higher biomass populations in solution, and increased cellulose-binding ability of the biomass. A sensitivity analysis of the system parameters revealed different sensitivities to model parameters for a typical landfill scenario versus that for an anaerobic digester. The results indicate that relative surface area of cellulose and proximity of hydrolyzing bacteria are key factors determining the cellulose degradation rate.

  2. Implementation of the Business Process Modelling Notation (BPMN) in the modelling of anatomic pathology processes.

    PubMed

    Rojo, Marcial García; Rolón, Elvira; Calahorra, Luis; García, Felix Oscar; Sánchez, Rosario Paloma; Ruiz, Francisco; Ballester, Nieves; Armenteros, María; Rodríguez, Teresa; Espartero, Rafael Martín

    2008-07-15

    Process orientation is one of the essential elements of quality management systems, including those in use in healthcare. Business processes in hospitals are very complex and variable. BPMN (Business Process Modelling Notation) is a user-oriented language specifically designed for the modelling of business (organizational) processes. Previous experiences of the use of this notation in the processes modelling within the Pathology in Spain or another country are not known. We present our experience in the elaboration of the conceptual models of Pathology processes, as part of a global programmed surgical patient process, using BPMN. With the objective of analyzing the use of BPMN notation in real cases, a multidisciplinary work group was created, including software engineers from the Dep. of Technologies and Information Systems from the University of Castilla-La Mancha and health professionals and administrative staff from the Hospital General de Ciudad Real. The work in collaboration was carried out in six phases: informative meetings, intensive training, process selection, definition of the work method, process describing by hospital experts, and process modelling. The modelling of the processes of Anatomic Pathology is presented using BPMN. The presented subprocesses are those corresponding to the surgical pathology examination of the samples coming from operating theatre, including the planning and realization of frozen studies. The modelling of Anatomic Pathology subprocesses has allowed the creation of an understandable graphical model, where management and improvements are more easily implemented by health professionals.

  3. Implementation of the Business Process Modelling Notation (BPMN) in the modelling of anatomic pathology processes

    PubMed Central

    Rojo, Marcial García; Rolón, Elvira; Calahorra, Luis; García, Felix Óscar; Sánchez, Rosario Paloma; Ruiz, Francisco; Ballester, Nieves; Armenteros, María; Rodríguez, Teresa; Espartero, Rafael Martín

    2008-01-01

    Background Process orientation is one of the essential elements of quality management systems, including those in use in healthcare. Business processes in hospitals are very complex and variable. BPMN (Business Process Modelling Notation) is a user-oriented language specifically designed for the modelling of business (organizational) processes. Previous experiences of the use of this notation in the processes modelling within the Pathology in Spain or another country are not known. We present our experience in the elaboration of the conceptual models of Pathology processes, as part of a global programmed surgical patient process, using BPMN. Methods With the objective of analyzing the use of BPMN notation in real cases, a multidisciplinary work group was created, including software engineers from the Dep. of Technologies and Information Systems from the University of Castilla-La Mancha and health professionals and administrative staff from the Hospital General de Ciudad Real. The work in collaboration was carried out in six phases: informative meetings, intensive training, process selection, definition of the work method, process describing by hospital experts, and process modelling. Results The modelling of the processes of Anatomic Pathology is presented using BPMN. The presented subprocesses are those corresponding to the surgical pathology examination of the samples coming from operating theatre, including the planning and realization of frozen studies. Conclusion The modelling of Anatomic Pathology subprocesses has allowed the creation of an understandable graphical model, where management and improvements are more easily implemented by health professionals. PMID:18673511

  4. Orthogonal Gaussian process models

    DOE PAGES

    Plumlee, Matthew; Joseph, V. Roshan

    2017-01-01

    Gaussian processes models are widely adopted for nonparameteric/semi-parametric modeling. Identifiability issues occur when the mean model contains polynomials with unknown coefficients. Though resulting prediction is unaffected, this leads to poor estimation of the coefficients in the mean model, and thus the estimated mean model loses interpretability. This paper introduces a new Gaussian process model whose stochastic part is orthogonal to the mean part to address this issue. As a result, this paper also discusses applications to multi-fidelity simulations using data examples.

  5. Orthogonal Gaussian process models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plumlee, Matthew; Joseph, V. Roshan

    Gaussian processes models are widely adopted for nonparameteric/semi-parametric modeling. Identifiability issues occur when the mean model contains polynomials with unknown coefficients. Though resulting prediction is unaffected, this leads to poor estimation of the coefficients in the mean model, and thus the estimated mean model loses interpretability. This paper introduces a new Gaussian process model whose stochastic part is orthogonal to the mean part to address this issue. As a result, this paper also discusses applications to multi-fidelity simulations using data examples.

  6. Current models of the intensely ionizing particle environment in space

    NASA Technical Reports Server (NTRS)

    Adams, James H., Jr.

    1988-01-01

    The Cosmic Ray Effects on MicroElectronics (CREME) model that is currently in use to estimate single event effect rates in spacecraft is described. The CREME model provides a description of the radiation environment in interplanetary space near the orbit of the earth that contains no major deficiencies. The accuracy of the galactic cosmic ray model is limited by the uncertainties in solar modulation. The model for solar energetic particles could be improved by making use of all the data that has been collected on solar energetic particle events. There remain major uncertainties about the environment within the earth's magnetosphere, because of the uncertainties over the charge states of the heavy ions in the anomalous component and solar flares, and because of trapped heavy ions. The present CREME model is valid only at 1 AU, but it could be extended to other parts of the heliosphere. There is considerable data on the radiation environment from 0.2 to 35 AU in the ecliptic plane. This data could be used to extend the CREME model.

  7. Improving the process of process modelling by the use of domain process patterns

    NASA Astrophysics Data System (ADS)

    Koschmider, Agnes; Reijers, Hajo A.

    2015-01-01

    The use of business process models has become prevalent in a wide area of enterprise applications. But while their popularity is expanding, concerns are growing with respect to their proper creation and maintenance. An obvious way to boost the efficiency of creating high-quality business process models would be to reuse relevant parts of existing models. At this point, however, limited support exists to guide process modellers towards the usage of appropriate model content. In this paper, a set of content-oriented patterns is presented, which is extracted from a large set of process models from the order management and manufacturing production domains. The patterns are derived using a newly proposed set of algorithms, which are being discussed in this paper. The authors demonstrate how such Domain Process Patterns, in combination with information on their historic usage, can support process modellers in generating new models. To support the wider dissemination and development of Domain Process Patterns within and beyond the studied domains, an accompanying website has been set up.

  8. Interactive Schematic Integration Within the Propellant System Modeling Environment

    NASA Technical Reports Server (NTRS)

    Coote, David; Ryan, Harry; Burton, Kenneth; McKinney, Lee; Woodman, Don

    2012-01-01

    Task requirements for rocket propulsion test preparations of the test stand facilities drive the need to model the test facility propellant systems prior to constructing physical modifications. The Propellant System Modeling Environment (PSME) is an initiative designed to enable increased efficiency and expanded capabilities to a broader base of NASA engineers in the use of modeling and simulation (M&S) technologies for rocket propulsion test and launch mission requirements. PSME will enable a wider scope of users to utilize M&S of propulsion test and launch facilities for predictive and post-analysis functionality by offering a clean, easy-to-use, high-performance application environment.

  9. Modeling and dynamic environment analysis technology for spacecraft

    NASA Astrophysics Data System (ADS)

    Fang, Ren; Zhaohong, Qin; Zhong, Zhang; Zhenhao, Liu; Kai, Yuan; Long, Wei

    Spacecraft sustains complex and severe vibrations and acoustic environments during flight. Predicting the resulting structures, including numerical predictions of fluctuating pressure, updating models and random vibration and acoustic analysis, plays an important role during the design, manufacture and ground testing of spacecraft. In this paper, Monotony Integrative Large Eddy Simulation (MILES) is introduced to predict the fluctuating pressure of the fairing. The exact flow structures of the fairing wall surface under different Mach numbers are obtained, then a spacecraft model is constructed using the finite element method (FEM). According to the modal test data, the model is updated by the penalty method. On this basis, the random vibration and acoustic responses of the fairing and satellite are analyzed by different methods. The simulated results agree well with the experimental ones, which shows the validity of the modeling and dynamic environment analysis technology. This information can better support test planning, defining test conditions and designing optimal structures.

  10. A Markov Environment-dependent Hurricane Intensity Model and Its Comparison with Multiple Dynamic Models

    NASA Astrophysics Data System (ADS)

    Jing, R.; Lin, N.; Emanuel, K.; Vecchi, G. A.; Knutson, T. R.

    2017-12-01

    A Markov environment-dependent hurricane intensity model (MeHiM) is developed to simulate the climatology of hurricane intensity given the surrounding large-scale environment. The model considers three unobserved discrete states representing respectively storm's slow, moderate, and rapid intensification (and deintensification). Each state is associated with a probability distribution of intensity change. The storm's movement from one state to another, regarded as a Markov chain, is described by a transition probability matrix. The initial state is estimated with a Bayesian approach. All three model components (initial intensity, state transition, and intensity change) are dependent on environmental variables including potential intensity, vertical wind shear, midlevel relative humidity, and ocean mixing characteristics. This dependent Markov model of hurricane intensity shows a significant improvement over previous statistical models (e.g., linear, nonlinear, and finite mixture models) in estimating the distributions of 6-h and 24-h intensity change, lifetime maximum intensity, and landfall intensity, etc. Here we compare MeHiM with various dynamical models, including a global climate model [High-Resolution Forecast-Oriented Low Ocean Resolution model (HiFLOR)], a regional hurricane model (Geophysical Fluid Dynamics Laboratory (GFDL) hurricane model), and a simplified hurricane dynamic model [Coupled Hurricane Intensity Prediction System (CHIPS)] and its newly developed fast simulator. The MeHiM developed based on the reanalysis data is applied to estimate the intensity of simulated storms to compare with the dynamical-model predictions under the current climate. The dependences of hurricanes on the environment under current and future projected climates in the various models will also be compared statistically.

  11. Parametric Modelling of As-Built Beam Framed Structure in Bim Environment

    NASA Astrophysics Data System (ADS)

    Yang, X.; Koehl, M.; Grussenmeyer, P.

    2017-02-01

    A complete documentation and conservation of a historic timber roof requires the integration of geometry modelling, attributional and dynamic information management and results of structural analysis. Recently developed as-built Building Information Modelling (BIM) technique has the potential to provide a uniform platform, which provides possibility to integrate the traditional geometry modelling, parametric elements management and structural analysis together. The main objective of the project presented in this paper is to develop a parametric modelling tool for a timber roof structure whose elements are leaning and crossing beam frame. Since Autodesk Revit, as the typical BIM software, provides the platform for parametric modelling and information management, an API plugin, able to automatically create the parametric beam elements and link them together with strict relationship, was developed. The plugin under development is introduced in the paper, which can obtain the parametric beam model via Autodesk Revit API from total station points and terrestrial laser scanning data. The results show the potential of automatizing the parametric modelling by interactive API development in BIM environment. It also integrates the separate data processing and different platforms into the uniform Revit software.

  12. Time Domain and Frequency Domain Deterministic Channel Modeling for Tunnel/Mining Environments

    PubMed Central

    Zhou, Chenming; Jacksha, Ronald; Yan, Lincan; Reyes, Miguel; Kovalchik, Peter

    2018-01-01

    Understanding wireless channels in complex mining environments is critical for designing optimized wireless systems operated in these environments. In this paper, we propose two physics-based, deterministic ultra-wideband (UWB) channel models for characterizing wireless channels in mining/tunnel environments — one in the time domain and the other in the frequency domain. For the time domain model, a general Channel Impulse Response (CIR) is derived and the result is expressed in the classic UWB tapped delay line model. The derived time domain channel model takes into account major propagation controlling factors including tunnel or entry dimensions, frequency, polarization, electrical properties of the four tunnel walls, and transmitter and receiver locations. For the frequency domain model, a complex channel transfer function is derived analytically. Based on the proposed physics-based deterministic channel models, channel parameters such as delay spread, multipath component number, and angular spread are analyzed. It is found that, despite the presence of heavy multipath, both channel delay spread and angular spread for tunnel environments are relatively smaller compared to that of typical indoor environments. The results and findings in this paper have application in the design and deployment of wireless systems in underground mining environments.† PMID:29457801

  13. A Test of Two Alternative Cognitive Processing Models: Learning Styles and Dual Coding

    ERIC Educational Resources Information Center

    Cuevas, Joshua; Dawson, Bryan L.

    2018-01-01

    This study tested two cognitive models, learning styles and dual coding, which make contradictory predictions about how learners process and retain visual and auditory information. Learning styles-based instructional practices are common in educational environments despite a questionable research base, while the use of dual coding is less…

  14. Modeling the Biodegradability of Chemical Compounds Using the Online CHEmical Modeling Environment (OCHEM)

    PubMed Central

    Vorberg, Susann

    2013-01-01

    Abstract Biodegradability describes the capacity of substances to be mineralized by free‐living bacteria. It is a crucial property in estimating a compound’s long‐term impact on the environment. The ability to reliably predict biodegradability would reduce the need for laborious experimental testing. However, this endpoint is difficult to model due to unavailability or inconsistency of experimental data. Our approach makes use of the Online Chemical Modeling Environment (OCHEM) and its rich supply of machine learning methods and descriptor sets to build classification models for ready biodegradability. These models were analyzed to determine the relationship between characteristic structural properties and biodegradation activity. The distinguishing feature of the developed models is their ability to estimate the accuracy of prediction for each individual compound. The models developed using seven individual descriptor sets were combined in a consensus model, which provided the highest accuracy. The identified overrepresented structural fragments can be used by chemists to improve the biodegradability of new chemical compounds. The consensus model, the datasets used, and the calculated structural fragments are publicly available at http://ochem.eu/article/31660. PMID:27485201

  15. Mathematical modeling in biological populations through branching processes. Application to salmonid populations.

    PubMed

    Molina, Manuel; Mota, Manuel; Ramos, Alfonso

    2015-01-01

    This work deals with mathematical modeling through branching processes. We consider sexually reproducing animal populations where, in each generation, the number of progenitor couples is determined in a non-predictable environment. By using a class of two-sex branching processes, we describe their demographic dynamics and provide several probabilistic and inferential contributions. They include results about the extinction of the population and the estimation of the offspring distribution and its main moments. We also present an application to salmonid populations.

  16. Measurement and Model Linkages in Assessing School Environments.

    ERIC Educational Resources Information Center

    Schmitt, Neal

    Detailed methodology used to evaluate a causal model of school environment is presented in this report. The model depicts societal features that influence school district values and organizational characteristics, which in turn influence school operations and personnel attitudes and values. These school variables affect school community members'…

  17. Acoustic Environment of Haro Strait: Preliminary Propagation Modeling and Data Analysis

    DTIC Science & Technology

    2006-08-01

    the frequency range 1–10 kHz are combined to analyze the acoustic environment of Haro Strait of Puget Sound , an area frequented by the southern...51Haro Strait, Puget Sound , acoustic environment, shallow water, acoustic model, southern resident killer whales, shipping noise Field measurements and...acoustic propagation modeling for the frequency range 1–10 kHz are combined to analyze the acous- tic environment of Haro Strait of Puget Sound , home to

  18. Chaotic home environment is associated with reduced infant processing speed under high task demands.

    PubMed

    Tomalski, Przemysław; Marczuk, Karolina; Pisula, Ewa; Malinowska, Anna; Kawa, Rafał; Niedźwiecka, Alicja

    2017-08-01

    Early adversity has profound long-term consequences for child development across domains. The effects of early adversity on structural and functional brain development were shown for infants under 12 months of life. However, the causal mechanisms of these effects remain relatively unexplored. Using a visual habituation task we investigated whether chaotic home environment may affect processing speed in 5.5 month-old infants (n=71). We found detrimental effects of chaos on processing speed for complex but not for simple visual stimuli. No effects of socio-economic status on infant processing speed were found although the sample was predominantly middle class. Our results indicate that chaotic early environment may adversely affect processing speed in early infancy, but only when greater cognitive resources need to be deployed. The study highlights an attractive avenue for research on the mechanisms linking home environment with the development of attention control. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. Modeling persistence of motion in a crowded environment: The diffusive limit of excluding velocity-jump processes

    NASA Astrophysics Data System (ADS)

    Gavagnin, Enrico; Yates, Christian A.

    2018-03-01

    Persistence of motion is the tendency of an object to maintain motion in a direction for short time scales without necessarily being biased in any direction in the long term. One of the most appropriate mathematical tools to study this behavior is an agent-based velocity-jump process. In the absence of agent-agent interaction, the mean-field continuum limit of the agent-based model (ABM) gives rise to the well known hyperbolic telegraph equation. When agent-agent interaction is included in the ABM, a strictly advective system of partial differential equations (PDEs) can be derived at the population level. However, no diffusive limit of the ABM has been obtained from such a model. Connecting the microscopic behavior of the ABM to a diffusive macroscopic description is desirable, since it allows the exploration of a wider range of scenarios and establishes a direct connection with commonly used statistical tools of movement analysis. In order to connect the ABM at the population level to a diffusive PDE at the population level, we consider a generalization of the agent-based velocity-jump process on a two-dimensional lattice with three forms of agent interaction. This generalization allows us to take a diffusive limit and obtain a faithful population-level description. We investigate the properties of the model at both the individual and population levels and we elucidate some of the models' key characteristic features. In particular, we show an intrinsic anisotropy inherent to the models and we find evidence of a spontaneous form of aggregation at both the micro- and macroscales.

  20. A Stochastic Model of Plausibility in Live Virtual Constructive Environments

    DTIC Science & Technology

    2017-09-14

    objective in virtual environment research and design is the maintenance of adequate consistency levels in the face of limited system resources such as...provides some commentary with regard to system design considerations and future research directions. II. SYSTEM MODEL DVEs are often designed as a...exceed the system’s requirements. Research into predictive models of virtual environment consistency is needed to provide designers the tools to

  1. A Cooperative Model for IS Security Risk Management in Distributed Environment

    PubMed Central

    Zheng, Chundong

    2014-01-01

    Given the increasing cooperation between organizations, the flexible exchange of security information across the allied organizations is critical to effectively manage information systems (IS) security in a distributed environment. In this paper, we develop a cooperative model for IS security risk management in a distributed environment. In the proposed model, the exchange of security information among the interconnected IS under distributed environment is supported by Bayesian networks (BNs). In addition, for an organization's IS, a BN is utilized to represent its security environment and dynamically predict its security risk level, by which the security manager can select an optimal action to safeguard the firm's information resources. The actual case studied illustrates the cooperative model presented in this paper and how it can be exploited to manage the distributed IS security risk effectively. PMID:24563626

  2. A cooperative model for IS security risk management in distributed environment.

    PubMed

    Feng, Nan; Zheng, Chundong

    2014-01-01

    Given the increasing cooperation between organizations, the flexible exchange of security information across the allied organizations is critical to effectively manage information systems (IS) security in a distributed environment. In this paper, we develop a cooperative model for IS security risk management in a distributed environment. In the proposed model, the exchange of security information among the interconnected IS under distributed environment is supported by Bayesian networks (BNs). In addition, for an organization's IS, a BN is utilized to represent its security environment and dynamically predict its security risk level, by which the security manager can select an optimal action to safeguard the firm's information resources. The actual case studied illustrates the cooperative model presented in this paper and how it can be exploited to manage the distributed IS security risk effectively.

  3. eTOXlab, an open source modeling framework for implementing predictive models in production environments.

    PubMed

    Carrió, Pau; López, Oriol; Sanz, Ferran; Pastor, Manuel

    2015-01-01

    Computational models based in Quantitative-Structure Activity Relationship (QSAR) methodologies are widely used tools for predicting the biological properties of new compounds. In many instances, such models are used as a routine in the industry (e.g. food, cosmetic or pharmaceutical industry) for the early assessment of the biological properties of new compounds. However, most of the tools currently available for developing QSAR models are not well suited for supporting the whole QSAR model life cycle in production environments. We have developed eTOXlab; an open source modeling framework designed to be used at the core of a self-contained virtual machine that can be easily deployed in production environments, providing predictions as web services. eTOXlab consists on a collection of object-oriented Python modules with methods mapping common tasks of standard modeling workflows. This framework allows building and validating QSAR models as well as predicting the properties of new compounds using either a command line interface or a graphic user interface (GUI). Simple models can be easily generated by setting a few parameters, while more complex models can be implemented by overriding pieces of the original source code. eTOXlab benefits from the object-oriented capabilities of Python for providing high flexibility: any model implemented using eTOXlab inherits the features implemented in the parent model, like common tools and services or the automatic exposure of the models as prediction web services. The particular eTOXlab architecture as a self-contained, portable prediction engine allows building models with confidential information within corporate facilities, which can be safely exported and used for prediction without disclosing the structures of the training series. The software presented here provides full support to the specific needs of users that want to develop, use and maintain predictive models in corporate environments. The technologies used by e

  4. Modelling the near-Earth space environment using LDEF data

    NASA Technical Reports Server (NTRS)

    Atkinson, Dale R.; Coombs, Cassandra R.; Crowell, Lawrence B.; Watts, Alan J.

    1992-01-01

    Near-Earth space is a dynamic environment, that is currently not well understood. In an effort to better characterize the near-Earth space environment, this study compares the results of actual impact crater measurement data and the Space Environment (SPENV) Program developed in-house at POD, to theoretical models established by Kessler (NASA TM-100471, 1987) and Cour-Palais (NASA SP-8013, 1969). With the continuing escalation of debris there will exist a definite hazard to unmanned satellites as well as manned operations. Since the smaller non-trackable debris has the highest impact rate, it is clearly necessary to establish the true debris environment for all particle sizes. Proper comprehension of the near-Earth space environment and its origin will permit improvement in spacecraft design and mission planning, thereby reducing potential disasters and extreme costs. Results of this study directly relate to the survivability of future spacecraft and satellites that are to travel through and/or reside in low Earth orbit (LEO). More specifically, these data are being used to: (1) characterize the effects of the LEO micrometeoroid an debris environment on satellite designs and components; (2) update the current theoretical micrometeoroid and debris models for LEO; (3) help assess the survivability of spacecraft and satellites that must travel through or reside in LEO, and the probability of their collision with already resident debris; and (4) help define and evaluate future debris mitigation and disposal methods. Combined model predictions match relatively well with the LDEF data for impact craters larger than approximately 0.05 cm, diameter; however, for smaller impact craters, the combined predictions diverge and do not reflect the sporadic clouds identified by the Interplanetary Dust Experiment (IDE) aboard LDEF. The divergences cannot currently be explained by the authors or model developers. The mean flux of small craters (approximately 0.05 cm diameter) is

  5. Interplanetary Radiation and Internal Charging Environment Models for Solar Sails

    NASA Technical Reports Server (NTRS)

    Minow, Joseph I.; Altstatt, Richard L.; Neergaard, Linda F.

    2004-01-01

    A Solar Sail Radiation Environment (SSRE) model has been developed for characterizing the radiation dose and internal charging environments in the solar wind. The SSRE model defines the 0.01 keV to 1 MeV charged particle environment for use in testing the radiation dose vulnerability of candidate solar sail materials and for use in evaluating the internal charging effects in the interplanetary environment. Solar wind and energetic particle instruments aboard the Ulysses spacecraft provide the particle data used to derive the environments for the high inclination 0.5 AU Solar Polar Imager mission and the 1.0 AU L1 solar sail missions. Ulysses is the only spacecraft to sample high latitude solar wind environments far from the ecliptic plane and is therefore uniquely capable of providing the information necessary for defining radiation environments for the Solar Polar Imager spacecraft. Cold plasma moments are used to derive differential flux spectra based on Kappa distribution functions. Energetic particle flux measurements are used to constrain the high energy, non-thermal tails of the distribution functions providing a comprehensive electron, proton, and helium spectra from less than 0.01 keV to a few MeV.

  6. Modelling the morphodynamics and co-evolution of coast and estuarine environments

    NASA Astrophysics Data System (ADS)

    Morris, Chloe; Coulthard, Tom; Parsons, Daniel R.; Manson, Susan; Barkwith, Andrew

    2017-04-01

    The morphodynamics of coast and estuarine environments are known to be sensitive to environmental change and sea-level rise. However, whilst these systems have received considerable individual research attention, how they interact and co-evolve is relatively understudied. These systems are intrinsically linked and it is therefore advantageous to study them holistically in order to build a more comprehensive understanding of their behaviour and to inform sustainable management over the long term. Complex environments such as these are often studied using numerical modelling techniques. Inherent from the limited research in this area, existing models are currently not capable of simulating dynamic coast-estuarine interactions. A new model is being developed through coupling the one-line Coastline Evolution Model (CEM) with CAESAR-Lisflood (C-L), a hydrodynamic Landscape Evolution Model. It is intended that the eventual model be used to advance the understanding of these systems and how they may evolve over the mid to long term in response to climate change. In the UK, the Holderness Coast, Humber Estuary and Spurn Point system offers a diverse and complex case study for this research. Holderness is one of the fastest eroding coastlines in Europe and research suggests that the large volumes of material removed from its cliffs are responsible for the formation of the Spurn Point feature and for the Holocene infilling of the Humber Estuary. Marine, fluvial and coastal processes are continually reshaping this system and over the next century, it is predicted that climate change could lead to increased erosion along the coast and supply of material to the Humber Estuary and Spurn Point. How this manifests will be hugely influential to the future morphology of these systems and the existence of Spurn Point. Progress to date includes a new version of the CEM that has been prepared for integration into C-L and includes an improved graphical user interface and more complex

  7. A Model of Generating Visual Place Cells Based on Environment Perception and Similar Measure.

    PubMed

    Zhou, Yang; Wu, Dewei

    2016-01-01

    It is an important content to generate visual place cells (VPCs) in the field of bioinspired navigation. By analyzing the firing characteristic of biological place cells and the existing methods for generating VPCs, a model of generating visual place cells based on environment perception and similar measure is abstracted in this paper. VPCs' generation process is divided into three phases, including environment perception, similar measure, and recruiting of a new place cell. According to this process, a specific method for generating VPCs is presented. External reference landmarks are obtained based on local invariant characteristics of image and a similar measure function is designed based on Euclidean distance and Gaussian function. Simulation validates the proposed method is available. The firing characteristic of the generated VPCs is similar to that of biological place cells, and VPCs' firing fields can be adjusted flexibly by changing the adjustment factor of firing field (AFFF) and firing rate's threshold (FRT).

  8. A Model of Generating Visual Place Cells Based on Environment Perception and Similar Measure

    PubMed Central

    2016-01-01

    It is an important content to generate visual place cells (VPCs) in the field of bioinspired navigation. By analyzing the firing characteristic of biological place cells and the existing methods for generating VPCs, a model of generating visual place cells based on environment perception and similar measure is abstracted in this paper. VPCs' generation process is divided into three phases, including environment perception, similar measure, and recruiting of a new place cell. According to this process, a specific method for generating VPCs is presented. External reference landmarks are obtained based on local invariant characteristics of image and a similar measure function is designed based on Euclidean distance and Gaussian function. Simulation validates the proposed method is available. The firing characteristic of the generated VPCs is similar to that of biological place cells, and VPCs' firing fields can be adjusted flexibly by changing the adjustment factor of firing field (AFFF) and firing rate's threshold (FRT). PMID:27597859

  9. Engineered Barrier System: Physical and Chemical Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    P. Dixon

    2004-04-26

    The conceptual and predictive models documented in this Engineered Barrier System: Physical and Chemical Environment Model report describe the evolution of the physical and chemical conditions within the waste emplacement drifts of the repository. The modeling approaches and model output data will be used in the total system performance assessment (TSPA-LA) to assess the performance of the engineered barrier system and the waste form. These models evaluate the range of potential water compositions within the emplacement drifts, resulting from the interaction of introduced materials and minerals in dust with water seeping into the drifts and with aqueous solutions forming bymore » deliquescence of dust (as influenced by atmospheric conditions), and from thermal-hydrological-chemical (THC) processes in the drift. These models also consider the uncertainty and variability in water chemistry inside the drift and the compositions of introduced materials within the drift. This report develops and documents a set of process- and abstraction-level models that constitute the engineered barrier system: physical and chemical environment model. Where possible, these models use information directly from other process model reports as input, which promotes integration among process models used for total system performance assessment. Specific tasks and activities of modeling the physical and chemical environment are included in the technical work plan ''Technical Work Plan for: In-Drift Geochemistry Modeling'' (BSC 2004 [DIRS 166519]). As described in the technical work plan, the development of this report is coordinated with the development of other engineered barrier system analysis model reports.« less

  10. An u-Service Model Based on a Smart Phone for Urban Computing Environments

    NASA Astrophysics Data System (ADS)

    Cho, Yongyun; Yoe, Hyun

    In urban computing environments, all of services should be based on the interaction between humans and environments around them, which frequently and ordinarily in home and office. This paper propose an u-service model based on a smart phone for urban computing environments. The suggested service model includes a context-aware and personalized service scenario development environment that can instantly describe user's u-service demand or situation information with smart devices. To do this, the architecture of the suggested service model consists of a graphical service editing environment for smart devices, an u-service platform, and an infrastructure with sensors and WSN/USN. The graphic editor expresses contexts as execution conditions of a new service through a context model based on ontology. The service platform deals with the service scenario according to contexts. With the suggested service model, an user in urban computing environments can quickly and easily make u-service or new service using smart devices.

  11. The Role of Structural Models in the Solar Sail Flight Validation Process

    NASA Technical Reports Server (NTRS)

    Johnston, John D.

    2004-01-01

    NASA is currently soliciting proposals via the New Millennium Program ST-9 opportunity for a potential Solar Sail Flight Validation (SSFV) experiment to develop and operate in space a deployable solar sail that can be steered and provides measurable acceleration. The approach planned for this experiment is to test and validate models and processes for solar sail design, fabrication, deployment, and flight. These models and processes would then be used to design, fabricate, and operate scaleable solar sails for future space science missions. There are six validation objectives planned for the ST9 SSFV experiment: 1) Validate solar sail design tools and fabrication methods; 2) Validate controlled deployment; 3) Validate in space structural characteristics (focus of poster); 4) Validate solar sail attitude control; 5) Validate solar sail thrust performance; 6) Characterize the sail's electromagnetic interaction with the space environment. This poster presents a top-level assessment of the role of structural models in the validation process for in-space structural characteristics.

  12. Parameter Estimation and Model Selection for Indoor Environments Based on Sparse Observations

    NASA Astrophysics Data System (ADS)

    Dehbi, Y.; Loch-Dehbi, S.; Plümer, L.

    2017-09-01

    This paper presents a novel method for the parameter estimation and model selection for the reconstruction of indoor environments based on sparse observations. While most approaches for the reconstruction of indoor models rely on dense observations, we predict scenes of the interior with high accuracy in the absence of indoor measurements. We use a model-based top-down approach and incorporate strong but profound prior knowledge. The latter includes probability density functions for model parameters and sparse observations such as room areas and the building footprint. The floorplan model is characterized by linear and bi-linear relations with discrete and continuous parameters. We focus on the stochastic estimation of model parameters based on a topological model derived by combinatorial reasoning in a first step. A Gauss-Markov model is applied for estimation and simulation of the model parameters. Symmetries are represented and exploited during the estimation process. Background knowledge as well as observations are incorporated in a maximum likelihood estimation and model selection is performed with AIC/BIC. The likelihood is also used for the detection and correction of potential errors in the topological model. Estimation results are presented and discussed.

  13. Exploring the Potential of Aerial Photogrammetry for 3d Modelling of High-Alpine Environments

    NASA Astrophysics Data System (ADS)

    Legat, K.; Moe, K.; Poli, D.; Bollmannb, E.

    2016-03-01

    High-alpine areas are subject to rapid topographic changes, mainly caused by natural processes like glacial retreat and other geomorphological processes, and also due to anthropogenic interventions like construction of slopes and infrastructure in skiing resorts. Consequently, the demand for highly accurate digital terrain models (DTMs) in alpine environments has arisen. Public administrations often have dedicated resources for the regular monitoring of glaciers and natural hazard processes. In case of glaciers, traditional monitoring encompasses in-situ measurements of area and length and the estimation of volume and mass changes. Next to field measurements, data for such monitoring programs can be derived from DTMs and digital ortho photos (DOPs). Skiing resorts, on the other hand, require DTMs as input for planning and - more recently - for RTK-GNSS supported ski-slope grooming. Although different in scope, the demand of both user groups is similar: high-quality and up-to-date terrain data for extended areas often characterised by difficult accessibility and large elevation ranges. Over the last two decades, airborne laser scanning (ALS) has replaced photogrammetric approaches as state-of-the-art technology for the acquisition of high-resolution DTMs also in alpine environments. Reasons include the higher productivity compared to (manual) stereo-photogrammetric measurements, canopy-penetration capability, and limitations of photo measurements on sparsely textured surfaces like snow or ice. Nevertheless, the last few years have shown strong technological advances in the field of aerial camera technology, image processing and photogrammetric software which led to new possibilities for image-based DTM generation even in alpine terrain. At Vermessung AVT, an Austrian-based surveying company, and its subsidiary Terra Messflug, very promising results have been achieved for various projects in high-alpine environments, using images acquired by large-format digital

  14. On the upscaling of process-based models in deltaic applications

    NASA Astrophysics Data System (ADS)

    Li, L.; Storms, J. E. A.; Walstra, D. J. R.

    2018-03-01

    Process-based numerical models are increasingly used to study the evolution of marine and terrestrial depositional environments. Whilst a detailed description of small-scale processes provides an accurate representation of reality, application on geological timescales is restrained by the associated increase in computational time. In order to reduce the computational time, a number of acceleration methods are combined and evaluated for a schematic supply-driven delta (static base level) and an accommodation-driven delta (variable base level). The performance of the combined acceleration methods is evaluated by comparing the morphological indicators such as distributary channel networking and delta volumes derived from the model predictions for various levels of acceleration. The results of the accelerated models are compared to the outcomes from a series of simulations to capture autogenic variability. Autogenic variability is quantified by re-running identical models on an initial bathymetry with 1 cm added noise. The overall results show that the variability of the accelerated models fall within the autogenic variability range, suggesting that the application of acceleration methods does not significantly affect the simulated delta evolution. The Time-scale compression method (the acceleration method introduced in this paper) results in an increased computational efficiency of 75% without adversely affecting the simulated delta evolution compared to a base case. The combination of the Time-scale compression method with the existing acceleration methods has the potential to extend the application range of process-based models towards geologic timescales.

  15. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Heng; Ye, Ming; Walker, Anthony P.

    Hydrological models are always composed of multiple components that represent processes key to intended model applications. When a process can be simulated by multiple conceptual-mathematical models (process models), model uncertainty in representing the process arises. While global sensitivity analysis methods have been widely used for identifying important processes in hydrologic modeling, the existing methods consider only parametric uncertainty but ignore the model uncertainty for process representation. To address this problem, this study develops a new method to probe multimodel process sensitivity by integrating the model averaging methods into the framework of variance-based global sensitivity analysis, given that the model averagingmore » methods quantify both parametric and model uncertainty. A new process sensitivity index is derived as a metric of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and model parameters. For demonstration, the new index is used to evaluate the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that converting precipitation to recharge, and the geology process is also simulated by two models of different parameterizations of hydraulic conductivity; each process model has its own random parameters. The new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less

  16. Model-based software process improvement

    NASA Technical Reports Server (NTRS)

    Zettervall, Brenda T.

    1994-01-01

    The activities of a field test site for the Software Engineering Institute's software process definition project are discussed. Products tested included the improvement model itself, descriptive modeling techniques, the CMM level 2 framework document, and the use of process definition guidelines and templates. The software process improvement model represents a five stage cyclic approach for organizational process improvement. The cycles consist of the initiating, diagnosing, establishing, acting, and leveraging phases.

  17. A Comprehensive Model of the Meteoroids Environment Around Mercury

    NASA Astrophysics Data System (ADS)

    Pokorny, P.; Sarantos, M.; Janches, D.

    2018-05-01

    We present a comprehensive dynamical model for the meteoroid environment around Mercury comprised of meteoroids originating in asteroids, short and long period comets. Our model is fully calibrated and provides predictions for different values of TAA.

  18. A Context-Adaptive Teacher Training Model in a Ubiquitous Learning Environment

    ERIC Educational Resources Information Center

    Chen, Min; Chiang, Feng Kuang; Jiang, Ya Na; Yu, Sheng Quan

    2017-01-01

    In view of the discrepancies in teacher training and teaching practice, this paper put forward a context-adaptive teacher training model in a ubiquitous learning (u-learning) environment. The innovative model provides teachers of different subjects with adaptive and personalized learning content in a u-learning environment, implements intra- and…

  19. Atomistic Modeling of Corrosion Events at the Interface between a Metal and Its Environment

    DOE PAGES

    Taylor, Christopher D.

    2012-01-01

    Atomistic simulation is a powerful tool for probing the structure and properties of materials and the nature of chemical reactions. Corrosion is a complex process that involves chemical reactions occurring at the interface between a material and its environment and is, therefore, highly suited to study by atomistic modeling techniques. In this paper, the complex nature of corrosion processes and mechanisms is briefly reviewed. Various atomistic methods for exploring corrosion mechanisms are then described, and recent applications in the literature surveyed. Several instances of the application of atomistic modeling to corrosion science are then reviewed in detail, including studies ofmore » the metal-water interface, the reaction of water on electrified metallic interfaces, the dissolution of metal atoms from metallic surfaces, and the role of competitive adsorption in controlling the chemical nature and structure of a metallic surface. Some perspectives are then given concerning the future of atomistic modeling in the field of corrosion science.« less

  20. Design Process for Online Websites Created for Teaching Turkish as a Foreign Language in Web Based Environments

    ERIC Educational Resources Information Center

    Türker, Fatih Mehmet

    2016-01-01

    In today's world, where online learning environments have increased their efficiency in education and training, the design of the websites prepared for education and training purposes has become an important process. This study is about the teaching process of the online learning environments created to teach Turkish in web based environments, and…

  1. Modeling Dynamic Food Choice Processes to Understand Dietary Intervention Effects.

    PubMed

    Marcum, Christopher Steven; Goldring, Megan R; McBride, Colleen M; Persky, Susan

    2018-02-17

    Meal construction is largely governed by nonconscious and habit-based processes that can be represented as a collection of in dividual, micro-level food choices that eventually give rise to a final plate. Despite this, dietary behavior intervention research rarely captures these micro-level food choice processes, instead measuring outcomes at aggregated levels. This is due in part to a dearth of analytic techniques to model these dynamic time-series events. The current article addresses this limitation by applying a generalization of the relational event framework to model micro-level food choice behavior following an educational intervention. Relational event modeling was used to model the food choices that 221 mothers made for their child following receipt of an information-based intervention. Participants were randomized to receive either (a) control information; (b) childhood obesity risk information; (c) childhood obesity risk information plus a personalized family history-based risk estimate for their child. Participants then made food choices for their child in a virtual reality-based food buffet simulation. Micro-level aspects of the built environment, such as the ordering of each food in the buffet, were influential. Other dynamic processes such as choice inertia also influenced food selection. Among participants receiving the strongest intervention condition, choice inertia decreased and the overall rate of food selection increased. Modeling food selection processes can elucidate the points at which interventions exert their influence. Researchers can leverage these findings to gain insight into nonconscious and uncontrollable aspects of food selection that influence dietary outcomes, which can ultimately improve the design of dietary interventions.

  2. In-vehicle group activity modeling and simulation in sensor-based virtual environment

    NASA Astrophysics Data System (ADS)

    Shirkhodaie, Amir; Telagamsetti, Durga; Poshtyar, Azin; Chan, Alex; Hu, Shuowen

    2016-05-01

    Human group activity recognition is a very complex and challenging task, especially for Partially Observable Group Activities (POGA) that occur in confined spaces with limited visual observability and often under severe occultation. In this paper, we present IRIS Virtual Environment Simulation Model (VESM) for the modeling and simulation of dynamic POGA. More specifically, we address sensor-based modeling and simulation of a specific category of POGA, called In-Vehicle Group Activities (IVGA). In VESM, human-alike animated characters, called humanoids, are employed to simulate complex in-vehicle group activities within the confined space of a modeled vehicle. Each articulated humanoid is kinematically modeled with comparable physical attributes and appearances that are linkable to its human counterpart. Each humanoid exhibits harmonious full-body motion - simulating human-like gestures and postures, facial impressions, and hands motions for coordinated dexterity. VESM facilitates the creation of interactive scenarios consisting of multiple humanoids with different personalities and intentions, which are capable of performing complicated human activities within the confined space inside a typical vehicle. In this paper, we demonstrate the efficiency and effectiveness of VESM in terms of its capabilities to seamlessly generate time-synchronized, multi-source, and correlated imagery datasets of IVGA, which are useful for the training and testing of multi-source full-motion video processing and annotation. Furthermore, we demonstrate full-motion video processing of such simulated scenarios under different operational contextual constraints.

  3. Optimization of multi-environment trials for genomic selection based on crop models.

    PubMed

    Rincent, R; Kuhn, E; Monod, H; Oury, F-X; Rousset, M; Allard, V; Le Gouis, J

    2017-08-01

    We propose a statistical criterion to optimize multi-environment trials to predict genotype × environment interactions more efficiently, by combining crop growth models and genomic selection models. Genotype × environment interactions (GEI) are common in plant multi-environment trials (METs). In this context, models developed for genomic selection (GS) that refers to the use of genome-wide information for predicting breeding values of selection candidates need to be adapted. One promising way to increase prediction accuracy in various environments is to combine ecophysiological and genetic modelling thanks to crop growth models (CGM) incorporating genetic parameters. The efficiency of this approach relies on the quality of the parameter estimates, which depends on the environments composing this MET used for calibration. The objective of this study was to determine a method to optimize the set of environments composing the MET for estimating genetic parameters in this context. A criterion called OptiMET was defined to this aim, and was evaluated on simulated and real data, with the example of wheat phenology. The MET defined with OptiMET allowed estimating the genetic parameters with lower error, leading to higher QTL detection power and higher prediction accuracies. MET defined with OptiMET was on average more efficient than random MET composed of twice as many environments, in terms of quality of the parameter estimates. OptiMET is thus a valuable tool to determine optimal experimental conditions to best exploit MET and the phenotyping tools that are currently developed.

  4. Modeling of Solid Waste Processing Options in BIO-Plex

    NASA Technical Reports Server (NTRS)

    Rodriguez, Luis F.; Finn, Cory; Kang, Sukwon; Hogan, John; Luna, Bernadette (Technical Monitor)

    2000-01-01

    BIO-Plex is a ground-based test bed currently under development by NASA for testing technologies and practices that may be utilized in future long-term life support missions. All aspects of such an Advanced Life Support (ALS) System must be considered to confidently construct a reliable system, which will not only allow the crew to survive in harsh environments, but allow the crew time to perform meaningful research. Effective handling of solid wastes is a critical aspect of the system, especially when recovery of resources contained in the waste is required. This is particularly important for ALS Systems configurations that include a Biomass Production Chamber. In these cases, significant amounts of inedible biomass waste may be produced, which can ultimately serve as a repository of necessary resources for sustaining life, notably carbon, water, and plant nutrients. Numerous biological and physicochemical solid waste processing options have been considered. Biological options include composting, aerobic digestion, and anaerobic digestion. Physicochemical options include pyrolysis, SCWO (supercritical water oxidation), various incineration configurations, microwave incineration, magnetically assisted gasification, and low temperature plasma reaction. Modeling of these options is a necessary step to assist in the design process. A previously developed top-level model of BIO-Plex implemented in MATLAB Simulink (r) for the use of systems analysis and design has been adopted for this analysis. Presently, this model only considered incineration for solid waste processing. Present work, reported here, includes the expansion of this model to include a wider array of solid waste processing options selected from the above options, bearing in mind potential, near term solid waste treatment systems. Furthermore, a trade study has also been performed among these solid waste processing technologies in an effort to determine the ideal technology for long-term life support

  5. Using Unified Modelling Language (UML) as a process-modelling technique for clinical-research process improvement.

    PubMed

    Kumarapeli, P; De Lusignan, S; Ellis, T; Jones, B

    2007-03-01

    The Primary Care Data Quality programme (PCDQ) is a quality-improvement programme which processes routinely collected general practice computer data. Patient data collected from a wide range of different brands of clinical computer systems are aggregated, processed, and fed back to practices in an educational context to improve the quality of care. Process modelling is a well-established approach used to gain understanding and systematic appraisal, and identify areas of improvement of a business process. Unified modelling language (UML) is a general purpose modelling technique used for this purpose. We used UML to appraise the PCDQ process to see if the efficiency and predictability of the process could be improved. Activity analysis and thinking-aloud sessions were used to collect data to generate UML diagrams. The UML model highlighted the sequential nature of the current process as a barrier for efficiency gains. It also identified the uneven distribution of process controls, lack of symmetric communication channels, critical dependencies among processing stages, and failure to implement all the lessons learned in the piloting phase. It also suggested that improved structured reporting at each stage - especially from the pilot phase, parallel processing of data and correctly positioned process controls - should improve the efficiency and predictability of research projects. Process modelling provided a rational basis for the critical appraisal of a clinical data processing system; its potential maybe underutilized within health care.

  6. Numerical model describing optimization of fibres winding process on open and closed frame

    NASA Astrophysics Data System (ADS)

    Petrů, M.; Mlýnek, J.; Martinec, T.

    2016-08-01

    This article discusses a numerical model describing optimization of fibres winding process on open and closed frame. The quality production of said type of composite frame depends primarily on the correct winding of fibers on a polyurethane core. It is especially needed to ensure the correct angles of the fibers winding on the polyurethane core and the homogeneity of individual winding layers. The article describes mathematical model for use an industrial robot in filament winding and how to calculate the trajectory of the robot. When winding fibers on the polyurethane core which is fastened to the robot-end-effector so that during the winding process goes through a fibre-processing head on the basis of the suitably determined robot-end-effector trajectory. We use the described numerical model and matrix calculus to enumerate the trajectory of the robot-end-effector to determine the desired passage of the frame through the fibre-processing head. The calculation of the trajectory was programmed in the Delphi development environment. Relations of the numerical model are important for use a real solving of the passage of a polyurethane core through fibre-processing head.

  7. [Effect of solution environments on ceramic membrane microfiltration of model system of Chinese medicines].

    PubMed

    Zhang, Lianjun; Lu, Jin; Le, Kang; Fu, Tingming; Guo, Liwei

    2010-07-01

    To investigate the effect of differents solution environments on the ceramic membrane microfiltration of model system of Chinese medicines. Taking binary system of soybean protein-berberine as the research object, flux, transmittance of berberine and traping rate of protein as indexes, different solution environment on membrane process were investigated. When the concentration of soybean protein was under 1 g x L(-1), the membrane flux was minimum with the traping of berberine decreased slightly as the concentration increased. When pH was 4, the flux was maximum with the traping rate of protein was 99%, and the transmittance of berberine reached above 60%. The efficiency of membrane separation can be improved by optimizing the solution environment of water-extraction of chinese medicines. The efficiency of membrane separation is the best when adjust the pH to the isoelectric point of proteins for the proteins as the main pollutant in aqueous solution.

  8. EIT forward problem parallel simulation environment with anisotropic tissue and realistic electrode models.

    PubMed

    De Marco, Tommaso; Ries, Florian; Guermandi, Marco; Guerrieri, Roberto

    2012-05-01

    Electrical impedance tomography (EIT) is an imaging technology based on impedance measurements. To retrieve meaningful insights from these measurements, EIT relies on detailed knowledge of the underlying electrical properties of the body. This is obtained from numerical models of current flows therein. The nonhomogeneous and anisotropic electric properties of human tissues make accurate modeling and simulation very challenging, leading to a tradeoff between physical accuracy and technical feasibility, which at present severely limits the capabilities of EIT. This work presents a complete algorithmic flow for an accurate EIT modeling environment featuring high anatomical fidelity with a spatial resolution equal to that provided by an MRI and a novel realistic complete electrode model implementation. At the same time, we demonstrate that current graphics processing unit (GPU)-based platforms provide enough computational power that a domain discretized with five million voxels can be numerically modeled in about 30 s.

  9. Expert models and modeling processes associated with a computer-modeling tool

    NASA Astrophysics Data System (ADS)

    Zhang, Baohui; Liu, Xiufeng; Krajcik, Joseph S.

    2006-07-01

    Holding the premise that the development of expertise is a continuous process, this study concerns expert models and modeling processes associated with a modeling tool called Model-It. Five advanced Ph.D. students in environmental engineering and public health used Model-It to create and test models of water quality. Using think aloud technique and video recording, we captured their computer screen modeling activities and thinking processes. We also interviewed them the day following their modeling sessions to further probe the rationale of their modeling practices. We analyzed both the audio-video transcripts and the experts' models. We found the experts' modeling processes followed the linear sequence built in the modeling program with few instances of moving back and forth. They specified their goals up front and spent a long time thinking through an entire model before acting. They specified relationships with accurate and convincing evidence. Factors (i.e., variables) in expert models were clustered, and represented by specialized technical terms. Based on the above findings, we made suggestions for improving model-based science teaching and learning using Model-It.

  10. Revisiting Gaussian Process Regression Modeling for Localization in Wireless Sensor Networks

    PubMed Central

    Richter, Philipp; Toledano-Ayala, Manuel

    2015-01-01

    Signal strength-based positioning in wireless sensor networks is a key technology for seamless, ubiquitous localization, especially in areas where Global Navigation Satellite System (GNSS) signals propagate poorly. To enable wireless local area network (WLAN) location fingerprinting in larger areas while maintaining accuracy, methods to reduce the effort of radio map creation must be consolidated and automatized. Gaussian process regression has been applied to overcome this issue, also with auspicious results, but the fit of the model was never thoroughly assessed. Instead, most studies trained a readily available model, relying on the zero mean and squared exponential covariance function, without further scrutinization. This paper studies the Gaussian process regression model selection for WLAN fingerprinting in indoor and outdoor environments. We train several models for indoor/outdoor- and combined areas; we evaluate them quantitatively and compare them by means of adequate model measures, hence assessing the fit of these models directly. To illuminate the quality of the model fit, the residuals of the proposed model are investigated, as well. Comparative experiments on the positioning performance verify and conclude the model selection. In this way, we show that the standard model is not the most appropriate, discuss alternatives and present our best candidate. PMID:26370996

  11. Process-Response Modeling and the Scientific Process.

    ERIC Educational Resources Information Center

    Fichter, Lynn S.

    1988-01-01

    Discusses the process-response model (PRM) in its theoretical and practical forms. Describes how geologists attempt to reconstruct the process from the response (the geologic phenomenon) being studied. (TW)

  12. Genomic-Enabled Prediction in Maize Using Kernel Models with Genotype × Environment Interaction

    PubMed Central

    Bandeira e Sousa, Massaine; Cuevas, Jaime; de Oliveira Couto, Evellyn Giselly; Pérez-Rodríguez, Paulino; Jarquín, Diego; Fritsche-Neto, Roberto; Burgueño, Juan; Crossa, Jose

    2017-01-01

    Multi-environment trials are routinely conducted in plant breeding to select candidates for the next selection cycle. In this study, we compare the prediction accuracy of four developed genomic-enabled prediction models: (1) single-environment, main genotypic effect model (SM); (2) multi-environment, main genotypic effects model (MM); (3) multi-environment, single variance G×E deviation model (MDs); and (4) multi-environment, environment-specific variance G×E deviation model (MDe). Each of these four models were fitted using two kernel methods: a linear kernel Genomic Best Linear Unbiased Predictor, GBLUP (GB), and a nonlinear kernel Gaussian kernel (GK). The eight model-method combinations were applied to two extensive Brazilian maize data sets (HEL and USP data sets), having different numbers of maize hybrids evaluated in different environments for grain yield (GY), plant height (PH), and ear height (EH). Results show that the MDe and the MDs models fitted with the Gaussian kernel (MDe-GK, and MDs-GK) had the highest prediction accuracy. For GY in the HEL data set, the increase in prediction accuracy of SM-GK over SM-GB ranged from 9 to 32%. For the MM, MDs, and MDe models, the increase in prediction accuracy of GK over GB ranged from 9 to 49%. For GY in the USP data set, the increase in prediction accuracy of SM-GK over SM-GB ranged from 0 to 7%. For the MM, MDs, and MDe models, the increase in prediction accuracy of GK over GB ranged from 34 to 70%. For traits PH and EH, gains in prediction accuracy of models with GK compared to models with GB were smaller than those achieved in GY. Also, these gains in prediction accuracy decreased when a more difficult prediction problem was studied. PMID:28455415

  13. Genomic-Enabled Prediction in Maize Using Kernel Models with Genotype × Environment Interaction.

    PubMed

    Bandeira E Sousa, Massaine; Cuevas, Jaime; de Oliveira Couto, Evellyn Giselly; Pérez-Rodríguez, Paulino; Jarquín, Diego; Fritsche-Neto, Roberto; Burgueño, Juan; Crossa, Jose

    2017-06-07

    Multi-environment trials are routinely conducted in plant breeding to select candidates for the next selection cycle. In this study, we compare the prediction accuracy of four developed genomic-enabled prediction models: (1) single-environment, main genotypic effect model (SM); (2) multi-environment, main genotypic effects model (MM); (3) multi-environment, single variance G×E deviation model (MDs); and (4) multi-environment, environment-specific variance G×E deviation model (MDe). Each of these four models were fitted using two kernel methods: a linear kernel Genomic Best Linear Unbiased Predictor, GBLUP (GB), and a nonlinear kernel Gaussian kernel (GK). The eight model-method combinations were applied to two extensive Brazilian maize data sets (HEL and USP data sets), having different numbers of maize hybrids evaluated in different environments for grain yield (GY), plant height (PH), and ear height (EH). Results show that the MDe and the MDs models fitted with the Gaussian kernel (MDe-GK, and MDs-GK) had the highest prediction accuracy. For GY in the HEL data set, the increase in prediction accuracy of SM-GK over SM-GB ranged from 9 to 32%. For the MM, MDs, and MDe models, the increase in prediction accuracy of GK over GB ranged from 9 to 49%. For GY in the USP data set, the increase in prediction accuracy of SM-GK over SM-GB ranged from 0 to 7%. For the MM, MDs, and MDe models, the increase in prediction accuracy of GK over GB ranged from 34 to 70%. For traits PH and EH, gains in prediction accuracy of models with GK compared to models with GB were smaller than those achieved in GY. Also, these gains in prediction accuracy decreased when a more difficult prediction problem was studied. Copyright © 2017 Bandeira e Sousa et al.

  14. GREENSCOPE: A Method for Modeling Chemical Process ...

    EPA Pesticide Factsheets

    Current work within the U.S. Environmental Protection Agency’s National Risk Management Research Laboratory is focused on the development of a method for modeling chemical process sustainability. The GREENSCOPE methodology, defined for the four bases of Environment, Economics, Efficiency, and Energy, can evaluate processes with over a hundred different indicators. These indicators provide a means for realizing the principles of green chemistry and green engineering in the context of sustainability. Development of the methodology has centered around three focal points. One is a taxonomy of impacts that describe the indicators and provide absolute scales for their evaluation. The setting of best and worst limits for the indicators allows the user to know the status of the process under study in relation to understood values. Thus, existing or imagined processes can be evaluated according to their relative indicator scores, and process modifications can strive towards realizable targets. A second area of focus is in advancing definitions of data needs for the many indicators of the taxonomy. Each of the indicators has specific data that is necessary for their calculation. Values needed and data sources have been identified. These needs can be mapped according to the information source (e.g., input stream, output stream, external data, etc.) for each of the bases. The user can visualize data-indicator relationships on the way to choosing selected ones for evalua

  15. Conceptual models of information processing

    NASA Technical Reports Server (NTRS)

    Stewart, L. J.

    1983-01-01

    The conceptual information processing issues are examined. Human information processing is defined as an active cognitive process that is analogous to a system. It is the flow and transformation of information within a human. The human is viewed as an active information seeker who is constantly receiving, processing, and acting upon the surrounding environmental stimuli. Human information processing models are conceptual representations of cognitive behaviors. Models of information processing are useful in representing the different theoretical positions and in attempting to define the limits and capabilities of human memory. It is concluded that an understanding of conceptual human information processing models and their applications to systems design leads to a better human factors approach.

  16. Heatstroke model for desert dry-heat environment and observed organ damage.

    PubMed

    ou Zhou, Ren; Liu, Jiang Wei; Zhang, Dong; Zhang, Qiong

    2014-06-01

    Heatstroke is one of the most common clinical emergencies. Heatstroke that occurred in a dry-heat environment such as desert is usually more seriously effective and often leads to death. However, the report of the pathophysiologic mechanisms about heatstroke in dry-heat environment of desert has not been seen. Our objectives are to establish a rat model of heatstroke of dry-heat environment of desert, to assess the different degrees of damage of organ, and to preliminarily discuss the mechanism of heatstroke in dry-heat environment of desert. The first step, we have established a rat heatstroke model of dry heat environment of desert. The second step, we have accessed changes in morphology and blood indicators of heatstroke rats in dry-heat environment of desert. The heatstroke rats have expressed the changing characteristics of mean arterial pressure, core temperature, and heart rate. The organ damage changed from mild to serious level, specifically in the morphology and blood enzymology parameters such as alanine aminotransferase, aspartate aminotransferase, creatinine, urea, uric acid, creatine kinase-MB, creatine kinase, and blood gas parameters such as base excess extracellular fluid and bicarbonate ions (HCO3-). We have successfully established the rat heatstroke model of dry-heat environment of desert. We have identified heatstroke rats that presented changing characteristics on physiological indicators and varying degrees of organ damage, which are aggravated by the evolution of heatstroke in dry-heat environment of desert. We have preliminarily discussed the mechanism of heatstroke in dry-heat environment of desert. Copyright © 2014 Elsevier Inc. All rights reserved.

  17. Hyporheic flow and transport processes: mechanisms, models, and biogeochemical implications

    USGS Publications Warehouse

    Boano, Fulvio; Harvey, Judson W.; Marion, Andrea; Packman, Aaron I.; Revelli, Roberto; Ridolfi, Luca; Anders, Wörman

    2014-01-01

    Fifty years of hyporheic zone research have shown the important role played by the hyporheic zone as an interface between groundwater and surface waters. However, it is only in the last two decades that what began as an empirical science has become a mechanistic science devoted to modeling studies of the complex fluid dynamical and biogeochemical mechanisms occurring in the hyporheic zone. These efforts have led to the picture of surface-subsurface water interactions as regulators of the form and function of fluvial ecosystems. Rather than being isolated systems, surface water bodies continuously interact with the subsurface. Exploration of hyporheic zone processes has led to a new appreciation of their wide reaching consequences for water quality and stream ecology. Modern research aims toward a unified approach, in which processes occurring in the hyporheic zone are key elements for the appreciation, management, and restoration of the whole river environment. In this unifying context, this review summarizes results from modeling studies and field observations about flow and transport processes in the hyporheic zone and describes the theories proposed in hydrology and fluid dynamics developed to quantitatively model and predict the hyporheic transport of water, heat, and dissolved and suspended compounds from sediment grain scale up to the watershed scale. The implications of these processes for stream biogeochemistry and ecology are also discussed."

  18. Hyporheic flow and transport processes: Mechanisms, models, and biogeochemical implications

    NASA Astrophysics Data System (ADS)

    Boano, F.; Harvey, J. W.; Marion, A.; Packman, A. I.; Revelli, R.; Ridolfi, L.; Wörman, A.

    2014-12-01

    Fifty years of hyporheic zone research have shown the important role played by the hyporheic zone as an interface between groundwater and surface waters. However, it is only in the last two decades that what began as an empirical science has become a mechanistic science devoted to modeling studies of the complex fluid dynamical and biogeochemical mechanisms occurring in the hyporheic zone. These efforts have led to the picture of surface-subsurface water interactions as regulators of the form and function of fluvial ecosystems. Rather than being isolated systems, surface water bodies continuously interact with the subsurface. Exploration of hyporheic zone processes has led to a new appreciation of their wide reaching consequences for water quality and stream ecology. Modern research aims toward a unified approach, in which processes occurring in the hyporheic zone are key elements for the appreciation, management, and restoration of the whole river environment. In this unifying context, this review summarizes results from modeling studies and field observations about flow and transport processes in the hyporheic zone and describes the theories proposed in hydrology and fluid dynamics developed to quantitatively model and predict the hyporheic transport of water, heat, and dissolved and suspended compounds from sediment grain scale up to the watershed scale. The implications of these processes for stream biogeochemistry and ecology are also discussed.

  19. Toward a Model of Human Information Processing for Decision-Making and Skill Acquisition in Laparoscopic Colorectal Surgery.

    PubMed

    White, Eoin J; McMahon, Muireann; Walsh, Michael T; Coffey, J Calvin; O Sullivan, Leonard

    To create a human information-processing model for laparoscopic surgery based on already established literature and primary research to enhance laparoscopic surgical education in this context. We reviewed the literature for information-processing models most relevant to laparoscopic surgery. Our review highlighted the necessity for a model that accounts for dynamic environments, perception, allocation of attention resources between the actions of both hands of an operator, and skill acquisition and retention. The results of the literature review were augmented through intraoperative observations of 7 colorectal surgical procedures, supported by laparoscopic video analysis of 12 colorectal procedures. The Wickens human information-processing model was selected as the most relevant theoretical model to which we make adaptions for this specific application. We expanded the perception subsystem of the model to involve all aspects of perception during laparoscopic surgery. We extended the decision-making system to include dynamic decision-making to account for case/patient-specific and surgeon-specific deviations. The response subsystem now includes dual-task performance and nontechnical skills, such as intraoperative communication. The memory subsystem is expanded to include skill acquisition and retention. Surgical decision-making during laparoscopic surgery is the result of a highly complex series of processes influenced not only by the operator's knowledge, but also patient anatomy and interaction with the surgical team. Newer developments in simulation-based education must focus on the theoretically supported elements and events that underpin skill acquisition and affect the cognitive abilities of novice surgeons. The proposed human information-processing model builds on established literature regarding information processing, accounting for a dynamic environment of laparoscopic surgery. This revised model may be used as a foundation for a model describing robotic

  20. The Role of the Built Environment: How Decentralized Nurse Stations Shape Communication, Patient Care Processes, and Patient Outcomes.

    PubMed

    Real, Kevin; Bardach, Shoshana H; Bardach, David R

    2017-12-01

    Increasingly, health communication scholars are attending to how hospital built environments shape communication, patient care processes, and patient outcomes. This multimethod study was conducted on two floors of a newly designed urban hospital. Nine focus groups interviews were conducted with 35 health care professionals from 10 provider groups. Seven of the groups were homogeneous by profession or level: nursing (three groups), nurse managers (two groups), and one group each of nurse care technicians ("techs") and physicians. Two mixed groups were comprised of staff from pharmacy, occupational therapy, patient care facilitators, physical therapy, social work, and pastoral care. Systematic qualitative analysis was conducted using a conceptual framework based on systems theory and prior health care design and communication research. Additionally, quantitative modeling was employed to assess walking distances in two different hospital designs. Results indicate nurses walked significantly more in the new hospital environment. Qualitative analysis revealed three insights developed in relationship to system structures, processes, and outcomes. First, decentralized nurse stations changed system interdependencies by reducing nurse-to-nurse interactions and teamwork while heightening nurse interdependencies and teamwork with other health care occupations. Second, many nursing-related processes remained centralized while nurse stations were decentralized, creating systems-based problems for nursing care. Third, nursing communities of practices were adversely affected by the new design. Implications of this study suggest that nurse station design shapes communication, patient care processes, and patient outcomes. Further, it is important to understand how the built environment, often treated as invisible in communication research, is crucial to understanding communication within complex health care systems.

  1. Comprehensive Numerical Modeling of the Blast Furnace Ironmaking Process

    NASA Astrophysics Data System (ADS)

    Zhou, Chenn; Tang, Guangwu; Wang, Jichao; Fu, Dong; Okosun, Tyamo; Silaen, Armin; Wu, Bin

    2016-05-01

    Blast furnaces are counter-current chemical reactors, widely utilized in the ironmaking industry. Hot reduction gases injected from lower regions of the furnace ascend, reacting with the descending burden. Through this reaction process, iron ore is reduced into liquid iron that is tapped from the furnace hearth. Due to the extremely harsh environment inside the blast furnace, it is difficult to measure or observe internal phenomena during operation. Through the collaboration between steel companies and the Center for Innovation through Visualization and Simulation, multiple computational fluid dynamics (CFD) models have been developed to simulate the complex multiphase reacting flow in the three regions of the furnace, the shaft, the raceway, and the hearth. The models have been used effectively to troubleshoot and optimize blast furnace operations. In addition, the CFD models have been integrated with virtual reality. An interactive virtual blast furnace has been developed for training purpose. This paper summarizes the developments and applications of blast furnace CFD models and the virtual blast furnace.

  2. A new Mars radiation environment model with visualization

    NASA Technical Reports Server (NTRS)

    De Angelis, G.; Clowdsley, M. S.; Singleterry, R. C.; Wilson, J. W.

    2004-01-01

    A new model for the radiation environment to be found on the planet Mars due to Galactic Cosmic Rays (OCR) has been developed at the NASA Langley Research Center. Solar modulated primary particles rescaled for Mars conditions are transported through the Martian atmosphere, with temporal properties modeled with variable timescales, down to the surface, with altitude and backscattering patterns taken into account. The Martian atmosphere has been modeled by using the Mars Global Reference Atmospheric Model--version 2001 (Mars-GRAM 2001). The altitude to compute the atmospheric thickness profile has been determined by using a model for the topography based on the data provided by the Mars Orbiter Laser Altimeter (MOLA) instrument on board the Mars Global Surveyor (MGS) spacecraft. The Mars surface composition has been modeled based on averages over the measurements obtained from orbiting spacecraft and at various landing sites, taking into account the possible volatile inventory (e.g., CO2 ice, H2O ice) along with its time variation throughout the Martian year. Particle transport has been performed with the HZETRN heavy ion code. The Mars Radiation Environment Model has been made available worldwide through the Space Ionizing Radiation Effects and Shielding Tools (SIREST) website, a project of NASA Langley Research Center. c2004 COSPAR. Published by Elsevier Ltd. All rights reserved.

  3. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    DOE PAGES

    Dai, Heng; Ye, Ming; Walker, Anthony P.; ...

    2017-03-28

    A hydrological model consists of multiple process level submodels, and each submodel represents a process key to the operation of the simulated system. Global sensitivity analysis methods have been widely used to identify important processes for system model development and improvement. The existing methods of global sensitivity analysis only consider parametric uncertainty, and are not capable of handling model uncertainty caused by multiple process models that arise from competing hypotheses about one or more processes. To address this problem, this study develops a new method to probe model output sensitivity to competing process models by integrating model averaging methods withmore » variance-based global sensitivity analysis. A process sensitivity index is derived as a single summary measure of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and their parameters. Here, for demonstration, the new index is used to assign importance to the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that convert precipitation to recharge, and the geology process is simulated by two models of hydraulic conductivity. Each process model has its own random parameters. Finally, the new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less

  4. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Heng; Ye, Ming; Walker, Anthony P.

    A hydrological model consists of multiple process level submodels, and each submodel represents a process key to the operation of the simulated system. Global sensitivity analysis methods have been widely used to identify important processes for system model development and improvement. The existing methods of global sensitivity analysis only consider parametric uncertainty, and are not capable of handling model uncertainty caused by multiple process models that arise from competing hypotheses about one or more processes. To address this problem, this study develops a new method to probe model output sensitivity to competing process models by integrating model averaging methods withmore » variance-based global sensitivity analysis. A process sensitivity index is derived as a single summary measure of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and their parameters. Here, for demonstration, the new index is used to assign importance to the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that convert precipitation to recharge, and the geology process is simulated by two models of hydraulic conductivity. Each process model has its own random parameters. Finally, the new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less

  5. Blob-Spring Model for the Dynamics of Ring Polymer in Obstacle Environment

    NASA Astrophysics Data System (ADS)

    Lele, Ashish K.; Iyer, Balaji V. S.; Juvekar, Vinay A.

    2008-07-01

    The dynamical behavior of cyclic macromolecules in a fixed obstacle (FO) environment is very different than the behavior of linear chains in the same topological environment; while the latter relax by a snake-like reptational motion from their chain ends the former can relax only by contour length fluctuations since they are endless. Duke, Obukhov and Rubinstein proposed a scaling model (the DOR model) to interpret the dynamical scaling exponents shown by Monte Carlo simulations of rings in a FO environment. We present a model (blob-spring model) to describe the dynamics of flexible and non-concatenated ring polymer in FO environment based on a theoretical formulation developed for the dynamics of an unentangled fractal polymer. We argue that the perpetual evolution of ring perimeter by the motion of contour segments results in an extra frictional load. Our model predicts self-similar dynamics with scaling exponents for the molecular weight dependence of diffusion coefficient and relaxation times that are in agreement with the scaling model proposed by Obukhov et al.

  6. Visualizing the process of interaction in a 3D environment

    NASA Astrophysics Data System (ADS)

    Vaidya, Vivek; Suryanarayanan, Srikanth; Krishnan, Kajoli; Mullick, Rakesh

    2007-03-01

    As the imaging modalities used in medicine transition to increasingly three-dimensional data the question of how best to interact with and analyze this data becomes ever more pressing. Immersive virtual reality systems seem to hold promise in tackling this, but how individuals learn and interact in these environments is not fully understood. Here we will attempt to show some methods in which user interaction in a virtual reality environment can be visualized and how this can allow us to gain greater insight into the process of interaction/learning in these systems. Also explored is the possibility of using this method to improve understanding and management of ergonomic issues within an interface.

  7. Reaction time for processing visual stimulus in a computer-assisted rehabilitation environment.

    PubMed

    Sanchez, Yerly; Pinzon, David; Zheng, Bin

    2017-10-01

    To examine the reaction time when human subjects process information presented in the visual channel under both a direct vision and a virtual rehabilitation environment when walking was performed. Visual stimulus included eight math problems displayed on the peripheral vision to seven healthy human subjects in a virtual rehabilitation training (computer-assisted rehabilitation environment (CAREN)) and a direct vision environment. Subjects were required to verbally report the results of these math calculations in a short period of time. Reaction time measured by Tobii Eye tracker and calculation accuracy were recorded and compared between the direct vision and virtual rehabilitation environment. Performance outcomes measured for both groups included reaction time, reading time, answering time and the verbal answer score. A significant difference between the groups was only found for the reaction time (p = .004). Participants had more difficulty recognizing the first equation of the virtual environment. Participants reaction time was faster in the direct vision environment. This reaction time delay should be kept in mind when designing skill training scenarios in virtual environments. This was a pilot project to a series of studies assessing cognition ability of stroke patients who are undertaking a rehabilitation program with a virtual training environment. Implications for rehabilitation Eye tracking is a reliable tool that can be employed in rehabilitation virtual environments. Reaction time changes between direct vision and virtual environment.

  8. Business process modeling in healthcare.

    PubMed

    Ruiz, Francisco; Garcia, Felix; Calahorra, Luis; Llorente, César; Gonçalves, Luis; Daniel, Christel; Blobel, Bernd

    2012-01-01

    The importance of the process point of view is not restricted to a specific enterprise sector. In the field of health, as a result of the nature of the service offered, health institutions' processes are also the basis for decision making which is focused on achieving their objective of providing quality medical assistance. In this chapter the application of business process modelling - using the Business Process Modelling Notation (BPMN) standard is described. Main challenges of business process modelling in healthcare are the definition of healthcare processes, the multi-disciplinary nature of healthcare, the flexibility and variability of the activities involved in health care processes, the need of interoperability between multiple information systems, and the continuous updating of scientific knowledge in healthcare.

  9. A distributed computing model for telemetry data processing

    NASA Astrophysics Data System (ADS)

    Barry, Matthew R.; Scott, Kevin L.; Weismuller, Steven P.

    1994-05-01

    We present a new approach to distributing processed telemetry data among spacecraft flight controllers within the control centers at NASA's Johnson Space Center. This approach facilitates the development of application programs which integrate spacecraft-telemetered data and ground-based synthesized data, then distributes this information to flight controllers for analysis and decision-making. The new approach combines various distributed computing models into one hybrid distributed computing model. The model employs both client-server and peer-to-peer distributed computing models cooperating to provide users with information throughout a diverse operations environment. Specifically, it provides an attractive foundation upon which we are building critical real-time monitoring and control applications, while simultaneously lending itself to peripheral applications in playback operations, mission preparations, flight controller training, and program development and verification. We have realized the hybrid distributed computing model through an information sharing protocol. We shall describe the motivations that inspired us to create this protocol, along with a brief conceptual description of the distributed computing models it employs. We describe the protocol design in more detail, discussing many of the program design considerations and techniques we have adopted. Finally, we describe how this model is especially suitable for supporting the implementation of distributed expert system applications.

  10. A distributed computing model for telemetry data processing

    NASA Technical Reports Server (NTRS)

    Barry, Matthew R.; Scott, Kevin L.; Weismuller, Steven P.

    1994-01-01

    We present a new approach to distributing processed telemetry data among spacecraft flight controllers within the control centers at NASA's Johnson Space Center. This approach facilitates the development of application programs which integrate spacecraft-telemetered data and ground-based synthesized data, then distributes this information to flight controllers for analysis and decision-making. The new approach combines various distributed computing models into one hybrid distributed computing model. The model employs both client-server and peer-to-peer distributed computing models cooperating to provide users with information throughout a diverse operations environment. Specifically, it provides an attractive foundation upon which we are building critical real-time monitoring and control applications, while simultaneously lending itself to peripheral applications in playback operations, mission preparations, flight controller training, and program development and verification. We have realized the hybrid distributed computing model through an information sharing protocol. We shall describe the motivations that inspired us to create this protocol, along with a brief conceptual description of the distributed computing models it employs. We describe the protocol design in more detail, discussing many of the program design considerations and techniques we have adopted. Finally, we describe how this model is especially suitable for supporting the implementation of distributed expert system applications.

  11. Conservation Process Model (cpm): a Twofold Scientific Research Scope in the Information Modelling for Cultural Heritage

    NASA Astrophysics Data System (ADS)

    Fiorani, D.; Acierno, M.

    2017-05-01

    The aim of the present research is to develop an instrument able to adequately support the conservation process by means of a twofold approach, based on both BIM environment and ontology formalisation. Although BIM has been successfully experimented within AEC (Architecture Engineering Construction) field, it has showed many drawbacks for architectural heritage. To cope with unicity and more generally complexity of ancient buildings, applications so far developed have shown to poorly adapt BIM to conservation design with unsatisfactory results (Dore, Murphy 2013; Carrara 2014). In order to combine achievements reached within AEC through BIM environment (design control and management) with an appropriate, semantically enriched and flexible The presented model has at its core a knowledge base developed through information ontologies and oriented around the formalization and computability of all the knowledge necessary for the full comprehension of the object of architectural heritage an its conservation. Such a knowledge representation is worked out upon conceptual categories defined above all within architectural criticism and conservation scope. The present paper aims at further extending the scope of conceptual modelling within cultural heritage conservation already formalized by the model. A special focus is directed on decay analysis and surfaces conservation project.

  12. Topobathymetric LiDAR point cloud processing and landform classification in a tidal environment

    NASA Astrophysics Data System (ADS)

    Skovgaard Andersen, Mikkel; Al-Hamdani, Zyad; Steinbacher, Frank; Rolighed Larsen, Laurids; Brandbyge Ernstsen, Verner

    2017-04-01

    Historically it has been difficult to create high resolution Digital Elevation Models (DEMs) in land-water transition zones due to shallow water depth and often challenging environmental conditions. This gap of information has been reflected as a "white ribbon" with no data in the land-water transition zone. In recent years, the technology of airborne topobathymetric Light Detection and Ranging (LiDAR) has proven capable of filling out the gap by simultaneously capturing topographic and bathymetric elevation information, using only a single green laser. We collected green LiDAR point cloud data in the Knudedyb tidal inlet system in the Danish Wadden Sea in spring 2014. Creating a DEM from a point cloud requires the general processing steps of data filtering, water surface detection and refraction correction. However, there is no transparent and reproducible method for processing green LiDAR data into a DEM, specifically regarding the procedure of water surface detection and modelling. We developed a step-by-step procedure for creating a DEM from raw green LiDAR point cloud data, including a procedure for making a Digital Water Surface Model (DWSM) (see Andersen et al., 2017). Two different classification analyses were applied to the high resolution DEM: A geomorphometric and a morphological classification, respectively. The classification methods were originally developed for a small test area; but in this work, we have used the classification methods to classify the complete Knudedyb tidal inlet system. References Andersen MS, Gergely Á, Al-Hamdani Z, Steinbacher F, Larsen LR, Ernstsen VB (2017). Processing and performance of topobathymetric lidar data for geomorphometric and morphological classification in a high-energy tidal environment. Hydrol. Earth Syst. Sci., 21: 43-63, doi:10.5194/hess-21-43-2017. Acknowledgements This work was funded by the Danish Council for Independent Research | Natural Sciences through the project "Process-based understanding and

  13. Comparing Two Types of Model Progression in an Inquiry Learning Environment with Modelling Facilities

    ERIC Educational Resources Information Center

    Mulder, Yvonne G.; Lazonder, Ard W.; de Jong, Ton

    2011-01-01

    The educational advantages of inquiry learning environments that incorporate modelling facilities are often challenged by students' poor inquiry skills. This study examined two types of model progression as means to compensate for these skill deficiencies. Model order progression (MOP), the predicted optimal variant, gradually increases the…

  14. a Framework for Voxel-Based Global Scale Modeling of Urban Environments

    NASA Astrophysics Data System (ADS)

    Gehrung, Joachim; Hebel, Marcus; Arens, Michael; Stilla, Uwe

    2016-10-01

    The generation of 3D city models is a very active field of research. Modeling environments as point clouds may be fast, but has disadvantages. These are easily solvable by using volumetric representations, especially when considering selective data acquisition, change detection and fast changing environments. Therefore, this paper proposes a framework for the volumetric modeling and visualization of large scale urban environments. Beside an architecture and the right mix of algorithms for the task, two compression strategies for volumetric models as well as a data quality based approach for the import of range measurements are proposed. The capabilities of the framework are shown on a mobile laser scanning dataset of the Technical University of Munich. Furthermore the loss of the compression techniques is evaluated and their memory consumption is compared to that of raw point clouds. The presented results show that generation, storage and real-time rendering of even large urban models are feasible, even with off-the-shelf hardware.

  15. Realistic Modeling of Wireless Network Environments

    DTIC Science & Technology

    2015-03-01

    wireless environment, namely vehicular networks. We also made a number of improvements to an emulation-based wireless testbed to improve channel model...and the two wireless devices used in the experiment (bottom). This testbed was used for point-point vehicular wireless experiments that used the...DSRC-based vehicular networks (~5.9 GHz). We were able to meet that goal, as described below. Figure 3: DSP Card 3.3 System design and

  16. Adapting Evaluations of Alternative Payment Models to a Changing Environment.

    PubMed

    Grannemann, Thomas W; Brown, Randall S

    2018-04-01

    To identify the most robust methods for evaluating alternative payment models (APMs) in the emerging health care delivery system environment. We assess the impact of widespread testing of alternative payment models on the ability to find credible comparison groups. We consider the applicability of factorial research designs for assessing the effects of these models. The widespread adoption of alternative payment models could effectively eliminate the possibility of comparing APM results with a "pure" control or comparison group unaffected by other interventions. In this new environment, factorial experiments have distinct advantages over the single-model experimental or quasi-experimental designs that have been the mainstay of recent tests of Medicare payment and delivery models. The best prospects for producing definitive evidence of the effects of payment incentives for APMs include fractional factorial experiments that systematically vary requirements and payment provisions within a payment model. © Health Research and Educational Trust.

  17. A road map for integrating eco-evolutionary processes into biodiversity models.

    PubMed

    Thuiller, Wilfried; Münkemüller, Tamara; Lavergne, Sébastien; Mouillot, David; Mouquet, Nicolas; Schiffers, Katja; Gravel, Dominique

    2013-05-01

    The demand for projections of the future distribution of biodiversity has triggered an upsurge in modelling at the crossroads between ecology and evolution. Despite the enthusiasm around these so-called biodiversity models, most approaches are still criticised for not integrating key processes known to shape species ranges and community structure. Developing an integrative modelling framework for biodiversity distribution promises to improve the reliability of predictions and to give a better understanding of the eco-evolutionary dynamics of species and communities under changing environments. In this article, we briefly review some eco-evolutionary processes and interplays among them, which are essential to provide reliable projections of species distributions and community structure. We identify gaps in theory, quantitative knowledge and data availability hampering the development of an integrated modelling framework. We argue that model development relying on a strong theoretical foundation is essential to inspire new models, manage complexity and maintain tractability. We support our argument with an example of a novel integrated model for species distribution modelling, derived from metapopulation theory, which accounts for abiotic constraints, dispersal, biotic interactions and evolution under changing environmental conditions. We hope such a perspective will motivate exciting and novel research, and challenge others to improve on our proposed approach. © 2013 John Wiley & Sons Ltd/CNRS.

  18. Radiography for intensive care: participatory process analysis in a PACS-equipped and film/screen environment

    NASA Astrophysics Data System (ADS)

    Peer, Regina; Peer, Siegfried; Sander, Heike; Marsolek, Ingo; Koller, Wolfgang; Pappert, Dirk; Hierholzer, Johannes

    2002-05-01

    If new technology is introduced into medical practice it must prove to make a difference. However traditional approaches of outcome analysis failed to show a direct benefit of PACS on patient care and economical benefits are still in debate. A participatory process analysis was performed to compare workflow in a film based hospital and a PACS environment. This included direct observation of work processes, interview of involved staff, structural analysis and discussion of observations with staff members. After definition of common structures strong and weak workflow steps were evaluated. With a common workflow structure in both hospitals, benefits of PACS were revealed in workflow steps related to image reporting with simultaneous image access for ICU-physicians and radiologists, archiving of images as well as image and report distribution. However PACS alone is not able to cover the complete process of 'radiography for intensive care' from ordering of an image till provision of the final product equals image + report. Interference of electronic workflow with analogue process steps such as paper based ordering reduces the potential benefits of PACS. In this regard workflow modeling proved to be very helpful for the evaluation of complex work processes linking radiology and the ICU.

  19. Methodology for the Incorporation of Passive Component Aging Modeling into the RAVEN/ RELAP-7 Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mandelli, Diego; Rabiti, Cristian; Cogliati, Joshua

    2014-11-01

    Passive system, structure and components (SSCs) will degrade over their operation life and this degradation may cause to reduction in the safety margins of a nuclear power plant. In traditional probabilistic risk assessment (PRA) using the event-tree/fault-tree methodology, passive SSC failure rates are generally based on generic plant failure data and the true state of a specific plant is not reflected realistically. To address aging effects of passive SSCs in the traditional PRA methodology [1] does consider physics based models that account for the operating conditions in the plant, however, [1] does not include effects of surveillance/inspection. This paper representsmore » an overall methodology for the incorporation of aging modeling of passive components into the RAVEN/RELAP-7 environment which provides a framework for performing dynamic PRA. Dynamic PRA allows consideration of both epistemic and aleatory uncertainties (including those associated with maintenance activities) in a consistent phenomenological and probabilistic framework and is often needed when there is complex process/hardware/software/firmware/ human interaction [2]. Dynamic PRA has gained attention recently due to difficulties in the traditional PRA modeling of aging effects of passive components using physics based models and also in the modeling of digital instrumentation and control systems. RAVEN (Reactor Analysis and Virtual control Environment) [3] is a software package under development at the Idaho National Laboratory (INL) as an online control logic driver and post-processing tool. It is coupled to the plant transient code RELAP-7 (Reactor Excursion and Leak Analysis Program) also currently under development at INL [3], as well as RELAP 5 [4]. The overall methodology aims to: • Address multiple aging mechanisms involving large number of components in a computational feasible manner where sequencing of events is conditioned on the physical conditions predicted in a simulation

  20. Workflows for microarray data processing in the Kepler environment.

    PubMed

    Stropp, Thomas; McPhillips, Timothy; Ludäscher, Bertram; Bieda, Mark

    2012-05-17

    Microarray data analysis has been the subject of extensive and ongoing pipeline development due to its complexity, the availability of several options at each analysis step, and the development of new analysis demands, including integration with new data sources. Bioinformatics pipelines are usually custom built for different applications, making them typically difficult to modify, extend and repurpose. Scientific workflow systems are intended to address these issues by providing general-purpose frameworks in which to develop and execute such pipelines. The Kepler workflow environment is a well-established system under continual development that is employed in several areas of scientific research. Kepler provides a flexible graphical interface, featuring clear display of parameter values, for design and modification of workflows. It has capabilities for developing novel computational components in the R, Python, and Java programming languages, all of which are widely used for bioinformatics algorithm development, along with capabilities for invoking external applications and using web services. We developed a series of fully functional bioinformatics pipelines addressing common tasks in microarray processing in the Kepler workflow environment. These pipelines consist of a set of tools for GFF file processing of NimbleGen chromatin immunoprecipitation on microarray (ChIP-chip) datasets and more comprehensive workflows for Affymetrix gene expression microarray bioinformatics and basic primer design for PCR experiments, which are often used to validate microarray results. Although functional in themselves, these workflows can be easily customized, extended, or repurposed to match the needs of specific projects and are designed to be a toolkit and starting point for specific applications. These workflows illustrate a workflow programming paradigm focusing on local resources (programs and data) and therefore are close to traditional shell scripting or R

  1. Polymerisation processes in expoy resins under influence of free space environment

    NASA Astrophysics Data System (ADS)

    Kondyurin, A.; Lauke, B.; Kondyurina, I.

    A creation of large size constructions in space or on celestial bodies is possible by the way of chemical reactions of liquid viscous components under space environment conditions [1-2]. In particular, a new technology for large-size space module for electronic components, energy and materials production is developed on the basis of polymerisation technique. The factors of free space environment have a significant influence on the polymerisation processes. The polymerisation processes in active liquid components are sensitive to microgravitation, temperature variations (-150{ldots}+1500C), high vacuum (10-3{ldots}10-7 Pa), atomic oxygen flux (on LEO), UV and VUV irradiations, X-ray and γ -irradiations, high energy electron and ion fluxes. Experiments of polymerisation processes under simulated free space conditions were conducted. The influences of high vacuum, high energy ion beam and rf- and mw-plasma on polymerisation of epoxy resins were observed. The effects of low molecular components evaporations, free radical formations, additional chemical reactions and mixing processes during polymerisation were observed. Our results showed, that the space factors can initiate the polymerisation reaction in epoxy matrix of glass and carbon fibre composites. The result can be used for a technology for large size constructions on Earth orbit, in far space and on space bodies as for deployed antennas, solar sail stringers, solar shield stringers, frame for large-size space station, frame for Moon, Mars, asteroids bases, frame for space plant on Earth orbit and on other celestial bodies. The study was partially supported by Alexander von Humboldt Foundation (A. Kondyurin) and European Space Agency, ESTEC (contract 17083/03/NL/Sfe "Space Environmental Effects on the Polymerisation of Composite Structures"). 1. A.Kondyurin, B.Lauke, Polymerisation processes in simulated free space conditions, Proceedings of the 9th International Symposium on Materials in a Space Environment

  2. Visual Modelling of Learning Processes

    ERIC Educational Resources Information Center

    Copperman, Elana; Beeri, Catriel; Ben-Zvi, Nava

    2007-01-01

    This paper introduces various visual models for the analysis and description of learning processes. The models analyse learning on two levels: the dynamic level (as a process over time) and the functional level. Two types of model for dynamic modelling are proposed: the session trace, which documents a specific learner in a particular learning…

  3. Process-based modeling of temperature and water profiles in the seedling recruitment zone: Part II. Seedling emergence timing

    USDA-ARS?s Scientific Manuscript database

    Predictions of seedling emergence timing for spring wheat are facilitated by process-based modeling of the microsite environment in the shallow seedling recruitment zone. Hourly temperature and water profiles within the recruitment zone for 60 days after planting were simulated from the process-base...

  4. Effect of river flow fluctuations on riparian vegetation dynamics: Processes and models

    NASA Astrophysics Data System (ADS)

    Vesipa, Riccardo; Camporeale, Carlo; Ridolfi, Luca

    2017-12-01

    Several decades of field observations, laboratory experiments and mathematical modelings have demonstrated that the riparian environment is a disturbance-driven ecosystem, and that the main source of disturbance is river flow fluctuations. The focus of the present work has been on the key role that flow fluctuations play in determining the abundance, zonation and species composition of patches of riparian vegetation. To this aim, the scientific literature on the subject, over the last 20 years, has been reviewed. First, the most relevant ecological, morphological and chemical mechanisms induced by river flow fluctuations are described from a process-based perspective. The role of flow variability is discussed for the processes that affect the recruitment of vegetation, the vegetation during its adult life, and the morphological and nutrient dynamics occurring in the riparian habitat. Particular emphasis has been given to studies that were aimed at quantifying the effect of these processes on vegetation, and at linking them to the statistical characteristics of the river hydrology. Second, the advances made, from a modeling point of view, have been considered and discussed. The main models that have been developed to describe the dynamics of riparian vegetation have been presented. Different modeling approaches have been compared, and the corresponding advantages and drawbacks have been pointed out. Finally, attention has been paid to identifying the processes considered by the models, and these processes have been compared with those that have actually been observed or measured in field/laboratory studies.

  5. Interdisciplinary shared governance: a partnership model for high performance in a managed care environment.

    PubMed

    Anderson, D A; Bankston, K; Stindt, J L; Weybright, D W

    2000-09-01

    Today's managed care environment is forcing hospitals to seek new and innovative ways to deliver a seamless continuum of high-quality care and services to defined populations at lower costs. Many are striving to achieve this goal through the implementation of shared governance models that support point-of-service decision making, interdisciplinary partnerships, and the integration of work across clinical settings and along the service delivery continuum. The authors describe the key processes and strategies used to facilitate the design and successful implementation of an interdisciplinary shared governance model at The University Hospital, Cincinnati, Ohio. Implementation costs and initial benefits obtained over a 2-year period also are identified.

  6. A virtual environment for modeling and testing sensemaking with multisensor information

    NASA Astrophysics Data System (ADS)

    Nicholson, Denise; Bartlett, Kathleen; Hoppenfeld, Robert; Nolan, Margaret; Schatz, Sae

    2014-05-01

    Given today's challenging Irregular Warfare, members of small infantry units must be able to function as highly sensitized perceivers throughout large operational areas. Improved Situation Awareness (SA) in rapidly changing fields of operation may also save lives of law enforcement personnel and first responders. Critical competencies for these individuals include sociocultural sensemaking, the ability to assess a situation through the perception of essential salient environmental and behavioral cues, and intuitive sensemaking, which allows experts to act with the utmost agility. Intuitive sensemaking and intuitive decision making (IDM), which involve processing information at a subconscious level, have been cited as playing a critical role in saving lives and enabling mission success. This paper discusses the development of a virtual environment for modeling, analysis and human-in-the-loop testing of perception, sensemaking, intuitive sensemaking, decision making (DM), and IDM performance, using state-of-the-art scene simulation and modeled imagery from multi-source systems, under the "Intuition and Implicit Learning" Basic Research Challenge (I2BRC) sponsored by the Office of Naval Research (ONR). We present results from our human systems engineering approach including 1) development of requirements and test metrics for individual and integrated system components, 2) the system architecture design 3) images of the prototype virtual environment testing system and 4) a discussion of the system's current and future testing capabilities. In particular, we examine an Enhanced Interaction Suite testbed to model, test, and analyze the impact of advances in sensor spatial, and temporal resolution to a user's intuitive sensemaking and decision making capabilities.

  7. Rapid prototyping and AI programming environments applied to payload modeling

    NASA Technical Reports Server (NTRS)

    Carnahan, Richard S., Jr.; Mendler, Andrew P.

    1987-01-01

    This effort focused on using artificial intelligence (AI) programming environments and rapid prototyping to aid in both space flight manned and unmanned payload simulation and training. Significant problems addressed are the large amount of development time required to design and implement just one of these payload simulations and the relative inflexibility of the resulting model to accepting future modification. Results of this effort have suggested that both rapid prototyping and AI programming environments can significantly reduce development time and cost when applied to the domain of payload modeling for crew training. The techniques employed are applicable to a variety of domains where models or simulations are required.

  8. Exploring the Processes of Generating LOD (0-2) Citygml Models in Greater Municipality of Istanbul

    NASA Astrophysics Data System (ADS)

    Buyuksalih, I.; Isikdag, U.; Zlatanova, S.

    2013-08-01

    3D models of cities, visualised and exploded in 3D virtual environments have been available for several years. Currently a large number of impressive realistic 3D models have been regularly presented at scientific, professional and commercial events. One of the most promising developments is OGC standard CityGML. CityGML is object-oriented model that support 3D geometry and thematic semantics, attributes and relationships, and offers advanced options for realistic visualization. One of the very attractive characteristics of the model is the support of 5 levels of detail (LOD), starting from 2.5D less accurate model (LOD0) and ending with very detail indoor model (LOD4). Different local government offices and municipalities have different needs when utilizing the CityGML models, and the process of model generation depends on local and domain specific needs. Although the processes (i.e. the tasks and activities) for generating the models differs depending on its utilization purpose, there are also some common tasks (i.e. common denominator processes) in the model generation of City GML models. This paper focuses on defining the common tasks in generation of LOD (0-2) City GML models and representing them in a formal way with process modeling diagrams.

  9. Pharmaceutical process chemistry: evolution of a contemporary data-rich laboratory environment.

    PubMed

    Caron, Stéphane; Thomson, Nicholas M

    2015-03-20

    Over the past 20 years, the industrial laboratory environment has gone through a major transformation in the industrial process chemistry setting. In order to discover and develop robust and efficient syntheses and processes for a pharmaceutical portfolio with growing synthetic complexity and increased regulatory expectations, the round-bottom flask and other conventional equipment familiar to a traditional organic chemistry laboratory are being replaced. The new process chemistry laboratory fosters multidisciplinary collaborations by providing a suite of tools capable of delivering deeper process understanding through mechanistic insights and detailed kinetics translating to greater predictability at scale. This transformation is essential to the field of organic synthesis in order to promote excellence in quality, safety, speed, and cost efficiency in synthesis.

  10. Model of a programmable quantum processing unit based on a quantum transistor effect

    NASA Astrophysics Data System (ADS)

    Ablayev, Farid; Andrianov, Sergey; Fetisov, Danila; Moiseev, Sergey; Terentyev, Alexandr; Urmanchev, Andrey; Vasiliev, Alexander

    2018-02-01

    In this paper we propose a model of a programmable quantum processing device realizable with existing nano-photonic technologies. It can be viewed as a basis for new high performance hardware architectures. Protocols for physical implementation of device on the controlled photon transfer and atomic transitions are presented. These protocols are designed for executing basic single-qubit and multi-qubit gates forming a universal set. We analyze the possible operation of this quantum computer scheme. Then we formalize the physical architecture by a mathematical model of a Quantum Processing Unit (QPU), which we use as a basis for the Quantum Programming Framework. This framework makes it possible to perform universal quantum computations in a multitasking environment.

  11. Estimation of Environment-Related Properties of Chemicals for Design of Sustainable Processes: Development of Group-Contribution+ (GC+) Property Models and Uncertainty Analysis

    EPA Science Inventory

    The aim of this work is to develop group-contribution+ (GC+) method (combined group-contribution (GC) method and atom connectivity index (CI) method) based property models to provide reliable estimations of environment-related properties of organic chemicals together with uncert...

  12. Three layer functional model and energy exchange concept of aging process

    PubMed Central

    Mihajlovic, William

    2006-01-01

    Relying on a certain degree of abstraction, we can propose that no particular distinction exists between animate or living matter and inanimate matter. While focusing attention on some specifics, the dividing line between the two can be drawn. The most apparent distinction is in the level of structural and functional organization with the dissimilar streams of ‘energy flow’ between the observed entity and the surrounding environment. In essence, living matter is created from inanimate matter which is organized to contain internal intense energy processes and maintain lower intensity energy exchange processes with the environment. Taking internal and external energy processes into account, we contend in this paper that living matter can be referred to as matter of dissipative structure, with this structure assumed to be a common quality of all living creatures and living matter in general. Interruption of internal energy conversion processes and terminating the controlled energy exchange with the environment leads to degeneration of dissipative structure and reduction of the same to inanimate matter, (gas, liquid and/or solid inanimate substances), and ultimately what can be called ‘death.’ This concept of what we call dissipative nature can be extended from living organisms to social groups of animals, to mankind. An analogy based on the organization of matter provides a basis for a functional model of living entities. The models relies on the parallels among the three central structures of any cell (nucleus, cytoplasm and outer membrane) and the human body (central organs, body fluids along with the connective tissues, and external skin integument). This three-part structural organization may be observed almost universally in nature. It can be observed from the atomic structure to the planetary and intergalactic organizations. This similarity is corroborated by the membrane theory applied to living organisms. According to the energy nature of living matter

  13. Modeling Gene-Environment Interactions With Quasi-Natural Experiments.

    PubMed

    Schmitz, Lauren; Conley, Dalton

    2017-02-01

    This overview develops new empirical models that can effectively document Gene × Environment (G×E) interactions in observational data. Current G×E studies are often unable to support causal inference because they use endogenous measures of the environment or fail to adequately address the nonrandom distribution of genes across environments, confounding estimates. Comprehensive measures of genetic variation are incorporated into quasi-natural experimental designs to exploit exogenous environmental shocks or isolate variation in environmental exposure to avoid potential confounders. In addition, we offer insights from population genetics that improve upon extant approaches to address problems from population stratification. Together, these tools offer a powerful way forward for G×E research on the origin and development of social inequality across the life course. © 2015 Wiley Periodicals, Inc.

  14. Modeling nuclear processes by Simulink

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rashid, Nahrul Khair Alang Md, E-mail: nahrul@iium.edu.my

    2015-04-29

    Modelling and simulation are essential parts in the study of dynamic systems behaviours. In nuclear engineering, modelling and simulation are important to assess the expected results of an experiment before the actual experiment is conducted or in the design of nuclear facilities. In education, modelling can give insight into the dynamic of systems and processes. Most nuclear processes can be described by ordinary or partial differential equations. Efforts expended to solve the equations using analytical or numerical solutions consume time and distract attention from the objectives of modelling itself. This paper presents the use of Simulink, a MATLAB toolbox softwaremore » that is widely used in control engineering, as a modelling platform for the study of nuclear processes including nuclear reactor behaviours. Starting from the describing equations, Simulink models for heat transfer, radionuclide decay process, delayed neutrons effect, reactor point kinetic equations with delayed neutron groups, and the effect of temperature feedback are used as examples.« less

  15. Neutron Environment Calculations for Low Earth Orbit

    NASA Technical Reports Server (NTRS)

    Clowdsley, M. S.; Wilson, J. W.; Shinn, J. L.; Badavi, F. F.; Heinbockel, J. H.; Atwell, W.

    2001-01-01

    The long term exposure of astronauts on the developing International Space Station (ISS) requires an accurate knowledge of the internal exposure environment for human risk assessment and other onboard processes. The natural environment is moderated by the solar wind, which varies over the solar cycle. The HZETRN high charge and energy transport code developed at NASA Langley Research Center can be used to evaluate the neutron environment on ISS. A time dependent model for the ambient environment in low earth orbit is used. This model includes GCR radiation moderated by the Earth's magnetic field, trapped protons, and a recently completed model of the albedo neutron environment formed through the interaction of galactic cosmic rays with the Earth's atmosphere. Using this code, the neutron environments for space shuttle missions were calculated and comparisons were made to measurements by the Johnson Space Center with onboard detectors. The models discussed herein are being developed to evaluate the natural and induced environment data for the Intelligence Synthesis Environment Project and eventual use in spacecraft optimization.

  16. The Acquisition of Integrated Science Process Skills in a Web-Based Learning Environment

    ERIC Educational Resources Information Center

    Saat, Rohaida Mohd

    2004-01-01

    Web-based learning is becoming prevalent in science learning. Some use specially designed programs, while others use materials available on the Internet. This qualitative case study examined the process of acquisition of integrated science process skills, particularly the skill of controlling variables, in a web-based learning environment among…

  17. Mathematical modeling of heat treatment processes conserving biological activity of plant bioresources

    NASA Astrophysics Data System (ADS)

    Rodionova, N. S.; Popov, E. S.; Pozhidaeva, E. A.; Pynzar, S. S.; Ryaskina, L. O.

    2018-05-01

    The aim of this study is to develop a mathematical model of the heat exchange process of LT-processing to estimate the dynamics of temperature field changes and optimize the regime parameters, due to the non-stationarity process, the physicochemical and thermophysical properties of food systems. The application of LT-processing, based on the use of low-temperature modes in thermal culinary processing of raw materials with preliminary vacuum packaging in a polymer heat- resistant film is a promising trend in the development of technics and technology in the catering field. LT-processing application of food raw materials guarantees the preservation of biologically active substances in food environments, which are characterized by a certain thermolability, as well as extend the shelf life and high consumer characteristics of food systems that are capillary-porous bodies. When performing the mathematical modeling of the LT-processing process, the packet of symbolic mathematics “Maple” was used, as well as the mathematical packet flexPDE that uses the finite element method for modeling objects with distributed parameters. The processing of experimental results was evaluated with the help of the developed software in the programming language Python 3.4. To calculate and optimize the parameters of the LT processing process of polycomponent food systems, the differential equation of non-stationary thermal conductivity was used, the solution of which makes it possible to identify the temperature change at any point of the solid at different moments. The present study specifies data on the thermophysical characteristics of the polycomponent food system based on plant raw materials, with the help of which the physico-mathematical model of the LT- processing process has been developed. The obtained mathematical model allows defining of the dynamics of the temperature field in different sections of the LT-processed polycomponent food systems on the basis of calculating the

  18. A biologically inspired model of bat echolocation in a cluttered environment with inputs designed from field Recordings

    NASA Astrophysics Data System (ADS)

    Loncich, Kristen Teczar

    Bat echolocation strategies and neural processing of acoustic information, with a focus on cluttered environments, is investigated in this study. How a bat processes the dense field of echoes received while navigating and foraging in the dark is not well understood. While several models have been developed to describe the mechanisms behind bat echolocation, most are based in mathematics rather than biology, and focus on either peripheral or neural processing---not exploring how these two levels of processing are vitally connected. Current echolocation models also do not use habitat specific acoustic input, or account for field observations of echolocation strategies. Here, a new approach to echolocation modeling is described capturing the full picture of echolocation from signal generation to a neural picture of the acoustic scene. A biologically inspired echolocation model is developed using field research measurements of the interpulse interval timing used by a frequency modulating (FM) bat in the wild, with a whole method approach to modeling echolocation including habitat specific acoustic inputs, a biologically accurate peripheral model of sound processing by the outer, middle, and inner ear, and finally a neural model incorporating established auditory pathways and neuron types with echolocation adaptations. Field recordings analyzed underscore bat sonar design differences observed in the laboratory and wild, and suggest a correlation between interpulse interval groupings and increased clutter. The scenario model provides habitat and behavior specific echoes and is a useful tool for both modeling and behavioral studies, and the peripheral and neural model show that spike-time information and echolocation specific neuron types can produce target localization in the midbrain.

  19. The Quality of Home Environment in Brazil: An Ecological Model

    ERIC Educational Resources Information Center

    de Oliveira, Ebenezer A.; Barros, Fernando C.; Anselmi, Luciana D. da Silva; Piccinini, Cesar A.

    2006-01-01

    Based on Bronfenbrenner's (1999) ecological perspective, a longitudinal, prospective model of individual differences in the quality of home environment (Home Observation for Measurement of the Environment--HOME) was tested in a sample of 179 Brazilian children and their families. Perinatal measures of family socioeconomic status (SES) and child…

  20. IoT-Based User-Driven Service Modeling Environment for a Smart Space Management System

    PubMed Central

    Choi, Hoan-Suk; Rhee, Woo-Seop

    2014-01-01

    The existing Internet environment has been extended to the Internet of Things (IoT) as an emerging new paradigm. The IoT connects various physical entities. These entities have communication capability and deploy the observed information to various service areas such as building management, energy-saving systems, surveillance services, and smart homes. These services are designed and developed by professional service providers. Moreover, users' needs have become more complicated and personalized with the spread of user-participation services such as social media and blogging. Therefore, some active users want to create their own services to satisfy their needs, but the existing IoT service-creation environment is difficult for the non-technical user because it requires a programming capability to create a service. To solve this problem, we propose the IoT-based user-driven service modeling environment to provide an easy way to create IoT services. Also, the proposed environment deploys the defined service to another user. Through the personalization and customization of the defined service, the value and dissemination of the service is increased. This environment also provides the ontology-based context-information processing that produces and describes the context information for the IoT-based user-driven service. PMID:25420153

  1. IoT-based user-driven service modeling environment for a smart space management system.

    PubMed

    Choi, Hoan-Suk; Rhee, Woo-Seop

    2014-11-20

    The existing Internet environment has been extended to the Internet of Things (IoT) as an emerging new paradigm. The IoT connects various physical entities. These entities have communication capability and deploy the observed information to various service areas such as building management, energy-saving systems, surveillance services, and smart homes. These services are designed and developed by professional service providers. Moreover, users' needs have become more complicated and personalized with the spread of user-participation services such as social media and blogging. Therefore, some active users want to create their own services to satisfy their needs, but the existing IoT service-creation environment is difficult for the non-technical user because it requires a programming capability to create a service. To solve this problem, we propose the IoT-based user-driven service modeling environment to provide an easy way to create IoT services. Also, the proposed environment deploys the defined service to another user. Through the personalization and customization of the defined service, the value and dissemination of the service is increased. This environment also provides the ontology-based context-information processing that produces and describes the context information for the IoT-based user-driven service.

  2. Predicting Material Performance in the Space Environment from Laboratory Test Data, Static Design Environments, and Space Weather Models

    NASA Technical Reports Server (NTRS)

    Minow, Josep I.; Edwards, David L.

    2008-01-01

    Qualifying materials for use in the space environment is typically accomplished with laboratory exposures to simulated UV/EUV, atomic oxygen, and charged particle radiation environments with in-situ or subsequent measurements of material properties of interest to the particular application. Choice of environment exposure levels are derived from static design environments intended to represent either mean or extreme conditions that are anticipated to be encountered during a mission. The real space environment however is quite variable. Predictions of the on orbit performance of a material qualified to laboratory environments can be done using information on 'space weather' variations in the real environment. This presentation will first review the variability of space environments of concern for material degradation and then demonstrate techniques for using test data to predict material performance in a variety of space environments from low Earth orbit to interplanetary space using historical measurements and space weather models.

  3. A Virtual Environment for Process Management. A Step by Step Implementation

    ERIC Educational Resources Information Center

    Mayer, Sergio Valenzuela

    2003-01-01

    In this paper it is presented a virtual organizational environment, conceived with the integration of three computer programs: a manufacturing simulation package, an automation of businesses processes (workflows), and business intelligence (Balanced Scorecard) software. It was created as a supporting tool for teaching IE, its purpose is to give…

  4. Student involvement in the Geospace Environment Modeling (GEM) workshop

    NASA Astrophysics Data System (ADS)

    Allen, R. C.; Cohen, I. J.

    2014-12-01

    The Geospace Environment Modeling (GEM) workshop is a unique venue for students to begin to integrate into the magnetospheric community. GEM, an annual workshop funded by the NSF, allows students to present their research in a collaborative atmosphere and to engage with senior scientists as peers. This builds confidence in the students, while also allowing them to share ideas and strengthen their research. Each GEM workshop starts with a student-run and organized "student day", in which older students volunteer to present tutorials on different magnetospheric systems and processes. These tutorials strive to put the upcoming week of talks and posters in context while providing an overarching base understanding of the magnetospheric system. By starting the week with student taught tutorials, as well as icebreaker activities, the students become comfortable with asking questions and set the tone for the less formal student and discussion-oriented workshop.

  5. Application of overlay modeling and control with Zernike polynomials in an HVM environment

    NASA Astrophysics Data System (ADS)

    Ju, JaeWuk; Kim, MinGyu; Lee, JuHan; Nabeth, Jeremy; Jeon, Sanghuck; Heo, Hoyoung; Robinson, John C.; Pierson, Bill

    2016-03-01

    Shrinking technology nodes and smaller process margins require improved photolithography overlay control. Generally, overlay measurement results are modeled with Cartesian polynomial functions for both intra-field and inter-field models and the model coefficients are sent to an advanced process control (APC) system operating in an XY Cartesian basis. Dampened overlay corrections, typically via exponentially or linearly weighted moving average in time, are then retrieved from the APC system to apply on the scanner in XY Cartesian form for subsequent lot exposure. The goal of the above method is to process lots with corrections that target the least possible overlay misregistration in steady state as well as in change point situations. In this study, we model overlay errors on product using Zernike polynomials with same fitting capability as the process of reference (POR) to represent the wafer-level terms, and use the standard Cartesian polynomials to represent the field-level terms. APC calculations for wafer-level correction are performed in Zernike basis while field-level calculations use standard XY Cartesian basis. Finally, weighted wafer-level correction terms are converted to XY Cartesian space in order to be applied on the scanner, along with field-level corrections, for future wafer exposures. Since Zernike polynomials have the property of being orthogonal in the unit disk we are able to reduce the amount of collinearity between terms and improve overlay stability. Our real time Zernike modeling and feedback evaluation was performed on a 20-lot dataset in a high volume manufacturing (HVM) environment. The measured on-product results were compared to POR and showed a 7% reduction in overlay variation including a 22% terms variation. This led to an on-product raw overlay Mean + 3Sigma X&Y improvement of 5% and resulted in 0.1% yield improvement.

  6. Mid- and long-term debris environment projections using the EVOLVE and CHAIN models

    NASA Astrophysics Data System (ADS)

    Eichler, Peter; Reynolds, Robert C.

    1995-06-01

    Results of debris environment projections are of great importance for the evaluation of the necessity and effectiveness of debris mitigation measures. EVOLVE and CHAIN are two models for debris environment projections that have been developed independently using different conceptual approaches. A comparison of results from these two models therefore provides a means of validating debris environment projections which they have made. EVOLVE is a model that requires mission model projections to describe future space operation; these projections include launch date, mission orbit altitude and inclimation, mission duration, vehicle size and mass, and classification as an object capable of experiencing breakup from on-board stored energy. EVOLVE describes the orbital debris environment by the orbital elements of the objects in the environment. CHAIN is an analytic model that bins the debris environemnt in size and altitude rather than following the orbit evolution of individual debris fragments. The altitude/size bins are coupled by the initial spreading of fragments by collisions and the following orbital decay behavior. A set of test cases covering a variety of space usage scenarios have been defined for the two models. In this paper, a comparison of the results will be presented and sources of disagreement identified and discussed. One major finding is that despite differences in the results of the two models, the basic tendencies of the environment projections are independent of modeled uncertainties, leading to the demand of debris mitigation measures--explosion suppression and de-orbit of rocket bodies and payloads after mission completion.

  7. Principal process analysis of biological models.

    PubMed

    Casagranda, Stefano; Touzeau, Suzanne; Ropers, Delphine; Gouzé, Jean-Luc

    2018-06-14

    Understanding the dynamical behaviour of biological systems is challenged by their large number of components and interactions. While efforts have been made in this direction to reduce model complexity, they often prove insufficient to grasp which and when model processes play a crucial role. Answering these questions is fundamental to unravel the functioning of living organisms. We design a method for dealing with model complexity, based on the analysis of dynamical models by means of Principal Process Analysis. We apply the method to a well-known model of circadian rhythms in mammals. The knowledge of the system trajectories allows us to decompose the system dynamics into processes that are active or inactive with respect to a certain threshold value. Process activities are graphically represented by Boolean and Dynamical Process Maps. We detect model processes that are always inactive, or inactive on some time interval. Eliminating these processes reduces the complex dynamics of the original model to the much simpler dynamics of the core processes, in a succession of sub-models that are easier to analyse. We quantify by means of global relative errors the extent to which the simplified models reproduce the main features of the original system dynamics and apply global sensitivity analysis to test the influence of model parameters on the errors. The results obtained prove the robustness of the method. The analysis of the sub-model dynamics allows us to identify the source of circadian oscillations. We find that the negative feedback loop involving proteins PER, CRY, CLOCK-BMAL1 is the main oscillator, in agreement with previous modelling and experimental studies. In conclusion, Principal Process Analysis is a simple-to-use method, which constitutes an additional and useful tool for analysing the complex dynamical behaviour of biological systems.

  8. Generalized Processing Tree Models: Jointly Modeling Discrete and Continuous Variables.

    PubMed

    Heck, Daniel W; Erdfelder, Edgar; Kieslich, Pascal J

    2018-05-24

    Multinomial processing tree models assume that discrete cognitive states determine observed response frequencies. Generalized processing tree (GPT) models extend this conceptual framework to continuous variables such as response times, process-tracing measures, or neurophysiological variables. GPT models assume finite-mixture distributions, with weights determined by a processing tree structure, and continuous components modeled by parameterized distributions such as Gaussians with separate or shared parameters across states. We discuss identifiability, parameter estimation, model testing, a modeling syntax, and the improved precision of GPT estimates. Finally, a GPT version of the feature comparison model of semantic categorization is applied to computer-mouse trajectories.

  9. Modified Process Reduces Porosity when Soldering in Reduced Gravity Environments

    NASA Technical Reports Server (NTRS)

    Watson, Kevin; Struk, Peter; Pettegrew, Richard; Downs, Robert; Haylett, Daniel

    2012-01-01

    A modified process yields lower levels of internal porosity for solder joints produced in reduced-gravity environments. The process incorporates both alternative materials and a modified procedure. The process provides the necessary cleaning action to enable effective bonding of the applied solder alloy with the materials to be joined. The modified process incorporates a commercially available liquid flux that is applied to the solder joint before heating with the soldering iron. It is subsequently heated with the soldering iron to activate the cleaning action of the flux and to evaporate most of the flux, followed by application of solder alloy in the form of commercially available solid solder wire (containing no flux). Continued heating ensures adequate flow of the solder alloy around and onto the materials to be joined. The final step is withdrawal of the soldering iron to allow alloy solidification and cooling of the solder joint.

  10. CLEW: A Cooperative Learning Environment for the Web.

    ERIC Educational Resources Information Center

    Ribeiro, Marcelo Blois; Noya, Ricardo Choren; Fuks, Hugo

    This paper outlines CLEW (collaborative learning environment for the Web). The project combines MUD (Multi-User Dimension), workflow, VRML (Virtual Reality Modeling Language) and educational concepts like constructivism in a learning environment where students actively participate in the learning process. The MUD shapes the environment structure.…

  11. Use of loglinear models to assess factors influencing concern for the natural environment.

    PubMed

    Lakhan, V Chris; Lavalle, Placido D

    2002-07-01

    Since it is necessary to isolate the most significant factors influencing personal concern for the environment, this paper utilizes loglinear models for identifying the interactions and interrelationships underlying multidimensional environmental survey data. A field study in Guyana conducted face-to-face interviews with 1600 citizens. Acquired categorical data were then subjected to loglinear modeling techniques to determine what significance the factors education, age, residential location, and gender have on personal concern for the environment. The loglinear models obtained from the five-dimensional contingency table suggest that there is a direct relationship between education and personal concern for the environment. Age has an interaction with education, and some influence on environmental concern, with younger respondents expressing higher concern for the environment than older respondents. Other results from the loglinear model demonstrate that residential location and the gender of the respondents do not have any statistically significant association with personal concern for the environment.

  12. Mapping care processes within a hospital: from theory to a web-based proposal merging enterprise modelling and ISO normative principles.

    PubMed

    Staccini, Pascal; Joubert, Michel; Quaranta, Jean-François; Fieschi, Marius

    2005-03-01

    Today, the economic and regulatory environment, involving activity-based and prospective payment systems, healthcare quality and risk analysis, traceability of the acts performed and evaluation of care practices, accounts for the current interest in clinical and hospital information systems. The structured gathering of information relative to users' needs and system requirements is fundamental when installing such systems. This stage takes time and is generally misconstrued by caregivers and is of limited efficacy to analysts. We used a modelling technique designed for manufacturing processes (IDEF0/SADT). We enhanced the basic model of an activity with descriptors extracted from the Ishikawa cause-and-effect diagram (methods, men, materials, machines, and environment). We proposed an object data model of a process and its components, and programmed a web-based tool in an object-oriented environment. This tool makes it possible to extract the data dictionary of a given process from the description of its elements and to locate documents (procedures, recommendations, instructions) according to each activity or role. Aimed at structuring needs and storing information provided by directly involved teams regarding the workings of an institution (or at least part of it), the process-mapping approach has an important contribution to make in the analysis of clinical information systems.

  13. Multi objective optimization model for minimizing production cost and environmental impact in CNC turning process

    NASA Astrophysics Data System (ADS)

    Widhiarso, Wahyu; Rosyidi, Cucuk Nur

    2018-02-01

    Minimizing production cost in a manufacturing company will increase the profit of the company. The cutting parameters will affect total processing time which then will affect the production cost of machining process. Besides affecting the production cost and processing time, the cutting parameters will also affect the environment. An optimization model is needed to determine the optimum cutting parameters. In this paper, we develop an optimization model to minimize the production cost and the environmental impact in CNC turning process. The model is used a multi objective optimization. Cutting speed and feed rate are served as the decision variables. Constraints considered are cutting speed, feed rate, cutting force, output power, and surface roughness. The environmental impact is converted from the environmental burden by using eco-indicator 99. Numerical example is given to show the implementation of the model and solved using OptQuest of Oracle Crystal Ball software. The results of optimization indicate that the model can be used to optimize the cutting parameters to minimize the production cost and the environmental impact.

  14. Reaction norm model with unknown environmental covariate to analyze heterosis by environment interaction.

    PubMed

    Su, G; Madsen, P; Lund, M S

    2009-05-01

    Crossbreeding is currently increasing in dairy cattle production. Several studies have shown an environment-dependent heterosis [i.e., an interaction between heterosis and environment (H x E)]. An H x E interaction is usually estimated from a few discrete environment levels. The present study proposes a reaction norm model to describe H x E interaction, which can deal with a large number of environment levels using few parameters. In the proposed model, total heterosis consists of an environment-independent part, which is described as a function of heterozygosity, and an environment-dependent part, which is described as a function of heterozygosity and environmental value (e.g., herd-year effect). A Bayesian approach is developed to estimate the environmental covariates, the regression coefficients of the reaction norm, and other parameters of the model simultaneously in both linear and nonlinear reaction norms. In the nonlinear reaction norm model, the H x E is approximated using linear splines. The approach was tested using simulated data, which were generated using an animal model with a reaction norm for heterosis. The simulation study includes 4 scenarios (the combinations of moderate vs. low heritability and moderate vs. low herd-year variation) of H x E interaction in a nonlinear form. In all scenarios, the proposed model predicted total heterosis very well. The correlation between true heterosis and predicted heterosis was 0.98 in the scenarios with low herd-year variation and 0.99 in the scenarios with moderate herd-year variation. This suggests that the proposed model and method could be a good approach to analyze H x E interactions and predict breeding values in situations in which heterosis changes gradually and continuously over an environmental gradient. On the other hand, it was found that a model ignoring H x E interaction did not significantly harm the prediction of breeding value under the simulated scenarios in which the variance for environment

  15. Numerical model of thermo-mechanical coupling for the tensile failure process of brittle materials

    NASA Astrophysics Data System (ADS)

    Fu, Yu; Wang, Zhe; Ren, Fengyu; Wang, Daguo

    2017-10-01

    A numerical model of thermal cracking with a thermo-mechanical coupling effect was established. The theory of tensile failure and heat conduction is used to study the tensile failure process of brittle materials, such as rock and concrete under high temperature environment. The validity of the model is verified by thick-wall cylinders with analytical solutions. The failure modes of brittle materials under thermal stresses caused by temperature gradient and different thermal expansion coefficient were studied by using a thick-wall cylinder model and an embedded particle model, respectively. In the thick-wall cylinder model, different forms of cracks induced by temperature gradient were obtained under different temperature boundary conditions. In the embedded particle model, radial cracks were produced in the medium part with lower tensile strength when temperature increased because of the different thermal expansion coefficient. Model results are in good agreement with the experimental results, thereby providing a new finite element method for analyzing the thermal damage process and mechanism of brittle materials.

  16. Using ecosystem services to represent the environment in hydro-economic models

    NASA Astrophysics Data System (ADS)

    Momblanch, Andrea; Connor, Jeffery D.; Crossman, Neville D.; Paredes-Arquiola, Javier; Andreu, Joaquín

    2016-07-01

    Demand for water is expected to grow in line with global human population growth, but opportunities to augment supply are limited in many places due to resource limits and expected impacts of climate change. Hydro-economic models are often used to evaluate water resources management options, commonly with a goal of understanding how to maximise water use value and reduce conflicts among competing uses. The environment is now an important factor in decision making, which has resulted in its inclusion in hydro-economic models. We reviewed 95 studies applying hydro-economic models, and documented how the environment is represented in them and the methods they use to value environmental costs and benefits. We also sought out key gaps and inconsistencies in the treatment of the environment in hydro-economic models. We found that representation of environmental values of water is patchy in most applications, and there should be systematic consideration of the scope of environmental values to include and how they should be valued. We argue that the ecosystem services framework offers a systematic approach to identify the full range of environmental costs and benefits. The main challenges to more holistic representation of the environment in hydro-economic models are the current limits to understanding of ecological functions which relate physical, ecological and economic values and critical environmental thresholds; and the treatment of uncertainty.

  17. Construction material processed using lunar simulant in various environments

    NASA Technical Reports Server (NTRS)

    Chase, Stan; Ocallaghan-Hay, Bridget; Housman, Ralph; Kindig, Michael; King, John; Montegrande, Kevin; Norris, Raymond; Vanscotter, Ryan; Willenborg, Jonathan; Staubs, Harry

    1995-01-01

    The manufacture of construction materials from locally available resources in space is an important first step in the establishment of lunar and planetary bases. The objective of the CoMPULSIVE (Construction Material Processed Using Lunar Simulant In Various Environments) experiment is to develop a procedure to produce construction materials by sintering or melting Johnson Space Center Simulant 1 (JSC-1) lunar soil simulant in both earth-based (1-g) and microgravity (approximately 0-g) environments. The characteristics of the resultant materials will be tested to determine its physical and mechanical properties. The physical characteristics include: crystalline, thermal, and electrical properties. The mechanical properties include: compressive tensile, and flexural strengths. The simulant, placed in a sealed graphite crucible, will be heated using a high temperature furnace. The crucible will then be cooled by radiative and forced convective means. The core furnace element consists of space qualified quartz-halogen incandescent lamps with focusing mirrors. Sample temperatures of up to 2200 C are attainable using this heating method.

  18. Modeling the Biodegradability of Chemical Compounds Using the Online CHEmical Modeling Environment (OCHEM).

    PubMed

    Vorberg, Susann; Tetko, Igor V

    2014-01-01

    Biodegradability describes the capacity of substances to be mineralized by free-living bacteria. It is a crucial property in estimating a compound's long-term impact on the environment. The ability to reliably predict biodegradability would reduce the need for laborious experimental testing. However, this endpoint is difficult to model due to unavailability or inconsistency of experimental data. Our approach makes use of the Online Chemical Modeling Environment (OCHEM) and its rich supply of machine learning methods and descriptor sets to build classification models for ready biodegradability. These models were analyzed to determine the relationship between characteristic structural properties and biodegradation activity. The distinguishing feature of the developed models is their ability to estimate the accuracy of prediction for each individual compound. The models developed using seven individual descriptor sets were combined in a consensus model, which provided the highest accuracy. The identified overrepresented structural fragments can be used by chemists to improve the biodegradability of new chemical compounds. The consensus model, the datasets used, and the calculated structural fragments are publicly available at http://ochem.eu/article/31660. © 2014 The Authors. Published by Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim. This is an open access article under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited.

  19. Database integration in a multimedia-modeling environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dorow, Kevin E.

    2002-09-02

    Integration of data from disparate remote sources has direct applicability to modeling, which can support Brownfield assessments. To accomplish this task, a data integration framework needs to be established. A key element in this framework is the metadata that creates the relationship between the pieces of information that are important in the multimedia modeling environment and the information that is stored in the remote data source. The design philosophy is to allow modelers and database owners to collaborate by defining this metadata in such a way that allows interaction between their components. The main parts of this framework include toolsmore » to facilitate metadata definition, database extraction plan creation, automated extraction plan execution / data retrieval, and a central clearing house for metadata and modeling / database resources. Cross-platform compatibility (using Java) and standard communications protocols (http / https) allow these parts to run in a wide variety of computing environments (Local Area Networks, Internet, etc.), and, therefore, this framework provides many benefits. Because of the specific data relationships described in the metadata, the amount of data that have to be transferred is kept to a minimum (only the data that fulfill a specific request are provided as opposed to transferring the complete contents of a data source). This allows for real-time data extraction from the actual source. Also, the framework sets up collaborative responsibilities such that the different types of participants have control over the areas in which they have domain knowledge-the modelers are responsible for defining the data relevant to their models, while the database owners are responsible for mapping the contents of the database using the metadata definitions. Finally, the data extraction mechanism allows for the ability to control access to the data and what data are made available.« less

  20. Ensemble Eclipse: A Process for Prefab Development Environment for the Ensemble Project

    NASA Technical Reports Server (NTRS)

    Wallick, Michael N.; Mittman, David S.; Shams, Khawaja, S.; Bachmann, Andrew G.; Ludowise, Melissa

    2013-01-01

    This software simplifies the process of having to set up an Eclipse IDE programming environment for the members of the cross-NASA center project, Ensemble. It achieves this by assembling all the necessary add-ons and custom tools/preferences. This software is unique in that it allows developers in the Ensemble Project (approximately 20 to 40 at any time) across multiple NASA centers to set up a development environment almost instantly and work on Ensemble software. The software automatically has the source code repositories and other vital information and settings included. The Eclipse IDE is an open-source development framework. The NASA (Ensemble-specific) version of the software includes Ensemble-specific plug-ins as well as settings for the Ensemble project. This software saves developers the time and hassle of setting up a programming environment, making sure that everything is set up in the correct manner for Ensemble development. Existing software (i.e., standard Eclipse) requires an intensive setup process that is both time-consuming and error prone. This software is built once by a single user and tested, allowing other developers to simply download and use the software

  1. Incorporating pushing in exclusion-process models of cell migration.

    PubMed

    Yates, Christian A; Parker, Andrew; Baker, Ruth E

    2015-05-01

    The macroscale movement behavior of a wide range of isolated migrating cells has been well characterized experimentally. Recently, attention has turned to understanding the behavior of cells in crowded environments. In such scenarios it is possible for cells to interact, inducing neighboring cells to move in order to make room for their own movements or progeny. Although the behavior of interacting cells has been modeled extensively through volume-exclusion processes, few models, thus far, have explicitly accounted for the ability of cells to actively displace each other in order to create space for themselves. In this work we consider both on- and off-lattice volume-exclusion position-jump processes in which cells are explicitly allowed to induce movements in their near neighbors in order to create space for themselves to move or proliferate into. We refer to this behavior as pushing. From these simple individual-level representations we derive continuum partial differential equations for the average occupancy of the domain. We find that, for limited amounts of pushing, comparison between the averaged individual-level simulations and the population-level model is nearly as good as in the scenario without pushing. Interestingly, we find that, in the on-lattice case, the diffusion coefficient of the population-level model is increased by pushing, whereas, for the particular off-lattice model that we investigate, the diffusion coefficient is reduced. We conclude, therefore, that it is important to consider carefully the appropriate individual-level model to use when representing complex cell-cell interactions such as pushing.

  2. Integrated catchment modelling within a strategic planning and decision making process: Werra case study

    NASA Astrophysics Data System (ADS)

    Dietrich, Jörg; Funke, Markus

    Integrated water resources management (IWRM) redefines conventional water management approaches through a closer cross-linkage between environment and society. The role of public participation and socio-economic considerations becomes more important within the planning and decision making process. In this paper we address aspects of the integration of catchment models into such a process taking the implementation of the European Water Framework Directive (WFD) as an example. Within a case study situated in the Werra river basin (Central Germany), a systems analytic decision process model was developed. This model uses the semantics of the Unified Modeling Language (UML) activity model. As an example application, the catchment model SWAT and the water quality model RWQM1 were applied to simulate the effect of phosphorus emissions from non-point and point sources on water quality. The decision process model was able to guide the participants of the case study through the interdisciplinary planning and negotiation of actions. Further improvements of the integration framework include tools for quantitative uncertainty analyses, which are crucial for real life application of models within an IWRM decision making toolbox. For the case study, the multi-criteria assessment of actions indicates that the polluter pays principle can be met at larger scales (sub-catchment or river basin) without significantly compromising cost efficiency for the local situation.

  3. The TAME Project: Towards improvement-oriented software environments

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Rombach, H. Dieter

    1988-01-01

    Experience from a dozen years of analyzing software engineering processes and products is summarized as a set of software engineering and measurement principles that argue for software engineering process models that integrate sound planning and analysis into the construction process. In the TAME (Tailoring A Measurement Environment) project at the University of Maryland, such an improvement-oriented software engineering process model was developed that uses the goal/question/metric paradigm to integrate the constructive and analytic aspects of software development. The model provides a mechanism for formalizing the characterization and planning tasks, controlling and improving projects based on quantitative analysis, learning in a deeper and more systematic way about the software process and product, and feeding the appropriate experience back into the current and future projects. The TAME system is an instantiation of the TAME software engineering process model as an ISEE (integrated software engineering environment). The first in a series of TAME system prototypes has been developed. An assessment of experience with this first limited prototype is presented including a reassessment of its initial architecture.

  4. Extreme Environment Capable, Modular and Scalable Power Processing Unit for Solar Electric Propulsion

    NASA Technical Reports Server (NTRS)

    Carr, Gregory A.; Iannello, Christopher J.; Chen, Yuan; Hunter, Don J.; DelCastillo, Linda; Bradley, Arthur T.; Stell, Christopher; Mojarradi, Mohammad M.

    2013-01-01

    This paper is to present a concept of a modular and scalable High Temperature Boost (HTB) Power Processing Unit (PPU) capable of operating at temperatures beyond the standard military temperature range. The various extreme environments technologies are also described as the fundamental technology path to this concept. The proposed HTB PPU is intended for power processing in the area of space solar electric propulsion, where reduction of in-space mass and volume are desired, and sometimes even critical, to achieve the goals of future space flight missions. The concept of the HTB PPU can also be applied to other extreme environment applications, such as geothermal and petroleum deep-well drilling, where higher temperature operation is required.

  5. Extreme Environment Capable, Modular and Scalable Power Processing Unit for Solar Electric Propulsion

    NASA Technical Reports Server (NTRS)

    Carr, Gregory A.; Iannello, Christopher J.; Chen, Yuan; Hunter, Don J.; Del Castillo, Linda; Bradley, Arthur T.; Stell, Christopher; Mojarradi, Mohammad M.

    2013-01-01

    This paper is to present a concept of a modular and scalable High Temperature Boost (HTB) Power Processing Unit (PPU) capable of operating at temperatures beyond the standard military temperature range. The various extreme environments technologies are also described as the fundamental technology path to this concept. The proposed HTB PPU is intended for power processing in the area of space solar electric propulsion, where the reduction of in-space mass and volume are desired, and sometimes even critical, to achieve the goals of future space flight missions. The concept of the HTB PPU can also be applied to other extreme environment applications, such as geothermal and petroleum deep-well drilling, where higher temperature operation is required.

  6. Cielo Computational Environment Usage Model With Mappings to ACE Requirements for the General Availability User Environment Capabilities Release Version 1.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vigil,Benny Manuel; Ballance, Robert; Haskell, Karen

    Cielo is a massively parallel supercomputer funded by the DOE/NNSA Advanced Simulation and Computing (ASC) program, and operated by the Alliance for Computing at Extreme Scale (ACES), a partnership between Los Alamos National Laboratory (LANL) and Sandia National Laboratories (SNL). The primary Cielo compute platform is physically located at Los Alamos National Laboratory. This Cielo Computational Environment Usage Model documents the capabilities and the environment to be provided for the Q1 FY12 Level 2 Cielo Capability Computing (CCC) Platform Production Readiness Milestone. This document describes specific capabilities, tools, and procedures to support both local and remote users. The model ismore » focused on the needs of the ASC user working in the secure computing environments at Lawrence Livermore National Laboratory (LLNL), Los Alamos National Laboratory, or Sandia National Laboratories, but also addresses the needs of users working in the unclassified environment. The Cielo Computational Environment Usage Model maps the provided capabilities to the tri-Lab ASC Computing Environment (ACE) Version 8.0 requirements. The ACE requirements reflect the high performance computing requirements for the Production Readiness Milestone user environment capabilities of the ASC community. A description of ACE requirements met, and those requirements that are not met, are included in each section of this document. The Cielo Computing Environment, along with the ACE mappings, has been issued and reviewed throughout the tri-Lab community.« less

  7. A modelling framework for the transport, transformation and biouptake of manufactured nanoparticles in the aquatic environment

    NASA Astrophysics Data System (ADS)

    Lofts, Stephen; Keller, Virginie; Dumont, Egon; Williams, Richard; Praetorius, Antonia; von der Kammer, Frank

    2016-04-01

    The development of innovative new chemical products is a key aspect of the modern economy, yet society demands that such development is environmentally sustainable. Developing knowledge of how new classes of chemicals behave following release to the environment is key to understanding the hazards that will potentially result. Nanoparticles are a key example of a class of chemicals that have undergone a significant expansion in production and use in recent years and so there is a need to develop tools to predict their potential hazard following their deliberate or incidental release to the environment. Generalising the understanding of the environmental behaviour of manufactured nanoparticles in general is challenging, as they are chemically and physically diverse (e.g. metals, metal oxides, carbon nanotubes, cellulose, quantum dots). Furthermore, nanoparticles may be manufactured with capping agents to modify their desired behaviour in industrial applications; such agents may also influence their environmental behaviour. Also, nanoparticles may become significantly modified from their as-manufactured forms both prior to and after the point of environmental release. Tools for predicting nanoparticle behaviour and hazard need to be able to consider a wide range of release scenarios and aspects of nanoparticle behaviour in the environment (e.g. dissolution, transformation of capping agents, agglomeration and aggregation behaviour), where such behaviours are not shared by all types of nanoparticle. This implies the need for flexible, futureproofed tools capable of being updated to take new understanding of behavioural processes into account as such knowledge emerges. This presentation will introduce the NanoFASE model system, a multimedia modelling framework for the transport, transformation and biouptake of manufactured nanoparticles. The complete system will comprise atmospheric, terrestrial and aquatic compartments to allow holistic simulation of nanoparticles; this

  8. The Simultaneous Production Model; A Model for the Construction, Testing, Implementation and Revision of Educational Computer Simulation Environments.

    ERIC Educational Resources Information Center

    Zillesen, Pieter G. van Schaick

    This paper introduces a hardware and software independent model for producing educational computer simulation environments. The model, which is based on the results of 32 studies of educational computer simulations program production, implies that educational computer simulation environments are specified, constructed, tested, implemented, and…

  9. Adapting the iSNOBAL model for improved visualization in a GIS environment

    NASA Astrophysics Data System (ADS)

    Johansen, W. J.; Delparte, D.

    2014-12-01

    Snowmelt is a primary means of crucial water resources in much of the western United States. Researchers are developing models that estimate snowmelt to aid in water resource management. One such model is the image snowcover energy and mass balance (iSNOBAL) model. It uses input climate grids to simulate the development and melting of snowpack in mountainous regions. This study looks at applying this model to the Reynolds Creek Experimental Watershed in southwestern Idaho, utilizing novel approaches incorporating geographic information systems (GIS). To improve visualization of the iSNOBAL model, we have adapted it to run in a GIS environment. This type of environment is suited to both the input grid creation and the visualization of results. The data used for input grid creation can be stored locally or on a web-server. Kriging interpolation embedded within Python scripts are used to create air temperature, soil temperature, humidity, and precipitation grids, while built-in GIS and existing tools are used to create solar radiation and wind grids. Additional Python scripting is then used to perform model calculations. The final product is a user-friendly and accessible version of the iSNOBAL model, including the ability to easily visualize and interact with model results, all within a web- or desktop-based GIS environment. This environment allows for interactive manipulation of model parameters and visualization of the resulting input grids for the model calculations. Future work is moving towards adapting the model further for use in a 3D gaming engine for improved visualization and interaction.

  10. Prior Design for Dependent Dirichlet Processes: An Application to Marathon Modeling

    PubMed Central

    F. Pradier, Melanie; J. R. Ruiz, Francisco; Perez-Cruz, Fernando

    2016-01-01

    This paper presents a novel application of Bayesian nonparametrics (BNP) for marathon data modeling. We make use of two well-known BNP priors, the single-p dependent Dirichlet process and the hierarchical Dirichlet process, in order to address two different problems. First, we study the impact of age, gender and environment on the runners’ performance. We derive a fair grading method that allows direct comparison of runners regardless of their age and gender. Unlike current grading systems, our approach is based not only on top world records, but on the performances of all runners. The presented methodology for comparison of densities can be adopted in many other applications straightforwardly, providing an interesting perspective to build dependent Dirichlet processes. Second, we analyze the running patterns of the marathoners in time, obtaining information that can be valuable for training purposes. We also show that these running patterns can be used to predict finishing time given intermediate interval measurements. We apply our models to New York City, Boston and London marathons. PMID:26821155

  11. A Phenomena-Oriented Environment for Teaching Process Modeling: Novel Modeling Software and Its Use in Problem Solving.

    ERIC Educational Resources Information Center

    Foss, Alan S.; Geurts, Kevin R.; Goodeve, Peter J.; Dahm, Kevin D.; Stephanopoulos, George; Bieszczad, Jerry; Koulouris, Alexandros

    1999-01-01

    Discusses a program that offers students a phenomenon-oriented environment expressed in the fundamental concepts and language of chemical engineering such as mass and energy balancing, phase equilibria, reaction stoichiometry and rate, modes of heat, and species transport. (CCM)

  12. Combining Unsupervised and Supervised Classification to Build User Models for Exploratory Learning Environments

    ERIC Educational Resources Information Center

    Amershi, Saleema; Conati, Cristina

    2009-01-01

    In this paper, we present a data-based user modeling framework that uses both unsupervised and supervised classification to build student models for exploratory learning environments. We apply the framework to build student models for two different learning environments and using two different data sources (logged interface and eye-tracking data).…

  13. Evaluating Galactic Cosmic Ray Environment Models Using RaD-X Flight Data

    NASA Technical Reports Server (NTRS)

    Norman, R. B.; Mertens, C. J.; Slaba, T. C.

    2016-01-01

    Galactic cosmic rays enter Earth's atmosphere after interacting with the geomagnetic field. The primary galactic cosmic rays spectrum is fundamentally changed as it interacts with Earth's atmosphere through nuclear and atomic interactions. At points deeper in the atmosphere, such as at airline altitudes, the radiation environment is a combination of the primary galactic cosmic rays and the secondary particles produced through nuclear interactions. The RaD-X balloon experiment measured the atmospheric radiation environment above 20 km during 2 days in September 2015. These experimental measurements were used to validate and quantify uncertainty in physics-based models used to calculate exposure levels for commercial aviation. In this paper, the Badhwar-O'Neill 2014, the International Organization for Standardization 15390, and the German Aerospace Company galactic cosmic ray environment models are used as input into the same radiation transport code to predict and compare dosimetric quantities to RaD-X measurements. In general, the various model results match the measured tissue equivalent dose well, with results generated by the German Aerospace Center galactic cosmic ray environment model providing the best comparison. For dose equivalent and dose measured in silicon, however, the models were compared less favorably to the measurements.

  14. Cognitive Virtualization: Combining Cognitive Models and Virtual Environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tuan Q. Tran; David I. Gertman; Donald D. Dudenhoeffer

    2007-08-01

    3D manikins are often used in visualizations to model human activity in complex settings. Manikins assist in developing understanding of human actions, movements and routines in a variety of different environments representing new conceptual designs. One such environment is a nuclear power plant control room, here they have the potential to be used to simulate more precise ergonomic assessments of human work stations. Next generation control rooms will pose numerous challenges for system designers. The manikin modeling approach by itself, however, may be insufficient for dealing with the desired technical advancements and challenges of next generation automated systems. Uncertainty regardingmore » effective staffing levels; and the potential for negative human performance consequences in the presence of advanced automated systems (e.g., reduced vigilance, poor situation awareness, mistrust or blind faith in automation, higher information load and increased complexity) call for further research. Baseline assessment of novel control room equipment(s) and configurations needs to be conducted. These design uncertainties can be reduced through complementary analysis that merges ergonomic manikin models with models of higher cognitive functions, such as attention, memory, decision-making, and problem-solving. This paper will discuss recent advancements in merging a theoretical-driven cognitive modeling framework within a 3D visualization modeling tool to evaluate of next generation control room human factors and ergonomic assessment. Though this discussion primary focuses on control room design, the application for such a merger between 3D visualization and cognitive modeling can be extended to various areas of focus such as training and scenario planning.« less

  15. The use of a high resolution model in a private environment.

    NASA Astrophysics Data System (ADS)

    van Dijke, D.; Malda, D.

    2009-09-01

    The commercial organisation MeteoGroup uses high resolution modelling for multiple purposes. MeteoGroup uses the Weather Research and Forecasting Model (WRF®1). WRF is used in the operational environment of several MeteoGroup companies across Europe. It is also used in hindcast studies, for example hurricane tracking, wind climate computation and deriving boundary conditions for air quality models. A special operational service was set up for our tornado chasing team that uses high resolution flexible WRF data to chase for super cells and tornados in the USA during spring. Much effort is put into the development and improvement of the pre- and post-processing of the model. At MeteoGroup the static land-use data has been extended and adjusted to improve temperature and wind forecasts. The system has been modified such that sigma level input data from the global ECMWF model can be used for initialisation. By default only pressure level data could be used. During the spin-up of the model synoptical observations are nudged. A program to adjust possible initialisation errors of several surface parameters in coastal areas has been implemented. We developed an algorithm that computes cloud fractions using multiple direct model output variables. Forecasters prefer to use weather codes for their daily forecasts to detect severe weather. For this usage we developed model weather codes using a variety of direct model output and our own derived variables. 1 WRF® is a registered trademark of the University Corporation for Atmospheric Research (UCAR)

  16. CLIMLAB: a Python-based software toolkit for interactive, process-oriented climate modeling

    NASA Astrophysics Data System (ADS)

    Rose, B. E. J.

    2015-12-01

    Global climate is a complex emergent property of the rich interactions between simpler components of the climate system. We build scientific understanding of this system by breaking it down into component process models (e.g. radiation, large-scale dynamics, boundary layer turbulence), understanding each components, and putting them back together. Hands-on experience and freedom to tinker with climate models (whether simple or complex) is invaluable for building physical understanding. CLIMLAB is an open-ended software engine for interactive, process-oriented climate modeling. With CLIMLAB you can interactively mix and match model components, or combine simpler process models together into a more comprehensive model. It was created primarily to support classroom activities, using hands-on modeling to teach fundamentals of climate science at both undergraduate and graduate levels. CLIMLAB is written in Python and ties in with the rich ecosystem of open-source scientific Python tools for numerics and graphics. The IPython notebook format provides an elegant medium for distributing interactive example code. I will give an overview of the current capabilities of CLIMLAB, the curriculum we have developed thus far, and plans for the future. Using CLIMLAB requires some basic Python coding skills. We consider this an educational asset, as we are targeting upper-level undergraduates and Python is an increasingly important language in STEM fields. However CLIMLAB is well suited to be deployed as a computational back-end for a graphical gaming environment based on earth-system modeling.

  17. [Analytic methods for seed models with genotype x environment interactions].

    PubMed

    Zhu, J

    1996-01-01

    Genetic models with genotype effect (G) and genotype x environment interaction effect (GE) are proposed for analyzing generation means of seed quantitative traits in crops. The total genetic effect (G) is partitioned into seed direct genetic effect (G0), cytoplasm genetic of effect (C), and maternal plant genetic effect (Gm). Seed direct genetic effect (G0) can be further partitioned into direct additive (A) and direct dominance (D) genetic components. Maternal genetic effect (Gm) can also be partitioned into maternal additive (Am) and maternal dominance (Dm) genetic components. The total genotype x environment interaction effect (GE) can also be partitioned into direct genetic by environment interaction effect (G0E), cytoplasm genetic by environment interaction effect (CE), and maternal genetic by environment interaction effect (GmE). G0E can be partitioned into direct additive by environment interaction (AE) and direct dominance by environment interaction (DE) genetic components. GmE can also be partitioned into maternal additive by environment interaction (AmE) and maternal dominance by environment interaction (DmE) genetic components. Partitions of genetic components are listed for parent, F1, F2 and backcrosses. A set of parents, their reciprocal F1 and F2 seeds is applicable for efficient analysis of seed quantitative traits. MINQUE(0/1) method can be used for estimating variance and covariance components. Unbiased estimation for covariance components between two traits can also be obtained by the MINQUE(0/1) method. Random genetic effects in seed models are predictable by the Adjusted Unbiased Prediction (AUP) approach with MINQUE(0/1) method. The jackknife procedure is suggested for estimation of sampling variances of estimated variance and covariance components and of predicted genetic effects, which can be further used in a t-test for parameter. Unbiasedness and efficiency for estimating variance components and predicting genetic effects are tested by

  18. Self-assembly processes in the prebiotic environment

    PubMed Central

    Deamer, David; Singaram, Sara; Rajamani, Sudha; Kompanichenko, Vladimir; Guggenheim, Stephen

    2006-01-01

    An important question guiding research on the origin of life concerns the environmental conditions where molecular systems with the properties of life first appeared on the early Earth. An appropriate site would require liquid water, a source of organic compounds, a source of energy to drive polymerization reactions and a process by which the compounds were sufficiently concentrated to undergo physical and chemical interactions. One such site is a geothermal setting, in which organic compounds interact with mineral surfaces to promote self-assembly and polymerization reactions. Here, we report an initial study of two geothermal sites where mixtures of representative organic solutes (amino acids, nucleobases, a fatty acid and glycerol) and phosphate were mixed with high-temperature water in clay-lined pools. Most of the added organics and phosphate were removed from solution with half-times measured in minutes to a few hours. Analysis of the clay, primarily smectite and kaolin, showed that the organics were adsorbed to the mineral surfaces at the acidic pH of the pools, but could subsequently be released in basic solutions. These results help to constrain the range of possible environments for the origin of life. A site conducive to self-assembly of organic solutes would be an aqueous environment relatively low in ionic solutes, at an intermediate temperature range and neutral pH ranges, in which cyclic concentration of the solutes can occur by transient dry intervals. PMID:17008220

  19. Modeling Physiological Processes That Relate Toxicant Exposure and Bacterial Population Dynamics

    PubMed Central

    Klanjscek, Tin; Nisbet, Roger M.; Priester, John H.; Holden, Patricia A.

    2012-01-01

    Quantifying effects of toxicant exposure on metabolic processes is crucial to predicting microbial growth patterns in different environments. Mechanistic models, such as those based on Dynamic Energy Budget (DEB) theory, can link physiological processes to microbial growth. Here we expand the DEB framework to include explicit consideration of the role of reactive oxygen species (ROS). Extensions considered are: (i) additional terms in the equation for the “hazard rate” that quantifies mortality risk; (ii) a variable representing environmental degradation; (iii) a mechanistic description of toxic effects linked to increase in ROS production and aging acceleration, and to non-competitive inhibition of transport channels; (iv) a new representation of the “lag time” based on energy required for acclimation. We estimate model parameters using calibrated Pseudomonas aeruginosa optical density growth data for seven levels of cadmium exposure. The model reproduces growth patterns for all treatments with a single common parameter set, and bacterial growth for treatments of up to 150 mg(Cd)/L can be predicted reasonably well using parameters estimated from cadmium treatments of 20 mg(Cd)/L and lower. Our approach is an important step towards connecting levels of biological organization in ecotoxicology. The presented model reveals possible connections between processes that are not obvious from purely empirical considerations, enables validation and hypothesis testing by creating testable predictions, and identifies research required to further develop the theory. PMID:22328915

  20. Modeling Electrostatic Fields Generated by Internal Charging of Materials in Space Radiation Environments

    NASA Technical Reports Server (NTRS)

    Minow, Joseph I.

    2011-01-01

    Internal charging is a risk to spacecraft in energetic electron environments. DICTAT, NU MIT computational codes are the most widely used engineering tools for evaluating internal charging of insulator materials exposed to these environments. Engineering tools are designed for rapid evaluation of ESD threats, but there is a need for more physics based models for investigating the science of materials interactions with energetic electron environments. Current tools are limited by the physics included in the models and ease of user implementation .... additional development work is needed to improve models.

  1. Antibiotic Resistance in Listeria Species Isolated from Catfish Fillets and Processing Environment

    USDA-ARS?s Scientific Manuscript database

    The susceptibility of 221 Listeria spp. (86 Listeria monocytogenes, 41 Listeria innocua and 94 Listeria seeligeri-Listeria welshimeri-Listeria ivanovii) isolated from catfish fillets and processing environment to 15 antibiotics was determined. Listeria isolates were analysed by disc-diffusion assay...

  2. Computational Process Modeling for Additive Manufacturing

    NASA Technical Reports Server (NTRS)

    Bagg, Stacey; Zhang, Wei

    2014-01-01

    Computational Process and Material Modeling of Powder Bed additive manufacturing of IN 718. Optimize material build parameters with reduced time and cost through modeling. Increase understanding of build properties. Increase reliability of builds. Decrease time to adoption of process for critical hardware. Potential to decrease post-build heat treatments. Conduct single-track and coupon builds at various build parameters. Record build parameter information and QM Meltpool data. Refine Applied Optimization powder bed AM process model using data. Report thermal modeling results. Conduct metallography of build samples. Calibrate STK models using metallography findings. Run STK models using AO thermal profiles and report STK modeling results. Validate modeling with additional build. Photodiode Intensity measurements highly linear with power input. Melt Pool Intensity highly correlated to Melt Pool Size. Melt Pool size and intensity increase with power. Applied Optimization will use data to develop powder bed additive manufacturing process model.

  3. Model of areas for identifying risks influencing the compliance of technological processes and products

    NASA Astrophysics Data System (ADS)

    Misztal, A.; Belu, N.

    2016-08-01

    Operation of every company is associated with the risk of interfering with proper performance of its fundamental processes. This risk is associated with various internal areas of the company, as well as the environment in which it operates. From the point of view of ensuring compliance of the course of specific technological processes and, consequently, product conformity with requirements, it is important to identify these threats and eliminate or reduce the risk of their occurrence. The purpose of this article is to present a model of areas of identifying risk affecting the compliance of processes and products, which is based on multiregional targeted monitoring of typical places of interference and risk management methods. The model is based on the verification of risk analyses carried out in small and medium-sized manufacturing companies in various industries..

  4. Model of melting (crystallization) process of the condensed disperse phase in the smoky plasmas

    NASA Astrophysics Data System (ADS)

    Dragan, G. S.; Kolesnikov, K. V.; Kutarov, V. V.

    2018-01-01

    The paper presents an analysis of the causes of a formation of spatial ordered grain structures in a smoky plasma. We are modeling the process of melting (crystallization) of a condensed phase in this environment taking into account the screened electrostatic interaction and the diffusion-drift force. We discuss an influence of the charge on the melting temperatures.

  5. Composing Models of Geographic Physical Processes

    NASA Astrophysics Data System (ADS)

    Hofer, Barbara; Frank, Andrew U.

    Processes are central for geographic information science; yet geographic information systems (GIS) lack capabilities to represent process related information. A prerequisite to including processes in GIS software is a general method to describe geographic processes independently of application disciplines. This paper presents such a method, namely a process description language. The vocabulary of the process description language is derived formally from mathematical models. Physical processes in geography can be described in two equivalent languages: partial differential equations or partial difference equations, where the latter can be shown graphically and used as a method for application specialists to enter their process models. The vocabulary of the process description language comprises components for describing the general behavior of prototypical geographic physical processes. These process components can be composed by basic models of geographic physical processes, which is shown by means of an example.

  6. Managing Analysis Models in the Design Process

    NASA Technical Reports Server (NTRS)

    Briggs, Clark

    2006-01-01

    Design of large, complex space systems depends on significant model-based support for exploration of the design space. Integrated models predict system performance in mission-relevant terms given design descriptions and multiple physics-based numerical models. Both the design activities and the modeling activities warrant explicit process definitions and active process management to protect the project from excessive risk. Software and systems engineering processes have been formalized and similar formal process activities are under development for design engineering and integrated modeling. JPL is establishing a modeling process to define development and application of such system-level models.

  7. SEPEM: A tool for statistical modeling the solar energetic particle environment

    NASA Astrophysics Data System (ADS)

    Crosby, Norma; Heynderickx, Daniel; Jiggens, Piers; Aran, Angels; Sanahuja, Blai; Truscott, Pete; Lei, Fan; Jacobs, Carla; Poedts, Stefaan; Gabriel, Stephen; Sandberg, Ingmar; Glover, Alexi; Hilgers, Alain

    2015-07-01

    Solar energetic particle (SEP) events are a serious radiation hazard for spacecraft as well as a severe health risk to humans traveling in space. Indeed, accurate modeling of the SEP environment constitutes a priority requirement for astrophysics and solar system missions and for human exploration in space. The European Space Agency's Solar Energetic Particle Environment Modelling (SEPEM) application server is a World Wide Web interface to a complete set of cross-calibrated data ranging from 1973 to 2013 as well as new SEP engineering models and tools. Both statistical and physical modeling techniques have been included, in order to cover the environment not only at 1 AU but also in the inner heliosphere ranging from 0.2 AU to 1.6 AU using a newly developed physics-based shock-and-particle model to simulate particle flux profiles of gradual SEP events. With SEPEM, SEP peak flux and integrated fluence statistics can be studied, as well as durations of high SEP flux periods. Furthermore, effects tools are also included to allow calculation of single event upset rate and radiation doses for a variety of engineering scenarios.

  8. A Process-Based Transport-Distance Model of Aeolian Transport

    NASA Astrophysics Data System (ADS)

    Naylor, A. K.; Okin, G.; Wainwright, J.; Parsons, A. J.

    2017-12-01

    We present a new approach to modeling aeolian transport based on transport distance. Particle fluxes are based on statistical probabilities of particle detachment and distributions of transport lengths, which are functions of particle size classes. A computational saltation model is used to simulate transport distances over a variety of sizes. These are fit to an exponential distribution, which has the advantages of computational economy, concordance with current field measurements, and a meaningful relationship to theoretical assumptions about mean and median particle transport distance. This novel approach includes particle-particle interactions, which are important for sustaining aeolian transport and dust emission. Results from this model are compared with results from both bulk- and particle-sized-specific transport equations as well as empirical wind tunnel studies. The transport-distance approach has been successfully used for hydraulic processes, and extending this methodology from hydraulic to aeolian transport opens up the possibility of modeling joint transport by wind and water using consistent physics. Particularly in nutrient-limited environments, modeling the joint action of aeolian and hydraulic transport is essential for understanding the spatial distribution of biomass across landscapes and how it responds to climatic variability and change.

  9. Longitudinal monitoring of Listeria monocytogenes and Listeria phages in seafood processing environments in Thailand.

    PubMed

    Vongkamjan, Kitiya; Benjakul, Soottawat; Kim Vu, Hue Thi; Vuddhakul, Varaporn

    2017-09-01

    Listeria monocytogenes is a foodborne pathogen commonly found in environments of seafood processing, thus presenting a challenge for eradication from seafood processing facilities. Monitoring the prevalence and subtype diversity of L. monocytogenes together with phages that are specific to Listeria spp. ("Listeria phages") will provide knowledge on the bacteria-phage ecology in food processing plants. In this work, a total of 595 samples were collected from raw material, finished seafood products and environmental samples from different sites of a seafood processing plant during 17 sampling visits in 1.5 years of study. L. monocytogenes and Listeria spp. (non-monocytogenes) were found in 22 (3.7%) and 43 (7.2%) samples, respectively, whereas 29 Listeria phages were isolated from 9 (1.5%) phage-positive samples. DNA fingerprint analysis of L. monocytogenes isolates revealed 11 Random Amplified Polymorphic DNA (RAPD) profiles, with two subtypes were frequently observed over time. Our data reveal a presence of Listeria phages within the same seafood processing environments where a diverse set of L. monocytogenes subtypes was also found. Although serotype 4b was observed at lower frequency, data indicate that isolates from this seafood processing plant belonged to both epidemiologically important serotypes 1/2a and 4b, which may suggest a potential public health risk. Phages (all showed a unique genome size of 65 ± 2 kb) were classified into 9 host range groups, representing both broad- and narrow-host range. While most L. monocytogenes isolates from this facility were susceptible to phages, five isolates showed resistance to 12-20 phages. Variations in phage host range among Listeria phages isolated from food processing plant may affect a presence of a diverse set of L. monocytogenes isolates derived from the same processing environment in Thailand. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Molecular recognition of the environment and mechanisms of the origin of species in quantum-like modeling of evolution.

    PubMed

    Melkikh, Alexey V; Khrennikov, Andrei

    2017-11-01

    A review of the mechanisms of speciation is performed. The mechanisms of the evolution of species, taking into account the feedback of the state of the environment and mechanisms of the emergence of complexity, are considered. It is shown that these mechanisms, at the molecular level, cannot work steadily in terms of classical mechanics. Quantum mechanisms of changes in the genome, based on the long-range interaction potential between biologically important molecules, are proposed as one of possible explanation. Different variants of interactions of the organism and environment based on molecular recognition and leading to new species origins are considered. Experiments to verify the model are proposed. This bio-physical study is completed by the general operational model of based on quantum information theory. The latter is applied to model of epigenetic evolution. We briefly present the basics of the quantum-like approach to modeling of bio-informational processes. This approach is illustrated by the quantum-like model of epigenetic evolution. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. A space radiation shielding model of the Martian radiation environment experiment (MARIE)

    NASA Technical Reports Server (NTRS)

    Atwell, W.; Saganti, P.; Cucinotta, F. A.; Zeitlin, C. J.

    2004-01-01

    The 2001 Mars Odyssey spacecraft was launched towards Mars on April 7, 2001. Onboard the spacecraft is the Martian radiation environment experiment (MARIE), which is designed to measure the background radiation environment due to galactic cosmic rays (GCR) and solar protons in the 20-500 MeV/n energy range. We present an approach for developing a space radiation-shielding model of the spacecraft that includes the MARIE instrument in the current mapping phase orientation. A discussion is presented describing the development and methodology used to construct the shielding model. For a given GCR model environment, using the current MARIE shielding model and the high-energy particle transport codes, dose rate values are compared with MARIE measurements during the early mapping phase in Mars orbit. The results show good agreement between the model calculations and the MARIE measurements as presented for the March 2002 dataset. c2003 COSPAR. Published by Elsevier Ltd. All rights reserved.

  12. User modeling for distributed virtual environment intelligent agents

    NASA Astrophysics Data System (ADS)

    Banks, Sheila B.; Stytz, Martin R.

    1999-07-01

    This paper emphasizes the requirement for user modeling by presenting the necessary information to motivate the need for and use of user modeling for intelligent agent development. The paper will present information on our current intelligent agent development program, the Symbiotic Information Reasoning and Decision Support (SIRDS) project. We then discuss the areas of intelligent agents and user modeling, which form the foundation of the SIRDS project. Included in the discussion of user modeling are its major components, which are cognitive modeling and behavioral modeling. We next motivate the need for and user of a methodology to develop user models to encompass work within cognitive task analysis. We close the paper by drawing conclusions from our current intelligent agent research project and discuss avenues of future research in the utilization of user modeling for the development of intelligent agents for virtual environments.

  13. Kinetic Modeling of Microbiological Processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Chongxuan; Fang, Yilin

    Kinetic description of microbiological processes is vital for the design and control of microbe-based biotechnologies such as waste water treatment, petroleum oil recovery, and contaminant attenuation and remediation. Various models have been proposed to describe microbiological processes. This editorial article discusses the advantages and limiation of these modeling approaches in cluding tranditional, Monod-type models and derivatives, and recently developed constraint-based approaches. The article also offers the future direction of modeling researches that best suit for petroleum and environmental biotechnologies.

  14. Lunar laser ranging data processing in a Unix/X windows environment

    NASA Technical Reports Server (NTRS)

    Ricklefs, Randall L.; Ries, Judit G.

    1993-01-01

    In cooperation with the NASA Crustal Dynamics Project initiative placing workstation computers at each of its laser ranging stations to handle data filtering and normalpointing, MLRS personnel have developed a new generation of software to provide the same services for the lunar laser ranging data type. The Unix operating system and X windows/Motif provides an environment for both batch and interactive filtering and normalpointing as well as prediction calculations. The goal is to provide a transportable and maintainable data reduction environment. This software and some sample displays are presented. that the lunar (or satellite) datacould be processed on one computer while data was taken on the other. The reduction of the data was totally interactive and in no way automated. In addition, lunar predictions were produced on-site, another first in the effort to down-size historically mainframe-based applications. Extraction of earth rotation parameters was at one time attempted on site in near-realtime. In 1988, the Crustal Dynamics Project SLR Computer Panel mandated the installation of Hewlett-Packard 9000/360 Unix workstations at each NASA-operated laser ranging station to relieve the aging controller computers of much of their data and communications handling responsibility and to provide on-site data filtering and normal pointing for a growing list of artificial satellite targets. This was seen by MLRS staff as an opportunity to provide a better lunar data processing environment as well.

  15. Lunar laser ranging data processing in a Unix/X windows environment

    NASA Astrophysics Data System (ADS)

    Ricklefs, Randall L.; Ries, Judit G.

    1993-06-01

    In cooperation with the NASA Crustal Dynamics Project initiative placing workstation computers at each of its laser ranging stations to handle data filtering and normalpointing, MLRS personnel have developed a new generation of software to provide the same services for the lunar laser ranging data type. The Unix operating system and X windows/Motif provides an environment for both batch and interactive filtering and normalpointing as well as prediction calculations. The goal is to provide a transportable and maintainable data reduction environment. This software and some sample displays are presented. that the lunar (or satellite) datacould be processed on one computer while data was taken on the other. The reduction of the data was totally interactive and in no way automated. In addition, lunar predictions were produced on-site, another first in the effort to down-size historically mainframe-based applications. Extraction of earth rotation parameters was at one time attempted on site in near-realtime. In 1988, the Crustal Dynamics Project SLR Computer Panel mandated the installation of Hewlett-Packard 9000/360 Unix workstations at each NASA-operated laser ranging station to relieve the aging controller computers of much of their data and communications handling responsibility and to provide on-site data filtering and normal pointing for a growing list of artificial satellite targets. This was seen by MLRS staff as an opportunity to provide a better lunar data processing environment as well.

  16. Students' perception of the psycho-social clinical learning environment: an evaluation of placement models.

    PubMed

    Henderson, Amanda; Twentyman, Michelle; Heel, Alison; Lloyd, Belinda

    2006-10-01

    Nursing is a practice based discipline. A supportive environment has been identified as important for the transfer of learning in the clinical context. The aim of the paper was to assess undergraduate nurses' perceptions of the psychosocial characteristics of clinical learning environments within three different clinical placement models. Three hundred and eight-nine undergraduate nursing students rated their perceptions of the psycho-social learning environment using a Clinical Learning Environment Inventory. There were 16 respondents in the Preceptor model category, 269 respondents in the Facilitation model category and 114 respondents in the clinical education unit model across 25 different clinical areas in one tertiary facility. The most positive social climate was associated with the preceptor model. On all subscales the median score was rated higher than the two other models. When clinical education units were compared with the standard facilitation model the median score was rated higher in all of the subscales in the Clinical Learning Environment Inventory. These results suggest that while preceptoring is an effective clinical placement strategy that provides psycho-social support for students, clinical education units that are more sustainable through their placement of greater numbers of students, can provide greater psycho-social support for students than traditional models.

  17. A three-dimensional virtual environment for modeling mechanical cardiopulmonary interactions.

    PubMed

    Kaye, J M; Primiano, F P; Metaxas, D N

    1998-06-01

    We have developed a real-time computer system for modeling mechanical physiological behavior in an interactive, 3-D virtual environment. Such an environment can be used to facilitate exploration of cardiopulmonary physiology, particularly in situations that are difficult to reproduce clinically. We integrate 3-D deformable body dynamics with new, formal models of (scalar) cardiorespiratory physiology, associating the scalar physiological variables and parameters with the corresponding 3-D anatomy. Our framework enables us to drive a high-dimensional system (the 3-D anatomical models) from one with fewer parameters (the scalar physiological models) because of the nature of the domain and our intended application. Our approach is amenable to modeling patient-specific circumstances in two ways. First, using CT scan data, we apply semi-automatic methods for extracting and reconstructing the anatomy to use in our simulations. Second, our scalar physiological models are defined in terms of clinically measurable, patient-specific parameters. This paper describes our approach, problems we have encountered and a sample of results showing normal breathing and acute effects of pneumothoraces.

  18. Brief Report: Preliminary Proposal of a Conceptual Model of a Digital Environment for Developing Mathematical Reasoning in Students with Autism Spectrum Disorders.

    PubMed

    Santos, Maria Isabel; Breda, Ana; Almeida, Ana Margarida

    2015-08-01

    There is clear evidence that in typically developing children reasoning and sense-making are essential in all mathematical learning and understanding processes. In children with autism spectrum disorders (ASD), however, these become much more significant, considering their importance to successful independent living. This paper presents a preliminary proposal of a digital environment, specifically targeted to promote the development of mathematical reasoning in students with ASD. Given the diversity of ASD, the prototyping of this environment requires the study of dynamic adaptation processes and the development of activities adjusted to each user's profile. We present the results obtained during the first phase of this ongoing research, describing a conceptual model of the proposed digital environment. Guidelines for future research are also discussed.

  19. Physics-based statistical model and simulation method of RF propagation in urban environments

    DOEpatents

    Pao, Hsueh-Yuan; Dvorak, Steven L.

    2010-09-14

    A physics-based statistical model and simulation/modeling method and system of electromagnetic wave propagation (wireless communication) in urban environments. In particular, the model is a computationally efficient close-formed parametric model of RF propagation in an urban environment which is extracted from a physics-based statistical wireless channel simulation method and system. The simulation divides the complex urban environment into a network of interconnected urban canyon waveguides which can be analyzed individually; calculates spectral coefficients of modal fields in the waveguides excited by the propagation using a database of statistical impedance boundary conditions which incorporates the complexity of building walls in the propagation model; determines statistical parameters of the calculated modal fields; and determines a parametric propagation model based on the statistical parameters of the calculated modal fields from which predictions of communications capability may be made.

  20. The impact of working memory and the “process of process modelling” on model quality: Investigating experienced versus inexperienced modellers

    NASA Astrophysics Data System (ADS)

    Martini, Markus; Pinggera, Jakob; Neurauter, Manuel; Sachse, Pierre; Furtner, Marco R.; Weber, Barbara

    2016-05-01

    A process model (PM) represents the graphical depiction of a business process, for instance, the entire process from online ordering a book until the parcel is delivered to the customer. Knowledge about relevant factors for creating PMs of high quality is lacking. The present study investigated the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension, modelling, and reconciliation) were related to PM quality. Our results show that the WM function of relational integration was positively related to PM quality in both modelling groups. The ratio of comprehension phases was negatively related to PM quality in inexperienced modellers and the ratio of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling.

  1. The impact of working memory and the “process of process modelling” on model quality: Investigating experienced versus inexperienced modellers

    PubMed Central

    Martini, Markus; Pinggera, Jakob; Neurauter, Manuel; Sachse, Pierre; Furtner, Marco R.; Weber, Barbara

    2016-01-01

    A process model (PM) represents the graphical depiction of a business process, for instance, the entire process from online ordering a book until the parcel is delivered to the customer. Knowledge about relevant factors for creating PMs of high quality is lacking. The present study investigated the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension, modelling, and reconciliation) were related to PM quality. Our results show that the WM function of relational integration was positively related to PM quality in both modelling groups. The ratio of comprehension phases was negatively related to PM quality in inexperienced modellers and the ratio of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling. PMID:27157858

  2. An Integrated Product Environment

    NASA Technical Reports Server (NTRS)

    Higgins, Chuck

    1997-01-01

    Mechanical Advantage is a mechanical design decision support system. Unlike our CAD/CAM cousins, Mechanical Advantage addresses true engineering processes, not just the form and fit of geometry. If we look at a traditional engineering environment, we see that an engineer starts with two things - performance goals and design rules. The intent is to have a product perform specific functions and accomplish that within a designated environment. Geometry should be a simple byproduct of that engineering process - not the controller of it. Mechanical Advantage is a performance modeler allowing engineers to consider all these criteria in making their decisions by providing such capabilities as critical parameter analysis, tolerance and sensitivity analysis, math driven Geometry, and automated design optimizations. If you should desire an industry standard solid model, we would produce an ACIS-based solid model. If you should desire an ANSI/ISO standard drawing, we would produce this as well with a virtual push of the button. For more information on this and other Advantage Series products, please contact the author.

  3. Negative Binomial Process Count and Mixture Modeling.

    PubMed

    Zhou, Mingyuan; Carin, Lawrence

    2015-02-01

    The seemingly disjoint problems of count and mixture modeling are united under the negative binomial (NB) process. A gamma process is employed to model the rate measure of a Poisson process, whose normalization provides a random probability measure for mixture modeling and whose marginalization leads to an NB process for count modeling. A draw from the NB process consists of a Poisson distributed finite number of distinct atoms, each of which is associated with a logarithmic distributed number of data samples. We reveal relationships between various count- and mixture-modeling distributions and construct a Poisson-logarithmic bivariate distribution that connects the NB and Chinese restaurant table distributions. Fundamental properties of the models are developed, and we derive efficient Bayesian inference. It is shown that with augmentation and normalization, the NB process and gamma-NB process can be reduced to the Dirichlet process and hierarchical Dirichlet process, respectively. These relationships highlight theoretical, structural, and computational advantages of the NB process. A variety of NB processes, including the beta-geometric, beta-NB, marked-beta-NB, marked-gamma-NB and zero-inflated-NB processes, with distinct sharing mechanisms, are also constructed. These models are applied to topic modeling, with connections made to existing algorithms under Poisson factor analysis. Example results show the importance of inferring both the NB dispersion and probability parameters.

  4. Modeling the human body/seat system in a vibration environment.

    PubMed

    Rosen, Jacob; Arcan, Mircea

    2003-04-01

    The vibration environment is a common man-made artificial surrounding with which humans have a limited tolerance to cope due to their body dynamics. This research studied the dynamic characteristics of a seated human body/seat system in a vibration environment. The main result is a multi degrees of freedom lumped parameter model that synthesizes two basic dynamics: (i) global human dynamics, the apparent mass phenomenon, including a systematic set of the model parameters for simulating various conditions like body posture, backrest, footrest, muscle tension, and vibration directions, and (ii) the local human dynamics, represented by the human pelvis/vibrating seat contact, using a cushioning interface. The model and its selected parameters successfully described the main effects of the apparent mass phenomenon compared to experimental data documented in the literature. The model provided an analytical tool for human body dynamics research. It also enabled a primary tool for seat and cushioning design. The model was further used to develop design guidelines for a composite cushion using the principle of quasi-uniform body/seat contact force distribution. In terms of evenly distributing the contact forces, the best result for the different materials and cushion geometries simulated in the current study was achieved using a two layer shaped geometry cushion built from three materials. Combining the geometry and the mechanical characteristics of a structure under large deformation into a lumped parameter model enables successful analysis of the human/seat interface system and provides practical results for body protection in dynamic environment.

  5. Microphysics, Radiation and Surface Processes in the Goddard Cumulus Ensemble (GCE) Model

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo; Starr, David (Technical Monitor)

    2002-01-01

    One of the most promising methods to test the representation of cloud processes used in climate models is to use observations together with Cloud Resolving Models (CRMs). The CRMs use more sophisticated and realistic representations of cloud microphysical processes, and they can reasonably well resolve the time evolution, structure, and life cycles of clouds and cloud systems (size about 2-200 km). The CRMs also allow explicit interaction between out-going longwave (cooling) and in-coming solar (heating) radiation with clouds. Observations can provide the initial conditions and validation for CRM results. The Goddard Cumulus Ensemble (GCE) Model, a CRM, has been developed and improved at NASA/Goddard Space Flight Center over the past two decades. The GCE model has been used to understand the following: 1) water and energy cycles and their roles in the tropical climate system; 2) the vertical redistribution of ozone and trace constituents by individual clouds and well organized convective systems over various spatial scales; 3) the relationship between the vertical distribution of latent heating (phase change of water) and the large-scale (pre-storm) environment; 4) the validity of assumptions used in the representation of cloud processes in climate and global circulation models; and 5) the representation of cloud microphysical processes and their interaction with radiative forcing over tropical and midlatitude regions. Four-dimensional cloud and latent heating fields simulated from the GCE model have been provided to the TRMM Science Data and Information System (TSDIS) to develop and improve algorithms for retrieving rainfall and latent heating rates for TRMM and the NASA Earth Observing System (EOS). More than 90 referred papers using the GCE model have been published in the last two decades. Also, more than 10 national and international universities are currently using the GCE model for research and teaching. In this talk, five specific major GCE improvements: (1

  6. Hidden Process Models

    DTIC Science & Technology

    2009-12-18

    cannot be detected with univariate techniques, but require multivariate analysis instead (Kamitani and Tong [2005]). Two other time series analysis ...learning for time series analysis . The historical record of DBNs can be traced back to Dean and Kanazawa [1988] and Dean and Wellman [1991], with...Rev. 8-98) Prescribed by ANSI Std Z39-18 Keywords: Hidden Process Models, probabilistic time series modeling, functional Magnetic Resonance Imaging

  7. A Virtual Environment for Resilient Infrastructure Modeling and Design

    DTIC Science & Technology

    2015-09-01

    Security CI Critical Infrastructure CID Center for Infrastructure Defense CSV Comma Separated Value DAD Defender-Attacker-Defender DHS Department...responses to disruptive events (e.g., cascading failure behavior) in a context- rich , controlled environment for exercises, education, and training...The general attacker-defender (AD) and defender-attacker-defender ( DAD ) models for CI are defined in Brown et al. (2006). These models help

  8. A Simple Model for the Orbital Debris Environment in GEO

    NASA Astrophysics Data System (ADS)

    Anilkumar, A. K.; Ananthasayanam, M. R.; Subba Rao, P. V.

    The increase of space debris and its threat to commercial space activities in the Geosynchronous Earth Orbit (GEO) predictably cause concern regarding the environment over the long term. A variety of studies regarding space debris such as detection, modeling, protection and mitigation measures, is being pursued for the past couple of decades. Due to the absence of atmospheric drag to remove debris in GEO and the increasing number of utility satellites therein, the number of objects in GEO will continue to increase. The characterization of the GEO environment is critical for risk assessment and protection of future satellites and also to incorporate effective debris mitigation measures in the design and operations. The debris measurements in GEO have been limited to objects with size more than 60 cm. This paper provides an engineering model of the GEO environment by utilizing the philosophy and approach as laid out for the SIMPLE model proposed recently for LEO by the authors. The present study analyses the statistical characteristics of the GEO catalogued objects in order to arrive at a model for the GEO space debris environment. It is noted that the catalogued objects, as of now of around 800, by USSPACECOM across the years 1998 to 2004 have the same semi major axis mode (highest number density) around 35750 km above the earth. After removing the objects in the small bin around the mode, (35700, 35800) km containing around 40 percent (a value that is nearly constant across the years) of the objects, the number density of the other objects follow a single Laplace distribution with two parameters, namely location and scale. Across the years the location parameter of the above distribution does not significantly vary but the scale parameter shows a definite trend. These observations are successfully utilized in proposing a simple model for the GEO debris environment. References Ananthasayanam, M. R., Anil Kumar, A. K., and Subba Rao, P. V., ``A New Stochastic

  9. Distributed Lag Models: Examining Associations between the Built Environment and Health

    PubMed Central

    Baek, Jonggyu; Sánchez, Brisa N.; Berrocal, Veronica J.; Sanchez-Vaznaugh, Emma V.

    2016-01-01

    Built environment factors constrain individual level behaviors and choices, and thus are receiving increasing attention to assess their influence on health. Traditional regression methods have been widely used to examine associations between built environment measures and health outcomes, where a fixed, pre-specified spatial scale (e.g., 1 mile buffer) is used to construct environment measures. However, the spatial scale for these associations remains largely unknown and misspecifying it introduces bias. We propose the use of distributed lag models (DLMs) to describe the association between built environment features and health as a function of distance from the locations of interest and circumvent a-priori selection of a spatial scale. Based on simulation studies, we demonstrate that traditional regression models produce associations biased away from the null when there is spatial correlation among the built environment features. Inference based on DLMs is robust under a range of scenarios of the built environment. We use this innovative application of DLMs to examine the association between the availability of convenience stores near California public schools, which may affect children’s dietary choices both through direct access to junk food and exposure to advertisement, and children’s body mass index z-scores (BMIz). PMID:26414942

  10. Factors Affecting the Institutional Development of CBTE Programs. Vol. 2, Process Environment.

    ERIC Educational Resources Information Center

    Bystydzienski, Jill M.; And Others

    This document examines three process environment factors (communications networks, morale of consortia members, and teacher's union attitudes) and their influence on the responses of educational institutions to a New York State Department of Education mandate on performance based teacher education (PBTE). In studying the communications networks,…

  11. Cost Models for MMC Manufacturing Processes

    NASA Technical Reports Server (NTRS)

    Elzey, Dana M.; Wadley, Haydn N. G.

    1996-01-01

    Processes for the manufacture of advanced metal matrix composites are rapidly approaching maturity in the research laboratory and there is growing interest in their transition to industrial production. However, research conducted to date has almost exclusively focused on overcoming the technical barriers to producing high-quality material and little attention has been given to the economical feasibility of these laboratory approaches and process cost issues. A quantitative cost modeling (QCM) approach was developed to address these issues. QCM are cost analysis tools based on predictive process models relating process conditions to the attributes of the final product. An important attribute, of the QCM approach is the ability to predict the sensitivity of material production costs to product quality and to quantitatively explore trade-offs between cost and quality. Applications of the cost models allow more efficient direction of future MMC process technology development and a more accurate assessment of MMC market potential. Cost models were developed for two state-of-the art metal matrix composite (MMC) manufacturing processes: tape casting and plasma spray deposition. Quality and Cost models are presented for both processes and the resulting predicted quality-cost curves are presented and discussed.

  12. An agent-vector-host-environment model for controlling small arms and light weapons.

    PubMed

    Pinto, Andrew D; Sharma, Malika; Muggah, Robert

    2011-05-01

    Armed violence is a significant public health problem. It results in fatal and non-fatal injuries and disrupts social and economic processes that are essential to the health of individuals and communities. We argue that an agent-vector-host-environment model can be helpful in understanding and describing the availability and misuse of small arms and light weapons. Moreover, such a model can assist in identifying potential control points and in developing mitigation strategies. These concepts have been developed from analogous vector control programs and are applied to controlling arms to reduce their misuse. So-called 'denormalization' and 'de-legitimization' campaigns that focus on the vector - including the industry producing these commodities - can be based on the experience of public health in controlling tobacco use and exposure. This model can assist health professionals, civil society and governments in developing comprehensive strategies to limit the production, distribution and misuse of small arms and light weapons.

  13. The Web Measurement Environment (WebME): A Tool for Combining and Modeling Distributed Data

    NASA Technical Reports Server (NTRS)

    Tesoriero, Roseanne; Zelkowitz, Marvin

    1997-01-01

    Many organizations have incorporated data collection into their software processes for the purpose of process improvement. However, in order to improve, interpreting the data is just as important as the collection of data. With the increased presence of the Internet and the ubiquity of the World Wide Web, the potential for software processes being distributed among several physically separated locations has also grown. Because project data may be stored in multiple locations and in differing formats, obtaining and interpreting data from this type of environment becomes even more complicated. The Web Measurement Environment (WebME), a Web-based data visualization tool, is being developed to facilitate the understanding of collected data in a distributed environment. The WebME system will permit the analysis of development data in distributed, heterogeneous environments. This paper provides an overview of the system and its capabilities.

  14. Performance of redundant disk array organizations in transaction processing environments

    NASA Technical Reports Server (NTRS)

    Mourad, Antoine N.; Fuchs, W. K.; Saab, Daniel G.

    1993-01-01

    A performance evaluation is conducted for two redundant disk-array organizations in a transaction-processing environment, relative to the performance of both mirrored disk organizations and organizations using neither striping nor redundancy. The proposed parity-striping alternative to striping with rotated parity is shown to furnish rapid recovery from failure at the same low storage cost without interleaving the data over multiple disks. Both noncached systems and systems using a nonvolatile cache as the controller are considered.

  15. Health, Supportive Environments, and the Reasonable Person Model

    Treesearch

    Stephen Kaplan; Rachel Kaplan

    2003-01-01

    The Reasonable Person Model is a conceptual framework that links environmental factors with human behavior. People are more reasonable, cooperative, helpful, and satisfied when the environment supports their basic informational needs. The same environmental supports are important factors in enhancing human health. We use this framework to identify the informational...

  16. A model of adaptive decision-making from representation of information environment by quantum fields.

    PubMed

    Bagarello, F; Haven, E; Khrennikov, A

    2017-11-13

    We present the mathematical model of decision-making (DM) of agents acting in a complex and uncertain environment (combining huge variety of economical, financial, behavioural and geopolitical factors). To describe interaction of agents with it, we apply the formalism of quantum field theory (QTF). Quantum fields are a purely informational nature. The QFT model can be treated as a far relative of the expected utility theory, where the role of utility is played by adaptivity to an environment (bath). However, this sort of utility-adaptivity cannot be represented simply as a numerical function. The operator representation in Hilbert space is used and adaptivity is described as in quantum dynamics. We are especially interested in stabilization of solutions for sufficiently large time. The outputs of this stabilization process, probabilities for possible choices, are treated in the framework of classical DM. To connect classical and quantum DM, we appeal to Quantum Bayesianism. We demonstrate the quantum-like interference effect in DM, which is exhibited as a violation of the formula of total probability, and hence the classical Bayesian inference scheme.This article is part of the themed issue 'Second quantum revolution: foundational questions'. © 2017 The Author(s).

  17. A model of adaptive decision-making from representation of information environment by quantum fields

    NASA Astrophysics Data System (ADS)

    Bagarello, F.; Haven, E.; Khrennikov, A.

    2017-10-01

    We present the mathematical model of decision-making (DM) of agents acting in a complex and uncertain environment (combining huge variety of economical, financial, behavioural and geopolitical factors). To describe interaction of agents with it, we apply the formalism of quantum field theory (QTF). Quantum fields are a purely informational nature. The QFT model can be treated as a far relative of the expected utility theory, where the role of utility is played by adaptivity to an environment (bath). However, this sort of utility-adaptivity cannot be represented simply as a numerical function. The operator representation in Hilbert space is used and adaptivity is described as in quantum dynamics. We are especially interested in stabilization of solutions for sufficiently large time. The outputs of this stabilization process, probabilities for possible choices, are treated in the framework of classical DM. To connect classical and quantum DM, we appeal to Quantum Bayesianism. We demonstrate the quantum-like interference effect in DM, which is exhibited as a violation of the formula of total probability, and hence the classical Bayesian inference scheme. This article is part of the themed issue `Second quantum revolution: foundational questions'.

  18. Using {sup 222}Rn as a tracer of geophysical processes in underground environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lacerda, T.; Anjos, R. M.; Valladares, D. L.

    2014-11-11

    Radon levels in two old mines in San Luis, Argentina, are reported and analyzed. These mines are today used for touristic visitation. Our goal was to assess the potential use of such radioactive noble gas as tracer of geological processes in underground environments. CR-39 nuclear track detectors were used during the winter and summer seasons. The findings show that the significant radon concentrations reported in this environment are subject to large seasonal modulations, due to the strong dependence of natural ventilation on the variations of outside temperature. The results also indicate that radon pattern distribution appear as a good methodmore » to localize unknown ducts, fissures or secondary tunnels in subterranean environments.« less

  19. Multiscale Materials Modeling in an Industrial Environment.

    PubMed

    Weiß, Horst; Deglmann, Peter; In 't Veld, Pieter J; Cetinkaya, Murat; Schreiner, Eduard

    2016-06-07

    In this review, we sketch the materials modeling process in industry. We show that predictive and fast modeling is a prerequisite for successful participation in research and development processes in the chemical industry. Stable and highly automated workflows suitable for handling complex systems are a must. In particular, we review approaches to build and parameterize soft matter systems. By satisfying these prerequisites, efficiency for the development of new materials can be significantly improved, as exemplified here for formulation polymer development. This is in fact in line with recent Materials Genome Initiative efforts sponsored by the US government. Valuable contributions to product development are possible today by combining existing modeling techniques in an intelligent fashion, provided modeling and experiment work hand in hand.

  20. NCWin — A Component Object Model (COM) for processing and visualizing NetCDF data

    USGS Publications Warehouse

    Liu, Jinxun; Chen, J.M.; Price, D.T.; Liu, S.

    2005-01-01

    NetCDF (Network Common Data Form) is a data sharing protocol and library that is commonly used in large-scale atmospheric and environmental data archiving and modeling. The NetCDF tool described here, named NCWin and coded with Borland C + + Builder, was built as a standard executable as well as a COM (component object model) for the Microsoft Windows environment. COM is a powerful technology that enhances the reuse of applications (as components). Environmental model developers from different modeling environments, such as Python, JAVA, VISUAL FORTRAN, VISUAL BASIC, VISUAL C + +, and DELPHI, can reuse NCWin in their models to read, write and visualize NetCDF data. Some Windows applications, such as ArcGIS and Microsoft PowerPoint, can also call NCWin within the application. NCWin has three major components: 1) The data conversion part is designed to convert binary raw data to and from NetCDF data. It can process six data types (unsigned char, signed char, short, int, float, double) and three spatial data formats (BIP, BIL, BSQ); 2) The visualization part is designed for displaying grid map series (playing forward or backward) with simple map legend, and displaying temporal trend curves for data on individual map pixels; and 3) The modeling interface is designed for environmental model development by which a set of integrated NetCDF functions is provided for processing NetCDF data. To demonstrate that the NCWin can easily extend the functions of some current GIS software and the Office applications, examples of calling NCWin within ArcGIS and MS PowerPoint for showing NetCDF map animations are given.

  1. Quality and Safety in Health Care, Part XIV: The External Environment and Research for Diagnostic Processes.

    PubMed

    Harolds, Jay A

    2016-09-01

    The work system in which diagnosis takes place is affected by the external environment, which includes requirements such as certification, accreditation, and regulations. How errors are reported, malpractice, and the system for payment are some other aspects of the external environment. Improving the external environment is expected to decrease errors in diagnosis. More research on improving the diagnostic process is needed.

  2. Information-Processing Models and Curriculum Design

    ERIC Educational Resources Information Center

    Calfee, Robert C.

    1970-01-01

    "This paper consists of three sections--(a) the relation of theoretical analyses of learning to curriculum design, (b) the role of information-processing models in analyses of learning processes, and (c) selected examples of the application of information-processing models to curriculum design problems." (Author)

  3. A Model Supported Interactive Virtual Environment for Natural Resource Sharing in Environmental Education

    ERIC Educational Resources Information Center

    Barbalios, N.; Ioannidou, I.; Tzionas, P.; Paraskeuopoulos, S.

    2013-01-01

    This paper introduces a realistic 3D model supported virtual environment for environmental education, that highlights the importance of water resource sharing by focusing on the tragedy of the commons dilemma. The proposed virtual environment entails simulations that are controlled by a multi-agent simulation model of a real ecosystem consisting…

  4. Application of bacteriophages to reduce Salmonella contamination on workers' boots in rendering-processing environment.

    PubMed

    Gong, C; Jiang, X; Wang, J

    2017-10-01

    Workers' boots are considered one of the re-contamination routes of Salmonella for rendered meals in the rendering-processing environment. This study was conducted to evaluate the efficacy of a bacteriophage cocktail for reducing Salmonella on workers' boots and ultimately for preventing Salmonella re-contamination of rendered meals. Under laboratory conditions, biofilms of Salmonella Typhimurium avirulent strain 8243 formed on rubber templates or boots were treated with a bacteriophage cocktail of 6 strains (ca. 9 log PFU/mL) for 6 h at room temperature. Bacteriophage treatments combined with sodium hypochlorite (400 ppm) or 30-second brush scrubbing also were investigated for a synergistic effect on reducing Salmonella biofilms. Sodium magnesium (SM) buffer and sodium hypochlorite (400 ppm) were used as controls. To reduce indigenous Salmonella on workers' boots, a field study was conducted to apply a bacteriophage cocktail and other combined treatments 3 times within one wk in a rendering-processing environment. Prior to and after bacteriophage treatments, Salmonella populations on the soles of rubber boots were swabbed and enumerated on XLT-4, Miller-Mallinson or CHROMagar™ plates. Under laboratory conditions, Salmonella biofilms formed on rubber templates and boots were reduced by 95.1 to 99.999% and 91.5 to 99.2%, respectively. In a rendering-processing environment (ave. temperature: 19.3°C; ave. relative humidity: 48%), indigenous Salmonella populations on workers' boots were reduced by 84.2, 92.9, and 93.2% after being treated with bacteriophages alone, bacteriophages + sodium hypochlorite, and bacteriophages + scrubbing for one wk, respectively. Our results demonstrated the effectiveness of bacteriophage treatments in reducing Salmonella contamination on the boots in both laboratory and the rendering-processing environment. © 2017 Poultry Science Association Inc.

  5. Occurrence of Alicyclobacillus in the fruit processing environment--a review.

    PubMed

    Steyn, Catharina E; Cameron, Michelle; Witthuhn, R Corli

    2011-05-14

    Concentrated fruit products have a significant place in modern consumption markets and are valuable semi-prepared food components to the bakery, dairy, confectionary, canning, baby food, frozen food, distilling and beverage industries. There is continuous pressure on the beverage industry to improve the quality of concentrated fruit products in order for reconstituted fruit beverages to compete with beverages that are made from fresh fruits. In recent years, Alicyclobacillus spp. have become a major concern to the beverage industry worldwide as many high-acid, concentrated fruit products have been found to be contaminated with these spoilage microbes. The thermo-acidophilic nature of alicyclobacilli and highly resistant endospores allows for their survival during the production of concentrated fruit products. Under favourable conditions, endospores can germinate and multiply to numbers high enough to cause spoilage and product deterioration through the production of chemical taint compounds. It is imperative to understand the nature of Alicyclobacillus within the fruit concentrate processing environment so as to develop effective control strategies and to prevent spoilage in juice and beverage products that are reconstituted from fruit concentrates. This paper reviews the occurrence of alicyclobacilli in the fruit processing environment, control measures, as well as detection, identification and standardised test methods that are currently used for Alicyclobacillus in concentrated fruit products. Copyright © 2011 Elsevier B.V. All rights reserved.

  6. Modelling between Epistemological Beliefs and Constructivist Learning Environment

    ERIC Educational Resources Information Center

    Çetin-Dindar, Ayla; Kirbulut, Zübeyde Demet; Boz, Yezdan

    2014-01-01

    The purpose of this study was to model the relationship between pre-service chemistry teachers' epistemological beliefs and their preference to use constructivist-learning environment in their future class. The sample was 125 pre-service chemistry teachers from five universities in Turkey. Two instruments were used in this study. One of the…

  7. Modeling Student Cognition in Digital and Nondigital Assessment Environments

    ERIC Educational Resources Information Center

    DiCerbo, Kristen E.; Xu, Yuning; Levy, Roy; Lai, Emily; Holland, Laura

    2017-01-01

    Inferences about student knowledge, skills, and attributes based on digital activity still largely come from whether students ultimately get a correct result or not. However, the ability to collect activity stream data as individuals interact with digital environments provides information about students' processes as they progress through learning…

  8. Model medication management process in Australian nursing homes using business process modeling.

    PubMed

    Qian, Siyu; Yu, Ping

    2013-01-01

    One of the reasons for end user avoidance or rejection to use health information systems is poor alignment of the system with healthcare workflow, likely causing by system designers' lack of thorough understanding about healthcare process. Therefore, understanding the healthcare workflow is the essential first step for the design of optimal technologies that will enable care staff to complete the intended tasks faster and better. The often use of multiple or "high risk" medicines by older people in nursing homes has the potential to increase medication error rate. To facilitate the design of information systems with most potential to improve patient safety, this study aims to understand medication management process in nursing homes using business process modeling method. The paper presents study design and preliminary findings from interviewing two registered nurses, who were team leaders in two nursing homes. Although there were subtle differences in medication management between the two homes, major medication management activities were similar. Further field observation will be conducted. Based on the data collected from observations, an as-is process model for medication management will be developed.

  9. The Joint UK Land Environment Simulator (JULES), Model description - Part 2: Carbon fluxes and vegetation

    NASA Astrophysics Data System (ADS)

    Clark, D. B.; Mercado, L. M.; Sitch, S.; Jones, C. D.; Gedney, N.; Best, M. J.; Pryor, M.; Rooney, G. G.; Essery, R. L. H.; Blyth, E.; Boucher, O.; Harding, R. J.; Cox, P. M.

    2011-03-01

    The Joint UK Land Environment Simulator (JULES) is a process-based model that simulates the fluxes of carbon, water, energy and momentum between the land surface and the atmosphere. Past studies with JULES have demonstrated the important role of the land surface in the Earth System. Different versions of JULES have been employed to quantify the effects on the land carbon sink of separately changing atmospheric aerosols and tropospheric ozone, and the response of methane emissions from wetlands to climate change. There was a need to consolidate these and other advances into a single model code so as to be able to study interactions in a consistent manner. This paper describes the consolidation of these advances into the modelling of carbon fluxes and stores, in the vegetation and soil, in version 2.2 of JULES. Features include a multi-layer canopy scheme for light interception, including a sunfleck penetration scheme, a coupled scheme of leaf photosynthesis and stomatal conductance, representation of the effects of ozone on leaf physiology, and a description of methane emissions from wetlands. JULES represents the carbon allocation, growth and population dynamics of five plant functional types. The turnover of carbon from living plant tissues is fed into a 4-pool soil carbon model. The process-based descriptions of key ecological processes and trace gas fluxes in JULES mean that this community model is well-suited for use in carbon cycle, climate change and impacts studies, either in standalone mode or as the land component of a coupled Earth system model.

  10. Saint: a lightweight integration environment for model annotation.

    PubMed

    Lister, Allyson L; Pocock, Matthew; Taschuk, Morgan; Wipat, Anil

    2009-11-15

    Saint is a web application which provides a lightweight annotation integration environment for quantitative biological models. The system enables modellers to rapidly mark up models with biological information derived from a range of data sources. Saint is freely available for use on the web at http://www.cisban.ac.uk/saint. The web application is implemented in Google Web Toolkit and Tomcat, with all major browsers supported. The Java source code is freely available for download at http://saint-annotate.sourceforge.net. The Saint web server requires an installation of libSBML and has been tested on Linux (32-bit Ubuntu 8.10 and 9.04).

  11. The Dimethylsulfide Cycle in the Eutrophied Southern North Sea: A Model Study Integrating Phytoplankton and Bacterial Processes

    PubMed Central

    Gypens, Nathalie; Borges, Alberto V.; Speeckaert, Gaelle; Lancelot, Christiane

    2014-01-01

    We developed a module describing the dimethylsulfoniopropionate (DMSP) and dimethylsulfide (DMS) dynamics, including biological transformations by phytoplankton and bacteria, and physico-chemical processes (including DMS air-sea exchange). This module was integrated in the MIRO ecological model and applied in a 0D frame in the Southern North Sea (SNS). The DMS(P) module is built on parameterizations derived from available knowledge on DMS(P) sources, transformations and sinks, and provides an explicit representation of bacterial activity in contrast to most of existing models that only include phytoplankton process (and abiotic transformations). The model is tested in a highly productive coastal ecosystem (the Belgian coastal zone, BCZ) dominated by diatoms and the Haptophyceae Phaeocystis, respectively low and high DMSP producers. On an annual basis, the particulate DMSP (DMSPp) production simulated in 1989 is mainly related to Phaeocystis colonies (78%) rather than diatoms (13%) and nanoflagellates (9%). Accordingly, sensitivity analysis shows that the model responds more to changes in the sulfur:carbon (S:C) quota and lyase yield of Phaeocystis. DMS originates equally from phytoplankton and bacterial DMSP-lyase activity and only 3% of the DMS is emitted to the atmosphere. Model analysis demonstrates the sensitivity of DMS emission towards the atmosphere to the description and parameterization of biological processes emphasizing the need of adequately representing in models both phytoplankton and bacterial processes affecting DMS(P) dynamics. This is particularly important in eutrophied coastal environments such as the SNS dominated by high non-diatom blooms and where empirical models developed from data-sets biased towards open ocean conditions do not satisfactorily predict the timing and amplitude of the DMS seasonal cycle. In order to predict future feedbacks of DMS emissions on climate, it is needed to account for hotspots of DMS emissions from coastal

  12. Multidimensional Data Modeling for Business Process Analysis

    NASA Astrophysics Data System (ADS)

    Mansmann, Svetlana; Neumuth, Thomas; Scholl, Marc H.

    The emerging area of business process intelligence attempts to enhance the analytical capabilities of business process management systems by employing data warehousing and mining technologies. This paper presents an approach to re-engineering the business process modeling in conformity with the multidimensional data model. Since the business process and the multidimensional model are driven by rather different objectives and assumptions, there is no straightforward solution to converging these models.

  13. Models and signal processing for an implanted ethanol bio-sensor.

    PubMed

    Han, Jae-Joon; Doerschuk, Peter C; Gelfand, Saul B; O'Connor, Sean J

    2008-02-01

    The understanding of drinking patterns leading to alcoholism has been hindered by an inability to unobtrusively measure ethanol consumption over periods of weeks to months in the community environment. An implantable ethanol sensor is under development using microelectromechanical systems technology. For safety and user acceptability issues, the sensor will be implanted subcutaneously and, therefore, measure peripheral-tissue ethanol concentration. Determining ethanol consumption and kinetics in other compartments from the time course of peripheral-tissue ethanol concentration requires sophisticated signal processing based on detailed descriptions of the relevant physiology. A statistical signal processing system based on detailed models of the physiology and using extended Kalman filtering and dynamic programming tools is described which can estimate the time series of ethanol concentration in blood, liver, and peripheral tissue and the time series of ethanol consumption based on peripheral-tissue ethanol concentration measurements.

  14. Kinetic model of continuous ethanol fermentation in closed-circulating process with pervaporation membrane bioreactor by Saccharomyces cerevisiae.

    PubMed

    Fan, Senqing; Chen, Shiping; Tang, Xiaoyu; Xiao, Zeyi; Deng, Qing; Yao, Peina; Sun, Zhaopeng; Zhang, Yan; Chen, Chunyan

    2015-02-01

    Unstructured kinetic models were proposed to describe the principal kinetics involved in ethanol fermentation in a continuous and closed-circulating fermentation (CCCF) process with a pervaporation membrane bioreactor. After ethanol was removed in situ from the broth by the membrane pervaporation, the secondary metabolites accumulated in the broth became the inhibitors to cell growth. The cell death rate related to the deterioration of the culture environment was described as a function of the cell concentration and fermentation time. In CCCF process, 609.8 g L(-1) and 750.1 g L(-1) of ethanol production were obtained in the first run and second run, respectively. The modified Gompertz model, correlating the ethanol production with the fermentation period, could be used to describe the ethanol production during CCCF process. The fitting results by the models showed good agreement with the experimental data. These models could be employed for the CCCF process technology development for ethanol fermentation. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. DEVELOPMENT OF A CHEMICAL PROCESS MODELING ENVIRONMENT BASED ON CAPE-OPEN INTERFACE STANDARDS AND THE MICROSOFT .NET FRAMEWORK

    EPA Science Inventory

    Chemical process simulation has long been used as a design tool in the development of chemical plants, and has long been considered a means to evaluate different design options. With the advent of large scale computer networks and interface models for program components, it is po...

  16. Simulation Modeling of Software Development Processes

    NASA Technical Reports Server (NTRS)

    Calavaro, G. F.; Basili, V. R.; Iazeolla, G.

    1996-01-01

    A simulation modeling approach is proposed for the prediction of software process productivity indices, such as cost and time-to-market, and the sensitivity analysis of such indices to changes in the organization parameters and user requirements. The approach uses a timed Petri Net and Object Oriented top-down model specification. Results demonstrate the model representativeness, and its usefulness in verifying process conformance to expectations, and in performing continuous process improvement and optimization.

  17. Corporate corruption of the environment: sustainability as a process of compromise.

    PubMed

    Nyberg, Daniel; Wright, Christopher

    2013-09-01

    A key response to environmental degradation, climate change and declining biodiversity has been the growing adoption of market principles in an effort to better value the social good of nature. Through concepts such as 'natural capitalism' and 'corporate environmentalism', nature is increasingly viewed as a domain of capitalist endeavour. In this article, we use convention theory and a pluralist understanding of social goods to investigate how the social good of the environment is usurped by the alternate social good of the market. Through analysis of interviews with sustainability managers and corporate documentation, we highlight how organizational actors employ compromise to temporally settle disputes between competing claims about environmental activities. Our findings contribute to an understanding of the processes of empirically grounded critique and the under-theorized concept of compromise between social goods. Rather than protecting the environment, the corporate promotion of sustainability facilitates the corruption of the social good of the environment and its conversion into a market commodity. © London School of Economics and Political Science 2013.

  18. A Process for Technology Prioritization in a Competitive Environment

    NASA Technical Reports Server (NTRS)

    Stephens, Karen; Herman, Melody; Griffin, Brand

    2006-01-01

    This slide presentation reviews NASA's process for prioritizing technology requirements where there is a competitive environment. The In-Space Propulsion Technology (ISPT) project is used to exemplify the process. The ISPT project focuses on the mid level Technology Readiness Level (TRL) for development. These are TRL's 4 through 6, (i.e. Technology Development and Technology Demonstration. The objective of the planning activity is to identify the current most likely date each technology is needed and create ISPT technology development schedules based on these dates. There is a minimum of 4 years between flight and pacing mission. The ISPT Project needed to identify the "pacing mission" for each technology in order to provide funding for each area. Graphic representations show the development of the process. A matrix shows which missions are currently receiving pull from the both the Solar System Exploration and the Sun-Solar System Connection Roadmaps. The timeframes of the pacing missions technologies are shown for various types of propulsion. A pacing mission that was in the near future serves to increase the priority for funding. Adaptations were made when budget reductions precluded the total implementation of the plan.

  19. Towards high fidelity numerical wave tanks for modelling coastal and ocean engineering processes

    NASA Astrophysics Data System (ADS)

    Cozzuto, G.; Dimakopoulos, A.; de Lataillade, T.; Kees, C. E.

    2017-12-01

    With the increasing availability of computational resources, the engineering and research community is gradually moving towards using high fidelity Comutational Fluid Mechanics (CFD) models to perform numerical tests for improving the understanding of physical processes pertaining to wave propapagation and interaction with the coastal environment and morphology, either physical or man-made. It is therefore important to be able to reproduce in these models the conditions that drive these processes. So far, in CFD models the norm is to use regular (linear or nonlinear) waves for performing numerical tests, however, only random waves exist in nature. In this work, we will initially present the verification and validation of numerical wave tanks based on Proteus, an open-soruce computational toolkit based on finite element analysis, with respect to the generation, propagation and absorption of random sea states comprising of long non-repeating wave sequences. Statistical and spectral processing of results demonstrate that the methodologies employed (including relaxation zone methods and moving wave paddles) are capable of producing results of similar quality to the wave tanks used in laboratories (Figure 1). Subsequently cases studies of modelling complex process relevant to coastal defences and floating structures such as sliding and overturning of composite breakwaters, heave and roll response of floating caissons are presented. Figure 1: Wave spectra in the numerical wave tank (coloured symbols), compared against the JONSWAP distribution

  20. Improved thermodynamic modeling of the no-vent fill process and correlation with experimental data

    NASA Technical Reports Server (NTRS)

    Taylor, William J.; Chato, David J.

    1991-01-01

    The United States' plans to establish a permanent manned presence in space and to explore the Solar System created the need to efficiently handle large quantities of subcritical cryogenic fluids, particularly propellants such as liquid hydrogen and liquid oxygen, in low- to zero-gravity environments. One of the key technologies to be developed for fluid handling is the ability to transfer the cryogens between storage and spacecraft tanks. The no-vent fill method was identified as one way to perform this transfer. In order to understand how to apply this method, a model of the no-vent fill process is being developed and correlated with experimental data. The verified models then can be used to design and analyze configurations for tankage and subcritical fluid depots. The development of an improved macroscopic thermodynamic model is discussed of the no-vent fill process and the analytical results from the computer program implementation of the model are correlated with experimental results for two different test tanks.

  1. Journey into the Problem-Solving Process: Cognitive Functions in a PBL Environment

    ERIC Educational Resources Information Center

    Chua, B. L.; Tan, O. S.; Liu, W. C.

    2016-01-01

    In a PBL environment, learning results from learners engaging in cognitive processes pivotal in the understanding or resolution of the problem. Using Tan's cognitive function disc, this study examines the learner's perceived cognitive functions at each stage of PBL, as facilitated by the PBL schema. The results suggest that these learners…

  2. HexSim: a modeling environment for ecology and conservation.

    EPA Science Inventory

    HexSim is a powerful and flexible new spatially-explicit, individual based modeling environment intended for use in ecology, conservation, genetics, epidemiology, toxicology, and other disciplines. We describe HexSim, illustrate past applications that contributed to our >10 year ...

  3. Marketing the use of the space environment for the processing of biological and pharmaceutical materials

    NASA Technical Reports Server (NTRS)

    1984-01-01

    The perceptions of U.S. biotechnology and pharmaceutical companies concerning the potential use of the space environment for the processing of biological substances was examined. Physical phenomena that may be important in space-base processing of biological materials are identified and discussed in the context of past and current experiment programs. The capabilities of NASA to support future research and development, and to engage in cooperative risk sharing programs with industry are discussed. Meetings were held with several biotechnology and pharmaceutical companies to provide data for an analysis of the attitudes and perceptions of these industries toward the use of the space environment. Recommendations are made for actions that might be taken by NASA to facilitate the marketing of the use of the space environment, and in particular the Space Shuttle, to the biotechnology and pharmaceutical industries.

  4. Modeling alpine grasslands with two integrated hydrologic models: a comparison of the different process representation in CATHY and GEOtop

    NASA Astrophysics Data System (ADS)

    Camporese, M.; Bertoldi, G.; Bortoli, E.; Wohlfahrt, G.

    2017-12-01

    Integrated hydrologic surface-subsurface models (IHSSMs) are increasingly used as prediction tools to solve simultaneously states and fluxes in and between multiple terrestrial compartments (e.g., snow cover, surface water, groundwater), in an attempt to tackle environmental problems in a holistic approach. Two such models, CATHY and GEOtop, are used in this study to investigate their capabilities to reproduce hydrological processes in alpine grasslands. The two models differ significantly in the complexity of the representation of the surface energy balance and the solution of Richards equation for water flow in the variably saturated subsurface. The main goal of this research is to show how these differences in process representation can lead to different predictions of hydrologic states and fluxes, in the simulation of an experimental site located in the Venosta Valley (South Tyrol, Italy). Here, a large set of relevant hydrological data (e.g., evapotranspiration, soil moisture) has been collected, with ground and remote sensing observations. The area of interest is part of a Long-Term Ecological Research (LTER) site, a mountain steep, heterogeneous slope, where the predominant land use types are meadow, pasture, and forest. The comparison between data and model predictions, as well as between simulations with the two IHSSMs, contributes to advance our understanding of the tradeoffs between different complexities in modeĺs process representation, model accuracy, and the ability to explain observed hydrological dynamics in alpine environments.

  5. Modelling of Sub-daily Hydrological Processes Using Daily Time-Step Models: A Distribution Function Approach to Temporal Scaling

    NASA Astrophysics Data System (ADS)

    Kandel, D. D.; Western, A. W.; Grayson, R. B.

    2004-12-01

    erosion models. The statistical description of sub-daily variability is thus propagated through the model, allowing the effects of variability to be captured in the simulations. This results in cdfs of various fluxes, the integration of which over a day gives respective daily totals. Using 42-plot-years of surface runoff and soil erosion data from field studies in different environments from Australia and Nepal, simulation results from this cdf approach are compared with the sub-hourly (2-minute for Nepal and 6-minute for Australia) and daily models having similar process descriptions. Significant improvements in the simulation of surface runoff and erosion are achieved, compared with a daily model that uses average daily rainfall intensities. The cdf model compares well with a sub-hourly time-step model. This suggests that the approach captures the important effects of sub-daily variability while utilizing commonly available daily information. It is also found that the model parameters are more robustly defined using the cdf approach compared with the effective values obtained at the daily scale. This suggests that the cdf approach may offer improved model transferability spatially (to other areas) and temporally (to other periods).

  6. Finite Birth-and-Death Models in Randomly Changing Environments.

    DTIC Science & Technology

    1982-02-01

    7 AD-AL14 188 NAVAL POSTGRADUATE SCHOOL MONTEREY CA F/ 12/2 FINITE BIRTH-AND-DEATH MODELS IN RANDOMLY CHANGING ENVROENTS-(TC(U) FEB 82 D P 6AVER...Monterey, California DTIC $ELECTEMAY6 1982 B FINITE BIRTH-AND-DEATH MODELS IN RANDOMLY CHANGING ENVIRONMENTS by D. P. Gayer P. A. Jacobs G. Latouche February...CATALOG NUMUEi4NPS55-82-007 [iI. (. 4. TITLE (d Subtitle) S. TYPE OF REPORT A PERIOO COVERED FINITE BIRTH-AND-DEATH MODELS IN RANDOMLY Technical

  7. Strengthening the weak link: Built Environment modelling for loss analysis

    NASA Astrophysics Data System (ADS)

    Millinship, I.

    2012-04-01

    Methods to analyse insured losses from a range of natural perils, including pricing by primary insurers and catastrophe modelling by reinsurers, typically lack sufficient exposure information. Understanding the hazard intensity in terms of spatial severity and frequency is only the first step towards quantifying the risk of a catastrophic event. For any given event we need to know: Are any structures affected? What type of buildings are they? How much damaged occurred? How much will the repairs cost? To achieve this, detailed exposure information is required to assess the likely damage and to effectively calculate the resultant loss. Modelling exposures in the Built Environment therefore plays as important a role in understanding re/insurance risk as characterising the physical hazard. Across both primary insurance books and aggregated reinsurance portfolios, the location of a property (a risk) and its monetary value is typically known. Exactly what that risk is in terms of detailed property descriptors including structure type and rebuild cost - and therefore its vulnerability to loss - is often omitted. This data deficiency is a primary source of variations between modelled losses and the actual claims value. Built Environment models are therefore required at a high resolution to describe building attributes that relate vulnerability to property damage. However, national-scale household-level datasets are often not computationally practical in catastrophe models and data must be aggregated. In order to provide more accurate risk analysis, we have developed and applied a methodology for Built Environment modelling for incorporation into a range of re/insurance applications, including operational models for different international regions and different perils and covering residential, commercial and industry exposures. Illustrated examples are presented, including exposure modelling suitable for aggregated reinsurance analysis for the UK and bespoke high resolution

  8. Simplified model of statistically stationary spacecraft rotation and associated induced gravity environments

    NASA Technical Reports Server (NTRS)

    Fichtl, G. H.; Holland, R. L.

    1978-01-01

    A stochastic model of spacecraft motion was developed based on the assumption that the net torque vector due to crew activity and rocket thruster firings is a statistically stationary Gaussian vector process. The process had zero ensemble mean value, and the components of the torque vector were mutually stochastically independent. The linearized rigid-body equations of motion were used to derive the autospectral density functions of the components of the spacecraft rotation vector. The cross-spectral density functions of the components of the rotation vector vanish for all frequencies so that the components of rotation were mutually stochastically independent. The autospectral and cross-spectral density functions of the induced gravity environment imparted to scientific apparatus rigidly attached to the spacecraft were calculated from the rotation rate spectral density functions via linearized inertial frame to body-fixed principal axis frame transformation formulae. The induced gravity process was a Gaussian one with zero mean value. Transformation formulae were used to rotate the principal axis body-fixed frame to which the rotation rate and induced gravity vector were referred to a body-fixed frame in which the components of the induced gravity vector were stochastically independent. Rice's theory of exceedances was used to calculate expected exceedance rates of the components of the rotation and induced gravity vector processes.

  9. Cognitive Styles and Virtual Environments.

    ERIC Educational Resources Information Center

    Ford, Nigel

    2000-01-01

    Discussion of navigation through virtual information environments focuses on the need for robust user models that take into account individual differences. Considers Pask's information processing styles and strategies; deep (transformational) and surface (reproductive) learning; field dependence/independence; divergent/convergent thinking;…

  10. Modeling biochemical transformation processes and information processing with Narrator.

    PubMed

    Mandel, Johannes J; Fuss, Hendrik; Palfreyman, Niall M; Dubitzky, Werner

    2007-03-27

    Software tools that model and simulate the dynamics of biological processes and systems are becoming increasingly important. Some of these tools offer sophisticated graphical user interfaces (GUIs), which greatly enhance their acceptance by users. Such GUIs are based on symbolic or graphical notations used to describe, interact and communicate the developed models. Typically, these graphical notations are geared towards conventional biochemical pathway diagrams. They permit the user to represent the transport and transformation of chemical species and to define inhibitory and stimulatory dependencies. A critical weakness of existing tools is their lack of supporting an integrative representation of transport, transformation as well as biological information processing. Narrator is a software tool facilitating the development and simulation of biological systems as Co-dependence models. The Co-dependence Methodology complements the representation of species transport and transformation together with an explicit mechanism to express biological information processing. Thus, Co-dependence models explicitly capture, for instance, signal processing structures and the influence of exogenous factors or events affecting certain parts of a biological system or process. This combined set of features provides the system biologist with a powerful tool to describe and explore the dynamics of life phenomena. Narrator's GUI is based on an expressive graphical notation which forms an integral part of the Co-dependence Methodology. Behind the user-friendly GUI, Narrator hides a flexible feature which makes it relatively easy to map models defined via the graphical notation to mathematical formalisms and languages such as ordinary differential equations, the Systems Biology Markup Language or Gillespie's direct method. This powerful feature facilitates reuse, interoperability and conceptual model development. Narrator is a flexible and intuitive systems biology tool. It is

  11. Modeling biochemical transformation processes and information processing with Narrator

    PubMed Central

    Mandel, Johannes J; Fuß, Hendrik; Palfreyman, Niall M; Dubitzky, Werner

    2007-01-01

    Background Software tools that model and simulate the dynamics of biological processes and systems are becoming increasingly important. Some of these tools offer sophisticated graphical user interfaces (GUIs), which greatly enhance their acceptance by users. Such GUIs are based on symbolic or graphical notations used to describe, interact and communicate the developed models. Typically, these graphical notations are geared towards conventional biochemical pathway diagrams. They permit the user to represent the transport and transformation of chemical species and to define inhibitory and stimulatory dependencies. A critical weakness of existing tools is their lack of supporting an integrative representation of transport, transformation as well as biological information processing. Results Narrator is a software tool facilitating the development and simulation of biological systems as Co-dependence models. The Co-dependence Methodology complements the representation of species transport and transformation together with an explicit mechanism to express biological information processing. Thus, Co-dependence models explicitly capture, for instance, signal processing structures and the influence of exogenous factors or events affecting certain parts of a biological system or process. This combined set of features provides the system biologist with a powerful tool to describe and explore the dynamics of life phenomena. Narrator's GUI is based on an expressive graphical notation which forms an integral part of the Co-dependence Methodology. Behind the user-friendly GUI, Narrator hides a flexible feature which makes it relatively easy to map models defined via the graphical notation to mathematical formalisms and languages such as ordinary differential equations, the Systems Biology Markup Language or Gillespie's direct method. This powerful feature facilitates reuse, interoperability and conceptual model development. Conclusion Narrator is a flexible and intuitive systems

  12. Risk attitudes in a changing environment: An evolutionary model of the fourfold pattern of risk preferences.

    PubMed

    Mallpress, Dave E W; Fawcett, Tim W; Houston, Alasdair I; McNamara, John M

    2015-04-01

    A striking feature of human decision making is the fourfold pattern of risk attitudes, involving risk-averse behavior in situations of unlikely losses and likely gains, but risk-seeking behavior in response to likely losses and unlikely gains. Current theories to explain this pattern assume particular psychological processes to reproduce empirical observations, but do not address whether it is adaptive for the decision maker to respond to risk in this way. Here, drawing on insights from behavioral ecology, we build an evolutionary model of risk-sensitive behavior, to investigate whether particular types of environmental conditions could favor a fourfold pattern of risk attitudes. We consider an individual foraging in a changing environment, where energy is needed to prevent starvation and build up reserves for reproduction. The outcome, in terms of reproductive value (a rigorous measure of evolutionary success), of a one-off choice between a risky and a safe gain, or between a risky and a safe loss, determines the risk-sensitive behavior we should expect to see in this environment. Our results show that the fourfold pattern of risk attitudes may be adaptive in an environment in which conditions vary stochastically but are autocorrelated in time. In such an environment the current options provide information about the likely environmental conditions in the future, which affect the optimal pattern of risk sensitivity. Our model predicts that risk preferences should be both path dependent and affected by the decision maker's current state. (c) 2015 APA, all rights reserved).

  13. Computational Modeling in Structural Materials Processing

    NASA Technical Reports Server (NTRS)

    Meyyappan, Meyya; Arnold, James O. (Technical Monitor)

    1997-01-01

    High temperature materials such as silicon carbide, a variety of nitrides, and ceramic matrix composites find use in aerospace, automotive, machine tool industries and in high speed civil transport applications. Chemical vapor deposition (CVD) is widely used in processing such structural materials. Variations of CVD include deposition on substrates, coating of fibers, inside cavities and on complex objects, and infiltration within preforms called chemical vapor infiltration (CVI). Our current knowledge of the process mechanisms, ability to optimize processes, and scale-up for large scale manufacturing is limited. In this regard, computational modeling of the processes is valuable since a validated model can be used as a design tool. The effort is similar to traditional chemically reacting flow modeling with emphasis on multicomponent diffusion, thermal diffusion, large sets of homogeneous reactions, and surface chemistry. In the case of CVI, models for pore infiltration are needed. In the present talk, examples of SiC nitride, and Boron deposition from the author's past work will be used to illustrate the utility of computational process modeling.

  14. Modeling of acetone biofiltration process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hsiu-Mu Tang; Shyh-Jye Hwang; Wen-Chuan Wang

    1996-12-31

    The objective of this research was to investigate the kinetic behavior of the biofiltration process for the removal of acetone 41 which was used as a model compound for highly water soluble gas pollutants. A mathematical model was developed by taking into account diffusion and biodegradation of acetone and oxygen in the biofilm, mass transfer resistance in the gas film, and flow pattern of the bulk gas phase. The simulated results obtained by the proposed model indicated that mass transfer resistance in the gas phase was negligible for this biofiltration process. Analysis of the relative importance of various rate stepsmore » indicated that the overall acetone removal process was primarily limited by the oxygen diffusion rate. 11 refs., 6 figs., 1 tab.« less

  15. Construction of integrated case environments.

    PubMed

    Losavio, Francisca; Matteo, Alfredo; Pérez, María

    2003-01-01

    The main goal of Computer-Aided Software Engineering (CASE) technology is to improve the entire software system development process. The CASE approach is not merely a technology; it involves a fundamental change in the process of software development. The tendency of the CASE approach, technically speaking, is the integration of tools that assist in the application of specific methods. In this sense, the environment architecture, which includes the platform and the system's hardware and software, constitutes the base of the CASE environment. The problem of tools integration has been proposed for two decades. Current integration efforts emphasize the interoperability of tools, especially in distributed environments. In this work we use the Brown approach. The environment resulting from the application of this model is called a federative environment, focusing on the fact that this architecture pays special attention to the connections among the components of the environment. This approach is now being used in component-based design. This paper describes a concrete experience in civil engineering and architecture fields, for the construction of an integrated CASE environment. A generic architectural framework based on an intermediary architectural pattern is applied to achieve the integration of the different tools. This intermediary represents the control perspective of the PAC (Presentation-Abstraction-Control) style, which has been implemented as a Mediator pattern and it has been used in the interactive systems domain. In addition, a process is given to construct the integrated CASE.

  16. Space Shuttle Main Engine Low Pressure Oxidizer Turbo-Pump Inducer Dynamic Environment Characterization through Water Model and Hot-Fire Testing

    NASA Technical Reports Server (NTRS)

    Arellano, Patrick; Patton, Marc; Schwartz, Alan; Stanton, David

    2006-01-01

    The Low Pressure Oxidizer Turbopump (LPOTP) inducer on the Block II configuration Space Shuttle Main Engine (SSME) experienced blade leading edge ripples during hot firing. This undesirable condition led to a minor redesign of the inducer blades. This resulted in the need to evaluate the performance and the dynamic environment of the redesign, relative to the current configuration, as part of the design acceptance process. Sub-scale water model tests of the two inducer configurations were performed, with emphasis on the dynamic environment due to cavitation induced vibrations. Water model tests were performed over a wide range of inlet flow coefficient and pressure conditions, representative of the scaled operating envelope of the Block II SSME, both in flight and in ground hot-fire tests, including all power levels. The water test hardware, facility set-up, type and placement of instrumentation, the scope of the test program, specific test objectives, data evaluation process and water test results that characterize and compare the two SSME LPOTP inducers are discussed. In addition, dynamic characteristics of the two water models were compared to hot fire data from specially instrumented ground tests. In general, good agreement between the water model and hot fire data was found, which confirms the value of water model testing for dynamic characterization of rocket engine turbomachinery.

  17. The Impact of Field Trips and Family Involvement on Mental Models of the Desert Environment

    NASA Astrophysics Data System (ADS)

    Judson, Eugene

    2011-07-01

    This study examined the mental models of the desert environment held by fourth- and seventh-grade students in the USA and whether those mental models could be affected by: (1) classroom field trips to a desert riparian preserve, and (2) interaction with family members at the same preserve. Results generally indicated that students in this study were resolute in their models and that field trips did not impact the types of models students adhered to. Twenty-three seventh-grade students who self-selected to participate in a Family Science Club with their parents did demonstrate a shift in their mental models and developed significantly more sophisticated models over time. A critical implication of the study is that unless transformation of mental models of the environment is an explicit goal of instruction, simple exposure to the environment (even within the context of life science instruction) will not transform understandings of how organisms within an environment act and interact interdependently.

  18. Introduction to a special section on ecohydrology of semiarid environments: Confronting mathematical models with ecosystem complexity

    NASA Astrophysics Data System (ADS)

    Svoray, Tal; Assouline, Shmuel; Katul, Gabriel

    2015-11-01

    Current literature provides large number of publications about ecohydrological processes and their effect on the biota in drylands. Given the limited laboratory and field experiments in such systems, many of these publications are based on mathematical models of varying complexity. The underlying implicit assumption is that the data set used to evaluate these models covers the parameter space of conditions that characterize drylands and that the models represent the actual processes with acceptable certainty. However, a question raised is to what extent these mathematical models are valid when confronted with observed ecosystem complexity? This Introduction reviews the 16 papers that comprise the Special Section on Eco-hydrology of Semiarid Environments: Confronting Mathematical Models with Ecosystem Complexity. The subjects studied in these papers include rainfall regime, infiltration and preferential flow, evaporation and evapotranspiration, annual net primary production, dispersal and invasion, and vegetation greening. The findings in the papers published in this Special Section show that innovative mathematical modeling approaches can represent actual field measurements. Hence, there are strong grounds for suggesting that mathematical models can contribute to greater understanding of ecosystem complexity through characterization of space-time dynamics of biomass and water storage as well as their multiscale interactions. However, the generality of the models and their low-dimensional representation of many processes may also be a "curse" that results in failures when particulars of an ecosystem are required. It is envisaged that the search for a unifying "general" model, while seductive, may remain elusive in the foreseeable future. It is for this reason that improving the merger between experiments and models of various degrees of complexity continues to shape the future research agenda.

  19. ARSENIC UPTAKE PROCESSES IN REDUCING ENVIRONMENTS: IMPLICATIONS FOR ACTIVE REMEDIATION AND NATURAL ATTENUATION

    EPA Science Inventory

    Reductive dissolution of iron oxyhydr(oxides) and release of adsorbed or coprecipitated arsenic is often implicated as a key process that controls the mobility and bioavailability of arsenic in anoxic environments. Yet a complete assessment of arsenic transport and fate requires...

  20. Overcoming the Subject-Object Dichotomy in Urban Modeling: Axial Maps as Geometric Representations of Affordances in the Built Environment.

    PubMed

    Marcus, Lars

    2018-01-01

    The world is witnessing unprecedented urbanization, bringing extreme challenges to contemporary practices in urban planning and design. This calls for improved urban models that can generate new knowledge and enhance practical skill. Importantly, any urban model embodies a conception of the relation between humans and the physical environment. In urban modeling this is typically conceived of as a relation between human subjects and an environmental object, thereby reproducing a humans-environment dichotomy. Alternative modeling traditions, such as space syntax that originates in architecture rather than geography, have tried to overcome this dichotomy. Central in this effort is the development of new representations of urban space, such as in the case of space syntax, the axial map. This form of representation aims to integrate both human behavior and the physical environment into one and the same description. Interestingly, models based on these representations have proved to better capture pedestrian movement than regular models. Pedestrian movement, as well as other kinds of human flows in urban space, is essential for urban modeling, since increasingly flows of this kind are understood as the driver in urban processes. Critical for a full understanding of space syntax modeling is the ontology of its' representations, such as the axial map. Space syntax theory here often refers to James Gibson's "Theory of affordances," where the concept of affordances, in a manner similar to axial maps, aims to bridge the subject-object dichotomy by neither constituting physical properties of the environment or human behavior, but rather what emerges in the meeting between the two. In extension of this, the axial map can be interpreted as a representation of how the physical form of the environment affords human accessibility and visibility in urban space. This paper presents a close examination of the form of representations developed in space syntax methodology, in particular

  1. Overcoming the Subject-Object Dichotomy in Urban Modeling: Axial Maps as Geometric Representations of Affordances in the Built Environment

    PubMed Central

    Marcus, Lars

    2018-01-01

    The world is witnessing unprecedented urbanization, bringing extreme challenges to contemporary practices in urban planning and design. This calls for improved urban models that can generate new knowledge and enhance practical skill. Importantly, any urban model embodies a conception of the relation between humans and the physical environment. In urban modeling this is typically conceived of as a relation between human subjects and an environmental object, thereby reproducing a humans-environment dichotomy. Alternative modeling traditions, such as space syntax that originates in architecture rather than geography, have tried to overcome this dichotomy. Central in this effort is the development of new representations of urban space, such as in the case of space syntax, the axial map. This form of representation aims to integrate both human behavior and the physical environment into one and the same description. Interestingly, models based on these representations have proved to better capture pedestrian movement than regular models. Pedestrian movement, as well as other kinds of human flows in urban space, is essential for urban modeling, since increasingly flows of this kind are understood as the driver in urban processes. Critical for a full understanding of space syntax modeling is the ontology of its' representations, such as the axial map. Space syntax theory here often refers to James Gibson's “Theory of affordances,” where the concept of affordances, in a manner similar to axial maps, aims to bridge the subject-object dichotomy by neither constituting physical properties of the environment or human behavior, but rather what emerges in the meeting between the two. In extension of this, the axial map can be interpreted as a representation of how the physical form of the environment affords human accessibility and visibility in urban space. This paper presents a close examination of the form of representations developed in space syntax methodology, in

  2. MESA: An Interactive Modeling and Simulation Environment for Intelligent Systems Automation

    NASA Technical Reports Server (NTRS)

    Charest, Leonard

    1994-01-01

    This report describes MESA, a software environment for creating applications that automate NASA mission opterations. MESA enables intelligent automation by utilizing model-based reasoning techniques developed in the field of Artificial Intelligence. Model-based reasoning techniques are realized in Mesa through native support of causal modeling and discrete event simulation.

  3. Modeling users, context and devices for ambient assisted living environments.

    PubMed

    Castillejo, Eduardo; Almeida, Aitor; López-de-Ipiña, Diego; Chen, Liming

    2014-03-17

    The participation of users within AAL environments is increasing thanks to the capabilities of the current wearable devices. Furthermore, the significance of considering user's preferences, context conditions and device's capabilities help smart environments to personalize services and resources for them. Being aware of different characteristics of the entities participating in these situations is vital for reaching the main goals of the corresponding systems efficiently. To collect different information from these entities, it is necessary to design several formal models which help designers to organize and give some meaning to the gathered data. In this paper, we analyze several literature solutions for modeling users, context and devices considering different approaches in the Ambient Assisted Living domain. Besides, we remark different ongoing standardization works in this area. We also discuss the used techniques, modeled characteristics and the advantages and drawbacks of each approach to finally draw several conclusions about the reviewed works.

  4. A Comparative of business process modelling techniques

    NASA Astrophysics Data System (ADS)

    Tangkawarow, I. R. H. T.; Waworuntu, J.

    2016-04-01

    In this era, there is a lot of business process modeling techniques. This article is the research about differences of business process modeling techniques. For each technique will explain about the definition and the structure. This paper presents a comparative analysis of some popular business process modelling techniques. The comparative framework is based on 2 criteria: notation and how it works when implemented in Somerleyton Animal Park. Each technique will end with the advantages and disadvantages. The final conclusion will give recommend of business process modeling techniques that easy to use and serve the basis for evaluating further modelling techniques.

  5. Butterfly valve in a virtual environment

    NASA Astrophysics Data System (ADS)

    Talekar, Aniruddha; Patil, Saurabh; Thakre, Prashant; Rajkumar, E.

    2017-11-01

    Assembly of components is one of the processes involved in product design and development. The present paper deals with the assembly of a simple butterfly valve components in a virtual environment. The assembly has been carried out using virtual reality software by trial and error methods. The parts are modelled using parametric software (SolidWorks), meshed accordingly, and then called into virtual environment for assembly.

  6. Natural and Induced Environment in Low Earth Orbit

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Badavi, Francis F.; Kim, Myung-Hee Y.; Clowdsley, Martha S.; Heinbockel, John H.; Cucinotta, Francis A.; Badhwar, Gautam D.; Atwell, William; Huston, Stuart L.

    2002-01-01

    The long-term exposure of astronauts on the developing International Space Station (ISS) requires an accurate knowledge of the internal exposure environment for human risk assessment and other onboard processes. The natural environment is moderated by the solar wind which varies over the solar cycle. The neutron environment within the Shuttle in low Earth orbit has two sources. A time dependent model for the ambient environment is used to evaluate the natural and induced environment. The induced neutron environment is evaluated using measurements on STS-31 and STS-36 near the 1990 solar maximum.

  7. Gene × Environment Interactions in Schizophrenia: Evidence from Genetic Mouse Models

    PubMed Central

    Marr, Julia; Bock, Gavin; Desbonnet, Lieve; Waddington, John

    2016-01-01

    The study of gene × environment, as well as epistatic interactions in schizophrenia, has provided important insight into the complex etiopathologic basis of schizophrenia. It has also increased our understanding of the role of susceptibility genes in the disorder and is an important consideration as we seek to translate genetic advances into novel antipsychotic treatment targets. This review summarises data arising from research involving the modelling of gene × environment interactions in schizophrenia using preclinical genetic models. Evidence for synergistic effects on the expression of schizophrenia-relevant endophenotypes will be discussed. It is proposed that valid and multifactorial preclinical models are important tools for identifying critical areas, as well as underlying mechanisms, of convergence of genetic and environmental risk factors, and their interaction in schizophrenia. PMID:27725886

  8. Modeling particle dispersion and deposition in indoor environments

    NASA Astrophysics Data System (ADS)

    Gao, N. P.; Niu, J. L.

    Particle dispersion and deposition in man-made enclosed environments are closely related to the well-being of occupants. The present study developed a three-dimensional drift-flux model for particle movements in turbulent indoor airflows, and combined it into Eulerian approaches. To account for the process of particle deposition at solid boundaries, a semi-empirical deposition model was adopted in which the size-dependent deposition characteristics were well resolved. After validation against the experimental data in a scaled isothermal chamber and in a full-scale non-isothermal environmental chamber, the drift-flux model was used to investigate the deposition rates and human exposures to particles from two different sources with three typical ventilation systems: mixing ventilation (MV), displacement ventilation (DV), and under-floor air distribution (UFAD). For particles originating from the supply air, a V-shaped curve of the deposition velocity variation as a function of particle size was observed. The minimum deposition appeared at 0.1- 0.5μm. For supermicron particles, the ventilation type and air exchange rate had an ignorable effect on the deposition rate. The movements of submicron particles were like tracer gases while the gravitational settling effect should be taken into account for particles larger than 2.5μm. The temporal increment of human exposure to a step-up particle release in the supply air was determined, among many factors, by the distance between the occupant and air outlet. The larger the particle size, the lower the human exposure. For particles released from an internal heat source, the concentration stratification of small particles (diameter <10μm) in the vertical direction appeared with DV and UFAD, and it was found the advantageous principle for gaseous pollutants that a relatively less-polluted occupied zone existed in DV and UFAD was also applicable to small particles.

  9. InSAR Scientific Computing Environment

    NASA Astrophysics Data System (ADS)

    Gurrola, E. M.; Rosen, P. A.; Sacco, G.; Zebker, H. A.; Simons, M.; Sandwell, D. T.

    2010-12-01

    The InSAR Scientific Computing Environment (ISCE) is a software development effort in its second year within the NASA Advanced Information Systems and Technology program. The ISCE will provide a new computing environment for geodetic image processing for InSAR sensors that will enable scientists to reduce measurements directly from radar satellites and aircraft to new geophysical products without first requiring them to develop detailed expertise in radar processing methods. The environment can serve as the core of a centralized processing center to bring Level-0 raw radar data up to Level-3 data products, but is adaptable to alternative processing approaches for science users interested in new and different ways to exploit mission data. The NRC Decadal Survey-recommended DESDynI mission will deliver data of unprecedented quantity and quality, making possible global-scale studies in climate research, natural hazards, and Earth's ecosystem. The InSAR Scientific Computing Environment is planned to become a key element in processing DESDynI data into higher level data products and it is expected to enable a new class of analyses that take greater advantage of the long time and large spatial scales of these new data, than current approaches. At the core of ISCE is both legacy processing software from the JPL/Caltech ROI_PAC repeat-pass interferometry package as well as a new InSAR processing package containing more efficient and more accurate processing algorithms being developed at Stanford for this project that is based on experience gained in developing processors for missions such as SRTM and UAVSAR. Around the core InSAR processing programs we are building object-oriented wrappers to enable their incorporation into a more modern, flexible, extensible software package that is informed by modern programming methods, including rigorous componentization of processing codes, abstraction and generalization of data models, and a robust, intuitive user interface with

  10. Modeling the economics of landfilling organic processing waste streams

    NASA Astrophysics Data System (ADS)

    Rosentrater, Kurt A.

    2005-11-01

    As manufacturing industries become more cognizant of the ecological effects that their firms have on the surrounding environment, their waste streams are increasingly becoming viewed not only as materials in need of disposal, but also as resources that can be reused, recycled, or reprocessed into valuable products. Within the food processing sector are many examples of various liquid, sludge, and solid biological and organic waste streams that require remediation. Alternative disposal methods for food and other bio-organic manufacturing waste streams are increasingly being investigated. Direct shipping, blending, extrusion, pelleting, and drying are commonly used to produce finished human food, animal feed, industrial products, and components ready for further manufacture. Landfilling, the traditional approach to waste remediation, however, should not be dismissed entirely. It does provide a baseline to which all other recycling and reprocessing options should be compared. This paper discusses the implementation of a computer model designed to examine the economics of landfilling bio-organic processing waste streams. Not only are these results applicable to food processing operations, but any industrial or manufacturing firm would benefit from examining the trends discussed here.

  11. Effect of the Salary Model on Sustainability of a Professional Practice Environment.

    PubMed

    Hickey, Rosa G; Buchko, Barbara L; Coe, Paula F; Woods, Anne B

    2017-10-01

    This replication study examined differences in RN perception of the professional practice environment (PPE) between salary- and hourly-wage compensation models over time. A previous study demonstrated that nurses in a salary-wage model had a significantly higher perception of the PPE compared with their peers receiving hourly wages. A descriptive, comparative design was used to examine the Revised Professional Practice Environment (RPPE) scale of nurses in the same units surveyed in the previous study 2 years later. Mean scores on the RPPE continued to be significantly lower for hourly-wage RNs compared with the RNs in the salary-wage model. Nurses in an hourly-wage unit have significantly lower perceptions of the clinical practice environment than their peers in a salary-wage unit, indicating that professional practice perceptions in a salary-wage unit were sustained for a 2-year period and may provide a more effective PPE.

  12. Modelling the effects of contaminated environments on HFMD infections in mainland China.

    PubMed

    Wang, Jinyan; Xiao, Yanni; Cheke, Robert A

    2016-02-01

    Hand-foot-mouth disease (HFMD) has spread widely in mainland China increasing in prevalence in most years with serious consequences for child health. The HFMD virus can survive for a long period outside the host in suitable conditions, and hence contaminated environments may play important roles in HFMD infection. A new mathematical model was proposed and used to investigate the roles that asymptomatic individuals and contaminated environments played in HFMD dynamics. The model includes both direct transmission between susceptible and infected individuals and indirect transmission via free-living infectious unites in the environment. Theoretical analysis shows that the disease goes to extinction if the basic reproduction number is less than unity, whilst otherwise the disease persists. By fitting the proposed model to surveillance data we estimated the basic reproduction number as 1.509. Numerical simulations show that increasing the rate of virus clearance and decreasing transmission rates can delay epidemic outbreaks and weaken the severity of HFMD. Sensitivity analysis indicated that the basic reproduction number is sensitive to the transmission rate induced by asymptomatic infectious individuals and parameters associated with contaminated environments such as the indirect transmission rate, the rate of clearance and the virus shedding rates. This implies that asymptomatic infectious individuals and contaminated environments contribute substantially to new HFMD infections, and so would be targets for effective control measures. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  13. Nurses Improving Care for Healthsystem Elders – a model for optimising the geriatric nursing practice environment

    PubMed Central

    Capezuti, Elizabeth; Boltz, Marie; Cline, Daniel; Dickson, Victoria Vaughn; Rosenberg, Marie-Claire; Wagner, Laura; Shuluk, Joseph; Nigolian, Cindy

    2012-01-01

    Aims and objectives To explain the relationship between a positive nurse practice environment (NPE) and implementation of evidence-based practices. To describe the components of NICHE (Nurses Improving Care for Healthsystem Elders) programmes that contribute to a positive geriatric nursing practice environment. Background The NPE is a system-level intervention for promoting quality and patient safety; however, there are population-specific factors that influence the nurses’ perception of their practice and its’ relationship with patient outcomes. Favourable perceptions of the geriatric-specific NPE are associated with better perceptions of geriatric care quality. Designs Discursive paper. Method In this selective critical analysis of the descriptive and empirical literature, we present the implementation of geriatric models in relation to the NPE and components of the NICHE programme that support hospitals’ systemic capacity to effectively integrate and sustain evidence-based geriatric knowledge into practice. Results Although there are several geriatric models and chronic care models available, NICHE has been the most successful in recruiting hospital membership as well as contributing to the depth of geriatric hospital programming. Conclusions Although all geriatric care models require significant nursing input, only NICHE focuses on the nursing staff’s perception of the care environment for geriatric practice. Studies in NICHE hospitals demonstrate that quality geriatric care requires a NPE in which the structure and processes of hospital services focus on specific patient care needs. Relevance to clinical practice The implementation of evidence-based models addressing the unique needs of hospitalised older adults requires programmes such as NICHE that serve as technical resources centre and a catalyst for networking among facilities committed to quality geriatric care. Unprecedented international growth in the ageing population compels us to examine how

  14. Process model economics of xanthan production from confectionery industry wastewaters.

    PubMed

    Bajić, Bojana Ž; Vučurović, Damjan G; Dodić, Siniša N; Grahovac, Jovana A; Dodić, Jelena M

    2017-12-01

    In this research a process and cost model for a xanthan production facility was developed using process simulation software (SuperPro Designer ® ). This work represents a novelty in the field for two reasons. One is that xanthan gum has been produced from several wastes but never from wastewaters from confectionery industries. The other more important is that the aforementioned software, which in intended exclusively for bioprocesses, is used for generating a base case, i.e. starting point for transferring the technology to industrial scales. Previously acquired experimental knowledge about using confectionery wastewaters from five different factories as substitutes for commercially used cultivation medium have been incorporated into the process model in order to obtain an economic viability of implementing such substrates. A lower initial sugar content in the medium based on wastewater (28.41 g/L) compared to the synthetic medium (30.00 g/L) gave a lower xanthan content at the end of cultivation (23.98 and 26.27 g/L, respectively). Although this resulted in somewhat poorer economic parameters, they were still in the range of being an investment of interest. Also the possibility of utilizing a cheap resource (waste) and reducing pollution that would result from its disposal has a positive effect on the environment. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Modeling of ETL-Processes and Processed Information in Clinical Data Warehousing.

    PubMed

    Tute, Erik; Steiner, Jochen

    2018-01-01

    Literature describes a big potential for reuse of clinical patient data. A clinical data warehouse (CDWH) is a means for that. To support management and maintenance of processes extracting, transforming and loading (ETL) data into CDWHs as well as to ease reuse of metadata between regular IT-management, CDWH and secondary data users by providing a modeling approach. Expert survey and literature review to find requirements and existing modeling techniques. An ETL-modeling-technique was developed extending existing modeling techniques. Evaluation by exemplarily modeling existing ETL-process and a second expert survey. Nine experts participated in the first survey. Literature review yielded 15 included publications. Six existing modeling techniques were identified. A modeling technique extending 3LGM2 and combining it with openEHR information models was developed and evaluated. Seven experts participated in the evaluation. The developed approach can help in management and maintenance of ETL-processes and could serve as interface between regular IT-management, CDWH and secondary data users.

  16. A New Time-dependent Model for the Martian Radiation Environment

    NASA Technical Reports Server (NTRS)

    DeAngelis, G.; Clowdsley, M. S.; Singleterry, R. C., Jr.; Wilson, J. W.

    2003-01-01

    Manned space activities have been until present time limited to the near-Earth environment, most of them to low Earth orbit (LEO) scenarios, with only some of the Apollo missions targeted to the Moon. In current times most human exploration and development of space (HEDS) activities are related to the development of the International Space Station (ISS), and therefore take place in the LEO environment. A natural extension of HEDS activities will be going beyond LEO, and reach asteroids, Mars, Jupiter, Saturn, the Kuiper belt and the outskirts of the Solar System. Such long journeys onboard spacecraft outside the protective umbrella of the geomagnetic field will require higher levels of protection from the radiation environment found in the deep space for both astronauts and equipment. So, it is important to have available a tool for radiation shielding which takes into account the radiation environments found all along the interplanetary space and at the different bodies encountered in the Solar System. Moreover, the radiation protection is one of the two NASA highest concerns and priorities. A tool integrating different radiation environments with shielding computation techniques especially tailored for deep space mission scenario is instrumental in view of this exigency. In view of manned missions targeted to Mars, for which radiation exposure is one of the greatest problems and challenges to be tackled, it is of fundamental importance to have available a tool which allows to know which are the particle flux and spectra at any time at any point of the Martian surface. With this goal in mind, a new model for the radiation environment to be found on the planet Mars due to Galactic Cosmic Rays (GCR) has been developed. Solar modulated primary particles rescaled for Mars conditions are transported within the Martian atmosphere, with temporal properties modeled with variable timescales, down to the surface, with altitude and backscattering patterns taken into account

  17. A MODELING AND SIMULATION LANGUAGE FOR BIOLOGICAL CELLS WITH COUPLED MECHANICAL AND CHEMICAL PROCESSES

    PubMed Central

    Somogyi, Endre; Glazier, James A.

    2017-01-01

    Biological cells are the prototypical example of active matter. Cells sense and respond to mechanical, chemical and electrical environmental stimuli with a range of behaviors, including dynamic changes in morphology and mechanical properties, chemical uptake and secretion, cell differentiation, proliferation, death, and migration. Modeling and simulation of such dynamic phenomena poses a number of computational challenges. A modeling language describing cellular dynamics must naturally represent complex intra and extra-cellular spatial structures and coupled mechanical, chemical and electrical processes. Domain experts will find a modeling language most useful when it is based on concepts, terms and principles native to the problem domain. A compiler must then be able to generate an executable model from this physically motivated description. Finally, an executable model must efficiently calculate the time evolution of such dynamic and inhomogeneous phenomena. We present a spatial hybrid systems modeling language, compiler and mesh-free Lagrangian based simulation engine which will enable domain experts to define models using natural, biologically motivated constructs and to simulate time evolution of coupled cellular, mechanical and chemical processes acting on a time varying number of cells and their environment. PMID:29303160

  18. A MODELING AND SIMULATION LANGUAGE FOR BIOLOGICAL CELLS WITH COUPLED MECHANICAL AND CHEMICAL PROCESSES.

    PubMed

    Somogyi, Endre; Glazier, James A

    2017-04-01

    Biological cells are the prototypical example of active matter. Cells sense and respond to mechanical, chemical and electrical environmental stimuli with a range of behaviors, including dynamic changes in morphology and mechanical properties, chemical uptake and secretion, cell differentiation, proliferation, death, and migration. Modeling and simulation of such dynamic phenomena poses a number of computational challenges. A modeling language describing cellular dynamics must naturally represent complex intra and extra-cellular spatial structures and coupled mechanical, chemical and electrical processes. Domain experts will find a modeling language most useful when it is based on concepts, terms and principles native to the problem domain. A compiler must then be able to generate an executable model from this physically motivated description. Finally, an executable model must efficiently calculate the time evolution of such dynamic and inhomogeneous phenomena. We present a spatial hybrid systems modeling language, compiler and mesh-free Lagrangian based simulation engine which will enable domain experts to define models using natural, biologically motivated constructs and to simulate time evolution of coupled cellular, mechanical and chemical processes acting on a time varying number of cells and their environment.

  19. Multi-Instance Learning Models for Automated Support of Analysts in Simulated Surveillance Environments

    NASA Technical Reports Server (NTRS)

    Birisan, Mihnea; Beling, Peter

    2011-01-01

    New generations of surveillance drones are being outfitted with numerous high definition cameras. The rapid proliferation of fielded sensors and supporting capacity for processing and displaying data will translate into ever more capable platforms, but with increased capability comes increased complexity and scale that may diminish the usefulness of such platforms to human operators. We investigate methods for alleviating strain on analysts by automatically retrieving content specific to their current task using a machine learning technique known as Multi-Instance Learning (MIL). We use MIL to create a real time model of the analysts' task and subsequently use the model to dynamically retrieve relevant content. This paper presents results from a pilot experiment in which a computer agent is assigned analyst tasks such as identifying caravanning vehicles in a simulated vehicle traffic environment. We compare agent performance between MIL aided trials and unaided trials.

  20. A Bayesian Poisson-lognormal Model for Count Data for Multiple-Trait Multiple-Environment Genomic-Enabled Prediction

    PubMed Central

    Montesinos-López, Osval A.; Montesinos-López, Abelardo; Crossa, José; Toledo, Fernando H.; Montesinos-López, José C.; Singh, Pawan; Juliana, Philomin; Salinas-Ruiz, Josafhat

    2017-01-01

    When a plant scientist wishes to make genomic-enabled predictions of multiple traits measured in multiple individuals in multiple environments, the most common strategy for performing the analysis is to use a single trait at a time taking into account genotype × environment interaction (G × E), because there is a lack of comprehensive models that simultaneously take into account the correlated counting traits and G × E. For this reason, in this study we propose a multiple-trait and multiple-environment model for count data. The proposed model was developed under the Bayesian paradigm for which we developed a Markov Chain Monte Carlo (MCMC) with noninformative priors. This allows obtaining all required full conditional distributions of the parameters leading to an exact Gibbs sampler for the posterior distribution. Our model was tested with simulated data and a real data set. Results show that the proposed multi-trait, multi-environment model is an attractive alternative for modeling multiple count traits measured in multiple environments. PMID:28364037