Sample records for process modeling environment

  1. Research environments that promote integrity.

    PubMed

    Jeffers, Brenda Recchia; Whittemore, Robin

    2005-01-01

    The body of empirical knowledge about research integrity and the factors that promote research integrity in nursing research environments remains small. To propose an internal control model as an innovative framework for the design and structure of nursing research environments that promote integrity. An internal control model is adapted to illustrate its use for conceptualizing and designing research environments that promote integrity. The internal control model integrates both the organizational elements necessary to promote research integrity and the processes needed to assess research environments. The model provides five interrelated process components within which any number of research integrity variables and processes may be used and studied: internal control environment, risk assessment, internal control activities, monitoring, and information and communication. The components of the proposed research integrity internal control model proposed comprise an integrated conceptualization of the processes that provide reasonable assurance that research integrity will be promoted within the nursing research environment. Schools of nursing can use the model to design, implement, and evaluate systems that promote research integrity. The model process components need further exploration to substantiate the use of the model in nursing research environments.

  2. Extending BPM Environments of Your Choice with Performance Related Decision Support

    NASA Astrophysics Data System (ADS)

    Fritzsche, Mathias; Picht, Michael; Gilani, Wasif; Spence, Ivor; Brown, John; Kilpatrick, Peter

    What-if Simulations have been identified as one solution for business performance related decision support. Such support is especially useful in cases where it can be automatically generated out of Business Process Management (BPM) Environments from the existing business process models and performance parameters monitored from the executed business process instances. Currently, some of the available BPM Environments offer basic-level performance prediction capabilities. However, these functionalities are normally too limited to be generally useful for performance related decision support at business process level. In this paper, an approach is presented which allows the non-intrusive integration of sophisticated tooling for what-if simulations, analytic performance prediction tools, process optimizations or a combination of such solutions into already existing BPM environments. The approach abstracts from process modelling techniques which enable automatic decision support spanning processes across numerous BPM Environments. For instance, this enables end-to-end decision support for composite processes modelled with the Business Process Modelling Notation (BPMN) on top of existing Enterprise Resource Planning (ERP) processes modelled with proprietary languages.

  3. Integrated approaches to the application of advanced modeling technology in process development and optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allgor, R.J.; Feehery, W.F.; Tolsma, J.E.

    The batch process development problem serves as good candidate to guide the development of process modeling environments. It demonstrates that very robust numerical techniques are required within an environment that can collect, organize, and maintain the data and models required to address the batch process development problem. This paper focuses on improving the robustness and efficiency of the numerical algorithms required in such a modeling environment through the development of hybrid numerical and symbolic strategies.

  4. Conceptualization of Approaches and Thought Processes Emerging in Validating of Model in Mathematical Modeling in Technology Aided Environment

    ERIC Educational Resources Information Center

    Hidiroglu, Çaglar Naci; Bukova Güzel, Esra

    2013-01-01

    The aim of the present study is to conceptualize the approaches displayed for validation of model and thought processes provided in mathematical modeling process performed in technology-aided learning environment. The participants of this grounded theory study were nineteen secondary school mathematics student teachers. The data gathered from the…

  5. Extreme Environments Development of Decision Processes and Training Programs for Medical Policy Formulation

    NASA Technical Reports Server (NTRS)

    Stough, Roger

    2004-01-01

    The purpose of this workshop was to survey existing health and safety policies as well as processes and practices for various extreme environments; to identify strengths and shortcomings of these processes; and to recommend parameters for inclusion in a generic approach to policy formulation, applicable to the broadest categories of extreme environments. It was anticipated that two additional workshops would follow. The November 7, 2003 workshop would be devoted to the evaluation of different model(s) and a concluding expert evaluation of the usefulness of the model using a policy formulation example. The final workshop was planned for March 2004.

  6. Performance implications of leader briefings and team-interaction training for team adaptation to novel environments.

    PubMed

    Marks, M A; Zaccaro, S J; Mathieu, J E

    2000-12-01

    The authors examined how leader briefings and team-interaction training influence team members' knowledge structures concerning processes related to effective performance in both routine and novel environments. Two-hundred thirty-seven undergraduates from a large mid-Atlantic university formed 79 three-member tank platoon teams and participated in a low-fidelity tank simulation. Team-interaction training, leader briefings, and novelty of performance environment were manipulated. Findings indicated that both leader briefings and team-interaction training affected the development of mental models, which in turn positively influenced team communication processes and team performance. Mental models and communication processes predicted performance more strongly in novel than in routine environments. Implications for the role of team-interaction training, leader briefings, and mental models as mechanisms for team adaptation are discussed.

  7. Systems view on spatial planning and perception based on invariants in agent-environment dynamics

    PubMed Central

    Mettler, Bérénice; Kong, Zhaodan; Li, Bin; Andersh, Jonathan

    2015-01-01

    Modeling agile and versatile spatial behavior remains a challenging task, due to the intricate coupling of planning, control, and perceptual processes. Previous results have shown that humans plan and organize their guidance behavior by exploiting patterns in the interactions between agent or organism and the environment. These patterns, described under the concept of Interaction Patterns (IPs), capture invariants arising from equivalences and symmetries in the interaction with the environment, as well as effects arising from intrinsic properties of human control and guidance processes, such as perceptual guidance mechanisms. The paper takes a systems' perspective, considering the IP as a unit of organization, and builds on its properties to present a hierarchical model that delineates the planning, control, and perceptual processes and their integration. The model's planning process is further elaborated by showing that the IP can be abstracted, using spatial time-to-go functions. The perceptual processes are elaborated from the hierarchical model. The paper provides experimental support for the model's ability to predict the spatial organization of behavior and the perceptual processes. PMID:25628524

  8. Engineered Barrier System: Physical and Chemical Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    P. Dixon

    2004-04-26

    The conceptual and predictive models documented in this Engineered Barrier System: Physical and Chemical Environment Model report describe the evolution of the physical and chemical conditions within the waste emplacement drifts of the repository. The modeling approaches and model output data will be used in the total system performance assessment (TSPA-LA) to assess the performance of the engineered barrier system and the waste form. These models evaluate the range of potential water compositions within the emplacement drifts, resulting from the interaction of introduced materials and minerals in dust with water seeping into the drifts and with aqueous solutions forming bymore » deliquescence of dust (as influenced by atmospheric conditions), and from thermal-hydrological-chemical (THC) processes in the drift. These models also consider the uncertainty and variability in water chemistry inside the drift and the compositions of introduced materials within the drift. This report develops and documents a set of process- and abstraction-level models that constitute the engineered barrier system: physical and chemical environment model. Where possible, these models use information directly from other process model reports as input, which promotes integration among process models used for total system performance assessment. Specific tasks and activities of modeling the physical and chemical environment are included in the technical work plan ''Technical Work Plan for: In-Drift Geochemistry Modeling'' (BSC 2004 [DIRS 166519]). As described in the technical work plan, the development of this report is coordinated with the development of other engineered barrier system analysis model reports.« less

  9. A cluster expansion model for predicting activation barrier of atomic processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rehman, Tafizur; Jaipal, M.; Chatterjee, Abhijit, E-mail: achatter@iitk.ac.in

    2013-06-15

    We introduce a procedure based on cluster expansion models for predicting the activation barrier of atomic processes encountered while studying the dynamics of a material system using the kinetic Monte Carlo (KMC) method. Starting with an interatomic potential description, a mathematical derivation is presented to show that the local environment dependence of the activation barrier can be captured using cluster interaction models. Next, we develop a systematic procedure for training the cluster interaction model on-the-fly, which involves: (i) obtaining activation barriers for handful local environments using nudged elastic band (NEB) calculations, (ii) identifying the local environment by analyzing the NEBmore » results, and (iii) estimating the cluster interaction model parameters from the activation barrier data. Once a cluster expansion model has been trained, it is used to predict activation barriers without requiring any additional NEB calculations. Numerical studies are performed to validate the cluster expansion model by studying hop processes in Ag/Ag(100). We show that the use of cluster expansion model with KMC enables efficient generation of an accurate process rate catalog.« less

  10. RELATIVE CONTRIBUTIONS OF THE WEAK, MAIN, AND FISSION-RECYCLING r-PROCESS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shibagaki, S.; Kajino, T.; Mathews, G. J.

    There has been a persistent conundrum in attempts to model the nucleosynthesis of heavy elements by rapid neutron capture (the r-process). Although the locations of the abundance peaks near nuclear mass numbers 130 and 195 identify an environment of rapid neutron capture near closed nuclear shells, the abundances of elements just above and below those peaks are often underproduced by more than an order of magnitude in model calculations. At the same time, there is a debate in the literature as to what degree the r-process elements are produced in supernovae or the mergers of binary neutron stars. In thismore » paper we propose a novel solution to both problems. We demonstrate that the underproduction of nuclides above and below the r-process peaks in main or weak r-process models (like magnetohydrodynamic jets or neutrino-driven winds in core-collapse supernovae) can be supplemented via fission fragment distributions from the recycling of material in a neutron-rich environment such as that encountered in neutron star mergers (NSMs). In this paradigm, the abundance peaks themselves are well reproduced by a moderately neutron-rich, main r-process environment such as that encountered in the magnetohydrodynamical jets in supernovae supplemented with a high-entropy, weakly neutron-rich environment such as that encountered in the neutrino-driven-wind model to produce the lighter r-process isotopes. Moreover, we show that the relative contributions to the r-process abundances in both the solar system and metal-poor stars from the weak, main, and fission-recycling environments required by this proposal are consistent with estimates of the relative Galactic event rates of core-collapse supernovae for the weak and main r-process and NSMs for the fission-recycling r-process.« less

  11. The ANISA Model of Education: A Critique. Issues in Native Education.

    ERIC Educational Resources Information Center

    Four Worlds Development Project, Lethbridge (Alberta).

    The ANISA model of education (D. Streets and D. Jordan) classifies curriculum content into four areas--the physical environment, the human environment, the unknown environment, and the self--and encourages horizontal integration between content areas. The ANISA model holds that the process of learning consists of differentiation, integration, and…

  12. Ada (Trade name) Compiler Validation Summary Report: Rational. Rational Environment (Trademark) A952. Rational Architecture (R1000 (Trade name) Model 200).

    DTIC Science & Technology

    1987-05-06

    Rational . Rational Environment A_9_5_2. Rational Arthitecture (R1000 Model 200) 6. PERFORMING ORG. REPORT...validation testing performed on the Rational Environment , A_9_5_2, using Version 1.8 of the Ada0 Compiler Validation Capability (ACVC). The Rational ... Environment is hosted on a Rational Architecture (R1000 Model 200) operating under Rational Environment , Release A 95 2. Programs processed by this

  13. The TAME Project: Towards improvement-oriented software environments

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Rombach, H. Dieter

    1988-01-01

    Experience from a dozen years of analyzing software engineering processes and products is summarized as a set of software engineering and measurement principles that argue for software engineering process models that integrate sound planning and analysis into the construction process. In the TAME (Tailoring A Measurement Environment) project at the University of Maryland, such an improvement-oriented software engineering process model was developed that uses the goal/question/metric paradigm to integrate the constructive and analytic aspects of software development. The model provides a mechanism for formalizing the characterization and planning tasks, controlling and improving projects based on quantitative analysis, learning in a deeper and more systematic way about the software process and product, and feeding the appropriate experience back into the current and future projects. The TAME system is an instantiation of the TAME software engineering process model as an ISEE (integrated software engineering environment). The first in a series of TAME system prototypes has been developed. An assessment of experience with this first limited prototype is presented including a reassessment of its initial architecture.

  14. EvoBuild: A Quickstart Toolkit for Programming Agent-Based Models of Evolutionary Processes

    NASA Astrophysics Data System (ADS)

    Wagh, Aditi; Wilensky, Uri

    2018-04-01

    Extensive research has shown that one of the benefits of programming to learn about scientific phenomena is that it facilitates learning about mechanisms underlying the phenomenon. However, using programming activities in classrooms is associated with costs such as requiring additional time to learn to program or students needing prior experience with programming. This paper presents a class of programming environments that we call quickstart: Environments with a negligible threshold for entry into programming and a modest ceiling. We posit that such environments can provide benefits of programming for learning without incurring associated costs for novice programmers. To make this claim, we present a design-based research study conducted to compare programming models of evolutionary processes with a quickstart toolkit with exploring pre-built models of the same processes. The study was conducted in six seventh grade science classes in two schools. Students in the programming condition used EvoBuild, a quickstart toolkit for programming agent-based models of evolutionary processes, to build their NetLogo models. Students in the exploration condition used pre-built NetLogo models. We demonstrate that although students came from a range of academic backgrounds without prior programming experience, and all students spent the same number of class periods on the activities including the time students took to learn programming in this environment, EvoBuild students showed greater learning about evolutionary mechanisms. We discuss the implications of this work for design research on programming environments in K-12 science education.

  15. Designing a Web-Based Science Learning Environment for Model-Based Collaborative Inquiry

    ERIC Educational Resources Information Center

    Sun, Daner; Looi, Chee-Kit

    2013-01-01

    The paper traces a research process in the design and development of a science learning environment called WiMVT (web-based inquirer with modeling and visualization technology). The WiMVT system is designed to help secondary school students build a sophisticated understanding of scientific conceptions, and the science inquiry process, as well as…

  16. A Dynamic Process Model for Optimizing the Hospital Environment Cash-Flow

    NASA Astrophysics Data System (ADS)

    Pater, Flavius; Rosu, Serban

    2011-09-01

    In this article is presented a new approach to some fundamental techniques of solving dynamic programming problems with the use of functional equations. We will analyze the problem of minimizing the cost of treatment in a hospital environment. Mathematical modeling of this process leads to an optimal control problem with a finite horizon.

  17. Accounting for the influence of salt water in the physics required for processing underwater UXO EMI signals

    NASA Astrophysics Data System (ADS)

    Shubitidze, Fridon; Barrowes, Benjamin E.; Shamatava, Irma; Sigman, John; O'Neill, Kevin A.

    2018-05-01

    Processing electromagnetic induction signals from subsurface targets, for purposes of discrimination, requires accurate physical models. To date, successful approaches for on-land cases have entailed advanced modeling of responses by the targets themselves, with quite adequate treatment of instruments as well. Responses from the environment were typically slight and/or were treated very simply. When objects are immersed in saline solutions, however, more sophisticated modeling of the diffusive EMI physics in the environment is required. One needs to account for the response of the environment itself as well as the environment's frequency and time-dependent effects on both primary and secondary fields, from sensors and targets, respectively. Here we explicate the requisite physics and identify its effects quantitatively via analytical, numerical, and experimental investigations. Results provide a path for addressing the quandaries posed by previous underwater measurements and indicate how the environmental physics may be included in more successful processing.

  18. EEGLAB, SIFT, NFT, BCILAB, and ERICA: new tools for advanced EEG processing.

    PubMed

    Delorme, Arnaud; Mullen, Tim; Kothe, Christian; Akalin Acar, Zeynep; Bigdely-Shamlo, Nima; Vankov, Andrey; Makeig, Scott

    2011-01-01

    We describe a set of complementary EEG data collection and processing tools recently developed at the Swartz Center for Computational Neuroscience (SCCN) that connect to and extend the EEGLAB software environment, a freely available and readily extensible processing environment running under Matlab. The new tools include (1) a new and flexible EEGLAB STUDY design facility for framing and performing statistical analyses on data from multiple subjects; (2) a neuroelectromagnetic forward head modeling toolbox (NFT) for building realistic electrical head models from available data; (3) a source information flow toolbox (SIFT) for modeling ongoing or event-related effective connectivity between cortical areas; (4) a BCILAB toolbox for building online brain-computer interface (BCI) models from available data, and (5) an experimental real-time interactive control and analysis (ERICA) environment for real-time production and coordination of interactive, multimodal experiments.

  19. Autoplan: A self-processing network model for an extended blocks world planning environment

    NASA Technical Reports Server (NTRS)

    Dautrechy, C. Lynne; Reggia, James A.; Mcfadden, Frank

    1990-01-01

    Self-processing network models (neural/connectionist models, marker passing/message passing networks, etc.) are currently undergoing intense investigation for a variety of information processing applications. These models are potentially very powerful in that they support a large amount of explicit parallel processing, and they cleanly integrate high level and low level information processing. However they are currently limited by a lack of understanding of how to apply them effectively in many application areas. The formulation of self-processing network methods for dynamic, reactive planning is studied. The long-term goal is to formulate robust, computationally effective information processing methods for the distributed control of semiautonomous exploration systems, e.g., the Mars Rover. The current research effort is focusing on hierarchical plan generation, execution and revision through local operations in an extended blocks world environment. This scenario involves many challenging features that would be encountered in a real planning and control environment: multiple simultaneous goals, parallel as well as sequential action execution, action sequencing determined not only by goals and their interactions but also by limited resources (e.g., three tasks, two acting agents), need to interpret unanticipated events and react appropriately through replanning, etc.

  20. Exploring Undergraduate Students' Mental Models of the Environment: Are They Related to Environmental Affect and Behavior?

    ERIC Educational Resources Information Center

    Liu, Shu-Chiu; Lin, Huann-shyang

    2015-01-01

    A draw-and-explain task and questionnaire were used to explore Taiwanese undergraduate students' mental models of the environment and whether and how they relate to their environmental affect and behavioral commitment. We found that students generally held incomplete mental models of the environment, focusing on objects rather than on processes or…

  1. Scenario Analysis: An Integrative Study and Guide to Implementation in the United States Air Force

    DTIC Science & Technology

    1994-09-01

    Environmental Analysis ................................ 3-3 Classifications of Environments ......................... 3-5 Characteristics of... Environments ........................ 3-8 iii Page Components of the Environmental Analysis Process ........... 3-12 Forecasting... Environmental Analysis ...................... 3-4 3-2 Model of the Industry Environment ......................... 3-6 3-3 Model of Macroenvironment

  2. Performance modeling codes for the QuakeSim problem solving environment

    NASA Technical Reports Server (NTRS)

    Parker, J. W.; Donnellan, A.; Lyzenga, G.; Rundle, J.; Tullis, T.

    2003-01-01

    The QuakeSim Problem Solving Environment uses a web-services approach to unify and deploy diverse remote data sources and processing services within a browser environment. Here we focus on the high-performance crustal modeling applications that will be included in this set of remote but interoperable applications.

  3. DEVELOPMENT OF CAPE-OPEN COMPLIANT PROCESS MODELING COMPONENTS IN MICROSOFT .NET

    EPA Science Inventory

    The CAPE-OPEN middleware standards were created to allow process modeling components (PMCs) developed by third parties to be used in any process modeling environment (PME) utilizing these standards. The CAPE-OPEN middleware specifications were based upon both Microsoft's Compone...

  4. A Distributed Snow Evolution Modeling System (SnowModel)

    NASA Astrophysics Data System (ADS)

    Liston, G. E.; Elder, K.

    2004-12-01

    A spatially distributed snow-evolution modeling system (SnowModel) has been specifically designed to be applicable over a wide range of snow landscapes, climates, and conditions. To reach this goal, SnowModel is composed of four sub-models: MicroMet defines the meteorological forcing conditions, EnBal calculates surface energy exchanges, SnowMass simulates snow depth and water-equivalent evolution, and SnowTran-3D accounts for snow redistribution by wind. While other distributed snow models exist, SnowModel is unique in that it includes a well-tested blowing-snow sub-model (SnowTran-3D) for application in windy arctic, alpine, and prairie environments where snowdrifts are common. These environments comprise 68% of the seasonally snow-covered Northern Hemisphere land surface. SnowModel also accounts for snow processes occurring in forested environments (e.g., canopy interception related processes). SnowModel is designed to simulate snow-related physical processes occurring at spatial scales of 5-m and greater, and temporal scales of 1-hour and greater. These include: accumulation from precipitation; wind redistribution and sublimation; loading, unloading, and sublimation within forest canopies; snow-density evolution; and snowpack ripening and melt. To enhance its wide applicability, SnowModel includes the physical calculations required to simulate snow evolution within each of the global snow classes defined by Sturm et al. (1995), e.g., tundra, taiga, alpine, prairie, maritime, and ephemeral snow covers. The three, 25-km by 25-km, Cold Land Processes Experiment (CLPX) mesoscale study areas (MSAs: Fraser, North Park, and Rabbit Ears) are used as SnowModel simulation examples to highlight model strengths, weaknesses, and features in forested, semi-forested, alpine, and shrubland environments.

  5. The interaction of wind and water in the desertification environment

    NASA Technical Reports Server (NTRS)

    Jacobberger, P. A.

    1987-01-01

    An appropriate process/response model for the physical basis of desertification is provided by the interactions of wind and water in the desert fringe environment. Essentially, the process of desertification can be thought of as a progressive environmental transition from predominantly fluvial to aeolian processes. This is a simple but useful way of looking at desertification; in this context, desertification is morphogenetic in character. To illustrate the model, a study of drought-related changes in central Mali will serve to trace the interrelated responses of geomorphologic processes to drought conditions.

  6. An Extended Petri-Net Based Approach for Supply Chain Process Enactment in Resource-Centric Web Service Environment

    NASA Astrophysics Data System (ADS)

    Wang, Xiaodong; Zhang, Xiaoyu; Cai, Hongming; Xu, Boyi

    Enacting a supply-chain process involves variant partners and different IT systems. REST receives increasing attention for distributed systems with loosely coupled resources. Nevertheless, resource model incompatibilities and conflicts prevent effective process modeling and deployment in resource-centric Web service environment. In this paper, a Petri-net based framework for supply-chain process integration is proposed. A resource meta-model is constructed to represent the basic information of resources. Then based on resource meta-model, XML schemas and documents are derived, which represent resources and their states in Petri-net. Thereafter, XML-net, a high level Petri-net, is employed for modeling control and data flow of process. From process model in XML-net, RESTful services and choreography descriptions are deduced. Therefore, unified resource representation and RESTful services description are proposed for cross-system integration in a more effective way. A case study is given to illustrate the approach and the desirable features of the approach are discussed.

  7. EEGLAB, SIFT, NFT, BCILAB, and ERICA: New Tools for Advanced EEG Processing

    PubMed Central

    Delorme, Arnaud; Mullen, Tim; Kothe, Christian; Akalin Acar, Zeynep; Bigdely-Shamlo, Nima; Vankov, Andrey; Makeig, Scott

    2011-01-01

    We describe a set of complementary EEG data collection and processing tools recently developed at the Swartz Center for Computational Neuroscience (SCCN) that connect to and extend the EEGLAB software environment, a freely available and readily extensible processing environment running under Matlab. The new tools include (1) a new and flexible EEGLAB STUDY design facility for framing and performing statistical analyses on data from multiple subjects; (2) a neuroelectromagnetic forward head modeling toolbox (NFT) for building realistic electrical head models from available data; (3) a source information flow toolbox (SIFT) for modeling ongoing or event-related effective connectivity between cortical areas; (4) a BCILAB toolbox for building online brain-computer interface (BCI) models from available data, and (5) an experimental real-time interactive control and analysis (ERICA) environment for real-time production and coordination of interactive, multimodal experiments. PMID:21687590

  8. A Novel Petri Nets-Based Modeling Method for the Interaction between the Sensor and the Geographic Environment in Emerging Sensor Networks

    PubMed Central

    Zhang, Feng; Xu, Yuetong; Chou, Jarong

    2016-01-01

    The service of sensor device in Emerging Sensor Networks (ESNs) is the extension of traditional Web services. Through the sensor network, the service of sensor device can communicate directly with the entity in the geographic environment, and even impact the geographic entity directly. The interaction between the sensor device in ESNs and geographic environment is very complex, and the interaction modeling is a challenging problem. This paper proposed a novel Petri Nets-based modeling method for the interaction between the sensor device and the geographic environment. The feature of the sensor device service in ESNs is more easily affected by the geographic environment than the traditional Web service. Therefore, the response time, the fault-tolerant ability and the resource consumption become important factors in the performance of the whole sensor application system. Thus, this paper classified IoT services as Sensing services and Controlling services according to the interaction between IoT service and geographic entity, and classified GIS services as data services and processing services. Then, this paper designed and analyzed service algebra and Colored Petri Nets model to modeling the geo-feature, IoT service, GIS service and the interaction process between the sensor and the geographic enviroment. At last, the modeling process is discussed by examples. PMID:27681730

  9. Interplanetary Radiation and Internal Charging Environment Models for Solar Sails

    NASA Technical Reports Server (NTRS)

    Minow, Joseph I.; Altstatt, Richard L.; NeegaardParker, Linda

    2005-01-01

    A Solar Sail Radiation Environment (SSRE) model has been developed for defining charged particle environments over an energy range from 0.01 keV to 1 MeV for hydrogen ions, helium ions, and electrons. The SSRE model provides the free field charged particle environment required for characterizing energy deposition per unit mass, charge deposition, and dose rate dependent conductivity processes required to evaluate radiation dose and internal (bulk) charging processes in the solar sail membrane in interplanetary space. Solar wind and energetic particle measurements from instruments aboard the Ulysses spacecraft in a solar, near-polar orbit provide the particle data over a range of heliospheric latitudes used to derive the environment that can be used for radiation and charging environments for both high inclination 0.5 AU Solar Polar Imager mission and the 1.0 AU L1 solar missions. This paper describes the techniques used to model comprehensive electron, proton, and helium spectra over the range of particle energies of significance to energy and charge deposition in thin (less than 25 micrometers) solar sail materials.

  10. Systems Modeling at Multiple Levels of Regulation: Linking Systems and Genetic Networks to Spatially Explicit Plant Populations

    PubMed Central

    Kitchen, James L.; Allaby, Robin G.

    2013-01-01

    Selection and adaptation of individuals to their underlying environments are highly dynamical processes, encompassing interactions between the individual and its seasonally changing environment, synergistic or antagonistic interactions between individuals and interactions amongst the regulatory genes within the individual. Plants are useful organisms to study within systems modeling because their sedentary nature simplifies interactions between individuals and the environment, and many important plant processes such as germination or flowering are dependent on annual cycles which can be disrupted by climate behavior. Sedentism makes plants relevant candidates for spatially explicit modeling that is tied in with dynamical environments. We propose that in order to fully understand the complexities behind plant adaptation, a system that couples aspects from systems biology with population and landscape genetics is required. A suitable system could be represented by spatially explicit individual-based models where the virtual individuals are located within time-variable heterogeneous environments and contain mutable regulatory gene networks. These networks could directly interact with the environment, and should provide a useful approach to studying plant adaptation. PMID:27137364

  11. Macro Level Simulation Model Of Space Shuttle Processing

    NASA Technical Reports Server (NTRS)

    2000-01-01

    The contents include: 1) Space Shuttle Processing Simulation Model; 2) Knowledge Acquisition; 3) Simulation Input Analysis; 4) Model Applications in Current Shuttle Environment; and 5) Model Applications for Future Reusable Launch Vehicles (RLV's). This paper is presented in viewgraph form.

  12. A Multiagent Modeling Environment for Simulating Work Practice in Organizations

    NASA Technical Reports Server (NTRS)

    Sierhuis, Maarten; Clancey, William J.; vanHoof, Ron

    2004-01-01

    In this paper we position Brahms as a tool for simulating organizational processes. Brahms is a modeling and simulation environment for analyzing human work practice, and for using such models to develop intelligent software agents to support the work practice in organizations. Brahms is the result of more than ten years of research at the Institute for Research on Learning (IRL), NYNEX Science & Technology (the former R&D institute of the Baby Bell telephone company in New York, now Verizon), and for the last six years at NASA Ames Research Center, in the Work Systems Design and Evaluation group, part of the Computational Sciences Division (Code IC). Brahms has been used on more than ten modeling and simulation research projects, and recently has been used as a distributed multiagent development environment for developing work practice support tools for human in-situ science exploration on planetary surfaces, in particular a human mission to Mars. Brahms was originally conceived of as a business process modeling and simulation tool that incorporates the social systems of work, by illuminating how formal process flow descriptions relate to people s actual located activities in the workplace. Our research started in the early nineties as a reaction to experiences with work process modeling and simulation . Although an effective tool for convincing management of the potential cost-savings of the newly designed work processes, the modeling and simulation environment was only able to describe work as a normative workflow. However, the social systems, uncovered in work practices studied by the design team played a significant role in how work actually got done-actual lived work. Multi- tasking, informal assistance and circumstantial work interactions could not easily be represented in a tool with a strict workflow modeling paradigm. In response, we began to develop a tool that would have the benefits of work process modeling and simulation, but be distinctively able to represent the relations of people, locations, systems, artifacts, communication and information content.

  13. Differential Susceptibility to the Environment: Are Developmental Models Compatible with the Evidence from Twin Studies?

    ERIC Educational Resources Information Center

    Del Giudice, Marco

    2016-01-01

    According to models of differential susceptibility, the same neurobiological and temperamental traits that determine increased sensitivity to stress and adversity also confer enhanced responsivity to the positive aspects of the environment. Differential susceptibility models have expanded to include complex developmental processes in which genetic…

  14. Automating an integrated spatial data-mining model for landfill site selection

    NASA Astrophysics Data System (ADS)

    Abujayyab, Sohaib K. M.; Ahamad, Mohd Sanusi S.; Yahya, Ahmad Shukri; Ahmad, Siti Zubaidah; Aziz, Hamidi Abdul

    2017-10-01

    An integrated programming environment represents a robust approach to building a valid model for landfill site selection. One of the main challenges in the integrated model is the complicated processing and modelling due to the programming stages and several limitations. An automation process helps avoid the limitations and improve the interoperability between integrated programming environments. This work targets the automation of a spatial data-mining model for landfill site selection by integrating between spatial programming environment (Python-ArcGIS) and non-spatial environment (MATLAB). The model was constructed using neural networks and is divided into nine stages distributed between Matlab and Python-ArcGIS. A case study was taken from the north part of Peninsular Malaysia. 22 criteria were selected to utilise as input data and to build the training and testing datasets. The outcomes show a high-performance accuracy percentage of 98.2% in the testing dataset using 10-fold cross validation. The automated spatial data mining model provides a solid platform for decision makers to performing landfill site selection and planning operations on a regional scale.

  15. 78 FR 6269 - Amendment to the International Traffic in Arms Regulations: Revision of U.S. Munitions List...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-30

    ... remain subject to USML control are modeling or simulation tools that model or simulate the environments... USML revision process, the public is asked to provide specific examples of nuclear-related items whose...) Modeling or simulation tools that model or simulate the environments generated by nuclear detonations or...

  16. A model framework to describe growth-linked biodegradation of trace-level pollutants in the presence of coincidental carbon substrates and microbes.

    PubMed

    Liu, Li; Helbling, Damian E; Kohler, Hans-Peter E; Smets, Barth F

    2014-11-18

    Pollutants such as pesticides and their degradation products occur ubiquitously in natural aquatic environments at trace concentrations (μg L(-1) and lower). Microbial biodegradation processes have long been known to contribute to the attenuation of pesticides in contaminated environments. However, challenges remain in developing engineered remediation strategies for pesticide-contaminated environments because the fundamental processes that regulate growth-linked biodegradation of pesticides in natural environments remain poorly understood. In this research, we developed a model framework to describe growth-linked biodegradation of pesticides at trace concentrations. We used experimental data reported in the literature or novel simulations to explore three fundamental kinetic processes in isolation. We then combine these kinetic processes into a unified model framework. The three kinetic processes described were: the growth-linked biodegradation of micropollutant at environmentally relevant concentrations; the effect of coincidental assimilable organic carbon substrates; and the effect of coincidental microbes that compete for assimilable organic carbon substrates. We used Monod kinetic models to describe substrate utilization and microbial growth rates for specific pesticide and degrader pairs. We then extended the model to include terms for utilization of assimilable organic carbon substrates by the specific degrader and coincidental microbes, growth on assimilable organic carbon substrates by the specific degrader and coincidental microbes, and endogenous metabolism. The proposed model framework enables interpretation and description of a range of experimental observations on micropollutant biodegradation. The model provides a useful tool to identify environmental conditions with respect to the occurrence of assimilable organic carbon and coincidental microbes that may result in enhanced or reduced micropollutant biodegradation.

  17. Determining the potential productivity of food crops in controlled environments

    NASA Technical Reports Server (NTRS)

    Bugbee, Bruce

    1992-01-01

    The quest to determine the maximum potential productivity of food crops is greatly benefitted by crop growth models. Many models have been developed to analyze and predict crop growth in the field, but it is difficult to predict biological responses to stress conditions. Crop growth models for the optimal environments of a Controlled Environment Life Support System (CELSS) can be highly predictive. This paper discusses the application of a crop growth model to CELSS; the model is used to evaluate factors limiting growth. The model separately evaluates the following four physiological processes: absorption of PPF by photosynthetic tissue, carbon fixation (photosynthesis), carbon use (respiration), and carbon partitioning (harvest index). These constituent processes determine potentially achievable productivity. An analysis of each process suggests that low harvest index is the factor most limiting to yield. PPF absorption by plant canopies and respiration efficiency are also of major importance. Research concerning productivity in a CELSS should emphasize: (1) the development of gas exchange techniques to continuously monitor plant growth rates and (2) environmental techniques to reduce plant height in communities.

  18. Microphysics, Radiation and Surface Processes in the Goddard Cumulus Ensemble (GCE) Model

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo; Simpson, J.; Baker, D.; Braun, S.; Chou, M.-D.; Ferrier, B.; Johnson, D.; Khain, A.; Lang, S.; Lynn, B.

    2001-01-01

    The response of cloud systems to their environment is an important link in a chain of processes responsible for monsoons, frontal depression, El Nino Southern Oscillation (ENSO) episodes and other climate variations (e.g., 30-60 day intra-seasonal oscillations). Numerical models of cloud properties provide essential insights into the interactions of clouds with each other, with their surroundings, and with land and ocean surfaces. Significant advances are currently being made in the modeling of rainfall and rain-related cloud processes, ranging in scales from the very small up to the simulation of an extensive population of raining cumulus clouds in a tropical- or midlatitude-storm environment. The Goddard Cumulus Ensemble (GCE) model is a multi-dimensional nonhydrostatic dynamic/microphysical cloud resolving model. It has been used to simulate many different mesoscale convective systems that occurred in various geographic locations. In this paper, recent GCE model improvements (microphysics, radiation and surface processes) will be described as well as their impact on the development of precipitation events from various geographic locations. The performance of these new physical processes will be examined by comparing the model results with observations. In addition, the explicit interactive processes between cloud, radiation and surface processes will be discussed.

  19. Space - A unique environment for process modeling R&D

    NASA Technical Reports Server (NTRS)

    Overfelt, Tony

    1991-01-01

    Process modeling, the application of advanced computational techniques to simulate real processes as they occur in regular use, e.g., welding, casting and semiconductor crystal growth, is discussed. Using the low-gravity environment of space will accelerate the technical validation of the procedures and enable extremely accurate determinations of the many necessary thermophysical properties. Attention is given to NASA's centers for the commercial development of space; joint ventures of universities, industries, and goverment agencies to study the unique attributes of space that offer potential for applied R&D and eventual commercial exploitation.

  20. Using of Group-Modeling in Predesign Phase of New Healthcare Environments: Stakeholders Experiences.

    PubMed

    Elf, Marie; Eldh, Ann Catrine; Malmqvist, Inga; Öhrn, Kerstin; von Koch, Lena

    2016-01-01

    Current research shows a relationship between healthcare architecture and patient-related outcomes. The planning and designing of new healthcare environments is a complex process. The needs of the various end users of the environment must be considered, including the patients, the patients' significant others, and the staff. The aim of this study was to explore the experiences of healthcare professionals participating in group modeling utilizing system dynamics in the predesign phase of new healthcare environments. We engaged healthcare professionals in a series of workshops using system dynamics to discuss the planning of healthcare environments in the beginning of a construction and then interviewed them about their experience. An explorative and qualitative design was used to describe participants' experiences of participating in the group-modeling projects. Participants (N = 20) were recruited from a larger intervention study using group modeling and system dynamics in planning and designing projects. The interviews were analyzed by qualitative content analysis. Two themes were formed, representing the experiences in the group-modeling process: "Participation in the group modeling generated knowledge and was empowering" and "Participation in the group modeling differed from what was expected and required the dedication of time and skills." The method can support participants in design teams to focus more on their healthcare organization, their care activities, and their aims rather than focusing on detailed layout solutions. This clarification is important when decisions about the design are discussed and prepared and will most likely lead to greater readiness for future building process. © The Author(s) 2015.

  1. Integrated Modeling Environment

    NASA Technical Reports Server (NTRS)

    Mosier, Gary; Stone, Paul; Holtery, Christopher

    2006-01-01

    The Integrated Modeling Environment (IME) is a software system that establishes a centralized Web-based interface for integrating people (who may be geographically dispersed), processes, and data involved in a common engineering project. The IME includes software tools for life-cycle management, configuration management, visualization, and collaboration.

  2. Toward an Integration of Cognitive and Genetic Models of Risk for Depression

    PubMed Central

    Gibb, Brandon E.; Beevers, Christopher G.; McGeary, John E.

    2012-01-01

    There is growing interest in integrating cognitive and genetic models of depression risk. We review two ways in which these models can be meaningfully integrated. First, information-processing biases may represent intermediate phenotypes for specific genetic influences. These genetic influences may represent main effects on specific cognitive processes or may moderate the impact of environmental influences on information-processing biases. Second, cognitive and genetic influences may combine to increase reactivity to environmental stressors, increasing risk for depression in a gene × cognition × environment model of risk. There is now growing support for both of these ways of integrating cognitive and genetic models of depression risk. Specifically, there is support for genetic influences on information-processing biases, particularly the link between 5-HTTLPR and attentional biases, from both genetic association and gene × environment (G × E) studies. There is also initial support for gene × cognition × environment models of risk in which specific genetic influences contribute to increased reactivity to environmental influences. We review this research and discuss important areas of future research, particularly the need for larger samples that allow for a broader examination of genetic and epigenetic influences as well as the combined influence of variability across a number of genes. PMID:22920216

  3. Use of an uncertainty analysis for genome-scale models as a prediction tool for microbial growth processes in subsurface environments.

    PubMed

    Klier, Christine

    2012-03-06

    The integration of genome-scale, constraint-based models of microbial cell function into simulations of contaminant transport and fate in complex groundwater systems is a promising approach to help characterize the metabolic activities of microorganisms in natural environments. In constraint-based modeling, the specific uptake flux rates of external metabolites are usually determined by Michaelis-Menten kinetic theory. However, extensive data sets based on experimentally measured values are not always available. In this study, a genome-scale model of Pseudomonas putida was used to study the key issue of uncertainty arising from the parametrization of the influx of two growth-limiting substrates: oxygen and toluene. The results showed that simulated growth rates are highly sensitive to substrate affinity constants and that uncertainties in specific substrate uptake rates have a significant influence on the variability of simulated microbial growth. Michaelis-Menten kinetic theory does not, therefore, seem to be appropriate for descriptions of substrate uptake processes in the genome-scale model of P. putida. Microbial growth rates of P. putida in subsurface environments can only be accurately predicted if the processes of complex substrate transport and microbial uptake regulation are sufficiently understood in natural environments and if data-driven uptake flux constraints can be applied.

  4. OpenDA Open Source Generic Data Assimilation Environment and its Application in Process Models

    NASA Astrophysics Data System (ADS)

    El Serafy, Ghada; Verlaan, Martin; Hummel, Stef; Weerts, Albrecht; Dhondia, Juzer

    2010-05-01

    Data Assimilation techniques are essential elements in state-of-the-art development of models and their optimization with data in the field of groundwater, surface water and soil systems. They are essential tools in calibration of complex modelling systems and improvement of model forecasts. The OpenDA is a new and generic open source data assimilation environment for application to a choice of physical process models, applied to case dependent domains. OpenDA was introduced recently when the developers of Costa, an open-source TU Delft project [http://www.costapse.org; Van Velzen and Verlaan; 2007] and those of the DATools from the former WL|Delft Hydraulics [El Serafy et al 2007; Weerts et al. 2009] decided to join forces. OpenDA makes use of a set of interfaces that describe the interaction between models, observations and data assimilation algorithms. It focuses on flexible applications in portable systems for modelling geophysical processes. It provides a generic interfacing protocol that allows combination of the implemented data assimilation techniques with, in principle, any time-stepping model duscribing a process(atmospheric processes, 3D circulation, 2D water level, sea surface temperature, soil systems, groundwater etc.). Presently, OpenDA features filtering techniques and calibration techniques. The presentation will give an overview of the OpenDA and the results of some of its practical applications. Application of data assimilation in portable operational forecasting systems—the DATools assimilation environment, El Serafy G.Y., H. Gerritsen, S. Hummel, A. H. Weerts, A.E. Mynett and M. Tanaka (2007), Journal of Ocean Dynamics, DOI 10.1007/s10236-007-0124-3, pp.485-499. COSTA a problem solving environment for data assimilation applied for hydrodynamical modelling, Van Velzen and Verlaan (2007), Meteorologische Zeitschrift, Volume 16, Number 6, December 2007 , pp. 777-793(17). Application of generic data assimilation tools (DATools) for flood forecasting purposes, A.H. Weerts, G.Y.H. El Serafy, S. Hummel, J. Dhondia, and H. Gerritsen (2009), accepted by Geoscience & Computers.

  5. Estimation of environment-related properties of chemicals for design of sustainable processes: development of group-contribution+ (GC+) property models and uncertainty analysis.

    PubMed

    Hukkerikar, Amol Shivajirao; Kalakul, Sawitree; Sarup, Bent; Young, Douglas M; Sin, Gürkan; Gani, Rafiqul

    2012-11-26

    The aim of this work is to develop group-contribution(+) (GC(+)) method (combined group-contribution (GC) method and atom connectivity index (CI) method) based property models to provide reliable estimations of environment-related properties of organic chemicals together with uncertainties of estimated property values. For this purpose, a systematic methodology for property modeling and uncertainty analysis is used. The methodology includes a parameter estimation step to determine parameters of property models and an uncertainty analysis step to establish statistical information about the quality of parameter estimation, such as the parameter covariance, the standard errors in predicted properties, and the confidence intervals. For parameter estimation, large data sets of experimentally measured property values of a wide range of chemicals (hydrocarbons, oxygenated chemicals, nitrogenated chemicals, poly functional chemicals, etc.) taken from the database of the US Environmental Protection Agency (EPA) and from the database of USEtox is used. For property modeling and uncertainty analysis, the Marrero and Gani GC method and atom connectivity index method have been considered. In total, 22 environment-related properties, which include the fathead minnow 96-h LC(50), Daphnia magna 48-h LC(50), oral rat LD(50), aqueous solubility, bioconcentration factor, permissible exposure limit (OSHA-TWA), photochemical oxidation potential, global warming potential, ozone depletion potential, acidification potential, emission to urban air (carcinogenic and noncarcinogenic), emission to continental rural air (carcinogenic and noncarcinogenic), emission to continental fresh water (carcinogenic and noncarcinogenic), emission to continental seawater (carcinogenic and noncarcinogenic), emission to continental natural soil (carcinogenic and noncarcinogenic), and emission to continental agricultural soil (carcinogenic and noncarcinogenic) have been modeled and analyzed. The application of the developed property models for the estimation of environment-related properties and uncertainties of the estimated property values is highlighted through an illustrative example. The developed property models provide reliable estimates of environment-related properties needed to perform process synthesis, design, and analysis of sustainable chemical processes and allow one to evaluate the effect of uncertainties of estimated property values on the calculated performance of processes giving useful insights into quality and reliability of the design of sustainable processes.

  6. An Instructional Design Framework for Fostering Student Engagement in Online Learning Environments

    ERIC Educational Resources Information Center

    Czerkawski, Betul C.; Lyman, Eugene W.

    2016-01-01

    Many approaches, models and frameworks exist when designing quality online learning environments. These approaches assist and guide instructional designers through the process of analysis, design, development, implementation and evaluation of instructional processes. Some of these frameworks are concerned with student participation, some with…

  7. Automatic mathematical modeling for real time simulation system

    NASA Technical Reports Server (NTRS)

    Wang, Caroline; Purinton, Steve

    1988-01-01

    A methodology for automatic mathematical modeling and generating simulation models is described. The models will be verified by running in a test environment using standard profiles with the results compared against known results. The major objective is to create a user friendly environment for engineers to design, maintain, and verify their model and also automatically convert the mathematical model into conventional code for conventional computation. A demonstration program was designed for modeling the Space Shuttle Main Engine Simulation. It is written in LISP and MACSYMA and runs on a Symbolic 3670 Lisp Machine. The program provides a very friendly and well organized environment for engineers to build a knowledge base for base equations and general information. It contains an initial set of component process elements for the Space Shuttle Main Engine Simulation and a questionnaire that allows the engineer to answer a set of questions to specify a particular model. The system is then able to automatically generate the model and FORTRAN code. The future goal which is under construction is to download the FORTRAN code to VAX/VMS system for conventional computation. The SSME mathematical model will be verified in a test environment and the solution compared with the real data profile. The use of artificial intelligence techniques has shown that the process of the simulation modeling can be simplified.

  8. A decision-making model based on a spiking neural circuit and synaptic plasticity.

    PubMed

    Wei, Hui; Bu, Yijie; Dai, Dawei

    2017-10-01

    To adapt to the environment and survive, most animals can control their behaviors by making decisions. The process of decision-making and responding according to cues in the environment is stable, sustainable, and learnable. Understanding how behaviors are regulated by neural circuits and the encoding and decoding mechanisms from stimuli to responses are important goals in neuroscience. From results observed in Drosophila experiments, the underlying decision-making process is discussed, and a neural circuit that implements a two-choice decision-making model is proposed to explain and reproduce the observations. Compared with previous two-choice decision making models, our model uses synaptic plasticity to explain changes in decision output given the same environment. Moreover, biological meanings of parameters of our decision-making model are discussed. In this paper, we explain at the micro-level (i.e., neurons and synapses) how observable decision-making behavior at the macro-level is acquired and achieved.

  9. Dynamical nexus of water supply, hydropower and environment based on the modeling of multiple socio-natural processes: from socio-hydrological perspective

    NASA Astrophysics Data System (ADS)

    Liu, D.; Wei, X.; Li, H. Y.; Lin, M.; Tian, F.; Huang, Q.

    2017-12-01

    In the socio-hydrological system, the ecological functions and environmental services, which are chosen to maintain, are determined by the preference of the society, which is making the trade-off among the values of riparian vegetation, fish, river landscape, water supply, hydropower, navigation and so on. As the society develops, the preference of the value will change and the ecological functions and environmental services which are chosen to maintain will change. The aim of the study is to focus on revealing the feedback relationship of water supply, hydropower and environment and the dynamical feedback mechanism at macro-scale, and to establish socio-hydrological evolution model of the watershed based on the modeling of multiple socio-natural processes. The study will aim at the Han River in China, analyze the impact of the water supply and hydropower on the ecology, hydrology and other environment elements, and study the effect on the water supply and hydropower to ensure the ecological and environmental water of the different level. Water supply and ecology are usually competitive. In some reservoirs, hydropower and ecology are synergic relationship while they are competitive in some reservoirs. The study will analyze the multiple mechanisms to implement the dynamical feedbacks of environment to hydropower, set up the quantitative relationship description of the feedback mechanisms, recognize the dominant processes in the feedback relationships of hydropower and environment and then analyze the positive and negative feedbacks in the feedback networks. The socio-hydrological evolution model at the watershed scale will be built and applied to simulate the long-term evolution processes of the watershed of the current situation. Dynamical nexus of water supply, hydropower and environment will be investigated.

  10. CLEW: A Cooperative Learning Environment for the Web.

    ERIC Educational Resources Information Center

    Ribeiro, Marcelo Blois; Noya, Ricardo Choren; Fuks, Hugo

    This paper outlines CLEW (collaborative learning environment for the Web). The project combines MUD (Multi-User Dimension), workflow, VRML (Virtual Reality Modeling Language) and educational concepts like constructivism in a learning environment where students actively participate in the learning process. The MUD shapes the environment structure.…

  11. Drift-Scale Coupled Processes (DST and THC Seepage) Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    E. Gonnenthal; N. Spyoher

    The purpose of this Analysis/Model Report (AMR) is to document the Near-Field Environment (NFE) and Unsaturated Zone (UZ) models used to evaluate the potential effects of coupled thermal-hydrologic-chemical (THC) processes on unsaturated zone flow and transport. This is in accordance with the ''Technical Work Plan (TWP) for Unsaturated Zone Flow and Transport Process Model Report'', Addendum D, Attachment D-4 (Civilian Radioactive Waste Management System (CRWMS) Management and Operating Contractor (M and O) 2000 [153447]) and ''Technical Work Plan for Nearfield Environment Thermal Analyses and Testing'' (CRWMS M and O 2000 [153309]). These models include the Drift Scale Test (DST) THCmore » Model and several THC seepage models. These models provide the framework to evaluate THC coupled processes at the drift scale, predict flow and transport behavior for specified thermal loading conditions, and predict the chemistry of waters and gases entering potential waste-emplacement drifts. The intended use of this AMR is to provide input for the following: (1) Performance Assessment (PA); (2) Abstraction of Drift-Scale Coupled Processes AMR (ANL-NBS-HS-000029); (3) UZ Flow and Transport Process Model Report (PMR); and (4) Near-Field Environment (NFE) PMR. The work scope for this activity is presented in the TWPs cited above, and summarized as follows: continue development of the repository drift-scale THC seepage model used in support of the TSPA in-drift geochemical model; incorporate heterogeneous fracture property realizations; study sensitivity of results to changes in input data and mineral assemblage; validate the DST model by comparison with field data; perform simulations to predict mineral dissolution and precipitation and their effects on fracture properties and chemistry of water (but not flow rates) that may seep into drifts; submit modeling results to the TDMS and document the models. The model development, input data, sensitivity and validation studies described in this AMR are required to fully document and address the requirements of the TWPs.« less

  12. Drift-Scale Coupled Processes (DST and THC Seepage) Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    E. Sonnenthale

    The purpose of this Analysis/Model Report (AMR) is to document the Near-Field Environment (NFE) and Unsaturated Zone (UZ) models used to evaluate the potential effects of coupled thermal-hydrologic-chemical (THC) processes on unsaturated zone flow and transport. This is in accordance with the ''Technical Work Plan (TWP) for Unsaturated Zone Flow and Transport Process Model Report'', Addendum D, Attachment D-4 (Civilian Radioactive Waste Management System (CRWMS) Management and Operating Contractor (M&O) 2000 [1534471]) and ''Technical Work Plan for Nearfield Environment Thermal Analyses and Testing'' (CRWMS M&O 2000 [153309]). These models include the Drift Scale Test (DST) THC Model and several THCmore » seepage models. These models provide the framework to evaluate THC coupled processes at the drift scale, predict flow and transport behavior for specified thermal loading conditions, and predict the chemistry of waters and gases entering potential waste-emplacement drifts. The intended use of this AMR is to provide input for the following: Performance Assessment (PA); Near-Field Environment (NFE) PMR; Abstraction of Drift-Scale Coupled Processes AMR (ANL-NBS-HS-000029); and UZ Flow and Transport Process Model Report (PMR). The work scope for this activity is presented in the TWPs cited above, and summarized as follows: Continue development of the repository drift-scale THC seepage model used in support of the TSPA in-drift geochemical model; incorporate heterogeneous fracture property realizations; study sensitivity of results to changes in input data and mineral assemblage; validate the DST model by comparison with field data; perform simulations to predict mineral dissolution and precipitation and their effects on fracture properties and chemistry of water (but not flow rates) that may seep into drifts; submit modeling results to the TDMS and document the models. The model development, input data, sensitivity and validation studies described in this AMR are required to fully document and address the requirements of the TWPs.« less

  13. .NET INTEROPERABILITY GUIDELINES

    EPA Science Inventory

    The CAPE-OPEN middleware standards were created to allow process modelling components (PMCs) developed by third parties to be used in any process modelling environment (PME) utilizing these standards. The CAPE-OPEN middleware specifications were based upon both Microsoft's Compo...

  14. Aspects of the BPRIM Language for Risk Driven Process Engineering

    NASA Astrophysics Data System (ADS)

    Sienou, Amadou; Lamine, Elyes; Pingaud, Hervé; Karduck, Achim

    Nowadays organizations are exposed to frequent changes in business environment requiring continuous alignment of business processes on business strategies. This agility requires methods promoted in enterprise engineering approaches. Risk consideration in enterprise engineering is getting important since the business environment is becoming more and more competitive and unpredictable. Business processes are subject to the same quality requirements as material and human resources. Thus, process management is supposed to tackle value creation challenges but also the ones related to value preservation. Our research considers risk driven business process design as an integral part of enterprise engineering. A graphical modelling language for risk driven business process engineering was introduced in former research. This paper extends the language and handles questions related to modelling risk in organisational context.

  15. A validated agent-based model to study the spatial and temporal heterogeneities of malaria incidence in the rainforest environment.

    PubMed

    Pizzitutti, Francesco; Pan, William; Barbieri, Alisson; Miranda, J Jaime; Feingold, Beth; Guedes, Gilvan R; Alarcon-Valenzuela, Javiera; Mena, Carlos F

    2015-12-22

    The Amazon environment has been exposed in the last decades to radical changes that have been accompanied by a remarkable rise of both Plasmodium falciparum and Plasmodium vivax malaria. The malaria transmission process is highly influenced by factors such as spatial and temporal heterogeneities of the environment and individual-based characteristics of mosquitoes and humans populations. All these determinant factors can be simulated effectively trough agent-based models. This paper presents a validated agent-based model of local-scale malaria transmission. The model reproduces the environment of a typical riverine village in the northern Peruvian Amazon, where the malaria transmission is highly seasonal and apparently associated with flooding of large areas caused by the neighbouring river. Agents representing humans, mosquitoes and the two species of Plasmodium (P. falciparum and P. vivax) are simulated in a spatially explicit representation of the environment around the village. The model environment includes: climate, people houses positions and elevation. A representation of changes in the mosquito breeding areas extension caused by the river flooding is also included in the simulation environment. A calibration process was carried out to reproduce the variations of the malaria monthly incidence over a period of 3 years. The calibrated model is also able to reproduce the spatial heterogeneities of local scale malaria transmission. A "what if" eradication strategy scenario is proposed: if the mosquito breeding sites are eliminated through mosquito larva habitat management in a buffer area extended at least 200 m around the village, the malaria transmission is eradicated from the village. The use of agent-based models can reproduce effectively the spatiotemporal variations of the malaria transmission in a low endemicity environment dominated by river floodings like in the Amazon.

  16. Learning Environment, Learning Process, Academic Outcomes and Career Success of University Graduates

    ERIC Educational Resources Information Center

    Vermeulen, Lyanda; Schmidt, Henk G.

    2008-01-01

    This study expands on literature covering models on educational productivity, student integration and effectiveness of instruction. An expansion of the literature concerning the impact of higher education on workplace performance is also covered. Relationships were examined between the quality of the academic learning environment, the process of…

  17. Framework programmable platform for the advanced software development workstation. Integration mechanism design document

    NASA Technical Reports Server (NTRS)

    Mayer, Richard J.; Blinn, Thomas M.; Mayer, Paula S. D.; Reddy, Uday; Ackley, Keith; Futrell, Mike

    1991-01-01

    The Framework Programmable Software Development Platform (FPP) is a project aimed at combining effective tool and data integration mechanisms with a model of the software development process in an intelligent integrated software development environment. Guided by this model, this system development framework will take advantage of an integrated operating environment to automate effectively the management of the software development process so that costly mistakes during the development phase can be eliminated.

  18. An Empirical Verification of a-priori Learning Models on Mailing Archives in the Context of Online Learning Activities of Participants in Free\\Libre Open Source Software (FLOSS) Communities

    ERIC Educational Resources Information Center

    Mukala, Patrick; Cerone, Antonio; Turini, Franco

    2017-01-01

    Free\\Libre Open Source Software (FLOSS) environments are increasingly dubbed as learning environments where practical software engineering skills can be acquired. Numerous studies have extensively investigated how knowledge is acquired in these environments through a collaborative learning model that define a learning process. Such a learning…

  19. Modeling socio-cultural processes in network-centric environments

    NASA Astrophysics Data System (ADS)

    Santos, Eunice E.; Santos, Eugene, Jr.; Korah, John; George, Riya; Gu, Qi; Kim, Keumjoo; Li, Deqing; Russell, Jacob; Subramanian, Suresh

    2012-05-01

    The major focus in the field of modeling & simulation for network centric environments has been on the physical layer while making simplifications for the human-in-the-loop. However, the human element has a big impact on the capabilities of network centric systems. Taking into account the socio-behavioral aspects of processes such as team building, group decision-making, etc. are critical to realistically modeling and analyzing system performance. Modeling socio-cultural processes is a challenge because of the complexity of the networks, dynamism in the physical and social layers, feedback loops and uncertainty in the modeling data. We propose an overarching framework to represent, model and analyze various socio-cultural processes within network centric environments. The key innovation in our methodology is to simultaneously model the dynamism in both the physical and social layers while providing functional mappings between them. We represent socio-cultural information such as friendships, professional relationships and temperament by leveraging the Culturally Infused Social Network (CISN) framework. The notion of intent is used to relate the underlying socio-cultural factors to observed behavior. We will model intent using Bayesian Knowledge Bases (BKBs), a probabilistic reasoning network, which can represent incomplete and uncertain socio-cultural information. We will leverage previous work on a network performance modeling framework called Network-Centric Operations Performance and Prediction (N-COPP) to incorporate dynamism in various aspects of the physical layer such as node mobility, transmission parameters, etc. We validate our framework by simulating a suitable scenario, incorporating relevant factors and providing analyses of the results.

  20. Monitoring Biogeochemical Processes in Coral Reef Environments with Remote Sensing: A Cross-Disciplinary Approach.

    NASA Astrophysics Data System (ADS)

    Perez, D.; Phinn, S. R.; Roelfsema, C. M.; Shaw, E. C.; Johnston, L.; Iguel, J.; Camacho, R.

    2017-12-01

    Primary production and calcification are important to measure and monitor over time, because of their fundamental roles in the carbon cycling and accretion of habitat structure for reef ecosystems. However, monitoring biogeochemical processes in coastal environments has been difficult due to complications in resolving differences in water optical properties from biological productivity and other sources (sediment, dissolved organics, etc.). This complicates application of algorithms developed for satellite image data from open ocean conditions, and requires alternative approaches. This project applied a cross-disciplinary approach, using established methods for monitoring productivity in terrestrial environments to coral reef systems. Availability of regularly acquired high spatial (< 5m pixels), multispectral satellite imagery has improved mapping and monitoring capabilities for shallow, marine environments such as seagrass and coral reefs. There is potential to further develop optical models for remote sensing applications to estimate and monitor reef system processes, such as primary productivity and calcification. This project collected field measurements of spectral absorptance and primary productivity and calcification rates for two reef systems: Heron Reef, southern Great Barrier Reef and Saipan Lagoon, Commonwealth of the Northern Mariana Islands. Field data were used to parameterize a light-use efficiency (LUE) model, estimating productivity from absorbed photosynthetically active radiation. The LUE model has been successfully applied in terrestrial environments for the past 40 years, and could potentially be used in shallow, marine environments. The model was used in combination with a map of benthic community composition produced from objective based image analysis of WorldView 2 imagery. Light-use efficiency was measured for functional groups: coral, algae, seagrass, and sediment. However, LUE was overestimated for sediment, which led to overestimation of productivity for the mapped area. This was due to differences in spatial and temporal resolution of field data used in the model. The limitations and application of the LUE model to coral reef environments will be presented.

  1. Cardea: Providing Support for Dynamic Resource Access in a Distributed Computing Environment

    NASA Technical Reports Server (NTRS)

    Lepro, Rebekah

    2003-01-01

    The environment framing the modem authorization process span domains of administration, relies on many different authentication sources, and manages complex attributes as part of the authorization process. Cardea facilitates dynamic access control within this environment as a central function of an inter-operable authorization framework. The system departs from the traditional authorization model by separating the authentication and authorization processes, distributing the responsibility for authorization data and allowing collaborating domains to retain control over their implementation mechanisms. Critical features of the system architecture and its handling of the authorization process differentiate the system from existing authorization components by addressing common needs not adequately addressed by existing systems. Continuing system research seeks to enhance the implementation of the current authorization model employed in Cardea, increase the robustness of current features, further the framework for establishing trust and promote interoperability with existing security mechanisms.

  2. Mapping care processes within a hospital: from theory to a web-based proposal merging enterprise modelling and ISO normative principles.

    PubMed

    Staccini, Pascal; Joubert, Michel; Quaranta, Jean-François; Fieschi, Marius

    2005-03-01

    Today, the economic and regulatory environment, involving activity-based and prospective payment systems, healthcare quality and risk analysis, traceability of the acts performed and evaluation of care practices, accounts for the current interest in clinical and hospital information systems. The structured gathering of information relative to users' needs and system requirements is fundamental when installing such systems. This stage takes time and is generally misconstrued by caregivers and is of limited efficacy to analysts. We used a modelling technique designed for manufacturing processes (IDEF0/SADT). We enhanced the basic model of an activity with descriptors extracted from the Ishikawa cause-and-effect diagram (methods, men, materials, machines, and environment). We proposed an object data model of a process and its components, and programmed a web-based tool in an object-oriented environment. This tool makes it possible to extract the data dictionary of a given process from the description of its elements and to locate documents (procedures, recommendations, instructions) according to each activity or role. Aimed at structuring needs and storing information provided by directly involved teams regarding the workings of an institution (or at least part of it), the process-mapping approach has an important contribution to make in the analysis of clinical information systems.

  3. Neutron Environment Calculations for Low Earth Orbit

    NASA Technical Reports Server (NTRS)

    Clowdsley, M. S.; Wilson, J. W.; Shinn, J. L.; Badavi, F. F.; Heinbockel, J. H.; Atwell, W.

    2001-01-01

    The long term exposure of astronauts on the developing International Space Station (ISS) requires an accurate knowledge of the internal exposure environment for human risk assessment and other onboard processes. The natural environment is moderated by the solar wind, which varies over the solar cycle. The HZETRN high charge and energy transport code developed at NASA Langley Research Center can be used to evaluate the neutron environment on ISS. A time dependent model for the ambient environment in low earth orbit is used. This model includes GCR radiation moderated by the Earth's magnetic field, trapped protons, and a recently completed model of the albedo neutron environment formed through the interaction of galactic cosmic rays with the Earth's atmosphere. Using this code, the neutron environments for space shuttle missions were calculated and comparisons were made to measurements by the Johnson Space Center with onboard detectors. The models discussed herein are being developed to evaluate the natural and induced environment data for the Intelligence Synthesis Environment Project and eventual use in spacecraft optimization.

  4. Study on intelligent processing system of man-machine interactive garment frame model

    NASA Astrophysics Data System (ADS)

    Chen, Shuwang; Yin, Xiaowei; Chang, Ruijiang; Pan, Peiyun; Wang, Xuedi; Shi, Shuze; Wei, Zhongqian

    2018-05-01

    A man-machine interactive garment frame model intelligent processing system is studied in this paper. The system consists of several sensor device, voice processing module, mechanical parts and data centralized acquisition devices. The sensor device is used to collect information on the environment changes brought by the body near the clothes frame model, the data collection device is used to collect the information of the environment change induced by the sensor device, voice processing module is used for speech recognition of nonspecific person to achieve human-machine interaction, mechanical moving parts are used to make corresponding mechanical responses to the information processed by data collection device.it is connected with data acquisition device by a means of one-way connection. There is a one-way connection between sensor device and data collection device, two-way connection between data acquisition device and voice processing module. The data collection device is one-way connection with mechanical movement parts. The intelligent processing system can judge whether it needs to interact with the customer, realize the man-machine interaction instead of the current rigid frame model.

  5. Rocks in the River: The Challenge of Piloting the Inquiry Process in Today's Learning Environment

    ERIC Educational Resources Information Center

    Lambusta, Patrice; Graham, Sandy; Letteri-Walker, Barbara

    2014-01-01

    School librarians in Newport News, Virginia, are meeting the challenges of integrating an Inquiry Process Model into instruction. In the original model the process began by asking students to develop questions to start their inquiry journey. As this model was taught it was discovered that students often did not have enough background knowledge to…

  6. System Level Uncertainty Assessment for Collaborative RLV Design

    NASA Technical Reports Server (NTRS)

    Charania, A. C.; Bradford, John E.; Olds, John R.; Graham, Matthew

    2002-01-01

    A collaborative design process utilizing Probabilistic Data Assessment (PDA) is showcased. Given the limitation of financial resources by both the government and industry, strategic decision makers need more than just traditional point designs, they need to be aware of the likelihood of these future designs to meet their objectives. This uncertainty, an ever-present character in the design process, can be embraced through a probabilistic design environment. A conceptual design process is presented that encapsulates the major engineering disciplines for a Third Generation Reusable Launch Vehicle (RLV). Toolsets consist of aerospace industry standard tools in disciplines such as trajectory, propulsion, mass properties, cost, operations, safety, and economics. Variations of the design process are presented that use different fidelities of tools. The disciplinary engineering models are used in a collaborative engineering framework utilizing Phoenix Integration's ModelCenter and AnalysisServer environment. These tools allow the designer to join disparate models and simulations together in a unified environment wherein each discipline can interact with any other discipline. The design process also uses probabilistic methods to generate the system level output metrics of interest for a RLV conceptual design. The specific system being examined is the Advanced Concept Rocket Engine 92 (ACRE-92) RLV. Previous experience and knowledge (in terms of input uncertainty distributions from experts and modeling and simulation codes) can be coupled with Monte Carlo processes to best predict the chances of program success.

  7. A Phenomena-Oriented Environment for Teaching Process Modeling: Novel Modeling Software and Its Use in Problem Solving.

    ERIC Educational Resources Information Center

    Foss, Alan S.; Geurts, Kevin R.; Goodeve, Peter J.; Dahm, Kevin D.; Stephanopoulos, George; Bieszczad, Jerry; Koulouris, Alexandros

    1999-01-01

    Discusses a program that offers students a phenomenon-oriented environment expressed in the fundamental concepts and language of chemical engineering such as mass and energy balancing, phase equilibria, reaction stoichiometry and rate, modes of heat, and species transport. (CCM)

  8. Landlab: A numerical modeling framework for evolving Earth surfaces from mountains to the coast

    NASA Astrophysics Data System (ADS)

    Gasparini, N. M.; Adams, J. M.; Tucker, G. E.; Hobley, D. E. J.; Hutton, E.; Istanbulluoglu, E.; Nudurupati, S. S.

    2016-02-01

    Landlab is an open-source, user-friendly, component-based modeling framework for exploring the evolution of Earth's surface. Landlab itself is not a model. Instead, it is a computational framework that facilitates the development of numerical models of coupled earth surface processes. The Landlab Python library includes a gridding engine and process components, along with support functions for tasks such as reading in DEM data and input variables, setting boundary conditions, and plotting and outputting data. Each user of Landlab builds his or her own unique model. The first step in building a Landlab model is generally initializing a grid, either regular (raster) or irregular (e.g. delaunay or radial), and process components. This initialization process involves reading in relevant parameter values and data. The process components act on the grid to alter grid properties over time. For example, a component exists that can track the growth, death, and succession of vegetation over time. There are also several components that evolve surface elevation, through processes such as fluvial sediment transport and linear diffusion, among others. Users can also build their own process components, taking advantage of existing functions in Landlab such as those that identify grid connectivity and calculate gradients and flux divergence. The general nature of the framework makes it applicable to diverse environments - from bedrock rivers to a pile of sand - and processes acting over a range of spatial and temporal scales. In this poster we illustrate how a user builds a model using Landlab and propose a number of ways in which Landlab can be applied in coastal environments - from dune migration to channelization of barrier islands. We seek input from the coastal community as to how the process component library can be expanded to explore the diverse phenomena that act to shape coastal environments.

  9. Analysing Students' Shared Activity while Modeling a Biological Process in a Computer-Supported Educational Environment

    ERIC Educational Resources Information Center

    Ergazaki, M.; Zogza, V.; Komis, V.

    2007-01-01

    This paper reports on a case study with three dyads of high school students (age 14 years) each collaborating on a plant growth modeling task in the computer-supported educational environment "ModelsCreator". Following a qualitative line of research, the present study aims at highlighting the ways in which the collaborating students as well as the…

  10. Technique for experimental determination of radiation interchange factors in solar wavelengths

    NASA Technical Reports Server (NTRS)

    Bobco, R. P.; Nolte, L. J.; Wensley, J. R.

    1971-01-01

    Process obtains solar heating data which support analytical design. Process yields quantitative information on local solar exposure of models which are geometrically and reflectively similar to prototypes under study. Models are tested in a shirtsleeve environment.

  11. Process-based modeling of temperature and water profiles in the seedling recruitment zone: Part I. Model validation

    USDA-ARS?s Scientific Manuscript database

    Process-based modeling provides detailed spatial and temporal information of the soil environment in the shallow seedling recruitment zone across field topography where measurements of soil temperature and water may not sufficiently describe the zone. Hourly temperature and water profiles within the...

  12. A spacecraft's own ambient environment: The role of simulation-based research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ketsdever, Andrew D.; Gimelshein, Sergey

    2014-12-09

    Spacecraft contamination has long been a subject of study in the rarefied gas dynamics community. Professor Mikhail Ivanov coined the term a spacecraft's 'own ambient environment' to describe the effects of natural and satellite driven processes on the conditions encountered by a spacecraft in orbit. Outgassing, thruster firings, and gas and liquid dumps all contribute to the spacecraft's contamination environment. Rarefied gas dynamic modeling techniques, such as Direct Simulation Monte Carlo, are well suited to investigate these spacebased environments. However, many advances were necessary to fully characterize the extent of this problem. A better understanding of modeling flows over largemore » pressure ranges, for example hybrid continuum and rarefied numerical schemes, were required. Two-phase flow modeling under rarefied conditions was necessary. And the ability to model plasma flows for a new era of propulsion systems was also required. Through the work of Professor Ivanov and his team, we now have a better understanding of processes that create a spacecraft's own ambient environment and are able to better characterize these environments. Advances in numerical simulation have also spurred on the development of experimental facilities to study these effects. The relationship between numerical results and experimental advances will be explored in this manuscript.« less

  13. Computer model for the cardiovascular system: development of an e-learning tool for teaching of medical students.

    PubMed

    Warriner, David Roy; Bayley, Martin; Shi, Yubing; Lawford, Patricia Victoria; Narracott, Andrew; Fenner, John

    2017-11-21

    This study combined themes in cardiovascular modelling, clinical cardiology and e-learning to create an on-line environment that would assist undergraduate medical students in understanding key physiological and pathophysiological processes in the cardiovascular system. An interactive on-line environment was developed incorporating a lumped-parameter mathematical model of the human cardiovascular system. The model outputs were used to characterise the progression of key disease processes and allowed students to classify disease severity with the aim of improving their understanding of abnormal physiology in a clinical context. Access to the on-line environment was offered to students at all stages of undergraduate training as an adjunct to routine lectures and tutorials in cardiac pathophysiology. Student feedback was collected on this novel on-line material in the course of routine audits of teaching delivery. Medical students, irrespective of their stage of undergraduate training, reported that they found the models and the environment interesting and a positive experience. After exposure to the environment, there was a statistically significant improvement in student performance on a series of 6 questions based on cardiovascular medicine, with a 33% and 22% increase in the number of questions answered correctly, p < 0.0001 and p < 0.001 respectively. Considerable improvement was found in students' knowledge and understanding during assessment after exposure to the e-learning environment. Opportunities exist for development of similar environments in other fields of medicine, refinement of the existing environment and further engagement with student cohorts. This work combines some exciting and developing fields in medical education, but routine adoption of these types of tool will be possible only with the engagement of all stake-holders, from educationalists, clinicians, modellers to, most importantly, medical students.

  14. Brief Report: Preliminary Proposal of a Conceptual Model of a Digital Environment for Developing Mathematical Reasoning in Students with Autism Spectrum Disorders.

    PubMed

    Santos, Maria Isabel; Breda, Ana; Almeida, Ana Margarida

    2015-08-01

    There is clear evidence that in typically developing children reasoning and sense-making are essential in all mathematical learning and understanding processes. In children with autism spectrum disorders (ASD), however, these become much more significant, considering their importance to successful independent living. This paper presents a preliminary proposal of a digital environment, specifically targeted to promote the development of mathematical reasoning in students with ASD. Given the diversity of ASD, the prototyping of this environment requires the study of dynamic adaptation processes and the development of activities adjusted to each user's profile. We present the results obtained during the first phase of this ongoing research, describing a conceptual model of the proposed digital environment. Guidelines for future research are also discussed.

  15. [Epistemic injustice during the medical education process in the hospital context].

    PubMed

    Consejo-Y Chapela, Carolina; Viesca-Treviño, Carlos Alfonso

    2017-01-01

    The educational model adopted by the Universidad Nacional Autónoma de México (UNAM) Faculty of Medicine is constructivist; it is a model based on competence development. It aims to provide learning environments that incorporate real activities (it helps the students to develop social negotiation skills, as part of their integral learning; it encourages them to take a critical and reflexive approach; and it is also a student-centered model). However, many challenges arise when this model is implemented in the context of hospital environments. Therefore, our aim was to analyse the hospital as an hermeneutical community and as a power relations scenario, contrary to the constructivist model. In the analysis of a conflict between a chief of a medical department and an undergraduated medical intern, we use Miranda Fricker's categories discriminatory epistemic injustice, and testimonial injustice, as well as Foucault's power relationships and knowledge. The program implementation is placed in the context of power relations and different disciplinary methods that could affect the training process of the students, whose educational background belongs to the constructivist model. This in part is due to the existence of informal normative structures that are hidden in the process of medical knowledge construction at the hospital scenario. Practices of epistemic discriminatory injustice in the hospital environment increase vulnerability conditions for medical students in their education process.

  16. Knowledge environments representing molecular entities for the virtual physiological human.

    PubMed

    Hofmann-Apitius, Martin; Fluck, Juliane; Furlong, Laura; Fornes, Oriol; Kolárik, Corinna; Hanser, Susanne; Boeker, Martin; Schulz, Stefan; Sanz, Ferran; Klinger, Roman; Mevissen, Theo; Gattermayer, Tobias; Oliva, Baldo; Friedrich, Christoph M

    2008-09-13

    In essence, the virtual physiological human (VPH) is a multiscale representation of human physiology spanning from the molecular level via cellular processes and multicellular organization of tissues to complex organ function. The different scales of the VPH deal with different entities, relationships and processes, and in consequence the models used to describe and simulate biological functions vary significantly. Here, we describe methods and strategies to generate knowledge environments representing molecular entities that can be used for modelling the molecular scale of the VPH. Our strategy to generate knowledge environments representing molecular entities is based on the combination of information extraction from scientific text and the integration of information from biomolecular databases. We introduce @neuLink, a first prototype of an automatically generated, disease-specific knowledge environment combining biomolecular, chemical, genetic and medical information. Finally, we provide a perspective for the future implementation and use of knowledge environments representing molecular entities for the VPH.

  17. Computational characterization of fracture healing under reduced gravity loading conditions.

    PubMed

    Gadomski, Benjamin C; Lerner, Zachary F; Browning, Raymond C; Easley, Jeremiah T; Palmer, Ross H; Puttlitz, Christian M

    2016-07-01

    The literature is deficient with regard to how the localized mechanical environment of skeletal tissue is altered during reduced gravitational loading and how these alterations affect fracture healing. Thus, a finite element model of the ovine hindlimb was created to characterize the local mechanical environment responsible for the inhibited fracture healing observed under experimental simulated hypogravity conditions. Following convergence and verification studies, hydrostatic pressure and strain within a diaphyseal fracture of the metatarsus were evaluated for models under both 1 and 0.25 g loading environments and compared to results of a related in vivo study. Results of the study suggest that reductions in hydrostatic pressure and strain of the healing fracture for animals exposed to reduced gravitational loading conditions contributed to an inhibited healing process, with animals exposed to the simulated hypogravity environment subsequently initiating an intramembranous bone formation process rather than the typical endochondral ossification healing process experienced by animals healing in a 1 g gravitational environment. © 2015 Orthopaedic Research Society. Published by Wiley Periodicals, Inc. J Orthop Res 34:1206-1215, 2016. © 2015 Orthopaedic Research Society. Published by Wiley Periodicals, Inc.

  18. A sampling model of social judgment.

    PubMed

    Galesic, Mirta; Olsson, Henrik; Rieskamp, Jörg

    2018-04-01

    Studies of social judgments have demonstrated a number of diverse phenomena that were so far difficult to explain within a single theoretical framework. Prominent examples are false consensus and false uniqueness, as well as self-enhancement and self-depreciation. Here we show that these seemingly complex phenomena can be a product of an interplay between basic cognitive processes and the structure of social and task environments. We propose and test a new process model of social judgment, the social sampling model (SSM), which provides a parsimonious quantitative account of different types of social judgments. In the SSM, judgments about characteristics of broader social environments are based on sampling of social instances from memory, where instances receive activation if they belong to a target reference class and have a particular characteristic. These sampling processes interact with the properties of social and task environments, including homophily, shapes of frequency distributions, and question formats. For example, in line with the model's predictions we found that whether false consensus or false uniqueness will occur depends on the level of homophily in people's social circles and on the way questions are asked. The model also explains some previously unaccounted-for patterns of self-enhancement and self-depreciation. People seem to be well informed about many characteristics of their immediate social circles, which in turn influence how they evaluate broader social environments and their position within them. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  19. Controlled English to facilitate human/machine analytical processing

    NASA Astrophysics Data System (ADS)

    Braines, Dave; Mott, David; Laws, Simon; de Mel, Geeth; Pham, Tien

    2013-06-01

    Controlled English is a human-readable information representation format that is implemented using a restricted subset of the English language, but which is unambiguous and directly accessible by simple machine processes. We have been researching the capabilities of CE in a number of contexts, and exploring the degree to which a flexible and more human-friendly information representation format could aid the intelligence analyst in a multi-agent collaborative operational environment; especially in cases where the agents are a mixture of other human users and machine processes aimed at assisting the human users. CE itself is built upon a formal logic basis, but allows users to easily specify models for a domain of interest in a human-friendly language. In our research we have been developing an experimental component known as the "CE Store" in which CE information can be quickly and flexibly processed and shared between human and machine agents. The CE Store environment contains a number of specialized machine agents for common processing tasks and also supports execution of logical inference rules that can be defined in the same CE language. This paper outlines the basic architecture of this approach, discusses some of the example machine agents that have been developed, and provides some typical examples of the CE language and the way in which it has been used to support complex analytical tasks on synthetic data sources. We highlight the fusion of human and machine processing supported through the use of the CE language and CE Store environment, and show this environment with examples of highly dynamic extensions to the model(s) and integration between different user-defined models in a collaborative setting.

  20. Characterizing Space Environments with Long-Term Space Plasma Archive Resources

    NASA Technical Reports Server (NTRS)

    Minow, Joseph I.; Miller, J. Scott; Diekmann, Anne M.; Parker, Linda N.

    2009-01-01

    A significant scientific benefit of establishing and maintaining long-term space plasma data archives is the ready access the archives afford to resources required for characterizing spacecraft design environments. Space systems must be capable of operating in the mean environments driven by climatology as well as the extremes that occur during individual space weather events. Long- term time series are necessary to obtain quantitative information on environment variability and extremes that characterize the mean and worst case environments that may be encountered during a mission. In addition, analysis of large data sets are important to scientific studies of flux limiting processes that provide a basis for establishing upper limits to environment specifications used in radiation or charging analyses. We present applications using data from existing archives and highlight their contributions to space environment models developed at Marshall Space Flight Center including the Chandra Radiation Model, ionospheric plasma variability models, and plasma models of the L2 space environment.

  1. GREENSCOPE: A Method for Modeling Chemical Process Sustainability

    EPA Science Inventory

    Current work within the U.S. Environmental Protection Agency’s National Risk Management Research Laboratory is focused on the development of a method for modeling chemical process sustainability. The GREENSCOPE methodology, defined for the four bases of Environment, Economics, Ef...

  2. Developing a multi-systemic fall prevention model, incorporating the physical environment, the care process and technology: a systematic review.

    PubMed

    Choi, Young-Seon; Lawler, Erin; Boenecke, Clayton A; Ponatoski, Edward R; Zimring, Craig M

    2011-12-01

    This paper reports a review that assessed the effectiveness and characteristics of fall prevention interventions implemented in hospitals. A multi-systemic fall prevention model that establishes a practical framework was developed from the evidence. Falls occur through complex interactions between patient-related and environmental risk factors, suggesting a need for multifaceted fall prevention approaches that address both factors. We searched Medline, CINAHL, PsycInfo and the Web of Science databases for references published between January 1990 and June 2009 and scrutinized secondary references from acquired papers. Due to the heterogeneity of interventions and populations, we conducted a quantitative systematic review without a meta-analysis and used a narrative summary to report findings. From the review, three distinct characteristics of fall prevention interventions emerged: (1) the physical environment, (2) the care process and culture and (3) technology. While clinically significant evidence shows the efficacy of environment-related interventions in reducing falls and fall-related injuries, the literature identified few hospitals that had introduced environment-related interventions in their multifaceted fall intervention strategies. Using the multi-systemic fall prevention model, hospitals should promote a practical strategy that benefits from the collective effects of the physical environment, the care process and culture and technology to prevent falls and fall-related injuries. By doing so, they can more effectively address the various risk factors for falling and therefore, prevent falls. Studies that test the proposed model need to be conducted to establish the efficacy of the model in practice. © 2011 The Authors. Journal of Advanced Nursing © 2011 Blackwell Publishing Ltd.

  3. [Watershed water environment pollution models and their applications: a review].

    PubMed

    Zhu, Yao; Liang, Zhi-Wei; Li, Wei; Yang, Yi; Yang, Mu-Yi; Mao, Wei; Xu, Han-Li; Wu, Wei-Xiang

    2013-10-01

    Watershed water environment pollution model is the important tool for studying watershed environmental problems. Through the quantitative description of the complicated pollution processes of whole watershed system and its parts, the model can identify the main sources and migration pathways of pollutants, estimate the pollutant loadings, and evaluate their impacts on water environment, providing a basis for watershed planning and management. This paper reviewed the watershed water environment models widely applied at home and abroad, with the focuses on the models of pollutants loading (GWLF and PLOAD), water quality of received water bodies (QUAL2E and WASP), and the watershed models integrated pollutant loadings and water quality (HSPF, SWAT, AGNPS, AnnAGNPS, and SWMM), and introduced the structures, principles, and main characteristics as well as the limitations in practical applications of these models. The other models of water quality (CE-QUAL-W2, EFDC, and AQUATOX) and watershed models (GLEAMS and MIKE SHE) were also briefly introduced. Through the case analysis on the applications of single model and integrated models, the development trend and application prospect of the watershed water environment pollution models were discussed.

  4. Process-based modeling of temperature and water profiles in the seedling recruitment zone: Part II. Seedling emergence timing

    USDA-ARS?s Scientific Manuscript database

    Predictions of seedling emergence timing for spring wheat are facilitated by process-based modeling of the microsite environment in the shallow seedling recruitment zone. Hourly temperature and water profiles within the recruitment zone for 60 days after planting were simulated from the process-base...

  5. An extended car-following model considering the acceleration derivative in some typical traffic environments

    NASA Astrophysics Data System (ADS)

    Zhou, Tong; Chen, Dong; Liu, Weining

    2018-03-01

    Based on the full velocity difference and acceleration car-following model, an extended car-following model is proposed by considering the vehicle’s acceleration derivative. The stability condition is given by applying the control theory. Considering some typical traffic environments, the results of theoretical analysis and numerical simulation show the extended model has a more actual acceleration of string vehicles than that of the previous models in starting process, stopping process and sudden brake. Meanwhile, the traffic jams more easily occur when the coefficient of vehicle’s acceleration derivative increases, which is presented by space-time evolution. The results confirm that the vehicle’s acceleration derivative plays an important role in the traffic jamming transition and the evolution of traffic congestion.

  6. A Competence-Based Service for Supporting Self-Regulated Learning in Virtual Environments

    ERIC Educational Resources Information Center

    Nussbaumer, Alexander; Hillemann, Eva-Catherine; Gütl, Christian; Albert, Dietrich

    2015-01-01

    This paper presents a conceptual approach and a Web-based service that aim at supporting self-regulated learning in virtual environments. The conceptual approach consists of four components: 1) a self-regulated learning model for supporting a learner-centred learning process, 2) a psychological model for facilitating competence-based…

  7. Estimation of Environment-Related Properties of Chemicals for Design of Sustainable Processes: Development of Group-Contribution+ (GC+) Property Models and Uncertainty Analysis

    EPA Science Inventory

    The aim of this work is to develop group-contribution+ (GC+) method (combined group-contribution (GC) method and atom connectivity index (CI) method) based property models to provide reliable estimations of environment-related properties of organic chemicals together with uncert...

  8. Analysis of Feedback Processes in Online Group Interaction: A Methodological Model

    ERIC Educational Resources Information Center

    Espasa, Anna; Guasch, Teresa; Alvarez, Ibis M.

    2013-01-01

    The aim of this article is to present a methodological model to analyze students' group interaction to improve their essays in online learning environments, based on asynchronous and written communication. In these environments teacher and student scaffolds for discussion are essential to promote interaction. One of these scaffolds can be the…

  9. Simulation of Plant Physiological Process Using Fuzzy Variables

    Treesearch

    Daniel L. Schmoldt

    1991-01-01

    Qualitative modelling can help us understand and project effects of multiple stresses on trees. It is not practical to collect and correlate empirical data for all combinations of plant/environments and human/climate stresses, especially for mature trees in natural settings. Therefore, a mechanistic model was developed to describe ecophysiological processes. This model...

  10. The Conceptualization of the Mathematical Modelling Process in Technology-Aided Environment

    ERIC Educational Resources Information Center

    Hidiroglu, Çaglar Naci; Güzel, Esra Bukova

    2017-01-01

    The aim of the study is to conceptualize the technology-aided mathematical modelling process in the frame of cognitive modelling perspective. The grounded theory approach was adopted in the study. The research was conducted with seven groups consisting of nineteen prospective mathematics teachers. The data were collected from the video records of…

  11. Experimenting with C2 Applications and Federated Infrastructures for Integrated Full-Spectrum Operational Environments in Support of Collaborative Planning and Interoperable Execution

    DTIC Science & Technology

    2004-06-01

    Situation Understanding) Common Operational Pictures Planning & Decision Support Capabilities Message & Order Processing Common Operational...Pictures Planning & Decision Support Capabilities Message & Order Processing Common Languages & Data Models Modeling & Simulation Domain

  12. Wind Sensing, Analysis, and Modeling

    NASA Technical Reports Server (NTRS)

    Corvin, Michael A.

    1995-01-01

    The purpose of this task was to begin development of a unified approach to the sensing, analysis, and modeling of the wind environments in which launch system operate. The initial activity was to examine the current usage and requirements for wind modeling for the Titan 4 vehicle. This was to be followed by joint technical efforts with NASA Langley Research Center to develop applicable analysis methods. This work was to be performed in and demonstrate the use of prototype tools implementing an environment in which to realize a unified system. At the convenience of the customer, due to resource limitations, the task was descoped. The survey of Titan 4 processes was accomplished and is reported in this document. A summary of general requirements is provided. Current versions of prototype Process Management Environment tools are being provided to the customer.

  13. Wind sensing, analysis, and modeling

    NASA Technical Reports Server (NTRS)

    Corvin, Michael A.

    1995-01-01

    The purpose of this task was to begin development of a unified approach to the sensing, analysis, and modeling of the wind environments in which launch systems operate. The initial activity was to examine the current usage and requirements for wind modeling for the Titan 4 vehicle. This was to be followed by joint technical efforts with NASA Langley Research Center to develop applicable analysis methods. This work was to be performed in and demonstrate the use of prototype tools implementing an environment in which to realize a unified system. At the convenience of the customer, due to resource limitations, the task was descoped. The survey of Titan 4 processes was accomplished and is reported in this document. A summary of general requirements is provided . Current versions of prototype Process Management Environment tools are being provided to the customer.

  14. JIMM: the next step for mission-level models

    NASA Astrophysics Data System (ADS)

    Gump, Jamieson; Kurker, Robert G.; Nalepka, Joseph P.

    2001-09-01

    The (Simulation Based Acquisition) SBA process is one in which the planning, design, and test of a weapon system or other product is done through the more effective use of modeling and simulation, information technology, and process improvement. This process results in a product that is produced faster, cheaper, and more reliably than its predecessors. Because the SBA process requires realistic and detailed simulation conditions, it was necessary to develop a simulation tool that would provide a simulation environment acceptable for doing SBA analysis. The Joint Integrated Mission Model (JIMM) was created to help define and meet the analysis, test and evaluation, and training requirements of a Department of Defense program utilizing SBA. Through its generic nature of representing simulation entities, its data analysis capability, and its robust configuration management process, JIMM can be used to support a wide range of simulation applications as both a constructive and a virtual simulation tool. JIMM is a Mission Level Model (MLM). A MLM is capable of evaluating the effectiveness and survivability of a composite force of air and space systems executing operational objectives in a specific scenario against an integrated air and space defense system. Because MLMs are useful for assessing a system's performance in a realistic, integrated, threat environment, they are key to implementing the SBA process. JIMM is a merger of the capabilities of one legacy model, the Suppressor MLM, into another, the Simulated Warfare Environment Generator (SWEG) MLM. By creating a more capable MLM, JIMM will not only be a tool to support the SBA initiative, but could also provide the framework for the next generation of MLMs.

  15. Putting FLEXPART to REST: The Provision of Atmospheric Transport Modeling Services

    NASA Astrophysics Data System (ADS)

    Morton, Don; Arnold, Dèlia

    2015-04-01

    We are developing a RESTful set of modeling services for the FLEXPART modeling system. FLEXPART (FLEXible PARTicle dispersion model) is a Lagrangian transport and dispersion model used by a growing international community. It has been used to simulate and forecast the atmospheric transport of wildfire smoke, volcanic ash and radionuclides and may be run in backwards mode to provide information for the determination of emission sources such as nuclear emissions and greenhouse gases. This open source software is distributed in source code form, and has several compiler and library dependencies that users need to address. Although well-documented, getting it compiled, set up, running, and post-processed is often tedious, making it difficult for the inexperienced or casual user. Well-designed modeling services lower the entry barrier for scientists to perform simulations, allowing them to create and execute their models from a variety of devices and programming environments. This world of Service Oriented Architectures (SOA) has progressed to a REpresentational State Transfer (REST) paradigm, in which the pervasive and mature HTTP environment is used as a foundation for providing access to model services. With such an approach, sound software engineering practises are adhered to in order to deploy service modules exhibiting very loose coupling with the clients. In short, services are accessed and controlled through the formation of properly-constructed Uniform Resource Identifiers (URI's), processed in an HTTP environment. In this way, any client or combination of clients - whether a bash script, Python program, web GUI, or even Unix command line - that can interact with an HTTP server, can run the modeling environment. This loose coupling allows for the deployment of a variety of front ends, all accessing a common modeling backend system. Furthermore, it is generally accepted in the cloud computing community that RESTful approaches constitute a sound approach towards successful deployment of services. Through the design of a RESTful, cloud-based modeling system, we provide the ubiquitous access to FLEXPART that allows scientists to focus on modeling processes instead of tedious computational details. In this work, we describe the modeling services environment, and provide examples of access via command-line, Python programs, and web GUI interfaces.

  16. Integration of Modelling and Graphics to Create an Infrared Signal Processing Test Bed

    NASA Astrophysics Data System (ADS)

    Sethi, H. R.; Ralph, John E.

    1989-03-01

    The work reported in this paper was carried out as part of a contract with MoD (PE) UK. It considers the problems associated with realistic modelling of a passive infrared system in an operational environment. Ideally all aspects of the system and environment should be integrated into a complete end-to-end simulation but in the past limited computing power has prevented this. Recent developments in workstation technology and the increasing availability of parallel processing techniques makes the end-to-end simulation possible. However the complexity and speed of such simulations means difficulties for the operator in controlling the software and understanding the results. These difficulties can be greatly reduced by providing an extremely user friendly interface and a very flexible, high power, high resolution colour graphics capability. Most system modelling is based on separate software simulation of the individual components of the system itself and its environment. These component models may have their own characteristic inbuilt assumptions and approximations, may be written in the language favoured by the originator and may have a wide variety of input and output conventions and requirements. The models and their limitations need to be matched to the range of conditions appropriate to the operational scenerio. A comprehensive set of data bases needs to be generated by the component models and these data bases must be made readily available to the investigator. Performance measures need to be defined and displayed in some convenient graphics form. Some options are presented for combining available hardware and software to create an environment within which the models can be integrated, and which provide the required man-machine interface, graphics and computing power. The impact of massively parallel processing and artificial intelligence will be discussed. Parallel processing will make real time end-to-end simulation possible and will greatly improve the graphical visualisation of the model output data. Artificial intelligence should help to enhance the man-machine interface.

  17. An Observation Analysis Tool for time-series analysis and sensor management in the FREEWAT GIS environment for water resources management

    NASA Astrophysics Data System (ADS)

    Cannata, Massimiliano; Neumann, Jakob; Cardoso, Mirko; Rossetto, Rudy; Foglia, Laura; Borsi, Iacopo

    2017-04-01

    In situ time-series are an important aspect of environmental modelling, especially with the advancement of numerical simulation techniques and increased model complexity. In order to make use of the increasing data available through the requirements of the EU Water Framework Directive, the FREEWAT GIS environment incorporates the newly developed Observation Analysis Tool for time-series analysis. The tool is used to import time-series data into QGIS from local CSV files, online sensors using the istSOS service, or MODFLOW model result files and enables visualisation, pre-processing of data for model development, and post-processing of model results. OAT can be used as a pre-processor for calibration observations, integrating the creation of observations for calibration directly from sensor time-series. The tool consists in an expandable Python library of processing methods and an interface integrated in the QGIS FREEWAT plug-in which includes a large number of modelling capabilities, data management tools and calibration capacity.

  18. Framework Programmable Platform for the advanced software development workstation: Framework processor design document

    NASA Technical Reports Server (NTRS)

    Mayer, Richard J.; Blinn, Thomas M.; Mayer, Paula S. D.; Ackley, Keith A.; Crump, Wes; Sanders, Les

    1991-01-01

    The design of the Framework Processor (FP) component of the Framework Programmable Software Development Platform (FFP) is described. The FFP is a project aimed at combining effective tool and data integration mechanisms with a model of the software development process in an intelligent integrated software development environment. Guided by the model, this Framework Processor will take advantage of an integrated operating environment to provide automated support for the management and control of the software development process so that costly mistakes during the development phase can be eliminated.

  19. A Conceptual Model of the Cognitive Processing of Environmental Distance Information

    NASA Astrophysics Data System (ADS)

    Montello, Daniel R.

    I review theories and research on the cognitive processing of environmental distance information by humans, particularly that acquired via direct experience in the environment. The cognitive processes I consider for acquiring and thinking about environmental distance information include working-memory, nonmediated, hybrid, and simple-retrieval processes. Based on my review of the research literature, and additional considerations about the sources of distance information and the situations in which it is used, I propose an integrative conceptual model to explain the cognitive processing of distance information that takes account of the plurality of possible processes and information sources, and describes conditions under which particular processes and sources are likely to operate. The mechanism of summing vista distances is identified as widely important in situations with good visual access to the environment. Heuristics based on time, effort, or other information are likely to play their most important role when sensory access is restricted.

  20. Administrative Decision Making and Resource Allocation.

    ERIC Educational Resources Information Center

    Sardy, Susan; Sardy, Hyman

    This paper considers selected aspects of the systems analysis of administrative decisionmaking regarding resource allocations in an educational system. A model of the instructional materials purchase system is presented. The major components of this model are: environment, input, decision process, conversion structure, conversion process, output,…

  1. M-and-C Domain Map Maker: an environment complimenting MDE with M-and-C knowledge and ensuring solution completeness

    NASA Astrophysics Data System (ADS)

    Patwari, Puneet; Choudhury, Subhrojyoti R.; Banerjee, Amar; Swaminathan, N.; Pandey, Shreya

    2016-07-01

    Model Driven Engineering (MDE) as a key driver to reduce development cost of M&C systems is beginning to find acceptance across scientific instruments such as Radio Telescopes and Nuclear Reactors. Such projects are adopting it to reduce time to integrate, test and simulate their individual controllers and increase reusability and traceability in the process. The creation and maintenance of models is still a significant challenge to realizing MDE benefits. Creating domain-specific modelling environments reduces the barriers, and we have been working along these lines, creating a domain-specific language and environment based on an M&C knowledge model. However, large projects involve several such domains, and there is still a need to interconnect the domain models, in order to ensure modelling completeness. This paper presents a knowledge-centric approach to doing that, by creating a generic system model that underlies the individual domain knowledge models. We present our vision for M&C Domain Map Maker, a set of processes and tools that enables explication of domain knowledge in terms of domain models with mutual consistency relationships to aid MDE.

  2. Offering a Framework for Value Co-Creation in Virtual Academic Learning Environments

    ERIC Educational Resources Information Center

    Ranjbarfard, Mina; Heidari Sureshjani, Mahboobeh

    2018-01-01

    Purpose: This research aims to convert the traditional teacher-student models, in which teachers determine the learning resources, into a flexible structure and an active learning environment so that students can participate in the educational processes and value co-creation in virtual academic learning environments (VALEs).…

  3. The lifecycle of e-learning course in the adaptive educational environment

    NASA Astrophysics Data System (ADS)

    Gustun, O. N.; Budaragin, N. V.

    2017-01-01

    In the article we have considered the lifecycle model of the e-learning course in the electronic educational environment. This model consists of three stages and nine phases. In order to implement the adaptive control of the learning process we have determined the actions which are necessary to undertake at different phases of the e-learning course lifecycle. The general characteristics of the SPACEL-technology is given for creating adaptive educational environments of the next generation.

  4. A BDI Approach to Infer Student's Emotions in an Intelligent Learning Environment

    ERIC Educational Resources Information Center

    Jaques, Patricia Augustin; Vicari, Rosa Maria

    2007-01-01

    In this article we describe the use of mental states approach, more specifically the belief-desire-intention (BDI) model, to implement the process of affective diagnosis in an educational environment. We use the psychological OCC model, which is based on the cognitive theory of emotions and is possible to be implemented computationally, in order…

  5. Modeling emerald ash borer dispersal using percolation theory: estimating the rate of range expansion in a fragmented landscape

    Treesearch

    Robin A. J. Taylor; Daniel A. Herms; Louis R. Iverson

    2008-01-01

    The dispersal of organisms is rarely random, although diffusion processes can be useful models for movement in approximately homogeneous environments. However, the environments through which all organisms disperse are far from uniform at all scales. The emerald ash borer (EAB), Agrilus planipennis, is obligate on ash (Fraxinus spp...

  6. A stochastic vision-based model inspired by zebrafish collective behaviour in heterogeneous environments

    PubMed Central

    Collignon, Bertrand; Séguret, Axel; Halloy, José

    2016-01-01

    Collective motion is one of the most ubiquitous behaviours displayed by social organisms and has led to the development of numerous models. Recent advances in the understanding of sensory system and information processing by animals impels one to revise classical assumptions made in decisional algorithms. In this context, we present a model describing the three-dimensional visual sensory system of fish that adjust their trajectory according to their perception field. Furthermore, we introduce a stochastic process based on a probability distribution function to move in targeted directions rather than on a summation of influential vectors as is classically assumed by most models. In parallel, we present experimental results of zebrafish (alone or in group of 10) swimming in both homogeneous and heterogeneous environments. We use these experimental data to set the parameter values of our model and show that this perception-based approach can simulate the collective motion of species showing cohesive behaviour in heterogeneous environments. Finally, we discuss the advances of this multilayer model and its possible outcomes in biological, physical and robotic sciences. PMID:26909173

  7. Adsorption Processes of Lead Ions on the Mixture Surface of Bentonite and Bottom Sediments.

    PubMed

    Hegedűsová, Alžbeta; Hegedűs, Ondrej; Tóth, Tomáš; Vollmannová, Alena; Andrejiová, Alena; Šlosár, Miroslav; Mezeyová, Ivana; Pernyeszi, Tímea

    2016-12-01

    The adsorption of contaminants plays an important role in the process of their elimination from a polluted environment. This work describes the issue of loading environment with lead Pb(II) and the resulting negative impact it has on plants and living organisms. It also focuses on bentonite as a natural adsorbent and on the adsorption process of Pb(II) ions on the mixture of bentonite and bottom sediment from the water reservoir in Kolíňany (SR). The equilibrium and kinetic experimental data were evaluated using Langmuir isotherm kinetic pseudo-first and pseudo-second-order rate equations the intraparticle and surface diffusion models. Langmuir isotherm model was successfully used to characterize the lead ions adsorption equilibrium on the mixture of bentonite and bottom sediment. The pseudo second-order model, the intraparticle and surface (film) diffusion models could be simultaneously fitted the experimental kinetic data.

  8. Pathogen survival trajectories: an eco-environmental approach to the modeling of human campylobacteriosis ecology.

    PubMed Central

    Skelly, Chris; Weinstein, Phil

    2003-01-01

    Campylobacteriosis, like many human diseases, has its own ecology in which the propagation of human infection and disease depends on pathogen survival and finding new hosts in order to replicate and sustain the pathogen population. The complexity of this process, a process common to other enteric pathogens, has hampered control efforts. Many unknowns remain, resulting in a poorly understood disease ecology. To provide structure to these unknowns and help direct further research and intervention, we propose an eco-environmental modeling approach for campylobacteriosis. This modeling approach follows the pathogen population as it moves through the environments that define the physical structure of its ecology. In this paper, we term the ecologic processes and environments through which these populations move "pathogen survival trajectories." Although such a modeling approach could have veterinary applications, our emphasis is on human campylobacteriosis and focuses on human exposures to Campylobacter through feces, food, and aquatic environments. The pathogen survival trajectories that lead to human exposure include ecologic filters that limit population size, e.g., cooking food to kill Campylobacter. Environmental factors that influence the size of the pathogen reservoirs include temperature, nutrient availability, and moisture availability during the period of time the pathogen population is moving through the environment between infected and susceptible hosts. We anticipate that the modeling approach proposed here will work symbiotically with traditional epidemiologic and microbiologic research to help guide and evaluate the acquisition of new knowledge about the ecology, eventual intervention, and control of campylobacteriosis. PMID:12515674

  9. Applications integration in a hybrid cloud computing environment: modelling and platform

    NASA Astrophysics Data System (ADS)

    Li, Qing; Wang, Ze-yuan; Li, Wei-hua; Li, Jun; Wang, Cheng; Du, Rui-yang

    2013-08-01

    With the development of application services providers and cloud computing, more and more small- and medium-sized business enterprises use software services and even infrastructure services provided by professional information service companies to replace all or part of their information systems (ISs). These information service companies provide applications, such as data storage, computing processes, document sharing and even management information system services as public resources to support the business process management of their customers. However, no cloud computing service vendor can satisfy the full functional IS requirements of an enterprise. As a result, enterprises often have to simultaneously use systems distributed in different clouds and their intra enterprise ISs. Thus, this article presents a framework to integrate applications deployed in public clouds and intra ISs. A run-time platform is developed and a cross-computing environment process modelling technique is also developed to improve the feasibility of ISs under hybrid cloud computing environments.

  10. Coarse-grained models of key self-assembly processes in HIV-1

    NASA Astrophysics Data System (ADS)

    Grime, John

    Computational molecular simulations can elucidate microscopic information that is inaccessible to conventional experimental techniques. However, many processes occur over time and length scales that are beyond the current capabilities of atomic-resolution molecular dynamics (MD). One such process is the self-assembly of the HIV-1 viral capsid, a biological structure that is crucial to viral infectivity. The nucleation and growth of capsid structures requires the interaction of large numbers of capsid proteins within a complicated molecular environment. Coarse-grained (CG) models, where degrees of freedom are removed to produce more computationally efficient models, can in principle access large-scale phenomena such as the nucleation and growth of HIV-1 capsid lattice. We report here studies of the self-assembly behaviors of a CG model of HIV-1 capsid protein, including the influence of the local molecular environment on nucleation and growth processes. Our results suggest a multi-stage process, involving several characteristic structures, eventually producing metastable capsid lattice morphologies that are amenable to subsequent capsid dissociation in order to transmit the viral infection.

  11. Problems with Current Models of Grieving and Consequences for Older Persons.

    ERIC Educational Resources Information Center

    Horacek, Bruce J.

    Classical models of the grieving process include Freud's concept of withdrawal of ties to the love object called decathexis, and Lindemann's emancipation from the bondage to the deceased involving adjusting to the loss in one's environment and the ability to form new relationships. Most of the models and explanations of the grieving process over…

  12. Service-oriented model-encapsulation strategy for sharing and integrating heterogeneous geo-analysis models in an open web environment

    NASA Astrophysics Data System (ADS)

    Yue, Songshan; Chen, Min; Wen, Yongning; Lu, Guonian

    2016-04-01

    Earth environment is extremely complicated and constantly changing; thus, it is widely accepted that the use of a single geo-analysis model cannot accurately represent all details when solving complex geo-problems. Over several years of research, numerous geo-analysis models have been developed. However, a collaborative barrier between model providers and model users still exists. The development of cloud computing has provided a new and promising approach for sharing and integrating geo-analysis models across an open web environment. To share and integrate these heterogeneous models, encapsulation studies should be conducted that are aimed at shielding original execution differences to create services which can be reused in the web environment. Although some model service standards (such as Web Processing Service (WPS) and Geo Processing Workflow (GPW)) have been designed and developed to help researchers construct model services, various problems regarding model encapsulation remain. (1) The descriptions of geo-analysis models are complicated and typically require rich-text descriptions and case-study illustrations, which are difficult to fully represent within a single web request (such as the GetCapabilities and DescribeProcess operations in the WPS standard). (2) Although Web Service technologies can be used to publish model services, model users who want to use a geo-analysis model and copy the model service into another computer still encounter problems (e.g., they cannot access the model deployment dependencies information). This study presents a strategy for encapsulating geo-analysis models to reduce problems encountered when sharing models between model providers and model users and supports the tasks with different web service standards (e.g., the WPS standard). A description method for heterogeneous geo-analysis models is studied. Based on the model description information, the methods for encapsulating the model-execution program to model services and for describing model-service deployment information are also included in the proposed strategy. Hence, the model-description interface, model-execution interface and model-deployment interface are studied to help model providers and model users more easily share, reuse and integrate geo-analysis models in an open web environment. Finally, a prototype system is established, and the WPS standard is employed as an example to verify the capability and practicability of the model-encapsulation strategy. The results show that it is more convenient for modellers to share and integrate heterogeneous geo-analysis models in cloud computing platforms.

  13. Surface complexation modeling

    USDA-ARS?s Scientific Manuscript database

    Adsorption-desorption reactions are important processes that affect the transport of contaminants in the environment. Surface complexation models are chemical models that can account for the effects of variable chemical conditions, such as pH, on adsorption reactions. These models define specific ...

  14. Mass and Environment as Drivers of Galaxy Evolution: Simplicity and its Consequences

    NASA Astrophysics Data System (ADS)

    Peng, Yingjie

    2012-01-01

    The galaxy population appears to be composed of infinitely complex different types and properties at first sight, however, when large samples of galaxies are studied, it appears that the vast majority of galaxies just follow simple scaling relations and similar evolutional modes while the outliers represent some minority. The underlying simplicities of the interrelationships among stellar mass, star formation rate and environment are seen in SDSS and zCOSMOS. We demonstrate that the differential effects of mass and environment are completely separable to z 1, indicating that two distinct physical processes are operating, namely the "mass quenching" and "environment quenching". These two simple quenching processes, plus some additional quenching due to merging, then naturally produce the Schechter form of the galaxy stellar mass functions and make quantitative predictions for the inter-relationships between the Schechter parameters of star-forming and passive galaxies in different environments. All of these detailed quantitative relationships are indeed seen, to very high precision, in SDSS, lending strong support to our simple empirically-based model. The model also offers qualitative explanations for the "anti-hierarchical" age-mass relation and the alpha-enrichment patterns for passive galaxies and makes some other testable predictions such as the mass function of the population of transitory objects that are in the process of being quenched, the galaxy major- and minor-merger rates, the galaxy stellar mass assembly history, star formation history and etc. Although still purely phenomenological, the model makes clear what the evolutionary characteristics of the relevant physical processes must in fact be.

  15. A model for the effect of real leaks on the transport of microorganisms into a vacuum freeze-dryer.

    PubMed

    Jennings, T A

    1990-01-01

    This paper proposes a model for determining the effect that real leaks, whose flow is viscous in nature, could have on the microorganism density in a vacuum freeze-dryer during a drying process. The model considers the entry of microorganisms to result from real leaks stemming from an environment containing a known bioburden. A means for determining the relationship between the rate of pressure rise of the system (ROR) and the density of microorganisms in a system, stemming from an environment of a known bioburden, is examined. The model also considers the change in the bioburden of the dryer with respect to variations in the primary and secondary drying process.

  16. AN OVERVIEW OF THE INTEROPERABILITY ROADMAP FOR COM/.NET-BASED CAPE-OPEN

    EPA Science Inventory

    The CAPE-OPEN standard interfaces have been designed to permit flexibility and modularization of process simulation environments (PMEs) in order to use process modeling components such as unit operation or thermodynamic property models across a range of tolls employed in the life...

  17. IAIMS

    PubMed Central

    Frisse, Mark

    1997-01-01

    Abstract The success of IAIMSs and other information technology plans depends to a great extent on the fit between the planning process and the nature of the organization. Planning processes differ as a function of both plurality of goals and the degree to which technology or the external environment changes. If all members of an organization share a common goal and the organization is in a relatively stable environment, the classic “plan, prototype, implement, evaluate” process may be appropriate. Most health care organizations are not consistent with this model. The components of the organization may have different goals, and both the health care environment and roles for technology are changing rapidly. In these circumstances, planning takes on a different light. This paper outlines approaches to IAIMS planning in various environments and provides a framework for IAIMS planning in rapidly changing environments. PMID:9067883

  18. Informations in Models of Evolutionary Dynamics

    NASA Astrophysics Data System (ADS)

    Rivoire, Olivier

    2016-03-01

    Biological organisms adapt to changes by processing informations from different sources, most notably from their ancestors and from their environment. We review an approach to quantify these informations by analyzing mathematical models of evolutionary dynamics and show how explicit results are obtained for a solvable subclass of these models. In several limits, the results coincide with those obtained in studies of information processing for communication, gambling or thermodynamics. In the most general case, however, information processing by biological populations shows unique features that motivate the analysis of specific models.

  19. Multifactorial modelling of high-temperature treatment of timber in the saturated water steam medium

    NASA Astrophysics Data System (ADS)

    Prosvirnikov, D. B.; Safin, R. G.; Ziatdinova, D. F.; Timerbaev, N. F.; Lashkov, V. A.

    2016-04-01

    The paper analyses experimental data obtained in studies of high-temperature treatment of softwood and hardwood in an environment of saturated water steam. Data were processed in the Curve Expert software for the purpose of statistical modelling of processes and phenomena occurring during this process. The multifactorial modelling resulted in the empirical dependences, allowing determining the main parameters of this type of hydrothermal treatment with high accuracy.

  20. Modelling Technology for Building Fire Scene with Virtual Geographic Environment

    NASA Astrophysics Data System (ADS)

    Song, Y.; Zhao, L.; Wei, M.; Zhang, H.; Liu, W.

    2017-09-01

    Building fire is a risky activity that can lead to disaster and massive destruction. The management and disposal of building fire has always attracted much interest from researchers. Integrated Virtual Geographic Environment (VGE) is a good choice for building fire safety management and emergency decisions, in which a more real and rich fire process can be computed and obtained dynamically, and the results of fire simulations and analyses can be much more accurate as well. To modelling building fire scene with VGE, the application requirements and modelling objective of building fire scene were analysed in this paper. Then, the four core elements of modelling building fire scene (the building space environment, the fire event, the indoor Fire Extinguishing System (FES) and the indoor crowd) were implemented, and the relationship between the elements was discussed also. Finally, with the theory and framework of VGE, the technology of building fire scene system with VGE was designed within the data environment, the model environment, the expression environment, and the collaborative environment as well. The functions and key techniques in each environment are also analysed, which may provide a reference for further development and other research on VGE.

  1. Software-as-a-Service Vendors: Are They Ready to Successfully Deliver?

    NASA Astrophysics Data System (ADS)

    Heart, Tsipi; Tsur, Noa Shamir; Pliskin, Nava

    Software as a service (SaaS) is a software sourcing option that allows organizations to remotely access enterprise applications, without having to install the application in-house. In this work we study vendors' readiness to deliver SaaS, a topic scarcely studied before. The innovation classification (evolutionary vs. revolutionary) and a new, Seven Fundamental Organizational Capabilities (FOCs) Model, are used as the theoretical frameworks. The Seven FOCs model suggests generic yet comprehensive set of capabilities that are required for organizational success: 1) sensing the stakeholders, 2) sensing the business environment, 3) sensing the knowledge environment, 4) process control, 5) process improvement, 6) new process development, and 7) appropriate resolution.

  2. Run Environment and Data Management for Earth System Models

    NASA Astrophysics Data System (ADS)

    Widmann, H.; Lautenschlager, M.; Fast, I.; Legutke, S.

    2009-04-01

    The Integrating Model and Data Infrastructure (IMDI) developed and maintained by the Model and Data Group (M&D) comprises the Standard Compile Environment (SCE) and the Standard Run Environment (SRE). The IMDI software has a modular design, which allows to combine and couple a suite of model components and as well to execute the tasks independently and on various platforms. Furthermore the modular structure enables the extension to new model combinations and new platforms. The SRE presented here enables the configuration and performance of earth system model experiments from model integration up to storage and visualization of data. We focus on recently implemented tasks such as synchronous data base filling, graphical monitoring and automatic generation of meta data in XML forms during run time. As well we address the capability to run experiments in heterogeneous IT environments with different computing systems for model integration, data processing and storage. These features are demonstrated for model configurations and on platforms used in current or upcoming projects, e.g. MILLENNIUM or IPCC AR5.

  3. System Engineering Issues for Avionics Survival in the Space Environment

    NASA Technical Reports Server (NTRS)

    Pavelitz, Steven

    1999-01-01

    This paper examines how the system engineering process influences the design of a spacecraft's avionics by considering the space environment. Avionics are susceptible to the thermal, radiation, plasma, and meteoroids/orbital debris environments. The environment definitions for various spacecraft mission orbits (LEO/low inclination, LEO/Polar, MEO, HEO, GTO, GEO and High ApogeeElliptical) are discussed. NASA models and commercial software used for environment analysis are reviewed. Applicability of technical references, such as NASA TM-4527 "Natural Orbital Environment Guidelines for Use in Aerospace Vehicle Development" is discussed. System engineering references, such as the MSFC System Engineering Handbook, are reviewed to determine how the environments are accounted for in the system engineering process. Tools and databases to assist the system engineer and avionics designer in addressing space environment effects on avionics are described and usefulness assessed.

  4. Adaptive Process Control with Fuzzy Logic and Genetic Algorithms

    NASA Technical Reports Server (NTRS)

    Karr, C. L.

    1993-01-01

    Researchers at the U.S. Bureau of Mines have developed adaptive process control systems in which genetic algorithms (GA's) are used to augment fuzzy logic controllers (FLC's). GA's are search algorithms that rapidly locate near-optimum solutions to a wide spectrum of problems by modeling the search procedures of natural genetics. FLC's are rule based systems that efficiently manipulate a problem environment by modeling the 'rule-of-thumb' strategy used in human decision-making. Together, GA's and FLC's possess the capabilities necessary to produce powerful, efficient, and robust adaptive control systems. To perform efficiently, such control systems require a control element to manipulate the problem environment, an analysis element to recognize changes in the problem environment, and a learning element to adjust to the changes in the problem environment. Details of an overall adaptive control system are discussed. A specific laboratory acid-base pH system is used to demonstrate the ideas presented.

  5. Parallel plan execution with self-processing networks

    NASA Technical Reports Server (NTRS)

    Dautrechy, C. Lynne; Reggia, James A.

    1989-01-01

    A critical issue for space operations is how to develop and apply advanced automation techniques to reduce the cost and complexity of working in space. In this context, it is important to examine how recent advances in self-processing networks can be applied for planning and scheduling tasks. For this reason, the feasibility of applying self-processing network models to a variety of planning and control problems relevant to spacecraft activities is being explored. Goals are to demonstrate that self-processing methods are applicable to these problems, and that MIRRORS/II, a general purpose software environment for implementing self-processing models, is sufficiently robust to support development of a wide range of application prototypes. Using MIRRORS/II and marker passing modelling techniques, a model of the execution of a Spaceworld plan was implemented. This is a simplified model of the Voyager spacecraft which photographed Jupiter, Saturn, and their satellites. It is shown that plan execution, a task usually solved using traditional artificial intelligence (AI) techniques, can be accomplished using a self-processing network. The fact that self-processing networks were applied to other space-related tasks, in addition to the one discussed here, demonstrates the general applicability of this approach to planning and control problems relevant to spacecraft activities. It is also demonstrated that MIRRORS/II is a powerful environment for the development and evaluation of self-processing systems.

  6. Land Use Change on Household Farms in the Ecuadorian Amazon: Design and Implementation of an Agent-Based Model.

    PubMed

    Mena, Carlos F; Walsh, Stephen J; Frizzelle, Brian G; Xiaozheng, Yao; Malanson, George P

    2011-01-01

    This paper describes the design and implementation of an Agent-Based Model (ABM) used to simulate land use change on household farms in the Northern Ecuadorian Amazon (NEA). The ABM simulates decision-making processes at the household level that is examined through a longitudinal, socio-economic and demographic survey that was conducted in 1990 and 1999. Geographic Information Systems (GIS) are used to establish spatial relationships between farms and their environment, while classified Landsat Thematic Mapper (TM) imagery is used to set initial land use/land cover conditions for the spatial simulation, assess from-to land use/land cover change patterns, and describe trajectories of land use change at the farm and landscape levels. Results from prior studies in the NEA provide insights into the key social and ecological variables, describe human behavioral functions, and examine population-environment interactions that are linked to deforestation and agricultural extensification, population migration, and demographic change. Within the architecture of the model, agents are classified as active or passive. The model comprises four modules, i.e., initialization, demography, agriculture, and migration that operate individually, but are linked through key household processes. The main outputs of the model include a spatially-explicit representation of the land use/land cover on survey and non-survey farms and at the landscape level for each annual time-step, as well as simulated socio-economic and demographic characteristics of households and communities. The work describes the design and implementation of the model and how population-environment interactions can be addressed in a frontier setting. The paper contributes to land change science by examining important pattern-process relations, advocating a spatial modeling approach that is capable of synthesizing fundamental relationships at the farm level, and links people and environment in complex ways.

  7. UAV Swarm Behavior Modeling for Early Exposure of Failure Modes

    DTIC Science & Technology

    2016-09-01

    Systems Center Atlantic, for his patience with me through this two-year process. He worked with my schedule and was very understanding of the...emergence of new failure modes? The MP modeling environment provides a breakdown of all potential event traces. Given that the research questions call...for the revelation of potential failure modes, MP was selected as the modeling environment because it provides a substantial set of results and data

  8. Enhanced Vehicle Beddown Approximations for the Improved Theater Distribution Model

    DTIC Science & Technology

    2014-03-27

    processed utilizing a heuristic routing and scheduling procedure the authors called the Airlift Planning Algorithm ( APA ). The linear programming model...LINGO 13 environment. The model is then solved by LINGO 13 and solution data is passed back to the Excel environment in a readable format . All original...DSS is relatively unchanged when solutions to the ITDM are referenced for comparison testing. Readers are encouraged to see Appendix I for ITDM VBA

  9. Predictors for Chinese Students' Management of Study Environment in Online Groupwork

    ERIC Educational Resources Information Center

    Du, Jianxia

    2016-01-01

    Management of the study environment is crucial to the learning process, and this management in an online class setting is even more challenging. This study investigates models of environmental structuring in online groupwork in China, as reported by 307 graduate students in 80 groups. At the group level, environment management was positively…

  10. Compositional schedulability analysis of real-time actor-based systems.

    PubMed

    Jaghoori, Mohammad Mahdi; de Boer, Frank; Longuet, Delphine; Chothia, Tom; Sirjani, Marjan

    2017-01-01

    We present an extension of the actor model with real-time, including deadlines associated with messages, and explicit application-level scheduling policies, e.g.,"earliest deadline first" which can be associated with individual actors. Schedulability analysis in this setting amounts to checking whether, given a scheduling policy for each actor, every task is processed within its designated deadline. To check schedulability, we introduce a compositional automata-theoretic approach, based on maximal use of model checking combined with testing. Behavioral interfaces define what an actor expects from the environment, and the deadlines for messages given these assumptions. We use model checking to verify that actors match their behavioral interfaces. We extend timed automata refinement with the notion of deadlines and use it to define compatibility of actor environments with the behavioral interfaces. Model checking of compatibility is computationally hard, so we propose a special testing process. We show that the analyses are decidable and automate the process using the Uppaal model checker.

  11. Design as Knowledge Construction: Constructing Knowledge of Design

    ERIC Educational Resources Information Center

    Cennamo, Katherine C.

    2004-01-01

    In this article, I present a model of instructional design that has evolved from analysis and reflection on the process of designing materials for constructivist learning environments. I observed how we addressed the critical questions for instructional design, comparing the process to traditional instructional design models and to my emerging…

  12. Business Process Elicitation, Modeling, and Reengineering: Teaching and Learning with Simulated Environments

    ERIC Educational Resources Information Center

    Jeyaraj, Anand

    2010-01-01

    The design of enterprise information systems requires students to master technical skills for elicitation, modeling, and reengineering business processes as well as soft skills for information gathering and communication. These tacit skills and behaviors cannot be effectively taught students but rather experienced and learned by students. This…

  13. Spontaneous cortical activity reveals hallmarks of an optimal internal model of the environment.

    PubMed

    Berkes, Pietro; Orbán, Gergo; Lengyel, Máté; Fiser, József

    2011-01-07

    The brain maintains internal models of its environment to interpret sensory inputs and to prepare actions. Although behavioral studies have demonstrated that these internal models are optimally adapted to the statistics of the environment, the neural underpinning of this adaptation is unknown. Using a Bayesian model of sensory cortical processing, we related stimulus-evoked and spontaneous neural activities to inferences and prior expectations in an internal model and predicted that they should match if the model is statistically optimal. To test this prediction, we analyzed visual cortical activity of awake ferrets during development. Similarity between spontaneous and evoked activities increased with age and was specific to responses evoked by natural scenes. This demonstrates the progressive adaptation of internal models to the statistics of natural stimuli at the neural level.

  14. Multispectral simulation environment for modeling low-light-level sensor systems

    NASA Astrophysics Data System (ADS)

    Ientilucci, Emmett J.; Brown, Scott D.; Schott, John R.; Raqueno, Rolando V.

    1998-11-01

    Image intensifying cameras have been found to be extremely useful in low-light-level (LLL) scenarios including military night vision and civilian rescue operations. These sensors utilize the available visible region photons and an amplification process to produce high contrast imagery. It has been demonstrated that processing techniques can further enhance the quality of this imagery. For example, fusion with matching thermal IR imagery can improve image content when very little visible region contrast is available. To aid in the improvement of current algorithms and the development of new ones, a high fidelity simulation environment capable of producing radiometrically correct multi-band imagery for low- light-level conditions is desired. This paper describes a modeling environment attempting to meet these criteria by addressing the task as two individual components: (1) prediction of a low-light-level radiance field from an arbitrary scene, and (2) simulation of the output from a low- light-level sensor for a given radiance field. The radiance prediction engine utilized in this environment is the Digital Imaging and Remote Sensing Image Generation (DIRSIG) model which is a first principles based multi-spectral synthetic image generation model capable of producing an arbitrary number of bands in the 0.28 to 20 micrometer region. The DIRSIG model is utilized to produce high spatial and spectral resolution radiance field images. These images are then processed by a user configurable multi-stage low-light-level sensor model that applies the appropriate noise and modulation transfer function (MTF) at each stage in the image processing chain. This includes the ability to reproduce common intensifying sensor artifacts such as saturation and 'blooming.' Additionally, co-registered imagery in other spectral bands may be simultaneously generated for testing fusion and exploitation algorithms. This paper discusses specific aspects of the DIRSIG radiance prediction for low- light-level conditions including the incorporation of natural and man-made sources which emphasizes the importance of accurate BRDF. A description of the implementation of each stage in the image processing and capture chain for the LLL model is also presented. Finally, simulated images are presented and qualitatively compared to lab acquired imagery from a commercial system.

  15. How Is the Learning Environment in Physics Lesson with Using 7E Model Teaching Activities

    ERIC Educational Resources Information Center

    Turgut, Umit; Colak, Alp; Salar, Riza

    2017-01-01

    The aim of this research is to reveal the results in the planning, implementation and evaluation of the process for learning environments to be designed in compliance with 7E learning cycle model in physics lesson. "Action research", which is a qualitative research pattern, is employed in this research in accordance with the aim of the…

  16. Autonomous Robot Navigation in Human-Centered Environments Based on 3D Data Fusion

    NASA Astrophysics Data System (ADS)

    Steinhaus, Peter; Strand, Marcus; Dillmann, Rüdiger

    2007-12-01

    Efficient navigation of mobile platforms in dynamic human-centered environments is still an open research topic. We have already proposed an architecture (MEPHISTO) for a navigation system that is able to fulfill the main requirements of efficient navigation: fast and reliable sensor processing, extensive global world modeling, and distributed path planning. Our architecture uses a distributed system of sensor processing, world modeling, and path planning units. In this arcticle, we present implemented methods in the context of data fusion algorithms for 3D world modeling and real-time path planning. We also show results of the prototypic application of the system at the museum ZKM (center for art and media) in Karlsruhe.

  17. Meteorological Processes Affecting Air Quality – Research and Model Development Needs

    EPA Science Inventory

    Meteorology modeling is an important component of air quality modeling systems that defines the physical and dynamical environment for atmospheric chemistry. The meteorology models used for air quality applications are based on numerical weather prediction models that were devel...

  18. Linking Nurse Leadership and Work Characteristics to Nurse Burnout and Engagement.

    PubMed

    Lewis, Heather Smith; Cunningham, Christopher J L

    2016-01-01

    Burnout and engagement are critical conditions affecting patient safety and the functioning of healthcare organizations; the areas of worklife model suggest that work environment characteristics may impact employee burnout and general worklife quality. The purpose was to present and test a conditional process model linking perceived transformational nurse leadership to nurse staff burnout and engagement via important work environment characteristics. Working nurses (N = 120) provided perceptions of the core study variables via Internet- or paper-based survey. The hypothesized model was tested using the PROCESS analysis tool, which enables simultaneous testing of multiple, parallel, indirect effects within the SPSS statistical package. Findings support the areas of worklife model and suggest that transformational leadership is strongly associated with work environment characteristics that are further linked to nurse burnout and engagement. Interestingly, different work characteristics appear to be critical channels through which transformational leadership impacts nurse burnout and engagement. There are several methodological and practical implications of this work for researchers and practitioners interested in preventing burnout and promoting occupational health within healthcare organizations. These implications are tied to the connections observed between transformational leadership, specific work environment characteristics, and burnout and engagement outcomes.

  19. Performance analysis of no-vent fill process for liquid hydrogen tank in terrestrial and on-orbit environments

    NASA Astrophysics Data System (ADS)

    Wang, Lei; Li, Yanzhong; Zhang, Feini; Ma, Yuan

    2015-12-01

    Two finite difference computer models, aiming at the process predictions of no-vent fill in normal gravity and microgravity environments respectively, are developed to investigate the filling performance in a liquid hydrogen (LH2) tank. In the normal gravity case model, the tank/fluid system is divided into five control volume including ullage, bulk liquid, gas-liquid interface, ullage-adjacent wall, and liquid-adjacent wall. In the microgravity case model, vapor-liquid thermal equilibrium state is maintained throughout the process, and only two nodes representing fluid and wall regions are applied. To capture the liquid-wall heat transfer accurately, a series of heat transfer mechanisms are considered and modeled successively, including film boiling, transition boiling, nucleate boiling and liquid natural convection. The two models are validated by comparing their prediction with experimental data, which shows good agreement. Then the two models are used to investigate the performance of no-vent fill in different conditions and several conclusions are obtained. It shows that in the normal gravity environment the no-vent fill experiences a continuous pressure rise during the whole process and the maximum pressure occurs at the end of the operation, while the maximum pressure of the microgravity case occurs at the beginning stage of the process. Moreover, it seems that increasing inlet mass flux has an apparent influence on the pressure evolution of no-vent fill process in normal gravity but a little influence in microgravity. The larger initial wall temperature brings about more significant liquid evaporation during the filling operation, and then causes higher pressure evolution, no matter the filling process occurs under normal gravity or microgravity conditions. Reducing inlet liquid temperature can improve the filling performance in normal gravity, but cannot significantly reduce the maximum pressure in microgravity. The presented work benefits the understanding of the no-vent fill performance and may guide the design of on-orbit no-vent fill system.

  20. Simulation model for plant growth in controlled environment systems

    NASA Technical Reports Server (NTRS)

    Raper, C. D., Jr.; Wann, M.

    1986-01-01

    The role of the mathematical model is to relate the individual processes to environmental conditions and the behavior of the whole plant. Using the controlled-environment facilities of the phytotron at North Carolina State University for experimentation at the whole-plant level and methods for handling complex models, researchers developed a plant growth model to describe the relationships between hierarchial levels of the crop production system. The fundamental processes that are considered are: (1) interception of photosynthetically active radiation by leaves, (2) absorption of photosynthetically active radiation, (3) photosynthetic transformation of absorbed radiation into chemical energy of carbon bonding in solube carbohydrates in the leaves, (4) translocation between carbohydrate pools in leaves, stems, and roots, (5) flow of energy from carbohydrate pools for respiration, (6) flow from carbohydrate pools for growth, and (7) aging of tissues. These processes are described at the level of organ structure and of elementary function processes. The driving variables of incident photosynthetically active radiation and ambient temperature as inputs pertain to characterization at the whole-plant level. The output of the model is accumulated dry matter partitioned among leaves, stems, and roots; thus, the elementary processes clearly operate under the constraints of the plant structure which is itself the output of the model.

  1. Inquiry Based Learning and Meaning Generation through Modelling on Geometrical Optics in a Constructionist Environment

    ERIC Educational Resources Information Center

    Kotsari, Constantina; Smyrnaiou, Zacharoula

    2017-01-01

    The central roles that modelling plays in the processes of scientific enquiry and that models play as the outcomes of that enquiry are well established (Gilbert & Boulter, 1998). Besides, there are considerable similarities between the processes and outcomes of science and technology (Cinar, 2016). In this study, we discuss how the use of…

  2. Application of Chemistry in Materials Research at NASA GRC

    NASA Technical Reports Server (NTRS)

    Kavandi, Janet L.

    2016-01-01

    Overview of NASA GRC Materials Development. New materials enabled by new chemistries offering unique properties and chemical processing techniques. Durability of materials in harsh environments requires understanding and modeling of chemical interaction of materials with the environment.

  3. Cognitive Styles and Virtual Environments.

    ERIC Educational Resources Information Center

    Ford, Nigel

    2000-01-01

    Discussion of navigation through virtual information environments focuses on the need for robust user models that take into account individual differences. Considers Pask's information processing styles and strategies; deep (transformational) and surface (reproductive) learning; field dependence/independence; divergent/convergent thinking;…

  4. Defining the Environment in Gene–Environment Research: Lessons From Social Epidemiology

    PubMed Central

    Daw, Jonathan; Freese, Jeremy

    2013-01-01

    In this article, we make the case that social epidemiology provides a useful framework to define the environment within gene–environment (G×E) research. We describe the environment in a multilevel, multidomain, longitudinal framework that accounts for upstream processes influencing health outcomes. We then illustrate the utility of this approach by describing how intermediate levels of social organization, such as neighborhoods or schools, are key environmental components of G×E research. We discuss different models of G×E research and encourage public health researchers to consider the value of including genetic information from their study participants. We also encourage researchers interested in G×E interplay to consider the merits of the social epidemiology model when defining the environment. PMID:23927514

  5. Patient Data Synchronization Process in a Continuity of Care Environment

    PubMed Central

    Haras, Consuela; Sauquet, Dominique; Ameline, Philippe; Jaulent, Marie-Christine; Degoulet, Patrice

    2005-01-01

    In a distributed patient record environment, we analyze the processes needed to ensure exchange and access to EHR data. We propose an adapted method and the tools for data synchronization. Our study takes into account the issues of user rights management for data access and of decreasing the amount of data exchanged over the network. We describe a XML-based synchronization model that is portable and independent of specific medical data models. The implemented platform consists of several servers, of local network clients, of workstations running user’s interfaces and of data exchange and synchronization tools. PMID:16779049

  6. Implications for neurobiological research of cognitive models of psychosis: a theoretical paper.

    PubMed

    Garety, Philippa A; Bebbington, Paul; Fowler, David; Freeman, Daniel; Kuipers, Elizabeth

    2007-10-01

    Cognitive models of the positive symptoms of psychosis specify the cognitive, social and emotional processes hypothesized to contribute to their occurrence and persistence, and propose that vulnerable individuals make characteristic appraisals that result in specific positive symptoms. We describe cognitive models of positive psychotic symptoms and use this as the basis of discussing recent relevant empirical investigations and reviews that integrate cognitive approaches into neurobiological frameworks. Evidence increasingly supports a number of the hypotheses proposed by cognitive models. These are that: psychosis is on a continuum; specific cognitive processes are risk factors for the transition from subclinical experiences to clinical disorder; social adversity and trauma are associated with psychosis and with negative emotional processes; and these emotional processes contribute to the occurrence and persistence of psychotic symptoms. There is also evidence that reasoning biases contribute to the occurrence of delusions. The benefits of incorporating cognitive processes into neurobiological research include more sophisticated, bidirectional and interactive causal models, the amplification of phenotypes in neurobiological investigations by including emotional processes, and the adoption of more specific clinical phenotypes. For example, there is potential value in studying gene x environment x cognition/emotion interactions. Cognitive models and their derived phenotypes constitute the missing link in the chain between genetic or acquired biological vulnerability, the social environment and the expression of individual positive symptoms.

  7. Simulating the decentralized processes of the human immune system in a virtual anatomy model.

    PubMed

    Sarpe, Vladimir; Jacob, Christian

    2013-01-01

    Many physiological processes within the human body can be perceived and modeled as large systems of interacting particles or swarming agents. The complex processes of the human immune system prove to be challenging to capture and illustrate without proper reference to the spatial distribution of immune-related organs and systems. Our work focuses on physical aspects of immune system processes, which we implement through swarms of agents. This is our first prototype for integrating different immune processes into one comprehensive virtual physiology simulation. Using agent-based methodology and a 3-dimensional modeling and visualization environment (LINDSAY Composer), we present an agent-based simulation of the decentralized processes in the human immune system. The agents in our model - such as immune cells, viruses and cytokines - interact through simulated physics in two different, compartmentalized and decentralized 3-dimensional environments namely, (1) within the tissue and (2) inside a lymph node. While the two environments are separated and perform their computations asynchronously, an abstract form of communication is allowed in order to replicate the exchange, transportation and interaction of immune system agents between these sites. The distribution of simulated processes, that can communicate across multiple, local CPUs or through a network of machines, provides a starting point to build decentralized systems that replicate larger-scale processes within the human body, thus creating integrated simulations with other physiological systems, such as the circulatory, endocrine, or nervous system. Ultimately, this system integration across scales is our goal for the LINDSAY Virtual Human project. Our current immune system simulations extend our previous work on agent-based simulations by introducing advanced visualizations within the context of a virtual human anatomy model. We also demonstrate how to distribute a collection of connected simulations over a network of computers. As a future endeavour, we plan to use parameter tuning techniques on our model to further enhance its biological credibility. We consider these in silico experiments and their associated modeling and optimization techniques as essential components in further enhancing our capabilities of simulating a whole-body, decentralized immune system, to be used both for medical education and research as well as for virtual studies in immunoinformatics.

  8. An integrative model linking feedback environment and organizational citizenship behavior.

    PubMed

    Peng, Jei-Chen; Chiu, Su-Fen

    2010-01-01

    Past empirical evidence has suggested that a positive supervisor feedback environment may enhance employees' organizational citizenship behavior (OCB). In this study, we aim to extend previous research by proposing and testing an integrative model that examines the mediating processes underlying the relationship between supervisor feedback environment and employee OCB. Data were collected from 259 subordinate-supervisor dyads across a variety of organizations in Taiwan. We used structural equation modeling to test our hypotheses. The results demonstrated that supervisor feedback environment influenced employees' OCB indirectly through (1) both positive affective-cognition and positive attitude (i.e., person-organization fit and organizational commitment), and (2) both negative affective-cognition and negative attitude (i.e., role stressors and job burnout). Theoretical and practical implications are discussed.

  9. The Web Measurement Environment (WebME): A Tool for Combining and Modeling Distributed Data

    NASA Technical Reports Server (NTRS)

    Tesoriero, Roseanne; Zelkowitz, Marvin

    1997-01-01

    Many organizations have incorporated data collection into their software processes for the purpose of process improvement. However, in order to improve, interpreting the data is just as important as the collection of data. With the increased presence of the Internet and the ubiquity of the World Wide Web, the potential for software processes being distributed among several physically separated locations has also grown. Because project data may be stored in multiple locations and in differing formats, obtaining and interpreting data from this type of environment becomes even more complicated. The Web Measurement Environment (WebME), a Web-based data visualization tool, is being developed to facilitate the understanding of collected data in a distributed environment. The WebME system will permit the analysis of development data in distributed, heterogeneous environments. This paper provides an overview of the system and its capabilities.

  10. Mission Assurance in a Distributed Environment

    DTIC Science & Technology

    2009-06-01

    Notation ( BPMN ) – Graphical representation of business processes in a workflow • Unified Modeling Language (UML) – Use standard UML diagrams to model the system – Component, sequence, activity diagrams

  11. [Market-based medicine or patient-based medicine?].

    PubMed

    Justich, Pablo R

    2015-04-01

    The health care has evolved over the centuries from a theocentric model to a model centered on man, environment and society. The process of neoliberal globalization has changed the relationship between the components of the health system and population. The active participation of organizations such as the World Trade Organization, the International Monetary Fund and the World Bank by the techno-medical industrial complex tends to make the health care in a model focused on economy. This, impacts negatively on all components in the process of health care and have an adverse effect on the humanized care. The analysis of each sector in particular and their interactions shows the effects of this change. Alternatives are proposed for each sector to contribute to a model of care focused on the patient, their family and the social environment.

  12. Conceptual Model of Iodine Behavior in the Subsurface at the Hanford Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Truex, Michael J.; Lee, Brady D.; Johnson, Christian D.

    The fate and transport of 129I in the environment and potential remediation technologies are currently being studied as part of environmental remediation activities at the Hanford Site. A conceptual model describing the nature and extent of subsurface contamination, factors that control plume behavior, and factors relevant to potential remediation processes is needed to support environmental remedy decisions. Because 129I is an uncommon contaminant, relevant remediation experience and scientific literature are limited. Thus, the conceptual model also needs to both describe known contaminant and biogeochemical process information and to identify aspects about which additional information needed to effectively support remedy decisions.more » this document summarizes the conceptual model of iodine behavior relevant to iodine in the subsurface environment at the Hanford site.« less

  13. Model-based adaptive 3D sonar reconstruction in reverberating environments.

    PubMed

    Saucan, Augustin-Alexandru; Sintes, Christophe; Chonavel, Thierry; Caillec, Jean-Marc Le

    2015-10-01

    In this paper, we propose a novel model-based approach for 3D underwater scene reconstruction, i.e., bathymetry, for side scan sonar arrays in complex and highly reverberating environments like shallow water areas. The presence of multipath echoes and volume reverberation generates false depth estimates. To improve the resulting bathymetry, this paper proposes and develops an adaptive filter, based on several original geometrical models. This multimodel approach makes it possible to track and separate the direction of arrival trajectories of multiple echoes impinging the array. Echo tracking is perceived as a model-based processing stage, incorporating prior information on the temporal evolution of echoes in order to reject cluttered observations generated by interfering echoes. The results of the proposed filter on simulated and real sonar data showcase the clutter-free and regularized bathymetric reconstruction. Model validation is carried out with goodness of fit tests, and demonstrates the importance of model-based processing for bathymetry reconstruction.

  14. Answering Questions about Complex Events

    DTIC Science & Technology

    2008-12-19

    in their environment. To reason about events requires a means of describing, simulating, and analyzing their underlying dynamic processes . For our...that are relevant to our goal of connecting inference and reasoning about processes to answering questions about events. 11 We start with a...different event and process descriptions, ontologies, and models. 2.1.1 Logical AI In AI, formal approaches to model the ability to reason about

  15. The CLER model: thinking through change. Configurations of social relationships. Linkages to carry communications. Environment(s) inside and around systems. Resources for enabling implementation for incorporating change.

    PubMed

    Bhola, H S

    1994-05-01

    The CLER model is presented to nursing professionals as a model for planning and implementing change in interpersonal, institutional and cultural settings. It is useful for generating other models by modeling reality as actually encountered by change agents and adopters. The CLER model is related philosophically to systems thinking (that there is interdependence among social entities), dialectical thinking (that there is mutual shaping among social processes) and constructivist thinking (that human beings take part in creating their own reality).

  16. Personal Learning Environments in the Workplace: An Exploratory Study into the Key Business Decision Factors

    ERIC Educational Resources Information Center

    Chatterjee, Arunangsu; Law, Effie Lai-Chong; Mikroyannidis, Alexander; Owen, Glyn; Velasco, Karen

    2013-01-01

    Personal Learning Environments (PLEs) have emerged as a solution to the need of learners for open and easily customisable learning environments. PLEs essentially hand complete control over the learning process to the learner. However, this learning model is not fully compatible with learning in the workplace, which is influenced by certain…

  17. Improved Air Combat Awareness; with AESA and Next-Generation Signal Processing

    DTIC Science & Technology

    2002-09-01

    competence network Building techniques Software development environment Communication Computer architecture Modeling Real-time programming Radar...memory access, skewed load and store, 3.2 GB/s BW • Performance: 400 MFLOPS Runtime environment Custom runtime routines Driver routines Hardware

  18. Weathering profiles in soils and rocks on Earth and Mars

    NASA Astrophysics Data System (ADS)

    Hausrath, E.; Adcock, C. T.; Bamisile, T.; Baumeister, J. L.; Gainey, S.; Ralston, S. J.; Steiner, M.; Tu, V.

    2017-12-01

    Interactions of liquid water with rock, soil, or sediments can result in significant chemical and mineralogical changes with depth. These changes can include transformation from one phase to another as well as translocation, addition, and loss of material. The resulting chemical and mineralogical depth profiles can record characteristics of the interacting liquid water such as pH, temperature, duration, and abundance. We use a combined field, laboratory, and modeling approach to interpret the environmental conditions preserved in soils and rocks. We study depth profiles in terrestrial field environments; perform dissolution experiments of primary and secondary phases important in soil environments; and perform numerical modeling to quantitatively interpret weathering environments. In our field studies we have measured time-integrated basaltic mineral dissolution rates, and interpreted the impact of pH and temperature on weathering in basaltic and serpentine-containing rocks and soils. These results help us interpret fundamental processes occurring in soils on Earth and on Mars, and can also be used to inform numerical modeling and laboratory experiments. Our laboratory experiments provide fundamental kinetic data to interpret processes occurring in soils. We have measured dissolution rates of Mars-relevant phosphate minerals, clay minerals, and amorphous phases, as well as dissolution rates under specific Mars-relevant conditions such as in concentrated brines. Finally, reactive transport modeling allows a quantitative interpretation of the kinetic, thermodynamic, and transport processes occurring in soil environments. Such modeling allows the testing of conditions under longer time frames and under different conditions than might be possible under either terrestrial field or laboratory conditions. We have used modeling to examine the weathering of basalt, olivine, carbonate, phosphate, and clay minerals, and placed constraints on the duration, pH, and solution chemistry of past aqueous alteration occurring on Mars.

  19. Model of environmental life cycle assessment for coal mining operations.

    PubMed

    Burchart-Korol, Dorota; Fugiel, Agata; Czaplicka-Kolarz, Krystyna; Turek, Marian

    2016-08-15

    This paper presents a novel approach to environmental assessment of coal mining operations, which enables assessment of the factors that are both directly and indirectly affecting the environment and are associated with the production of raw materials and energy used in processes. The primary novelty of the paper is the development of a computational environmental life cycle assessment (LCA) model for coal mining operations and the application of the model for coal mining operations in Poland. The LCA model enables the assessment of environmental indicators for all identified unit processes in hard coal mines with the life cycle approach. The proposed model enables the assessment of greenhouse gas emissions (GHGs) based on the IPCC method and the assessment of damage categories, such as human health, ecosystems and resources based on the ReCiPe method. The model enables the assessment of GHGs for hard coal mining operations in three time frames: 20, 100 and 500years. The model was used to evaluate the coal mines in Poland. It was demonstrated that the largest environmental impacts in damage categories were associated with the use of fossil fuels, methane emissions and the use of electricity, processing of wastes, heat, and steel supports. It was concluded that an environmental assessment of coal mining operations, apart from direct influence from processing waste, methane emissions and drainage water, should include the use of electricity, heat and steel, particularly for steel supports. Because the model allows the comparison of environmental impact assessment for various unit processes, it can be used for all hard coal mines, not only in Poland but also in the world. This development is an important step forward in the study of the impacts of fossil fuels on the environment with the potential to mitigate the impact of the coal industry on the environment. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. A Constraints-Led Perspective to Understanding Skill Acquisition and Game Play: A Basis for Integration of Motor Learning Theory and Physical Education Praxis?

    ERIC Educational Resources Information Center

    Renshaw, Ian; Chow, Jia Yi; Davids, Keith; Hammond, John

    2010-01-01

    Background: In order to design appropriate environments for performance and learning of movement skills, physical educators need a sound theoretical model of the learner and of processes of learning. In physical education, this type of modelling informs the organisation of learning environments and effective and efficient use of practice time. An…

  1. Intellectual, Psychosocial, and Moral Development in College: Four Major Theories. Revised.

    ERIC Educational Resources Information Center

    Kurfiss, Joanne

    Four models are discussed with which to view students, educational goals, and learning environments. Each of the four theories emphasizes a unique aspect of the total development process. Piaget's model describes the development of structures and processes which characterize mature logical thinking. Perry provides a closer look at students'…

  2. Model Transformation for a System of Systems Dependability Safety Case

    NASA Technical Reports Server (NTRS)

    Murphy, Judy; Driskell, Steve

    2011-01-01

    The presentation reviews the dependability and safety effort of NASA's Independent Verification and Validation Facility. Topics include: safety engineering process, applications to non-space environment, Phase I overview, process creation, sample SRM artifact, Phase I end result, Phase II model transformation, fault management, and applying Phase II to individual projects.

  3. A Test of Two Alternative Cognitive Processing Models: Learning Styles and Dual Coding

    ERIC Educational Resources Information Center

    Cuevas, Joshua; Dawson, Bryan L.

    2018-01-01

    This study tested two cognitive models, learning styles and dual coding, which make contradictory predictions about how learners process and retain visual and auditory information. Learning styles-based instructional practices are common in educational environments despite a questionable research base, while the use of dual coding is less…

  4. Natural and Induced Environment in Low Earth Orbit

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Badavi, Francis F.; Kim, Myung-Hee Y.; Clowdsley, Martha S.; Heinbockel, John H.; Cucinotta, Francis A.; Badhwar, Gautam D.; Atwell, William; Huston, Stuart L.

    2002-01-01

    The long-term exposure of astronauts on the developing International Space Station (ISS) requires an accurate knowledge of the internal exposure environment for human risk assessment and other onboard processes. The natural environment is moderated by the solar wind which varies over the solar cycle. The neutron environment within the Shuttle in low Earth orbit has two sources. A time dependent model for the ambient environment is used to evaluate the natural and induced environment. The induced neutron environment is evaluated using measurements on STS-31 and STS-36 near the 1990 solar maximum.

  5. A neural model of hierarchical reinforcement learning.

    PubMed

    Rasmussen, Daniel; Voelker, Aaron; Eliasmith, Chris

    2017-01-01

    We develop a novel, biologically detailed neural model of reinforcement learning (RL) processes in the brain. This model incorporates a broad range of biological features that pose challenges to neural RL, such as temporally extended action sequences, continuous environments involving unknown time delays, and noisy/imprecise computations. Most significantly, we expand the model into the realm of hierarchical reinforcement learning (HRL), which divides the RL process into a hierarchy of actions at different levels of abstraction. Here we implement all the major components of HRL in a neural model that captures a variety of known anatomical and physiological properties of the brain. We demonstrate the performance of the model in a range of different environments, in order to emphasize the aim of understanding the brain's general reinforcement learning ability. These results show that the model compares well to previous modelling work and demonstrates improved performance as a result of its hierarchical ability. We also show that the model's behaviour is consistent with available data on human hierarchical RL, and generate several novel predictions.

  6. Negotiation Support System’s Impact on the Socio-Emotional Environment: A Research Design Framework

    DTIC Science & Technology

    1992-03-01

    conflict environment and develop some proposed effects that Negotiation Support Systems (NSS) have on the socio- emotional climate. This introduction of...assessment of current NSS structure, processes and capabilities. Section IV provides a theoretical discussion of conflict and the socio- emotional environment ...model. First, strict economic rationalization does not take into account social/normative issues present --n the negotiation environment . Thus, in an

  7. Models of Solar Wind Structures and Their Interaction with the Earth's Space Environment

    NASA Astrophysics Data System (ADS)

    Watermann, J.; Wintoft, P.; Sanahuja, B.; Saiz, E.; Poedts, S.; Palmroth, M.; Milillo, A.; Metallinou, F.-A.; Jacobs, C.; Ganushkina, N. Y.; Daglis, I. A.; Cid, C.; Cerrato, Y.; Balasis, G.; Aylward, A. D.; Aran, A.

    2009-11-01

    The discipline of “Space Weather” is built on the scientific foundation of solar-terrestrial physics but with a strong orientation toward applied research. Models describing the solar-terrestrial environment are therefore at the heart of this discipline, for both physical understanding of the processes involved and establishing predictive capabilities of the consequences of these processes. Depending on the requirements, purely physical models, semi-empirical or empirical models are considered to be the most appropriate. This review focuses on the interaction of solar wind disturbances with geospace. We cover interplanetary space, the Earth’s magnetosphere (with the exception of radiation belt physics), the ionosphere (with the exception of radio science), the neutral atmosphere and the ground (via electromagnetic induction fields). Space weather relevant state-of-the-art physical and semi-empirical models of the various regions are reviewed. They include models for interplanetary space, its quiet state and the evolution of recurrent and transient solar perturbations (corotating interaction regions, coronal mass ejections, their interplanetary remnants, and solar energetic particle fluxes). Models of coupled large-scale solar wind-magnetosphere-ionosphere processes (global magnetohydrodynamic descriptions) and of inner magnetosphere processes (ring current dynamics) are discussed. Achievements in modeling the coupling between magnetospheric processes and the neutral and ionized upper and middle atmospheres are described. Finally we mention efforts to compile comprehensive and flexible models from selections of existing modules applicable to particular regions and conditions in interplanetary space and geospace.

  8. Managing Algorithmic Skeleton Nesting Requirements in Realistic Image Processing Applications: The Case of the SKiPPER-II Parallel Programming Environment's Operating Model

    NASA Astrophysics Data System (ADS)

    Coudarcher, Rémi; Duculty, Florent; Serot, Jocelyn; Jurie, Frédéric; Derutin, Jean-Pierre; Dhome, Michel

    2005-12-01

    SKiPPER is a SKeleton-based Parallel Programming EnviRonment being developed since 1996 and running at LASMEA Laboratory, the Blaise-Pascal University, France. The main goal of the project was to demonstrate the applicability of skeleton-based parallel programming techniques to the fast prototyping of reactive vision applications. This paper deals with the special features embedded in the latest version of the project: algorithmic skeleton nesting capabilities and a fully dynamic operating model. Throughout the case study of a complete and realistic image processing application, in which we have pointed out the requirement for skeleton nesting, we are presenting the operating model of this feature. The work described here is one of the few reported experiments showing the application of skeleton nesting facilities for the parallelisation of a realistic application, especially in the area of image processing. The image processing application we have chosen is a 3D face-tracking algorithm from appearance.

  9. Generation of large scale urban environments to support advanced sensor and seeker simulation

    NASA Astrophysics Data System (ADS)

    Giuliani, Joseph; Hershey, Daniel; McKeown, David, Jr.; Willis, Carla; Van, Tan

    2009-05-01

    One of the key aspects for the design of a next generation weapon system is the need to operate in cluttered and complex urban environments. Simulation systems rely on accurate representation of these environments and require automated software tools to construct the underlying 3D geometry and associated spectral and material properties that are then formatted for various objective seeker simulation systems. Under an Air Force Small Business Innovative Research (SBIR) contract, we have developed an automated process to generate 3D urban environments with user defined properties. These environments can be composed from a wide variety of source materials, including vector source data, pre-existing 3D models, and digital elevation models, and rapidly organized into a geo-specific visual simulation database. This intermediate representation can be easily inspected in the visible spectrum for content and organization and interactively queried for accuracy. Once the database contains the required contents, it can then be exported into specific synthetic scene generation runtime formats, preserving the relationship between geometry and material properties. To date an exporter for the Irma simulation system developed and maintained by AFRL/Eglin has been created and a second exporter to Real Time Composite Hardbody and Missile Plume (CHAMP) simulation system for real-time use is currently being developed. This process supports significantly more complex target environments than previous approaches to database generation. In this paper we describe the capabilities for content creation for advanced seeker processing algorithms simulation and sensor stimulation, including the overall database compilation process and sample databases produced and exported for the Irma runtime system. We also discuss the addition of object dynamics and viewer dynamics within the visual simulation into the Irma runtime environment.

  10. A Methodology and Software Environment for Testing Process Model’s Sequential Predictions with Protocols

    DTIC Science & Technology

    1992-12-21

    in preparation). Foundations of artificial intelligence. Cambridge, MA: MIT Press. O’Reilly, R. C. (1991). X3DNet: An X- Based Neural Network ...2.2.3 Trace based protocol analysis 19 2.2A Summary of important data features 21 2.3 Tools related to process model testing 23 2.3.1 Tools for building...algorithm 57 3. Requirements for testing process models using trace based protocol 59 analysis 3.1 Definition of trace based protocol analysis (TBPA) 59

  11. TLS for generating multi-LOD of 3D building model

    NASA Astrophysics Data System (ADS)

    Akmalia, R.; Setan, H.; Majid, Z.; Suwardhi, D.; Chong, A.

    2014-02-01

    The popularity of Terrestrial Laser Scanners (TLS) to capture three dimensional (3D) objects has been used widely for various applications. Development in 3D models has also led people to visualize the environment in 3D. Visualization of objects in a city environment in 3D can be useful for many applications. However, different applications require different kind of 3D models. Since a building is an important object, CityGML has defined a standard for 3D building models at four different levels of detail (LOD). In this research, the advantages of TLS for capturing buildings and the modelling process of the point cloud can be explored. TLS will be used to capture all the building details to generate multi-LOD. This task, in previous works, involves usually the integration of several sensors. However, in this research, point cloud from TLS will be processed to generate the LOD3 model. LOD2 and LOD1 will then be generalized from the resulting LOD3 model. Result from this research is a guiding process to generate the multi-LOD of 3D building starting from LOD3 using TLS. Lastly, the visualization for multi-LOD model will also be shown.

  12. Hydrological models as web services: Experiences from the Environmental Virtual Observatory project

    NASA Astrophysics Data System (ADS)

    Buytaert, W.; Vitolo, C.; Reaney, S. M.; Beven, K.

    2012-12-01

    Data availability in environmental sciences is expanding at a rapid pace. From the constant stream of high-resolution satellite images to the local efforts of citizen scientists, there is an increasing need to process the growing stream of heterogeneous data and turn it into useful information for decision-making. Environmental models, ranging from simple rainfall - runoff relations to complex climate models, can be very useful tools to process data, identify patterns, and help predict the potential impact of management scenarios. Recent technological innovations in networking, computing and standardization may bring a new generation of interactive models plugged into virtual environments closer to the end-user. They are the driver of major funding initiatives such as the UK's Virtual Observatory program, and the U.S. National Science Foundation's Earth Cube. In this study we explore how hydrological models, being an important subset of environmental models, have to be adapted in order to function within a broader environment of web-services and user interactions. Historically, hydrological models have been developed for very different purposes. Typically they have a rigid model structure, requiring a very specific set of input data and parameters. As such, the process of implementing a model for a specific catchment requires careful collection and preparation of the input data, extensive calibration and subsequent validation. This procedure seems incompatible with a web-environment, where data availability is highly variable, heterogeneous and constantly changing in time, and where the requirements of end-users may be not necessarily align with the original intention of the model developer. We present prototypes of models that are web-enabled using the web standards of the Open Geospatial Consortium, and implemented in online decision-support systems. We identify issues related to (1) optimal use of available data; (2) the need for flexible and adaptive structures; (3) quantification and communication of uncertainties. Lastly, we present some road maps to address these issues and discuss them in the broader context of web-based data processing and "big data" science.

  13. Entrepreneurship management in health services: an integrative model.

    PubMed

    Guo, Kristina L

    2006-01-01

    This research develops an integrated systems model of entrepreneurship management as a method for achieving health care organizational survival and growth. Specifically, it analyzes current health care environment challenges, identifies roles of managers and discusses organizational theories that are relevant to the health care environment, outlines the role of entrepreneurship in health care, and describes the entrepreneurial manager in the entrepreneurial management process to produce desirable organizational outcomes. The study concludes that as current health care environment continues to show intense competition, entrepreneurial managers are responsible for creating innovations, managing change, investing in resources, and recognizing opportunities in the environment to increase organizational viability.

  14. Scaling the Information Processing Demands of Occupations

    ERIC Educational Resources Information Center

    Haase, Richard F.; Jome, LaRae M.; Ferreira, Joaquim Armando; Santos, Eduardo J. R.; Connacher, Christopher C.; Sendrowitz, Kerrin

    2011-01-01

    The purpose of this study was to provide additional validity evidence for a model of person-environment fit based on polychronicity, stimulus load, and information processing capacities. In this line of research the confluence of polychronicity and information processing (e.g., the ability of individuals to process stimuli from the environment…

  15. Pavlovian control of cross-tolerance between pentobarbital and ethanol.

    PubMed

    Cappell, H; Roach, C; Poulos, C X

    1981-01-01

    Tolerance to several effects of a number of drugs has been shown to depend on Pavlovian conditioning processes. Experiment I extended the compensatory conditioning model (Siegel 1975) to tolerance to the hypothermic effect of pentobarbital (30 mg/kg). In Experiment I, rats that acquired hypothermic tolerance in one environment did not display tolerance when tested in an environment not previously associated with drug administration. In Experiment II, rats were made tolerant to the hypothermic effect of pentobarbital (30 mg/kg) and tested for cross-tolerance to ethanol (2.5 g/kg). Cross-tolerance was observed, but it was significantly reduced if the test was in an environment different from the one in which tolerance to pentobarbital was originally acquired. Thus, the compensatory conditioning model accounts for at least part of the tolerance and cross-tolerance to the thermic effects of alcohol and pentobarbital. The physiological processes in the CNS underlying tolerance and cross-tolerance for these drugs, therefore, are controlled by associative processes.

  16. Multifluid MHD Simulations of the Plasma Environment of Comet Churyumov-Gerasimenko at Different Heliocentric Distances

    NASA Astrophysics Data System (ADS)

    Huang, Z.; Jia, X.; Rubin, M.; Fougere, N.; Gombosi, T. I.; Tenishev, V.; Combi, M. R.; Bieler, A. M.; Toth, G.; Hansen, K. C.; Shou, Y.

    2014-12-01

    We study the plasma environment of the comet Churyumov-Gerasimenko, which is the target of the Rosetta mission, by performing large scale numerical simulations. Our model is based on BATS-R-US within the Space Weather Modeling Framework that solves the governing multifluid MHD equations, which describe the behavior of the cometary heavy ions, the solar wind protons, and electrons. The model includes various mass loading processes, including ionization, charge exchange, dissociative ion-electron recombination, as well as collisional interactions between different fluids. The neutral background used in our MHD simulations is provided by a kinetic Direct Simulation Monte Carlo (DSMC) model. We will simulate how the cometary plasma environment changes at different heliocentric distances.

  17. Modelling of Indoor Environments Using Lindenmayer Systems

    NASA Astrophysics Data System (ADS)

    Peter, M.

    2017-09-01

    Documentation of the "as-built" state of building interiors has gained a lot of interest in the recent years. Various data acquisition methods exist, e.g. the extraction from photographed evacuation plans using image processing or, most prominently, indoor mobile laser scanning. Due to clutter or data gaps as well as errors during data acquisition and processing, automatic reconstruction of CAD/BIM-like models from these data sources is not a trivial task. Thus it is often tried to support reconstruction by general rules for the perpendicularity and parallelism which are predominant in man-made structures. Indoor environments of large, public buildings, however, often also follow higher-level rules like symmetry and repetition of e.g. room sizes and corridor widths. In the context of reconstruction of city city elements (e.g. street networks) or building elements (e.g. façade layouts), formal grammars have been put to use. In this paper, we describe the use of Lindenmayer systems - which originally have been developed for the computer-based modelling of plant growth - to model and reproduce the layout of indoor environments in 2D.

  18. Effects of Kinetic Processes in Shaping Io's Global Plasma Environment: A 3D Hybrid Model

    NASA Technical Reports Server (NTRS)

    Lipatov, Alexander S.; Combi, Michael R.

    2004-01-01

    The global dynamics of the ionized and neutral components in the environment of Io plays an important role in the interaction of Jupiter's corotating magnetospheric plasma with Io. The stationary simulation of this problem was done in the MHD and the electrodynamics approaches. One of the main significant results from the simplified two-fluid model simulations was a production of the structure of the double-peak in the magnetic field signature of the I0 flyby that could not be explained by standard MHD models. In this paper, we develop a method of kinetic ion simulation. This method employs the fluid description for electrons and neutrals whereas for ions multilevel, drift-kinetic and particle, approaches are used. We also take into account charge-exchange and photoionization processes. Our model provides much more accurate description for ion dynamics and allows us to take into account the realistic anisotropic ion distribution that cannot be done in fluid simulations. The first results of such simulation of the dynamics of ions in the Io's environment are discussed in this paper.

  19. Investigating the Use of 3d Geovisualizations for Urban Design in Informal Settlement Upgrading in South Africa

    NASA Astrophysics Data System (ADS)

    Rautenbach, V.; Coetzee, S.; Çöltekin, A.

    2016-06-01

    Informal settlements are a common occurrence in South Africa, and to improve in-situ circumstances of communities living in informal settlements, upgrades and urban design processes are necessary. Spatial data and maps are essential throughout these processes to understand the current environment, plan new developments, and communicate the planned developments. All stakeholders need to understand maps to actively participate in the process. However, previous research demonstrated that map literacy was relatively low for many planning professionals in South Africa, which might hinder effective planning. Because 3D visualizations resemble the real environment more than traditional maps, many researchers posited that they would be easier to interpret. Thus, our goal is to investigate the effectiveness of 3D geovisualizations for urban design in informal settlement upgrading in South Africa. We consider all involved processes: 3D modelling, visualization design, and cognitive processes during map reading. We found that procedural modelling is a feasible alternative to time-consuming manual modelling, and can produce high quality models. When investigating the visualization design, the visual characteristics of 3D models and relevance of a subset of visual variables for urban design activities of informal settlement upgrades were qualitatively assessed. The results of three qualitative user experiments contributed to understanding the impact of various levels of complexity in 3D city models and map literacy of future geoinformatics and planning professionals when using 2D maps and 3D models. The research results can assist planners in designing suitable 3D models that can be used throughout all phases of the process.

  20. A THREE-DIMENSIONAL MODEL ASSESSMENT OF THE GLOBAL DISTRIBUTION OF HEXACHLOROBENZENE

    EPA Science Inventory

    The distributions of persistent organic pollutants (POPs) in the global environment have been studied typically with box/fugacity models with simplified treatments of atmospheric transport processes1. Such models are incapable of simulating the complex three-dimensional mechanis...

  1. Butterfly valve in a virtual environment

    NASA Astrophysics Data System (ADS)

    Talekar, Aniruddha; Patil, Saurabh; Thakre, Prashant; Rajkumar, E.

    2017-11-01

    Assembly of components is one of the processes involved in product design and development. The present paper deals with the assembly of a simple butterfly valve components in a virtual environment. The assembly has been carried out using virtual reality software by trial and error methods. The parts are modelled using parametric software (SolidWorks), meshed accordingly, and then called into virtual environment for assembly.

  2. Phenotypic switching of populations of cells in a stochastic environment

    NASA Astrophysics Data System (ADS)

    Hufton, Peter G.; Lin, Yen Ting; Galla, Tobias

    2018-02-01

    In biology phenotypic switching is a common bet-hedging strategy in the face of uncertain environmental conditions. Existing mathematical models often focus on periodically changing environments to determine the optimal phenotypic response. We focus on the case in which the environment switches randomly between discrete states. Starting from an individual-based model we derive stochastic differential equations to describe the dynamics, and obtain analytical expressions for the mean instantaneous growth rates based on the theory of piecewise-deterministic Markov processes. We show that optimal phenotypic responses are non-trivial for slow and intermediate environmental processes, and systematically compare the cases of periodic and random environments. The best response to random switching is more likely to be heterogeneity than in the case of deterministic periodic environments, net growth rates tend to be higher under stochastic environmental dynamics. The combined system of environment and population of cells can be interpreted as host-pathogen interaction, in which the host tries to choose environmental switching so as to minimise growth of the pathogen, and in which the pathogen employs a phenotypic switching optimised to increase its growth rate. We discuss the existence of Nash-like mutual best-response scenarios for such host-pathogen games.

  3. Nonlinear and Dissipation Characteristics of Ocean Surface Waves in Estuarine Environments

    DTIC Science & Technology

    2014-09-30

    transformation and evolution . In addition these modules would allow for feedback between the surface wave and the energy dissipating feature. OBJECTIVES...dissipation on wave processes. 3) Develop and test low-dimension, reduced representations of estuarine effects for inclusion into operational wave models...Sheremet (PI), Miao Tian and Cihan Sahin (Ph.D. students) who are working on modeling nonlinear wave evolution in dissipative environments (mud), and

  4. Framework Programmable Platform for the Advanced Software Development Workstation (FPP/ASDW). Demonstration framework document. Volume 1: Concepts and activity descriptions

    NASA Technical Reports Server (NTRS)

    Mayer, Richard J.; Blinn, Thomas M.; Dewitte, Paul S.; Crump, John W.; Ackley, Keith A.

    1992-01-01

    The Framework Programmable Software Development Platform (FPP) is a project aimed at effectively combining tool and data integration mechanisms with a model of the software development process to provide an intelligent integrated software development environment. Guided by the model, this system development framework will take advantage of an integrated operating environment to automate effectively the management of the software development process so that costly mistakes during the development phase can be eliminated. The Advanced Software Development Workstation (ASDW) program is conducting research into development of advanced technologies for Computer Aided Software Engineering (CASE).

  5. An Integrated Product Environment

    NASA Technical Reports Server (NTRS)

    Higgins, Chuck

    1997-01-01

    Mechanical Advantage is a mechanical design decision support system. Unlike our CAD/CAM cousins, Mechanical Advantage addresses true engineering processes, not just the form and fit of geometry. If we look at a traditional engineering environment, we see that an engineer starts with two things - performance goals and design rules. The intent is to have a product perform specific functions and accomplish that within a designated environment. Geometry should be a simple byproduct of that engineering process - not the controller of it. Mechanical Advantage is a performance modeler allowing engineers to consider all these criteria in making their decisions by providing such capabilities as critical parameter analysis, tolerance and sensitivity analysis, math driven Geometry, and automated design optimizations. If you should desire an industry standard solid model, we would produce an ACIS-based solid model. If you should desire an ANSI/ISO standard drawing, we would produce this as well with a virtual push of the button. For more information on this and other Advantage Series products, please contact the author.

  6. Framework Programmable Platform for the Advanced Software Development Workstation: Preliminary system design document

    NASA Technical Reports Server (NTRS)

    Mayer, Richard J.; Blinn, Thomas M.; Mayer, Paula S. D.; Ackley, Keith A.; Crump, John W., IV; Henderson, Richard; Futrell, Michael T.

    1991-01-01

    The Framework Programmable Software Development Platform (FPP) is a project aimed at combining effective tool and data integration mechanisms with a model of the software development process in an intelligent integrated software environment. Guided by the model, this system development framework will take advantage of an integrated operating environment to automate effectively the management of the software development process so that costly mistakes during the development phase can be eliminated. The focus here is on the design of components that make up the FPP. These components serve as supporting systems for the Integration Mechanism and the Framework Processor and provide the 'glue' that ties the FPP together. Also discussed are the components that allow the platform to operate in a distributed, heterogeneous environment and to manage the development and evolution of software system artifacts.

  7. Creating an inclusive mall environment with the PRECEDE-PROCEED model: a living lab case study.

    PubMed

    Ahmed, Sara; Swaine, Bonnie; Milot, Marc; Gaudet, Caroline; Poldma, Tiiu; Bartlett, Gillian; Mazer, Barbara; Le Dorze, Guylaine; Barbic, Skye; Rodriguez, Ana Maria; Lefebvre, Hélène; Archambault, Philippe; Kairy, Dahlia; Fung, Joyce; Labbé, Delphine; Lamontagne, Anouk; Kehayia, Eva

    2017-10-01

    Although public environments provide opportunities for participation and social inclusion, they are not always inclusive spaces and may not accommodate the wide diversity of people. The Rehabilitation Living Lab in the Mall is a unique, interdisciplinary, and multi-sectoral research project with an aim to transform a shopping complex in Montreal, Canada, into an inclusive environment optimizing the participation and social inclusion of all people. The PRECEDE-PROCEDE Model (PPM), a community-oriented and participatory planning model, was applied as a framework. The PPM is comprised of nine steps divided between planning, implementation, and evaluation. The PPM is well suited as a framework for the development of an inclusive mall. Its ecological approach considers the environment, as well as the social and individual factors relating to mall users' needs and expectations. Transforming a mall to be more inclusive is a complex process involving many stakeholders. The PPM allows the synthesis of several sources of information, as well as the identification and prioritization of key issues to address. The PPM also helps to frame and drive the implementation and evaluate the components of the project. This knowledge can help others interested in using the PPM to create similar enabling and inclusive environments world-wide. Implication for rehabilitation While public environments provide opportunities for participation and social inclusion, they are not always inclusive spaces and may not accommodate the wide diversity of people. The PRECEDE PROCEDE Model (PPM) is well suited as a framework for the development, implementation, and evaluation of an inclusive mall. Environmental barriers can negatively impact the rehabilitation process by impeding the restoration and augmentation of function. Removing barriers to social participation and independent living by improving inclusivity in the mall and other environments positively impacts the lives of people with disabilities.

  8. DEVELOPMENT OF A CHEMICAL PROCESS MODELING ENVIRONMENT BASED ON CAPE-OPEN INTERFACE STANDARDS AND THE MICROSOFT .NET FRAMEWORK

    EPA Science Inventory

    Chemical process simulation has long been used as a design tool in the development of chemical plants, and has long been considered a means to evaluate different design options. With the advent of large scale computer networks and interface models for program components, it is po...

  9. Modeling snow accumulation and ablation processes in forested environments

    NASA Astrophysics Data System (ADS)

    Andreadis, Konstantinos M.; Storck, Pascal; Lettenmaier, Dennis P.

    2009-05-01

    The effects of forest canopies on snow accumulation and ablation processes can be very important for the hydrology of midlatitude and high-latitude areas. A mass and energy balance model for snow accumulation and ablation processes in forested environments was developed utilizing extensive measurements of snow interception and release in a maritime mountainous site in Oregon. The model was evaluated using 2 years of weighing lysimeter data and was able to reproduce the snow water equivalent (SWE) evolution throughout winters both beneath the canopy and in the nearby clearing, with correlations to observations ranging from 0.81 to 0.99. Additionally, the model was evaluated using measurements from a Boreal Ecosystem-Atmosphere Study (BOREAS) field site in Canada to test the robustness of the canopy snow interception algorithm in a much different climate. Simulated SWE was relatively close to the observations for the forested sites, with discrepancies evident in some cases. Although the model formulation appeared robust for both types of climates, sensitivity to parameters such as snow roughness length and maximum interception capacity suggested the magnitude of improvements of SWE simulations that might be achieved by calibration.

  10. Deposition Of Thin-Film Sensors On Glass-Fiber/Epoxy Models

    NASA Technical Reports Server (NTRS)

    Tran, Sang Q.

    1995-01-01

    Direct-deposition process devised for fabrication of thin-film sensors on three-dimensional, curved surfaces of models made of stainless steel covered with glass-fiber/epoxy-matrix composite material. Models used under cryogenic conditions, and sensors used to detect on-line transitions between laminar and turbulent flows in wind tunnel environments. Sensors fabricated by process used at temperatures from minus 300 degrees F to 175 degrees F.

  11. Adaptation, Learning, and the Art of War: A Cybernetic Perspective

    DTIC Science & Technology

    2014-05-14

    William Ross Ashby and contemporary cybernetic thought, the study modeled the adaptive systems as control loops and the processes of adaptive systems...as a Markov process . Using this model , the study concluded that systems would return to the same relative equilibrium point, expressed in terms of...uncertain and ever-changing environment. Drawing from the works of William Ross Ashby and contemporary cybernetic thought, the study modeled the adaptive

  12. Optimization of MLS receivers for multipath environments

    NASA Technical Reports Server (NTRS)

    Mcalpine, G. A.; Highfill, J. H., III

    1976-01-01

    The design of a microwave landing system (MLS) aircraft receiver, capable of optimal performance in multipath environments found in air terminal areas, is reported. Special attention was given to the angle tracking problem of the receiver and includes tracking system design considerations, study and application of locally optimum estimation involving multipath adaptive reception and then envelope processing, and microcomputer system design. Results show processing is competitive in this application with i-f signal processing performance-wise and is much more simple and cheaper. A summary of the signal model is given.

  13. Theory of quantized systems: formal basis for DEVS/HLA distributed simulation environment

    NASA Astrophysics Data System (ADS)

    Zeigler, Bernard P.; Lee, J. S.

    1998-08-01

    In the context of a DARPA ASTT project, we are developing an HLA-compliant distributed simulation environment based on the DEVS formalism. This environment will provide a user- friendly, high-level tool-set for developing interoperable discrete and continuous simulation models. One application is the study of contract-based predictive filtering. This paper presents a new approach to predictive filtering based on a process called 'quantization' to reduce state update transmission. Quantization, which generates state updates only at quantum level crossings, abstracts a sender model into a DEVS representation. This affords an alternative, efficient approach to embedding continuous models within distributed discrete event simulations. Applications of quantization to message traffic reduction are discussed. The theory has been validated by DEVSJAVA simulations of test cases. It will be subject to further test in actual distributed simulations using the DEVS/HLA modeling and simulation environment.

  14. Sonic environment of aircraft structure immersed in a supersonic jet flow stream

    NASA Technical Reports Server (NTRS)

    Guinn, W. A.; Balena, F. J.; Soovere, J.

    1976-01-01

    Test methods for determining the sonic environment of aircraft structure that is immersed in the flow stream of a high velocity jet or that is subjected to the noise field surrounding the jet, were investigated. Sonic environment test data measured on a SCAT 15-F model in the flow field of Mach 1.5 and 2.5 jets were processed. Narrow band, lateral cross correlation and noise contour plots are presented. Data acquisition and reduction methods are depicted. A computer program for scaling the model data is given that accounts for model size, jet velocity, transducer size, and jet density. Comparisons of scaled model data and full size aircraft data are made for the L-1011, S-3A, and a V/STOL lower surface blowing concept. Sonic environment predictions are made for an engine-over-the-wing SST configuration.

  15. Quantum-like model of brain's functioning: decision making from decoherence.

    PubMed

    Asano, Masanari; Ohya, Masanori; Tanaka, Yoshiharu; Basieva, Irina; Khrennikov, Andrei

    2011-07-21

    We present a quantum-like model of decision making in games of the Prisoner's Dilemma type. By this model the brain processes information by using representation of mental states in a complex Hilbert space. Driven by the master equation the mental state of a player, say Alice, approaches an equilibrium point in the space of density matrices (representing mental states). This equilibrium state determines Alice's mixed (i.e., probabilistic) strategy. We use a master equation in which quantum physics describes the process of decoherence as the result of interaction with environment. Thus our model is a model of thinking through decoherence of the initially pure mental state. Decoherence is induced by the interaction with memory and the external mental environment. We study (numerically) the dynamics of quantum entropy of Alice's mental state in the process of decision making. We also consider classical entropy corresponding to Alice's choices. We introduce a measure of Alice's diffidence as the difference between classical and quantum entropies of Alice's mental state. We see that (at least in our model example) diffidence decreases (approaching zero) in the process of decision making. Finally, we discuss the problem of neuronal realization of quantum-like dynamics in the brain; especially roles played by lateral prefrontal cortex or/and orbitofrontal cortex. Copyright © 2011 Elsevier Ltd. All rights reserved.

  16. Confounding environmental colour and distribution shape leads to underestimation of population extinction risk.

    PubMed

    Fowler, Mike S; Ruokolainen, Lasse

    2013-01-01

    The colour of environmental variability influences the size of population fluctuations when filtered through density dependent dynamics, driving extinction risk through dynamical resonance. Slow fluctuations (low frequencies) dominate in red environments, rapid fluctuations (high frequencies) in blue environments and white environments are purely random (no frequencies dominate). Two methods are commonly employed to generate the coloured spatial and/or temporal stochastic (environmental) series used in combination with population (dynamical feedback) models: autoregressive [AR(1)] and sinusoidal (1/f) models. We show that changing environmental colour from white to red with 1/f models, and from white to red or blue with AR(1) models, generates coloured environmental series that are not normally distributed at finite time-scales, potentially confounding comparison with normally distributed white noise models. Increasing variability of sample Skewness and Kurtosis and decreasing mean Kurtosis of these series alter the frequency distribution shape of the realised values of the coloured stochastic processes. These changes in distribution shape alter patterns in the probability of single and series of extreme conditions. We show that the reduced extinction risk for undercompensating (slow growing) populations in red environments previously predicted with traditional 1/f methods is an artefact of changes in the distribution shapes of the environmental series. This is demonstrated by comparison with coloured series controlled to be normally distributed using spectral mimicry. Changes in the distribution shape that arise using traditional methods lead to underestimation of extinction risk in normally distributed, red 1/f environments. AR(1) methods also underestimate extinction risks in traditionally generated red environments. This work synthesises previous results and provides further insight into the processes driving extinction risk in model populations. We must let the characteristics of known natural environmental covariates (e.g., colour and distribution shape) guide us in our choice of how to best model the impact of coloured environmental variation on population dynamics.

  17. Near-field environment/processes working group summary

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Murphy, W.M.

    1995-09-01

    This article is a summary of the proceedings of a group discussion which took place at the Workshop on the Role of Natural Analogs in Geologic Disposal of High-Level Nuclear Waste in San Antonio, Texas on July 22-25, 1991. The working group concentrated on the subject of the near-field environment to geologic repositories for high-level nuclear waste. The near-field environment may be affected by thermal perturbations from the waste, and by disturbances caused by the introduction of exotic materials during construction of the repository. This group also discussed the application of modelling of performance-related processes.

  18. Updated Model of the Solar Energetic Proton Environment in Space

    NASA Astrophysics Data System (ADS)

    Jiggens, Piers; Heynderickx, Daniel; Sandberg, Ingmar; Truscott, Pete; Raukunen, Osku; Vainio, Rami

    2018-05-01

    The Solar Accumulated and Peak Proton and Heavy Ion Radiation Environment (SAPPHIRE) model provides environment specification outputs for all aspects of the Solar Energetic Particle (SEP) environment. The model is based upon a thoroughly cleaned and carefully processed data set. Herein the evolution of the solar proton model is discussed with comparisons to other models and data. This paper discusses the construction of the underlying data set, the modelling methodology, optimisation of fitted flux distributions and extrapolation of model outputs to cover a range of proton energies from 0.1 MeV to 1 GeV. The model provides outputs in terms of mission cumulative fluence, maximum event fluence and peak flux for both solar maximum and solar minimum periods. A new method for describing maximum event fluence and peak flux outputs in terms of 1-in-x-year SPEs is also described. SAPPHIRE proton model outputs are compared with previous models including CREME96, ESP-PSYCHIC and the JPL model. Low energy outputs are compared to SEP data from ACE/EPAM whilst high energy outputs are compared to a new model based on GLEs detected by Neutron Monitors (NMs).

  19. Use of the computational-informational web-GIS system for the development of climatology students' skills in modeling and understanding climate change

    NASA Astrophysics Data System (ADS)

    Gordova, Yulia; Martynova, Yulia; Shulgina, Tamara

    2015-04-01

    The current situation with the training of specialists in environmental sciences is complicated by the fact that the very scientific field is experiencing a period of rapid development. Global change has caused the development of measurement techniques and modeling of environmental characteristics, accompanied by the expansion of the conceptual and mathematical apparatus. Understanding and forecasting processes in the Earth system requires extensive use of mathematical modeling and advanced computing technologies. As a rule, available training programs in the environmental sciences disciplines do not have time to adapt to such rapid changes in the domain content. As a result, graduates of faculties do not understand processes and mechanisms of the global change, have only superficial knowledge of mathematical modeling of processes in the environment. They do not have the required skills in numerical modeling, data processing and analysis of observations and computation outputs and are not prepared to work with the meteorological data. For adequate training of future specialists in environmental sciences we propose the following approach, which reflects the new "research" paradigm in education. We believe that the training of such specialists should be done not in an artificial learning environment, but based on actual operating information-computational systems used in environment studies, in the so-called virtual research environment via development of virtual research and learning laboratories. In the report the results of the use of computational-informational web-GIS system "Climate" (http://climate.scert.ru/) as a prototype of such laboratory are discussed. The approach is realized at Tomsk State University to prepare bachelors in meteorology. Student survey shows that their knowledge has become deeper and more systemic after undergoing training in virtual learning laboratory. The scientific team plans to assist any educators to utilize the system in earth science education. This work is partially supported by SB RAS project VIII.80.2.1, RFBR grants 13-05-12034 and 14-05-00502.

  20. Strategies for concurrent processing of complex algorithms in data driven architectures

    NASA Technical Reports Server (NTRS)

    Stoughton, John W.; Mielke, Roland R.

    1987-01-01

    The results of ongoing research directed at developing a graph theoretical model for describing data and control flow associated with the execution of large grained algorithms in a spatial distributed computer environment is presented. This model is identified by the acronym ATAMM (Algorithm/Architecture Mapping Model). The purpose of such a model is to provide a basis for establishing rules for relating an algorithm to its execution in a multiprocessor environment. Specifications derived from the model lead directly to the description of a data flow architecture which is a consequence of the inherent behavior of the data and control flow described by the model. The purpose of the ATAMM based architecture is to optimize computational concurrency in the multiprocessor environment and to provide an analytical basis for performance evaluation. The ATAMM model and architecture specifications are demonstrated on a prototype system for concept validation.

  1. Interactions of social, terrestrial, and marine sub-systems in the Galapagos Islands, Ecuador.

    PubMed

    Walsh, Stephen J; Mena, Carlos F

    2016-12-20

    Galapagos is often cited as an example of the conflicts that are emerging between resource conservation and economic development in island ecosystems, as the pressures associated with tourism threaten nature, including the iconic and emblematic species, unique terrestrial landscapes, and special marine environments. In this paper, two projects are described that rely upon dynamic systems models and agent-based models to examine human-environment interactions. We use a theoretical context rooted in complexity theory to guide the development of our models that are linked to social-ecological dynamics. The goal of this paper is to describe key elements, relationships, and processes to inform and enhance our understanding of human-environment interactions in the Galapagos Islands of Ecuador. By formalizing our knowledge of how systems operate and the manner in which key elements are linked in coupled human-natural systems, we specify rules, relationships, and rates of exchange between social and ecological features derived through statistical functions and/or functions specified in theory or practice. The processes described in our models also have practical applications in that they emphasize how political policies generate different human responses and model outcomes, many detrimental to the social-ecological sustainability of the Galapagos Islands.

  2. Putting mechanisms into crop production models.

    PubMed

    Boote, Kenneth J; Jones, James W; White, Jeffrey W; Asseng, Senthold; Lizaso, Jon I

    2013-09-01

    Crop growth models dynamically simulate processes of C, N and water balance on daily or hourly time-steps to predict crop growth and development and at season-end, final yield. Their ability to integrate effects of genetics, environment and crop management have led to applications ranging from understanding gene function to predicting potential impacts of climate change. The history of crop models is reviewed briefly, and their level of mechanistic detail for assimilation and respiration, ranging from hourly leaf-to-canopy assimilation to daily radiation-use efficiency is discussed. Crop models have improved steadily over the past 30-40 years, but much work remains. Improvements are needed for the prediction of transpiration response to elevated CO₂ and high temperature effects on phenology and reproductive fertility, and simulation of root growth and nutrient uptake under stressful edaphic conditions. Mechanistic improvements are needed to better connect crop growth to genetics and to soil fertility, soil waterlogging and pest damage. Because crop models integrate multiple processes and consider impacts of environment and management, they have excellent potential for linking research from genomics and allied disciplines to crop responses at the field scale, thus providing a valuable tool for deciphering genotype by environment by management effects. © 2013 John Wiley & Sons Ltd.

  3. Interactions of social, terrestrial, and marine sub-systems in the Galapagos Islands, Ecuador

    PubMed Central

    Walsh, Stephen J.; Mena, Carlos F.

    2016-01-01

    Galapagos is often cited as an example of the conflicts that are emerging between resource conservation and economic development in island ecosystems, as the pressures associated with tourism threaten nature, including the iconic and emblematic species, unique terrestrial landscapes, and special marine environments. In this paper, two projects are described that rely upon dynamic systems models and agent-based models to examine human–environment interactions. We use a theoretical context rooted in complexity theory to guide the development of our models that are linked to social–ecological dynamics. The goal of this paper is to describe key elements, relationships, and processes to inform and enhance our understanding of human–environment interactions in the Galapagos Islands of Ecuador. By formalizing our knowledge of how systems operate and the manner in which key elements are linked in coupled human–natural systems, we specify rules, relationships, and rates of exchange between social and ecological features derived through statistical functions and/or functions specified in theory or practice. The processes described in our models also have practical applications in that they emphasize how political policies generate different human responses and model outcomes, many detrimental to the social–ecological sustainability of the Galapagos Islands. PMID:27791072

  4. Robots with language.

    PubMed

    Parisi, Domenico

    2010-01-01

    Trying to understand human language by constructing robots that have language necessarily implies an embodied view of language, where the meaning of linguistic expressions is derived from the physical interactions of the organism with the environment. The paper describes a neural model of language according to which the robot's behaviour is controlled by a neural network composed of two sub-networks, one dedicated to the non-linguistic interactions of the robot with the environment and the other one to processing linguistic input and producing linguistic output. We present the results of a number of simulations using the model and we suggest how the model can be used to account for various language-related phenomena such as disambiguation, the metaphorical use of words, the pervasive idiomaticity of multi-word expressions, and mental life as talking to oneself. The model implies a view of the meaning of words and multi-word expressions as a temporal process that takes place in the entire brain and has no clearly defined boundaries. The model can also be extended to emotional words if we assume that an embodied view of language includes not only the interactions of the robot's brain with the external environment but also the interactions of the brain with what is inside the body.

  5. Pathogen transfer through environment-host contact: an agent-based queueing theoretic framework.

    PubMed

    Chen, Shi; Lenhart, Suzanne; Day, Judy D; Lee, Chihoon; Dulin, Michael; Lanzas, Cristina

    2017-11-02

    Queueing theory studies the properties of waiting queues and has been applied to investigate direct host-to-host transmitted disease dynamics, but its potential in modelling environmentally transmitted pathogens has not been fully explored. In this study, we provide a flexible and customizable queueing theory modelling framework with three major subroutines to study the in-hospital contact processes between environments and hosts and potential nosocomial pathogen transfer, where environments are servers and hosts are customers. Two types of servers with different parameters but the same utilization are investigated. We consider various forms of transfer functions that map contact duration to the amount of pathogen transfer based on existing literature. We propose a case study of simulated in-hospital contact processes and apply stochastic queues to analyse the amount of pathogen transfer under different transfer functions, and assume that pathogen amount decreases during the inter-arrival time. Different host behaviour (feedback and non-feedback) as well as initial pathogen distribution (whether in environment and/or in hosts) are also considered and simulated. We assess pathogen transfer and circulation under these various conditions and highlight the importance of the nonlinear interactions among contact processes, transfer functions and pathogen demography during the contact process. Our modelling framework can be readily extended to more complicated queueing networks to simulate more realistic situations by adjusting parameters such as the number and type of servers and customers, and adding extra subroutines. © The authors 2017. Published by Oxford University Press on behalf of the Institute of Mathematics and its Applications. All rights reserved.

  6. A stochastic model for transmission, extinction and outbreak of Escherichia coli O157:H7 in cattle as affected by ambient temperature and cleaning practices.

    PubMed

    Wang, Xueying; Gautam, Raju; Pinedo, Pablo J; Allen, Linda J S; Ivanek, Renata

    2014-08-01

    Many infectious agents transmitting through a contaminated environment are able to persist in the environment depending on the temperature and sanitation determined rates of their replication and clearance, respectively. There is a need to elucidate the effect of these factors on the infection transmission dynamics in terms of infection outbreaks and extinction while accounting for the random nature of the process. Also, it is important to distinguish between the true and apparent extinction, where the former means pathogen extinction in both the host and the environment while the latter means extinction only in the host population. This study proposes a stochastic-differential equation model as an approximation to a Markov jump process model, using Escherichia coli O157:H7 in cattle as a model system. In the model, the host population infection dynamics are described using the standard susceptible-infected-susceptible framework, and the E. coli O157:H7 population in the environment is represented by an additional variable. The backward Kolmogorov equations that determine the probability distribution and the expectation of the first passage time are provided in a general setting. The outbreak and apparent extinction of infection are investigated by numerically solving the Kolmogorov equations for the probability density function of the associated process and the expectation of the associated stopping time. The results provide insight into E. coli O157:H7 transmission and apparent extinction, and suggest ways for controlling the spread of infection in a cattle herd. Specifically, this study highlights the importance of ambient temperature and sanitation, especially during summer.

  7. Cognitive Components Underpinning the Development of Model-Based Learning

    PubMed Central

    Potter, Tracey C.S.; Bryce, Nessa V.; Hartley, Catherine A.

    2016-01-01

    Reinforcement learning theory distinguishes “model-free” learning, which fosters reflexive repetition of previously rewarded actions, from “model-based” learning, which recruits a mental model of the environment to flexibly select goal-directed actions. Whereas model-free learning is evident across development, recruitment of model-based learning appears to increase with age. However, the cognitive processes underlying the development of model-based learning remain poorly characterized. Here, we examined whether age-related differences in cognitive processes underlying the construction and flexible recruitment of mental models predict developmental increases in model-based choice. In a cohort of participants aged 9–25, we examined whether the abilities to infer sequential regularities in the environment (“statistical learning”), maintain information in an active state (“working memory”) and integrate distant concepts to solve problems (“fluid reasoning”) predicted age-related improvements in model-based choice. We found that age-related improvements in statistical learning performance did not mediate the relationship between age and model-based choice. Ceiling performance on our working memory assay prevented examination of its contribution to model-based learning. However, age-related improvements in fluid reasoning statistically mediated the developmental increase in the recruitment of a model-based strategy. These findings suggest that gradual development of fluid reasoning may be a critical component process underlying the emergence of model-based learning. PMID:27825732

  8. Cognitive components underpinning the development of model-based learning.

    PubMed

    Potter, Tracey C S; Bryce, Nessa V; Hartley, Catherine A

    2017-06-01

    Reinforcement learning theory distinguishes "model-free" learning, which fosters reflexive repetition of previously rewarded actions, from "model-based" learning, which recruits a mental model of the environment to flexibly select goal-directed actions. Whereas model-free learning is evident across development, recruitment of model-based learning appears to increase with age. However, the cognitive processes underlying the development of model-based learning remain poorly characterized. Here, we examined whether age-related differences in cognitive processes underlying the construction and flexible recruitment of mental models predict developmental increases in model-based choice. In a cohort of participants aged 9-25, we examined whether the abilities to infer sequential regularities in the environment ("statistical learning"), maintain information in an active state ("working memory") and integrate distant concepts to solve problems ("fluid reasoning") predicted age-related improvements in model-based choice. We found that age-related improvements in statistical learning performance did not mediate the relationship between age and model-based choice. Ceiling performance on our working memory assay prevented examination of its contribution to model-based learning. However, age-related improvements in fluid reasoning statistically mediated the developmental increase in the recruitment of a model-based strategy. These findings suggest that gradual development of fluid reasoning may be a critical component process underlying the emergence of model-based learning. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  9. The environmental zero-point problem in evolutionary reaction norm modeling.

    PubMed

    Ergon, Rolf

    2018-04-01

    There is a potential problem in present quantitative genetics evolutionary modeling based on reaction norms. Such models are state-space models, where the multivariate breeder's equation in some form is used as the state equation that propagates the population state forward in time. These models use the implicit assumption of a constant reference environment, in many cases set to zero. This zero-point is often the environment a population is adapted to, that is, where the expected geometric mean fitness is maximized. Such environmental reference values follow from the state of the population system, and they are thus population properties. The environment the population is adapted to, is, in other words, an internal population property, independent of the external environment. It is only when the external environment coincides with the internal reference environment, or vice versa, that the population is adapted to the current environment. This is formally a result of state-space modeling theory, which is an important theoretical basis for evolutionary modeling. The potential zero-point problem is present in all types of reaction norm models, parametrized as well as function-valued, and the problem does not disappear when the reference environment is set to zero. As the environmental reference values are population characteristics, they ought to be modeled as such. Whether such characteristics are evolvable is an open question, but considering the complexity of evolutionary processes, such evolvability cannot be excluded without good arguments. As a straightforward solution, I propose to model the reference values as evolvable mean traits in their own right, in addition to other reaction norm traits. However, solutions based on an evolvable G matrix are also possible.

  10. Investigating evaporation of melting ice particles within a bin melting layer model

    NASA Astrophysics Data System (ADS)

    Neumann, Andrea J.

    Single column models have been used to help develop algorithms for remote sensing retrievals. Assumptions in the single-column models may affect the assumptions of the remote sensing retrievals. Studies of the melting layer that use single column models often assume environments that are near or at water saturation. This study investigates the effects of evaporation upon melting particles to determine whether the assumption of negligible mass loss still holds within subsaturated melting layers. A single column, melting layer model is modified to include the effects of sublimation and evaporation upon the particles. Other changes to the model include switching the order in which the model loops over particle sizes and model layers; including a particle sedimentation scheme; adding aggregation, accretion, and collision and coalescence processes; allowing environmental variables such as the water vapor diffusivity and the Schmidt number to vary with the changes in the environment; adding explicitly calculated particle temperature, changing the particle terminal velocity parameterization; and using a newly-derived effective density-dimensional relationship for use in particle mass calculations. Simulations of idealized melting layer environments show that significant mass loss due to evaporation during melting is possible within subsaturated environments. Short melting distances, accelerating particle fall speeds, and short melting times help constrain the amount of mass lost due to evaporation while melting is occurring, even in subsaturated profiles. Sublimation prior to melting can also be a significant source of mass loss. The trends shown on the particle scale also appear in the bulk distribution parameters such as rainfall rate and ice water content. Simulations incorporating observed melting layer environments show that significant mass loss due to evaporation during the melting process is possible under certain environmental conditions. A profile such as the first melting layer profile on 10 May 2011 from the Midlatitude Continental Convective Clouds Experiment (MC3E) that is neither too saturated nor too subsaturated is possible and shows considerable mass loss for all particle sizes. Most melting layer profiles sampled during MC3E were too saturated for more than a dozen or two of the smallest particle sizes to experience significant mass loss. The aggregation, accretion, and collision and coalescence processes also countered significant mass loss at the largest particles sizes because these particles are efficient at collecting smaller particles due to their relative large sweep-out area. From these results, it appears that the assumption of negligible mass loss due to evaporation while melting is occurring is not always valid. Studies that use large, low-density snowflakes and high RH environments can safely use the assumption of negligible mass loss. Studies that use small ice particles or low RH environments (RH less than about 80%) cannot use the assumption of negligible mass loss due to evaporation. Retrieval algorithms may be overestimating surface precipitation rates and intensities in subsaturated environments due to the assumptions of negligible mass loss while melting and near-saturated melting layer environments.

  11. Creating Resilient Children and Empowering Families Using a Multifamily Group Process.

    ERIC Educational Resources Information Center

    Sayger, Thomas V.

    1996-01-01

    Presents a model for prevention and early intervention using a multifamily group counseling process to increase the resiliency of children and to empower families living with multiple stressors in high-risk environments. (Author)

  12. Impact of Process Protocol Design on Virtual Team Effectiveness

    ERIC Educational Resources Information Center

    Cordes, Christofer Sean

    2013-01-01

    This dissertation examined the influence of action process dimensions on team decision performance, and attitudes toward team work environment and procedures given different degrees of collaborative technology affordance. Process models were used to provide context for understanding team behavior in the experimental task, and clarify understanding…

  13. An approach to knowledge engineering to support knowledge-based simulation of payload ground processing at the Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Mcmanus, Shawn; Mcdaniel, Michael

    1989-01-01

    Planning for processing payloads was always difficult and time-consuming. With the advent of Space Station Freedom and its capability to support a myriad of complex payloads, the planning to support this ground processing maze involves thousands of man-hours of often tedious data manipulation. To provide the capability to analyze various processing schedules, an object oriented knowledge-based simulation environment called the Advanced Generic Accomodations Planning Environment (AGAPE) is being developed. Having nearly completed the baseline system, the emphasis in this paper is directed toward rule definition and its relation to model development and simulation. The focus is specifically on the methodologies implemented during knowledge acquisition, analysis, and representation within the AGAPE rule structure. A model is provided to illustrate the concepts presented. The approach demonstrates a framework for AGAPE rule development to assist expert system development.

  14. Commentary on the shifting processes model: a conceptual model for weight management.

    PubMed

    Pagoto, Sherry; Rodrigues, Stephanie

    2013-12-01

    Macchi and colleagues propose a theoretical model that merges concepts from the biopsychosocial model and family systems theory to produce a broader framework for understanding weight loss and maintenance (see record 2013-28564-001). The Shifting Processes Model views individual weight loss and maintenance in the context of family dynamics, including family eating and exercise habits, home environment, and family relationships. The authors reason that traditional models put the burden of change on the individual rather than the family system, when the latter is an important context of individual behavior.

  15. Dual-Schemata Model

    NASA Astrophysics Data System (ADS)

    Taniguchi, Tadahiro; Sawaragi, Tetsuo

    In this paper, a new machine-learning method, called Dual-Schemata model, is presented. Dual-Schemata model is a kind of self-organizational machine learning methods for an autonomous robot interacting with an unknown dynamical environment. This is based on Piaget's Schema model, that is a classical psychological model to explain memory and cognitive development of human beings. Our Dual-Schemata model is developed as a computational model of Piaget's Schema model, especially focusing on sensori-motor developing period. This developmental process is characterized by a couple of two mutually-interacting dynamics; one is a dynamics formed by assimilation and accommodation, and the other dynamics is formed by equilibration and differentiation. By these dynamics schema system enables an agent to act well in a real world. This schema's differentiation process corresponds to a symbol formation process occurring within an autonomous agent when it interacts with an unknown, dynamically changing environment. Experiment results obtained from an autonomous facial robot in which our model is embedded are presented; an autonomous facial robot becomes able to chase a ball moving in various ways without any rewards nor teaching signals from outside. Moreover, emergence of concepts on the target movements within a robot is shown and discussed in terms of fuzzy logics on set-subset inclusive relationships.

  16. Predicting Student Performance in a Collaborative Learning Environment

    ERIC Educational Resources Information Center

    Olsen, Jennifer K.; Aleven, Vincent; Rummel, Nikol

    2015-01-01

    Student models for adaptive systems may not model collaborative learning optimally. Past research has either focused on modeling individual learning or for collaboration, has focused on group dynamics or group processes without predicting learning. In the current paper, we adjust the Additive Factors Model (AFM), a standard logistic regression…

  17. Improving models for describing phosphorus cycling in agricultural soils

    USDA-ARS?s Scientific Manuscript database

    The mobility of phosphorus in the environment is controlled to a large extent by its sorption to soil. Therefore, an important component of all P loss models is how the model describes the biogeochemical processes governing P sorption and desorption to soils. The most common approach to modeling P c...

  18. Comparative evaluation of urban storm water quality models

    NASA Astrophysics Data System (ADS)

    Vaze, J.; Chiew, Francis H. S.

    2003-10-01

    The estimation of urban storm water pollutant loads is required for the development of mitigation and management strategies to minimize impacts to receiving environments. Event pollutant loads are typically estimated using either regression equations or "process-based" water quality models. The relative merit of using regression models compared to process-based models is not clear. A modeling study is carried out here to evaluate the comparative ability of the regression equations and process-based water quality models to estimate event diffuse pollutant loads from impervious surfaces. The results indicate that, once calibrated, both the regression equations and the process-based model can estimate event pollutant loads satisfactorily. In fact, the loads estimated using the regression equation as a function of rainfall intensity and runoff rate are better than the loads estimated using the process-based model. Therefore, if only estimates of event loads are required, regression models should be used because they are simpler and require less data compared to process-based models.

  19. Simulation Assessment Validation Environment (SAVE). Software User’s Manual

    DTIC Science & Technology

    2000-09-01

    requirements and decisions are made. The integration is leveraging work from other DoD organizations so that high -end results are attainable much faster than...planning through the modeling and simulation data capture and visualization process. The planners can complete the manufacturing process plan with a high ...technologies. This tool is also used to perform “ high level” factory process simulation prior to full CAD model development and help define feasible

  20. Virtual Control Systems Environment (VCSE)

    ScienceCinema

    Atkins, Will

    2018-02-14

    Will Atkins, a Sandia National Laboratories computer engineer discusses cybersecurity research work for process control systems. Will explains his work on the Virtual Control Systems Environment project to develop a modeling and simulation framework of the U.S. electric grid in order to study and mitigate possible cyberattacks on infrastructure.

  1. Information Seeking in a Virtual Learning Environment.

    ERIC Educational Resources Information Center

    Byron, Suzanne M.; Young, Jon I.

    2000-01-01

    Examines the applicability of Kuhlthau's Information Search Process Model in the context of a virtual learning environment at the University of North Texas that used virtual collaborative software. Highlights include cognitive and affective aspects of information seeking; computer experience and confidence; and implications for future research.…

  2. Silicon-Carbide Power MOSFET Performance in High Efficiency Boost Power Processing Unit for Extreme Environments

    NASA Technical Reports Server (NTRS)

    Ikpe, Stanley A.; Lauenstein, Jean-Marie; Carr, Gregory A.; Hunter, Don; Ludwig, Lawrence L.; Wood, William; Del Castillo, Linda Y.; Fitzpatrick, Fred; Chen, Yuan

    2016-01-01

    Silicon-Carbide device technology has generated much interest in recent years. With superior thermal performance, power ratings and potential switching frequencies over its Silicon counterpart, Silicon-Carbide offers a greater possibility for high powered switching applications in extreme environment. In particular, Silicon-Carbide Metal-Oxide- Semiconductor Field-Effect Transistors' (MOSFETs) maturing process technology has produced a plethora of commercially available power dense, low on-state resistance devices capable of switching at high frequencies. A novel hard-switched power processing unit (PPU) is implemented utilizing Silicon-Carbide power devices. Accelerated life data is captured and assessed in conjunction with a damage accumulation model of gate oxide and drain-source junction lifetime to evaluate potential system performance at high temperature environments.

  3. Functional Fault Modeling of a Cryogenic System for Real-Time Fault Detection and Isolation

    NASA Technical Reports Server (NTRS)

    Ferrell, Bob; Lewis, Mark; Perotti, Jose; Oostdyk, Rebecca; Brown, Barbara

    2010-01-01

    The purpose of this paper is to present the model development process used to create a Functional Fault Model (FFM) of a liquid hydrogen (L H2) system that will be used for realtime fault isolation in a Fault Detection, Isolation and Recover (FDIR) system. The paper explains th e steps in the model development process and the data products required at each step, including examples of how the steps were performed fo r the LH2 system. It also shows the relationship between the FDIR req uirements and steps in the model development process. The paper concl udes with a description of a demonstration of the LH2 model developed using the process and future steps for integrating the model in a live operational environment.

  4. A simple hyperbolic model for communication in parallel processing environments

    NASA Technical Reports Server (NTRS)

    Stoica, Ion; Sultan, Florin; Keyes, David

    1994-01-01

    We introduce a model for communication costs in parallel processing environments called the 'hyperbolic model,' which generalizes two-parameter dedicated-link models in an analytically simple way. Dedicated interprocessor links parameterized by a latency and a transfer rate that are independent of load are assumed by many existing communication models; such models are unrealistic for workstation networks. The communication system is modeled as a directed communication graph in which terminal nodes represent the application processes that initiate the sending and receiving of the information and in which internal nodes, called communication blocks (CBs), reflect the layered structure of the underlying communication architecture. The direction of graph edges specifies the flow of the information carried through messages. Each CB is characterized by a two-parameter hyperbolic function of the message size that represents the service time needed for processing the message. The parameters are evaluated in the limits of very large and very small messages. Rules are given for reducing a communication graph consisting of many to an equivalent two-parameter form, while maintaining an approximation for the service time that is exact in both large and small limits. The model is validated on a dedicated Ethernet network of workstations by experiments with communication subprograms arising in scientific applications, for which a tight fit of the model predictions with actual measurements of the communication and synchronization time between end processes is demonstrated. The model is then used to evaluate the performance of two simple parallel scientific applications from partial differential equations: domain decomposition and time-parallel multigrid. In an appropriate limit, we also show the compatibility of the hyperbolic model with the recently proposed LogP model.

  5. Land Use Change on Household Farms in the Ecuadorian Amazon: Design and Implementation of an Agent-Based Model

    PubMed Central

    Mena, Carlos F.; Walsh, Stephen J.; Frizzelle, Brian G.; Xiaozheng, Yao; Malanson, George P.

    2010-01-01

    This paper describes the design and implementation of an Agent-Based Model (ABM) used to simulate land use change on household farms in the Northern Ecuadorian Amazon (NEA). The ABM simulates decision-making processes at the household level that is examined through a longitudinal, socio-economic and demographic survey that was conducted in 1990 and 1999. Geographic Information Systems (GIS) are used to establish spatial relationships between farms and their environment, while classified Landsat Thematic Mapper (TM) imagery is used to set initial land use/land cover conditions for the spatial simulation, assess from-to land use/land cover change patterns, and describe trajectories of land use change at the farm and landscape levels. Results from prior studies in the NEA provide insights into the key social and ecological variables, describe human behavioral functions, and examine population-environment interactions that are linked to deforestation and agricultural extensification, population migration, and demographic change. Within the architecture of the model, agents are classified as active or passive. The model comprises four modules, i.e., initialization, demography, agriculture, and migration that operate individually, but are linked through key household processes. The main outputs of the model include a spatially-explicit representation of the land use/land cover on survey and non-survey farms and at the landscape level for each annual time-step, as well as simulated socio-economic and demographic characteristics of households and communities. The work describes the design and implementation of the model and how population-environment interactions can be addressed in a frontier setting. The paper contributes to land change science by examining important pattern-process relations, advocating a spatial modeling approach that is capable of synthesizing fundamental relationships at the farm level, and links people and environment in complex ways. PMID:24436501

  6. The N2N instrument to evaluate healthy work environments: an Italian validation.

    PubMed

    Palese, Alvisa; Dante, Angelo; Tonzar, Laura; Balboni, Bernardo

    2014-02-01

    The aims of the study were to (a) validate N2N Healthy Work Environment tool, (b) assess the healthiness of work environments as perceived by nurses themselves and (c) identify the factors associated with Italian nurses' perception of work environment healthiness. The linguistic and cultural adaptation of USA-N2N Healthy Work Environments was achieved through a process of forward/backward translation. Content validity was assessed by three expert nurses. The stability of the instrument was checked with a test/retest evaluation. The instrument psychometric properties, the confirmatory factor analysis as well the healthiness of the work environment and its determinant factors were evaluated with a sample of 294 nurses. The content and face validity of the N2N Healthy Work Environment instrument was confirmed. The instrument demonstrated good internal consistency (α of 0.82), excellent stability values (ρ > 0.70) and high levels of acceptability (response rate: 96.4 %). The confirmatory factor analysis has corroborated the existence of two factors as documented in the original instrument (Mays et al. in J Nurs Manag 19:18-26, 2011). Eighty-seven (29.6 %) nurses perceived the work environment where they work as "healthy". Working under a functional model of care delivery (χ(2) 24.856, p 0.000) and being responsible for one project or more (χ(2) 5.256, p 0.021) were associated with healthy environments. The instrument--valid and reliable, short in the number of items, easy to understand and based on international standards--allows a systematic assessment of the healthiness of the environment and might provide not only the opportunity to evaluate the effects of new organizational models and interventions, but also the possibility to activate a process of self-analysis and a process of ongoing review. The instrument can be used to systematically check the healthiness of Italian working environments, allowing for organizational diagnosis, targeted interventions and international comparisons.

  7. Methods for design and evaluation of integrated hardware-software systems for concurrent computation

    NASA Technical Reports Server (NTRS)

    Pratt, T. W.

    1985-01-01

    Research activities and publications are briefly summarized. The major tasks reviewed are: (1) VAX implementation of the PISCES parallel programming environment; (2) Apollo workstation network implementation of the PISCES environment; (3) FLEX implementation of the PISCES environment; (4) sparse matrix iterative solver in PSICES Fortran; (5) image processing application of PISCES; and (6) a formal model of concurrent computation being developed.

  8. The Use of Deep and Surface Learning Strategies among Students Learning English as a Foreign Language in an Internet Environment

    ERIC Educational Resources Information Center

    Aharony, Noa

    2006-01-01

    Background: The learning context is learning English in an Internet environment. The examination of this learning process was based on the Biggs and Moore's teaching-learning model (Biggs & Moore, 1993). Aim: The research aims to explore the use of the deep and surface strategies in an Internet environment among EFL students who come from…

  9. Process membership in asynchronous environments

    NASA Technical Reports Server (NTRS)

    Ricciardi, Aleta M.; Birman, Kenneth P.

    1993-01-01

    The development of reliable distributed software is simplified by the ability to assume a fail-stop failure model. The emulation of such a model in an asynchronous distributed environment is discussed. The solution proposed, called Strong-GMP, can be supported through a highly efficient protocol, and was implemented as part of a distributed systems software project at Cornell University. The precise definition of the problem, the protocol, correctness proofs, and an analysis of costs are addressed.

  10. Sonic Actuation of Small-Scale Robots in a Fluidic Environment

    DTIC Science & Technology

    2014-05-09

    model, calculated from (4)-(13) using dimensions and materials that are typical of the laser cutter for the milliscale and of the polyMUMPS process...propel the robot structure through the low Reynolds number environment. The flappers and robot structures were fabricated using a 30 W laser cutter...commonly used in fabrication with the laser cutter, were assigned to the flapper. Figure 15: Geometric model implemented in COMSOL of Flapper 0

  11. Evolution of quantum-like modeling in decision making processes

    NASA Astrophysics Data System (ADS)

    Khrennikova, Polina

    2012-12-01

    The application of the mathematical formalism of quantum mechanics to model behavioral patterns in social science and economics is a novel and constantly emerging field. The aim of the so called 'quantum like' models is to model the decision making processes in a macroscopic setting, capturing the particular 'context' in which the decisions are taken. Several subsequent empirical findings proved that when making a decision people tend to violate the axioms of expected utility theory and Savage's Sure Thing principle, thus violating the law of total probability. A quantum probability formula was devised to describe more accurately the decision making processes. A next step in the development of QL-modeling in decision making was the application of Schrödinger equation to describe the evolution of people's mental states. A shortcoming of Schrödinger equation is its inability to capture dynamics of an open system; the brain of the decision maker can be regarded as such, actively interacting with the external environment. Recently the master equation, by which quantum physics describes the process of decoherence as the result of interaction of the mental state with the environmental 'bath', was introduced for modeling the human decision making. The external environment and memory can be referred to as a complex 'context' influencing the final decision outcomes. The master equation can be considered as a pioneering and promising apparatus for modeling the dynamics of decision making in different contexts.

  12. Technical Manual for the Geospatial Stream Flow Model (GeoSFM)

    USGS Publications Warehouse

    Asante, Kwabena O.; Artan, Guleid A.; Pervez, Md Shahriar; Bandaragoda, Christina; Verdin, James P.

    2008-01-01

    The monitoring of wide-area hydrologic events requires the use of geospatial and time series data available in near-real time. These data sets must be manipulated into information products that speak to the location and magnitude of the event. Scientists at the U.S. Geological Survey Earth Resources Observation and Science (USGS EROS) Center have implemented a hydrologic modeling system which consists of an operational data processing system and the Geospatial Stream Flow Model (GeoSFM). The data processing system generates daily forcing evapotranspiration and precipitation data from various remotely sensed and ground-based data sources. To allow for rapid implementation in data scarce environments, widely available terrain, soil, and land cover data sets are used for model setup and initial parameter estimation. GeoSFM performs geospatial preprocessing and postprocessing tasks as well as hydrologic modeling tasks within an ArcView GIS environment. The integration of GIS routines and time series processing routines is achieved seamlessly through the use of dynamically linked libraries (DLLs) embedded within Avenue scripts. GeoSFM is run operationally to identify and map wide-area streamflow anomalies. Daily model results including daily streamflow and soil water maps are disseminated through Internet map servers, flood hazard bulletins and other media.

  13. Land Ecological Security Evaluation of Underground Iron Mine Based on PSR Model

    NASA Astrophysics Data System (ADS)

    Xiao, Xiao; Chen, Yong; Ruan, Jinghua; Hong, Qiang; Gan, Yong

    2018-01-01

    Iron ore mine provides an important strategic resource to the national economy while it also causes many serious ecological problems to the environment. The study summed up the characteristics of ecological environment problems of underground iron mine. Considering the mining process of underground iron mine, we analysis connections between mining production, resource, environment and economical background. The paper proposed a land ecological security evaluation system and method of underground iron mine based on Pressure-State-Response model. Our application in Chengchao iron mine proves its efficiency and promising guide on land ecological security evaluation.

  14. A neural network ActiveX based integrated image processing environment.

    PubMed

    Ciuca, I; Jitaru, E; Alaicescu, M; Moisil, I

    2000-01-01

    The paper outlines an integrated image processing environment that uses neural networks ActiveX technology for object recognition and classification. The image processing environment which is Windows based, encapsulates a Multiple-Document Interface (MDI) and is menu driven. Object (shape) parameter extraction is focused on features that are invariant in terms of translation, rotation and scale transformations. The neural network models that can be incorporated as ActiveX components into the environment allow both clustering and classification of objects from the analysed image. Mapping neural networks perform an input sensitivity analysis on the extracted feature measurements and thus facilitate the removal of irrelevant features and improvements in the degree of generalisation. The program has been used to evaluate the dimensions of the hydrocephalus in a study for calculating the Evans index and the angle of the frontal horns of the ventricular system modifications.

  15. Astronomical Ice: The Effects of Treating Ice as a Porous Media on the Dynamics and Evolution of Extraterrestrial Ice-Ocean Environments

    NASA Astrophysics Data System (ADS)

    Buffo, J.; Schmidt, B. E.

    2015-12-01

    With the prevalence of water and ice rich environments in the solar system, and likely the universe, becoming more apparent, understanding the evolutionary dynamics and physical processes of such locales is of great importance. Piqued interest arises from the understanding that the persistence of all known life depends on the presence of liquid water. As in situ investigation is currently infeasible, accurate numerical modeling is the best technique to demystify these environments. We will discuss an evolving model of ice-ocean interaction aimed at realistically describing the behavior of the ice-ocean interface by treating basal ice as a porous media, and its possible implications on the formation of astrobiological niches. Treating ice as a porous media drastically affects the thermodynamic properties it exhibits. Thus inclusion of this phenomenon is critical in accurately representing the dynamics and evolution of all ice-ocean environments. This model utilizes equations that describe the dynamics of sea ice when it is treated as a porous media (Hunke et. al. 2011), coupled with a basal melt and accretion model (Holland and Jenkins 1999). Combined, these two models produce the most accurate description of the processes occurring at the base of terrestrial sea ice and ice shelves, capable of resolving variations within the ice due to environmental pressures. While these models were designed for application to terrestrial environments, the physics occurring at any ice-water interface is identical, and these models can be used to represent the evolution of a variety of icy astronomical bodies. As terrestrial ice shelves provide a close analog to planetary ice-ocean environments, we truth test the models validity against observations of ice shelves. We apply this model to the ice-ocean interface of the icy Galilean moon Europa. We include profiles of temperature, salinity, solid fraction, and Darcy velocity, as well as temporally and spatially varying melt and accretion rates. A porous medium is an ideal place for the coalescence of nutrients and the formation of energy gradients, key controllers of biological activity. Understanding the physics that influence ice-ocean exchange is thus essential in assessing the habitability of Europa and its contemporaries.

  16. Improving orbit prediction accuracy through supervised machine learning

    NASA Astrophysics Data System (ADS)

    Peng, Hao; Bai, Xiaoli

    2018-05-01

    Due to the lack of information such as the space environment condition and resident space objects' (RSOs') body characteristics, current orbit predictions that are solely grounded on physics-based models may fail to achieve required accuracy for collision avoidance and have led to satellite collisions already. This paper presents a methodology to predict RSOs' trajectories with higher accuracy than that of the current methods. Inspired by the machine learning (ML) theory through which the models are learned based on large amounts of observed data and the prediction is conducted without explicitly modeling space objects and space environment, the proposed ML approach integrates physics-based orbit prediction algorithms with a learning-based process that focuses on reducing the prediction errors. Using a simulation-based space catalog environment as the test bed, the paper demonstrates three types of generalization capability for the proposed ML approach: (1) the ML model can be used to improve the same RSO's orbit information that is not available during the learning process but shares the same time interval as the training data; (2) the ML model can be used to improve predictions of the same RSO at future epochs; and (3) the ML model based on a RSO can be applied to other RSOs that share some common features.

  17. Effect of the Environment and Environmental Uncertainty on Ship Routes

    DTIC Science & Technology

    2012-06-01

    models consisting of basic differential equations simulating the fluid dynamic process and physics of the environment. Based on Newton’s second law of...Charles and Hazel Hall, for their unconditional love and support. They were there for me during this entire process , as they have been throughout...A simple transit across the Atlantic Ocean can easily become a rough voyage if the ship encounters high winds, which in turn will cause a high sea

  18. Utilizing Controlled Vibrations in a Microgravity Environment to Understand and Promote Microstructural Homogeneity During Floating-Zone Crystal Growth

    NASA Technical Reports Server (NTRS)

    Grugel, Richard N.

    1999-01-01

    It has been demonstrated in floating-zone configurations utilizing silicone oil and nitrate salts that mechanically induced vibration effectively minimizes detrimental, gravity independent, thermocapillary flow. The processing parameters leading to crystal improvement and aspects of the on-going modeling effort are discussed. Plans for applying the crystal growth technique to commercially relevant materials, e.g., silicon, as well as the value of processing in a microgravity environment are presented.

  19. Evaluation of the Combined AERCOARE/AERMOD Modeling Approach for Offshore Sources

    EPA Science Inventory

    ENVIRON conducted an evaluation of the combined AERCOARE/AERMOD (AERCOARE-MOD) modeling approach for offshore sources using tracer data from four field studies. AERCOARE processes overwater meteorological data for use by the AERMOD air quality dispersion model (EPA, 2004a). AERC...

  20. Geometry Modeling and Grid Generation for Design and Optimization

    NASA Technical Reports Server (NTRS)

    Samareh, Jamshid A.

    1998-01-01

    Geometry modeling and grid generation (GMGG) have played and will continue to play an important role in computational aerosciences. During the past two decades, tremendous progress has occurred in GMGG; however, GMGG is still the biggest bottleneck to routine applications for complicated Computational Fluid Dynamics (CFD) and Computational Structures Mechanics (CSM) models for analysis, design, and optimization. We are still far from incorporating GMGG tools in a design and optimization environment for complicated configurations. It is still a challenging task to parameterize an existing model in today's Computer-Aided Design (CAD) systems, and the models created are not always good enough for automatic grid generation tools. Designers may believe their models are complete and accurate, but unseen imperfections (e.g., gaps, unwanted wiggles, free edges, slivers, and transition cracks) often cause problems in gridding for CSM and CFD. Despite many advances in grid generation, the process is still the most labor-intensive and time-consuming part of the computational aerosciences for analysis, design, and optimization. In an ideal design environment, a design engineer would use a parametric model to evaluate alternative designs effortlessly and optimize an existing design for a new set of design objectives and constraints. For this ideal environment to be realized, the GMGG tools must have the following characteristics: (1) be automated, (2) provide consistent geometry across all disciplines, (3) be parametric, and (4) provide sensitivity derivatives. This paper will review the status of GMGG for analysis, design, and optimization processes, and it will focus on some emerging ideas that will advance the GMGG toward the ideal design environment.

  1. Once More with Feeling: Affect and Playing with the TGfU Model

    ERIC Educational Resources Information Center

    Pope, Clive C.

    2005-01-01

    Certainly, the process of decision-making and problem-solving in a shifting playing environment lies at the core of the Teaching Games for Understanding (TGfU) model. What is not clear is how, at the time of decision-making, players' feelings or affective factors and their subsequent influence on thinking, influence these processes. Affect has a…

  2. Towards an Intelligent Planning Knowledge Base Development Environment

    NASA Technical Reports Server (NTRS)

    Chien, S.

    1994-01-01

    ract describes work in developing knowledge base editing and debugging tools for the Multimission VICAR Planner (MVP) system. MVP uses artificial intelligence planning techniques to automatically construct executable complex image processing procedures (using models of the smaller constituent image processing requests made to the JPL Multimission Image Processing Laboratory.

  3. Device and circuit analysis of a sub 20 nm double gate MOSFET with gate stack using a look-up-table-based approach

    NASA Astrophysics Data System (ADS)

    Chakraborty, S.; Dasgupta, A.; Das, R.; Kar, M.; Kundu, A.; Sarkar, C. K.

    2017-12-01

    In this paper, we explore the possibility of mapping devices designed in TCAD environment to its modeled version developed in cadence virtuoso environment using a look-up table (LUT) approach. Circuit simulation of newly designed devices in TCAD environment is a very slow and tedious process involving complex scripting. Hence, the LUT based modeling approach has been proposed as a faster and easier alternative in cadence environment. The LUTs are prepared by extracting data from the device characteristics obtained from device simulation in TCAD. A comparative study is shown between the TCAD simulation and the LUT-based alternative to showcase the accuracy of modeled devices. Finally the look-up table approach is used to evaluate the performance of circuits implemented using 14 nm nMOSFET.

  4. A biologically inspired model of bat echolocation in a cluttered environment with inputs designed from field Recordings

    NASA Astrophysics Data System (ADS)

    Loncich, Kristen Teczar

    Bat echolocation strategies and neural processing of acoustic information, with a focus on cluttered environments, is investigated in this study. How a bat processes the dense field of echoes received while navigating and foraging in the dark is not well understood. While several models have been developed to describe the mechanisms behind bat echolocation, most are based in mathematics rather than biology, and focus on either peripheral or neural processing---not exploring how these two levels of processing are vitally connected. Current echolocation models also do not use habitat specific acoustic input, or account for field observations of echolocation strategies. Here, a new approach to echolocation modeling is described capturing the full picture of echolocation from signal generation to a neural picture of the acoustic scene. A biologically inspired echolocation model is developed using field research measurements of the interpulse interval timing used by a frequency modulating (FM) bat in the wild, with a whole method approach to modeling echolocation including habitat specific acoustic inputs, a biologically accurate peripheral model of sound processing by the outer, middle, and inner ear, and finally a neural model incorporating established auditory pathways and neuron types with echolocation adaptations. Field recordings analyzed underscore bat sonar design differences observed in the laboratory and wild, and suggest a correlation between interpulse interval groupings and increased clutter. The scenario model provides habitat and behavior specific echoes and is a useful tool for both modeling and behavioral studies, and the peripheral and neural model show that spike-time information and echolocation specific neuron types can produce target localization in the midbrain.

  5. Modeling Pre- and Post- Wildfire Hydrologic Response to Vegetation Change in the Valles Caldera National Preserve, NM

    NASA Astrophysics Data System (ADS)

    Gregory, A. E.; Benedict, K. K.; Zhang, S.; Savickas, J.

    2017-12-01

    Large scale, high severity wildfires in forests have become increasingly prevalent in the western United States due to fire exclusion. Although past work has focused on the immediate consequences of wildfire (ie. runoff magnitude and debris flow), little has been done to understand the post wildfire hydrologic consequences of vegetation regrowth. Furthermore, vegetation is often characterized by static parameterizations within hydrological models. In order to understand the temporal relationship between hydrologic processes and revegetation, we modularized and partially automated the hydrologic modeling process to increase connectivity between remotely sensed data, the Virtual Watershed Platform (a data management resource, called the VWP), input meteorological data, and the Precipitation-Runoff Modeling System (PRMS). This process was used to run simulations in the Valles Caldera of NM, an area impacted by the 2011 Las Conchas Fire, in PRMS before and after the Las Conchas to evaluate hydrologic process changes. The modeling environment addressed some of the existing challenges faced by hydrological modelers. At present, modelers are somewhat limited in their ability to push the boundaries of hydrologic understanding. Specific issues faced by modelers include limited computational resources to model processes at large spatial and temporal scales, data storage capacity and accessibility from the modeling platform, computational and time contraints for experimental modeling, and the skills to integrate modeling software in ways that have not been explored. By taking an interdisciplinary approach, we were able to address some of these challenges by leveraging the skills of hydrologic, data, and computer scientists; and the technical capabilities provided by a combination of on-demand/high-performance computing, distributed data, and cloud services. The hydrologic modeling process was modularized to include options for distributing meteorological data, parameter space experimentation, data format transformation, looping, validation of models and containerization for enabling new analytic scenarios. The user interacts with the modules through Jupyter Notebooks which can be connected to an on-demand computing and HPC environment, and data services built as part of the VWP.

  6. Examining Collaborative Knowledge Construction in Microblogging-Based Learning Environments

    ERIC Educational Resources Information Center

    Luo, Tian; Clifton, Lacey

    2017-01-01

    Aim/Purpose: The purpose of the study is to provide foundational research to exemplify how knowledge construction takes place in microblogging-based learning environments, to understand learner interaction representing the knowledge construction process, and to analyze learner perception, thereby suggesting a model of delivery for microblogging.…

  7. Cognitive Presence and Effect of Immersion in Virtual Learning Environment

    ERIC Educational Resources Information Center

    Katernyak, Ihor; Loboda, Viktoriya

    2016-01-01

    This paper presents the approach to successful application of two knowledge management techniques--community of practice and eLearning, in order to create and manage a competence-developing virtual learning environment. It explains how "4A" model of involving practitioners in eLearning process (through attention, actualization,…

  8. Infusing Two Models of Evaluation into a Military Environment: A Case Study

    ERIC Educational Resources Information Center

    Aaberg, Wayne; Thompson, Carla J.

    2012-01-01

    Determining the worth and effectiveness of training used within a military environment is the same accountability responsibility that educational organizations, businesses, and social agencies are charged with for improving programs and services to society. The need for accountability implies the process of evaluation, particularly in governmental…

  9. Situations, Interaction, Process and Affordances: An Ecological Psychology Perspective.

    ERIC Educational Resources Information Center

    Young, Michael F.; DePalma, Andrew; Garrett, Steven

    2002-01-01

    From an ecological psychology perspective, a full analysis of any learning context must acknowledge the complex nonlinear dynamics that unfold as an intentionally-driven learner interacts with a technology-based purposefully designed learning environment. A full situation model would need to incorporate constraints from the environment and also…

  10. Modelling Learners' Cognitive, Affective, and Social Processes through Language and Discourse

    ERIC Educational Resources Information Center

    Dowell, Nia M. M.; Graesser, Arthur C.

    2014-01-01

    An emerging trend toward computer-mediated collaborative learning environments promotes lively exchanges between learners in order to facilitate learning. Discourse can play an important role in enhancing epistemology, pedagogy, and assessments in these environments. In this paper, we highlight some of our recent work showing the advantages using…

  11. The Contribution of Visualization to Learning Computer Architecture

    ERIC Educational Resources Information Center

    Yehezkel, Cecile; Ben-Ari, Mordechai; Dreyfus, Tommy

    2007-01-01

    This paper describes a visualization environment and associated learning activities designed to improve learning of computer architecture. The environment, EasyCPU, displays a model of the components of a computer and the dynamic processes involved in program execution. We present the results of a research program that analysed the contribution of…

  12. International Management: Creating a More Realistic Global Planning Environment.

    ERIC Educational Resources Information Center

    Waldron, Darryl G.

    2000-01-01

    Discusses the need for realistic global planning environments in international business education, introducing a strategic planning model that has teams interacting with teams to strategically analyze a selected multinational company. This dynamic process must result in a single integrated written analysis that specifies an optimal strategy for…

  13. Modeling Student Cognition in Digital and Nondigital Assessment Environments

    ERIC Educational Resources Information Center

    DiCerbo, Kristen E.; Xu, Yuning; Levy, Roy; Lai, Emily; Holland, Laura

    2017-01-01

    Inferences about student knowledge, skills, and attributes based on digital activity still largely come from whether students ultimately get a correct result or not. However, the ability to collect activity stream data as individuals interact with digital environments provides information about students' processes as they progress through learning…

  14. A Big Data-driven Model for the Optimization of Healthcare Processes.

    PubMed

    Koufi, Vassiliki; Malamateniou, Flora; Vassilacopoulos, George

    2015-01-01

    Healthcare organizations increasingly navigate a highly volatile, complex environment in which technological advancements and new healthcare delivery business models are the only constants. In their effort to out-perform in this environment, healthcare organizations need to be agile enough in order to become responsive to these increasingly changing conditions. To act with agility, healthcare organizations need to discover new ways to optimize their operations. To this end, they focus on healthcare processes that guide healthcare delivery and on the technologies that support them. Business process management (BPM) and Service-Oriented Architecture (SOA) can provide a flexible, dynamic, cloud-ready infrastructure where business process analytics can be utilized to extract useful insights from mountains of raw data, and make them work in ways beyond the abilities of human brains, or IT systems from just a year ago. This paper presents a framework which provides healthcare professionals gain better insight within and across your business processes. In particular, it performs real-time analysis on process-related data in order reveal areas of potential process improvement.

  15. Jigsaw model of the origin of life

    NASA Astrophysics Data System (ADS)

    McGowan, John F.

    2002-02-01

    It is suggested that life originated in a three-step process referred to as the jigsaw model. RNA, proteins, or similar organic molecules polymerized in a dehydrated carbon-rich environment, on surfaces in a carbon-rich environment, or in another environment where polymerization occurs. These polymers subsequently entered an aqueous environment where they folded into compact structures. It is argued that the folding of randomly generated polymers such as RNA or proteins in water tends to partition the folded polymer into domains with hydrophobic cores and matching shapes to minimize energy. In the aqueous environment hydrolysis or other reactions fragmented the compact structures into two or more matching molecules, occasionally producing simple living systems, also knows as autocatalytic sets of molecules. It is argued that the hydrolysis of folded polymers such as RNA or proteins is not random. The hydrophobic cores of the domains are rarely bisected due to the energy requirements in water. Hydrolysis preferentially fragments the folded polymers into pieces with complementary structures and chemical affinities. Thus the probability of producing a system of matched, interacting molecules in prebiotic chemistry is much higher than usually estimated. Environments where this process may occur are identified. For example, the jigsaw model suggests life may have originated at a seep or carbonaceous fluids beneath the ocean. The polymerization occurred beneath the sea floor. The folding and fragmentation occurred in the ocean. The implications of this hypothesis for seeking life or prebiotic chemistry in the Solar System are explored.

  16. The development of a collaborative virtual environment for finite element simulation

    NASA Astrophysics Data System (ADS)

    Abdul-Jalil, Mohamad Kasim

    Communication between geographically distributed designers has been a major hurdle in traditional engineering design. Conventional methods of communication, such as video conferencing, telephone, and email, are less efficient especially when dealing with complex design models. Complex shapes, intricate features and hidden parts are often difficult to describe verbally or even using traditional 2-D or 3-D visual representations. Virtual Reality (VR) and Internet technologies have provided a substantial potential to bridge the present communication barrier. VR technology allows designers to immerse themselves in a virtual environment to view and manipulate this model just as in real-life. Fast Internet connectivity has enabled fast data transfer between remote locations. Although various collaborative virtual environment (CVE) systems have been developed in the past decade, they are limited to high-end technology that is not accessible to typical designers. The objective of this dissertation is to discover and develop a new approach to increase the efficiency of the design process, particularly for large-scale applications wherein participants are geographically distributed. A multi-platform and easily accessible collaborative virtual environment (CVRoom), is developed to accomplish the stated research objective. Geographically dispersed designers can meet in a single shared virtual environment to discuss issues pertaining to the engineering design process and to make trade-off decisions more quickly than before, thereby speeding the entire process. This 'faster' design process will be achieved through the development of capabilities to better enable the multidisciplinary and modeling the trade-off decisions that are so critical before launching into a formal detailed design. The features of the environment developed as a result of this research include the ability to view design models, use voice interaction, and to link engineering analysis modules (such as Finite Element Analysis module, such as is demonstrated in this work). One of the major issues in developing a CVE system for engineering design purposes is to obtain any pertinent simulation results in real-time. This is critical so that the designers can make decisions based on these results quickly. For example, in a finite element analysis, if a design model is changed or perturbed, the analysis results must be obtained in real-time or near real-time to make the virtual meeting environment realistic. In this research, the finite difference-based Design Sensitivity Analysis (DSA) approach is employed to approximate structural responses (i.e. stress, displacement, etc), so as to demonstrate the applicability of CVRoom for engineering design trade-offs. This DSA approach provides for fast approximation and is well-suited for the virtual meeting environment where fast response time is required. The DSA-based approach is tested on several example test problems to show its applicability and limitations. This dissertation demonstrates that an increase in efficiency and reduction of time required for a complex design processing can be accomplished using the approach developed in this dissertation research. Several implementations of CVRoom by students working on common design tasks were investigated. All participants confirmed the preference of using the collaborative virtual environment developed in this dissertation work (CVRoom) over other modes of interactions. It is proposed here that CVRoom is representative of the type of collaborative virtual environment that will be used by most designers in the future to reduce the time required in a design cycle and thereby reduce the associated cost.

  17. Proposal for a new CAPE-OPEN Object Model

    EPA Science Inventory

    Process simulation applications require the exchange of significant amounts of data between the flowsheet environment, unit operation model, and thermodynamic server. Packing and unpacking various data types and exchanging data using structured text-based architectures, including...

  18. Recent developments in broadly applicable structure-biodegradability relationships.

    PubMed

    Jaworska, Joanna S; Boethling, Robert S; Howard, Philip H

    2003-08-01

    Biodegradation is one of the most important processes influencing concentration of a chemical substance after its release to the environment. It is the main process for removal of many chemicals from the environment and therefore is an important factor in risk assessments. This article reviews available methods and models for predicting biodegradability of organic chemicals from structure. The first section of the article briefly discusses current needs for biodegradability estimation methods related to new and existing chemicals and in the context of multimedia exposure models. Following sections include biodegradation test methods and endpoints used in modeling, with special attention given to the Japanese Ministry of International Trade and Industry test; a primer on modeling, describing the various approaches that have been used in the structure/biodegradability relationship work, and contrasting statistical and mechanistic approaches; and recent developments in structure/biodegradability relationships, divided into group contribution, chemometric, and artificial intelligence approaches.

  19. Utility of Small Animal Models of Developmental Programming.

    PubMed

    Reynolds, Clare M; Vickers, Mark H

    2018-01-01

    Any effective strategy to tackle the global obesity and rising noncommunicable disease epidemic requires an in-depth understanding of the mechanisms that underlie these conditions that manifest as a consequence of complex gene-environment interactions. In this context, it is now well established that alterations in the early life environment, including suboptimal nutrition, can result in an increased risk for a range of metabolic, cardiovascular, and behavioral disorders in later life, a process preferentially termed developmental programming. To date, most of the mechanistic knowledge around the processes underpinning development programming has been derived from preclinical research performed mostly, but not exclusively, in laboratory mouse and rat strains. This review will cover the utility of small animal models in developmental programming, the limitations of such models, and potential future directions that are required to fully maximize information derived from preclinical models in order to effectively translate to clinical use.

  20. Modelling for Ship Design and Production

    DTIC Science & Technology

    1991-09-01

    the physical production process. The product has to be delivered within the chain of order processing . The process “ship production” is defined by the...environment is of increasing importance. Changing product types, complexity and parallelism of order processing , short throughput times and fixed due...specialized and high quality products under manu- facturing conditions which ensure economic and effective order processing . Mapping these main

  1. Adaptive process control using fuzzy logic and genetic algorithms

    NASA Technical Reports Server (NTRS)

    Karr, C. L.

    1993-01-01

    Researchers at the U.S. Bureau of Mines have developed adaptive process control systems in which genetic algorithms (GA's) are used to augment fuzzy logic controllers (FLC's). GA's are search algorithms that rapidly locate near-optimum solutions to a wide spectrum of problems by modeling the search procedures of natural genetics. FLC's are rule based systems that efficiently manipulate a problem environment by modeling the 'rule-of-thumb' strategy used in human decision making. Together, GA's and FLC's possess the capabilities necessary to produce powerful, efficient, and robust adaptive control systems. To perform efficiently, such control systems require a control element to manipulate the problem environment, and a learning element to adjust to the changes in the problem environment. Details of an overall adaptive control system are discussed. A specific laboratory acid-base pH system is used to demonstrate the ideas presented.

  2. Object schemas for grounding language in a responsive robot

    NASA Astrophysics Data System (ADS)

    Hsiao, Kai-Yuh; Tellex, Stefanie; Vosoughi, Soroush; Kubat, Rony; Roy, Deb

    2008-12-01

    An approach is introduced for physically grounded natural language interpretation by robots that reacts appropriately to unanticipated physical changes in the environment and dynamically assimilates new information pertinent to ongoing tasks. At the core of the approach is a model of object schemas that enables a robot to encode beliefs about physical objects in its environment using collections of coupled processes responsible for sensorimotor interaction. These interaction processes run concurrently in order to ensure responsiveness to the environment, while co-ordinating sensorimotor expectations, action planning and language use. The model has been implemented on a robot that manipulates objects on a tabletop in response to verbal input. The implementation responds to verbal requests such as 'Group the green block and the red apple', while adapting in real time to unexpected physical collisions and taking opportunistic advantage of any new information it may receive through perceptual and linguistic channels.

  3. QuakeSim: a Web Service Environment for Productive Investigations with Earth Surface Sensor Data

    NASA Astrophysics Data System (ADS)

    Parker, J. W.; Donnellan, A.; Granat, R. A.; Lyzenga, G. A.; Glasscoe, M. T.; McLeod, D.; Al-Ghanmi, R.; Pierce, M.; Fox, G.; Grant Ludwig, L.; Rundle, J. B.

    2011-12-01

    The QuakeSim science gateway environment includes a visually rich portal interface, web service access to data and data processing operations, and the QuakeTables ontology-based database of fault models and sensor data. The integrated tools and services are designed to assist investigators by covering the entire earthquake cycle of strain accumulation and release. The Web interface now includes Drupal-based access to diverse and changing content, with new ability to access data and data processing directly from the public page, as well as the traditional project management areas that require password access. The system is designed to make initial browsing of fault models and deformation data particularly engaging for new users. Popular data and data processing include GPS time series with data mining techniques to find anomalies in time and space, experimental forecasting methods based on catalogue seismicity, faulted deformation models (both half-space and finite element), and model-based inversion of sensor data. The fault models include the CGS and UCERF 2.0 faults of California and are easily augmented with self-consistent fault models from other regions. The QuakeTables deformation data include the comprehensive set of UAVSAR interferograms as well as a growing collection of satellite InSAR data.. Fault interaction simulations are also being incorporated in the web environment based on Virtual California. A sample usage scenario is presented which follows an investigation of UAVSAR data from viewing as an overlay in Google Maps, to selection of an area of interest via a polygon tool, to fast extraction of the relevant correlation and phase information from large data files, to a model inversion of fault slip followed by calculation and display of a synthetic model interferogram.

  4. Evaluation of ceramics for stator application: Gas turbine engine report

    NASA Technical Reports Server (NTRS)

    Trela, W.; Havstad, P. H.

    1978-01-01

    Current ceramic materials, component fabrication processes, and reliability prediction capability for ceramic stators in an automotive gas turbine engine environment are assessed. Simulated engine duty cycle testing of stators conducted at temperatures up to 1093 C is discussed. Materials evaluated are SiC and Si3N4 fabricated from two near-net-shape processes: slip casting and injection molding. Stators for durability cycle evaluation and test specimens for material property characterization, and reliability prediction model prepared to predict stator performance in the simulated engine environment are considered. The status and description of the work performed for the reliability prediction modeling, stator fabrication, material property characterization, and ceramic stator evaluation efforts are reported.

  5. Design of a Model Execution Framework: Repetitive Object-Oriented Simulation Environment (ROSE)

    NASA Technical Reports Server (NTRS)

    Gray, Justin S.; Briggs, Jeffery L.

    2008-01-01

    The ROSE framework was designed to facilitate complex system analyses. It completely divorces the model execution process from the model itself. By doing so ROSE frees the modeler to develop a library of standard modeling processes such as Design of Experiments, optimizers, parameter studies, and sensitivity studies which can then be applied to any of their available models. The ROSE framework accomplishes this by means of a well defined API and object structure. Both the API and object structure are presented here with enough detail to implement ROSE in any object-oriented language or modeling tool.

  6. The efficiency of driving chemical reactions by a physical non-equilibrium is kinetically controlled.

    PubMed

    Göppel, Tobias; Palyulin, Vladimir V; Gerland, Ulrich

    2016-07-27

    An out-of-equilibrium physical environment can drive chemical reactions into thermodynamically unfavorable regimes. Under prebiotic conditions such a coupling between physical and chemical non-equilibria may have enabled the spontaneous emergence of primitive evolutionary processes. Here, we study the coupling efficiency within a theoretical model that is inspired by recent laboratory experiments, but focuses on generic effects arising whenever reactant and product molecules have different transport coefficients in a flow-through system. In our model, the physical non-equilibrium is represented by a drift-diffusion process, which is a valid coarse-grained description for the interplay between thermophoresis and convection, as well as for many other molecular transport processes. As a simple chemical reaction, we consider a reversible dimerization process, which is coupled to the transport process by different drift velocities for monomers and dimers. Within this minimal model, the coupling efficiency between the non-equilibrium transport process and the chemical reaction can be analyzed in all parameter regimes. The analysis shows that the efficiency depends strongly on the Damköhler number, a parameter that measures the relative timescales associated with the transport and reaction kinetics. Our model and results will be useful for a better understanding of the conditions for which non-equilibrium environments can provide a significant driving force for chemical reactions in a prebiotic setting.

  7. Toward a Model of Human Information Processing for Decision-Making and Skill Acquisition in Laparoscopic Colorectal Surgery.

    PubMed

    White, Eoin J; McMahon, Muireann; Walsh, Michael T; Coffey, J Calvin; O Sullivan, Leonard

    To create a human information-processing model for laparoscopic surgery based on already established literature and primary research to enhance laparoscopic surgical education in this context. We reviewed the literature for information-processing models most relevant to laparoscopic surgery. Our review highlighted the necessity for a model that accounts for dynamic environments, perception, allocation of attention resources between the actions of both hands of an operator, and skill acquisition and retention. The results of the literature review were augmented through intraoperative observations of 7 colorectal surgical procedures, supported by laparoscopic video analysis of 12 colorectal procedures. The Wickens human information-processing model was selected as the most relevant theoretical model to which we make adaptions for this specific application. We expanded the perception subsystem of the model to involve all aspects of perception during laparoscopic surgery. We extended the decision-making system to include dynamic decision-making to account for case/patient-specific and surgeon-specific deviations. The response subsystem now includes dual-task performance and nontechnical skills, such as intraoperative communication. The memory subsystem is expanded to include skill acquisition and retention. Surgical decision-making during laparoscopic surgery is the result of a highly complex series of processes influenced not only by the operator's knowledge, but also patient anatomy and interaction with the surgical team. Newer developments in simulation-based education must focus on the theoretically supported elements and events that underpin skill acquisition and affect the cognitive abilities of novice surgeons. The proposed human information-processing model builds on established literature regarding information processing, accounting for a dynamic environment of laparoscopic surgery. This revised model may be used as a foundation for a model describing robotic surgery. Copyright © 2017 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  8. Uranium transport in a crushed granodiorite: Experiments and reactive transport modeling

    DOE PAGES

    Dittrich, T. M.; Reimus, P. W.

    2015-02-12

    The primary objective of this study was to develop and demonstrate an experimental method to refine and better parameterize process models for reactive contaminant transport in aqueous subsurface environments and to reduce conservatism in such models without attempting to fully describe the geochemical system.

  9. An approach for investigation of secure access processes at a combined e-learning environment

    NASA Astrophysics Data System (ADS)

    Romansky, Radi; Noninska, Irina

    2017-12-01

    The article discuses an approach to investigate processes for regulation the security and privacy control at a heterogenous e-learning environment realized as a combination of traditional and cloud means and tools. Authors' proposal for combined architecture of e-learning system is presented and main subsystems and procedures are discussed. A formalization of the processes for using different types resources (public, private internal and private external) is proposed. The apparatus of Markovian chains (MC) is used for modeling and analytical investigation of the secure access to the resources is used and some assessments are presented.

  10. Application of integration algorithms in a parallel processing environment for the simulation of jet engines

    NASA Technical Reports Server (NTRS)

    Krosel, S. M.; Milner, E. J.

    1982-01-01

    The application of Predictor corrector integration algorithms developed for the digital parallel processing environment are investigated. The algorithms are implemented and evaluated through the use of a software simulator which provides an approximate representation of the parallel processing hardware. Test cases which focus on the use of the algorithms are presented and a specific application using a linear model of a turbofan engine is considered. Results are presented showing the effects of integration step size and the number of processors on simulation accuracy. Real time performance, interprocessor communication, and algorithm startup are also discussed.

  11. An Interactive Teaching System for Bond Graph Modeling and Simulation in Bioengineering

    ERIC Educational Resources Information Center

    Roman, Monica; Popescu, Dorin; Selisteanu, Dan

    2013-01-01

    The objective of the present work was to implement a teaching system useful in modeling and simulation of biotechnological processes. The interactive system is based on applications developed using 20-sim modeling and simulation software environment. A procedure for the simulation of bioprocesses modeled by bond graphs is proposed and simulators…

  12. An integrated model of social environment and social context for pediatric rehabilitation.

    PubMed

    Batorowicz, Beata; King, Gillian; Mishra, Lipi; Missiuna, Cheryl

    2016-01-01

    This article considers the conceptualization and operationalization of "social environment" and "social context" with implications for research and practice with children and youth with impairments. We first discuss social environment and social context as constructs important for understanding interaction between external environmental qualities and the individual's experience. The article considers existing conceptualizations within psychological and sociological bodies of literature, research using these concepts, current developmental theories and issues in the understanding of environment and participation within rehabilitation science. We then describe a model that integrates a person-focused perspective with an environment-focused perspective and that outlines the mechanisms through which children/youth and social environment interact and transact. Finally, we consider the implications of the proposed model for research and clinical practice. This conceptual model directs researchers and practitioners toward interventions that will address the mechanisms of child-environment interaction and that will build capacity within both children and their social environments, including families, peers groups and communities. Health is created and lived by people within the settings of their everyday life; where they learn, work, play, and love [p.2]. Understanding how social environment and personal factors interact over time to affect the development of children/youth can influence the design of services for children and youth with impairments. The model described integrates the individual-focused and environment-focused perspectives and outlines the mechanisms of the ongoing reciprocal interaction between children/youth and their social environments: provision of opportunities, resources and supports and contextual processes of choice, active engagement and collaboration. Addressing these mechanisms could contribute to creating healthier environments in which all children, including children with impairments, have experiences that lead to positive developmental benefits.

  13. The Family Adaptation Model: A Life Course Perspective. Technical Report 880.

    ERIC Educational Resources Information Center

    Bowen, Gary L.

    This conceptual model for explaining the factors and processes that underlie family adaptation in the Army relies heavily upon two traditions: the "Double ABCX" model of family stress and adaptation and the "Person-Environment Fit" model. The new model has three major parts: the environmental system, the personal system, and family adaptation.…

  14. A neural model of hierarchical reinforcement learning

    PubMed Central

    Rasmussen, Daniel; Eliasmith, Chris

    2017-01-01

    We develop a novel, biologically detailed neural model of reinforcement learning (RL) processes in the brain. This model incorporates a broad range of biological features that pose challenges to neural RL, such as temporally extended action sequences, continuous environments involving unknown time delays, and noisy/imprecise computations. Most significantly, we expand the model into the realm of hierarchical reinforcement learning (HRL), which divides the RL process into a hierarchy of actions at different levels of abstraction. Here we implement all the major components of HRL in a neural model that captures a variety of known anatomical and physiological properties of the brain. We demonstrate the performance of the model in a range of different environments, in order to emphasize the aim of understanding the brain’s general reinforcement learning ability. These results show that the model compares well to previous modelling work and demonstrates improved performance as a result of its hierarchical ability. We also show that the model’s behaviour is consistent with available data on human hierarchical RL, and generate several novel predictions. PMID:28683111

  15. Fluid flow in the osteocyte mechanical environment: a fluid-structure interaction approach.

    PubMed

    Verbruggen, Stefaan W; Vaughan, Ted J; McNamara, Laoise M

    2014-01-01

    Osteocytes are believed to be the primary sensor of mechanical stimuli in bone, which orchestrate osteoblasts and osteoclasts to adapt bone structure and composition to meet physiological loading demands. Experimental studies to quantify the mechanical environment surrounding bone cells are challenging, and as such, computational and theoretical approaches have modelled either the solid or fluid environment of osteocytes to predict how these cells are stimulated in vivo. Osteocytes are an elastic cellular structure that deforms in response to the external fluid flow imposed by mechanical loading. This represents a most challenging multi-physics problem in which fluid and solid domains interact, and as such, no previous study has accounted for this complex behaviour. The objective of this study is to employ fluid-structure interaction (FSI) modelling to investigate the complex mechanical environment of osteocytes in vivo. Fluorescent staining of osteocytes was performed in order to visualise their native environment and develop geometrically accurate models of the osteocyte in vivo. By simulating loading levels representative of vigorous physiological activity ([Formula: see text] compression and 300 Pa pressure gradient), we predict average interstitial fluid velocities [Formula: see text] and average maximum shear stresses [Formula: see text] surrounding osteocytes in vivo. Interestingly, these values occur in the canaliculi around the osteocyte cell processes and are within the range of stimuli known to stimulate osteogenic responses by osteoblastic cells in vitro. Significantly our results suggest that the greatest mechanical stimulation of the osteocyte occurs in the cell processes, which, cell culture studies have indicated, is the most mechanosensitive area of the cell. These are the first computational FSI models to simulate the complex multi-physics mechanical environment of osteocyte in vivo and provide a deeper understanding of bone mechanobiology.

  16. Studying light-harvesting models with superconducting circuits.

    PubMed

    Potočnik, Anton; Bargerbos, Arno; Schröder, Florian A Y N; Khan, Saeed A; Collodo, Michele C; Gasparinetti, Simone; Salathé, Yves; Creatore, Celestino; Eichler, Christopher; Türeci, Hakan E; Chin, Alex W; Wallraff, Andreas

    2018-03-02

    The process of photosynthesis, the main source of energy in the living world, converts sunlight into chemical energy. The high efficiency of this process is believed to be enabled by an interplay between the quantum nature of molecular structures in photosynthetic complexes and their interaction with the environment. Investigating these effects in biological samples is challenging due to their complex and disordered structure. Here we experimentally demonstrate a technique for studying photosynthetic models based on superconducting quantum circuits, which complements existing experimental, theoretical, and computational approaches. We demonstrate a high degree of freedom in design and experimental control of our approach based on a simplified three-site model of a pigment protein complex with realistic parameters scaled down in energy by a factor of 10 5 . We show that the excitation transport between quantum-coherent sites disordered in energy can be enabled through the interaction with environmental noise. We also show that the efficiency of the process is maximized for structured noise resembling intramolecular phononic environments found in photosynthetic complexes.

  17. Encapsulating model complexity and landscape-scale analyses of state-and-transition simulation models: an application of ecoinformatics and juniper encroachment in sagebrush steppe ecosystems

    USGS Publications Warehouse

    O'Donnell, Michael

    2015-01-01

    State-and-transition simulation modeling relies on knowledge of vegetation composition and structure (states) that describe community conditions, mechanistic feedbacks such as fire that can affect vegetation establishment, and ecological processes that drive community conditions as well as the transitions between these states. However, as the need for modeling larger and more complex landscapes increase, a more advanced awareness of computing resources becomes essential. The objectives of this study include identifying challenges of executing state-and-transition simulation models, identifying common bottlenecks of computing resources, developing a workflow and software that enable parallel processing of Monte Carlo simulations, and identifying the advantages and disadvantages of different computing resources. To address these objectives, this study used the ApexRMS® SyncroSim software and embarrassingly parallel tasks of Monte Carlo simulations on a single multicore computer and on distributed computing systems. The results demonstrated that state-and-transition simulation models scale best in distributed computing environments, such as high-throughput and high-performance computing, because these environments disseminate the workloads across many compute nodes, thereby supporting analysis of larger landscapes, higher spatial resolution vegetation products, and more complex models. Using a case study and five different computing environments, the top result (high-throughput computing versus serial computations) indicated an approximate 96.6% decrease of computing time. With a single, multicore compute node (bottom result), the computing time indicated an 81.8% decrease relative to using serial computations. These results provide insight into the tradeoffs of using different computing resources when research necessitates advanced integration of ecoinformatics incorporating large and complicated data inputs and models. - See more at: http://aimspress.com/aimses/ch/reader/view_abstract.aspx?file_no=Environ2015030&flag=1#sthash.p1XKDtF8.dpuf

  18. Business intelligence modeling in launch operations

    NASA Astrophysics Data System (ADS)

    Bardina, Jorge E.; Thirumalainambi, Rajkumar; Davis, Rodney D.

    2005-05-01

    The future of business intelligence in space exploration will focus on the intelligent system-of-systems real-time enterprise. In present business intelligence, a number of technologies that are most relevant to space exploration are experiencing the greatest change. Emerging patterns of set of processes rather than organizational units leading to end-to-end automation is becoming a major objective of enterprise information technology. The cost element is a leading factor of future exploration systems. This technology project is to advance an integrated Planning and Management Simulation Model for evaluation of risks, costs, and reliability of launch systems from Earth to Orbit for Space Exploration. The approach builds on research done in the NASA ARC/KSC developed Virtual Test Bed (VTB) to integrate architectural, operations process, and mission simulations for the purpose of evaluating enterprise level strategies to reduce cost, improve systems operability, and reduce mission risks. The objectives are to understand the interdependency of architecture and process on recurring launch cost of operations, provide management a tool for assessing systems safety and dependability versus cost, and leverage lessons learned and empirical models from Shuttle and International Space Station to validate models applied to Exploration. The systems-of-systems concept is built to balance the conflicting objectives of safety, reliability, and process strategy in order to achieve long term sustainability. A planning and analysis test bed is needed for evaluation of enterprise level options and strategies for transit and launch systems as well as surface and orbital systems. This environment can also support agency simulation based acquisition process objectives. The technology development approach is based on the collaborative effort set forth in the VTB's integrating operations, process models, systems and environment models, and cost models as a comprehensive disciplined enterprise analysis environment. Significant emphasis is being placed on adapting root cause from existing Shuttle operations to exploration. Technical challenges include cost model validation, integration of parametric models with discrete event process and systems simulations, and large-scale simulation integration. The enterprise architecture is required for coherent integration of systems models. It will also require a plan for evolution over the life of the program. The proposed technology will produce long-term benefits in support of the NASA objectives for simulation based acquisition, will improve the ability to assess architectural options verses safety/risk for future exploration systems, and will facilitate incorporation of operability as a systems design consideration, reducing overall life cycle cost for future systems.

  19. Business Intelligence Modeling in Launch Operations

    NASA Technical Reports Server (NTRS)

    Bardina, Jorge E.; Thirumalainambi, Rajkumar; Davis, Rodney D.

    2005-01-01

    This technology project is to advance an integrated Planning and Management Simulation Model for evaluation of risks, costs, and reliability of launch systems from Earth to Orbit for Space Exploration. The approach builds on research done in the NASA ARC/KSC developed Virtual Test Bed (VTB) to integrate architectural, operations process, and mission simulations for the purpose of evaluating enterprise level strategies to reduce cost, improve systems operability, and reduce mission risks. The objectives are to understand the interdependency of architecture and process on recurring launch cost of operations, provide management a tool for assessing systems safety and dependability versus cost, and leverage lessons learned and empirical models from Shuttle and International Space Station to validate models applied to Exploration. The systems-of-systems concept is built to balance the conflicting objectives of safety, reliability, and process strategy in order to achieve long term sustainability. A planning and analysis test bed is needed for evaluation of enterprise level options and strategies for transit and launch systems as well as surface and orbital systems. This environment can also support agency simulation .based acquisition process objectives. The technology development approach is based on the collaborative effort set forth in the VTB's integrating operations. process models, systems and environment models, and cost models as a comprehensive disciplined enterprise analysis environment. Significant emphasis is being placed on adapting root cause from existing Shuttle operations to exploration. Technical challenges include cost model validation, integration of parametric models with discrete event process and systems simulations. and large-scale simulation integration. The enterprise architecture is required for coherent integration of systems models. It will also require a plan for evolution over the life of the program. The proposed technology will produce long-term benefits in support of the NASA objectives for simulation based acquisition, will improve the ability to assess architectural options verses safety/risk for future exploration systems, and will facilitate incorporation of operability as a systems design consideration, reducing overall life cycle cost for future systems. The future of business intelligence of space exploration will focus on the intelligent system-of-systems real-time enterprise. In present business intelligence, a number of technologies that are most relevant to space exploration are experiencing the greatest change. Emerging patterns of set of processes rather than organizational units leading to end-to-end automation is becoming a major objective of enterprise information technology. The cost element is a leading factor of future exploration systems.

  20. Interface design in the process industries

    NASA Technical Reports Server (NTRS)

    Beaverstock, M. C.; Stassen, H. G.; Williamson, R. A.

    1977-01-01

    Every operator runs his plant in accord with his own mental model of the process. In this sense, one characteristic of an ideal man-machine interface is that it be in harmony with that model. With this theme in mind, the paper first reviews the functions of the process operator and compares them with human operators involved in control situations previously studied outside the industrial environment (pilots, air traffic controllers, helmsmen, etc.). A brief history of the operator interface in the process industry and the traditional methodology employed in its design is then presented. Finally, a much more fundamental approach utilizing a model definition of the human operator's behavior is presented.

  1. Scripting human animations in a virtual environment

    NASA Technical Reports Server (NTRS)

    Goldsby, Michael E.; Pandya, Abhilash K.; Maida, James C.

    1994-01-01

    The current deficiencies of virtual environment (VE) are well known: annoying lag time in drawing the current view, drastically simplified environments to reduce that time lag, low resolution and narrow field of view. Animation scripting is an application of VE technology which can be carried out successfully despite these deficiencies. The final product is a smoothly moving high resolution animation displaying detailed models. In this system, the user is represented by a human computer model with the same body proportions. Using magnetic tracking, the motions of the model's upper torso, head and arms are controlled by the user's movements (18 degrees of freedom). The model's lower torso and global position and orientation are controlled by a spaceball and keypad (12 degrees of freedom). Using this system human motion scripts can be extracted from the user's movements while immersed in a simplified virtual environment. Recorded data is used to define key frames; motion is interpolated between them and post processing adds a more detailed environment. The result is a considerable savings in time and a much more natural-looking movement of a human figure in a smooth and seamless animation.

  2. Software Framework for Advanced Power Plant Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    John Widmann; Sorin Munteanu; Aseem Jain

    2010-08-01

    This report summarizes the work accomplished during the Phase II development effort of the Advanced Process Engineering Co-Simulator (APECS). The objective of the project is to develop the tools to efficiently combine high-fidelity computational fluid dynamics (CFD) models with process modeling software. During the course of the project, a robust integration controller was developed that can be used in any CAPE-OPEN compliant process modeling environment. The controller mediates the exchange of information between the process modeling software and the CFD software. Several approaches to reducing the time disparity between CFD simulations and process modeling have been investigated and implemented. Thesemore » include enabling the CFD models to be run on a remote cluster and enabling multiple CFD models to be run simultaneously. Furthermore, computationally fast reduced-order models (ROMs) have been developed that can be 'trained' using the results from CFD simulations and then used directly within flowsheets. Unit operation models (both CFD and ROMs) can be uploaded to a model database and shared between multiple users.« less

  3. AN-CASE NET-CENTRIC modeling and simulation

    NASA Astrophysics Data System (ADS)

    Baskinger, Patricia J.; Chruscicki, Mary Carol; Turck, Kurt

    2009-05-01

    The objective of mission training exercises is to immerse the trainees into an environment that enables them to train like they would fight. The integration of modeling and simulation environments that can seamlessly leverage Live systems, and Virtual or Constructive models (LVC) as they are available offers a flexible and cost effective solution to extending the "war-gaming" environment to a realistic mission experience while evolving the development of the net-centric enterprise. From concept to full production, the impact of new capabilities on the infrastructure and concept of operations, can be assessed in the context of the enterprise, while also exposing them to the warfighter. Training is extended to tomorrow's tools, processes, and Tactics, Techniques and Procedures (TTPs). This paper addresses the challenges of a net-centric modeling and simulation environment that is capable of representing a net-centric enterprise. An overview of the Air Force Research Laboratory's (AFRL) Airborne Networking Component Architecture Simulation Environment (AN-CASE) is provide as well as a discussion on how it is being used to assess technologies for the purpose of experimenting with new infrastructure mechanisms that enhance the scalability and reliability of the distributed mission operations environment.

  4. One-dimensional biomass fast pyrolysis model with reaction kinetics integrated in an Aspen Plus Biorefinery Process Model

    DOE PAGES

    Humbird, David; Trendewicz, Anna; Braun, Robert; ...

    2017-01-12

    A biomass fast pyrolysis reactor model with detailed reaction kinetics and one-dimensional fluid dynamics was implemented in an equation-oriented modeling environment (Aspen Custom Modeler). Portions of this work were detailed in previous publications; further modifications have been made here to improve stability and reduce execution time of the model to make it compatible for use in large process flowsheets. The detailed reactor model was integrated into a larger process simulation in Aspen Plus and was stable for different feedstocks over a range of reactor temperatures. Sample results are presented that indicate general agreement with experimental results, but with higher gasmore » losses caused by stripping of the bio-oil by the fluidizing gas in the simulated absorber/condenser. Lastly, this integrated modeling approach can be extended to other well-defined, predictive reactor models for fast pyrolysis, catalytic fast pyrolysis, as well as other processes.« less

  5. One-dimensional biomass fast pyrolysis model with reaction kinetics integrated in an Aspen Plus Biorefinery Process Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Humbird, David; Trendewicz, Anna; Braun, Robert

    A biomass fast pyrolysis reactor model with detailed reaction kinetics and one-dimensional fluid dynamics was implemented in an equation-oriented modeling environment (Aspen Custom Modeler). Portions of this work were detailed in previous publications; further modifications have been made here to improve stability and reduce execution time of the model to make it compatible for use in large process flowsheets. The detailed reactor model was integrated into a larger process simulation in Aspen Plus and was stable for different feedstocks over a range of reactor temperatures. Sample results are presented that indicate general agreement with experimental results, but with higher gasmore » losses caused by stripping of the bio-oil by the fluidizing gas in the simulated absorber/condenser. Lastly, this integrated modeling approach can be extended to other well-defined, predictive reactor models for fast pyrolysis, catalytic fast pyrolysis, as well as other processes.« less

  6. Development of climate data storage and processing model

    NASA Astrophysics Data System (ADS)

    Okladnikov, I. G.; Gordov, E. P.; Titov, A. G.

    2016-11-01

    We present a storage and processing model for climate datasets elaborated in the framework of a virtual research environment (VRE) for climate and environmental monitoring and analysis of the impact of climate change on the socio-economic processes on local and regional scales. The model is based on a «shared nothings» distributed computing architecture and assumes using a computing network where each computing node is independent and selfsufficient. Each node holds a dedicated software for the processing and visualization of geospatial data providing programming interfaces to communicate with the other nodes. The nodes are interconnected by a local network or the Internet and exchange data and control instructions via SSH connections and web services. Geospatial data is represented by collections of netCDF files stored in a hierarchy of directories in the framework of a file system. To speed up data reading and processing, three approaches are proposed: a precalculation of intermediate products, a distribution of data across multiple storage systems (with or without redundancy), and caching and reuse of the previously obtained products. For a fast search and retrieval of the required data, according to the data storage and processing model, a metadata database is developed. It contains descriptions of the space-time features of the datasets available for processing, their locations, as well as descriptions and run options of the software components for data analysis and visualization. The model and the metadata database together will provide a reliable technological basis for development of a high- performance virtual research environment for climatic and environmental monitoring.

  7. Designing a Web-Based Science Learning Environment for Model-Based Collaborative Inquiry

    NASA Astrophysics Data System (ADS)

    Sun, Daner; Looi, Chee-Kit

    2013-02-01

    The paper traces a research process in the design and development of a science learning environment called WiMVT (web-based inquirer with modeling and visualization technology). The WiMVT system is designed to help secondary school students build a sophisticated understanding of scientific conceptions, and the science inquiry process, as well as develop critical learning skills through model-based collaborative inquiry approach. It is intended to support collaborative inquiry, real-time social interaction, progressive modeling, and to provide multiple sources of scaffolding for students. We first discuss the theoretical underpinnings for synthesizing the WiMVT design framework, introduce the components and features of the system, and describe the proposed work flow of WiMVT instruction. We also elucidate our research approach that supports the development of the system. Finally, the findings of a pilot study are briefly presented to demonstrate of the potential for learning efficacy of the WiMVT implementation in science learning. Implications are drawn on how to improve the existing system, refine teaching strategies and provide feedback to researchers, designers and teachers. This pilot study informs designers like us on how to narrow the gap between the learning environment's intended design and its actual usage in the classroom.

  8. Simulating the conversion of rural settlements to town land based on multi-agent systems and cellular automata.

    PubMed

    Liu, Yaolin; Kong, Xuesong; Liu, Yanfang; Chen, Yiyun

    2013-01-01

    Rapid urbanization in China has triggered the conversion of land from rural to urban use, particularly the conversion of rural settlements to town land. This conversion is the result of the joint effects of the geographic environment and agents involving the government, investors, and farmers. To understand the dynamic interaction dominated by agents and to predict the future landscape of town expansion, a small town land-planning model is proposed based on the integration of multi-agent systems (MAS) and cellular automata (CA). The MAS-CA model links the decision-making behaviors of agents with the neighbor effect of CA. The interaction rules are projected by analyzing the preference conflicts among agents. To better illustrate the effects of the geographic environment, neighborhood, and agent behavior, a comparative analysis between the CA and MAS-CA models in three different towns is presented, revealing interesting patterns in terms of quantity, spatial characteristics, and the coordinating process. The simulation of rural settlements conversion to town land through modeling agent decision and human-environment interaction is very useful for understanding the mechanisms of rural-urban land-use change in developing countries. This process can assist town planners in formulating appropriate development plans.

  9. Multi-site calibration, validation, and sensitivity analysis of the MIKE SHE Model for a large watershed in northern China

    Treesearch

    S. Wang; Z. Zhang; G. Sun; P. Strauss; J. Guo; Y. Tang; A. Yao

    2012-01-01

    Model calibration is essential for hydrologic modeling of large watersheds in a heterogeneous mountain environment. Little guidance is available for model calibration protocols for distributed models that aim at capturing the spatial variability of hydrologic processes. This study used the physically-based distributed hydrologic model, MIKE SHE, to contrast a lumped...

  10. Parent- and child-driven effects during the transition to adolescence: a longitudinal, genetic analysis of the home environment.

    PubMed

    Hannigan, Laurie J; McAdams, Tom A; Plomin, Robert; Eley, Thalia C

    2017-09-01

    Theoretical models of child development typically consider the home environment as a product of bidirectional effects, with parent- and child-driven processes operating interdependently. However, the developmental structure of these processes during the transition from childhood to adolescence has not been well studied. In this study we used longitudinal genetic analyses of data from 6646 UK-representative twin pairs (aged 9-16 years) to investigate stability and change in parenting and household chaos in the context of parent-child bidirectional effects. Stability in the home environment was modest, arising mainly from parent-driven processes and family-wide influences. In contrast, change over time was more influenced by child-driven processes, indicated by significant age-specific genetic influences. Interpretations of these results and their implications for researchers are discussed. © 2016 The Authors. Developmental Science Published by John Wiley & Sons Ltd.

  11. Model of melting (crystallization) process of the condensed disperse phase in the smoky plasmas

    NASA Astrophysics Data System (ADS)

    Dragan, G. S.; Kolesnikov, K. V.; Kutarov, V. V.

    2018-01-01

    The paper presents an analysis of the causes of a formation of spatial ordered grain structures in a smoky plasma. We are modeling the process of melting (crystallization) of a condensed phase in this environment taking into account the screened electrostatic interaction and the diffusion-drift force. We discuss an influence of the charge on the melting temperatures.

  12. Human Systems Integration Design Environment (HSIDE)

    DTIC Science & Technology

    2012-04-09

    quality of the resulting HSI products. 15. SUBJECT TERMS HSI , Manning Estimation and Validation , Risk Assessment, I POE, PLM, BPMN , Workflow...business process model in Business Process Modeling Notation ( BPMN ) or the actual workflow template associated with the specific functional area, again...as filtered by the user settings in the high level interface. Figure 3 shows the initial screen which allows the user to select either the BPMN or

  13. A bio-inspired kinematic controller for obstacle avoidance during reaching tasks with real robots.

    PubMed

    Srinivasa, Narayan; Bhattacharyya, Rajan; Sundareswara, Rashmi; Lee, Craig; Grossberg, Stephen

    2012-11-01

    This paper describes a redundant robot arm that is capable of learning to reach for targets in space in a self-organized fashion while avoiding obstacles. Self-generated movement commands that activate correlated visual, spatial and motor information are used to learn forward and inverse kinematic control models while moving in obstacle-free space using the Direction-to-Rotation Transform (DIRECT). Unlike prior DIRECT models, the learning process in this work was realized using an online Fuzzy ARTMAP learning algorithm. The DIRECT-based kinematic controller is fault tolerant and can handle a wide range of perturbations such as joint locking and the use of tools despite not having experienced them during learning. The DIRECT model was extended based on a novel reactive obstacle avoidance direction (DIRECT-ROAD) model to enable redundant robots to avoid obstacles in environments with simple obstacle configurations. However, certain configurations of obstacles in the environment prevented the robot from reaching the target with purely reactive obstacle avoidance. To address this complexity, a self-organized process of mental rehearsals of movements was modeled, inspired by human and animal experiments on reaching, to generate plans for movement execution using DIRECT-ROAD in complex environments. These mental rehearsals or plans are self-generated by using the Fuzzy ARTMAP algorithm to retrieve multiple solutions for reaching each target while accounting for all the obstacles in its environment. The key aspects of the proposed novel controller were illustrated first using simple examples. Experiments were then performed on real robot platforms to demonstrate successful obstacle avoidance during reaching tasks in real-world environments. Copyright © 2012 Elsevier Ltd. All rights reserved.

  14. Using gridded multimedia model to simulate spatial fate of Benzo[α]pyrene on regional scale.

    PubMed

    Liu, Shijie; Lu, Yonglong; Wang, Tieyu; Xie, Shuangwei; Jones, Kevin C; Sweetman, Andrew J

    2014-02-01

    Predicting the environmental multimedia fate is an essential step in the process of assessing the human exposure and health impacts of chemicals released into the environment. Multimedia fate models have been widely applied to calculate the fate and distribution of chemicals in the environment, which can serve as input to a human exposure model. In this study, a grid based multimedia fugacity model at regional scale was developed together with a case study modeling the fate and transfer of Benzo[α]pyrene (BaP) in Bohai coastal region, China. Based on the estimated emission and in-site survey in 2008, the BaP concentrations in air, vegetation, soil, fresh water, fresh water sediment and coastal water as well as the transfer fluxes were derived under the steady-state assumption. The model results were validated through comparison between the measured and modeled concentrations of BaP. The model results indicated that the predicted concentrations of BaP in air, fresh water, soil and sediment generally agreed with field observations. Model predictions suggest that soil was the dominant sink of BaP in terrestrial systems. Flow from air to soil, vegetation and costal water were three major pathways of BaP inter-media transport processes. Most of the BaP entering the sea was transferred by air flow, which was also the crucial driving force in the spatial distribution processes of BaP. The Yellow River, Liaohe River and Daliao River played an important role in the spatial transformation processes of BaP. Compared with advection outflow, degradation was more important in removal processes of BaP. Sensitivities of the model estimates to input parameters were tested. The result showed that emission rates, compartment dimensions, transport velocity and degradation rates of BaP were the most influential parameters for the model output. Monte Carlo simulation was carried out to determine parameter uncertainty, from which the coefficients of variation for the estimated BaP concentrations in air and soil were computed, which were 0.46 and 1.53, respectively. The model output-concentrations of BaP in multimedia environment can be used in human exposure and risk assessment in the Bohai coastal region. The results also provide significant indicators on the likely dominant fate, influence range of emission and transport processes determining behavior of BaP in the Bohai coastal region, which is instrumental in human exposure and risk assessment in the region. © 2013.

  15. Four-fluid MHD Simulations of the Plasma and Neutral Gas Environment of Comet Churyumov-Gerasimenko Near Perihelion

    NASA Astrophysics Data System (ADS)

    Huang, Z.; Toth, G.; Gombosi, T.; Jia, X.; Rubin, M.; Fougere, N.; Tenishev, V.; Combi, M.; Bieler, A.; Hansen, K.; Shou, Y.; Altwegg, K.

    2015-10-01

    We develop a 3-D four fluid model to study the plasma environment of comet Churyumov- Gerasimenko (CG), which is the target of the Rosetta mission. Our model is based on BATS-R-US within the SWMF (Space Weather Modeling Framework) that solves the governing multifluid MHD equations and and the Euler equations for the neutral gas fluid. These equations describe the behavior and interactions of the cometary heavy ions, the solar wind protons, the electrons, and the neutrals. This model incorporates mass loading processes, including photo and electron impact ionization, furthermore taken into account are charge exchange, dissociative ion-electron recombination, as well as collisional interactions between different fluids. We simulate the near nucleus plasma and neutral gas environment with a realistic shape model of CG near perihelion and compare our simulation results with Rosetta observations.

  16. FAME, a microprocessor based front-end analysis and modeling environment

    NASA Technical Reports Server (NTRS)

    Rosenbaum, J. D.; Kutin, E. B.

    1980-01-01

    Higher order software (HOS) is a methodology for the specification and verification of large scale, complex, real time systems. The HOS methodology was implemented as FAME (front end analysis and modeling environment), a microprocessor based system for interactively developing, analyzing, and displaying system models in a low cost user-friendly environment. The nature of the model is such that when completed it can be the basis for projection to a variety of forms such as structured design diagrams, Petri-nets, data flow diagrams, and PSL/PSA source code. The user's interface with the analyzer is easily recognized by any current user of a structured modeling approach; therefore extensive training is unnecessary. Furthermore, when all the system capabilities are used one can check on proper usage of data types, functions, and control structures thereby adding a new dimension to the design process that will lead to better and more easily verified software designs.

  17. Rational Approximations to Rational Models: Alternative Algorithms for Category Learning

    ERIC Educational Resources Information Center

    Sanborn, Adam N.; Griffiths, Thomas L.; Navarro, Daniel J.

    2010-01-01

    Rational models of cognition typically consider the abstract computational problems posed by the environment, assuming that people are capable of optimally solving those problems. This differs from more traditional formal models of cognition, which focus on the psychological processes responsible for behavior. A basic challenge for rational models…

  18. The enhanced Software Life Cyle Support Environment (ProSLCSE): Automation for enterprise and process modeling

    NASA Technical Reports Server (NTRS)

    Milligan, James R.; Dutton, James E.

    1993-01-01

    In this paper, we have introduced a comprehensive method for enterprise modeling that addresses the three important aspects of how an organization goes about its business. FirstEP includes infrastructure modeling, information modeling, and process modeling notations that are intended to be easy to learn and use. The notations stress the use of straightforward visual languages that are intuitive, syntactically simple, and semantically rich. ProSLCSE will be developed with automated tools and services to facilitate enterprise modeling and process enactment. In the spirit of FirstEP, ProSLCSE tools will also be seductively easy to use. Achieving fully managed, optimized software development and support processes will be long and arduous for most software organizations, and many serious problems will have to be solved along the way. ProSLCSE will provide the ability to document, communicate, and modify existing processes, which is the necessary first step.

  19. Systematic analysis of signaling pathways using an integrative environment.

    PubMed

    Visvanathan, Mahesh; Breit, Marc; Pfeifer, Bernhard; Baumgartner, Christian; Modre-Osprian, Robert; Tilg, Bernhard

    2007-01-01

    Understanding the biological processes of signaling pathways as a whole system requires an integrative software environment that has comprehensive capabilities. The environment should include tools for pathway design, visualization, simulation and a knowledge base concerning signaling pathways as one. In this paper we introduce a new integrative environment for the systematic analysis of signaling pathways. This system includes environments for pathway design, visualization, simulation and a knowledge base that combines biological and modeling information concerning signaling pathways that provides the basic understanding of the biological system, its structure and functioning. The system is designed with a client-server architecture. It contains a pathway designing environment and a simulation environment as upper layers with a relational knowledge base as the underlying layer. The TNFa-mediated NF-kB signal trans-duction pathway model was designed and tested using our integrative framework. It was also useful to define the structure of the knowledge base. Sensitivity analysis of this specific pathway was performed providing simulation data. Then the model was extended showing promising initial results. The proposed system offers a holistic view of pathways containing biological and modeling data. It will help us to perform biological interpretation of the simulation results and thus contribute to a better understanding of the biological system for drug identification.

  20. Introducing GHOST: The Geospace/Heliosphere Observation & Simulation Tool-kit

    NASA Astrophysics Data System (ADS)

    Murphy, J. J.; Elkington, S. R.; Schmitt, P.; Wiltberger, M. J.; Baker, D. N.

    2013-12-01

    Simulation models of the heliospheric and geospace environments can provide key insights into the geoeffective potential of solar disturbances such as Coronal Mass Ejections and High Speed Solar Wind Streams. Advanced post processing of the results of these simulations greatly enhances the utility of these models for scientists and other researchers. Currently, no supported centralized tool exists for performing these processing tasks. With GHOST, we introduce a toolkit for the ParaView visualization environment that provides a centralized suite of tools suited for Space Physics post processing. Building on the work from the Center For Integrated Space Weather Modeling (CISM) Knowledge Transfer group, GHOST is an open-source tool suite for ParaView. The tool-kit plugin currently provides tools for reading LFM and Enlil data sets, and provides automated tools for data comparison with NASA's CDAweb database. As work progresses, many additional tools will be added and through open-source collaboration, we hope to add readers for additional model types, as well as any additional tools deemed necessary by the scientific public. The ultimate end goal of this work is to provide a complete Sun-to-Earth model analysis toolset.

  1. PROcess Based Diagnostics PROBE

    NASA Technical Reports Server (NTRS)

    Clune, T.; Schmidt, G.; Kuo, K.; Bauer, M.; Oloso, H.

    2013-01-01

    Many of the aspects of the climate system that are of the greatest interest (e.g., the sensitivity of the system to external forcings) are emergent properties that arise via the complex interplay between disparate processes. This is also true for climate models most diagnostics are not a function of an isolated portion of source code, but rather are affected by multiple components and procedures. Thus any model-observation mismatch is hard to attribute to any specific piece of code or imperfection in a specific model assumption. An alternative approach is to identify diagnostics that are more closely tied to specific processes -- implying that if a mismatch is found, it should be much easier to identify and address specific algorithmic choices that will improve the simulation. However, this approach requires looking at model output and observational data in a more sophisticated way than the more traditional production of monthly or annual mean quantities. The data must instead be filtered in time and space for examples of the specific process being targeted.We are developing a data analysis environment called PROcess-Based Explorer (PROBE) that seeks to enable efficient and systematic computation of process-based diagnostics on very large sets of data. In this environment, investigators can define arbitrarily complex filters and then seamlessly perform computations in parallel on the filtered output from their model. The same analysis can be performed on additional related data sets (e.g., reanalyses) thereby enabling routine comparisons between model and observational data. PROBE also incorporates workflow technology to automatically update computed diagnostics for subsequent executions of a model. In this presentation, we will discuss the design and current status of PROBE as well as share results from some preliminary use cases.

  2. On-line Meteorology-Chemistry/Aerosols Modelling and Integration for Risk Assessment: Case Studies

    NASA Astrophysics Data System (ADS)

    Bostanbekov, Kairat; Mahura, Alexander; Nuterman, Roman; Nurseitov, Daniyar; Zakarin, Edige; Baklanov, Alexander

    2016-04-01

    On regional level, and especially in areas with potential diverse sources of industrial pollutants, the risk assessment of impact on environment and population is critically important. During normal operations, the risk is minimal. However, during accidental situations, the risk is increased due to releases of harmful pollutants into different environments such as water, soil, and atmosphere where it is following processes of continuous transformation and transport. In this study, the Enviro-HIRLAM (Environment High Resolution Limited Area Model) was adapted and employed for assessment of scenarios with accidental and continuous emissions of sulphur dioxide (SO2) for selected case studies during January of 2010. The following scenarios were considered: (i) control reference run; (ii) accidental release (due to short-term 1 day fire at oil storage facility) occurred at city of Atyrau (Kazakhstan) near the northern part of the Caspian Sea; and (iii) doubling of original continuous emissions from three locations of metallurgical enterprises on the Kola Peninsula (Russia). The implemented aerosol microphysics module M7 uses 5 types - sulphates, sea salt, dust, black and organic carbon; as well as distributed in 7 size modes. Removal processes of aerosols include gravitational settling and wet deposition. As the Enviro-HIRLAM model is the on-line integrated model, both meteorological and chemical processes are simultaneously modelled at each time step. The modelled spatio-temporal variations for meteorological and chemical patterns are analyzed for both European and Kazakhstan regions domains. The results of evaluation of sulphur dioxide concentration and deposition on main populated cities, selected regions, countries are presented employing GIS tools. As outcome, the results of Enviro-HIRLAM modelling for accidental release near the Caspian Sea are integrated into the RANDOM (Risk Assessment of Nature Detriment due to Oil spill Migration) system.

  3. Physical Conditions of Eta Car Complex Environment Revealed From Photoionization Modeling

    NASA Technical Reports Server (NTRS)

    Verner, E. M.; Bruhweiler, F.; Nielsen, K. E.; Gull, T.; Kober, G. Vieira; Corcoran, M.

    2006-01-01

    The very massive star, Eta Carinae, is enshrouded in an unusual complex environment of nebulosities and structures. The circumstellar gas gives rise to distinct absorption and emission components at different velocities and distances from the central source(s). Through photoionization modeling, we find that the radiation field from the more massive B-star companion supports the low ionization structure throughout the 5.54 year period. The radiation field of an evolved O-star is required to produce the higher ionization . emission seen across the broad maximum. Our studies utilize the HST/STIS data and model calculations of various regimes from doubly ionized species (T= 10,000K) to the low temperature (T = 760 K) conditions conductive to molecule formation (CH and OH). Overall analysis suggests the high depletion in C and O and the enrichment in He and N. The sharp molecular and ionic absorptions in this extensively CNO - processed material offers a unique environment for studying the chemistry, dust formation processes, and nucleosynthesis in the ejected layers of a highly evolved massive star.

  4. Gaussian process based modeling and experimental design for sensor calibration in drifting environments

    PubMed Central

    Geng, Zongyu; Yang, Feng; Chen, Xi; Wu, Nianqiang

    2016-01-01

    It remains a challenge to accurately calibrate a sensor subject to environmental drift. The calibration task for such a sensor is to quantify the relationship between the sensor’s response and its exposure condition, which is specified by not only the analyte concentration but also the environmental factors such as temperature and humidity. This work developed a Gaussian Process (GP)-based procedure for the efficient calibration of sensors in drifting environments. Adopted as the calibration model, GP is not only able to capture the possibly nonlinear relationship between the sensor responses and the various exposure-condition factors, but also able to provide valid statistical inference for uncertainty quantification of the target estimates (e.g., the estimated analyte concentration of an unknown environment). Built on GP’s inference ability, an experimental design method was developed to achieve efficient sampling of calibration data in a batch sequential manner. The resulting calibration procedure, which integrates the GP-based modeling and experimental design, was applied on a simulated chemiresistor sensor to demonstrate its effectiveness and its efficiency over the traditional method. PMID:26924894

  5. Integration of the social environment in a mobility ontology for people with motor disabilities.

    PubMed

    Gharebaghi, Amin; Mostafavi, Mir-Abolfazl; Edwards, Geoffrey; Fougeyrollas, Patrick; Gamache, Stéphanie; Grenier, Yan

    2017-07-07

    Our contemporary understanding of disability is rooted in the idea that disability is the product of human-environment interaction processes. People may be functionally limited, but this becomes a disability only when they engage with their immediate social and physical environments. Any attempt to address issues of mobility in relation to people with disabilities should be grounded in an ontology that encompasses this understanding. The objective of this study is to provide a methodology to integrate the social and physical environments in the development of a mobility ontology for people with motor disabilities (PWMD). We propose to create subclasses of concepts based on a Nature-Development distinction rather than creating separate social and physical subclasses. This allows the relationships between social and physical elements to be modelled in a more compact and efficient way by specifying them locally within each entity, and better accommodates the complexities of the human-environment interaction as well. Based on this approach, an ontology for mobility of PWMD considering four main elements - the social and physical environmental factors, human factors, life habits related to mobility and possible goals of mobility - is presented. We demonstrate that employing the Nature-Development perspective facilitates the process of developing useful ontologies, especially for defining the relationships between the social and physical parts of the environment. This is a fundamental issue for modelling the interaction between humans and their social and physical environments for a broad range of applications, including the development of geospatial assistive technologies for navigation of PWMD. Implications for rehabilitation The proposed perspective may actually have much broader interests beyond the issue of disability - much of the interesting dynamics in city development arises from the interaction between human-developed components - the built environment and its associated entities - and natural or organic components. The proposed approach facilitates the process of developing useful ontologies, especially for defining the relationships between the social and physical parts of the environment. This is a fundamental issue for modeling the interaction between human -specially people with disabilities -and his social and physical environments in a broad range of domains and applications, such as Geographic Information Systems and the development of geospatial assistive technologies for navigation of people with disabilities, respectively.

  6. Attribute classification for generating GPR facies models

    NASA Astrophysics Data System (ADS)

    Tronicke, Jens; Allroggen, Niklas

    2017-04-01

    Ground-penetrating radar (GPR) is an established geophysical tool to explore near-surface sedimentary environments. It has been successfully used, for example, to reconstruct past depositional environments, to investigate sedimentary processes, to aid hydrogeological investigations, and to assist in hydrocarbon reservoir analog studies. Interpreting such 2D/3D GPR data, usually relies on concepts known as GPR facies analysis, in which GPR facies are defined as units composed of characteristic reflection patterns (in terms of reflection amplitude, continuity, geometry, and internal configuration). The resulting facies models are then interpreted in terms of depositional processes, sedimentary environments, litho-, and hydrofacies. Typically, such GPR facies analyses are implemented in a manual workflow being laborious and rather inefficient especially for 3D data sets. In addition, such a subjective strategy bears the potential of inconsistency because the outcome depends on the expertise and experience of the interpreter. In this presentation, we investigate the feasibility of delineating GPR facies in an objective and largely automated manner. Our proposed workflow relies on a three-step procedure. First, we calculate a variety of geometrical and physical attributes from processed 2D and 3D GPR data sets. Then, we analyze and evaluate this attribute data base (e.g., using statistical tools such as principal component analysis) to reduce its dimensionality and to avoid redundant information, respectively. Finally, we integrate the reduced data base using tools such as composite imaging, cluster analysis, and neural networks. Using field examples that have been acquired across different depositional environments, we demonstrate that the resulting 2D/3D facies models ease and improve the interpretation of GPR data. We conclude that our interpretation strategy allows to generate GPR facies models in a consistent and largely automated manner and might be helpful in variety near-surface applications.

  7. Electromagnetic field strength prediction in an urban environment: A useful tool for the planning of LMSS

    NASA Technical Reports Server (NTRS)

    Vandooren, G. A. J.; Herben, M. H. A. J.; Brussaard, G.; Sforza, M.; Poiaresbaptista, J. P. V.

    1993-01-01

    A model for the prediction of the electromagnetic field strength in an urban environment is presented. The ray model, that is based on the Uniform Theory of Diffraction (UTD), includes effects of the non-perfect conductivity of the obstacles and their surface roughness. The urban environment is transformed into a list of standardized obstacles that have various shapes and material properties. The model is capable of accurately predicting the field strength in the urban environment by calculating different types of wave contributions such as reflected, edge and corner diffracted waves, and combinations thereof. Also, antenna weight functions are introduced to simulate the spatial filtering by the mobile antenna. Communication channel parameters such as signal fading, time delay profiles, Doppler shifts and delay-Doppler spectra can be derived from the ray-tracing procedure using post-processing routines. The model has been tested against results from scaled measurements at 50 GHz and proves to be accurate.

  8. Teaching Harmonic Motion in Trigonometry: Inductive Inquiry Supported by Physics Simulations

    ERIC Educational Resources Information Center

    Sokolowski, Andrzej; Rackley, Robin

    2011-01-01

    In this article, the authors present a lesson whose goal is to utilise a scientific environment to immerse a trigonometry student in the process of mathematical modelling. The scientific environment utilised during this activity is a physics simulation called "Wave on a String" created by the PhET Interactive Simulations Project at…

  9. Criteria for the Development of Complex Teaching-Learning Environments.

    ERIC Educational Resources Information Center

    Achtenhagen, Frank

    2001-01-01

    Relates aspects of the didactic tradition, especially the German didactic tradition, to the theory and practice of instructional design. Focuses on processes that are necessary to the modeling of reality and describes the design and development of a virtual enterprise as a complex teaching-learning environment in a German business school.…

  10. Some Technical Implications of Distributed Cognition on the Design on Interactive Learning Environments.

    ERIC Educational Resources Information Center

    Dillenbourg, Pierre

    1996-01-01

    Maintains that diagnosis, explanation, and tutoring, the functions of an interactive learning environment, are collaborative processes. Examines how human-computer interaction can be improved using a distributed cognition framework. Discusses situational and distributed knowledge theories and provides a model on how they can be used to redesign…

  11. A Model for Field Deployment of Wireless Sensor Networks (WSNs) within the Domain of Microclimate Habitat Monitoring

    ERIC Educational Resources Information Center

    Sanborn, Mark

    2011-01-01

    Wireless sensor networks (WSNs) represent a class of miniaturized information systems designed to monitor physical environments. These smart monitoring systems form collaborative networks utilizing autonomous sensing, data-collection, and processing to provide real-time analytics of observed environments. As a fundamental research area in…

  12. Problem-Based Educational Game Becomes Student-Centered Learning Environment

    ERIC Educational Resources Information Center

    Rodkroh, Pornpimon; Suwannatthachote, Praweenya; Kaemkate, Wannee

    2013-01-01

    Problem-based educational games are able to provide a fun and motivating environment for teaching and learning of certain subjects. However, most educational game models do not address the learning elements of problem-based educational games. This study aims to synthesize and to propose the important elements to facilitate the learning process and…

  13. Distributed collaborative environments for predictive battlespace awareness

    NASA Astrophysics Data System (ADS)

    McQuay, William K.

    2003-09-01

    The past decade has produced significant changes in the conduct of military operations: asymmetric warfare, the reliance on dynamic coalitions, stringent rules of engagement, increased concern about collateral damage, and the need for sustained air operations. Mission commanders need to assimilate a tremendous amount of information, make quick-response decisions, and quantify the effects of those decisions in the face of uncertainty. Situational assessment is crucial in understanding the battlespace. Decision support tools in a distributed collaborative environment offer the capability of decomposing complex multitask processes and distributing them over a dynamic set of execution assets that include modeling, simulations, and analysis tools. Decision support technologies can semi-automate activities, such as analysis and planning, that have a reasonably well-defined process and provide machine-level interfaces to refine the myriad of information that the commander must fused. Collaborative environments provide the framework and integrate models, simulations, and domain specific decision support tools for the sharing and exchanging of data, information, knowledge, and actions. This paper describes ongoing AFRL research efforts in applying distributed collaborative environments to predictive battlespace awareness.

  14. Research on Modeling Technology of Virtual Robot Based on LabVIEW

    NASA Astrophysics Data System (ADS)

    Wang, Z.; Huo, J. L.; Y Sun, L.; Y Hao, X.

    2017-12-01

    Because of the dangerous working environment, the underwater operation robot for nuclear power station needs manual teleoperation. In the process of operation, it is necessary to guide the position and orientation of the robot in real time. In this paper, the geometric modeling of the virtual robot and the working environment is accomplished by using SolidWorks software, and the accurate modeling and assembly of the robot are realized. Using LabVIEW software to read the model, and established the manipulator forward kinematics and inverse kinematics model, and realized the hierarchical modeling of virtual robot and computer graphics modeling. Experimental results show that the method studied in this paper can be successfully applied to robot control system.

  15. NATO/CCMS PILOT STUDY - CLEAN PRODUCTS AND PROCESSES

    EPA Science Inventory

    The proposed objective of the NATO/CCMS Pilot on clean products and processes is to facilitate further gains in pollution prevention, waste minimization, and design for the environment. It is anticipated that the free exchange of knowledge, experience, data, and models will fost...

  16. Global Persistent Attack: A Systems Architecture, Process Modeling, and Risk Analysis Approach

    DTIC Science & Technology

    2008-06-01

    develop an analysis process for quantifying risk associated with the limitations presented by a fiscally constrained environment. The second step...previous independent analysis of each force structure provided information for quantifying risk associated with the given force presentations, the

  17. Model systems for leather research and beyond

    USDA-ARS?s Scientific Manuscript database

    Animal hides and skins, the most valuable byproducts of the meat industry, are raw material for the leather, biomaterials, gelatin and glue industries. Each of these industries modifies its processing methods as concerns over safety, the environment or economics arise. Processing changes are general...

  18. Extending the granularity of representation and control for the MIL-STD CAIS 1.0 node model

    NASA Technical Reports Server (NTRS)

    Rogers, Kathy L.

    1986-01-01

    The Common APSE (Ada 1 Program Support Environment) Interface Set (CAIS) (DoD85) node model provides an excellent baseline for interfaces in a single-host development environment. To encompass the entire spectrum of computing, however, the CAIS model should be extended in four areas. It should provide the interface between the engineering workstation and the host system throughout the entire lifecycle of the system. It should provide a basis for communication and integration functions needed by distributed host environments. It should provide common interfaces for communications mechanisms to and among target processors. It should provide facilities for integration, validation, and verification of test beds extending to distributed systems on geographically separate processors with heterogeneous instruction set architectures (ISAS). Additions to the PROCESS NODE model to extend the CAIS into these four areas are proposed.

  19. The Methodology of Interactive Parametric Modelling of Construction Site Facilities in BIM Environment

    NASA Astrophysics Data System (ADS)

    Kozlovská, Mária; Čabala, Jozef; Struková, Zuzana

    2014-11-01

    Information technology is becoming a strong tool in different industries, including construction. The recent trend of buildings designing is leading up to creation of the most comprehensive virtual building model (Building Information Model) in order to solve all the problems relating to the project as early as in the designing phase. Building information modelling is a new way of approaching to the design of building projects documentation. Currently, the building site layout as a part of the building design documents has a very little support in the BIM environment. Recently, the research of designing the construction process conditions has centred on improvement of general practice in planning and on new approaches to construction site layout planning. The state of art in field of designing the construction process conditions indicated an unexplored problem related to connection of knowledge system with construction site facilities (CSF) layout through interactive modelling. The goal of the paper is to present the methodology for execution of 3D construction site facility allocation model (3D CSF-IAM), based on principles of parametric and interactive modelling.

  20. Electrophysiological models of neural processing.

    PubMed

    Nelson, Mark E

    2011-01-01

    The brain is an amazing information processing system that allows organisms to adaptively monitor and control complex dynamic interactions with their environment across multiple spatial and temporal scales. Mathematical modeling and computer simulation techniques have become essential tools in understanding diverse aspects of neural processing ranging from sub-millisecond temporal coding in the sound localization circuity of barn owls to long-term memory storage and retrieval in humans that can span decades. The processing capabilities of individual neurons lie at the core of these models, with the emphasis shifting upward and downward across different levels of biological organization depending on the nature of the questions being addressed. This review provides an introduction to the techniques for constructing biophysically based models of individual neurons and local networks. Topics include Hodgkin-Huxley-type models of macroscopic membrane currents, Markov models of individual ion-channel currents, compartmental models of neuronal morphology, and network models involving synaptic interactions among multiple neurons.

  1. A Framework for Applying Point Clouds Grabbed by Multi-Beam LIDAR in Perceiving the Driving Environment

    PubMed Central

    Liu, Jian; Liang, Huawei; Wang, Zhiling; Chen, Xiangcheng

    2015-01-01

    The quick and accurate understanding of the ambient environment, which is composed of road curbs, vehicles, pedestrians, etc., is critical for developing intelligent vehicles. The road elements included in this work are road curbs and dynamic road obstacles that directly affect the drivable area. A framework for the online modeling of the driving environment using a multi-beam LIDAR, i.e., a Velodyne HDL-64E LIDAR, which describes the 3D environment in the form of a point cloud, is reported in this article. First, ground segmentation is performed via multi-feature extraction of the raw data grabbed by the Velodyne LIDAR to satisfy the requirement of online environment modeling. Curbs and dynamic road obstacles are detected and tracked in different manners. Curves are fitted for curb points, and points are clustered into bundles whose form and kinematics parameters are calculated. The Kalman filter is used to track dynamic obstacles, whereas the snake model is employed for curbs. Results indicate that the proposed framework is robust under various environments and satisfies the requirements for online processing. PMID:26404290

  2. Mathematical modeling in biological populations through branching processes. Application to salmonid populations.

    PubMed

    Molina, Manuel; Mota, Manuel; Ramos, Alfonso

    2015-01-01

    This work deals with mathematical modeling through branching processes. We consider sexually reproducing animal populations where, in each generation, the number of progenitor couples is determined in a non-predictable environment. By using a class of two-sex branching processes, we describe their demographic dynamics and provide several probabilistic and inferential contributions. They include results about the extinction of the population and the estimation of the offspring distribution and its main moments. We also present an application to salmonid populations.

  3. Process-based modelling of the nutritive value of forages: a review

    USDA-ARS?s Scientific Manuscript database

    Modelling sward nutritional value (NV) is of particular importance to understand the interactions between grasslands, livestock production, environment and climate-related impacts. Variables describing nutritive value vary significantly between ruminant production systems, but two types are commonly...

  4. Modeling Standards of Care for an Online Environment

    PubMed Central

    Jones-Schenk, Jan; Rossi, Julia

    1998-01-01

    At Intermountain Health Care in Salt Lake City a team was created to develop core standards for clinical practice that would enhance consistency of care across the care continuum. The newly developed Standards of Care had to meet the following criteria: electronic delivery, research-based, and support an interdisciplinary care environment along with an exception-based documentation system. The process has slowly evolved and the team has grown to include clinicians from multiple sites and disciplines who have met on a regular basis for over a year. The first challenge was to develop a model for the standards of care that would be suitable for an online environment.

  5. The component alignment model: a new approach to health care information technology strategic planning.

    PubMed

    Martin, J B; Wilkins, A S; Stawski, S K

    1998-08-01

    The evolving health care environment demands that health care organizations fully utilize information technologies (ITs). The effective deployment of IT requires the development and implementation of a comprehensive IT strategic plan. A number of approaches to health care IT strategic planning exist, but they are outdated or incomplete. The component alignment model (CAM) introduced here recognizes the complexity of today's health care environment, emphasizing continuous assessment and realignment of seven basic components: external environment, emerging ITs, organizational infrastructure, mission, IT infrastructure, business strategy, and IT strategy. The article provides a framework by which health care organizations can develop an effective IT strategic planning process.

  6. Social complexity beliefs predict posttraumatic growth in survivors of a natural disaster.

    PubMed

    Nalipay, Ma Jenina N; Bernardo, Allan B I; Mordeno, Imelu G

    2016-09-01

    Most studies on posttraumatic growth (PTG) have focused on personal characteristics, interpersonal resources, and the immediate environment. There has been less attention on dynamic internal processes related to the development of PTG and on how these processes are affected by the broader culture. Calhoun and Tedeschi's (2006) model suggests a role of distal culture in PTG development, but empirical investigations on that point are limited. The present study investigated the role of social complexity-the generalized belief about changing social environments and inconsistency of human behavior-as a predictor of PTG. Social complexity was hypothesized to be associated with problem-solving approaches that are likely to give rise to cognitive processes that promote PTG. A sample of 446 survivors of Typhoon Haiyan, 1 of the strongest typhoons ever recorded at the time, answered self-report measures of social complexity, cognitive processing of trauma, and PTG. Structural equation modeling indicated a good fit between the data and the hypothesized model; belief in social complexity predicted stronger PTG, mediated by cognitive processing. The results provide evidence for how disaster survivors' beliefs about the changing nature of social environments and their corresponding behavior changes are predictors of PTG and suggest a psychological mechanism for how distal culture can influence PTG. Thus, assessing social complexity beliefs during early the phases of a postdisaster psychosocial intervention may provide useful information on who is likely to experience PTG. Trauma workers might consider culture-specific social themes related to social complexity in disaster-affected communities. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  7. The GP problem: quantifying gene-to-phenotype relationships.

    PubMed

    Cooper, Mark; Chapman, Scott C; Podlich, Dean W; Hammer, Graeme L

    2002-01-01

    In this paper we refer to the gene-to-phenotype modeling challenge as the GP problem. Integrating information across levels of organization within a genotype-environment system is a major challenge in computational biology. However, resolving the GP problem is a fundamental requirement if we are to understand and predict phenotypes given knowledge of the genome and model dynamic properties of biological systems. Organisms are consequences of this integration, and it is a major property of biological systems that underlies the responses we observe. We discuss the E(NK) model as a framework for investigation of the GP problem and the prediction of system properties at different levels of organization. We apply this quantitative framework to an investigation of the processes involved in genetic improvement of plants for agriculture. In our analysis, N genes determine the genetic variation for a set of traits that are responsible for plant adaptation to E environment-types within a target population of environments. The N genes can interact in epistatic NK gene-networks through the way that they influence plant growth and development processes within a dynamic crop growth model. We use a sorghum crop growth model, available within the APSIM agricultural production systems simulation model, to integrate the gene-environment interactions that occur during growth and development and to predict genotype-to-phenotype relationships for a given E(NK) model. Directional selection is then applied to the population of genotypes, based on their predicted phenotypes, to simulate the dynamic aspects of genetic improvement by a plant-breeding program. The outcomes of the simulated breeding are evaluated across cycles of selection in terms of the changes in allele frequencies for the N genes and the genotypic and phenotypic values of the populations of genotypes.

  8. C-C1-04: Building a Health Services Information Technology Research Environment

    PubMed Central

    Gehrum, David W; Jones, JB; Romania, Gregory J; Young, David L; Lerch, Virginia R; Bruce, Christa A; Donkochik, Diane; Stewart, Walter F

    2010-01-01

    Background: The electronic health record (EHR) has opened a new era for health services research (HSR) where information technology (IT) is used to re-engineer care processes. While the EHR provides one means of advancing novel solutions, a promising strategy is to develop tools (e.g., online questionnaires, visual display tools, decision support) distinct from, but which interact with, the EHR. Development of such software tools outside the EHR offers an advantage in flexibility, sophistication, and ultimately in portability to other settings. However, institutional IT departments have an imperative to protect patient data and to standardize IT processes to ensure system-level security and support traditional business needs. Such imperatives usually present formidable process barriers to testing novel software solutions. We describe how, in collaboration with our IT department, we are creating an environment and a process that allows for routine and rapid testing of novel software solutions. Methods: We convened a working group consisting of IT and research personnel with expertise in information security, database design/management, web design, EHR programming, and health services research. The working group was tasked with developing a research IT environment to accomplish two objectives: maintain network/ data security and regulatory compliance; allow researchers working with external vendors to rapidly prototype and, in a clinical setting, test web-based tools. Results: Two parallel solutions, one focused on hardware, the second on oversight and management, were developed. First, we concluded that three separate, staged development environments were required to allow external vendor access for testing software and for transitioning software to be used in a clinic. In parallel, the extant oversight process for approving/managing access to internal/external personnel had to be altered to reflect the scope and scale of discrete research projects, as opposed to an enterpriselevel approach to IT management. Conclusions: Innovation in health services software development requires a flexible, scalable IT environment adapted to the unique objectives of a HSR software development model. In our experience, implementing the hardware solution is less challenging than the cultural change required to implement such a model and the modifications to administrative and oversight processes to sustain an environment for rapid product development and testing.

  9. MASCARET: creating virtual learning environments from system modelling

    NASA Astrophysics Data System (ADS)

    Querrec, Ronan; Vallejo, Paola; Buche, Cédric

    2013-03-01

    The design process for a Virtual Learning Environment (VLE) such as that put forward in the SIFORAS project (SImulation FOR training and ASsistance) means that system specifications can be differentiated from pedagogical specifications. System specifications can also be obtained directly from the specialists' expertise; that is to say directly from Product Lifecycle Management (PLM) tools. To do this, the system model needs to be considered as a piece of VLE data. In this paper we present Mascaret, a meta-model which can be used to represent such system models. In order to ensure that the meta-model is capable of describing, representing and simulating such systems, MASCARET is based SysML1, a standard defined by Omg.

  10. Modeling and optimum time performance for concurrent processing

    NASA Technical Reports Server (NTRS)

    Mielke, Roland R.; Stoughton, John W.; Som, Sukhamoy

    1988-01-01

    The development of a new graph theoretic model for describing the relation between a decomposed algorithm and its execution in a data flow environment is presented. Called ATAMM, the model consists of a set of Petri net marked graphs useful for representing decision-free algorithms having large-grained, computationally complex primitive operations. Performance time measures which determine computing speed and throughput capacity are defined, and the ATAMM model is used to develop lower bounds for these times. A concurrent processing operating strategy for achieving optimum time performance is presented and illustrated by example.

  11. Knowledge sifters in MDA technologies

    NASA Astrophysics Data System (ADS)

    Kravchenko, Yuri; Kursitys, Ilona; Bova, Victoria

    2018-05-01

    The article considers a new approach to efficient management of information processes on the basis of object models. With the help of special design tools, a generic and application-independent application model is created, and then the program is implemented in a specific development environment. At the same time, the development process is completely based on a model that must contain all the information necessary for programming. The presence of a detailed model provides the automatic creation of typical parts of the application, the development of which is amenable to automation.

  12. Hbim to VR. Semantic Awareness and Data Enrichment Interoperability for Parametric Libraries of Historical Architecture

    NASA Astrophysics Data System (ADS)

    Quattrini, R.; Battini, C.; Mammoli, R.

    2018-05-01

    Recently we assist to an increasing availability of HBIM models rich in geometric and informative terms. Instead, there is still a lack of researches implementing dedicated libraries, based on parametric intelligence and semantically aware, related to the architectural heritage. Additional challenges became from their portability in non-desktop environment (such as VR). The research article demonstrates the validity of a workflow applied to the architectural heritage, which starting from the semantic modeling reaches the visualization in a virtual reality environment, passing through the necessary phases of export, data migration and management. The three-dimensional modeling of the classical Doric order takes place in the BIM work environment and is configured as a necessary starting point for the implementation of data, parametric intelligences and definition of ontologies that exclusively qualify the model. The study also enables an effective method for data migration from the BIM model to databases integrated into VR technologies for AH. Furthermore, the process intends to propose a methodology, applicable in a return path, suited to the achievement of an appropriate data enrichment of each model and to the possibility of interaction in VR environment with the model.

  13. PALM-USM v1.0: A new urban surface model integrated into the PALM large-eddy simulation model

    NASA Astrophysics Data System (ADS)

    Resler, Jaroslav; Krč, Pavel; Belda, Michal; Juruš, Pavel; Benešová, Nina; Lopata, Jan; Vlček, Ondřej; Damašková, Daša; Eben, Kryštof; Derbek, Přemysl; Maronga, Björn; Kanani-Sühring, Farah

    2017-10-01

    Urban areas are an important part of the climate system and many aspects of urban climate have direct effects on human health and living conditions. This implies that reliable tools for local urban climate studies supporting sustainable urban planning are needed. However, a realistic implementation of urban canopy processes still poses a serious challenge for weather and climate modelling for the current generation of numerical models. To address this demand, a new urban surface model (USM), describing the surface energy processes for urban environments, was developed and integrated as a module into the PALM large-eddy simulation model. The development of the presented first version of the USM originated from modelling the urban heat island during summer heat wave episodes and thus implements primarily processes important in such conditions. The USM contains a multi-reflection radiation model for shortwave and longwave radiation with an integrated model of absorption of radiation by resolved plant canopy (i.e. trees, shrubs). Furthermore, it consists of an energy balance solver for horizontal and vertical impervious surfaces, and thermal diffusion in ground, wall, and roof materials, and it includes a simple model for the consideration of anthropogenic heat sources. The USM was parallelized using the standard Message Passing Interface and performance testing demonstrates that the computational costs of the USM are reasonable on typical clusters for the tested configurations. The module was fully integrated into PALM and is available via its online repository under the GNU General Public License (GPL). The USM was tested on a summer heat-wave episode for a selected Prague crossroads. The general representation of the urban boundary layer and patterns of surface temperatures of various surface types (walls, pavement) are in good agreement with in situ observations made in Prague. Additional simulations were performed in order to assess the sensitivity of the results to uncertainties in the material parameters, the domain size, and the general effect of the USM itself. The first version of the USM is limited to the processes most relevant to the study of summer heat waves and serves as a basis for ongoing development which will address additional processes of the urban environment and lead to improvements to extend the utilization of the USM to other environments and conditions.

  14. Effects of Electrostatic Environment on Charged Particle Transport near Lunar Holes

    NASA Astrophysics Data System (ADS)

    Miyake, Y.; Nishino, M. N.

    2017-12-01

    The Moon has neither dense atmosphere nor intrinsic magnetic field, and solar wind interactions with lunar surfaces are one of major plasma processes. The near-surface, dayside electrostatic environment is governed mainly by volume charges of solar wind plasma and photoelectrons as well as charged lunar surfaces. In fact, the electric environment strongly depends on surface topologies, as it will produce a shaded region, the electric environment of which can be very different from that in a sunlit condition. As one of high-profile terrains on the Moon, we have been focusing on the lunar vertical holes (or lunar pits), identified by the KAGUYA satellite and the Lunar Reconnaissance Orbiter. In order to model the distinctive electric and dust environments near the holes, we have started three-dimensional particle simulation analysis. The present study addresses the plasma environment of a lunar hole that is accompanied with a subsurface cavern. Besides the topographical effect of having a cavern, an investigation is focused on the following points. The first point is how deeply the solar wind protons are accessible into the hole and cavern. This point is relevant not only to an electric environment but also to possible existence of volatiles at permanently shaded regions of the hole. In order to examine the possibility, we implemented a proton scattering process at lunar surfaces into the simulation model. The other is the role of some minor current components such as secondary electrons, scattered protons, and charged dust grains at the lunar surface. Such minor currents become important for the charging of shaded surfaces, as major current components (solar wind plasma and photoelectrons) are not accessible there. We address these points based on kinetic model descriptions.

  15. Simulation modeling for the health care manager.

    PubMed

    Kennedy, Michael H

    2009-01-01

    This article addresses the use of simulation software to solve administrative problems faced by health care managers. Spreadsheet add-ins, process simulation software, and discrete event simulation software are available at a range of costs and complexity. All use the Monte Carlo method to realistically integrate probability distributions into models of the health care environment. Problems typically addressed by health care simulation modeling are facility planning, resource allocation, staffing, patient flow and wait time, routing and transportation, supply chain management, and process improvement.

  16. Modeling of impulsive propellant reorientation

    NASA Technical Reports Server (NTRS)

    Hochstein, John I.; Patag, Alfredo E.; Chato, David J.

    1988-01-01

    The impulsive propellant reorientation process is modeled using the (Energy Calculations for Liquid Propellants in a Space Environment (ECLIPSE) code. A brief description of the process and the computational model is presented. Code validation is documented via comparison to experimentally derived data for small scale tanks. Predictions of reorientation performance are presented for two tanks designed for use in flight experiments and for a proposed full scale OTV tank. A new dimensionless parameter is developed to correlate reorientation performance in geometrically similar tanks. Its success is demonstrated.

  17. Life's role in environmental regulation

    NASA Astrophysics Data System (ADS)

    Kump, L. R.

    2016-12-01

    The fusion of geological and biological perspectives on the operation of the Earth system is revolutionizing the way we think about the interactions of life and environment. No longer does life simply adapt to environmental change; those adaptations in turn modify the environment. Emerging from these interactions is the possibility of environmental regulation, the essence of Lovelock's Gaia Hypothesis. The long-term carbon cycle, for example, reflects a balance between the sources and sinks of carbon including volcanism, weathering of rocks exposed subaerially or on the seafloor, carbonate mineral formation, and the burial of organic carbon. The traditional view of these processes limits biological influences to the production and remineralization of organic matter and the formation of mineral skeletons. With the geobiological revolution we now also recognize the important role biological activity plays in accelerating weathering processes. Weathering rates depend on a variety of factors that we represent in numerical models with rate laws we adapt from inorganic chemistry. These can be characterized as zero-order (independent), first-order (linear), etc. and these functions are all monotonic. Yet one of the hallmark features of life is that it responds to changes in its environment parabolically: rates of physiological processes exhibit minima, optima, and maxima with respect to environment variables (temperature, pH, salinity, pO2, pCO2, . . .). Incorporation of physiological-style rate laws, and in general the explicit representation of life in models of Earth surface processes, demonstrates how the biota influence environmental stability on geologic time scales.

  18. The Spatial-Temporal Analysis of Ecological Environment of Red Bed Hills in East Sichuan - Taking LU County as a Case

    NASA Astrophysics Data System (ADS)

    Liu, H.; Liu, Y.; Wang, X.; Liu, J.

    2018-04-01

    Good ecological environment is the foundation of human existence and development, the development of society and economy must be based on the premise of maintaining the stability and balance of the ecological environment. RS and GIS technology are used in this paper while the red-bed hills of Sichuan Province-Lu County have been taken as an example. According to the ecological environment characteristics of the study areas and the principle of choosing evaluation index, this paper selected six evaluation indicators (elevation, slope, aspect, vegetation cover, land use, gully density) to establish evaluation index system of ecological environment of Lu County. This paper determine the weight of each evaluation index by AHP (Analytic Hierarchy Process) and establishes a comprehensive evaluation model by the weighted comprehensive evaluation method. This model is used to divide the ecological environment quality of Lu County into excellent, good, middle, poor and worse, and to analyze the ecological environment change in Lu County in recent ten years.

  19. Towards the Integration of APECS with VE-Suite to Create a Comprehensive Virtual Engineering Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCorkle, D.; Yang, C.; Jordan, T.

    2007-06-01

    Modeling and simulation tools are becoming pervasive in the process engineering practice of designing advanced power generation facilities. These tools enable engineers to explore many what-if scenarios before cutting metal or constructing a pilot scale facility. While such tools enable investigation of crucial plant design aspects, typical commercial process simulation tools such as Aspen Plus®, gPROMS®, and HYSYS® still do not explore some plant design information, including computational fluid dynamics (CFD) models for complex thermal and fluid flow phenomena, economics models for policy decisions, operational data after the plant is constructed, and as-built information for use in as-designed models. Softwaremore » tools must be created that allow disparate sources of information to be integrated if environments are to be constructed where process simulation information can be accessed. At the Department of Energy’s (DOE) National Energy Technology Laboratory (NETL), the Advanced Process Engineering Co-Simulator (APECS) has been developed as an integrated software suite that combines process simulation (e.g., Aspen Plus) and high-fidelity equipment simulation (e.g., Fluent® CFD), together with advanced analysis capabilities including case studies, sensitivity analysis, stochastic simulation for risk/uncertainty analysis, and multi-objective optimization. In this paper, we discuss the initial phases of integrating APECS with the immersive and interactive virtual engineering software, VE-Suite, developed at Iowa State University and Ames Laboratory. VE-Suite utilizes the ActiveX (OLE Automation) controls in Aspen Plus wrapped by the CASI library developed by Reaction Engineering International to run the process simulation and query for unit operation results. This integration permits any application that uses the VE-Open interface to integrate with APECS co-simulations, enabling construction of the comprehensive virtual engineering environment needed for the rapid engineering of advanced power generation facilities.« less

  20. Space Shuttle Main Engine Low Pressure Oxidizer Turbo-Pump Inducer Dynamic Environment Characterization through Water Model and Hot-Fire Testing

    NASA Technical Reports Server (NTRS)

    Arellano, Patrick; Patton, Marc; Schwartz, Alan; Stanton, David

    2006-01-01

    The Low Pressure Oxidizer Turbopump (LPOTP) inducer on the Block II configuration Space Shuttle Main Engine (SSME) experienced blade leading edge ripples during hot firing. This undesirable condition led to a minor redesign of the inducer blades. This resulted in the need to evaluate the performance and the dynamic environment of the redesign, relative to the current configuration, as part of the design acceptance process. Sub-scale water model tests of the two inducer configurations were performed, with emphasis on the dynamic environment due to cavitation induced vibrations. Water model tests were performed over a wide range of inlet flow coefficient and pressure conditions, representative of the scaled operating envelope of the Block II SSME, both in flight and in ground hot-fire tests, including all power levels. The water test hardware, facility set-up, type and placement of instrumentation, the scope of the test program, specific test objectives, data evaluation process and water test results that characterize and compare the two SSME LPOTP inducers are discussed. In addition, dynamic characteristics of the two water models were compared to hot fire data from specially instrumented ground tests. In general, good agreement between the water model and hot fire data was found, which confirms the value of water model testing for dynamic characterization of rocket engine turbomachinery.

  1. Prototype of an Integrated Hurricane Information System for Research: Description and Illustration of its Use in Evaluating WRF Model Simulations

    NASA Astrophysics Data System (ADS)

    Hristova-Veleva, S.; Chao, Y.; Vane, D.; Lambrigtsen, B.; Li, P. P.; Knosp, B.; Vu, Q. A.; Su, H.; Dang, V.; Fovell, R.; Tanelli, S.; Garay, M.; Willis, J.; Poulsen, W.; Fishbein, E.; Ao, C. O.; Vazquez, J.; Park, K. J.; Callahan, P.; Marcus, S.; Haddad, Z.; Fetzer, E.; Kahn, R.

    2007-12-01

    In spite of recent improvements in hurricane track forecast accuracy, currently there are still many unanswered questions about the physical processes that determine hurricane genesis, intensity, track and impact on large- scale environment. Furthermore, a significant amount of work remains to be done in validating hurricane forecast models, understanding their sensitivities and improving their parameterizations. None of this can be accomplished without a comprehensive set of multiparameter observations that are relevant to both the large- scale and the storm-scale processes in the atmosphere and in the ocean. To address this need, we have developed a prototype of a comprehensive hurricane information system of high- resolution satellite, airborne and in-situ observations and model outputs pertaining to: i) the thermodynamic and microphysical structure of the storms; ii) the air-sea interaction processes; iii) the larger-scale environment as depicted by the SST, ocean heat content and the aerosol loading of the environment. Our goal was to create a one-stop place to provide the researchers with an extensive set of observed hurricane data, and their graphical representation, together with large-scale and convection-resolving model output, all organized in an easy way to determine when coincident observations from multiple instruments are available. Analysis tools will be developed in the next step. The analysis tools will be used to determine spatial, temporal and multiparameter covariances that are needed to evaluate model performance, provide information for data assimilation and characterize and compare observations from different platforms. We envision that the developed hurricane information system will help in the validation of the hurricane models, in the systematic understanding of their sensitivities and in the improvement of the physical parameterizations employed by the models. Furthermore, it will help in studying the physical processes that affect hurricane development and impact on large-scale environment. This talk will describe the developed prototype of the hurricane information systems. Furthermore, we will use a set of WRF hurricane simulations and compare simulated to observed structures to illustrate how the information system can be used to discriminate between simulations that employ different physical parameterizations. The work described here was performed at the Jet Propulsion Laboratory, California Institute of Technology, under contract with the National Aeronautics ans Space Administration.

  2. Modeling of space environment impact on nanostructured materials. General principles

    NASA Astrophysics Data System (ADS)

    Voronina, Ekaterina; Novikov, Lev

    2016-07-01

    In accordance with the resolution of ISO TC20/SC14 WG4/WG6 joint meeting, Technical Specification (TS) 'Modeling of space environment impact on nanostructured materials. General principles' which describes computer simulation methods of space environment impact on nanostructured materials is being prepared. Nanomaterials surpass traditional materials for space applications in many aspects due to their unique properties associated with nanoscale size of their constituents. This superiority in mechanical, thermal, electrical and optical properties will evidently inspire a wide range of applications in the next generation spacecraft intended for the long-term (~15-20 years) operation in near-Earth orbits and the automatic and manned interplanetary missions. Currently, ISO activity on developing standards concerning different issues of nanomaterials manufacturing and applications is high enough. Most such standards are related to production and characterization of nanostructures, however there is no ISO documents concerning nanomaterials behavior in different environmental conditions, including the space environment. The given TS deals with the peculiarities of the space environment impact on nanostructured materials (i.e. materials with structured objects which size in at least one dimension lies within 1-100 nm). The basic purpose of the document is the general description of the methodology of applying computer simulation methods which relate to different space and time scale to modeling processes occurring in nanostructured materials under the space environment impact. This document will emphasize the necessity of applying multiscale simulation approach and present the recommendations for the choice of the most appropriate methods (or a group of methods) for computer modeling of various processes that can occur in nanostructured materials under the influence of different space environment components. In addition, TS includes the description of possible approximations and limitations of proposed simulation methods as well as of widely used software codes. This TS may be used as a base for developing a new standard devoted to nanomaterials applications for spacecraft.

  3. A distributed Clips implementation: dClips

    NASA Technical Reports Server (NTRS)

    Li, Y. Philip

    1993-01-01

    A distributed version of the Clips language, dClips, was implemented on top of two existing generic distributed messaging systems to show that: (1) it is easy to create a coarse-grained parallel programming environment out of an existing language if a high level messaging system is used; and (2) the computing model of a parallel programming environment can be changed easily if we change the underlying messaging system. dClips processes were first connected with a simple master-slave model. A client-server model with intercommunicating agents was later implemented. The concept of service broker is being investigated.

  4. Probabilistic Verification of Multi-Robot Missions in Uncertain Environments

    DTIC Science & Technology

    2015-11-01

    has been used to measure the environment, including any dynamic obstacles. However, no matter how the model originates, this approach is based on...modeled as bivariate Gaussian distributions and estimated by calibration measurements . The Robot process model is described in prior work [13...sn〉 (pR,pE)(obR) = In〈pR〉〈p〉 ; In〈pE〉〈e〉 ; ( Gtr〈 d(p,e), sr〉〈p1〉 ; Out〈obR,p1〉 | Lte 〈 d(p,e), sr〉〈p2〉 ; Out〈obR, sn+p2 〉 ) ; Sensors

  5. [Petrological Analysis of Astrophysical Dust Analog Evolution

    NASA Technical Reports Server (NTRS)

    Rietmeijer, Frans J. M.

    1997-01-01

    This project "Petrological analysis of astrophysical dust analog evolution" was initiated to try to understand the vapor phase condensation, and the nature of the reaction products, in circumstellar environments, such as the solar nebula 4,500 Myrs ago, and in the interstellar medium. Telescope-based infrared [IR] spectroscopy offers a broad-scale inventory of the various types of dust in these environments but no details on small-scale variations in terms of chemistry and morphology and petrological phase relationships. Vapor phase condensation in these environments is almost certainly a non-equilibrium process. The main challenge to this research was to document the nature of this process that, based on astrophysical observations, seems to yield compositionally consistent materials. This observation may suggest a predictable character during non-equilibrium condensation. These astrophysical environments include two chemically distinct, that is, oxygen-rich and carbon-rich environments. The former is characterized by silicates the latter by carbon-bearing solids. According to cosmological models of stellar evolution circumstellar dust accreted into protoplanets wherein thermal and/or aqueous processes will alter the dust under initially, non-equilibrium conditions.

  6. Integrating Computers into the Problem-Solving Process.

    ERIC Educational Resources Information Center

    Lowther, Deborah L.; Morrison, Gary R.

    2003-01-01

    Asserts that within the context of problem-based learning environments, professors can encourage students to use computers as problem-solving tools. The ten-step Integrating Technology for InQuiry (NteQ) model guides professors through the process of integrating computers into problem-based learning activities. (SWM)

  7. Employee Communication during Crises: The Effects of Stress on Information Processing.

    ERIC Educational Resources Information Center

    Pincus, J. David; Acharya, Lalit

    Based on multidisciplinary research findings, this report proposes an information processing model of employees' response to highly stressful information environments arising during organizational crises. The introduction stresses the importance of management's handling crisis communication with employees skillfully. The second section points out…

  8. Application of queueing models to multiprogrammed computer systems operating in a time-critical environment

    NASA Technical Reports Server (NTRS)

    Eckhardt, D. E., Jr.

    1979-01-01

    A model of a central processor (CPU) which services background applications in the presence of time critical activity is presented. The CPU is viewed as an M/M/1 queueing system subject to periodic interrupts by deterministic, time critical process. The Laplace transform of the distribution of service times for the background applications is developed. The use of state of the art queueing models for studying the background processing capability of time critical computer systems is discussed and the results of a model validation study which support this application of queueing models are presented.

  9. Physical Modeling for Processing Geosynchronous Imaging Fourier Transform Spectrometer-Indian Ocean METOC Imager (GIFTS-IOMI) Hyperspectral Data

    DTIC Science & Technology

    2002-09-30

    Physical Modeling for Processing Geosynchronous Imaging Fourier Transform Spectrometer-Indian Ocean METOC Imager ( GIFTS -IOMI) Hyperspectral Data...water quality assessment. OBJECTIVES The objective of this DoD research effort is to develop and demonstrate a fully functional GIFTS - IOMI...environment once GIFTS -IOMI is stationed over the Indian Ocean. The system will provide specialized methods for the characterization of the atmospheric

  10. Modeling gypsy moth seasonality

    Treesearch

    J. A. Logan; D. R. Gray

    1991-01-01

    Maintaining an appropriate seasonality is perhaps the most basic ecological requisite for insects living in temperate environments. The basic ecological importance of seasonality is enough to justify expending considerable effort to accurately model the processes involved. For insects of significant economic consequence, seasonality assumes additional importance...

  11. Integrated Multimedia Modeling System Response to Regional Land Management Change

    EPA Science Inventory

    A multi-media system of nitrogen and co-pollutant models describing critical physical and chemical processes that cascade synergistically and competitively through the environment, the economy and society has been developed at the USEPA Office of research and development. It is ...

  12. The active site architecture in peroxiredoxins: a case study on Mycobacterium tuberculosis AhpE.

    PubMed

    Pedre, Brandán; van Bergen, Laura A H; Palló, Anna; Rosado, Leonardo A; Dufe, Veronica Tamu; Molle, Inge Van; Wahni, Khadija; Erdogan, Huriye; Alonso, Mercedes; Proft, Frank De; Messens, Joris

    2016-08-11

    Peroxiredoxins catalyze the reduction of peroxides, a process of vital importance to survive oxidative stress. A nucleophilic cysteine, also known as the peroxidatic cysteine, is responsible for this catalytic process. We used the Mycobacterium tuberculosis alkyl hydroperoxide reductase E (MtAhpE) as a model to investigate the effect of the chemical environment on the specificity of the reaction. Using an integrative structural (R116A - PDB ; F37H - PDB ), kinetic and computational approach, we explain the mutational effects of key residues in its environment. This study shows that the active site residues are specifically oriented to create an environment which selectively favours a reaction with peroxides.

  13. Changing currents: a strategy for understanding and predicting the changing ocean circulation.

    PubMed

    Bryden, Harry L; Robinson, Carol; Griffiths, Gwyn

    2012-12-13

    Within the context of UK marine science, we project a strategy for ocean circulation research over the next 20 years. We recommend a focus on three types of research: (i) sustained observations of the varying and evolving ocean circulation, (ii) careful analysis and interpretation of the observed climate changes for comparison with climate model projections, and (iii) the design and execution of focused field experiments to understand ocean processes that are not resolved in coupled climate models so as to be able to embed these processes realistically in the models. Within UK-sustained observations, we emphasize smart, cost-effective design of the observational network to extract maximum information from limited field resources. We encourage the incorporation of new sensors and new energy sources within the operational environment of UK-sustained observational programmes to bridge the gap that normally separates laboratory prototype from operational instrument. For interpreting the climate-change records obtained through a variety of national and international sustained observational programmes, creative and dedicated UK scientists should lead efforts to extract the meaningful signals and patterns of climate change and to interpret them so as to project future changes. For the process studies, individual scientists will need to work together in team environments to combine observational and process modelling results into effective improvements in the coupled climate models that will lead to more accurate climate predictions.

  14. Development of techniques for processing metal-metal oxide systems

    NASA Technical Reports Server (NTRS)

    Johnson, P. C.

    1976-01-01

    Techniques for producing model metal-metal oxide systems for the purpose of evaluating the results of processing such systems in the low-gravity environment afforded by a drop tower facility are described. Because of the lack of success in producing suitable materials samples and techniques for processing in the 3.5 seconds available, the program was discontinued.

  15. Issues of Spatial and Temporal Scale in Modeling the Effects of Field Operatiions on Soil Properties

    USDA-ARS?s Scientific Manuscript database

    Tillage is an important procedure for modifying the soil environment in order to enhance crop growth and conserve soil and water resources. Process-based models of crop production are widely used in decision support, but few explicitly simulate tillage. The Cropping Systems Model (CSM) was modified ...

  16. Evaluating the multimedia fate of organic chemicals: A level III fugacity model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mackay, D.; Paterson, S.

    A multimedia model is developed and applied to selected organic chemicals in evaluative and real regional environments. The model employs the fugacity concept and treats four bulk compartments: air, water, soil, and bottom sediment, which consist of subcompartments of varying proportions of air, water, and mineral and organic matter. Chemical equilibrium is assumed to apply within (but not between) each bulk compartment. Expressions are included for emissions, advective flows, degrading reactions, and interphase transport by diffusive and non-diffusive processes. Input to the model consists of a description of the environment, the physical-chemical and reaction properties of the chemical, and emissionmore » rates. For steady-state conditions the solution is a simple algebraic expression. The model is applied to six chemicals in the region of southern Ontario and the calculated fate and concentrations are compared with observations. The results suggest that the model may be used to determine the processes that control the environmental fate of chemicals in a region and provide approximate estimates of relative media concentrations.« less

  17. Biomimicry of quorum sensing using bacterial lifecycle model.

    PubMed

    Niu, Ben; Wang, Hong; Duan, Qiqi; Li, Li

    2013-01-01

    Recent microbiologic studies have shown that quorum sensing mechanisms, which serve as one of the fundamental requirements for bacterial survival, exist widely in bacterial intra- and inter-species cell-cell communication. Many simulation models, inspired by the social behavior of natural organisms, are presented to provide new approaches for solving realistic optimization problems. Most of these simulation models follow population-based modelling approaches, where all the individuals are updated according to the same rules. Therefore, it is difficult to maintain the diversity of the population. In this paper, we present a computational model termed LCM-QS, which simulates the bacterial quorum-sensing (QS) mechanism using an individual-based modelling approach under the framework of Agent-Environment-Rule (AER) scheme, i.e. bacterial lifecycle model (LCM). LCM-QS model can be classified into three main sub-models: chemotaxis with QS sub-model, reproduction and elimination sub-model and migration sub-model. The proposed model is used to not only imitate the bacterial evolution process at the single-cell level, but also concentrate on the study of bacterial macroscopic behaviour. Comparative experiments under four different scenarios have been conducted in an artificial 3-D environment with nutrients and noxious distribution. Detailed study on bacterial chemotatic processes with quorum sensing and without quorum sensing are compared. By using quorum sensing mechanisms, artificial bacteria working together can find the nutrient concentration (or global optimum) quickly in the artificial environment. Biomimicry of quorum sensing mechanisms using the lifecycle model allows the artificial bacteria endowed with the communication abilities, which are essential to obtain more valuable information to guide their search cooperatively towards the preferred nutrient concentrations. It can also provide an inspiration for designing new swarm intelligence optimization algorithms, which can be used for solving the real-world problems.

  18. Biomimicry of quorum sensing using bacterial lifecycle model

    PubMed Central

    2013-01-01

    Background Recent microbiologic studies have shown that quorum sensing mechanisms, which serve as one of the fundamental requirements for bacterial survival, exist widely in bacterial intra- and inter-species cell-cell communication. Many simulation models, inspired by the social behavior of natural organisms, are presented to provide new approaches for solving realistic optimization problems. Most of these simulation models follow population-based modelling approaches, where all the individuals are updated according to the same rules. Therefore, it is difficult to maintain the diversity of the population. Results In this paper, we present a computational model termed LCM-QS, which simulates the bacterial quorum-sensing (QS) mechanism using an individual-based modelling approach under the framework of Agent-Environment-Rule (AER) scheme, i.e. bacterial lifecycle model (LCM). LCM-QS model can be classified into three main sub-models: chemotaxis with QS sub-model, reproduction and elimination sub-model and migration sub-model. The proposed model is used to not only imitate the bacterial evolution process at the single-cell level, but also concentrate on the study of bacterial macroscopic behaviour. Comparative experiments under four different scenarios have been conducted in an artificial 3-D environment with nutrients and noxious distribution. Detailed study on bacterial chemotatic processes with quorum sensing and without quorum sensing are compared. By using quorum sensing mechanisms, artificial bacteria working together can find the nutrient concentration (or global optimum) quickly in the artificial environment. Conclusions Biomimicry of quorum sensing mechanisms using the lifecycle model allows the artificial bacteria endowed with the communication abilities, which are essential to obtain more valuable information to guide their search cooperatively towards the preferred nutrient concentrations. It can also provide an inspiration for designing new swarm intelligence optimization algorithms, which can be used for solving the real-world problems. PMID:23815296

  19. Electron attachment to molecules in a cluster environment: suppression and enhancement effects

    NASA Astrophysics Data System (ADS)

    Fabrikant, Ilya I.

    2018-05-01

    Cluster environments can strongly influence dissociative electron attachment (DEA) processes. These effects are important in many applications, particularly for surface chemistry, radiation damage, and atmospheric physics. We review several mechanisms for DEA suppression and enhancement due to cluster environments, particularly due to microhydration. Long-range electron-molecule and electron-cluster interactions play often a significant role in these effects and can be analysed by using theoretical models. Nevertheless many observations remain unexplained due to complexity of the physics and chemistry of interaction of DEA fragments with the cluster environment.

  20. Modeling multi-scale aerosol dynamics and micro-environmental air quality near a large highway intersection using the CTAG model.

    PubMed

    Wang, Yan Jason; Nguyen, Monica T; Steffens, Jonathan T; Tong, Zheming; Wang, Yungang; Hopke, Philip K; Zhang, K Max

    2013-01-15

    A new methodology, referred to as the multi-scale structure, integrates "tailpipe-to-road" (i.e., on-road domain) and "road-to-ambient" (i.e., near-road domain) simulations to elucidate the environmental impacts of particulate emissions from traffic sources. The multi-scale structure is implemented in the CTAG model to 1) generate process-based on-road emission rates of ultrafine particles (UFPs) by explicitly simulating the effects of exhaust properties, traffic conditions, and meteorological conditions and 2) to characterize the impacts of traffic-related emissions on micro-environmental air quality near a highway intersection in Rochester, NY. The performance of CTAG, evaluated against with the field measurements, shows adequate agreement in capturing the dispersion of carbon monoxide (CO) and the number concentrations of UFPs in the near road micro-environment. As a proof-of-concept case study, we also apply CTAG to separate the relative impacts of the shutdown of a large coal-fired power plant (CFPP) and the adoption of the ultra-low-sulfur diesel (ULSD) on UFP concentrations in the intersection micro-environment. Although CTAG is still computationally expensive compared to the widely-used parameterized dispersion models, it has the potential to advance our capability to predict the impacts of UFP emissions and spatial/temporal variations of air pollutants in complex environments. Furthermore, for the on-road simulations, CTAG can serve as a process-based emission model; Combining the on-road and near-road simulations, CTAG becomes a "plume-in-grid" model for mobile emissions. The processed emission profiles can potentially improve regional air quality and climate predictions accordingly. Copyright © 2012 Elsevier B.V. All rights reserved.

  1. Integrated wetland management: an analysis with group model building based on system dynamics model.

    PubMed

    Chen, Hsin; Chang, Yang-Chi; Chen, Kung-Chen

    2014-12-15

    The wetland system possesses diverse functions such as preserving water sources, mediating flooding, providing habitats for wildlife and stabilizing coastlines. Nonetheless, rapid economic growth and the increasing population have significantly deteriorated the wetland environment. To secure the sustainability of the wetland, it is essential to introduce integrated and systematic management. This paper examines the resource management of the Jiading Wetland by applying group model building (GMB) and system dynamics (SD). We systematically identify local stakeholders' mental model regarding the impact brought by the yacht industry, and further establish a SD model to simulate the dynamic wetland environment. The GMB process improves the stakeholders' understanding about the interaction between the wetland environment and management policies. Differences between the stakeholders' perceptions and the behaviors shown by the SD model also suggest that our analysis would facilitate the stakeholders to broaden their horizons and achieve consensus on the wetland resource management. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. PROCRU: A model for analyzing crew procedures in approach to landing

    NASA Technical Reports Server (NTRS)

    Baron, S.; Muralidharan, R.; Lancraft, R.; Zacharias, G.

    1980-01-01

    A model for analyzing crew procedures in approach to landing is developed. The model employs the information processing structure used in the optimal control model and in recent models for monitoring and failure detection. Mechanisms are added to this basic structure to model crew decision making in this multi task environment. Decisions are based on probability assessments and potential mission impact (or gain). Sub models for procedural activities are included. The model distinguishes among external visual, instrument visual, and auditory sources of information. The external visual scene perception models incorporate limitations in obtaining information. The auditory information channel contains a buffer to allow for storage in memory until that information can be processed.

  3. OpenFLUID: an open-source software environment for modelling fluxes in landscapes

    NASA Astrophysics Data System (ADS)

    Fabre, Jean-Christophe; Rabotin, Michaël; Crevoisier, David; Libres, Aline; Dagès, Cécile; Moussa, Roger; Lagacherie, Philippe; Raclot, Damien; Voltz, Marc

    2013-04-01

    Integrative landscape functioning has become a common concept in environmental management. Landscapes are complex systems where many processes interact in time and space. In agro-ecosystems, these processes are mainly physical processes, including hydrological-processes, biological processes and human activities. Modelling such systems requires an interdisciplinary approach, coupling models coming from different disciplines, developed by different teams. In order to support collaborative works, involving many models coupled in time and space for integrative simulations, an open software modelling platform is a relevant answer. OpenFLUID is an open source software platform for modelling landscape functioning, mainly focused on spatial fluxes. It provides an advanced object-oriented architecture allowing to i) couple models developed de novo or from existing source code, and which are dynamically plugged to the platform, ii) represent landscapes as hierarchical graphs, taking into account multi-scale, spatial heterogeneities and landscape objects connectivity, iii) run and explore simulations in many ways : using the OpenFLUID software interfaces for users (command line interface, graphical user interface), or using external applications such as GNU R through the provided ROpenFLUID package. OpenFLUID is developed in C++ and relies on open source libraries only (Boost, libXML2, GLib/GTK, OGR/GDAL, …). For modelers and developers, OpenFLUID provides a dedicated environment for model development, which is based on an open source toolchain, including the Eclipse editor, the GCC compiler and the CMake build system. OpenFLUID is distributed under the GPLv3 open source license, with a special exception allowing to plug existing models licensed under any license. It is clearly in the spirit of sharing knowledge and favouring collaboration in a community of modelers. OpenFLUID has been involved in many research applications, such as modelling of hydrological network transfer, diagnosis and prediction of water quality taking into account human activities, study of the effect of spatial organization on hydrological fluxes, modelling of surface-subsurface water exchanges, … At LISAH research unit, OpenFLUID is the supporting development platform of the MHYDAS model, which is a distributed model for agrosystems (Moussa et al., 2002, Hydrological Processes, 16, 393-412). OpenFLUID web site : http://www.openfluid-project.org

  4. Distributed Architecture for the Object-Oriented Method for Interoperability

    DTIC Science & Technology

    2003-03-01

    Collaborative Environment. ......................121 Figure V-2. Distributed OOMI And The Collaboration Centric Paradigm. .....................123 Figure V...of systems are formed into a system federation to resolve differences in modeling. An OOMI Integrated Development Environment (OOMI IDE) lends ...space for the creation of possible distributed systems is partitioned into User Centric systems, Processing/Storage Centric systems, Implementation

  5. Integrated Perspective of Evolving Intrapsychic and Person-Environment Functions: Implications for Deaf and Hard of Hearing Individuals.

    ERIC Educational Resources Information Center

    Jung, Vivienne; Short, Robert H.

    2002-01-01

    This article reviews various theories for difficulties in socioemotional functioning experienced by many deaf persons. It then proposes a 3-level model which focuses on: (1) intrapsychic processes such as self-concept; (2) reciprocal interactions between the person and the social environment; and (3) resulting memories and expectancies that affect…

  6. Life Modeling for Nickel-Hydrogen Batteries in Geosynchronous Satellite Operation

    DTIC Science & Technology

    2005-03-25

    aerothermodynamics; chemical and electric propulsion; environmental chemistry; combustion processes; space environment effects on materials, hardening and...intelligent microinstruments for monitoring space and launch system environments . Space Science Applications Laboratory: Magnetospheric, auroral and cosmic-ray...hyperspectral imagery to defense, civil space, commercial, and environmental missions; effects of solar activity, magnetic storms and nuclear explosions on the

  7. Challenges in a Physics Course: Introducing Student-Centred Activities for Increased Learning

    ERIC Educational Resources Information Center

    Hernandez, Carola; Ravn, Ole; Forero-Shelton, Manu

    2014-01-01

    This article identifies and analyses some of the challenges that arose in a development process of changing from a content-based teaching environment to a student-centred environment in an undergraduate physics course for medicine and biology students at Universidad de los Andes. Through the use of the Critical Research model proposed by Skovsmose…

  8. The Emerging Importance of Business Process Standards in the Federal Government

    DTIC Science & Technology

    2006-02-23

    delivers enough value for its commercialization into the general industry. Today, we are seeing standards such as SOA, BPMN and BPEL hit that...Process Modeling Notation ( BPMN ) and the Business Process Execution Language (BPEL). BPMN provides a standard representation for capturing and...execution. The combination of BPMN and BPEL offers organizations the potential to standardize processes in a distributed environment, enabling

  9. Reference Model for Project Support Environments Version 1.0

    DTIC Science & Technology

    1993-02-28

    relationship with the framework’s Process Support services and with the Lifecycle Process Engineering services. Examples: "* ORCA (Object-based...Design services. Examples: "* ORCA (Object-based Requirements Capture and Analysis). "* RETRAC (REquirements TRACeability). 4.3 Life-Cycle Process...34traditional" computer tools. Operations: Examples of audio and video processing operations include: "* Create, modify, and delete sound and video data

  10. Impact of Pathogen Population Heterogeneity and Stress-Resistant Variants on Food Safety.

    PubMed

    Abee, T; Koomen, J; Metselaar, K I; Zwietering, M H; den Besten, H M W

    2016-01-01

    This review elucidates the state-of-the-art knowledge about pathogen population heterogeneity and describes the genotypic and phenotypic analyses of persister subpopulations and stress-resistant variants. The molecular mechanisms underlying the generation of persister phenotypes and genetic variants are identified. Zooming in on Listeria monocytogenes, a comparative whole-genome sequence analysis of wild types and variants that enabled the identification of mutations in variants obtained after a single exposure to lethal food-relevant stresses is described. Genotypic and phenotypic features are compared to those for persistent strains isolated from food processing environments. Inactivation kinetics, models used for fitting, and the concept of kinetic modeling-based schemes for detection of variants are presented. Furthermore, robustness and fitness parameters of L. monocytogenes wild type and variants are used to model their performance in food chains. Finally, the impact of stress-resistant variants and persistence in food processing environments on food safety is discussed.

  11. A methodology aimed at fostering and sustaining the development processes of an IE-based industry

    NASA Astrophysics Data System (ADS)

    Corallo, Angelo; Errico, Fabrizio; de Maggio, Marco; Giangreco, Enza

    In the current competitive scenario, where business relationships are fundamental in building successful business models and inter/intra organizational business processes are progressively digitalized, an end-to-end methodology is required that is capable of guiding business networks through the Internetworked Enterprise (IE) paradigm: a new and innovative organizational model able to leverage Internet technologies to perform real-time coordination of intra and inter-firm activities, to create value by offering innovative and personalized products/services and reduce transaction costs. This chapter presents the TEKNE project Methodology of change that guides business networks, by means of a modular and flexible approach, towards the IE techno-organizational paradigm, taking into account the competitive environment of the network and how this environment influences its strategic, organizational and technological levels. Contingency, the business model, enterprise architecture and performance metrics are the key concepts that form the cornerstone of this methodological framework.

  12. The Bilingual Language Interaction Network for Comprehension of Speech*

    PubMed Central

    Marian, Viorica

    2013-01-01

    During speech comprehension, bilinguals co-activate both of their languages, resulting in cross-linguistic interaction at various levels of processing. This interaction has important consequences for both the structure of the language system and the mechanisms by which the system processes spoken language. Using computational modeling, we can examine how cross-linguistic interaction affects language processing in a controlled, simulated environment. Here we present a connectionist model of bilingual language processing, the Bilingual Language Interaction Network for Comprehension of Speech (BLINCS), wherein interconnected levels of processing are created using dynamic, self-organizing maps. BLINCS can account for a variety of psycholinguistic phenomena, including cross-linguistic interaction at and across multiple levels of processing, cognate facilitation effects, and audio-visual integration during speech comprehension. The model also provides a way to separate two languages without requiring a global language-identification system. We conclude that BLINCS serves as a promising new model of bilingual spoken language comprehension. PMID:24363602

  13. Mapping care processes within a hospital: a web-based proposal merging enterprise modelling and ISO normative principles.

    PubMed

    Staccini, Pascal; Joubert, Michel; Quaranta, Jean-François; Fieschi, Marius

    2003-01-01

    Today, the economic and regulatory environment are pressuring hospitals and healthcare professionals to account for their results and methods of care delivery. The evaluation of the quality and the safety of care, the traceability of the acts performed and the evaluation of practices are some of the reasons underpinning current interest in clinical and hospital information systems. The structured collection of users' needs and system requirements is fundamental when installing such systems. This stage takes time and is generally misconstrued by caregivers and is of limited efficacy to analysis. We used a modelling technique designed for manufacturing processes (SADT: Structured Analysis and Design Technique). We enhanced the initial model of activity of this method and programmed a web-based tool in an object-oriented environment. This tool makes it possible to extract the data dictionary from the description of a given process and to locate documents (procedures, recommendations, instructions). Aimed at structuring needs and storing information provided by teams directly involved regarding the workings of an institution (or at least part of it), the process mapping approach has an important contribution to make in the analysis of clinical information systems.

  14. A process-based standard for the Solar Energetic Particle Event Environment

    NASA Astrophysics Data System (ADS)

    Gabriel, Stephen

    For 10 years or more, there has been a lack of concensus on what the ISO standard model for the Solar Energetic Particle Event (SEPE) environment should be. Despite many technical discussions between the world experts in this field, it has been impossible to agree on which of the several models available should be selected as the standard. Most of these discussions at the ISO WG4 meetings and conferences, etc have centred around the differences in modelling approach between the MSU model and the several remaining models from elsewhere worldwide (mainly the USA and Europe). The topic is considered timely given the inclusion of a session on reference data sets at the Space Weather Workshop in Boulder in April 2014. The original idea of a ‘process-based’ standard was conceived by Dr Kent Tobiska as a way of getting round the problems associated with not only the presence of different models, which in themselves could have quite distinct modelling approaches but could also be based on different data sets. In essence, a process based standard approach overcomes these issues by allowing there to be more than one model and not necessarily a single standard model; however, any such model has to be completely transparent in that the data set and the modelling techniques used have to be not only to be clearly and unambiguously defined but also subject to peer review. If the model meets all of these requirements then it should be acceptable as a standard model. So how does this process-based approach resolve the differences between the existing modelling approaches for the SEPE environment and remove the impasse? In a sense, it does not remove all of the differences but only some of them; however, most importantly it will allow something which so far has been impossible without ambiguities and disagreement and that is a comparison of the results of the various models. To date one of the problems (if not the major one) in comparing the results of the various different SEPE statistical models has been caused by two things: 1) the data set and 2) the definition of an event Because unravelling the dependencies of the outputs of different statistical models on these two parameters is extremely difficult if not impossible, currently comparison of the results from the different models is also extremely difficult and can lead to controversies, especially over which model is the correct one; hence, when it comes to using these models for engineering purposes to calculate, for example, the radiation dose for a particular mission, the user, who is in all likelihood not an expert in this field, could be given two( or even more) very different environments and find it impossible to know how to select one ( or even how to compare them). What is proposed then, is a process-based standard, which in common with nearly all of the current models is composed of 3 elements, a standard data set, a standard event definition and a resulting standard event list. A standard event list is the output of this standard and can then be used with any of the existing (or indeed future) models that are based on events. This standard event list is completely traceable and transparent and represents a reference event list for all the community. When coupled with a statistical model, the results when compared will only be dependent on the statistical model and not on the data set or event definition.

  15. Teaching MBA Statistics Online: A Pedagogically Sound Process Approach

    ERIC Educational Resources Information Center

    Grandzol, John R.

    2004-01-01

    Delivering MBA statistics in the online environment presents significant challenges to education and students alike because of varying student preparedness levels, complexity of content, difficulty in assessing learning outcomes, and faculty availability and technological expertise. In this article, the author suggests a process model that…

  16. Toward Evaluating the Predictability of Arctic-related Climate Variations: Initial Results from ArCS Project Theme 5

    NASA Astrophysics Data System (ADS)

    Hasumi, H.

    2016-12-01

    We present initial results from the theme 5 of the project ArCS, which is a national flagship project for Arctic research in Japan. The goal of theme 5 is to evaluate the predictability of Arctic-related climate variations, wherein we aim to: (1) establish the scientific basis of climate predictability; and (2) develop a method for predicting/projecting medium- and long-term climate variations. Variability in the Arctic environment remotely influences middle and low latitudes. Since some of the processes specific to the Arctic environment function as a long memory of the state of the climate, understanding of the process of remote connections would lead to higher-precision and longer-term prediction of global climate variations. Conventional climate models have large uncertainty in the Arctic region. By making Arctic processes in climate models more sophisticated, we aim to clarify the role of multi-sphere interaction in the Arctic environment. In this regard, our newly developed high resolution ice-ocean model has revealed the relationship between the oceanic heat transport into the Arctic Ocean and the synoptic scale atmospheric variability. We also aim to reveal the mechanism of remote connections by conducting climate simulations and analyzing various types of climate datasets. Our atmospheric model experiments under possible future situations of Arctic sea ice cover indicate that reduction of sea ice qualitatively alters the basic mechanism of remote connection. Also, our analyses of climate data have identified the cause of recent more frequent heat waves at Eurasian mid-to-high latitudes and clarified the dynamical process which forms the West Pacific pattern, a dominant mode of the atmospheric anomalous circulation in the West Pacific region which also exhibits a significant signal in the Arctic stratosphere.

  17. Microfluidic Experiments Studying Pore Scale Interactions of Microbes and Geochemistry

    NASA Astrophysics Data System (ADS)

    Chen, M.; Kocar, B. D.

    2016-12-01

    Understanding how physical phenomena, chemical reactions, and microbial behavior interact at the pore-scale is crucial to understanding larger scale trends in groundwater chemistry. Recent studies illustrate the utility of microfluidic devices for illuminating pore-scale physical-biogeochemical processes and their control(s) on the cycling of iron, uranium, and other important elements 1-3. These experimental systems are ideal for examining geochemical reactions mediated by microbes, which include processes governed by complex biological phenomenon (e.g. biofilm formation, etc.)4. We present results of microfluidic experiments using a model metal reducing bacteria and varying pore geometries, exploring the limitations of the microorganisms' ability to access tight pore spaces, and examining coupled biogeochemical-physical controls on the cycling of redox sensitive metals. Experimental results will provide an enhanced understanding of coupled physical-biogeochemical processes transpiring at the pore-scale, and will constrain and compliment continuum models used to predict and describe the subsurface cycling of redox-sensitive elements5. 1. Vrionis, H. A. et al. Microbiological and geochemical heterogeneity in an in situ uranium bioremediation field site. Appl. Environ. Microbiol. 71, 6308-6318 (2005). 2. Pearce, C. I. et al. Pore-scale characterization of biogeochemical controls on iron and uranium speciation under flow conditions. Environ. Sci. Technol. 46, 7992-8000 (2012). 3. Zhang, C., Liu, C. & Shi, Z. Micromodel investigation of transport effect on the kinetics of reductive dissolution of hematite. Environ. Sci. Technol. 47, 4131-4139 (2013). 4. Ginn, T. R. et al. Processes in microbial transport in the natural subsurface. Adv. Water Resour. 25, 1017-1042 (2002). 5. Scheibe, T. D. et al. Coupling a genome-scale metabolic model with a reactive transport model to describe in situ uranium bioremediation. Microb. Biotechnol. 2, 274-286 (2009).

  18. Performability evaluation of the SIFT computer

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.; Furchtgott, D. G.; Wu, L. T.

    1979-01-01

    Performability modeling and evaluation techniques are applied to the SIFT computer as it might operate in the computational evironment of an air transport mission. User-visible performance of the total system (SIFT plus its environment) is modeled as a random variable taking values in a set of levels of accomplishment. These levels are defined in terms of four attributes of total system behavior: safety, no change in mission profile, no operational penalties, and no economic process whose states describe the internal structure of SIFT as well as relavant conditions of the environment. Base model state trajectories are related to accomplishment levels via a capability function which is formulated in terms of a 3-level model hierarchy. Performability evaluation algorithms are then applied to determine the performability of the total system for various choices of computer and environment parameter values. Numerical results of those evaluations are presented and, in conclusion, some implications of this effort are discussed.

  19. Four-fluid MHD Simulations of the Plasma and Neutral Gas Environment of Comet Churyumov-Gerasimenko Near Perihelio

    NASA Astrophysics Data System (ADS)

    Huang, Z.; Toth, G.; Gombosi, T. I.; Jia, X.; Rubin, M.; Hansen, K. C.; Fougere, N.; Bieler, A. M.; Shou, Y.; Altwegg, K.; Combi, M. R.; Tenishev, V.

    2015-12-01

    The neutral and plasma environment is critical in understanding the interaction of comet Churyumov-Gerasimenko (CG), the target of the Rosetta mission, and the solar wind. To serve this need and support the Rosetta mission, we develop a 3-D four fluid model, which is based on BATS-R-US within the SWMF (Space Weather Modeling Framework) that solves the governing multi-fluid MHD equations and the Euler equations for the neutral gas fluid. These equations describe the behavior and interactions of the cometary heavy ions, the solar wind protons, the electrons, and the neutrals. This model incorporates different mass loading processes, including photo and electron impact ionization, charge exchange, dissociative ion-electron recombination, and collisional interactions between different fluids. We simulate the near nucleus plasma and neutral gas environment near perihelion with a realistic shape model of CG and compare our simulation results with Rosetta observations.

  20. Multiscale Modeling of Gene-Behavior Associations in an Artificial Neural Network Model of Cognitive Development.

    PubMed

    Thomas, Michael S C; Forrester, Neil A; Ronald, Angelica

    2016-01-01

    In the multidisciplinary field of developmental cognitive neuroscience, statistical associations between levels of description play an increasingly important role. One example of such associations is the observation of correlations between relatively common gene variants and individual differences in behavior. It is perhaps surprising that such associations can be detected despite the remoteness of these levels of description, and the fact that behavior is the outcome of an extended developmental process involving interaction of the whole organism with a variable environment. Given that they have been detected, how do such associations inform cognitive-level theories? To investigate this question, we employed a multiscale computational model of development, using a sample domain drawn from the field of language acquisition. The model comprised an artificial neural network model of past-tense acquisition trained using the backpropagation learning algorithm, extended to incorporate population modeling and genetic algorithms. It included five levels of description-four internal: genetic, network, neurocomputation, behavior; and one external: environment. Since the mechanistic assumptions of the model were known and its operation was relatively transparent, we could evaluate whether cross-level associations gave an accurate picture of causal processes. We established that associations could be detected between artificial genes and behavioral variation, even under polygenic assumptions of a many-to-one relationship between genes and neurocomputational parameters, and when an experience-dependent developmental process interceded between the action of genes and the emergence of behavior. We evaluated these associations with respect to their specificity (to different behaviors, to function vs. structure), to their developmental stability, and to their replicability, as well as considering issues of missing heritability and gene-environment interactions. We argue that gene-behavior associations can inform cognitive theory with respect to effect size, specificity, and timing. The model demonstrates a means by which researchers can undertake multiscale modeling with respect to cognition and develop highly specific and complex hypotheses across multiple levels of description. Copyright © 2015 Cognitive Science Society, Inc.

  1. SoftLab: A Soft-Computing Software for Experimental Research with Commercialization Aspects

    NASA Technical Reports Server (NTRS)

    Akbarzadeh-T, M.-R.; Shaikh, T. S.; Ren, J.; Hubbell, Rob; Kumbla, K. K.; Jamshidi, M

    1998-01-01

    SoftLab is a software environment for research and development in intelligent modeling/control using soft-computing paradigms such as fuzzy logic, neural networks, genetic algorithms, and genetic programs. SoftLab addresses the inadequacies of the existing soft-computing software by supporting comprehensive multidisciplinary functionalities from management tools to engineering systems. Furthermore, the built-in features help the user process/analyze information more efficiently by a friendly yet powerful interface, and will allow the user to specify user-specific processing modules, hence adding to the standard configuration of the software environment.

  2. Targeted quantification of functional enzyme dynamics in environmental samples for microbially mediated biogeochemical processes: Targeted quantification of functional enzyme dynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Minjing; Gao, Yuqian; Qian, Wei-Jun

    Microbially mediated biogeochemical processes are catalyzed by enzymes that control the transformation of carbon, nitrogen, and other elements in environment. The dynamic linkage between enzymes and biogeochemical species transformation has, however, rarely been investigated because of the lack of analytical approaches to efficiently and reliably quantify enzymes and their dynamics in soils and sediments. Herein, we developed a signature peptide-based technique for sensitively quantifying dissimilatory and assimilatory enzymes using nitrate-reducing enzymes in a hyporheic zone sediment as an example. Moreover, the measured changes in enzyme concentration were found to correlate with the nitrate reduction rate in a way different frommore » that inferred from biogeochemical models based on biomass or functional genes as surrogates for functional enzymes. This phenomenon has important implications for understanding and modeling the dynamics of microbial community functions and biogeochemical processes in environments. Our results also demonstrate the importance of enzyme quantification for the identification and interrogation of those biogeochemical processes with low metabolite concentrations as a result of faster enzyme-catalyzed consumption of metabolites than their production. The dynamic enzyme behaviors provide a basis for the development of enzyme-based models to describe the relationship between the microbial community and biogeochemical processes.« less

  3. Competition-Based Learning: A Model for the Integration of Competitions with Project-Based Learning Using Open Source LMS

    ERIC Educational Resources Information Center

    Issa, Ghassan; Hussain, Shakir M.; Al-Bahadili, Hussein

    2014-01-01

    In an effort to enhance the learning process in higher education, a new model for Competition-Based Learning (CBL) is presented. The new model utilizes two well-known learning models, namely, the Project-Based Learning (PBL) and competitions. The new model is also applied in a networked environment with emphasis on collective learning as well as…

  4. RESEARCH ACTIVITIES AT U.S. GOVERNMENT AGENCIES IN SUBSURFACE REACTIVE TRANSPORT MODELING

    EPA Science Inventory

    The fate of contaminants in the environment is controlled by both chemical reactions and transport phenomena in the subsurface. Our ability to understand the significance of these processes over time requires an accurate conceptual model that incorporates the various mechanisms ...

  5. Atomistic Modeling of Corrosion Events at the Interface between a Metal and Its Environment

    DOE PAGES

    Taylor, Christopher D.

    2012-01-01

    Atomistic simulation is a powerful tool for probing the structure and properties of materials and the nature of chemical reactions. Corrosion is a complex process that involves chemical reactions occurring at the interface between a material and its environment and is, therefore, highly suited to study by atomistic modeling techniques. In this paper, the complex nature of corrosion processes and mechanisms is briefly reviewed. Various atomistic methods for exploring corrosion mechanisms are then described, and recent applications in the literature surveyed. Several instances of the application of atomistic modeling to corrosion science are then reviewed in detail, including studies ofmore » the metal-water interface, the reaction of water on electrified metallic interfaces, the dissolution of metal atoms from metallic surfaces, and the role of competitive adsorption in controlling the chemical nature and structure of a metallic surface. Some perspectives are then given concerning the future of atomistic modeling in the field of corrosion science.« less

  6. A Comparison of Reasoning Processes in a Collaborative Modelling Environment: Learning about genetics problems using virtual chat

    NASA Astrophysics Data System (ADS)

    Pata, Kai; Sarapuu, Tago

    2006-09-01

    This study investigated the possible activation of different types of model-based reasoning processes in two learning settings, and the influence of various terms of reasoning on the learners’ problem representation development. Changes in 53 students’ problem representations about genetic issue were analysed while they worked with different modelling tools in a synchronous network-based environment. The discussion log-files were used for the “microgenetic” analysis of reasoning types. For studying the stages of students’ problem representation development, individual pre-essays and post-essays and their utterances during two reasoning phases were used. An approach for mapping problem representations was developed. Characterizing the elements of mental models and their reasoning level enabled the description of five hierarchical categories of problem representations. Learning in exploratory and experimental settings was registered as the shift towards more complex stages of problem representations in genetics. The effect of different types of reasoning could be observed as the divergent development of problem representations within hierarchical categories.

  7. Primitive bodies - Molecular abundances in Comet Halley as probes of cometary formation environments

    NASA Technical Reports Server (NTRS)

    Lunine, Jonathan I.

    1989-01-01

    The most recent results on abundances of molecules in Halley's comet are examined in the context of various models for the environment in which comets formed. These environments include molecular clouds associated with star-forming regions, the solar nebula, gaseous disks around proto-planets, and combinations of these. Of all constituents in a cometary nucleus, the highly volatile molecules such as methane, ammonia, molecular nitrogen, and carbon monoxide are most sensitive to the final episode of cometary grain formation and incorporation in the comet's nucleus; hence they likely reflect at least some chemical processing in the solar nebula. Proper interpretation requires modeling of a number of physical processes including gas phase chemistry, chemistry on grain surfaces, and fractionation effects resulting from preferential incorporation of certain gases in proto-cometary grains. The abundance of methane in Halley's comet could be a key indicator of where that comet formed, provided the methane abundance on grains in star-forming regions can be observationally constrained.

  8. A Model of Generating Visual Place Cells Based on Environment Perception and Similar Measure.

    PubMed

    Zhou, Yang; Wu, Dewei

    2016-01-01

    It is an important content to generate visual place cells (VPCs) in the field of bioinspired navigation. By analyzing the firing characteristic of biological place cells and the existing methods for generating VPCs, a model of generating visual place cells based on environment perception and similar measure is abstracted in this paper. VPCs' generation process is divided into three phases, including environment perception, similar measure, and recruiting of a new place cell. According to this process, a specific method for generating VPCs is presented. External reference landmarks are obtained based on local invariant characteristics of image and a similar measure function is designed based on Euclidean distance and Gaussian function. Simulation validates the proposed method is available. The firing characteristic of the generated VPCs is similar to that of biological place cells, and VPCs' firing fields can be adjusted flexibly by changing the adjustment factor of firing field (AFFF) and firing rate's threshold (FRT).

  9. A Model of Generating Visual Place Cells Based on Environment Perception and Similar Measure

    PubMed Central

    2016-01-01

    It is an important content to generate visual place cells (VPCs) in the field of bioinspired navigation. By analyzing the firing characteristic of biological place cells and the existing methods for generating VPCs, a model of generating visual place cells based on environment perception and similar measure is abstracted in this paper. VPCs' generation process is divided into three phases, including environment perception, similar measure, and recruiting of a new place cell. According to this process, a specific method for generating VPCs is presented. External reference landmarks are obtained based on local invariant characteristics of image and a similar measure function is designed based on Euclidean distance and Gaussian function. Simulation validates the proposed method is available. The firing characteristic of the generated VPCs is similar to that of biological place cells, and VPCs' firing fields can be adjusted flexibly by changing the adjustment factor of firing field (AFFF) and firing rate's threshold (FRT). PMID:27597859

  10. Construction of integrated case environments.

    PubMed

    Losavio, Francisca; Matteo, Alfredo; Pérez, María

    2003-01-01

    The main goal of Computer-Aided Software Engineering (CASE) technology is to improve the entire software system development process. The CASE approach is not merely a technology; it involves a fundamental change in the process of software development. The tendency of the CASE approach, technically speaking, is the integration of tools that assist in the application of specific methods. In this sense, the environment architecture, which includes the platform and the system's hardware and software, constitutes the base of the CASE environment. The problem of tools integration has been proposed for two decades. Current integration efforts emphasize the interoperability of tools, especially in distributed environments. In this work we use the Brown approach. The environment resulting from the application of this model is called a federative environment, focusing on the fact that this architecture pays special attention to the connections among the components of the environment. This approach is now being used in component-based design. This paper describes a concrete experience in civil engineering and architecture fields, for the construction of an integrated CASE environment. A generic architectural framework based on an intermediary architectural pattern is applied to achieve the integration of the different tools. This intermediary represents the control perspective of the PAC (Presentation-Abstraction-Control) style, which has been implemented as a Mediator pattern and it has been used in the interactive systems domain. In addition, a process is given to construct the integrated CASE.

  11. Model-Based Analysis of Cell Cycle Responses to Dynamically Changing Environments

    PubMed Central

    Seaton, Daniel D; Krishnan, J

    2016-01-01

    Cell cycle progression is carefully coordinated with a cell’s intra- and extracellular environment. While some pathways have been identified that communicate information from the environment to the cell cycle, a systematic understanding of how this information is dynamically processed is lacking. We address this by performing dynamic sensitivity analysis of three mathematical models of the cell cycle in Saccharomyces cerevisiae. We demonstrate that these models make broadly consistent qualitative predictions about cell cycle progression under dynamically changing conditions. For example, it is shown that the models predict anticorrelated changes in cell size and cell cycle duration under different environments independently of the growth rate. This prediction is validated by comparison to available literature data. Other consistent patterns emerge, such as widespread nonmonotonic changes in cell size down generations in response to parameter changes. We extend our analysis by investigating glucose signalling to the cell cycle, showing that known regulation of Cln3 translation and Cln1,2 transcription by glucose is sufficient to explain the experimentally observed changes in cell cycle dynamics at different glucose concentrations. Together, these results provide a framework for understanding the complex responses the cell cycle is capable of producing in response to dynamic environments. PMID:26741131

  12. Barotropic Tidal Predictions and Validation in a Relocatable Modeling Environment. Revised

    NASA Technical Reports Server (NTRS)

    Mehra, Avichal; Passi, Ranjit; Kantha, Lakshmi; Payne, Steven; Brahmachari, Shuvobroto

    1998-01-01

    Under funding from the Office of Naval Research (ONR), the Mississippi State University Center for Air Sea Technology (CAST) has been working on developing a Relocatable Modeling Environment (RME) to provide a uniform and unbiased infrastructure for efficiently configuring numerical models in any geographic or oceanic region. Under Naval Oceanographic Office (NAVOCEANO) funding, the model was implemented and tested for NAVOCEANO use. With our current emphasis on ocean tidal modeling, CAST has adopted the Colorado University's numerical ocean model, known as CURReNTSS (Colorado University Rapidly Relocatable Nestable Storm Surge) Model, as the model of choice. During the RME development process, CURReNTSS has been relocated to several coastal oceanic regions, providing excellent results that demonstrate its veracity. This report documents the model validation results and provides a brief description of the Graphic user Interface.

  13. The star formation history of early-type galaxies as a function of mass and environment

    NASA Astrophysics Data System (ADS)

    Clemens, M. S.; Bressan, A.; Nikolic, B.; Alexander, P.; Annibali, F.; Rampazzo, R.

    2006-08-01

    Using the third data release of the Sloan Digital Sky Survey (SDSS), we have rigorously defined a volume-limited sample of early-type galaxies in the redshift range 0.005 < z <= 0.1. We have defined the density of the local environment for each galaxy using a method which takes account of the redshift bias introduced by survey boundaries if traditional methods are used. At luminosities greater than our absolute r-band magnitude cut-off of -20.45, the mean density of environment shows no trend with redshift. We calculate the Lick indices for the entire sample and correct for aperture effects and velocity dispersion in a model-independent way. Although we find no dependence of redshift or luminosity on environment, we do find that the mean velocity dispersion, σ, of early-type galaxies in dense environments tends to be higher than in low-density environments. Taking account of this effect, we find that several indices show small but very significant trends with environment that are not the result of the correlation between indices and velocity dispersion. The statistical significance of the data is sufficiently high to reveal that models accounting only for α-enhancement struggle to produce a consistent picture of age and metallicity of the sample galaxies, whereas a model that also includes carbon enhancement fares much better. We find that early-type galaxies in the field are younger than those in environments typical of clusters but that neither metallicity, α-enhancement nor carbon enhancement are influenced by the environment. The youngest early-type galaxies in both field and cluster environments are those with the lowest σ. However, there is some evidence that the objects with the largest σ are slightly younger, especially in denser environments. Independent of environment both the metallicity and α-enhancement grow monotonically with σ. This suggests that the typical length of the star formation episodes which formed the stars of early-type galaxies decreases with σ. More massive galaxies were formed in faster bursts. We argue that the timing of the process of formation of early-type galaxies is determined by the environment, while the details of the process of star formation, which has built up the stellar mass, are entirely regulated by the halo mass. These results suggest that the star formation took place after the mass assembly and favours an anti-hierarchical model. In such a model, the majority of the mergers must take place before the bulk of the stars form. This can only happen if there exists an efficient feedback mechanism which inhibits the star formation in low-mass haloes and is progressively reduced as mergers increase the mass.

  14. Enrichment of the educational environment with information and communication technologies: state of art at the Faculty of Pharmacy of Kaunas University of Medicine.

    PubMed

    Butrimiene, Edita; Stankeviciene, Nida

    2008-01-01

    Both traditional and new educational environments, the latter enriched with information and communication technologies, coexist in today's university. The goal of this article is to present the concept of educational environment enriched with information and communication technologies, to reveal the main features of such environment, and to present the results of certain investigation on the application of information technologies in teaching/learning processes at the Faculty of Pharmacy of Kaunas University of Medicine. The discussion object of this paper is the educational environment enriched with information and communication technologies. In designing the environments of this type, positive aspects of traditional teaching models are being developed by integrating them into the new educational environment. The concept of educational environment enriched with information and communication technologies is reviewed in the first part of this paper. The structure and main features of educational environments enriched with information and communication technologies are highlighted in the second part. The results of the study on the application of information technologies in teaching/learning processes at the Faculty of Pharmacy of Kaunas University of Medicine are presented in the third part.

  15. Estimating suitable environments for invasive plant species across large landscapes: a remote sensing strategy using Landsat 7 ETM+

    USGS Publications Warehouse

    Young, Kendal E.; Abbott, Laurie B.; Caldwell, Colleen A.; Schrader, T. Scott

    2013-01-01

    The key to reducing ecological and economic damage caused by invasive plant species is to locate and eradicate new invasions before they threaten native biodiversity and ecological processes. We used Landsat Enhanced Thematic Mapper Plus imagery to estimate suitable environments for four invasive plants in Big Bend National Park, southwest Texas, using a presence-only modeling approach. Giant reed (Arundo donax), Lehmann lovegrass (Eragrostis lehmanniana), horehound (Marrubium vulgare) and buffelgrass (Pennisteum ciliare) were selected for remote sensing spatial analyses. Multiple dates/seasons of imagery were used to account for habitat conditions within the study area and to capture phenological differences among targeted species and the surrounding landscape. Individual species models had high (0.91 to 0.99) discriminative ability to differentiate invasive plant suitable environments from random background locations. Average test area under the receiver operating characteristic curve (AUC) ranged from 0.91 to 0.99, indicating that plant predictive models exhibited high discriminative ability to differentiate suitable environments for invasive plant species from random locations. Omission rates ranged from <1.0 to 18%. We demonstrated that useful models estimating suitable environments for invasive plants may be created with <50 occurrence locations and that reliable modeling using presence-only datasets can be powerful tools for land managers.

  16. Influence of fractal substructures of the percolating cluster on transferring processes in macroscopically disordered environments

    NASA Astrophysics Data System (ADS)

    Kolesnikov, B. P.

    2017-11-01

    The presented work belongs to the issue of searching for the effective kinetic properties of macroscopically disordered environments (MDE). These properties characterize MDE in general on the sizes which significantly exceed the sizes of macro inhomogeneity. The structure of MDE is considered as a complex of interpenetrating percolating and finite clusters consolidated from homonymous components, topological characteristics of which influence on the properties of the whole environment. The influence of percolating clusters’ fractal substructures (backbone, skeleton of backbone, red bonds) on the transfer processes during crossover (a structure transition from fractal to homogeneous condition) is investigated based on the offered mathematical approach for finding the effective conductivity of MDEs and on the percolating cluster model. The nature of the change of the critical conductivity index t during crossover from the characteristic value for the area close to percolation threshold to the value corresponded to homogeneous condition is demonstrated. The offered model describes the transfer processes in MDE with the finite conductivity relation of «conductive» and «low conductive» phases above and below percolation threshold and in smearing area (an analogue of a blur area of the second-order phase transfer).

  17. [Remodeling simulation of human femur under bed rest and spaceflight circumstances based on three dimensional finite element analysis].

    PubMed

    Yang, Wenting; Wang, Dongmei; Lei, Zhoujixin; Wang, Chunhui; Chen, Shanguang

    2017-12-01

    Astronauts who are exposed to weightless environment in long-term spaceflight might encounter bone density and mass loss for the mechanical stimulus is smaller than normal value. This study built a three dimensional model of human femur to simulate the remodeling process of human femur during bed rest experiment based on finite element analysis (FEA). The remodeling parameters of this finite element model was validated after comparing experimental and numerical results. Then, the remodeling process of human femur in weightless environment was simulated, and the remodeling function of time was derived. The loading magnitude and loading cycle on human femur during weightless environment were increased to simulate the exercise against bone loss. Simulation results showed that increasing loading magnitude is more effective in diminishing bone loss than increasing loading cycles, which demonstrated that exercise of certain intensity could help resist bone loss during long-term spaceflight. At the end, this study simulated the bone recovery process after spaceflight. It was found that the bone absorption rate is larger than bone formation rate. We advise that astronauts should take exercise during spaceflight to resist bone loss.

  18. High resolution modeling of a small urban catchment

    NASA Astrophysics Data System (ADS)

    Skouri-Plakali, Ilektra; Ichiba, Abdellah; Gires, Auguste; Tchiguirinskaia, Ioulia; Schertzer, Daniel

    2016-04-01

    Flooding is one of the most complex issues that urban environments have to deal with. In France, flooding remains the first natural risk with 72% of decrees state of natural disaster issued between October 1982 and mid-November 2014. Flooding is a result of meteorological extremes that are usually aggravated by the hydrological behavior of urban catchments and human factors. The continuing urbanization process is indeed changing the whole urban water cycle by limiting the infiltration and promoting runoff. Urban environments are very complex systems due to their extreme variability, the interference between human activities and natural processes but also the effect of the ongoing urbanization process that changes the landscape and hardly influences their hydrologic behavior. Moreover, many recent works highlight the need to simulate all urban water processes at their specific temporal and spatial scales. However, considering urban catchments heterogeneity still challenging for urban hydrology, even after advances noticed in term of high-resolution data collection and computational resources. This issue is more to be related to the architecture of urban models being used and how far these models are ready to take into account the extreme variability of urban catchments. In this work, high spatio-temporal resolution modeling is performed for a small and well-equipped urban catchment. The aim of this work is to identify urban modeling needs in terms of spatial and temporal resolution especially for a very small urban area (3.7 ha urban catchment located in the Perreux-sur-Marne city at the southeast of Paris) MultiHydro model was selected to carry out this work, it is a physical based and fully distributed model that interacts four existing modules each of them representing a portion of the water cycle in urban environments. MultiHydro was implemented at 10m, 5m and 2m resolution. Simulations were performed at different spatio-temporal resolutions and analyzed with respect to real flow measurements. First Results coming out show improvements obtained in terms of the model performance at high spatio-temporal resolution.

  19. Modeling a distributed environment for a petroleum reservoir engineering application with software product line

    NASA Astrophysics Data System (ADS)

    de Faria Scheidt, Rafael; Vilain, Patrícia; Dantas, M. A. R.

    2014-10-01

    Petroleum reservoir engineering is a complex and interesting field that requires large amount of computational facilities to achieve successful results. Usually, software environments for this field are developed without taking care out of possible interactions and extensibilities required by reservoir engineers. In this paper, we present a research work which it is characterized by the design and implementation based on a software product line model for a real distributed reservoir engineering environment. Experimental results indicate successfully the utilization of this approach for the design of distributed software architecture. In addition, all components from the proposal provided greater visibility of the organization and processes for the reservoir engineers.

  20. CHIMERA II - A real-time multiprocessing environment for sensor-based robot control

    NASA Technical Reports Server (NTRS)

    Stewart, David B.; Schmitz, Donald E.; Khosla, Pradeep K.

    1989-01-01

    A multiprocessing environment for a wide variety of sensor-based robot system, providing the flexibility, performance, and UNIX-compatible interface needed for fast development of real-time code is addressed. The requirements imposed on the design of a programming environment for sensor-based robotic control is outlined. The details of the current hardware configuration are presented, along with the details of the CHIMERA II software. Emphasis is placed on the kernel, low-level interboard communication, user interface, extended file system, user-definable and dynamically selectable real-time schedulers, remote process synchronization, and generalized interprocess communication. A possible implementation of a hierarchical control model, the NASA/NBS standard reference model for telerobot control system is demonstrated.

  1. Simulating the Conversion of Rural Settlements to Town Land Based on Multi-Agent Systems and Cellular Automata

    PubMed Central

    Liu, Yaolin; Kong, Xuesong; Liu, Yanfang; Chen, Yiyun

    2013-01-01

    Rapid urbanization in China has triggered the conversion of land from rural to urban use, particularly the conversion of rural settlements to town land. This conversion is the result of the joint effects of the geographic environment and agents involving the government, investors, and farmers. To understand the dynamic interaction dominated by agents and to predict the future landscape of town expansion, a small town land-planning model is proposed based on the integration of multi-agent systems (MAS) and cellular automata (CA). The MAS-CA model links the decision-making behaviors of agents with the neighbor effect of CA. The interaction rules are projected by analyzing the preference conflicts among agents. To better illustrate the effects of the geographic environment, neighborhood, and agent behavior, a comparative analysis between the CA and MAS-CA models in three different towns is presented, revealing interesting patterns in terms of quantity, spatial characteristics, and the coordinating process. The simulation of rural settlements conversion to town land through modeling agent decision and human-environment interaction is very useful for understanding the mechanisms of rural-urban land-use change in developing countries. This process can assist town planners in formulating appropriate development plans. PMID:24244472

  2. An Approach to Integrate a Space-Time GIS Data Model with High Performance Computers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Dali; Zhao, Ziliang; Shaw, Shih-Lung

    2011-01-01

    In this paper, we describe an approach to integrate a Space-Time GIS data model on a high performance computing platform. The Space-Time GIS data model has been developed on a desktop computing environment. We use the Space-Time GIS data model to generate GIS module, which organizes a series of remote sensing data. We are in the process of porting the GIS module into an HPC environment, in which the GIS modules handle large dataset directly via parallel file system. Although it is an ongoing project, authors hope this effort can inspire further discussions on the integration of GIS on highmore » performance computing platforms.« less

  3. geophylobuilder 1.0: an arcgis extension for creating 'geophylogenies'.

    PubMed

    Kidd, David M; Liu, Xianhua

    2008-01-01

    Evolution is inherently a spatiotemporal process; however, despite this, phylogenetic and geographical data and models remain largely isolated from one another. Geographical information systems provide a ready-made spatial modelling, analysis and dissemination environment within which phylogenetic models can be explicitly linked with their associated spatial data and subsequently integrated with other georeferenced data sets describing the biotic and abiotic environment. geophylobuilder 1.0 is an extension for the arcgis geographical information system that builds a 'geophylogenetic' data model from a phylogenetic tree and associated geographical data. Geophylogenetic database objects can subsequently be queried, spatially analysed and visualized in both 2D and 3D within a geographical information systems. © 2007 The Authors.

  4. High-performance integrated virtual environment (HIVE): a robust infrastructure for next-generation sequence data analysis

    PubMed Central

    Simonyan, Vahan; Chumakov, Konstantin; Dingerdissen, Hayley; Faison, William; Goldweber, Scott; Golikov, Anton; Gulzar, Naila; Karagiannis, Konstantinos; Vinh Nguyen Lam, Phuc; Maudru, Thomas; Muravitskaja, Olesja; Osipova, Ekaterina; Pan, Yang; Pschenichnov, Alexey; Rostovtsev, Alexandre; Santana-Quintero, Luis; Smith, Krista; Thompson, Elaine E.; Tkachenko, Valery; Torcivia-Rodriguez, John; Wan, Quan; Wang, Jing; Wu, Tsung-Jung; Wilson, Carolyn; Mazumder, Raja

    2016-01-01

    The High-performance Integrated Virtual Environment (HIVE) is a distributed storage and compute environment designed primarily to handle next-generation sequencing (NGS) data. This multicomponent cloud infrastructure provides secure web access for authorized users to deposit, retrieve, annotate and compute on NGS data, and to analyse the outcomes using web interface visual environments appropriately built in collaboration with research and regulatory scientists and other end users. Unlike many massively parallel computing environments, HIVE uses a cloud control server which virtualizes services, not processes. It is both very robust and flexible due to the abstraction layer introduced between computational requests and operating system processes. The novel paradigm of moving computations to the data, instead of moving data to computational nodes, has proven to be significantly less taxing for both hardware and network infrastructure. The honeycomb data model developed for HIVE integrates metadata into an object-oriented model. Its distinction from other object-oriented databases is in the additional implementation of a unified application program interface to search, view and manipulate data of all types. This model simplifies the introduction of new data types, thereby minimizing the need for database restructuring and streamlining the development of new integrated information systems. The honeycomb model employs a highly secure hierarchical access control and permission system, allowing determination of data access privileges in a finely granular manner without flooding the security subsystem with a multiplicity of rules. HIVE infrastructure will allow engineers and scientists to perform NGS analysis in a manner that is both efficient and secure. HIVE is actively supported in public and private domains, and project collaborations are welcomed. Database URL: https://hive.biochemistry.gwu.edu PMID:26989153

  5. High-performance integrated virtual environment (HIVE): a robust infrastructure for next-generation sequence data analysis.

    PubMed

    Simonyan, Vahan; Chumakov, Konstantin; Dingerdissen, Hayley; Faison, William; Goldweber, Scott; Golikov, Anton; Gulzar, Naila; Karagiannis, Konstantinos; Vinh Nguyen Lam, Phuc; Maudru, Thomas; Muravitskaja, Olesja; Osipova, Ekaterina; Pan, Yang; Pschenichnov, Alexey; Rostovtsev, Alexandre; Santana-Quintero, Luis; Smith, Krista; Thompson, Elaine E; Tkachenko, Valery; Torcivia-Rodriguez, John; Voskanian, Alin; Wan, Quan; Wang, Jing; Wu, Tsung-Jung; Wilson, Carolyn; Mazumder, Raja

    2016-01-01

    The High-performance Integrated Virtual Environment (HIVE) is a distributed storage and compute environment designed primarily to handle next-generation sequencing (NGS) data. This multicomponent cloud infrastructure provides secure web access for authorized users to deposit, retrieve, annotate and compute on NGS data, and to analyse the outcomes using web interface visual environments appropriately built in collaboration with research and regulatory scientists and other end users. Unlike many massively parallel computing environments, HIVE uses a cloud control server which virtualizes services, not processes. It is both very robust and flexible due to the abstraction layer introduced between computational requests and operating system processes. The novel paradigm of moving computations to the data, instead of moving data to computational nodes, has proven to be significantly less taxing for both hardware and network infrastructure.The honeycomb data model developed for HIVE integrates metadata into an object-oriented model. Its distinction from other object-oriented databases is in the additional implementation of a unified application program interface to search, view and manipulate data of all types. This model simplifies the introduction of new data types, thereby minimizing the need for database restructuring and streamlining the development of new integrated information systems. The honeycomb model employs a highly secure hierarchical access control and permission system, allowing determination of data access privileges in a finely granular manner without flooding the security subsystem with a multiplicity of rules. HIVE infrastructure will allow engineers and scientists to perform NGS analysis in a manner that is both efficient and secure. HIVE is actively supported in public and private domains, and project collaborations are welcomed. Database URL: https://hive.biochemistry.gwu.edu. © The Author(s) 2016. Published by Oxford University Press.

  6. Physical environment virtualization for human activities recognition

    NASA Astrophysics Data System (ADS)

    Poshtkar, Azin; Elangovan, Vinayak; Shirkhodaie, Amir; Chan, Alex; Hu, Shuowen

    2015-05-01

    Human activity recognition research relies heavily on extensive datasets to verify and validate performance of activity recognition algorithms. However, obtaining real datasets are expensive and highly time consuming. A physics-based virtual simulation can accelerate the development of context based human activity recognition algorithms and techniques by generating relevant training and testing videos simulating diverse operational scenarios. In this paper, we discuss in detail the requisite capabilities of a virtual environment to aid as a test bed for evaluating and enhancing activity recognition algorithms. To demonstrate the numerous advantages of virtual environment development, a newly developed virtual environment simulation modeling (VESM) environment is presented here to generate calibrated multisource imagery datasets suitable for development and testing of recognition algorithms for context-based human activities. The VESM environment serves as a versatile test bed to generate a vast amount of realistic data for training and testing of sensor processing algorithms. To demonstrate the effectiveness of VESM environment, we present various simulated scenarios and processed results to infer proper semantic annotations from the high fidelity imagery data for human-vehicle activity recognition under different operational contexts.

  7. Stratiform chromite deposit model: Chapter E in Mineral deposit models for resource assessment

    USGS Publications Warehouse

    Schulte, Ruth F.; Taylor, Ryan D.; Piatak, Nadine M.; Seal, Robert R.

    2012-01-01

    Most environmental concerns associated with the mining and processing of chromite ore focus on the solubility of chromium and its oxidation state. Although trivalent chromium (Cr3+) is an essential micronutrient for humans, hexavalent chromium (Cr6+) is highly toxic. Chromium-bearing solid phases that occur in the chromite ore-processing residue, for example, can effect the geochemical behavior and oxidation state of chromium in the environment.

  8. Improving the Aircraft Design Process Using Web-Based Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Reed, John A.; Follen, Gregory J.; Afjeh, Abdollah A.; Follen, Gregory J. (Technical Monitor)

    2000-01-01

    Designing and developing new aircraft systems is time-consuming and expensive. Computational simulation is a promising means for reducing design cycle times, but requires a flexible software environment capable of integrating advanced multidisciplinary and multifidelity analysis methods, dynamically managing data across heterogeneous computing platforms, and distributing computationally complex tasks. Web-based simulation, with its emphasis on collaborative composition of simulation models, distributed heterogeneous execution, and dynamic multimedia documentation, has the potential to meet these requirements. This paper outlines the current aircraft design process, highlighting its problems and complexities, and presents our vision of an aircraft design process using Web-based modeling and simulation.

  9. Improving the Aircraft Design Process Using Web-based Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Reed, John A.; Follen, Gregory J.; Afjeh, Abdollah A.

    2003-01-01

    Designing and developing new aircraft systems is time-consuming and expensive. Computational simulation is a promising means for reducing design cycle times, but requires a flexible software environment capable of integrating advanced multidisciplinary and muitifidelity analysis methods, dynamically managing data across heterogeneous computing platforms, and distributing computationally complex tasks. Web-based simulation, with its emphasis on collaborative composition of simulation models, distributed heterogeneous execution, and dynamic multimedia documentation, has the potential to meet these requirements. This paper outlines the current aircraft design process, highlighting its problems and complexities, and presents our vision of an aircraft design process using Web-based modeling and simulation.

  10. Integration of MATLAB Simulink(Registered Trademark) Models with the Vertical Motion Simulator

    NASA Technical Reports Server (NTRS)

    Lewis, Emily K.; Vuong, Nghia D.

    2012-01-01

    This paper describes the integration of MATLAB Simulink(Registered TradeMark) models into the Vertical Motion Simulator (VMS) at NASA Ames Research Center. The VMS is a high-fidelity, large motion flight simulator that is capable of simulating a variety of aerospace vehicles. Integrating MATLAB Simulink models into the VMS needed to retain the development flexibility of the MATLAB environment and allow rapid deployment of model changes. The process developed at the VMS was used successfully in a number of recent simulation experiments. This accomplishment demonstrated that the model integrity was preserved, while working within the hard real-time run environment of the VMS architecture, and maintaining the unique flexibility of the VMS to meet diverse research requirements.

  11. InSAR Scientific Computing Environment

    NASA Astrophysics Data System (ADS)

    Gurrola, E. M.; Rosen, P. A.; Sacco, G.; Zebker, H. A.; Simons, M.; Sandwell, D. T.

    2010-12-01

    The InSAR Scientific Computing Environment (ISCE) is a software development effort in its second year within the NASA Advanced Information Systems and Technology program. The ISCE will provide a new computing environment for geodetic image processing for InSAR sensors that will enable scientists to reduce measurements directly from radar satellites and aircraft to new geophysical products without first requiring them to develop detailed expertise in radar processing methods. The environment can serve as the core of a centralized processing center to bring Level-0 raw radar data up to Level-3 data products, but is adaptable to alternative processing approaches for science users interested in new and different ways to exploit mission data. The NRC Decadal Survey-recommended DESDynI mission will deliver data of unprecedented quantity and quality, making possible global-scale studies in climate research, natural hazards, and Earth's ecosystem. The InSAR Scientific Computing Environment is planned to become a key element in processing DESDynI data into higher level data products and it is expected to enable a new class of analyses that take greater advantage of the long time and large spatial scales of these new data, than current approaches. At the core of ISCE is both legacy processing software from the JPL/Caltech ROI_PAC repeat-pass interferometry package as well as a new InSAR processing package containing more efficient and more accurate processing algorithms being developed at Stanford for this project that is based on experience gained in developing processors for missions such as SRTM and UAVSAR. Around the core InSAR processing programs we are building object-oriented wrappers to enable their incorporation into a more modern, flexible, extensible software package that is informed by modern programming methods, including rigorous componentization of processing codes, abstraction and generalization of data models, and a robust, intuitive user interface with graduated exposure to the levels of sophistication, allowing novices to apply it readily for common tasks and experienced users to mine data with great facility and flexibility. The environment is designed to easily allow user contributions, enabling an open source community to extend the framework into the indefinite future. In this paper we briefly describe both the legacy and the new core processing algorithms and their integration into the new computing environment. We describe the ISCE component and application architecture and the features that permit the desired flexibility, extensibility and ease-of-use. We summarize the state of progress of the environment and the plans for completion of the environment and for its future introduction into the radar processing community.

  12. The Processing of Airspace Concept Evaluations Using FASTE-CNS as a Pre- or Post-Simulation CNS Analysis Tool

    NASA Technical Reports Server (NTRS)

    Mainger, Steve

    2004-01-01

    As NASA speculates on and explores the future of aviation, the technological and physical aspects of our environment increasing become hurdles that must be overcome for success. Research into methods for overcoming some of these selected hurdles have been purposed by several NASA research partners as concepts. The task of establishing a common evaluation environment was placed on NASA's Virtual Airspace Simulation Technologies (VAST) project (sub-project of VAMS), and they responded with the development of the Airspace Concept Evaluation System (ACES). As one examines the ACES environment from a communication, navigation or surveillance (CNS) perspective, the simulation parameters are built with assumed perfection in the transactions associated with CNS. To truly evaluate these concepts in a realistic sense, the contributions/effects of CNS must be part of the ACES. NASA Glenn Research Center (GRC) has supported the Virtual Airspace Modeling and Simulation (VAMS) project through the continued development of CNS models and analysis capabilities which supports the ACES environment. NASA GRC initiated the development a communications traffic loading analysis tool, called the Future Aeronautical Sub-network Traffic Emulator for Communications, Navigation and Surveillance (FASTE-CNS), as part of this support. This tool allows for forecasting of communications load with the understanding that, there is no single, common source for loading models used to evaluate the existing and planned communications channels; and that, consensus and accuracy in the traffic load models is a very important input to the decisions being made on the acceptability of communication techniques used to fulfill the aeronautical requirements. Leveraging off the existing capabilities of the FASTE-CNS tool, GRC has called for FASTE-CNS to have the functionality to pre- and post-process the simulation runs of ACES to report on instances when traffic density, frequency congestion or aircraft spacing/distance violations have occurred. The integration of these functions require that the CNS models used to characterize these avionic system be of higher fidelity and better consistency then is present in FASTE-CNS system. This presentation will explore the capabilities of FASTE-CNS with renewed emphasis on the enhancements being added to perform these processing functions; the fidelity and reliability of CNS models necessary to make the enhancements work; and the benchmarking of FASTE-CNS results to improve confidence for the results of the new processing capabilities.

  13. Intrusive [r] and Optimal Epenthetic Consonants

    ERIC Educational Resources Information Center

    Uffmann, Christian

    2007-01-01

    This paper argues against the view of intrusive [r] as a synchronically arbitrary insertion process. Instead, it is seen as a phonologically natural process, which can be modelled within the framework of Optimality Theory (OT). Insertion of [r] in phonologically restricted environments is a consequence of a more general theory of consonant…

  14. Leadership Development as a Dialogic Process: The Rationale and Concept of an International Leadership Institute

    ERIC Educational Resources Information Center

    Kedian, Jeremy; Giles, David; Morrison, Michele; Fletcher, Murray

    2016-01-01

    Rapidly changing educational contexts demand deft leadership responses. In this fluid environment, it is imperative that leadership learning models sound educational praxis. Such praxis necessitates the inclusion of participant voices within relational and dialogic processes that enable authentic, creative and collaborative thinking. This paper…

  15. The Stress Process of Family Caregiving in Institutional Settings.

    ERIC Educational Resources Information Center

    Whitlatch, Carol J.; Schur, Dorothy; Noelker, Linda S.; Ejaz, Farida K.; Looman, Wendy J.

    2001-01-01

    Adapts Stress Process Model (SPM) of family caregiving to examine predictors of depression in a sample of caregivers (N=133) with demented relatives residing in suburban skilled nursing facilities. Results suggest that caregiver depression is closely linked to how well resident and caregiver adjust to the nursing home environment. (BF)

  16. Applying AI to the Writer's Learning Environment.

    ERIC Educational Resources Information Center

    Houlette, Forrest

    1991-01-01

    Discussion of current applications of artificial intelligence (AI) to writing focuses on how to represent knowledge of the writing process in a way that links procedural knowledge to other types of knowledge. A model is proposed that integrates the subtasks of writing into the process of writing itself. (15 references) (LRW)

  17. Analyzing Student Inquiry Data Using Process Discovery and Sequence Classification

    ERIC Educational Resources Information Center

    Emond, Bruno; Buffett, Scott

    2015-01-01

    This paper reports on results of applying process discovery mining and sequence classification mining techniques to a data set of semi-structured learning activities. The main research objective is to advance educational data mining to model and support self-regulated learning in heterogeneous environments of learning content, activities, and…

  18. In-situ measurement of processing properties during fabrication in a production tool

    NASA Technical Reports Server (NTRS)

    Kranbuehl, D. E.; Haverty, P.; Hoff, M.; Loos, A. C.

    1988-01-01

    Progress is reported on the use of frequency-dependent electromagnetic measurements (FDEMs) as a single, convenient technique for continuous in situ monitoring of polyester cure during fabrication in a laboratory and manufacturing environment. Preliminary FDEM sensor and modeling work using the Loss-Springer model in order to develop an intelligent closed-loop, sensor-controlled cure process is described. FDEMs using impedance bridges in the Hz to MHz region is found to be ideal for automatically monitoring polyester processing properties continuously throughout the cure cycle.

  19. Evolutionary biology through the lens of budding yeast comparative genomics.

    PubMed

    Marsit, Souhir; Leducq, Jean-Baptiste; Durand, Éléonore; Marchant, Axelle; Filteau, Marie; Landry, Christian R

    2017-10-01

    The budding yeast Saccharomyces cerevisiae is a highly advanced model system for studying genetics, cell biology and systems biology. Over the past decade, the application of high-throughput sequencing technologies to this species has contributed to this yeast also becoming an important model for evolutionary genomics. Indeed, comparative genomic analyses of laboratory, wild and domesticated yeast populations are providing unprecedented detail about many of the processes that govern evolution, including long-term processes, such as reproductive isolation and speciation, and short-term processes, such as adaptation to natural and domestication-related environments.

  20. Analysis of methods. [information systems evolution environment

    NASA Technical Reports Server (NTRS)

    Mayer, Richard J. (Editor); Ackley, Keith A.; Wells, M. Sue; Mayer, Paula S. D.; Blinn, Thomas M.; Decker, Louis P.; Toland, Joel A.; Crump, J. Wesley; Menzel, Christopher P.; Bodenmiller, Charles A.

    1991-01-01

    Information is one of an organization's most important assets. For this reason the development and maintenance of an integrated information system environment is one of the most important functions within a large organization. The Integrated Information Systems Evolution Environment (IISEE) project has as one of its primary goals a computerized solution to the difficulties involved in the development of integrated information systems. To develop such an environment a thorough understanding of the enterprise's information needs and requirements is of paramount importance. This document is the current release of the research performed by the Integrated Development Support Environment (IDSE) Research Team in support of the IISEE project. Research indicates that an integral part of any information system environment would be multiple modeling methods to support the management of the organization's information. Automated tool support for these methods is necessary to facilitate their use in an integrated environment. An integrated environment makes it necessary to maintain an integrated database which contains the different kinds of models developed under the various methodologies. In addition, to speed the process of development of models, a procedure or technique is needed to allow automatic translation from one methodology's representation to another while maintaining the integrity of both. The purpose for the analysis of the modeling methods included in this document is to examine these methods with the goal being to include them in an integrated development support environment. To accomplish this and to develop a method for allowing intra-methodology and inter-methodology model element reuse, a thorough understanding of multiple modeling methodologies is necessary. Currently the IDSE Research Team is investigating the family of Integrated Computer Aided Manufacturing (ICAM) DEFinition (IDEF) languages IDEF(0), IDEF(1), and IDEF(1x), as well as ENALIM, Entity Relationship, Data Flow Diagrams, and Structure Charts, for inclusion in an integrated development support environment.

  1. An Object-Oriented Python Implementation of an Intermediate-Level Atmospheric Model

    NASA Astrophysics Data System (ADS)

    Lin, J. W.

    2008-12-01

    The Neelin-Zeng Quasi-equilibrium Tropical Circulation Model (QTCM1) is a Fortran-based intermediate-level atmospheric model that includes simplified treatments of several physical processes, including a GCM-like convective scheme and a land-surface scheme with representations of different surface types, evaporation, and soil moisture. This model has been used in studies of the Madden-Julian oscillation, ENSO, and vegetation-atmosphere interaction effects on climate. Through the assumption of convective quasi-equilibrium in the troposphere, the QTCM1 is able to include full nonlinearity, resolve baroclinic disturbances, and generate a reasonable climatology, all at low computational cost. One year of simulation on a PC at 5.625 × 3.75 degree longitude-latitude resolution takes under three minutes of wall-clock time. The Python package qtcm implements the QTCM1 in a mixed-language environment that retains the speed of compiled Fortran while providing the benefits of Python's object-oriented framework and robust suite of utilities and datatypes. We describe key programming constructs used to create this modeling environment: the decomposition of model runs into Python objects, providing methods so visualization tools are attached to model runs, and the use of Python's mutable datatypes (lists and dictionaries) to implement the "run list" entity, which enables total runtime control of subroutine execution order and content. The result is an interactive modeling environment where the traditional sequence of "hypothesis → modeling → visualization and analysis" is opened up and made nonlinear and flexible. In this environment, science tasks such as parameter-space exploration and testing alternative parameterizations can be easily automated, without the need for multiple versions of the model code interacting with a bevy of makefiles and shell scripts. The environment also simplifies interfacing of the atmospheric model to other models (e.g., hydrologic models, statistical models) and analysis tools. The tools developed for this package can be adapted to create similar environments for hydrologic models.

  2. Modeling Kanban Processes in Systems Engineering

    DTIC Science & Technology

    2012-06-01

    engineering through the services concept • Clarity in the value of SE as a knowledge broker and analysis service in brownfield evolution environments [18...Dynamics, Wiley-IEEE Press, Hoboken, NJ, 2008 [18] Boehm, B.: Applying the Incremental Commitment Model to Brownfield Systems Development, Proceedings, CSER 2009, April 2009.

  3. Using stable isotopes and models to explore estuarine linkages at multiple scales

    EPA Science Inventory

    Estuarine managers need tools to respond to dynamic stressors that occur in three linked environments – coastal ocean, estuaries and watersheds. Models have been the tool of choice for examining these dynamic systems because they simplify processes and integrate over multiple sc...

  4. A Graphical Aid for Introducing the Climatic Water Budget.

    ERIC Educational Resources Information Center

    Shelton, Marlyn L.

    1986-01-01

    The climatic water budget model provides an analytical framework to help geography students examine the processes shaping the environment. Examples illustrate how the model can be used in geography classes. Two flow diagrams are presented to help students master quantification of water budget variables. (RM)

  5. Specifications of a Simulation Model for a Local Area Network Design in Support of a Stock Point Logistics Integrated Communication Environment (SPLICE).

    DTIC Science & Technology

    1983-06-01

    constrained at each step. Use of dis- crete simulation can be a powerful tool in this process if its role is carefully planned. The gross behavior of the...by projecting: - the arrival of units of work at SPLICE processing facilities (workload analysis) . - the amount of processing resources comsumed in

  6. Software Engineering Guidebook

    NASA Technical Reports Server (NTRS)

    Connell, John; Wenneson, Greg

    1993-01-01

    The Software Engineering Guidebook describes SEPG (Software Engineering Process Group) supported processes and techniques for engineering quality software in NASA environments. Three process models are supported: structured, object-oriented, and evolutionary rapid-prototyping. The guidebook covers software life-cycles, engineering, assurance, and configuration management. The guidebook is written for managers and engineers who manage, develop, enhance, and/or maintain software under the Computer Software Services Contract.

  7. An approach for modelling snowcover ablation and snowmelt runoff in cold region environments

    NASA Astrophysics Data System (ADS)

    Dornes, Pablo Fernando

    Reliable hydrological model simulations are the result of numerous complex interactions among hydrological inputs, landscape properties, and initial conditions. Determination of the effects of these factors is one of the main challenges in hydrological modelling. This situation becomes even more difficult in cold regions due to the ungauged nature of subarctic and arctic environments. This research work is an attempt to apply a new approach for modelling snowcover ablation and snowmelt runoff in complex subarctic environments with limited data while retaining integrity in the process representations. The modelling strategy is based on the incorporation of both detailed process understanding and inputs along with information gained from observations of basin-wide streamflow phenomenon; essentially a combination of deductive and inductive approaches. The study was conducted in the Wolf Creek Research Basin, Yukon Territory, using three models, a small-scale physically based hydrological model, a land surface scheme, and a land surface hydrological model. The spatial representation was based on previous research studies and observations, and was accomplished by incorporating landscape units, defined according to topography and vegetation, as the spatial model elements. Comparisons between distributed and aggregated modelling approaches showed that simulations incorporating distributed initial snowcover and corrected solar radiation were able to properly simulate snowcover ablation and snowmelt runoff whereas the aggregated modelling approaches were unable to represent the differential snowmelt rates and complex snowmelt runoff dynamics. Similarly, the inclusion of spatially distributed information in a land surface scheme clearly improved simulations of snowcover ablation. Application of the same modelling approach at a larger scale using the same landscape based parameterisation showed satisfactory results in simulating snowcover ablation and snowmelt runoff with minimal calibration. Verification of this approach in an arctic basin illustrated that landscape based parameters are a feasible regionalisation framework for distributed and physically based models. In summary, the proposed modelling philosophy, based on the combination of an inductive and deductive reasoning, is a suitable strategy for reliable predictions of snowcover ablation and snowmelt runoff in cold regions and complex environments.

  8. Bayesian networks and information theory for audio-visual perception modeling.

    PubMed

    Besson, Patricia; Richiardi, Jonas; Bourdin, Christophe; Bringoux, Lionel; Mestre, Daniel R; Vercher, Jean-Louis

    2010-09-01

    Thanks to their different senses, human observers acquire multiple information coming from their environment. Complex cross-modal interactions occur during this perceptual process. This article proposes a framework to analyze and model these interactions through a rigorous and systematic data-driven process. This requires considering the general relationships between the physical events or factors involved in the process, not only in quantitative terms, but also in term of the influence of one factor on another. We use tools from information theory and probabilistic reasoning to derive relationships between the random variables of interest, where the central notion is that of conditional independence. Using mutual information analysis to guide the model elicitation process, a probabilistic causal model encoded as a Bayesian network is obtained. We exemplify the method by using data collected in an audio-visual localization task for human subjects, and we show that it yields a well-motivated model with good predictive ability. The model elicitation process offers new prospects for the investigation of the cognitive mechanisms of multisensory perception.

  9. An Interactive, Web-based High Performance Modeling Environment for Computational Epidemiology.

    PubMed

    Deodhar, Suruchi; Bisset, Keith R; Chen, Jiangzhuo; Ma, Yifei; Marathe, Madhav V

    2014-07-01

    We present an integrated interactive modeling environment to support public health epidemiology. The environment combines a high resolution individual-based model with a user-friendly web-based interface that allows analysts to access the models and the analytics back-end remotely from a desktop or a mobile device. The environment is based on a loosely-coupled service-oriented-architecture that allows analysts to explore various counter factual scenarios. As the modeling tools for public health epidemiology are getting more sophisticated, it is becoming increasingly hard for non-computational scientists to effectively use the systems that incorporate such models. Thus an important design consideration for an integrated modeling environment is to improve ease of use such that experimental simulations can be driven by the users. This is achieved by designing intuitive and user-friendly interfaces that allow users to design and analyze a computational experiment and steer the experiment based on the state of the system. A key feature of a system that supports this design goal is the ability to start, stop, pause and roll-back the disease propagation and intervention application process interactively. An analyst can access the state of the system at any point in time and formulate dynamic interventions based on additional information obtained through state assessment. In addition, the environment provides automated services for experiment set-up and management, thus reducing the overall time for conducting end-to-end experimental studies. We illustrate the applicability of the system by describing computational experiments based on realistic pandemic planning scenarios. The experiments are designed to demonstrate the system's capability and enhanced user productivity.

  10. An Interactive, Web-based High Performance Modeling Environment for Computational Epidemiology

    PubMed Central

    Deodhar, Suruchi; Bisset, Keith R.; Chen, Jiangzhuo; Ma, Yifei; Marathe, Madhav V.

    2014-01-01

    We present an integrated interactive modeling environment to support public health epidemiology. The environment combines a high resolution individual-based model with a user-friendly web-based interface that allows analysts to access the models and the analytics back-end remotely from a desktop or a mobile device. The environment is based on a loosely-coupled service-oriented-architecture that allows analysts to explore various counter factual scenarios. As the modeling tools for public health epidemiology are getting more sophisticated, it is becoming increasingly hard for non-computational scientists to effectively use the systems that incorporate such models. Thus an important design consideration for an integrated modeling environment is to improve ease of use such that experimental simulations can be driven by the users. This is achieved by designing intuitive and user-friendly interfaces that allow users to design and analyze a computational experiment and steer the experiment based on the state of the system. A key feature of a system that supports this design goal is the ability to start, stop, pause and roll-back the disease propagation and intervention application process interactively. An analyst can access the state of the system at any point in time and formulate dynamic interventions based on additional information obtained through state assessment. In addition, the environment provides automated services for experiment set-up and management, thus reducing the overall time for conducting end-to-end experimental studies. We illustrate the applicability of the system by describing computational experiments based on realistic pandemic planning scenarios. The experiments are designed to demonstrate the system's capability and enhanced user productivity. PMID:25530914

  11. Comparing Computer-Supported Dynamic Modeling and "Paper & Pencil" Concept Mapping Technique in Students' Collaborative Activity

    ERIC Educational Resources Information Center

    Komis, Vassilis; Ergazaki, Marida; Zogza, Vassiliki

    2007-01-01

    This study aims at highlighting the collaborative activity of two high school students (age 14) in the cases of modeling the complex biological process of plant growth with two different tools: the "paper & pencil" concept mapping technique and the computer-supported educational environment "ModelsCreator". Students' shared activity in both cases…

  12. Modelling aqueous corrosion of nuclear waste phosphate glass

    NASA Astrophysics Data System (ADS)

    Poluektov, Pavel P.; Schmidt, Olga V.; Kascheev, Vladimir A.; Ojovan, Michael I.

    2017-02-01

    A model is presented on nuclear sodium alumina phosphate (NAP) glass aqueous corrosion accounting for dissolution of radioactive glass and formation of corrosion products surface layer on the glass contacting ground water of a disposal environment. Modelling is used to process available experimental data demonstrating the generic inhibiting role of corrosion products on the NAP glass surface.

  13. Sustainable development education, practice, and research: an indigenous model of sustainable development at the College of Menominee Nation, Keshena, WI, USA

    Treesearch

    Michael J. Dockry; Katherine Hall; William Van Lopik; Christopher M. Caldwell

    2015-01-01

    The College of Menominee Nation Sustainable Development Institute's theoretical model (SDI model) conceptualizes sustainable development as the process of maintaining the balance and reconciling the inherent tensions among six dimensions of sustainability: land and sovereignty; natural environment #including human beings); institutions; technology; economy; and...

  14. A Preliminary Field Test of an Employee Work Passion Model

    ERIC Educational Resources Information Center

    Zigarmi, Drea; Nimon, Kim; Houson, Dobie; Witt, David; Diehl, Jim

    2011-01-01

    Four dimensions of a process model for the formulation of employee work passion, derived from Zigarmi, Nimon, Houson, Witt, and Diehl (2009), were tested in a field setting. A total of 447 employees completed questionnaires that assessed the internal elements of the model in a corporate work environment. Data from the measurements of work affect,…

  15. Generation of Digital Surface Models from satellite photogrammetry: the DSM-OPT service of the ESA Geohazards Exploitation Platform (GEP)

    NASA Astrophysics Data System (ADS)

    Stumpf, André; Michéa, David; Malet, Jean-Philippe

    2017-04-01

    The continuously increasing fleet of agile stereo-capable very-high resolution (VHR) optical satellites has facilitated the acquisition of multi-view images of the earth surface. Theoretical revisit times have been reduced to less than one day and the highest spatial resolution which is commercially available amounts now to 30 cm/pixel. Digital Surface Models (DSM) and point clouds computed from such satellite stereo-acquisitions can provide valuable input for studies in geomorphology, tectonics, glaciology, hydrology and urban remote sensing The photogrammetric processing, however, still requires significant expertise, computational resources and costly commercial software. To enable a large Earth Science community (researcher and end-users) to process easily and rapidly VHR multi-view images, the work targets the implementation of a fully automatic satellite-photogrammetry pipeline (i.e DSM-OPT) on the ESA Geohazards Exploitation Platform (GEP). The implemented pipeline is based on the open-source photogrammetry library MicMac [1] and is designed for distributed processing on a cloud-based infrastructure. The service can be employed in pre-defined processing modes (i.e. urban, plain, hilly, and mountainous environments) or in an advanced processing mode (i.e. in which expert-users have the possibility to adapt the processing parameters to their specific applications). Four representative use cases are presented to illustrate the accuracy of the resulting surface models and ortho-images as well as the overall processing time. These use cases consisted of the construction of surface models from series of Pléiades images for four applications: urban analysis (Strasbourg, France), landslide detection in mountainous environments (South French Alps), co-seismic deformation in mountain environments (Central Italy earthquake sequence of 2016) and fault recognition for paleo-tectonic analysis (North-East India). Comparisons of the satellite-derived topography to airborne LiDAR topography are discussed. [1] Rupnik, E., Pierrot Deseilligny, M., Delorme, A., and Klinger, Y.: Refined satellite image orientation in the free open-source photogrammetric tools APERO/MICMAC, ISPRS Ann. Photogramm. Remote Sens. Spatial Inf. Sci., III-1, 83-90, doi:10.5194/isprs-annals-III-1-83-2016, 2016.

  16. Development of the GPM Observatory Thermal Vacuum Test Model

    NASA Technical Reports Server (NTRS)

    Yang, Kan; Peabody, Hume

    2012-01-01

    A software-based thermal modeling process was documented for generating the thermal panel settings necessary to simulate worst-case on-orbit flight environments in an observatory-level thermal vacuum test setup. The method for creating such a thermal model involved four major steps: (1) determining the major thermal zones for test as indicated by the major dissipating components on the spacecraft, then mapping the major heat flows between these components; (2) finding the flight equivalent sink temperatures for these test thermal zones; (3) determining the thermal test ground support equipment (GSE) design and initial thermal panel settings based on the equivalent sink temperatures; and (4) adjusting the panel settings in the test model to match heat flows and temperatures with the flight model. The observatory test thermal model developed from this process allows quick predictions of the performance of the thermal vacuum test design. In this work, the method described above was applied to the Global Precipitation Measurement (GPM) core observatory spacecraft, a joint project between NASA and the Japanese Aerospace Exploration Agency (JAXA) which is currently being integrated at NASA Goddard Space Flight Center for launch in Early 2014. From preliminary results, the thermal test model generated from this process shows that the heat flows and temperatures match fairly well with the flight thermal model, indicating that the test model can simulate fairly accurately the conditions on-orbit. However, further analysis is needed to determine the best test configuration possible to validate the GPM thermal design before the start of environmental testing later this year. Also, while this analysis method has been applied solely to GPM, it should be emphasized that the same process can be applied to any mission to develop an effective test setup and panel settings which accurately simulate on-orbit thermal environments.

  17. Forest Canopy Processes in a Regional Chemical Transport Model

    NASA Astrophysics Data System (ADS)

    Makar, Paul; Staebler, Ralf; Akingunola, Ayodeji; Zhang, Junhua; McLinden, Chris; Kharol, Shailesh; Moran, Michael; Robichaud, Alain; Zhang, Leiming; Stroud, Craig; Pabla, Balbir; Cheung, Philip

    2016-04-01

    Forest canopies have typically been absent or highly parameterized in regional chemical transport models. Some forest-related processes are often considered - for example, biogenic emissions from the forests are included as a flux lower boundary condition on vertical diffusion, as is deposition to vegetation. However, real forest canopies comprise a much more complicated set of processes, at scales below the "transport model-resolved scale" of vertical levels usually employed in regional transport models. Advective and diffusive transport within the forest canopy typically scale with the height of the canopy, and the former process tends to dominate over the latter. Emissions of biogenic hydrocarbons arise from the foliage, which may be located tens of metres above the surface, while emissions of biogenic nitric oxide from decaying plant matter are located at the surface - in contrast to the surface flux boundary condition usually employed in chemical transport models. Deposition, similarly, is usually parameterized as a flux boundary condition, but may be differentiated between fluxes to vegetation and fluxes to the surface when the canopy scale is considered. The chemical environment also changes within forest canopies: shading, temperature, and relativity humidity changes with height within the canopy may influence chemical reaction rates. These processes have been observed in a host of measurement studies, and have been simulated using site-specific one-dimensional forest canopy models. Their influence on regional scale chemistry has been unknown, until now. In this work, we describe the results of the first attempt to include complex canopy processes within a regional chemical transport model (GEM-MACH). The original model core was subdivided into "canopy" and "non-canopy" subdomains. In the former, three additional near-surface layers based on spatially and seasonally varying satellite-derived canopy height and leaf area index were added to the original model structure. Process methodology for deposition, biogenic emissions, shading, vertical diffusion, advection, chemical reactive environment and particle microphysics were modified to account for expected conditions within the forest canopy and the additional layers. The revised and original models were compared for a 10km resolution domain covering North America, for a one-month duration simulation. The canopy processes were found to have a very significant impact on model results. We will present a comparison to network observations which suggests that forest canopy processes may account for previously unexplained local and regional biases in model ozone predictions noted in GEM-MACH and other models. The impact of the canopy processes on NO2, PM2.5, and SO2 performance will also be presented and discussed.

  18. Open source integrated modeling environment Delta Shell

    NASA Astrophysics Data System (ADS)

    Donchyts, G.; Baart, F.; Jagers, B.; van Putten, H.

    2012-04-01

    In the last decade, integrated modelling has become a very popular topic in environmental modelling since it helps solving problems, which is difficult to model using a single model. However, managing complexity of integrated models and minimizing time required for their setup remains a challenging task. The integrated modelling environment Delta Shell simplifies this task. The software components of Delta Shell are easy to reuse separately from each other as well as a part of integrated environment that can run in a command-line or a graphical user interface mode. The most components of the Delta Shell are developed using C# programming language and include libraries used to define, save and visualize various scientific data structures as well as coupled model configurations. Here we present two examples showing how Delta Shell simplifies process of setting up integrated models from the end user and developer perspectives. The first example shows coupling of a rainfall-runoff, a river flow and a run-time control models. The second example shows how coastal morphological database integrates with the coastal morphological model (XBeach) and a custom nourishment designer. Delta Shell is also available as open-source software released under LGPL license and accessible via http://oss.deltares.nl.

  19. Models of Cultural Niche Construction with Selection and Assortative Mating

    PubMed Central

    Feldman, Marcus W.

    2012-01-01

    Niche construction is a process through which organisms modify their environment and, as a result, alter the selection pressures on themselves and other species. In cultural niche construction, one or more cultural traits can influence the evolution of other cultural or biological traits by affecting the social environment in which the latter traits may evolve. Cultural niche construction may include either gene-culture or culture-culture interactions. Here we develop a model of this process and suggest some applications of this model. We examine the interactions between cultural transmission, selection, and assorting, paying particular attention to the complexities that arise when selection and assorting are both present, in which case stable polymorphisms of all cultural phenotypes are possible. We compare our model to a recent model for the joint evolution of religion and fertility and discuss other potential applications of cultural niche construction theory, including the evolution and maintenance of large-scale human conflict and the relationship between sex ratio bias and marriage customs. The evolutionary framework we introduce begins to address complexities that arise in the quantitative analysis of multiple interacting cultural traits. PMID:22905167

  20. A Java-based fMRI processing pipeline evaluation system for assessment of univariate general linear model and multivariate canonical variate analysis-based pipelines.

    PubMed

    Zhang, Jing; Liang, Lichen; Anderson, Jon R; Gatewood, Lael; Rottenberg, David A; Strother, Stephen C

    2008-01-01

    As functional magnetic resonance imaging (fMRI) becomes widely used, the demands for evaluation of fMRI processing pipelines and validation of fMRI analysis results is increasing rapidly. The current NPAIRS package, an IDL-based fMRI processing pipeline evaluation framework, lacks system interoperability and the ability to evaluate general linear model (GLM)-based pipelines using prediction metrics. Thus, it can not fully evaluate fMRI analytical software modules such as FSL.FEAT and NPAIRS.GLM. In order to overcome these limitations, a Java-based fMRI processing pipeline evaluation system was developed. It integrated YALE (a machine learning environment) into Fiswidgets (a fMRI software environment) to obtain system interoperability and applied an algorithm to measure GLM prediction accuracy. The results demonstrated that the system can evaluate fMRI processing pipelines with univariate GLM and multivariate canonical variates analysis (CVA)-based models on real fMRI data based on prediction accuracy (classification accuracy) and statistical parametric image (SPI) reproducibility. In addition, a preliminary study was performed where four fMRI processing pipelines with GLM and CVA modules such as FSL.FEAT and NPAIRS.CVA were evaluated with the system. The results indicated that (1) the system can compare different fMRI processing pipelines with heterogeneous models (NPAIRS.GLM, NPAIRS.CVA and FSL.FEAT) and rank their performance by automatic performance scoring, and (2) the rank of pipeline performance is highly dependent on the preprocessing operations. These results suggest that the system will be of value for the comparison, validation, standardization and optimization of functional neuroimaging software packages and fMRI processing pipelines.

  1. Liberty High School Transition Project: Model Process for Assimilating School, Community, Business, Government and Service Groups of the Least Restrictive Environment for Nondisabled and Disabled.

    ERIC Educational Resources Information Center

    Grimes, Michael K.

    The panel presentation traces the development of and describes the operation of a Brentwood (California) project to prepare approximately 75 severely disabled individuals, ages 12-22, to function in the least restrictive recreation/leisure, vocational, and general community environments. Transition Steering Committee developed such project…

  2. Evaluation of Keyphrase Extraction Algorithm and Tiling Process for a Document/Resource Recommender within E-Learning Environments

    ERIC Educational Resources Information Center

    Mangina, Eleni; Kilbride, John

    2008-01-01

    The research presented in this paper is an examination of the applicability of IUI techniques in an online e-learning environment. In particular we make use of user modeling techniques, information retrieval and extraction mechanisms and collaborative filtering methods. The domains of e-learning, web-based training and instruction and intelligent…

  3. Effects of Face-to-Face versus Chat Communication on Performance in a Collaborative Inquiry Modeling Task

    ERIC Educational Resources Information Center

    Sins, Patrick H. M.; Savelsbergh, Elwin R.; van Joolingen, Wouter R.; van Hout-Wolters, Bernadette H. A. M.

    2011-01-01

    In many contemporary collaborative inquiry learning environments, chat is being used as a means for communication. Still, it remains an open issue whether chat communication is an appropriate means to support the deep reasoning process students need to perform in such environments. Purpose of the present study was to compare the impact of chat…

  4. Keystroke-Level Analysis to Estimate Time to Process Pages in Online Learning Environments

    ERIC Educational Resources Information Center

    Bälter, Olle; Zimmaro, Dawn

    2018-01-01

    It is challenging for students to plan their work sessions in online environments, as it is very difficult to make estimates on how much material there is to cover. In order to simplify this estimation, we have extended the Keystroke-level analysis model with individual reading speed of text, figures, and questions. This was used to estimate how…

  5. A Learning Model for Enhancing the Student's Control in Educational Process Using Web 2.0 Personal Learning Environments

    ERIC Educational Resources Information Center

    Rahimi, Ebrahim; van den Berg, Jan; Veen, Wim

    2015-01-01

    In recent educational literature, it has been observed that improving student's control has the potential of increasing his or her feeling of ownership, personal agency and activeness as means to maximize his or her educational achievement. While the main conceived goal for personal learning environments (PLEs) is to increase student's control by…

  6. A Decision Tool that Combines Discrete Event Software Process Models with System Dynamics Pieces for Software Development Cost Estimation and Analysis

    NASA Technical Reports Server (NTRS)

    Mizell, Carolyn Barrett; Malone, Linda

    2007-01-01

    The development process for a large software development project is very complex and dependent on many variables that are dynamic and interrelated. Factors such as size, productivity and defect injection rates will have substantial impact on the project in terms of cost and schedule. These factors can be affected by the intricacies of the process itself as well as human behavior because the process is very labor intensive. The complex nature of the development process can be investigated with software development process models that utilize discrete event simulation to analyze the effects of process changes. The organizational environment and its effects on the workforce can be analyzed with system dynamics that utilizes continuous simulation. Each has unique strengths and the benefits of both types can be exploited by combining a system dynamics model and a discrete event process model. This paper will demonstrate how the two types of models can be combined to investigate the impacts of human resource interactions on productivity and ultimately on cost and schedule.

  7. Simulation Environment Synchronizing Real Equipment for Manufacturing Cell

    NASA Astrophysics Data System (ADS)

    Inukai, Toshihiro; Hibino, Hironori; Fukuda, Yoshiro

    Recently, manufacturing industries face various problems such as shorter product life cycle, more diversified customer needs. In this situation, it is very important to reduce lead-time of manufacturing system constructions. At the manufacturing system implementation stage, it is important to make and evaluate facility control programs for a manufacturing cell, such as ladder programs for programmable logical controllers (PLCs) rapidly. However, before the manufacturing systems are implemented, methods to evaluate the facility control programs for the equipment while mixing and synchronizing real equipment and virtual factory models on the computers have not been developed. This difficulty is caused by the complexity of the manufacturing system composed of a great variety of equipment, and stopped precise and rapid support of a manufacturing engineering process. In this paper, a manufacturing engineering environment (MEE) to support manufacturing engineering processes using simulation technologies is proposed. MEE consists of a manufacturing cell simulation environment (MCSE) and a distributed simulation environment (DSE). MCSE, which consists of a manufacturing cell simulator and a soft-wiring system, is emphatically proposed in detail. MCSE realizes making and evaluating facility control programs by using virtual factory models on computers before manufacturing systems are implemented.

  8. The `TTIME' Package: Performance Evaluation in a Cluster Computing Environment

    NASA Astrophysics Data System (ADS)

    Howe, Marico; Berleant, Daniel; Everett, Albert

    2011-06-01

    The objective of translating developmental event time across mammalian species is to gain an understanding of the timing of human developmental events based on known time of those events in animals. The potential benefits include improvements to diagnostic and intervention capabilities. The CRAN `ttime' package provides the functionality to infer unknown event timings and investigate phylogenetic proximity utilizing hierarchical clustering of both known and predicted event timings. The original generic mammalian model included nine eutherian mammals: Felis domestica (cat), Mustela putorius furo (ferret), Mesocricetus auratus (hamster), Macaca mulatta (monkey), Homo sapiens (humans), Mus musculus (mouse), Oryctolagus cuniculus (rabbit), Rattus norvegicus (rat), and Acomys cahirinus (spiny mouse). However, the data for this model is expected to grow as more data about developmental events is identified and incorporated into the analysis. Performance evaluation of the `ttime' package across a cluster computing environment versus a comparative analysis in a serial computing environment provides an important computational performance assessment. A theoretical analysis is the first stage of a process in which the second stage, if justified by the theoretical analysis, is to investigate an actual implementation of the `ttime' package in a cluster computing environment and to understand the parallelization process that underlies implementation.

  9. Sensor/Response Coordination In A Tactical Self-Protection System

    NASA Astrophysics Data System (ADS)

    Steinberg, Alan N.

    1988-08-01

    This paper describes a model for integrating information acquisition functions into a response planner within a tactical self-defense system. This model may be used in defining requirements in such applications for sensor systems and for associated processing and control functions. The goal of information acquisition in a self-defense system is generally not that of achieving the best possible estimate of the threat environment; but rather to provide resolution of that environment sufficient to support response decisions. We model the information acquisition problem as that of achieving a partition among possible world states such that the final partition maps into the system's repertoire of possible responses.

  10. Use of Laboratory Data to Model Interstellar Chemistry

    NASA Technical Reports Server (NTRS)

    Vidali, Gianfranco; Roser, J. E.; Manico, G.; Pirronello, V.

    2006-01-01

    Our laboratory research program is about the formation of molecules on dust grains analogues in conditions mimicking interstellar medium environments. Using surface science techniques, in the last ten years we have investigated the formation of molecular hydrogen and other molecules on different types of dust grain analogues. We analyzed the results to extract quantitative information on the processes of molecule formation on and ejection from dust grain analogues. The usefulness of these data lies in the fact that these results have been employed by theoreticians in models of the chemical evolution of ISM environments.

  11. Application of the Modular Command and Control Evaluation Structure to a Strategic Defense Initiative Command and Control System.

    DTIC Science & Technology

    1987-03-01

    environment . Actions within the process loop is initiated by a perceived divergence from a desired state and the sensed environmental state. Definitions of the...initiated environmental effect. An action by our own forces as well as by the enemy forces can create an alteration to the overall environment . The DESIRED...Additionally, the model would accommodate the entire C2 system, including physical entities, structure, and its environment . The objective was to

  12. Building a competent health manager at district level: a grounded theory study from Eastern Uganda.

    PubMed

    Tetui, Moses; Hurtig, Anna-Karin; Ekirpa-Kiracho, Elizabeth; Kiwanuka, Suzanne N; Coe, Anna-Britt

    2016-11-21

    Health systems in low-income countries are often characterized by poor health outcomes. While many reasons have been advanced to explain the persistently poor outcomes, management of the system has been found to play a key role. According to a WHO framework, the management of health systems is central to its ability to deliver needed health services. In this study, we examined how district managers in a rural setting in Uganda perceived existing approaches to strengthening management so as to provide a pragmatic and synergistic model for improving management capacity building. Twenty-two interviews were conducted with district level administrative and political managers, district level health managers and health facility managers to understand their perceptions and definitions of management and capacity building. Kathy Charmaz's constructive approach to grounded theory informed the data analysis process. An interative, dynamic and complex model with three sub-process of building a competent health manager was developed. A competent manager was understood as one who knew his/her roles, was well informed and was empowered to execute management functions. Professionalizing health managers which was viewed as the foundation, the use of engaging learning approaches as the inside contents and having a supportive work environment the frame of the model were the sub-processes involved in the model. The sub-processes were interconnected although the respondents agreed that having a supportive work environment was more time and effort intensive relative to the other two sub-processes. The model developed in our study makes four central contributions to enhance the WHO framework and the existing literature. First, it emphasizes management capacity building as an iterative, dynamic and complex process rather than a set of characteristics of competent managers. Second, our model suggests the need for professionalization of health managers at different levels of the health system. Third, our model underscores the benefits that could be accrued from the use of engaging learning approaches through prolonged and sustained processes that act in synergy. Lastly, our model postulates that different resource investments and a varied range of stakeholders could be required at each of the sub-processes.

  13. MULTI: a shared memory approach to cooperative molecular modeling.

    PubMed

    Darden, T; Johnson, P; Smith, H

    1991-03-01

    A general purpose molecular modeling system, MULTI, based on the UNIX shared memory and semaphore facilities for interprocess communication is described. In addition to the normal querying or monitoring of geometric data, MULTI also provides processes for manipulating conformations, and for displaying peptide or nucleic acid ribbons, Connolly surfaces, close nonbonded contacts, crystal-symmetry related images, least-squares superpositions, and so forth. This paper outlines the basic techniques used in MULTI to ensure cooperation among these specialized processes, and then describes how they can work together to provide a flexible modeling environment.

  14. COSMIC DUST AGGREGATION WITH STOCHASTIC CHARGING

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matthews, Lorin S.; Hyde, Truell W.; Shotorban, Babak, E-mail: Lorin_Matthews@baylor.edu

    2013-10-20

    The coagulation of cosmic dust grains is a fundamental process which takes place in astrophysical environments, such as presolar nebulae and circumstellar and protoplanetary disks. Cosmic dust grains can become charged through interaction with their plasma environment or other processes, and the resultant electrostatic force between dust grains can strongly affect their coagulation rate. Since ions and electrons are collected on the surface of the dust grain at random time intervals, the electrical charge of a dust grain experiences stochastic fluctuations. In this study, a set of stochastic differential equations is developed to model these fluctuations over the surface ofmore » an irregularly shaped aggregate. Then, employing the data produced, the influence of the charge fluctuations on the coagulation process and the physical characteristics of the aggregates formed is examined. It is shown that dust with small charges (due to the small size of the dust grains or a tenuous plasma environment) is affected most strongly.« less

  15. POLUTE. Forest Air Pollutant Uptake Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Murphy, C.E. Jr.; Sinclair, T.R.

    1992-02-13

    POLUTE is a computer model designed to estimate the uptake of air pollutants by forests. The model utilizes submodels to describe atmospheric diffusion immediately above and within the canopy, and into the sink areas within or on the trees. The program implementing the model is general and can be used, with only minor changes, for any gaseous pollutant. The model provides an estimate describing the response of the vegetarian-atmosphere system to the environment as related to three types of processes: atmospheric diffusion, diffusion near and inside the absorbing plant, and the physical and chemical processes at the sink on ormore » within the plant.« less

  16. Job shop scheduling model for non-identic machine with fixed delivery time to minimize tardiness

    NASA Astrophysics Data System (ADS)

    Kusuma, K. K.; Maruf, A.

    2016-02-01

    Scheduling non-identic machines problem with low utilization characteristic and fixed delivery time are frequent in manufacture industry. This paper propose a mathematical model to minimize total tardiness for non-identic machines in job shop environment. This model will be categorized as an integer linier programming model and using branch and bound algorithm as the solver method. We will use fixed delivery time as main constraint and different processing time to process a job. The result of this proposed model shows that the utilization of production machines can be increase with minimal tardiness using fixed delivery time as constraint.

  17. Modeling the Dynamics of Task Allocation and Specialization in Honeybee Societies

    NASA Astrophysics Data System (ADS)

    Hoogendoorn, Mark; Schut, Martijn C.; Treur, Jan

    The concept of organization has been studied in sciences such as social science and economics, but recently also in artificial intelligence [Furtado 2005, Giorgini 2004, and McCallum 2005]. With the desire to analyze and design more complex systems consisting of larger numbers of agents (e.g., in nature, society, or software), the need arises for a concept of higher abstraction than the concept agent. To this end, organizational modeling is becoming a practiced stage in the analysis and design of multi-agent systems, hereby taking into consideration the environment of the organization. An environment can have a high degree of variability which might require organizations to adapt to the environment's dynamics, to ensure a continuous proper functioning of the organization. Hence, such change processes are a crucial function of the organization and should be part of the organizational model.

  18. Neuromechanics of crawling in D. melanogaster larvae

    NASA Astrophysics Data System (ADS)

    Pehlevan, Cengiz; Paoletti, Paolo; Mahadevan, L.

    2015-03-01

    Nervous system, body and environment interact in non-trivial ways to generate locomotion and thence behavior in an organism. Here we present a minimal integrative mathematical model to describe the simple behavior of forward crawling in Drosophila larvae. Our model couples the excitation-inhibition circuits in the nervous system to force production in the muscles and body movement in a frictional environment, which in turn leads to a proprioceptive signal that feeds back to the nervous system. Our results explain the basic observed phenomenology of crawling with or without proprioception, and elucidate the stabilizing role of proprioception in crawling with respect to external and internal perturbations. Our integrated approach allows us to make testable predictions on the effect of changing body-environment interactions on crawling, and serves as a substrate for the development of hierarchical models linking cellular processes to behavior.

  19. Scalable Predictive Analysis in Critically Ill Patients Using a Visual Open Data Analysis Platform

    PubMed Central

    Poucke, Sven Van; Zhang, Zhongheng; Schmitz, Martin; Vukicevic, Milan; Laenen, Margot Vander; Celi, Leo Anthony; Deyne, Cathy De

    2016-01-01

    With the accumulation of large amounts of health related data, predictive analytics could stimulate the transformation of reactive medicine towards Predictive, Preventive and Personalized (PPPM) Medicine, ultimately affecting both cost and quality of care. However, high-dimensionality and high-complexity of the data involved, prevents data-driven methods from easy translation into clinically relevant models. Additionally, the application of cutting edge predictive methods and data manipulation require substantial programming skills, limiting its direct exploitation by medical domain experts. This leaves a gap between potential and actual data usage. In this study, the authors address this problem by focusing on open, visual environments, suited to be applied by the medical community. Moreover, we review code free applications of big data technologies. As a showcase, a framework was developed for the meaningful use of data from critical care patients by integrating the MIMIC-II database in a data mining environment (RapidMiner) supporting scalable predictive analytics using visual tools (RapidMiner’s Radoop extension). Guided by the CRoss-Industry Standard Process for Data Mining (CRISP-DM), the ETL process (Extract, Transform, Load) was initiated by retrieving data from the MIMIC-II tables of interest. As use case, correlation of platelet count and ICU survival was quantitatively assessed. Using visual tools for ETL on Hadoop and predictive modeling in RapidMiner, we developed robust processes for automatic building, parameter optimization and evaluation of various predictive models, under different feature selection schemes. Because these processes can be easily adopted in other projects, this environment is attractive for scalable predictive analytics in health research. PMID:26731286

  20. Scalable Predictive Analysis in Critically Ill Patients Using a Visual Open Data Analysis Platform.

    PubMed

    Van Poucke, Sven; Zhang, Zhongheng; Schmitz, Martin; Vukicevic, Milan; Laenen, Margot Vander; Celi, Leo Anthony; De Deyne, Cathy

    2016-01-01

    With the accumulation of large amounts of health related data, predictive analytics could stimulate the transformation of reactive medicine towards Predictive, Preventive and Personalized (PPPM) Medicine, ultimately affecting both cost and quality of care. However, high-dimensionality and high-complexity of the data involved, prevents data-driven methods from easy translation into clinically relevant models. Additionally, the application of cutting edge predictive methods and data manipulation require substantial programming skills, limiting its direct exploitation by medical domain experts. This leaves a gap between potential and actual data usage. In this study, the authors address this problem by focusing on open, visual environments, suited to be applied by the medical community. Moreover, we review code free applications of big data technologies. As a showcase, a framework was developed for the meaningful use of data from critical care patients by integrating the MIMIC-II database in a data mining environment (RapidMiner) supporting scalable predictive analytics using visual tools (RapidMiner's Radoop extension). Guided by the CRoss-Industry Standard Process for Data Mining (CRISP-DM), the ETL process (Extract, Transform, Load) was initiated by retrieving data from the MIMIC-II tables of interest. As use case, correlation of platelet count and ICU survival was quantitatively assessed. Using visual tools for ETL on Hadoop and predictive modeling in RapidMiner, we developed robust processes for automatic building, parameter optimization and evaluation of various predictive models, under different feature selection schemes. Because these processes can be easily adopted in other projects, this environment is attractive for scalable predictive analytics in health research.

  1. Allosteric Learning Model in English Lesson: Teachers' Views, the Instructions of Curriculum and Course Book, a Sample of Daily Lesson Plan

    ERIC Educational Resources Information Center

    Berkant, Hasan Güner; Baysal, Seda

    2017-01-01

    The changes which occur during the learning process have been explained by many teaching-learning models and theories. One of these models is allosteric learning model (ALM) which was developed by André Giordan in 1989. This model was derived from a biological metaphor related to proteins. The interaction between individual and environment in a…

  2. Time distributions of solar energetic particle events: Are SEPEs really random?

    NASA Astrophysics Data System (ADS)

    Jiggens, P. T. A.; Gabriel, S. B.

    2009-10-01

    Solar energetic particle events (SEPEs) can exhibit flux increases of several orders of magnitude over background levels and have always been considered to be random in nature in statistical models with no dependence of any one event on the occurrence of previous events. We examine whether this assumption of randomness in time is correct. Engineering modeling of SEPEs is important to enable reliable and efficient design of both Earth-orbiting and interplanetary spacecraft and future manned missions to Mars and the Moon. All existing engineering models assume that the frequency of SEPEs follows a Poisson process. We present analysis of the event waiting times using alternative distributions described by Lévy and time-dependent Poisson processes and compared these with the usual Poisson distribution. The results show significant deviation from a Poisson process and indicate that the underlying physical processes might be more closely related to a Lévy-type process, suggesting that there is some inherent “memory” in the system. Inherent Poisson assumptions of stationarity and event independence are investigated, and it appears that they do not hold and can be dependent upon the event definition used. SEPEs appear to have some memory indicating that events are not completely random with activity levels varying even during solar active periods and are characterized by clusters of events. This could have significant ramifications for engineering models of the SEP environment, and it is recommended that current statistical engineering models of the SEP environment should be modified to incorporate long-term event dependency and short-term system memory.

  3. Understanding How College Students Think.

    ERIC Educational Resources Information Center

    Cosgrove, Thomas J.

    1987-01-01

    With a knowledge of students' thinking processes, activities advisers and leaders can design environments for maximum learning and development. An interpretation of Perry's model of intellectual and ethical development is provided. (MLW)

  4. Laboratory Investigation of Space and Planetary Dust Grains

    NASA Technical Reports Server (NTRS)

    Spann, James

    2005-01-01

    Dust in space is ubiquitous and impacts diverse observed phenomena in various ways. Understanding the dominant mechanisms that control dust grain properties and its impact on surrounding environments is basic to improving our understanding observed processes at work in space. There is a substantial body of work on the theory and modeling of dust in space and dusty plasmas. To substantiate and validate theory and models, laboratory investigations and space borne observations have been conducted. Laboratory investigations are largely confined to an assembly of dust grains immersed in a plasma environment. Frequently the behaviors of these complex dusty plasmas in the laboratory have raised more questions than verified theories. Space borne observations have helped us characterize planetary environments. The complex behavior of dust grains in space indicates the need to understand the microphysics of individual grains immersed in a plasma or space environment.

  5. A Collective Case Study of Secondary Students' Model-Based Inquiry on Natural Selection through Programming in an Agent-Based Modeling Environment

    ERIC Educational Resources Information Center

    Xiang, Lin

    2011-01-01

    This is a collective case study seeking to develop detailed descriptions of how programming an agent-based simulation influences a group of 8th grade students' model-based inquiry (MBI) by examining students' agent-based programmable modeling (ABPM) processes and the learning outcomes. The context of the present study was a biology unit on…

  6. Origin of the main r-process elements

    NASA Astrophysics Data System (ADS)

    Otsuki, K.; Truran, J.; Wiescher, M.; Gorres, J.; Mathews, G.; Frekers, D.; Mengoni, A.; Bartlett, A.; Tostevin, J.

    2006-07-01

    The r-process is supposed to be a primary process which assembles heavy nuclei from a photo-dissociated nucleon gas. Hence, the reaction flow through light elements can be important as a constraint on the conditions for the r-process. We have studied the impact of di-neutron capture and the neutron-capture of light (Z<10) elements on r-process nucleosynthesis in three different environments: neutrino-driven winds in Type II supernovae; the prompt explosion of low mass supernovae; and neutron star mergers. Although the effect of di-neutron capture is not significant for the neutrino-driven wind model or low-mass supernovae, it becomes significant in the neutron-star merger model. The neutron-capture of light elements, which has been studied extensively for neutrino-driven wind models, also impacts the other two models. We show that it may be possible to identify the astrophysical site for the main r-process if the nuclear physics uncertainties in current r-process calculations could be reduced.

  7. Model-based reasoning for system and software engineering: The Knowledge From Pictures (KFP) environment

    NASA Technical Reports Server (NTRS)

    Bailin, Sydney; Paterra, Frank; Henderson, Scott; Truszkowski, Walt

    1993-01-01

    This paper presents a discussion of current work in the area of graphical modeling and model-based reasoning being undertaken by the Automation Technology Section, Code 522.3, at Goddard. The work was initially motivated by the growing realization that the knowledge acquisition process was a major bottleneck in the generation of fault detection, isolation, and repair (FDIR) systems for application in automated Mission Operations. As with most research activities this work started out with a simple objective: to develop a proof-of-concept system demonstrating that a draft rule-base for a FDIR system could be automatically realized by reasoning from a graphical representation of the system to be monitored. This work was called Knowledge From Pictures (KFP) (Truszkowski et. al. 1992). As the work has successfully progressed the KFP tool has become an environment populated by a set of tools that support a more comprehensive approach to model-based reasoning. This paper continues by giving an overview of the graphical modeling objectives of the work, describing the three tools that now populate the KFP environment, briefly presenting a discussion of related work in the field, and by indicating future directions for the KFP environment.

  8. The development of ecological environment in China based on the system dynamics method from the society, economy and environment perspective.

    PubMed

    Guang, Yang; Ge, Song; Han, Liu

    2016-01-01

    The harmonious development in society, economy and environment are crucial to regional sustained boom. However, the society, economy and environment are not respectively independent, but both mutually promotes one which, or restrict mutually complex to have the long-enduring overall process. The present study is an attempt to investigate the relationship and interaction of society, economy and environment in China based on the data from 2004 to 2013. The principal component analysis (PCA) model was employed to identify the main factors effecting the society, economy and environment subsystems, and SD (system dynamics) method used to carry out dynamic assessment for future state of sustainability from society, economy and environment perspective with future indicator values. Sustainable development in China was divided in the study into three phase from 2004 to 2013 based competitive values of these three subsystems. According to the results of PCA model, China is in third phase, and the economy growth is faster than the environment development, while the social development still maintained a steady and rapid growth, implying that the next step for sustainable development in China should focus on society development, especially the environment development.

  9. A model of adaptation of overseas nurses: exploring the experiences of Japanese nurses working in Australia.

    PubMed

    Kishi, Yuka; Inoue, Kumiyo; Crookes, Patrick; Shorten, Allison

    2014-04-01

    The purpose of the study was to investigate the experiences of Japanese nurses and their adaptation to their work environment in Australia. Using a qualitative research method and semistructured interviews, the study aimed to discover, describe, and analyze the experiences of 14 Japanese nurses participating in the study. A qualitative study. Fourteen Japanese registered nurses working in Australian hospitals participated in the study. Individual semistructured interviews were conducted from April to June in 2008. Thematic analysis was used to identify themes within the data. Analysis of qualitative open-ended questions revealed the participants' adaptation process. It consists of three themes or phases: seeking (S), acclimatizing (A), and settling (S), subsequently named the S.A.S. model. The conceptual model of the adaptation processes of 14 Japanese nurses working in Australia includes the seeking, acclimatizing, and settling phases. Although these phases are not mutually exclusive and the process is not necessarily uniformly linear, all participants in this study passed through this S.A.S. model in order to adapt to their new environment. The S.A.S. model of adaptation helps to describe the experiences of Japanese overseas qualified nurses working in Australian hospitals. Future research is needed to examine whether this model can be applied to nurses from other countries and in other settings outside Australia.

  10. The use of deep and surface learning strategies among students learning English as a foreign language in an Internet environment.

    PubMed

    Aharony, Noa

    2006-12-01

    The learning context is learning English in an Internet environment. The examination of this learning process was based on the Biggs and Moore's teaching-learning model (Biggs & Moore, 1993). The research aims to explore the use of the deep and surface strategies in an Internet environment among EFL students who come from different socio-economic backgrounds. The results of the research may add an additional level to the understanding of students' functioning in the Internet environment. One hundred fourty-eight Israeli junior and high school students participated in this research. The methodology was based on special computer software: Screen Cam, which recorded the students' learning process. In addition, expert judges completed a questionnaire which examined and categorized the students' learning strategies. The research findings show a clear preference of participants from all socio-economic backgrounds towards the surface learning strategy. The findings also showed that students from the medium to high socio-economic background used both learning strategies more frequently than low socio-economic students. The results reflect the habits that students acquire during their adjustment process throughout their education careers. A brief encounter with the Internet learning environment apparently cannot change norms or habits, which were acquired in the non-Internet learning environment.

  11. The Development of a Design and Construction Process Protocol to Support the Home Modification Process Delivered by Occupational Therapists

    PubMed Central

    Ormerod, Marcus; Newton, Rita

    2018-01-01

    Modifying the home environments of older people as they age in place is a well-established health and social care intervention. Using design and construction methods to redress any imbalance caused by the ageing process or disability within the home environment, occupational therapists are seen as the experts in this field of practice. However, the process used by occupational therapists when modifying home environments has been criticised for being disorganised and not founded on theoretical principles and concepts underpinning the profession. To address this issue, research was conducted to develop a design and construction process protocol specifically for home modifications. A three-stage approach was taken for the analysis of qualitative data generated from an online survey, completed by 135 occupational therapists in the UK. Using both the existing occupational therapy intervention process model and the design and construction process protocol as the theoretical frameworks, a 4-phase, 9-subphase design and construction process protocol for home modifications was developed. Overall, the study is innovative in developing the first process protocol for home modifications, potentially providing occupational therapists with a systematic and effective approach to the design and delivery of home modification services for older and disabled people. PMID:29682348

  12. The Development of a Design and Construction Process Protocol to Support the Home Modification Process Delivered by Occupational Therapists.

    PubMed

    Russell, Rachel; Ormerod, Marcus; Newton, Rita

    2018-01-01

    Modifying the home environments of older people as they age in place is a well-established health and social care intervention. Using design and construction methods to redress any imbalance caused by the ageing process or disability within the home environment, occupational therapists are seen as the experts in this field of practice. However, the process used by occupational therapists when modifying home environments has been criticised for being disorganised and not founded on theoretical principles and concepts underpinning the profession. To address this issue, research was conducted to develop a design and construction process protocol specifically for home modifications. A three-stage approach was taken for the analysis of qualitative data generated from an online survey, completed by 135 occupational therapists in the UK. Using both the existing occupational therapy intervention process model and the design and construction process protocol as the theoretical frameworks, a 4-phase, 9-subphase design and construction process protocol for home modifications was developed. Overall, the study is innovative in developing the first process protocol for home modifications, potentially providing occupational therapists with a systematic and effective approach to the design and delivery of home modification services for older and disabled people.

  13. FEST-C 1.3 & 2.0 for CMAQ Bi-directional NH3, Crop Production, and SWAT Modeling

    EPA Science Inventory

    The Fertilizer Emission Scenario Tool for CMAQ (FEST-C) is developed in a Linux environment, a festc JAVA interface that integrates 14 tools and scenario management options facilitating land use/crop data processing for the Community Multiscale Air Quality (CMAQ) modeling system ...

  14. Response of a One-Biosphere Nutrient Modeling System to Regional Land Use and Management Change

    EPA Science Inventory

    A multi-media system of nitrogen and co-pollutant models describing critical physical and chemical processes that cascade synergistically and competitively through the environment, the economy and society has been developed at the USEPA Office of Research and Development (see fig...

  15. Student Learning Theory Goes (Back) to (High) School

    ERIC Educational Resources Information Center

    Ginns, Paul; Martin, Andrew J.; Papworth, Brad

    2014-01-01

    Biggs' 3P (Presage-Process-Product) model, a key framework in Student Learning Theory, provides a powerful means of understanding relations between students' perceptions of the teaching and learning environment, learning strategies, and learning outcomes. While influential in higher education, fewer tests of the model in secondary education…

  16. Evaluation of distributed hydrologic impacts of temperature-index and energy-based snow models

    USDA-ARS?s Scientific Manuscript database

    Proper characterizations of snow melt and accumulation processes in the snow-dominated mountain environment are needed to understand and predict spatiotemporal distribution of water cycle components. Two commonly used strategies in modeling of snow accumulation and melt are the full energy based and...

  17. MODELING MICROBIAL TRANSPORT IN SOIL AND GROUNDWATER: MICROBIOLOGISTS CAN ASSIST IN THE DEVELOPMENT OF MODELS OF CONTAMINANT TRANSPORT

    EPA Science Inventory

    A large body of literature describes the processes affecting the fate of microorganisms in the subsurface environment (i.e., soil and groundwater). The fate of microorganisms depends on two main components: survival and transport. other components must be considered when determin...

  18. Do College Faculty Embrace Web 2.0 Technology?

    ERIC Educational Resources Information Center

    Siha, Samia M.; Bell, Reginald Lamar; Roebuck, Deborah

    2016-01-01

    The authors sought to determine if Rogers's Innovation Decision Process model could analyze Web 2.0 usage within the collegiate environment. The key independent variables studied in relationship to this model were gender, faculty rank, course content delivery method, and age. Chi-square nonparametric tests on the independent variables across…

  19. Use of a Behavioral Contract with Resident Assistants: Prelude to a Health Fair.

    ERIC Educational Resources Information Center

    Morgan, John D.; Hyner, Gerald C.

    1984-01-01

    Presents a conceptual model which focuses on goals of student government in residence halls. The model has two fundamental processes. One focuses on short term goals and activities catering to creation of environment. The second has long term effects referred to as lifelong personal development. (JAC)

  20. Linking Agricultural Crop Management and Air Quality Models for Regional to National-Scale Nitrogen Assessments

    EPA Science Inventory

    While nitrogen (N) is an essential element for life, human population growth and demands for energy, transportation and food can lead to excess nitrogen in the environment. A modeling framework is described and implemented to promote a more integrated, process-based and system le...

  1. A UML approach to process modelling of clinical practice guidelines for enactment.

    PubMed

    Knape, T; Hederman, L; Wade, V P; Gargan, M; Harris, C; Rahman, Y

    2003-01-01

    Although clinical practice guidelines (CPGs) have been suggested as a means of encapsulating best practice in evidence-based medical treatment, their usage in clinical environments has been disappointing. Criticisms of guideline representations have been that they are predominantly narrative and are difficult to incorporate into clinical information systems. This paper analyses the use of UML process modelling techniques for guideline representation and proposes the automated generation of executable guidelines using XMI. This hybrid UML-XMI approach provides flexible authoring of guideline decision and control structures whilst integrating appropriate data flow. It also uses an open XMI standard interface to allow the use of authoring tools and process control systems from multiple vendors. The paper first surveys CPG modelling formalisms followed by a brief introduction to process modelling in UMI. Furthermore, the modelling of CPGs in UML is presented leading to a case study of encoding a diabetes mellitus CPG using UML.

  2. A new approach in the design of an interactive environment for teaching Hamiltonian digraphs

    NASA Astrophysics Data System (ADS)

    Iordan, A. E.; Panoiu, M.

    2014-03-01

    In this article the authors present the necessary steps in object orientated design of an interactive environment that is dedicated to the process of acquaintances assimilation in Hamiltonian graphs theory domain, especially for the simulation of algorithms which determine the Hamiltonian trails and circuits. The modelling of the interactive environment is achieved through specific UML diagrams representing the steps of analysis, design and implementation. This interactive environment is very useful for both students and professors, because computer programming domain, especially digraphs theory domain is comprehended and assimilated with difficulty by students.

  3. On-lattice agent-based simulation of populations of cells within the open-source Chaste framework.

    PubMed

    Figueredo, Grazziela P; Joshi, Tanvi V; Osborne, James M; Byrne, Helen M; Owen, Markus R

    2013-04-06

    Over the years, agent-based models have been developed that combine cell division and reinforced random walks of cells on a regular lattice, reaction-diffusion equations for nutrients and growth factors; and ordinary differential equations for the subcellular networks regulating the cell cycle. When linked to a vascular layer, this multiple scale model framework has been applied to tumour growth and therapy. Here, we report on the creation of an agent-based multi-scale environment amalgamating the characteristics of these models within a Virtual Physiological Human (VPH) Exemplar Project. This project enables reuse, integration, expansion and sharing of the model and relevant data. The agent-based and reaction-diffusion parts of the multi-scale model have been implemented and are available for download as part of the latest public release of Chaste (Cancer, Heart and Soft Tissue Environment; http://www.cs.ox.ac.uk/chaste/), part of the VPH Toolkit (http://toolkit.vph-noe.eu/). The environment functionalities are verified against the original models, in addition to extra validation of all aspects of the code. In this work, we present the details of the implementation of the agent-based environment, including the system description, the conceptual model, the development of the simulation model and the processes of verification and validation of the simulation results. We explore the potential use of the environment by presenting exemplar applications of the 'what if' scenarios that can easily be studied in the environment. These examples relate to tumour growth, cellular competition for resources and tumour responses to hypoxia (low oxygen levels). We conclude our work by summarizing the future steps for the expansion of the current system.

  4. Shared decision making in designing new healthcare environments-time to begin improving quality.

    PubMed

    Elf, Marie; Fröst, Peter; Lindahl, Göran; Wijk, Helle

    2015-03-21

    Successful implementation of new methods and models of healthcare to achieve better patient outcomes and safe, person-centered care is dependent on the physical environment of the healthcare architecture in which the healthcare is provided. Thus, decisions concerning healthcare architecture are critical because it affects people and work processes for many years and requires a long-term financial commitment from society. In this paper, we describe and suggest several strategies (critical factors) to promote shared-decision making when planning and designing new healthcare environments. This paper discusses challenges and hindrances observed in the literature and from the authors extensive experiences in the field of planning and designing healthcare environments. An overview is presented of the challenges and new approaches for a process that involves the mutual exchange of knowledge among various stakeholders. Additionally, design approaches that balance the influence of specific and local requirements with general knowledge and evidence that should be encouraged are discussed. We suggest a shared-decision making and collaborative planning and design process between representatives from healthcare, construction sector and architecture based on evidence and end-users' perspectives. If carefully and systematically applied, this approach will support and develop a framework for creating high quality healthcare environments.

  5. Strategies for concurrent processing of complex algorithms in data driven architectures

    NASA Technical Reports Server (NTRS)

    Stoughton, John W.; Mielke, Roland R.

    1988-01-01

    Research directed at developing a graph theoretical model for describing data and control flow associated with the execution of large grained algorithms in a special distributed computer environment is presented. This model is identified by the acronym ATAMM which represents Algorithms To Architecture Mapping Model. The purpose of such a model is to provide a basis for establishing rules for relating an algorithm to its execution in a multiprocessor environment. Specifications derived from the model lead directly to the description of a data flow architecture which is a consequence of the inherent behavior of the data and control flow described by the model. The purpose of the ATAMM based architecture is to provide an analytical basis for performance evaluation. The ATAMM model and architecture specifications are demonstrated on a prototype system for concept validation.

  6. Understanding patients' behavioral intentions: evidence from Iran's private hospitals industry.

    PubMed

    Zarei, Ehsan; Arab, Mohammad; Tabatabaei, Seyed Mahmoud Ghazi; Rashidian, Arash; Forushani, Abbas Rahimi; Khabiri, Roghayeh

    2014-01-01

    In the ever-increasing competitive market of private hospital industry, creating a strong relationship with the customers that shapes patients' loyalty has been considered a key factor in obtaining market share. The purpose of this paper is to test a model of customer loyalty among patients of private hospitals in Iran. This cross-sectional study was carried out in Tehran, the capital of the Islamic Republic of Iran in 2010. The study samples composed of 969 patients who were consecutively selected from eight private hospitals. The survey instrument was designed based on a review of the related literature and included 36 items. Data analysis was performed using structural equation modeling. For the service quality construct, three dimensions extracted: Process, interaction, and environment. Both process and interaction quality had significant effects on perceived value. Perceived value along with the process and interaction quality were the most important antecedents of patient overall satisfaction. The direct effect of the process and interaction quality on behavioral intentions was insignificant. Perceived value and patient overall satisfaction were the direct antecedents of patient behavioral intentions and the mediators between service quality and behavioral intentions. Environment quality of service delivery had no significant effect on perceived value, overall satisfaction, and behavioral intentions. Contrary to previous similar studies, the role of service quality was investigated not in a general sense, but in the form of three types of qualities including quality of environment, quality of process, and quality of interaction.

  7. Mechanistic modeling of pesticide exposure: The missing keystone of honey bee toxicology.

    PubMed

    Sponsler, Douglas B; Johnson, Reed M

    2017-04-01

    The role of pesticides in recent honey bee losses is controversial, partly because field studies often fail to detect effects predicted by laboratory studies. This dissonance highlights a critical gap in the field of honey bee toxicology: there exists little mechanistic understanding of the patterns and processes of exposure that link honey bees to pesticides in their environment. The authors submit that 2 key processes underlie honey bee pesticide exposure: 1) the acquisition of pesticide by foraging bees, and 2) the in-hive distribution of pesticide returned by foragers. The acquisition of pesticide by foraging bees must be understood as the spatiotemporal intersection between environmental contamination and honey bee foraging activity. This implies that exposure is distributional, not discrete, and that a subset of foragers may acquire harmful doses of pesticide while the mean colony exposure would appear safe. The in-hive distribution of pesticide is a complex process driven principally by food transfer interactions between colony members, and this process differs importantly between pollen and nectar. High priority should be placed on applying the extensive literature on honey bee biology to the development of more rigorously mechanistic models of honey bee pesticide exposure. In combination with mechanistic effects modeling, mechanistic exposure modeling has the potential to integrate the field of honey bee toxicology, advancing both risk assessment and basic research. Environ Toxicol Chem 2017;36:871-881. © 2016 SETAC. © 2016 SETAC.

  8. Developing cloud-based Business Process Management (BPM): a survey

    NASA Astrophysics Data System (ADS)

    Mercia; Gunawan, W.; Fajar, A. N.; Alianto, H.; Inayatulloh

    2018-03-01

    In today’s highly competitive business environment, modern enterprises are dealing difficulties to cut unnecessary costs, eliminate wastes and delivery huge benefits for the organization. Companies are increasingly turning to a more flexible IT environment to help them realize this goal. For this reason, the article applies cloud based Business Process Management (BPM) that enables to focus on modeling, monitoring and process management. Cloud based BPM consists of business processes, business information and IT resources, which help build real-time intelligence systems, based on business management and cloud technology. Cloud computing is a paradigm that involves procuring dynamically measurable resources over the internet as an IT resource service. Cloud based BPM service enables to address common problems faced by traditional BPM, especially in promoting flexibility, event-driven business process to exploit opportunities in the marketplace.

  9. The Sense of Confidence during Probabilistic Learning: A Normative Account.

    PubMed

    Meyniel, Florent; Schlunegger, Daniel; Dehaene, Stanislas

    2015-06-01

    Learning in a stochastic environment consists of estimating a model from a limited amount of noisy data, and is therefore inherently uncertain. However, many classical models reduce the learning process to the updating of parameter estimates and neglect the fact that learning is also frequently accompanied by a variable "feeling of knowing" or confidence. The characteristics and the origin of these subjective confidence estimates thus remain largely unknown. Here we investigate whether, during learning, humans not only infer a model of their environment, but also derive an accurate sense of confidence from their inferences. In our experiment, humans estimated the transition probabilities between two visual or auditory stimuli in a changing environment, and reported their mean estimate and their confidence in this report. To formalize the link between both kinds of estimate and assess their accuracy in comparison to a normative reference, we derive the optimal inference strategy for our task. Our results indicate that subjects accurately track the likelihood that their inferences are correct. Learning and estimating confidence in what has been learned appear to be two intimately related abilities, suggesting that they arise from a single inference process. We show that human performance matches several properties of the optimal probabilistic inference. In particular, subjective confidence is impacted by environmental uncertainty, both at the first level (uncertainty in stimulus occurrence given the inferred stochastic characteristics) and at the second level (uncertainty due to unexpected changes in these stochastic characteristics). Confidence also increases appropriately with the number of observations within stable periods. Our results support the idea that humans possess a quantitative sense of confidence in their inferences about abstract non-sensory parameters of the environment. This ability cannot be reduced to simple heuristics, it seems instead a core property of the learning process.

  10. The Sense of Confidence during Probabilistic Learning: A Normative Account

    PubMed Central

    Meyniel, Florent; Schlunegger, Daniel; Dehaene, Stanislas

    2015-01-01

    Learning in a stochastic environment consists of estimating a model from a limited amount of noisy data, and is therefore inherently uncertain. However, many classical models reduce the learning process to the updating of parameter estimates and neglect the fact that learning is also frequently accompanied by a variable “feeling of knowing” or confidence. The characteristics and the origin of these subjective confidence estimates thus remain largely unknown. Here we investigate whether, during learning, humans not only infer a model of their environment, but also derive an accurate sense of confidence from their inferences. In our experiment, humans estimated the transition probabilities between two visual or auditory stimuli in a changing environment, and reported their mean estimate and their confidence in this report. To formalize the link between both kinds of estimate and assess their accuracy in comparison to a normative reference, we derive the optimal inference strategy for our task. Our results indicate that subjects accurately track the likelihood that their inferences are correct. Learning and estimating confidence in what has been learned appear to be two intimately related abilities, suggesting that they arise from a single inference process. We show that human performance matches several properties of the optimal probabilistic inference. In particular, subjective confidence is impacted by environmental uncertainty, both at the first level (uncertainty in stimulus occurrence given the inferred stochastic characteristics) and at the second level (uncertainty due to unexpected changes in these stochastic characteristics). Confidence also increases appropriately with the number of observations within stable periods. Our results support the idea that humans possess a quantitative sense of confidence in their inferences about abstract non-sensory parameters of the environment. This ability cannot be reduced to simple heuristics, it seems instead a core property of the learning process. PMID:26076466

  11. Barotropic Tidal Predictions and Validation in a Relocatable Modeling Environment. Revised

    NASA Technical Reports Server (NTRS)

    Mehra, Avichal; Passi, Ranjit; Kantha, Lakshmi; Payne, Steven; Brahmachari, Shuvobroto

    1998-01-01

    Under funding from the Office of Naval Research (ONR), and the Naval Oceanographic Office (NAVOCEANO), the Mississippi State University Center for Air Sea Technology (CAST) has been working on developing a Relocatable Modeling Environment(RME) to provide a uniform and unbiased infrastructure for efficiently configuring numerical models in any geographic/oceanic region. Under Naval Oceanographic Office (NAVO-CEANO) funding, the model was implemented and tested for NAVOCEANO use. With our current emphasis on ocean tidal modeling, CAST has adopted the Colorado University's numerical ocean model, known as CURReNTSS (Colorado University Rapidly Relocatable Nestable Storm Surge) Model, as the model of choice. During the RME development process, CURReNTSS has been relocated to several coastal oceanic regions, providing excellent results that demonstrate its veracity. This report documents the model validation results and provides a brief description of the Graphic user Interface (GUI).

  12. Automatic mathematical modeling for space application

    NASA Technical Reports Server (NTRS)

    Wang, Caroline K.

    1987-01-01

    A methodology for automatic mathematical modeling is described. The major objective is to create a very friendly environment for engineers to design, maintain and verify their model and also automatically convert the mathematical model into FORTRAN code for conventional computation. A demonstration program was designed for modeling the Space Shuttle Main Engine simulation mathematical model called Propulsion System Automatic Modeling (PSAM). PSAM provides a very friendly and well organized environment for engineers to build a knowledge base for base equations and general information. PSAM contains an initial set of component process elements for the Space Shuttle Main Engine simulation and a questionnaire that allows the engineer to answer a set of questions to specify a particular model. PSAM is then able to automatically generate the model and the FORTRAN code. A future goal is to download the FORTRAN code to the VAX/VMS system for conventional computation.

  13. Genomics and Metagenomics of Extreme Acidophiles in Biomining Environments

    NASA Astrophysics Data System (ADS)

    Holmes, D. S.

    2015-12-01

    Over 160 draft or complete genomes of extreme acidophiles (pH < 3) have been published, many of which are from bioleaching and other biomining environments, or are closely related to such microorganisms. In addition, there are over 20 metagenomic studies of such environments. This provides a rich source of latent data that can be exploited for understanding the biology of biomining environments and for advancing biotechnological applications. Genomic and metagenomic data are already yielding valuable insights into cellular processes, including carbon and nitrogen management, heavy metal and acid resistance, iron and sulfur oxido-reduction, linking biogeochemical processes to organismal physiology. The data also allow the construction of useful models of the ecophysiology of biomining environments and provide insight into the gene and genome evolution of extreme acidophiles. Additionally, since most of these acidophiles are also chemoautolithotrophs that use minerals as energy sources or electron sinks, their genomes can be plundered for clues about the evolution of cellular metabolism and bioenergetic pathways during the Archaean abiotic/biotic transition on early Earth. Acknowledgements: Fondecyt 1130683.

  14. Procuring interoperability at the expense of usability: a case study of UK National Programme for IT assurance process.

    PubMed

    Krause, Paul; de Lusignan, Simon

    2010-01-01

    The allure of interoperable systems is that they should improve patient safety and make health services more efficient. The UK's National Programme for IT has made great strides in achieving interoperability; through linkage to a national electronic spine. However, there has been criticism of the usability of the applications in the clinical environment. Analysis of the procurement and assurance process to explore whether they predetermine usability. Processes separate developers from users, and test products against theoretical assurance models of use rather than simulate or pilot in a clinical environment. The current process appears to be effective for back office systems and high risk applications, but too inflexible for developing applications for the clinical setting. For clinical applications agile techniques are more appropriate. Usability testing should become an integrated part of the contractual process and be introduced earlier in the development process.

  15. Deschutes estuary feasibility study: hydrodynamics and sediment transport modeling

    USGS Publications Warehouse

    George, Douglas A.; Gelfenbaum, Guy; Lesser, Giles; Stevens, Andrew W.

    2006-01-01

    - Provide the completed study to the CLAMP Steering Committee so that a recommendation about a long-term aquatic environment of the basin can be made. The hydrodynamic and sediment transport modeling task developed a number of different model simulations using a process-based morphological model, Delft3D, to help address these goals. Modeling results provide a qualitative assessment of estuarine behavior both prior to dam construction and after various post-dam removal scenarios. Quantitative data from the model is used in the companion biological assessment and engineering design components of the overall study. Overall, the modeling study found that after dam removal, tidal and estuarine processes are immediately restored, with marine water from Budd Inlet carried into North and Middle Basin on each rising tide and mud flats being exposed with each falling tide. Within the first year after dam removal, tidal processes, along with the occasional river floods, act to modify the estuary bed by redistributing sediment through erosion and deposition. The morphological response of the bed is rapid during the first couple of years, then slows as a dynamic equilibrium is reached within three to five years. By ten years after dam removal, the overall hydrodynamic and morphologic behavior of the estuary is similar to the pre-dam estuary, with the exception of South Basin, which has been permanently modified by human activities. In addition to a qualitative assessment of estuarine behavior, process-based modeling provides the ability address specific questions to help to inform decision-making. Considering that predicting future conditions of a complex estuarine environment is wrought with uncertainties, quantitative results in this report are often expressed in terms of ranges of possible outcomes.

  16. Wireless Sensor Network-Based Greenhouse Environment Monitoring and Automatic Control System for Dew Condensation Prevention

    PubMed Central

    Park, Dae-Heon; Park, Jang-Woo

    2011-01-01

    Dew condensation on the leaf surface of greenhouse crops can promote diseases caused by fungus and bacteria, affecting the growth of the crops. In this paper, we present a WSN (Wireless Sensor Network)-based automatic monitoring system to prevent dew condensation in a greenhouse environment. The system is composed of sensor nodes for collecting data, base nodes for processing collected data, relay nodes for driving devices for adjusting the environment inside greenhouse and an environment server for data storage and processing. Using the Barenbrug formula for calculating the dew point on the leaves, this system is realized to prevent dew condensation phenomena on the crop’s surface acting as an important element for prevention of diseases infections. We also constructed a physical model resembling the typical greenhouse in order to verify the performance of our system with regard to dew condensation control. PMID:22163813

  17. Geochemistry, geomorphology, and soil petrology of the Mars-like soils from Pampas de la Joya hyper-arid desert

    NASA Astrophysics Data System (ADS)

    Valdivia-Silva, Julio E.; Ortega-Gutierrez, Fernando; Bonaccorsi, Rosalba

    2016-07-01

    Mars-like environments on Earth are used as a model to guide the investigation of possible habitable Martian environments. In this work we evaluate and analyze the geology, geomorphology and soil petrology of the Pampas de La Joya Desert in southern Peru, in order to understand the processes that transformed the region into a Mars-like environment. Using a multidisciplinary approach, we analyze the different soils that compose the floor of the desert, as well as describe and interpret the post-Oligocene landscape emphasizing some Mars-like features with respect to its acting geologic processes, the habitability potential under very low levels of nutrients and water, and its suitability to sustain microorganisms or their remains. Importantly, this work is part of a bigger project that use Mars-like soils, looking for new crops capable to grow in extreme environments.

  18. Wireless sensor network-based greenhouse environment monitoring and automatic control system for dew condensation prevention.

    PubMed

    Park, Dae-Heon; Park, Jang-Woo

    2011-01-01

    Dew condensation on the leaf surface of greenhouse crops can promote diseases caused by fungus and bacteria, affecting the growth of the crops. In this paper, we present a WSN (Wireless Sensor Network)-based automatic monitoring system to prevent dew condensation in a greenhouse environment. The system is composed of sensor nodes for collecting data, base nodes for processing collected data, relay nodes for driving devices for adjusting the environment inside greenhouse and an environment server for data storage and processing. Using the Barenbrug formula for calculating the dew point on the leaves, this system is realized to prevent dew condensation phenomena on the crop's surface acting as an important element for prevention of diseases infections. We also constructed a physical model resembling the typical greenhouse in order to verify the performance of our system with regard to dew condensation control.

  19. Red mud flocculation process in alumina production

    NASA Astrophysics Data System (ADS)

    Fedorova, E. R.; Firsov, A. Yu

    2018-05-01

    The process of thickening and washing red mud is a gooseneck of alumina production. The existing automated systems of the thickening process control involve stabilizing the parameters of the primary technological circuits of the thickener. The actual direction of scientific research is the creation and improvement of models and systems of the thickening process control by model. But the known models do not fully consider the presence of perturbing effects, in particular the particle size distribution in the feed process, distribution of floccules by size after the aggregation process in the feed barrel. The article is devoted to the basic concepts and terms used in writing the population balance algorithm. The population balance model is implemented in the MatLab environment. The result of the simulation is the particle size distribution after the flocculation process. This model allows one to foreseen the distribution range of floccules after the process of aggregation of red mud in the feed barrel. The mud of Jamaican bauxite was acting as an industrial sample of red mud; Cytec Industries of HX-3000 series with a concentration of 0.5% was acting as a flocculant. When simulating, model constants obtained in a tubular tank in the laboratories of CSIRO (Australia) were used.

  20. A user-system interface agent

    NASA Technical Reports Server (NTRS)

    Wakim, Nagi T.; Srivastava, Sadanand; Bousaidi, Mehdi; Goh, Gin-Hua

    1995-01-01

    Agent-based technologies answer to several challenges posed by additional information processing requirements in today's computing environments. In particular, (1) users desire interaction with computing devices in a mode which is similar to that used between people, (2) the efficiency and successful completion of information processing tasks often require a high-level of expertise in complex and multiple domains, (3) information processing tasks often require handling of large volumes of data and, therefore, continuous and endless processing activities. The concept of an agent is an attempt to address these new challenges by introducing information processing environments in which (1) users can communicate with a system in a natural way, (2) an agent is a specialist and a self-learner and, therefore, it qualifies to be trusted to perform tasks independent of the human user, and (3) an agent is an entity that is continuously active performing tasks that are either delegated to it or self-imposed. The work described in this paper focuses on the development of an interface agent for users of a complex information processing environment (IPE). This activity is part of an on-going effort to build a model for developing agent-based information systems. Such systems will be highly applicable to environments which require a high degree of automation, such as, flight control operations and/or processing of large volumes of data in complex domains, such as the EOSDIS environment and other multidisciplinary, scientific data systems. The concept of an agent as an information processing entity is fully described with emphasis on characteristics of special interest to the User-System Interface Agent (USIA). Issues such as agent 'existence' and 'qualification' are discussed in this paper. Based on a definition of an agent and its main characteristics, we propose an architecture for the development of interface agents for users of an IPE that is agent-oriented and whose resources are likely to be distributed and heterogeneous in nature. The architecture of USIA is outlined in two main components: (1) the user interface which is concerned with issues as user dialog and interaction, user modeling, and adaptation to user profile, and (2) the system interface part which deals with identification of IPE capabilities, task understanding and feasibility assessment, and task delegation and coordination of assistant agents.

Top