Sample records for process model describing

  1. Managing the travel model process : small and medium-sized MPOs. Instructor guide.

    DOT National Transportation Integrated Search

    2013-09-01

    The learning objectives of this course were to: explain fundamental travel model concepts; describe the model development process; identify key inputs and describe the quality control process; and identify and manage resources.

  2. Managing the travel model process : small and medium-sized MPOs. Participant handbook.

    DOT National Transportation Integrated Search

    2013-09-01

    The learning objectives of this course were to: explain fundamental travel model concepts; describe the model development process; identify key inputs and describe the quality control process; and identify and manage resources.

  3. Composing Models of Geographic Physical Processes

    NASA Astrophysics Data System (ADS)

    Hofer, Barbara; Frank, Andrew U.

    Processes are central for geographic information science; yet geographic information systems (GIS) lack capabilities to represent process related information. A prerequisite to including processes in GIS software is a general method to describe geographic processes independently of application disciplines. This paper presents such a method, namely a process description language. The vocabulary of the process description language is derived formally from mathematical models. Physical processes in geography can be described in two equivalent languages: partial differential equations or partial difference equations, where the latter can be shown graphically and used as a method for application specialists to enter their process models. The vocabulary of the process description language comprises components for describing the general behavior of prototypical geographic physical processes. These process components can be composed by basic models of geographic physical processes, which is shown by means of an example.

  4. Modeling of Electrochemical Process for the Treatment of Wastewater Containing Organic Pollutants

    NASA Astrophysics Data System (ADS)

    Rodrigo, Manuel A.; Cañizares, Pablo; Lobato, Justo; Sáez, Cristina

    Electrocoagulation and electrooxidation are promising electrochemical technologies that can be used to remove organic pollutants contained in wastewaters. To make these technologies competitive with the conventional technologies that are in use today, a better understanding of the processes involved must be achieved. In this context, the development of mathematical models that are consistent with the processes occurring in a physical system is a relevant advance, because such models can help to understand what is happening in the treatment process. In turn, a more detailed knowledge of the physical system can be obtained, and tools for a proper design of the processes, or for the analysis of operating problems, are attained. The modeling of these technologies can be carried out using single-variable or multivariable models. Likewise, the position dependence of the model species can be described with different approaches. In this work, a review of the basics of the modeling of these processes and a description of several representative models for electrochemical oxidation and coagulation are carried out. Regarding electrooxidation, two models are described: one which summarizes the pollution of a wastewater in only one model species and that considers a macroscopic approach to formulate the mass balances and other that considers more detailed profile of concentration to describe the time course of pollutants and intermediates through a mixed maximum gradient/macroscopic approach. On the topic of electrochemical coagulation, two different approaches are also described in this work: one that considers the hydrodynamic conditions as the main factor responsible for the electrochemical coagulation processes and the other that considers the chemical interaction of the reagents and the pollutants as the more significant processes in the description of the electrochemical coagulation of organic compounds. In addition, in this work it is also described a multivariable model for the electrodissolution of anodes (first stage in electrocoagulation processes). This later model use a mixed macroscopic/maximum gradient approach to describe the chemical and electrochemical processes and it also assumes that the rates of all processes are very high, and that they can be successfully modeled using pseudoequilibrium approaches.

  5. A REVIEW AND COMPARISON OF MODELS FOR PREDICTING DYNAMIC CHEMICAL BIOCONCENTRATION IN FISH

    EPA Science Inventory

    Over the past 20 years, a variety of models have been developed to simulate the bioconcentration of hydrophobic organic chemicals by fish. These models differ not only in the processes they address but also in the way a given process is described. Processes described by these m...

  6. Software Engineering Laboratory (SEL) cleanroom process model

    NASA Technical Reports Server (NTRS)

    Green, Scott; Basili, Victor; Godfrey, Sally; Mcgarry, Frank; Pajerski, Rose; Waligora, Sharon

    1991-01-01

    The Software Engineering Laboratory (SEL) cleanroom process model is described. The term 'cleanroom' originates in the integrated circuit (IC) production process, where IC's are assembled in dust free 'clean rooms' to prevent the destructive effects of dust. When applying the clean room methodology to the development of software systems, the primary focus is on software defect prevention rather than defect removal. The model is based on data and analysis from previous cleanroom efforts within the SEL and is tailored to serve as a guideline in applying the methodology to future production software efforts. The phases that are part of the process model life cycle from the delivery of requirements to the start of acceptance testing are described. For each defined phase, a set of specific activities is discussed, and the appropriate data flow is described. Pertinent managerial issues, key similarities and differences between the SEL's cleanroom process model and the standard development approach used on SEL projects, and significant lessons learned from prior cleanroom projects are presented. It is intended that the process model described here will be further tailored as additional SEL cleanroom projects are analyzed.

  7. Petri net based model of the body iron homeostasis.

    PubMed

    Formanowicz, Dorota; Sackmann, Andrea; Formanowicz, Piotr; Błazewicz, Jacek

    2007-10-01

    The body iron homeostasis is a not fully understood complex process. Despite the fact that some components of this process have been described in the literature, the complete model of the whole process has not been proposed. In this paper a Petri net based model of the body iron homeostasis is presented. Recently, Petri nets have been used for describing and analyzing various biological processes since they allow modeling the system under consideration very precisely. The main result presented in the paper is twofold, i.e., an informal description of the main part of the whole iron homeostasis process is described, and then it is also formulated in the formal language of Petri net theory. This model allows for a possible simulation of the process, since Petri net theory provides a lot of established analysis techniques.

  8. Validation of a multi-phase plant-wide model for the description of the aeration process in a WWTP.

    PubMed

    Lizarralde, I; Fernández-Arévalo, T; Beltrán, S; Ayesa, E; Grau, P

    2018-02-01

    This paper introduces a new mathematical model built under the PC-PWM methodology to describe the aeration process in a full-scale WWTP. This methodology enables a systematic and rigorous incorporation of chemical and physico-chemical transformations into biochemical process models, particularly for the description of liquid-gas transfer to describe the aeration process. The mathematical model constructed is able to reproduce biological COD and nitrogen removal, liquid-gas transfer and chemical reactions. The capability of the model to describe the liquid-gas mass transfer has been tested by comparing simulated and experimental results in a full-scale WWTP. Finally, an exploration by simulation has been undertaken to show the potential of the mathematical model. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Information Processing and Risk Perception: An Adaptation of the Heuristic-Systematic Model.

    ERIC Educational Resources Information Center

    Trumbo, Craig W.

    2002-01-01

    Describes heuristic-systematic information-processing model and risk perception--the two major conceptual areas of the analysis. Discusses the proposed model, describing the context of the data collections (public health communication involving cancer epidemiology) and providing the results of a set of three replications using the proposed model.…

  10. Comparison of Centralized-Manual, Centralized-Computerized, and Decentralized-Computerized Order and Management Information Models for the Turkish Air Force Logistics System.

    DTIC Science & Technology

    1986-09-01

    differentiation between the systems. This study will investigate an appropriate Order Processing and Management Information System (OP&MIS) to link base-level...methodology: 1. Reviewed the current order processing and information model of the TUAF Logistics System. (centralized-manual model) 2. Described the...RDS program’s order processing and information system. (centralized-computerized model) 3. Described the order irocessing and information system of

  11. Numerical model describing optimization of fibres winding process on open and closed frame

    NASA Astrophysics Data System (ADS)

    Petrů, M.; Mlýnek, J.; Martinec, T.

    2016-08-01

    This article discusses a numerical model describing optimization of fibres winding process on open and closed frame. The quality production of said type of composite frame depends primarily on the correct winding of fibers on a polyurethane core. It is especially needed to ensure the correct angles of the fibers winding on the polyurethane core and the homogeneity of individual winding layers. The article describes mathematical model for use an industrial robot in filament winding and how to calculate the trajectory of the robot. When winding fibers on the polyurethane core which is fastened to the robot-end-effector so that during the winding process goes through a fibre-processing head on the basis of the suitably determined robot-end-effector trajectory. We use the described numerical model and matrix calculus to enumerate the trajectory of the robot-end-effector to determine the desired passage of the frame through the fibre-processing head. The calculation of the trajectory was programmed in the Delphi development environment. Relations of the numerical model are important for use a real solving of the passage of a polyurethane core through fibre-processing head.

  12. The Policy-Making Process of the State University System of Florida.

    ERIC Educational Resources Information Center

    Sullivan, Sandra M.

    The policy-making process of the State University System of Florida is described using David Easton's model of a political system as the conceptual framwork. Two models describing the policy-making process were developed from personal interviews with the primary participants in the governance structure and from three case studies of policy…

  13. Application of dynamic flux balance analysis to an industrial Escherichia coli fermentation.

    PubMed

    Meadows, Adam L; Karnik, Rahi; Lam, Harry; Forestell, Sean; Snedecor, Brad

    2010-03-01

    We have developed a reactor-scale model of Escherichia coli metabolism and growth in a 1000 L process for the production of a recombinant therapeutic protein. The model consists of two distinct parts: (1) a dynamic, process specific portion that describes the time evolution of 37 process variables of relevance and (2) a flux balance based, 123-reaction metabolic model of E. coli metabolism. This model combines several previously reported modeling approaches including a growth rate-dependent biomass composition, maximum growth rate objective function, and dynamic flux balancing. In addition, we introduce concentration-dependent boundary conditions of transport fluxes, dynamic maintenance demands, and a state-dependent cellular objective. This formulation was able to describe specific runs with high-fidelity over process conditions including rich media, simultaneous acetate and glucose consumption, glucose minimal media, and phosphate depleted media. Furthermore, the model accurately describes the effect of process perturbations--such as glucose overbatching and insufficient aeration--on growth, metabolism, and titer. (c) 2009 Elsevier Inc. All rights reserved.

  14. The Role of Independent V&V in Upstream Software Development Processes

    NASA Technical Reports Server (NTRS)

    Easterbrook, Steve

    1996-01-01

    This paper describes the role of Verification and Validation (V&V) during the requirements and high level design processes, and in particular the role of Independent V&V (IV&V). The job of IV&V during these phases is to ensure that the requirements are complete, consistent and valid, and to ensure that the high level design meets the requirements. This contrasts with the role of Quality Assurance (QA), which ensures that appropriate standards and process models are defined and applied. This paper describes the current state of practice for IV&V, concentrating on the process model used in NASA projects. We describe a case study, showing the processes by which problem reporting and tracking takes place, and how IV&V feeds into decision making by the development team. We then describe the problems faced in implementing IV&V. We conclude that despite a well defined process model, and tools to support it, IV&V is still beset by communication and coordination problems.

  15. Effective Report Preparation: Streamlining the Reporting Process. AIR 1999 Annual Forum Paper.

    ERIC Educational Resources Information Center

    Dalrymple, Margaret; Wang, Mindy; Frost, Jacquelyn

    This paper describes the processes and techniques used to improve and streamline the standard student reports used at Purdue University (Indiana). Various models for analyzing reporting processes are described, especially the model used in the study, the Shewart or Deming Cycle, a method that aids in continuous analysis and improvement through a…

  16. Psychological Dynamics of Adolescent Satanism.

    ERIC Educational Resources Information Center

    Moriarty, Anthony R.; Story, Donald W.

    1990-01-01

    Attempts to describe the psychological processes that predispose an individual to adopt a Satanic belief system. Describes processes in terms of child-parent relationships and the developmental tasks of adolescence. Proposes a model called the web of psychic tension to represent the process of Satanic cult adoption. Describes techniques for…

  17. Kinetic model of continuous ethanol fermentation in closed-circulating process with pervaporation membrane bioreactor by Saccharomyces cerevisiae.

    PubMed

    Fan, Senqing; Chen, Shiping; Tang, Xiaoyu; Xiao, Zeyi; Deng, Qing; Yao, Peina; Sun, Zhaopeng; Zhang, Yan; Chen, Chunyan

    2015-02-01

    Unstructured kinetic models were proposed to describe the principal kinetics involved in ethanol fermentation in a continuous and closed-circulating fermentation (CCCF) process with a pervaporation membrane bioreactor. After ethanol was removed in situ from the broth by the membrane pervaporation, the secondary metabolites accumulated in the broth became the inhibitors to cell growth. The cell death rate related to the deterioration of the culture environment was described as a function of the cell concentration and fermentation time. In CCCF process, 609.8 g L(-1) and 750.1 g L(-1) of ethanol production were obtained in the first run and second run, respectively. The modified Gompertz model, correlating the ethanol production with the fermentation period, could be used to describe the ethanol production during CCCF process. The fitting results by the models showed good agreement with the experimental data. These models could be employed for the CCCF process technology development for ethanol fermentation. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. Approach of describing dynamic production of volatile fatty acids from sludge alkaline fermentation.

    PubMed

    Wang, Dongbo; Liu, Yiwen; Ngo, Huu Hao; Zhang, Chang; Yang, Qi; Peng, Lai; He, Dandan; Zeng, Guangming; Li, Xiaoming; Ni, Bing-Jie

    2017-08-01

    In this work, a mathematical model was developed to describe the dynamics of fermentation products in sludge alkaline fermentation systems for the first time. In this model, the impacts of alkaline fermentation on sludge disintegration, hydrolysis, acidogenesis, acetogenesis, and methanogenesis processes are specifically considered for describing the high-level formation of fermentation products. The model proposed successfully reproduced the experimental data obtained from five independent sludge alkaline fermentation studies. The modeling results showed that alkaline fermentation largely facilitated the disintegration, acidogenesis, and acetogenesis processes and severely inhibited methanogenesis process. With the pH increase from 7.0 to 10.0, the disintegration, acidogenesis, and acetogenesis processes respectively increased by 53%, 1030%, and 30% while methane production decreased by 3800%. However, no substantial effect on hydrolysis process was found. The model also indicated that the pathway of acetoclastic methanogenesis was more severely inhibited by alkaline condition than that of hydrogentrophic methanogenesis. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Process-Response Modeling and the Scientific Process.

    ERIC Educational Resources Information Center

    Fichter, Lynn S.

    1988-01-01

    Discusses the process-response model (PRM) in its theoretical and practical forms. Describes how geologists attempt to reconstruct the process from the response (the geologic phenomenon) being studied. (TW)

  20. A MATHEMATICAL MODEL OF ELECTROSTATIC PRECIPITATION. (REVISION 1): VOLUME I. MODELING AND PROGRAMMING

    EPA Science Inventory

    The report briefly describes the fundamental mechanisms and limiting factors involved in the electrostatic precipitation process. It discusses theories and procedures used in the computer model to describe the physical mechanisms, and generally describes the major operations perf...

  1. Modeling biological gradient formation: combining partial differential equations and Petri nets.

    PubMed

    Bertens, Laura M F; Kleijn, Jetty; Hille, Sander C; Heiner, Monika; Koutny, Maciej; Verbeek, Fons J

    2016-01-01

    Both Petri nets and differential equations are important modeling tools for biological processes. In this paper we demonstrate how these two modeling techniques can be combined to describe biological gradient formation. Parameters derived from partial differential equation describing the process of gradient formation are incorporated in an abstract Petri net model. The quantitative aspects of the resulting model are validated through a case study of gradient formation in the fruit fly.

  2. Studying the Accuracy of Software Process Elicitation: The User Articulated Model

    ERIC Educational Resources Information Center

    Crabtree, Carlton A.

    2010-01-01

    Process models are often the basis for demonstrating improvement and compliance in software engineering organizations. A descriptive model is a type of process model describing the human activities in software development that actually occur. The purpose of a descriptive model is to provide a documented baseline for further process improvement…

  3. Two Undergraduate Process Modeling Courses Taught Using Inductive Learning Methods

    ERIC Educational Resources Information Center

    Soroush, Masoud; Weinberger, Charles B.

    2010-01-01

    This manuscript presents a successful application of inductive learning in process modeling. It describes two process modeling courses that use inductive learning methods such as inquiry learning and problem-based learning, among others. The courses include a novel collection of multi-disciplinary complementary process modeling examples. They were…

  4. Modeling process of embolization arteriovenous malformation on the basis of two-phase filtration model

    NASA Astrophysics Data System (ADS)

    Cherevko, A. A.; Gologush, T. S.; Ostapenko, V. V.; Petrenko, I. A.; Chupakhin, A. P.

    2016-06-01

    Arteriovenous malformation is a chaotic disordered interlacement of very small diameter vessels, performing reset of blood from the artery into the vein. In this regard it can be adequately modeled using porous medium. In this model process of embolization described as penetration of non-adhesive substance ONYX into the porous medium, filled with blood, both of these fluids are not mixed with each other. In one-dimensional approximation such processes are well described by Buckley-Leverett equation. In this paper Buckley-Leverett equation is solved numerically by using a new modification of Cabaret scheme. The results of numerical modeling process of embolization of AVM are shown.

  5. Implementation of a Goal-Based Systems Engineering Process Using the Systems Modeling Language (SysML)

    NASA Technical Reports Server (NTRS)

    Breckenridge, Jonathan T.; Johnson, Stephen B.

    2013-01-01

    This paper describes the core framework used to implement a Goal-Function Tree (GFT) based systems engineering process using the Systems Modeling Language. It defines a set of principles built upon by the theoretical approach described in the InfoTech 2013 ISHM paper titled "Goal-Function Tree Modeling for Systems Engineering and Fault Management" presented by Dr. Stephen B. Johnson. Using the SysML language, the principles in this paper describe the expansion of the SysML language as a baseline in order to: hierarchically describe a system, describe that system functionally within success space, and allocate detection mechanisms to success functions for system protection.

  6. A Neural Dynamic Model Generates Descriptions of Object-Oriented Actions.

    PubMed

    Richter, Mathis; Lins, Jonas; Schöner, Gregor

    2017-01-01

    Describing actions entails that relations between objects are discovered. A pervasively neural account of this process requires that fundamental problems are solved: the neural pointer problem, the binding problem, and the problem of generating discrete processing steps from time-continuous neural processes. We present a prototypical solution to these problems in a neural dynamic model that comprises dynamic neural fields holding representations close to sensorimotor surfaces as well as dynamic neural nodes holding discrete, language-like representations. Making the connection between these two types of representations enables the model to describe actions as well as to perceptually ground movement phrases-all based on real visual input. We demonstrate how the dynamic neural processes autonomously generate the processing steps required to describe or ground object-oriented actions. By solving the fundamental problems of neural pointing, binding, and emergent discrete processing, the model may be a first but critical step toward a systematic neural processing account of higher cognition. Copyright © 2017 The Authors. Topics in Cognitive Science published by Wiley Periodicals, Inc. on behalf of Cognitive Science Society.

  7. The technique for Simulation of Transient Combustion Processes in the Rocket Engine Operating with Gaseous Fuel “Hydrogen and Oxygen”

    NASA Astrophysics Data System (ADS)

    Zubanov, V. M.; Stepanov, D. V.; Shabliy, L. S.

    2017-01-01

    The article describes the method for simulation of transient combustion processes in the rocket engine. The engine operates on gaseous propellant: oxygen and hydrogen. Combustion simulation was performed using the ANSYS CFX software. Three reaction mechanisms for the stationary mode were considered and described in detail. Reactions mechanisms have been taken from several sources and verified. The method for converting ozone properties from the Shomate equation to the NASA-polynomial format was described in detail. The way for obtaining quick CFD-results with intermediate combustion components using an EDM model was found. Modeling difficulties with combustion model Finite Rate Chemistry, associated with a large scatter of reference data were identified and described. The way to generate the Flamelet library with CFX-RIF is described. Formulated adequate reaction mechanisms verified at a steady state have also been tested for transient simulation. The Flamelet combustion model was recognized as adequate for the transient mode. Integral parameters variation relates to the values obtained during stationary simulation. A cyclic irregularity of the temperature field, caused by precession of the vortex core, was detected in the chamber with the proposed simulation technique. Investigations of unsteady processes of rocket engines including the processes of ignition were proposed as the area for application of the described simulation technique.

  8. Strategic Project Management at the NASA Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Lavelle, Jerome P.

    2000-01-01

    This paper describes Project Management at NASA's Kennedy Space Center (KSC) from a strategic perspective. It develops the historical context of the agency and center's strategic planning process and illustrates how now is the time for KSC to become a center which has excellence in project management. The author describes project management activities at the center and details observations on those efforts. Finally the author describes the Strategic Project Management Process Model as a conceptual model which could assist KSC in defining an appropriate project management process system at the center.

  9. Branching processes in disease epidemics

    NASA Astrophysics Data System (ADS)

    Singh, Sarabjeet

    Branching processes have served as a model for chemical reactions, biological growth processes and contagion (of disease, information or fads). Through this connection, these seemingly different physical processes share some common universalities that can be elucidated by analyzing the underlying branching process. In this thesis, we focus on branching processes as a model for infectious diseases spreading between individuals belonging to different populations. The distinction between populations can arise from species separation (as in the case of diseases which jump across species) or spatial separation (as in the case of disease spreading between farms, cities, urban centers, etc). A prominent example of the former is zoonoses -- infectious diseases that spill from animals to humans -- whose specific examples include Nipah virus, monkeypox, HIV and avian influenza. A prominent example of the latter is infectious diseases of animals such as foot and mouth disease and bovine tuberculosis that spread between farms or cattle herds. Another example of the latter is infectious diseases of humans such as H1N1 that spread from one city to another through migration of infectious hosts. This thesis consists of three main chapters, an introduction and an appendix. The introduction gives a brief history of mathematics in modeling the spread of infectious diseases along with a detailed description of the most commonly used disease model -- the Susceptible-Infectious-Recovered (SIR) model. The introduction also describes how the stochastic formulation of the model reduces to a branching process in the limit of large population which is analyzed in detail. The second chapter describes a two species model of zoonoses with coupled SIR processes and proceeds into the calculation of statistics pertinent to cross species infection using multitype branching processes. The third chapter describes an SIR process driven by a Poisson process of infection spillovers. This is posed as a model of infectious diseases where a `reservoir' of infection exists that infects a susceptible host population at a constant rate. The final chapter of the thesis describes a general framework of modeling infectious diseases in a network of populations using multitype branching processes.

  10. Modeling nuclear processes by Simulink

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rashid, Nahrul Khair Alang Md, E-mail: nahrul@iium.edu.my

    2015-04-29

    Modelling and simulation are essential parts in the study of dynamic systems behaviours. In nuclear engineering, modelling and simulation are important to assess the expected results of an experiment before the actual experiment is conducted or in the design of nuclear facilities. In education, modelling can give insight into the dynamic of systems and processes. Most nuclear processes can be described by ordinary or partial differential equations. Efforts expended to solve the equations using analytical or numerical solutions consume time and distract attention from the objectives of modelling itself. This paper presents the use of Simulink, a MATLAB toolbox softwaremore » that is widely used in control engineering, as a modelling platform for the study of nuclear processes including nuclear reactor behaviours. Starting from the describing equations, Simulink models for heat transfer, radionuclide decay process, delayed neutrons effect, reactor point kinetic equations with delayed neutron groups, and the effect of temperature feedback are used as examples.« less

  11. Measurement-based reliability/performability models

    NASA Technical Reports Server (NTRS)

    Hsueh, Mei-Chen

    1987-01-01

    Measurement-based models based on real error-data collected on a multiprocessor system are described. Model development from the raw error-data to the estimation of cumulative reward is also described. A workload/reliability model is developed based on low-level error and resource usage data collected on an IBM 3081 system during its normal operation in order to evaluate the resource usage/error/recovery process in a large mainframe system. Thus, both normal and erroneous behavior of the system are modeled. The results provide an understanding of the different types of errors and recovery processes. The measured data show that the holding times in key operational and error states are not simple exponentials and that a semi-Markov process is necessary to model the system behavior. A sensitivity analysis is performed to investigate the significance of using a semi-Markov process, as opposed to a Markov process, to model the measured system.

  12. Low level image processing techniques using the pipeline image processing engine in the flight telerobotic servicer

    NASA Technical Reports Server (NTRS)

    Nashman, Marilyn; Chaconas, Karen J.

    1988-01-01

    The sensory processing system for the NASA/NBS Standard Reference Model (NASREM) for telerobotic control is described. This control system architecture was adopted by NASA of the Flight Telerobotic Servicer. The control system is hierarchically designed and consists of three parallel systems: task decomposition, world modeling, and sensory processing. The Sensory Processing System is examined, and in particular the image processing hardware and software used to extract features at low levels of sensory processing for tasks representative of those envisioned for the Space Station such as assembly and maintenance are described.

  13. Tree-Structured Digital Organisms Model

    NASA Astrophysics Data System (ADS)

    Suzuki, Teruhiko; Nobesawa, Shiho; Tahara, Ikuo

    Tierra and Avida are well-known models of digital organisms. They describe a life process as a sequence of computation codes. A linear sequence model may not be the only way to describe a digital organism, though it is very simple for a computer-based model. Thus we propose a new digital organism model based on a tree structure, which is rather similar to the generic programming. With our model, a life process is a combination of various functions, as if life in the real world is. This implies that our model can easily describe the hierarchical structure of life, and it can simulate evolutionary computation through mutual interaction of functions. We verified our model by simulations that our model can be regarded as a digital organism model according to its definitions. Our model even succeeded in creating species such as viruses and parasites.

  14. Mathematical model and software for investigation of internal ballistic processes in high-speed projectile installations

    NASA Astrophysics Data System (ADS)

    Diachkovskii, A. S.; Zykova, A. I.; Ishchenko, A. N.; Kasimov, V. Z.; Rogaev, K. S.; Sidorov, A. D.

    2017-11-01

    This paper describes a software package that allows to explore the interior ballistics processes occurring in a shot scheme with bulk charges using propellant pasty substances at various loading schemes, etc. As a mathematical model, a model of a polydisperse mixture of non-deformable particles and a carrier gas phase is used in the quasi-one-dimensional approximation. Writing the equations of the mathematical model allows to use it to describe a broad class of interior ballistics processes. Features of the using approach are illustrated by calculating the ignition period for the charge of tubular propellant.

  15. Use of transport models for wildfire behavior simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Linn, R.R.; Harlow, F.H.

    1998-01-01

    Investigators have attempted to describe the behavior of wildfires for over fifty years. Current models for numerical description are mainly algebraic and based on statistical or empirical ideas. The authors have developed a transport model called FIRETEC. The use of transport formulations connects the propagation rates to the full conservation equations for energy, momentum, species concentrations, mass, and turbulence. In this paper, highlights of the model formulation and results are described. The goal of the FIRETEC model is to describe most probable average behavior of wildfires in a wide variety of conditions. FIRETEC represents the essence of the combination ofmore » many small-scale processes without resolving each process in complete detail.« less

  16. Computer simulation of the metastatic progression.

    PubMed

    Wedemann, Gero; Bethge, Anja; Haustein, Volker; Schumacher, Udo

    2014-01-01

    A novel computer model based on a discrete event simulation procedure describes quantitatively the processes underlying the metastatic cascade. Analytical functions describe the size of the primary tumor and the metastases, while a rate function models the intravasation events of the primary tumor and metastases. Events describe the behavior of the malignant cells until the formation of new metastases. The results of the computer simulations are in quantitative agreement with clinical data determined from a patient with hepatocellular carcinoma in the liver. The model provides a more detailed view on the process than a conventional mathematical model. In particular, the implications of interventions on metastasis formation can be calculated.

  17. A Microsoft Project-Based Planning, Tracking, and Management Tool for the National Transonic Facility's Model Changeover Process

    NASA Technical Reports Server (NTRS)

    Vairo, Daniel M.

    1998-01-01

    The removal and installation of sting-mounted wind tunnel models in the National Transonic Facility (NTF) is a multi-task process having a large impact on the annual throughput of the facility. Approximately ten model removal and installation cycles occur annually at the NTF with each cycle requiring slightly over five days to complete. The various tasks of the model changeover process were modeled in Microsoft Project as a template to provide a planning, tracking, and management tool. The template can also be used as a tool to evaluate improvements to this process. This document describes the development of the template and provides step-by-step instructions on its use and as a planning and tracking tool. A secondary role of this document is to provide an overview of the model changeover process and briefly describe the tasks associated with it.

  18. A Model for Describing, Analysing and Investigating Cultural Understanding in EFL Reading Settings

    ERIC Educational Resources Information Center

    Porto, Melina

    2013-01-01

    This article describes a model used to explore cultural understanding in English as a foreign language reading in a developing country, namely Argentina. The model is designed to investigate, analyse and describe EFL readers' processes of cultural understanding in a specific context. Cultural understanding in reading is typically investigated…

  19. Pain management: a review of organisation models with integrated processes for the management of pain in adult cancer patients.

    PubMed

    Brink-Huis, Anita; van Achterberg, Theo; Schoonhoven, Lisette

    2008-08-01

    This paper reports a review of the literature conducted to identify organisation models in cancer pain management that contain integrated care processes and describe their effectiveness. Pain is experienced by 30-50% of cancer patients receiving treatment and by 70-90% of those with advanced disease. Efforts to improve pain management have been made through the development and dissemination of clinical guidelines. Early improvements in pain management were focussed on just one or two single processes such as pain assessment and patient education. Little is known about organisational models with multiple integrated processes throughout the course of the disease trajectory and concerning all stages of the care process. Systematic review. The review involved a systematic search of the literature, published between 1986-2006. Subject-specific keywords used to describe patients, disease, pain management interventions and integrated care processes, relevant for this review were selected using the thesaurus of the databases. Institutional models, clinical pathways and consultation services are three alternative models for the integration of care processes in cancer pain management. A clinical pathway is a comprehensive institutionalisation model, whereas a pain consultation service is a 'stand-alone' model that can be integrated in a clinical pathway. Positive patient and process outcomes have been described for all three models, although the level of evidence is generally low. Evaluation of the quality of pain management must involve standardised measurements of both patient and process outcomes. We recommend the development of policies for referrals to a pain consultation service. These policies can be integrated within a clinical pathway. To evaluate the effectiveness of pain management models standardised outcome measures are needed.

  20. Multicriteria framework for selecting a process modelling language

    NASA Astrophysics Data System (ADS)

    Scanavachi Moreira Campos, Ana Carolina; Teixeira de Almeida, Adiel

    2016-01-01

    The choice of process modelling language can affect business process management (BPM) since each modelling language shows different features of a given process and may limit the ways in which a process can be described and analysed. However, choosing the appropriate modelling language for process modelling has become a difficult task because of the availability of a large number modelling languages and also due to the lack of guidelines on evaluating, and comparing languages so as to assist in selecting the most appropriate one. This paper proposes a framework for selecting a modelling language in accordance with the purposes of modelling. This framework is based on the semiotic quality framework (SEQUAL) for evaluating process modelling languages and a multicriteria decision aid (MCDA) approach in order to select the most appropriate language for BPM. This study does not attempt to set out new forms of assessment and evaluation criteria, but does attempt to demonstrate how two existing approaches can be combined so as to solve the problem of selection of modelling language. The framework is described in this paper and then demonstrated by means of an example. Finally, the advantages and disadvantages of using SEQUAL and MCDA in an integrated manner are discussed.

  1. Kinetic Theory and Simulation of Single-Channel Water Transport

    NASA Astrophysics Data System (ADS)

    Tajkhorshid, Emad; Zhu, Fangqiang; Schulten, Klaus

    Water translocation between various compartments of a system is a fundamental process in biology of all living cells and in a wide variety of technological problems. The process is of interest in different fields of physiology, physical chemistry, and physics, and many scientists have tried to describe the process through physical models. Owing to advances in computer simulation of molecular processes at an atomic level, water transport has been studied in a variety of molecular systems ranging from biological water channels to artificial nanotubes. While simulations have successfully described various kinetic aspects of water transport, offering a simple, unified model to describe trans-channel translocation of water turned out to be a nontrivial task.

  2. Modelling and simulating decision processes of linked lives: An approach based on concurrent processes and stochastic race.

    PubMed

    Warnke, Tom; Reinhardt, Oliver; Klabunde, Anna; Willekens, Frans; Uhrmacher, Adelinde M

    2017-10-01

    Individuals' decision processes play a central role in understanding modern migration phenomena and other demographic processes. Their integration into agent-based computational demography depends largely on suitable support by a modelling language. We are developing the Modelling Language for Linked Lives (ML3) to describe the diverse decision processes of linked lives succinctly in continuous time. The context of individuals is modelled by networks the individual is part of, such as family ties and other social networks. Central concepts, such as behaviour conditional on agent attributes, age-dependent behaviour, and stochastic waiting times, are tightly integrated in the language. Thereby, alternative decisions are modelled by concurrent processes that compete by stochastic race. Using a migration model, we demonstrate how this allows for compact description of complex decisions, here based on the Theory of Planned Behaviour. We describe the challenges for the simulation algorithm posed by stochastic race between multiple concurrent complex decisions.

  3. Toward a General Research Process for Using Dubin's Theory Building Model

    ERIC Educational Resources Information Center

    Holton, Elwood F.; Lowe, Janis S.

    2007-01-01

    Dubin developed a widely used methodology for theory building, which describes the components of the theory building process. Unfortunately, he does not define a research process for implementing his theory building model. This article proposes a seven-step general research process for implementing Dubin's theory building model. An example of a…

  4. BioMOL: a computer-assisted biological modeling tool for complex chemical mixtures and biological processes at the molecular level.

    PubMed Central

    Klein, Michael T; Hou, Gang; Quann, Richard J; Wei, Wei; Liao, Kai H; Yang, Raymond S H; Campain, Julie A; Mazurek, Monica A; Broadbelt, Linda J

    2002-01-01

    A chemical engineering approach for the rigorous construction, solution, and optimization of detailed kinetic models for biological processes is described. This modeling capability addresses the required technical components of detailed kinetic modeling, namely, the modeling of reactant structure and composition, the building of the reaction network, the organization of model parameters, the solution of the kinetic model, and the optimization of the model. Even though this modeling approach has enjoyed successful application in the petroleum industry, its application to biomedical research has just begun. We propose to expand the horizons on classic pharmacokinetics and physiologically based pharmacokinetics (PBPK), where human or animal bodies were often described by a few compartments, by integrating PBPK with reaction network modeling described in this article. If one draws a parallel between an oil refinery, where the application of this modeling approach has been very successful, and a human body, the individual processing units in the oil refinery may be considered equivalent to the vital organs of the human body. Even though the cell or organ may be much more complicated, the complex biochemical reaction networks in each organ may be similarly modeled and linked in much the same way as the modeling of the entire oil refinery through linkage of the individual processing units. The integrated chemical engineering software package described in this article, BioMOL, denotes the biological application of molecular-oriented lumping. BioMOL can build a detailed model in 1-1,000 CPU sec using standard desktop hardware. The models solve and optimize using standard and widely available hardware and software and can be presented in the context of a user-friendly interface. We believe this is an engineering tool with great promise in its application to complex biological reaction networks. PMID:12634134

  5. Multiobjective optimization and multivariable control of the beer fermentation process with the use of evolutionary algorithms.

    PubMed

    Andrés-Toro, B; Girón-Sierra, J M; Fernández-Blanco, P; López-Orozco, J A; Besada-Portas, E

    2004-04-01

    This paper describes empirical research on the model, optimization and supervisory control of beer fermentation. Conditions in the laboratory were made as similar as possible to brewery industry conditions. Since mathematical models that consider realistic industrial conditions were not available, a new mathematical model design involving industrial conditions was first developed. Batch fermentations are multiobjective dynamic processes that must be guided along optimal paths to obtain good results. The paper describes a direct way to apply a Pareto set approach with multiobjective evolutionary algorithms (MOEAs). Successful finding of optimal ways to drive these processes were reported. Once obtained, the mathematical fermentation model was used to optimize the fermentation process by using an intelligent control based on certain rules.

  6. Model for amorphous aggregation processes

    NASA Astrophysics Data System (ADS)

    Stranks, Samuel D.; Ecroyd, Heath; van Sluyter, Steven; Waters, Elizabeth J.; Carver, John A.; von Smekal, Lorenz

    2009-11-01

    The amorphous aggregation of proteins is associated with many phenomena, ranging from the formation of protein wine haze to the development of cataract in the eye lens and the precipitation of recombinant proteins during their expression and purification. While much literature exists describing models for linear protein aggregation, such as amyloid fibril formation, there are few reports of models which address amorphous aggregation. Here, we propose a model to describe the amorphous aggregation of proteins which is also more widely applicable to other situations where a similar process occurs, such as in the formation of colloids and nanoclusters. As first applications of the model, we have tested it against experimental turbidimetry data of three proteins relevant to the wine industry and biochemistry, namely, thaumatin, a thaumatinlike protein, and α -lactalbumin. The model is very robust and describes amorphous experimental data to a high degree of accuracy. Details about the aggregation process, such as shape parameters of the aggregates and rate constants, can also be extracted.

  7. Automatization of hydrodynamic modelling in a Floreon+ system

    NASA Astrophysics Data System (ADS)

    Ronovsky, Ales; Kuchar, Stepan; Podhoranyi, Michal; Vojtek, David

    2017-07-01

    The paper describes fully automatized hydrodynamic modelling as a part of the Floreon+ system. The main purpose of hydrodynamic modelling in the disaster management is to provide an accurate overview of the hydrological situation in a given river catchment. Automatization of the process as a web service could provide us with immediate data based on extreme weather conditions, such as heavy rainfall, without the intervention of an expert. Such a service can be used by non scientific users such as fire-fighter operators or representatives of a military service organizing evacuation during floods or river dam breaks. The paper describes the whole process beginning with a definition of a schematization necessary for hydrodynamic model, gathering of necessary data and its processing for a simulation, the model itself and post processing of a result and visualization on a web service. The process is demonstrated on a real data collected during floods in our Moravian-Silesian region in 2010.

  8. Physical and mathematical modeling of antimicrobial photodynamic therapy

    NASA Astrophysics Data System (ADS)

    Bürgermeister, Lisa; López, Fernando Romero; Schulz, Wolfgang

    2014-07-01

    Antimicrobial photodynamic therapy (aPDT) is a promising method to treat local bacterial infections. The therapy is painless and does not cause bacterial resistances. However, there are gaps in understanding the dynamics of the processes, especially in periodontal treatment. This work describes the advances in fundamental physical and mathematical modeling of aPDT used for interpretation of experimental evidence. The result is a two-dimensional model of aPDT in a dental pocket phantom model. In this model, the propagation of laser light and the kinetics of the chemical reactions are described as coupled processes. The laser light induces the chemical processes depending on its intensity. As a consequence of the chemical processes, the local optical properties and distribution of laser light change as well as the reaction rates. The mathematical description of these coupled processes will help to develop treatment protocols and is the first step toward an inline feedback system for aPDT users.

  9. Application of a model of social information processing to nursing theory: how nurses respond to patients.

    PubMed

    Sheldon, Lisa Kennedy; Ellington, Lee

    2008-11-01

    This paper is a report of a study to assess the applicability of a theoretical model of social information processing in expanding a nursing theory addressing how nurses respond to patients. Nursing communication affects patient outcomes such as anxiety, adherence to treatments and satisfaction with care. Orlando's theory of nursing process describes nurses' reactions to patients' behaviour as generating a perception, thought and feeling in the nurse and then action by the nurse. A model of social information processing describes the sequential steps in the cognitive processes used to respond to social cues and may be useful in describing the nursing process. Cognitive interviews were conducted in 2006 with a convenience sample of 5 nurses in the United States of America. The data were interpreted using the Crick and Dodge model of social information processing. Themes arising from cognitive interviews validated concepts of the nursing theory and the constructs of the model of social information processing. The interviews revealed that the support of peers was an additional construct involved in the development of communication skills, creation of a database and enhancement of self-efficacy. Models of social information processing enhance understanding of the process of how nurses respond to patients and further develop nursing theories further. In combination, the theories are useful in developing research into nurse-patient communication. Future research based on the expansion of nursing theory may identify effective and culturally appropriate nurse response patterns to specific patient interactions with implications for nursing care and patient outcomes.

  10. Modeling microbial diversity in anaerobic digestion through an extended ADM1 model.

    PubMed

    Ramirez, Ivan; Volcke, Eveline I P; Rajinikanth, Rajagopal; Steyer, Jean-Philippe

    2009-06-01

    The anaerobic digestion process comprises a whole network of sequential and parallel reactions, of both biochemical and physicochemical nature. Mathematical models, aiming at understanding and optimization of the anaerobic digestion process, describe these reactions in a structured way, the IWA Anaerobic Digestion Model No. 1 (ADM1) being the most well established example. While these models distinguish between different microorganisms involved in different reactions, to our knowledge they all neglect species diversity between organisms with the same function, i.e. performing the same reaction. Nevertheless, available experimental evidence suggests that the structure and properties of a microbial community may be influenced by process operation and on their turn also determine the reactor functioning. In order to adequately describe these phenomena, mathematical models need to consider the underlying microbial diversity. This is demonstrated in this contribution by extending the ADM1 to describe microbial diversity between organisms of the same functional group. The resulting model has been compared with the traditional ADM1 in describing experimental data of a pilot-scale hybrid Upflow Anaerobic Sludge Filter Bed (UASFB) reactor, as well as in a more detailed simulation study. The presented model is further shown useful in assessing the relationship between reactor performance and microbial community structure in mesophilic CSTRs seeded with slaughterhouse wastewater when facing increasing levels of ammonia.

  11. The structure of the casein micelle of milk and its changes during processing.

    PubMed

    Dalgleish, Douglas G; Corredig, Milena

    2012-01-01

    The majority of the protein in cow's milk is contained in the particles known as casein micelles. This review describes the main structural features of these particles and the different models that have been used to define the interior structures. The reactions of the micelles during processing operations are described in terms of the structural models.

  12. Stochastic GARCH dynamics describing correlations between stocks

    NASA Astrophysics Data System (ADS)

    Prat-Ortega, G.; Savel'ev, S. E.

    2014-09-01

    The ARCH and GARCH processes have been successfully used for modelling price dynamics such as stock returns or foreign exchange rates. Analysing the long range correlations between stocks, we propose a model, based on the GARCH process, which is able to describe the main characteristics of the stock price correlations, including the mean, variance, probability density distribution and the noise spectrum.

  13. Hierarchial mark-recapture models: a framework for inference about demographic processes

    USGS Publications Warehouse

    Link, W.A.; Barker, R.J.

    2004-01-01

    The development of sophisticated mark-recapture models over the last four decades has provided fundamental tools for the study of wildlife populations, allowing reliable inference about population sizes and demographic rates based on clearly formulated models for the sampling processes. Mark-recapture models are now routinely described by large numbers of parameters. These large models provide the next challenge to wildlife modelers: the extraction of signal from noise in large collections of parameters. Pattern among parameters can be described by strong, deterministic relations (as in ultrastructural models) but is more flexibly and credibly modeled using weaker, stochastic relations. Trend in survival rates is not likely to be manifest by a sequence of values falling precisely on a given parametric curve; rather, if we could somehow know the true values, we might anticipate a regression relation between parameters and explanatory variables, in which true value equals signal plus noise. Hierarchical models provide a useful framework for inference about collections of related parameters. Instead of regarding parameters as fixed but unknown quantities, we regard them as realizations of stochastic processes governed by hyperparameters. Inference about demographic processes is based on investigation of these hyperparameters. We advocate the Bayesian paradigm as a natural, mathematically and scientifically sound basis for inference about hierarchical models. We describe analysis of capture-recapture data from an open population based on hierarchical extensions of the Cormack-Jolly-Seber model. In addition to recaptures of marked animals, we model first captures of animals and losses on capture, and are thus able to estimate survival probabilities w (i.e., the complement of death or permanent emigration) and per capita growth rates f (i.e., the sum of recruitment and immigration rates). Covariation in these rates, a feature of demographic interest, is explicitly described in the model.

  14. Improving models for describing phosphorus cycling in agricultural soils

    USDA-ARS?s Scientific Manuscript database

    The mobility of phosphorus in the environment is controlled to a large extent by its sorption to soil. Therefore, an important component of all P loss models is how the model describes the biogeochemical processes governing P sorption and desorption to soils. The most common approach to modeling P c...

  15. Comprehensive Career Guidance. Postsecondary & Adult. Programs and Model.

    ERIC Educational Resources Information Center

    Moore, Earl J.; Miller, Thomas B.

    Divided into four parts, this document describes a comprehensive career guidance model for postsecondary and adult programs. In part 1, the rationale for extending career guidance and counseling into the lifelong learning perspective is explained, the Georgia Life Career Development Model is described, and the components of a process model for…

  16. Observations and Modeling of the Green Ocean Amazon 2014/15. CHUVA Field Campaign Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Machado, L. A. T.

    2016-03-01

    The physical processes inside clouds are one of the most unknown components of weather and climate systems. A description of cloud processes through the use of standard meteorological parameters in numerical models has to be strongly improved to accurately describe the characteristics of hydrometeors, latent heating profiles, radiative balance, air entrainment, and cloud updrafts and downdrafts. Numerical models have been improved to run at higher spatial resolutions where it is necessary to explicitly describe these cloud processes. For instance, to analyze the effects of global warming in a given region it is necessary to perform simulations taking into account allmore » of the cloud processes described above. Another important application that requires this knowledge is satellite precipitation estimation. The analysis will be performed focusing on the microphysical evolution and cloud life cycle, different precipitation estimation algorithms, the development of thunderstorms and lightning formation, processes in the boundary layer, and cloud microphysical modeling. This project intends to extend the knowledge of these cloud processes to reduce the uncertainties in precipitation estimation, mainly from warm clouds, and, consequently, improve knowledge of the water and energy budget and cloud microphysics.« less

  17. Process-driven selection of information systems for healthcare

    NASA Astrophysics Data System (ADS)

    Mills, Stephen F.; Yeh, Raymond T.; Giroir, Brett P.; Tanik, Murat M.

    1995-05-01

    Integration of networking and data management technologies such as PACS, RIS and HIS into a healthcare enterprise in a clinically acceptable manner is a difficult problem. Data within such a facility are generally managed via a combination of manual hardcopy systems and proprietary, special-purpose data processing systems. Process modeling techniques have been successfully applied to engineering and manufacturing enterprises, but have not generally been applied to service-based enterprises such as healthcare facilities. The use of process modeling techniques can provide guidance for the placement, configuration and usage of PACS and other informatics technologies within the healthcare enterprise, and thus improve the quality of healthcare. Initial process modeling activities conducted within the Pediatric ICU at Children's Medical Center in Dallas, Texas are described. The ongoing development of a full enterprise- level model for the Pediatric ICU is also described.

  18. Translational research: understanding the continuum from bench to bedside.

    PubMed

    Drolet, Brian C; Lorenzi, Nancy M

    2011-01-01

    The process of translating basic scientific discoveries to clinical applications, and ultimately to public health improvements, has emerged as an important, but difficult, objective in biomedical research. The process is best described as a "translation continuum" because various resources and actions are involved in this progression of knowledge, which advances discoveries from the bench to the bedside. The current model of this continuum focuses primarily on translational research, which is merely one component of the overall translation process. This approach is ineffective. A revised model to address the entire continuum would provide a methodology to identify and describe all translational activities (eg, implementation, adoption translational research, etc) as well their place within the continuum. This manuscript reviews and synthesizes the literature to provide an overview of the current terminology and model for translation. A modification of the existing model is proposed to create a framework called the Biomedical Research Translation Continuum, which defines the translation process and describes the progression of knowledge from laboratory to health gains. This framework clarifies translation for readers who have not followed the evolving and complicated models currently described. Authors and researchers may use the continuum to understand and describe their research better as well as the translational activities within a conceptual framework. Additionally, the framework may increase the advancement of knowledge by refining discussions of translation and allowing more precise identification of barriers to progress. Copyright © 2011 Mosby, Inc. All rights reserved.

  19. Damage modeling and statistical analysis of optics damage performance in MJ-class laser systems.

    PubMed

    Liao, Zhi M; Raymond, B; Gaylord, J; Fallejo, R; Bude, J; Wegner, P

    2014-11-17

    Modeling the lifetime of a fused silica optic is described for a multiple beam, MJ-class laser system. This entails combining optic processing data along with laser shot data to account for complete history of optic processing and shot exposure. Integrating with online inspection data allows for the construction of a performance metric to describe how an optic performs with respect to the model. This methodology helps to validate the damage model as well as allows strategic planning and identifying potential hidden parameters that are affecting the optic's performance.

  20. Information processing psychology: A promising paradigm for research in science teaching

    NASA Astrophysics Data System (ADS)

    Stewart, James H.; Atkin, Julia A.

    Three research paradigms, those of Ausubel, Gagné and Piaget, have received a great deal of attention in the literature of science education. In this article a fourth paradigm is presented - an information processing psychology paradigm. The article is composed of two sections. The first section describes a model of memory developed by information processing psychologists. The second section describes how such a model could be used to guide science education research on learning and problem solving.Received: 19 October 1981

  1. The Spiral-Interactive Program Evaluation Model.

    ERIC Educational Resources Information Center

    Khaleel, Ibrahim Adamu

    1988-01-01

    Describes the spiral interactive program evaluation model, which is designed to evaluate vocational-technical education programs in secondary schools in Nigeria. Program evaluation is defined; utility oriented and process oriented models for evaluation are described; and internal and external evaluative factors and variables that define each…

  2. Self-Referenced Processing, Neurodevelopment and Joint Attention in Autism

    ERIC Educational Resources Information Center

    Mundy, Peter; Gwaltney, Mary; Henderson, Heather

    2010-01-01

    This article describes a parallel and distributed processing model (PDPM) of joint attention, self-referenced processing and autism. According to this model, autism involves early impairments in the capacity for rapid, integrated processing of self-referenced (proprioceptive and interoceptive) and other-referenced (exteroceptive) information.…

  3. Modeling of the flow stress for AISI H13 Tool Steel during Hard Machining Processes

    NASA Astrophysics Data System (ADS)

    Umbrello, Domenico; Rizzuti, Stefania; Outeiro, José C.; Shivpuri, Rajiv

    2007-04-01

    In general, the flow stress models used in computer simulation of machining processes are a function of effective strain, effective strain rate and temperature developed during the cutting process. However, these models do not adequately describe the material behavior in hard machining, where a range of material hardness between 45 and 60 HRC are used. Thus, depending on the specific material hardness different material models must be used in modeling the cutting process. This paper describes the development of a hardness-based flow stress and fracture models for the AISI H13 tool steel, which can be applied for range of material hardness mentioned above. These models were implemented in a non-isothermal viscoplastic numerical model to simulate the machining process for AISI H13 with various hardness values and applying different cutting regime parameters. Predicted results are validated by comparing them with experimental results found in the literature. They are found to predict reasonably well the cutting forces as well as the change in chip morphology from continuous to segmented chip as the material hardness change.

  4. Hybrid modeling as a QbD/PAT tool in process development: an industrial E. coli case study.

    PubMed

    von Stosch, Moritz; Hamelink, Jan-Martijn; Oliveira, Rui

    2016-05-01

    Process understanding is emphasized in the process analytical technology initiative and the quality by design paradigm to be essential for manufacturing of biopharmaceutical products with consistent high quality. A typical approach to developing a process understanding is applying a combination of design of experiments with statistical data analysis. Hybrid semi-parametric modeling is investigated as an alternative method to pure statistical data analysis. The hybrid model framework provides flexibility to select model complexity based on available data and knowledge. Here, a parametric dynamic bioreactor model is integrated with a nonparametric artificial neural network that describes biomass and product formation rates as function of varied fed-batch fermentation conditions for high cell density heterologous protein production with E. coli. Our model can accurately describe biomass growth and product formation across variations in induction temperature, pH and feed rates. The model indicates that while product expression rate is a function of early induction phase conditions, it is negatively impacted as productivity increases. This could correspond with physiological changes due to cytoplasmic product accumulation. Due to the dynamic nature of the model, rational process timing decisions can be made and the impact of temporal variations in process parameters on product formation and process performance can be assessed, which is central for process understanding.

  5. Mental health courts and their selection processes: modeling variation for consistency.

    PubMed

    Wolff, Nancy; Fabrikant, Nicole; Belenko, Steven

    2011-10-01

    Admission into mental health courts is based on a complicated and often variable decision-making process that involves multiple parties representing different expertise and interests. To the extent that eligibility criteria of mental health courts are more suggestive than deterministic, selection bias can be expected. Very little research has focused on the selection processes underpinning problem-solving courts even though such processes may dominate the performance of these interventions. This article describes a qualitative study designed to deconstruct the selection and admission processes of mental health courts. In this article, we describe a multi-stage, complex process for screening and admitting clients into mental health courts. The selection filtering model that is described has three eligibility screening stages: initial, assessment, and evaluation. The results of this study suggest that clients selected by mental health courts are shaped by the formal and informal selection criteria, as well as by the local treatment system.

  6. Carotene Degradation and Isomerization during Thermal Processing: A Review on the Kinetic Aspects.

    PubMed

    Colle, Ines J P; Lemmens, Lien; Knockaert, Griet; Van Loey, Ann; Hendrickx, Marc

    2016-08-17

    Kinetic models are important tools for process design and optimization to balance desired and undesired reactions taking place in complex food systems during food processing and preservation. This review covers the state of the art on kinetic models available to describe heat-induced conversion of carotenoids, in particular lycopene and β-carotene. First, relevant properties of these carotenoids are discussed. Second, some general aspects of kinetic modeling are introduced, including both empirical single-response modeling and mechanism-based multi-response modeling. The merits of multi-response modeling to simultaneously describe carotene degradation and isomerization are demonstrated. The future challenge in this research field lies in the extension of the current multi-response models to better approach the real reaction pathway and in the integration of kinetic models with mass transfer models in case of reaction in multi-phase food systems.

  7. A Bayesian network approach to knowledge integration and representation of farm irrigation: 1. Model development

    NASA Astrophysics Data System (ADS)

    Wang, Q. J.; Robertson, D. E.; Haines, C. L.

    2009-02-01

    Irrigation is important to many agricultural businesses but also has implications for catchment health. A considerable body of knowledge exists on how irrigation management affects farm business and catchment health. However, this knowledge is fragmentary; is available in many forms such as qualitative and quantitative; is dispersed in scientific literature, technical reports, and the minds of individuals; and is of varying degrees of certainty. Bayesian networks allow the integration of dispersed knowledge into quantitative systems models. This study describes the development, validation, and application of a Bayesian network model of farm irrigation in the Shepparton Irrigation Region of northern Victoria, Australia. In this first paper we describe the process used to integrate a range of sources of knowledge to develop a model of farm irrigation. We describe the principal model components and summarize the reaction to the model and its development process by local stakeholders. Subsequent papers in this series describe model validation and the application of the model to assess the regional impact of historical and future management intervention.

  8. A review of physically based models for soil erosion by water

    NASA Astrophysics Data System (ADS)

    Le, Minh-Hoang; Cerdan, Olivier; Sochala, Pierre; Cheviron, Bruno; Brivois, Olivier; Cordier, Stéphane

    2010-05-01

    Physically-based models rely on fundamental physical equations describing stream flow and sediment and associated nutrient generation in a catchment. This paper reviews several existing erosion and sediment transport approaches. The process of erosion include soil detachment, transport and deposition, we present various forms of equations and empirical formulas used when modelling and quantifying each of these processes. In particular, we detail models describing rainfall and infiltration effects and the system of equations to describe the overland flow and the evolution of the topography. We also present the formulas for the flow transport capacity and the erodibility functions. Finally, we present some recent numerical schemes to approach the shallow water equations and it's coupling with infiltration and erosion source terms.

  9. Forecasting Instability Indicators in the Horn of Africa

    DTIC Science & Technology

    2008-03-01

    further than 2 (Makridakis, et al, 1983, 359). 2-32 Autoregressive Integrated Moving Average ( ARIMA ) Model . Similar to the ARMA model except for...stationary process. ARIMA models are described as ARIMA (p,d,q), where p is the order of the autoregressive process, d is the degree of the...differential process, and q is the order of the moving average process. The ARMA (1,1) model shown above is equivalent to an ARIMA (1,0,1) model . An ARIMA

  10. Speech Perception as a Cognitive Process: The Interactive Activation Model.

    ERIC Educational Resources Information Center

    Elman, Jeffrey L.; McClelland, James L.

    Research efforts to model speech perception in terms of a processing system in which knowledge and processing are distributed over large numbers of highly interactive--but computationally primative--elements are described in this report. After discussing the properties of speech that demand a parallel interactive processing system, the report…

  11. Energy Models and the Policy Process.

    ERIC Educational Resources Information Center

    De Man, Reinier

    1983-01-01

    Describes the function of econometric and technological models in the policy process, and shows how different positions in the Dutch energy discussion are reflected by the application of different model methodologies. Discussion includes the energy policy context, a conceptual framework for using energy models, and energy scenarios in policy…

  12. Edgar Schein's Process versus Content Consultation Models.

    ERIC Educational Resources Information Center

    Rockwood, Gary F.

    1993-01-01

    Describes Schein's three models of consultation based on assumptions inherent in different helping styles: purchase of expertise and doctor-patient models, which focus on content of organization problems; and process consultation model, which focuses on how organizational problems are solved. Notes that Schein has suggested that consultants begin…

  13. Comparative Analysis on Nonlinear Models for Ron Gasoline Blending Using Neural Networks

    NASA Astrophysics Data System (ADS)

    Aguilera, R. Carreño; Yu, Wen; Rodríguez, J. C. Tovar; Mosqueda, M. Elena Acevedo; Ortiz, M. Patiño; Juarez, J. J. Medel; Bautista, D. Pacheco

    The blending process always being a nonlinear process is difficult to modeling, since it may change significantly depending on the components and the process variables of each refinery. Different components can be blended depending on the existing stock, and the chemical characteristics of each component are changing dynamically, they all are blended until getting the expected specification in different properties required by the customer. One of the most relevant properties is the Octane, which is difficult to control in line (without the component storage). Since each refinery process is quite different, a generic gasoline blending model is not useful when a blending in line wants to be done in a specific process. A mathematical gasoline blending model is presented in this paper for a given process described in state space as a basic gasoline blending process description. The objective is to adjust the parameters allowing the blending gasoline model to describe a signal in its trajectory, representing in neural networks extreme learning machine method and also for nonlinear autoregressive-moving average (NARMA) in neural networks method, such that a comparative work be developed.

  14. PREDICTING SUBSURFACE CONTAMINANT TRANSPORT AND TRANSFORMATION: CONSIDERATIONS FOR MODEL SELECTION AND FIELD VALIDATION

    EPA Science Inventory

    Predicting subsurface contaminant transport and transformation requires mathematical models based on a variety of physical, chemical, and biological processes. The mathematical model is an attempt to quantitatively describe observed processes in order to permit systematic forecas...

  15. A model framework to describe growth-linked biodegradation of trace-level pollutants in the presence of coincidental carbon substrates and microbes.

    PubMed

    Liu, Li; Helbling, Damian E; Kohler, Hans-Peter E; Smets, Barth F

    2014-11-18

    Pollutants such as pesticides and their degradation products occur ubiquitously in natural aquatic environments at trace concentrations (μg L(-1) and lower). Microbial biodegradation processes have long been known to contribute to the attenuation of pesticides in contaminated environments. However, challenges remain in developing engineered remediation strategies for pesticide-contaminated environments because the fundamental processes that regulate growth-linked biodegradation of pesticides in natural environments remain poorly understood. In this research, we developed a model framework to describe growth-linked biodegradation of pesticides at trace concentrations. We used experimental data reported in the literature or novel simulations to explore three fundamental kinetic processes in isolation. We then combine these kinetic processes into a unified model framework. The three kinetic processes described were: the growth-linked biodegradation of micropollutant at environmentally relevant concentrations; the effect of coincidental assimilable organic carbon substrates; and the effect of coincidental microbes that compete for assimilable organic carbon substrates. We used Monod kinetic models to describe substrate utilization and microbial growth rates for specific pesticide and degrader pairs. We then extended the model to include terms for utilization of assimilable organic carbon substrates by the specific degrader and coincidental microbes, growth on assimilable organic carbon substrates by the specific degrader and coincidental microbes, and endogenous metabolism. The proposed model framework enables interpretation and description of a range of experimental observations on micropollutant biodegradation. The model provides a useful tool to identify environmental conditions with respect to the occurrence of assimilable organic carbon and coincidental microbes that may result in enhanced or reduced micropollutant biodegradation.

  16. Mathematical Model of Nonstationary Separation Processes Proceeding in the Cascade of Gas Centrifuges in the Process of Separation of Multicomponent Isotope Mixtures

    NASA Astrophysics Data System (ADS)

    Orlov, A. A.; Ushakov, A. A.; Sovach, V. P.

    2017-03-01

    We have developed and realized on software a mathematical model of the nonstationary separation processes proceeding in the cascades of gas centrifuges in the process of separation of multicomponent isotope mixtures. With the use of this model the parameters of the separation process of germanium isotopes have been calculated. It has been shown that the model adequately describes the nonstationary processes in the cascade and is suitable for calculating their parameters in the process of separation of multicomponent isotope mixtures.

  17. Transforming Collaborative Process Models into Interface Process Models by Applying an MDA Approach

    NASA Astrophysics Data System (ADS)

    Lazarte, Ivanna M.; Chiotti, Omar; Villarreal, Pablo D.

    Collaborative business models among enterprises require defining collaborative business processes. Enterprises implement B2B collaborations to execute these processes. In B2B collaborations the integration and interoperability of processes and systems of the enterprises are required to support the execution of collaborative processes. From a collaborative process model, which describes the global view of the enterprise interactions, each enterprise must define the interface process that represents the role it performs in the collaborative process in order to implement the process in a Business Process Management System. Hence, in this work we propose a method for the automatic generation of the interface process model of each enterprise from a collaborative process model. This method is based on a Model-Driven Architecture to transform collaborative process models into interface process models. By applying this method, interface processes are guaranteed to be interoperable and defined according to a collaborative process.

  18. Development of Semantic Description for Multiscale Models of Thermo-Mechanical Treatment of Metal Alloys

    NASA Astrophysics Data System (ADS)

    Macioł, Piotr; Regulski, Krzysztof

    2016-08-01

    We present a process of semantic meta-model development for data management in an adaptable multiscale modeling framework. The main problems in ontology design are discussed, and a solution achieved as a result of the research is presented. The main concepts concerning the application and data management background for multiscale modeling were derived from the AM3 approach—object-oriented Agile multiscale modeling methodology. The ontological description of multiscale models enables validation of semantic correctness of data interchange between submodels. We also present a possibility of using the ontological model as a supervisor in conjunction with a multiscale model controller and a knowledge base system. Multiscale modeling formal ontology (MMFO), designed for describing multiscale models' data and structures, is presented. A need for applying meta-ontology in the MMFO development process is discussed. Examples of MMFO application in describing thermo-mechanical treatment of metal alloys are discussed. Present and future applications of MMFO are described.

  19. Modeling dyadic processes using Hidden Markov Models: A time series approach to mother-infant interactions during infant immunization.

    PubMed

    Stifter, Cynthia A; Rovine, Michael

    2015-01-01

    The focus of the present longitudinal study, to examine mother-infant interaction during the administration of immunizations at two and six months of age, used hidden Markov modeling, a time series approach that produces latent states to describe how mothers and infants work together to bring the infant to a soothed state. Results revealed a 4-state model for the dyadic responses to a two-month inoculation whereas a 6-state model best described the dyadic process at six months. Two of the states at two months and three of the states at six months suggested a progression from high intensity crying to no crying with parents using vestibular and auditory soothing methods. The use of feeding and/or pacifying to soothe the infant characterized one two-month state and two six-month states. These data indicate that with maturation and experience, the mother-infant dyad is becoming more organized around the soothing interaction. Using hidden Markov modeling to describe individual differences, as well as normative processes, is also presented and discussed.

  20. Modeling dyadic processes using Hidden Markov Models: A time series approach to mother-infant interactions during infant immunization

    PubMed Central

    Stifter, Cynthia A.; Rovine, Michael

    2016-01-01

    The focus of the present longitudinal study, to examine mother-infant interaction during the administration of immunizations at two and six months of age, used hidden Markov modeling, a time series approach that produces latent states to describe how mothers and infants work together to bring the infant to a soothed state. Results revealed a 4-state model for the dyadic responses to a two-month inoculation whereas a 6-state model best described the dyadic process at six months. Two of the states at two months and three of the states at six months suggested a progression from high intensity crying to no crying with parents using vestibular and auditory soothing methods. The use of feeding and/or pacifying to soothe the infant characterized one two-month state and two six-month states. These data indicate that with maturation and experience, the mother-infant dyad is becoming more organized around the soothing interaction. Using hidden Markov modeling to describe individual differences, as well as normative processes, is also presented and discussed. PMID:27284272

  1. Enhanced Self Tuning On-Board Real-Time Model (eSTORM) for Aircraft Engine Performance Health Tracking

    NASA Technical Reports Server (NTRS)

    Volponi, Al; Simon, Donald L. (Technical Monitor)

    2008-01-01

    A key technological concept for producing reliable engine diagnostics and prognostics exploits the benefits of fusing sensor data, information, and/or processing algorithms. This report describes the development of a hybrid engine model for a propulsion gas turbine engine, which is the result of fusing two diverse modeling methodologies: a physics-based model approach and an empirical model approach. The report describes the process and methods involved in deriving and implementing a hybrid model configuration for a commercial turbofan engine. Among the intended uses for such a model is to enable real-time, on-board tracking of engine module performance changes and engine parameter synthesis for fault detection and accommodation.

  2. The dual process model of coping with bereavement: a decade on.

    PubMed

    Stroebe, Margaret; Schut, Henk

    2010-01-01

    The Dual Process Model of Coping with Bereavement (DPM; Stroebe & Schut, 1999) is described in this article. The rationale is given as to why this model was deemed necessary and how it was designed to overcome limitations of earlier models of adaptive coping with loss. Although building on earlier theoretical formulations, it contrasts with other models along a number of dimensions which are outlined. In addition to describing the basic parameters of the DPM, theoretical and empirical developments that have taken place since the original publication of the model are summarized. Guidelines for future research are given focusing on principles that should be followed to put the model to stringent empirical test.

  3. Hydrologic controls on equilibrium soil depths

    NASA Astrophysics Data System (ADS)

    Nicótina, L.; Tarboton, D. G.; Tesfa, T. K.; Rinaldo, A.

    2011-04-01

    This paper deals with modeling the mutual feedbacks between runoff production and geomorphological processes and attributes that lead to patterns of equilibrium soil depth. Our primary goal is an attempt to describe spatial patterns of soil depth resulting from long-term interactions between hydrologic forcings and soil production, erosion, and sediment transport processes under the framework of landscape dynamic equilibrium. Another goal is to set the premises for exploiting the role of soil depths in shaping the hydrologic response of a catchment. The relevance of the study stems from the massive improvement in hydrologic predictions for ungauged basins that would be achieved by using directly soil depths derived from geomorphic features remotely measured and objectively manipulated. Hydrological processes are here described by explicitly accounting for local soil depths and detailed catchment topography. Geomorphological processes are described by means of well-studied geomorphic transport laws. The modeling approach is applied to the semiarid Dry Creek Experimental Watershed, located near Boise, Idaho. Modeled soil depths are compared with field data obtained from an extensive survey of the catchment. Our results show the ability of the model to describe properly the mean soil depth and the broad features of the distribution of measured data. However, local comparisons show significant scatter whose origins are discussed.

  4. Dual-Process Theory and Signal-Detection Theory of Recognition Memory

    ERIC Educational Resources Information Center

    Wixted, John T.

    2007-01-01

    Two influential models of recognition memory, the unequal-variance signal-detection model and a dual-process threshold/detection model, accurately describe the receiver operating characteristic, but only the latter model can provide estimates of recollection and familiarity. Such estimates often accord with those provided by the remember-know…

  5. Development of a model for whole brain learning of physiology.

    PubMed

    Eagleton, Saramarie; Muller, Anton

    2011-12-01

    In this report, a model was developed for whole brain learning based on Curry's onion model. Curry described the effect of personality traits as the inner layer of learning, information-processing styles as the middle layer of learning, and environmental and instructional preferences as the outer layer of learning. The model that was developed elaborates on these layers by relating the personality traits central to learning to the different quadrants of brain preference, as described by Neethling's brain profile, as the inner layer of the onion. This layer is encircled by the learning styles that describe different information-processing preferences for each brain quadrant. For the middle layer, the different stages of Kolb's learning cycle are classified into the four brain quadrants associated with the different brain processing strategies within the information processing circle. Each of the stages of Kolb's learning cycle is also associated with a specific cognitive learning strategy. These two inner circles are enclosed by the circle representing the role of the environment and instruction on learning. It relates environmental factors that affect learning and distinguishes between face-to-face and technology-assisted learning. This model informs on the design of instructional interventions for physiology to encourage whole brain learning.

  6. Operator Performance Measures for Assessing Voice Communication Effectiveness

    DTIC Science & Technology

    1989-07-01

    performance and work- load assessment techniques have been based.I Broadbent (1958) described a limited capacity filter model of human information...INFORMATION PROCESSING 20 3.1.1. Auditory Attention 20 3.1.2. Auditory Memory 24 3.2. MODELS OF INFORMATION PROCESSING 24 3.2.1. Capacity Theories 25...Learning 0 Attention * Language Specialization • Decision Making• Problem Solving Auditory Information Processing Models of Processing Ooemtor

  7. Mathematical Modeling of Nitrous Oxide Production during Denitrifying Phosphorus Removal Process.

    PubMed

    Liu, Yiwen; Peng, Lai; Chen, Xueming; Ni, Bing-Jie

    2015-07-21

    A denitrifying phosphorus removal process undergoes frequent alternating anaerobic/anoxic conditions to achieve phosphate release and uptake, during which microbial internal storage polymers (e.g., Polyhydroxyalkanoate (PHA)) could be produced and consumed dynamically. The PHA turnovers play important roles in nitrous oxide (N2O) accumulation during the denitrifying phosphorus removal process. In this work, a mathematical model is developed to describe N2O dynamics and the key role of PHA consumption on N2O accumulation during the denitrifying phosphorus removal process for the first time. In this model, the four-step anoxic storage of polyphosphate and four-step anoxic growth on PHA using nitrate, nitrite, nitric oxide (NO), and N2O consecutively by denitrifying polyphosphate accumulating organisms (DPAOs) are taken into account for describing all potential N2O accumulation steps in the denitrifying phosphorus removal process. The developed model is successfully applied to reproduce experimental data on N2O production obtained from four independent denitrifying phosphorus removal study reports with different experimental conditions. The model satisfactorily describes the N2O accumulation, nitrogen reduction, phosphate release and uptake, and PHA dynamics for all systems, suggesting the validity and applicability of the model. The results indicated a substantial role of PHA consumption in N2O accumulation due to the relatively low N2O reduction rate by using PHA during denitrifying phosphorus removal.

  8. An engineering approach to modelling, decision support and control for sustainable systems.

    PubMed

    Day, W; Audsley, E; Frost, A R

    2008-02-12

    Engineering research and development contributes to the advance of sustainable agriculture both through innovative methods to manage and control processes, and through quantitative understanding of the operation of practical agricultural systems using decision models. This paper describes how an engineering approach, drawing on mathematical models of systems and processes, contributes new methods that support decision making at all levels from strategy and planning to tactics and real-time control. The ability to describe the system or process by a simple and robust mathematical model is critical, and the outputs range from guidance to policy makers on strategic decisions relating to land use, through intelligent decision support to farmers and on to real-time engineering control of specific processes. Precision in decision making leads to decreased use of inputs, less environmental emissions and enhanced profitability-all essential to sustainable systems.

  9. Study of stability of the difference scheme for the model problem of the gaslift process

    NASA Astrophysics Data System (ADS)

    Temirbekov, Nurlan; Turarov, Amankeldy

    2017-09-01

    The paper studies a model of the gaslift process where the motion in a gas-lift well is described by partial differential equations. The system describing the studied process consists of equations of motion, continuity, equations of thermodynamic state, and hydraulic resistance. A two-layer finite-difference Lax-Vendroff scheme is constructed for the numerical solution of the problem. The stability of the difference scheme for the model problem is investigated using the method of a priori estimates, the order of approximation is investigated, the algorithm for numerical implementation of the gaslift process model is given, and the graphs are presented. The development and investigation of difference schemes for the numerical solution of systems of equations of gas dynamics makes it possible to obtain simultaneously exact and monotonic solutions.

  10. Black-Scholes model under subordination

    NASA Astrophysics Data System (ADS)

    Stanislavsky, A. A.

    2003-02-01

    In this paper, we consider a new mathematical extension of the Black-Scholes (BS) model in which the stochastic time and stock share price evolution is described by two independent random processes. The parent process is Brownian, and the directing process is inverse to the totally skewed, strictly α-stable process. The subordinated process represents the Brownian motion indexed by an independent, continuous and increasing process. This allows us to introduce the long-term memory effects in the classical BS model.

  11. Jet Fuel Exacerbated Noise-Induced Hearing Loss: Focus on Prediction of Central Auditory Processing Dysfunction

    DTIC Science & Technology

    2017-09-01

    to develop a multi-scale model, together with relevant supporting experimental data, to describe jet fuel exacerbated noise induced hearing loss. In...scale model, together with relevant supporting experimental data, to describe jet fuel exacerbated noise-induced hearing loss. Such hearing loss...project was to develop a multi-scale model, together with relevant supporting experimental data, to describe jet fuel exacerbated NIHL. Herein we

  12. Evaluation of nursing practice: process and critique.

    PubMed

    Braunstein, M S

    1998-01-01

    This article describes the difficulties in conducting clinical trials to evaluate nursing practice models. Suggestions are offered for strengthening the process. A clinical trial of a nursing practice model based on a synthesis of Aristotelian theory with Rogers' science is described. The rationale for decisions regarding the research procedures used in presented. Methodological limitations of the study design and the specifications of the practice model are examined. It is concluded that clear specification of theoretical relationships within a practice model and clear identification of key intervening variables will enable researchers to better connect the treatment with the outcome.

  13. POLUTE. Forest Air Pollutant Uptake Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Murphy, C.E. Jr.; Sinclair, T.R.

    1992-02-13

    POLUTE is a computer model designed to estimate the uptake of air pollutants by forests. The model utilizes submodels to describe atmospheric diffusion immediately above and within the canopy, and into the sink areas within or on the trees. The program implementing the model is general and can be used, with only minor changes, for any gaseous pollutant. The model provides an estimate describing the response of the vegetarian-atmosphere system to the environment as related to three types of processes: atmospheric diffusion, diffusion near and inside the absorbing plant, and the physical and chemical processes at the sink on ormore » within the plant.« less

  14. A Process Model of Principal Selection.

    ERIC Educational Resources Information Center

    Flanigan, J. L.; And Others

    A process model to assist school district superintendents in the selection of principals is presented in this paper. Components of the process are described, which include developing an action plan, formulating an explicit job description, advertising, assessing candidates' philosophy, conducting interview analyses, evaluating response to stress,…

  15. Managing Risk in Mobile Applications with Formal Security Policies

    DTIC Science & Technology

    2013-04-01

    Alternatively, Breaux and Powers (2009) found the Business Process Modeling Notation ( BPMN ), a declarative language for describing business processes, to be...the Business Process Execution Language (BPEL), preferred as the candidate formal semantics for BPMN , only works for limited classes of BPMN models

  16. Modeling transport kinetics in clinoptilolite-phosphate rock systems

    NASA Technical Reports Server (NTRS)

    Allen, E. R.; Ming, D. W.; Hossner, L. R.; Henninger, D. L.

    1995-01-01

    Nutrient release in clinoptilolite-phosphate rock (Cp-PR) systems occurs through dissolution and cation-exchange reactions. Investigating the kinetics of these reactions expands our understanding of nutrient release processes. Research was conducted to model transport kinetics of nutrient release in Cp-PR systems. The objectives were to identify empirical models that best describe NH4, K, and P release and define diffusion-controlling processes. Materials included a Texas clinoptilolite (Cp) and North Carolina phosphate rock (PR). A continuous-flow thin-disk technique was used. Models evaluated included zero order, first order, second order, parabolic diffusion, simplified Elovich, Elovich, and power function. The power-function, Elovich, and parabolic-diffusion models adequately described NH4, K, and P release. The power-function model was preferred because of its simplicity. Models indicated nutrient release was diffusion controlled. Primary transport processes controlling nutrient release for the time span observed were probably the result of a combination of several interacting transport mechanisms.

  17. Comparative Protein Structure Modeling Using MODELLER

    PubMed Central

    Webb, Benjamin; Sali, Andrej

    2016-01-01

    Comparative protein structure modeling predicts the three-dimensional structure of a given protein sequence (target) based primarily on its alignment to one or more proteins of known structure (templates). The prediction process consists of fold assignment, target-template alignment, model building, and model evaluation. This unit describes how to calculate comparative models using the program MODELLER and how to use the ModBase database of such models, and discusses all four steps of comparative modeling, frequently observed errors, and some applications. Modeling lactate dehydrogenase from Trichomonas vaginalis (TvLDH) is described as an example. The download and installation of the MODELLER software is also described. PMID:27322406

  18. Modelling atmospheric transport of α-hexachlorocyclohexane in the Northern Hemispherewith a 3-D dynamical model: DEHM-POP

    NASA Astrophysics Data System (ADS)

    Hansen, K. M.; Christensen, J. H.; Brandt, J.; Frohn, L. M.; Geels, C.

    2004-07-01

    The Danish Eulerian Hemispheric Model (DEHM) is a 3-D dynamical atmospheric transport model originally developed to describe the atmospheric transport of sulphur into the Arctic. A new version of the model, DEHM-POP, developed to study the atmospheric transport and environmental fate of persistent organic pollutants (POPs) is presented. During environmental cycling, POPs can be deposited and re-emitted several times before reaching a final destination. A description of the exchange processes between the land/ocean surfaces and the atmosphere is included in the model to account for this multi-hop transport. The α-isomer of the pesticide hexachlorocyclohexane (α-HCH) is used as tracer in the model development. The structure of the model and processes included are described in detail. The results from a model simulation showing the atmospheric transport for the years 1991 to 1998 are presented and evaluated against measurements. The annual averaged atmospheric concentration of α-HCH for the 1990s is well described by the model; however, the shorter-term average concentration for most of the stations is not well captured. This indicates that the present simple surface description needs to be refined to get a better description of the air-surface exchange processes of POPs.

  19. The drift diffusion model as the choice rule in reinforcement learning.

    PubMed

    Pedersen, Mads Lund; Frank, Michael J; Biele, Guido

    2017-08-01

    Current reinforcement-learning models often assume simplified decision processes that do not fully reflect the dynamic complexities of choice processes. Conversely, sequential-sampling models of decision making account for both choice accuracy and response time, but assume that decisions are based on static decision values. To combine these two computational models of decision making and learning, we implemented reinforcement-learning models in which the drift diffusion model describes the choice process, thereby capturing both within- and across-trial dynamics. To exemplify the utility of this approach, we quantitatively fit data from a common reinforcement-learning paradigm using hierarchical Bayesian parameter estimation, and compared model variants to determine whether they could capture the effects of stimulant medication in adult patients with attention-deficit hyperactivity disorder (ADHD). The model with the best relative fit provided a good description of the learning process, choices, and response times. A parameter recovery experiment showed that the hierarchical Bayesian modeling approach enabled accurate estimation of the model parameters. The model approach described here, using simultaneous estimation of reinforcement-learning and drift diffusion model parameters, shows promise for revealing new insights into the cognitive and neural mechanisms of learning and decision making, as well as the alteration of such processes in clinical groups.

  20. The drift diffusion model as the choice rule in reinforcement learning

    PubMed Central

    Frank, Michael J.

    2017-01-01

    Current reinforcement-learning models often assume simplified decision processes that do not fully reflect the dynamic complexities of choice processes. Conversely, sequential-sampling models of decision making account for both choice accuracy and response time, but assume that decisions are based on static decision values. To combine these two computational models of decision making and learning, we implemented reinforcement-learning models in which the drift diffusion model describes the choice process, thereby capturing both within- and across-trial dynamics. To exemplify the utility of this approach, we quantitatively fit data from a common reinforcement-learning paradigm using hierarchical Bayesian parameter estimation, and compared model variants to determine whether they could capture the effects of stimulant medication in adult patients with attention-deficit hyper-activity disorder (ADHD). The model with the best relative fit provided a good description of the learning process, choices, and response times. A parameter recovery experiment showed that the hierarchical Bayesian modeling approach enabled accurate estimation of the model parameters. The model approach described here, using simultaneous estimation of reinforcement-learning and drift diffusion model parameters, shows promise for revealing new insights into the cognitive and neural mechanisms of learning and decision making, as well as the alteration of such processes in clinical groups. PMID:27966103

  1. Consulting Basics for the Teacher-Turned-Technology Consultant.

    ERIC Educational Resources Information Center

    Stager, Sue; Green, Kathy

    1988-01-01

    Discusses the role of educational technology consultants who may be classroom teachers with no formal training in consulting. Consulting models are described, including content-oriented and process-oriented approaches; Schein's process facilitator model is examined; and Kurpius' consulting model is explained and expanded. (LRW)

  2. Current and Future Flight Operating Systems

    NASA Technical Reports Server (NTRS)

    Cudmore, Alan

    2007-01-01

    This viewgraph presentation reviews the current real time operating system (RTOS) type in use with current flight systems. A new RTOS model is described, i.e. the process model. Included is a review of the challenges of migrating from the classic RTOS to the Process Model type.

  3. Lecturing and Loving It: Applying the Information-Processing Model.

    ERIC Educational Resources Information Center

    Parker, Jonathan K.

    1993-01-01

    Discusses the benefits of lecturing, when done properly, in high schools. Describes the positive attributes of effective lecturers. Provides a human information-processing model applicable to the task of lecturing to students. (HB)

  4. Using object-oriented analysis techniques to support system testing

    NASA Astrophysics Data System (ADS)

    Zucconi, Lin

    1990-03-01

    Testing of real-time control systems can be greatly facilitated by use of object-oriented and structured analysis modeling techniques. This report describes a project where behavior, process and information models built for a real-time control system were used to augment and aid traditional system testing. The modeling techniques used were an adaptation of the Ward/Mellor method for real-time systems analysis and design (Ward85) for object-oriented development. The models were used to simulate system behavior by means of hand execution of the behavior or state model and the associated process (data and control flow) and information (data) models. The information model, which uses an extended entity-relationship modeling technique, is used to identify application domain objects and their attributes (instance variables). The behavioral model uses state-transition diagrams to describe the state-dependent behavior of the object. The process model uses a transformation schema to describe the operations performed on or by the object. Together, these models provide a means of analyzing and specifying a system in terms of the static and dynamic properties of the objects which it manipulates. The various models were used to simultaneously capture knowledge about both the objects in the application domain and the system implementation. Models were constructed, verified against the software as-built and validated through informal reviews with the developer. These models were then hand-executed.

  5. Unimolecular decomposition reactions at low-pressure: A comparison of competitive methods

    NASA Technical Reports Server (NTRS)

    Adams, G. F.

    1980-01-01

    The lack of a simple rate coefficient expression to describe the pressure and temperature dependence hampers chemical modeling of flame systems. Recently developed simplified models to describe unimolecular processes include the calculation of rate constants for thermal unimolecular reactions and recombinations at the low pressure limit, at the high pressure limit and in the intermediate fall-off region. Comparison between two different applications of Troe's simplified model and a comparison between the simplified model and the classic RRKM theory are described.

  6. MULTI: a shared memory approach to cooperative molecular modeling.

    PubMed

    Darden, T; Johnson, P; Smith, H

    1991-03-01

    A general purpose molecular modeling system, MULTI, based on the UNIX shared memory and semaphore facilities for interprocess communication is described. In addition to the normal querying or monitoring of geometric data, MULTI also provides processes for manipulating conformations, and for displaying peptide or nucleic acid ribbons, Connolly surfaces, close nonbonded contacts, crystal-symmetry related images, least-squares superpositions, and so forth. This paper outlines the basic techniques used in MULTI to ensure cooperation among these specialized processes, and then describes how they can work together to provide a flexible modeling environment.

  7. Phylogenetic mixtures and linear invariants for equal input models.

    PubMed

    Casanellas, Marta; Steel, Mike

    2017-04-01

    The reconstruction of phylogenetic trees from molecular sequence data relies on modelling site substitutions by a Markov process, or a mixture of such processes. In general, allowing mixed processes can result in different tree topologies becoming indistinguishable from the data, even for infinitely long sequences. However, when the underlying Markov process supports linear phylogenetic invariants, then provided these are sufficiently informative, the identifiability of the tree topology can be restored. In this paper, we investigate a class of processes that support linear invariants once the stationary distribution is fixed, the 'equal input model'. This model generalizes the 'Felsenstein 1981' model (and thereby the Jukes-Cantor model) from four states to an arbitrary number of states (finite or infinite), and it can also be described by a 'random cluster' process. We describe the structure and dimension of the vector spaces of phylogenetic mixtures and of linear invariants for any fixed phylogenetic tree (and for all trees-the so called 'model invariants'), on any number n of leaves. We also provide a precise description of the space of mixtures and linear invariants for the special case of [Formula: see text] leaves. By combining techniques from discrete random processes and (multi-) linear algebra, our results build on a classic result that was first established by James Lake (Mol Biol Evol 4:167-191, 1987).

  8. Teaching Mathematical Modelling: Demonstrating Enrichment and Elaboration

    ERIC Educational Resources Information Center

    Warwick, Jon

    2015-01-01

    This paper uses a series of models to illustrate one of the fundamental processes of model building--that of enrichment and elaboration. The paper describes how a problem context is given which allows a series of models to be developed from a simple initial model using a queuing theory framework. The process encourages students to think about the…

  9. State of the art in pathology business process analysis, modeling, design and optimization.

    PubMed

    Schrader, Thomas; Blobel, Bernd; García-Rojo, Marcial; Daniel, Christel; Słodkowska, Janina

    2012-01-01

    For analyzing current workflows and processes, for improving them, for quality management and quality assurance, for integrating hardware and software components, but also for education, training and communication between different domains' experts, modeling business process in a pathology department is inevitable. The authors highlight three main processes in pathology: general diagnostic, cytology diagnostic, and autopsy. In this chapter, those processes are formally modeled and described in detail. Finally, specialized processes such as immunohistochemistry and frozen section have been considered.

  10. The Local Brewery: A Project for Use in Differential Equations Courses

    ERIC Educational Resources Information Center

    Starling, James K.; Povich, Timothy J.; Findlay, Michael

    2016-01-01

    We describe a modeling project designed for an ordinary differential equations (ODEs) course using first-order and systems of first-order differential equations to model the fermentation process in beer. The project aims to expose the students to the modeling process by creating and solving a mathematical model and effectively communicating their…

  11. Computer-aided software development process design

    NASA Technical Reports Server (NTRS)

    Lin, Chi Y.; Levary, Reuven R.

    1989-01-01

    The authors describe an intelligent tool designed to aid managers of software development projects in planning, managing, and controlling the development process of medium- to large-scale software projects. Its purpose is to reduce uncertainties in the budget, personnel, and schedule planning of software development projects. It is based on dynamic model for the software development and maintenance life-cycle process. This dynamic process is composed of a number of time-varying, interacting developmental phases, each characterized by its intended functions and requirements. System dynamics is used as a modeling methodology. The resulting Software LIfe-Cycle Simulator (SLICS) and the hybrid expert simulation system of which it is a subsystem are described.

  12. A Synthetic Model of Mass Persuasion.

    ERIC Educational Resources Information Center

    Kneupper, Charles W.; Underwood, Willard A.

    Mass persuasion involves a message production process which significantly alters or reinforces an attitude, belief, or action of the members of a large, heterogeneous audience. A synthetic communication model for mass persuasion has been constructed which incorporates aspects of several models created to describe the process of effective…

  13. Aligning grammatical theories and language processing models.

    PubMed

    Lewis, Shevaun; Phillips, Colin

    2015-02-01

    We address two important questions about the relationship between theoretical linguistics and psycholinguistics. First, do grammatical theories and language processing models describe separate cognitive systems, or are they accounts of different aspects of the same system? We argue that most evidence is consistent with the one-system view. Second, how should we relate grammatical theories and language processing models to each other?

  14. Evolution of quantum-like modeling in decision making processes

    NASA Astrophysics Data System (ADS)

    Khrennikova, Polina

    2012-12-01

    The application of the mathematical formalism of quantum mechanics to model behavioral patterns in social science and economics is a novel and constantly emerging field. The aim of the so called 'quantum like' models is to model the decision making processes in a macroscopic setting, capturing the particular 'context' in which the decisions are taken. Several subsequent empirical findings proved that when making a decision people tend to violate the axioms of expected utility theory and Savage's Sure Thing principle, thus violating the law of total probability. A quantum probability formula was devised to describe more accurately the decision making processes. A next step in the development of QL-modeling in decision making was the application of Schrödinger equation to describe the evolution of people's mental states. A shortcoming of Schrödinger equation is its inability to capture dynamics of an open system; the brain of the decision maker can be regarded as such, actively interacting with the external environment. Recently the master equation, by which quantum physics describes the process of decoherence as the result of interaction of the mental state with the environmental 'bath', was introduced for modeling the human decision making. The external environment and memory can be referred to as a complex 'context' influencing the final decision outcomes. The master equation can be considered as a pioneering and promising apparatus for modeling the dynamics of decision making in different contexts.

  15. Atmospheric transport of persistent organic pollutants - development of a 3-d dynamical transport model covering the northern hemisphere

    NASA Astrophysics Data System (ADS)

    Hansen, K. M.; Christensen, J. H.; Geels, C.; Frohn, L. M.; Brandt, J.

    2003-04-01

    The Danish Eulerian Hemispheric Model (DEHM) is a 3-D dynamical atmospheric transport model originally developed to describe the atmospheric transport of sulphur, lead, and mercury to the Arctic. The model has been validated carefully for these compounds. A new version of DEHM is currently being developed to describe the atmospheric transport of persistent organic pollutants (POPs) which are toxic, lipophilic and bio-accumulating compounds showing great persistence in the environment. The model has a horizontal resolution of 150 km x 150 km and 18 vertical layers, and it is driven by meteorological data from the numerical weather prediction model MM5V2. During environmental cycling POPs can be deposited and re-emitted several times before reaching a final destination. A description of the exchange processes between the land/ocean surfaces and the atmosphere is included in the model to account for this multi-hop transport. The present model version describes the atmospheric transport of the pesticide alpha-hexachlorocyclohexane (alpha-HCH). Other POPs may be included when proper data on emissions and physical-chemical parameters becomes available. The model-processes and the first model results are presented. The atmospheric transport of alpha-HCH for the 1990s is well described by the model.

  16. Mechanistic modelling of fluidized bed drying processes of wet porous granules: a review.

    PubMed

    Mortier, Séverine Thérèse F C; De Beer, Thomas; Gernaey, Krist V; Remon, Jean Paul; Vervaet, Chris; Nopens, Ingmar

    2011-10-01

    Fluidized bed dryers are frequently used in industrial applications and also in the pharmaceutical industry. The general incentives to develop mechanistic models for pharmaceutical processes are listed, and our vision on how this can particularly be done for fluidized bed drying processes of wet granules is given. This review provides a basis for future mechanistic model development for the drying process of wet granules in pharmaceutical processes. It is intended for a broad audience with a varying level of knowledge on pharmaceutical processes and mathematical modelling. Mathematical models are powerful tools to gain process insight and eventually develop well-controlled processes. The level of detail embedded in such a model depends on the goal of the model. Several models have therefore been proposed in the literature and are reviewed here. The drying behaviour of one single granule, a porous particle, can be described using the continuum approach, the pore network modelling method and the shrinkage of the diameter of the wet core approach. As several granules dry at a drying rate dependent on the gas temperature, gas velocity, porosity, etc., the moisture content of a batch of granules will reside in a certain interval. Population Balance Model (ling) (PBM) offers a tool to describe the distribution of particle properties which can be of interest for the application. PBM formulation and solution methods are therefore reviewed. In a fluidized bed, the granules show a fluidization pattern depending on the geometry of the gas inlet, the gas velocity, characteristics of the particles, the dryer design, etc. Computational Fluid Dynamics (CFD) allows to model this behaviour. Moreover, turbulence can be modelled using several approaches: Reynolds-averaged Navier-Stokes Equations (RANS) or Large Eddy Simulation (LES). Another important aspect of CFD is the choice between the Eulerian-Lagrangian and the Eulerian-Eulerian approach. Finally, the PBM and CFD frameworks can be integrated, to describe the evolution of the moisture content of granules during fluidized bed drying. Copyright © 2011 Elsevier B.V. All rights reserved.

  17. An Approach to the Evaluation of Hypermedia.

    ERIC Educational Resources Information Center

    Knussen, Christina; And Others

    1991-01-01

    Discusses methods that may be applied to the evaluation of hypermedia, based on six models described by Lawton. Techniques described include observation, self-report measures, interviews, automated measures, psychometric tests, checklists and criterion-based techniques, process models, Experimentally Measuring Usability (EMU), and a naturalistic…

  18. Molecular Modeling of Environmentally Important Processes: Reduction Potentials

    ERIC Educational Resources Information Center

    Lewis, Anne; Bumpus, John A.; Truhlar, Donald G.; Cramer, Christopher J.

    2004-01-01

    The increasing use of computational quantum chemistry in the modeling of environmentally important processes is described. The employment of computational quantum mechanics for the prediction of oxidation-reduction potential for solutes in an aqueous medium is discussed.

  19. Extending SME to Handle Large-Scale Cognitive Modeling.

    PubMed

    Forbus, Kenneth D; Ferguson, Ronald W; Lovett, Andrew; Gentner, Dedre

    2017-07-01

    Analogy and similarity are central phenomena in human cognition, involved in processes ranging from visual perception to conceptual change. To capture this centrality requires that a model of comparison must be able to integrate with other processes and handle the size and complexity of the representations required by the tasks being modeled. This paper describes extensions to Structure-Mapping Engine (SME) since its inception in 1986 that have increased its scope of operation. We first review the basic SME algorithm, describe psychological evidence for SME as a process model, and summarize its role in simulating similarity-based retrieval and generalization. Then we describe five techniques now incorporated into the SME that have enabled it to tackle large-scale modeling tasks: (a) Greedy merging rapidly constructs one or more best interpretations of a match in polynomial time: O(n 2 log(n)); (b) Incremental operation enables mappings to be extended as new information is retrieved or derived about the base or target, to model situations where information in a task is updated over time; (c) Ubiquitous predicates model the varying degrees to which items may suggest alignment; (d) Structural evaluation of analogical inferences models aspects of plausibility judgments; (e) Match filters enable large-scale task models to communicate constraints to SME to influence the mapping process. We illustrate via examples from published studies how these enable it to capture a broader range of psychological phenomena than before. Copyright © 2016 Cognitive Science Society, Inc.

  20. Designing Class Methods from Dataflow Diagrams

    NASA Astrophysics Data System (ADS)

    Shoval, Peretz; Kabeli-Shani, Judith

    A method for designing the class methods of an information system is described. The method is part of FOOM - Functional and Object-Oriented Methodology. In the analysis phase of FOOM, two models defining the users' requirements are created: a conceptual data model - an initial class diagram; and a functional model - hierarchical OO-DFDs (object-oriented dataflow diagrams). Based on these models, a well-defined process of methods design is applied. First, the OO-DFDs are converted into transactions, i.e., system processes that supports user task. The components and the process logic of each transaction are described in detail, using pseudocode. Then, each transaction is decomposed, according to well-defined rules, into class methods of various types: basic methods, application-specific methods and main transaction (control) methods. Each method is attached to a proper class; messages between methods express the process logic of each transaction. The methods are defined using pseudocode or message charts.

  1. A rigorous approach to investigating common assumptions about disease transmission: Process algebra as an emerging modelling methodology for epidemiology.

    PubMed

    McCaig, Chris; Begon, Mike; Norman, Rachel; Shankland, Carron

    2011-03-01

    Changing scale, for example, the ability to move seamlessly from an individual-based model to a population-based model, is an important problem in many fields. In this paper, we introduce process algebra as a novel solution to this problem in the context of models of infectious disease spread. Process algebra allows us to describe a system in terms of the stochastic behaviour of individuals, and is a technique from computer science. We review the use of process algebra in biological systems, and the variety of quantitative and qualitative analysis techniques available. The analysis illustrated here solves the changing scale problem: from the individual behaviour we can rigorously derive equations to describe the mean behaviour of the system at the level of the population. The biological problem investigated is the transmission of infection, and how this relates to individual interactions.

  2. A Model of the Creative Process Based on Quantum Physics and Vedic Science.

    ERIC Educational Resources Information Center

    Rose, Laura Hall

    1988-01-01

    Using tenets from Vedic science and quantum physics, this model of the creative process suggests that the unified field of creation is pure consciousness, and that the development of the creative process within individuals mirrors the creative process within the universe. Rational and supra-rational creative thinking techniques are also described.…

  3. Using transfer functions to quantify El Niño Southern Oscillation dynamics in data and models.

    PubMed

    MacMartin, Douglas G; Tziperman, Eli

    2014-09-08

    Transfer function tools commonly used in engineering control analysis can be used to better understand the dynamics of El Niño Southern Oscillation (ENSO), compare data with models and identify systematic model errors. The transfer function describes the frequency-dependent input-output relationship between any pair of causally related variables, and can be estimated from time series. This can be used first to assess whether the underlying relationship is or is not frequency dependent, and if so, to diagnose the underlying differential equations that relate the variables, and hence describe the dynamics of individual subsystem processes relevant to ENSO. Estimating process parameters allows the identification of compensating model errors that may lead to a seemingly realistic simulation in spite of incorrect model physics. This tool is applied here to the TAO array ocean data, the GFDL-CM2.1 and CCSM4 general circulation models, and to the Cane-Zebiak ENSO model. The delayed oscillator description is used to motivate a few relevant processes involved in the dynamics, although any other ENSO mechanism could be used instead. We identify several differences in the processes between the models and data that may be useful for model improvement. The transfer function methodology is also useful in understanding the dynamics and evaluating models of other climate processes.

  4. Kinetic Modeling of Microbiological Processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Chongxuan; Fang, Yilin

    Kinetic description of microbiological processes is vital for the design and control of microbe-based biotechnologies such as waste water treatment, petroleum oil recovery, and contaminant attenuation and remediation. Various models have been proposed to describe microbiological processes. This editorial article discusses the advantages and limiation of these modeling approaches in cluding tranditional, Monod-type models and derivatives, and recently developed constraint-based approaches. The article also offers the future direction of modeling researches that best suit for petroleum and environmental biotechnologies.

  5. Modelling the morphology of migrating bacterial colonies

    NASA Astrophysics Data System (ADS)

    Nishiyama, A.; Tokihiro, T.; Badoual, M.; Grammaticos, B.

    2010-08-01

    We present a model which aims at describing the morphology of colonies of Proteus mirabilis and Bacillus subtilis. Our model is based on a cellular automaton which is obtained by the adequate discretisation of a diffusion-like equation, describing the migration of the bacteria, to which we have added rules simulating the consolidation process. Our basic assumption, following the findings of the group of Chuo University, is that the migration and consolidation processes are controlled by the local density of the bacteria. We show that it is possible within our model to reproduce the morphological diagrams of both bacteria species. Moreover, we model some detailed experiments done by the Chuo University group, obtaining a fine agreement.

  6. POLUTE; forest air pollutant uptake model. [IBM360,370; CSMP

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Murphy, C.E.

    POLUTE is a computer model designed to estimate the uptake of air pollutants by forests. The model utilizes submodels to describe atmospheric diffusion immediately above and within the canopy, and into the sink areas within or on the trees. The program implementing the model is general and can be used, with only minor changes, for any gaseous pollutant. The model provides an estimate describing the response of the vegetarian-atmosphere system to the environment as related to three types of processes: atmospheric diffusion, diffusion near and inside the absorbing plant, and the physical and chemical processes at the sink on ormore » within the plant.IBM360,370; CSMP; OS/370.« less

  7. Process-based modeling of temperature and water profiles in the seedling recruitment zone: Part I. Model validation

    USDA-ARS?s Scientific Manuscript database

    Process-based modeling provides detailed spatial and temporal information of the soil environment in the shallow seedling recruitment zone across field topography where measurements of soil temperature and water may not sufficiently describe the zone. Hourly temperature and water profiles within the...

  8. A perspective on modeling the multiscale response of energetic materials

    NASA Astrophysics Data System (ADS)

    Rice, Betsy M.

    2017-01-01

    The response of an energetic material to insult is perhaps one of the most difficult processes to model due to concurrent chemical and physical phenomena occurring over scales ranging from atomistic to continuum. Unraveling the interdependencies of these complex processes across the scales through modeling can only be done within a multiscale framework. In this paper, I will describe progress in the development of a predictive, experimentally validated multiscale reactive modeling capability for energetic materials at the Army Research Laboratory. I will also describe new challenges and research opportunities that have arisen in the course of our development which should be pursued in the future.

  9. A new computer code for discrete fracture network modelling

    NASA Astrophysics Data System (ADS)

    Xu, Chaoshui; Dowd, Peter

    2010-03-01

    The authors describe a comprehensive software package for two- and three-dimensional stochastic rock fracture simulation using marked point processes. Fracture locations can be modelled by a Poisson, a non-homogeneous, a cluster or a Cox point process; fracture geometries and properties are modelled by their respective probability distributions. Virtual sampling tools such as plane, window and scanline sampling are included in the software together with a comprehensive set of statistical tools including histogram analysis, probability plots, rose diagrams and hemispherical projections. The paper describes in detail the theoretical basis of the implementation and provides a case study in rock fracture modelling to demonstrate the application of the software.

  10. An Index and Test of Linear Moderated Mediation.

    PubMed

    Hayes, Andrew F

    2015-01-01

    I describe a test of linear moderated mediation in path analysis based on an interval estimate of the parameter of a function linking the indirect effect to values of a moderator-a parameter that I call the index of moderated mediation. This test can be used for models that integrate moderation and mediation in which the relationship between the indirect effect and the moderator is estimated as linear, including many of the models described by Edwards and Lambert ( 2007 ) and Preacher, Rucker, and Hayes ( 2007 ) as well as extensions of these models to processes involving multiple mediators operating in parallel or in serial. Generalization of the method to latent variable models is straightforward. Three empirical examples describe the computation of the index and the test, and its implementation is illustrated using Mplus and the PROCESS macro for SPSS and SAS.

  11. A Control Theory Model of Smoking

    PubMed Central

    Bobashev, Georgiy; Holloway, John; Solano, Eric; Gutkin, Boris

    2017-01-01

    We present a heuristic control theory model that describes smoking under restricted and unrestricted access to cigarettes. The model is based on the allostasis theory and uses a formal representation of a multiscale opponent process. The model simulates smoking behavior of an individual and produces both short-term (“loading up” after not smoking for a while) and long-term smoking patterns (e.g., gradual transition from a few cigarettes to one pack a day). By introducing a formal representation of withdrawal- and craving-like processes, the model produces gradual increases over time in withdrawal- and craving-like signals associated with abstinence and shows that after 3 months of abstinence, craving disappears. The model was programmed as a computer application allowing users to select simulation scenarios. The application links images of brain regions that are activated during the binge/intoxication, withdrawal, or craving with corresponding simulated states. The model was calibrated to represent smoking patterns described in peer-reviewed literature; however, it is generic enough to be adapted to other drugs, including cocaine and opioids. Although the model does not mechanistically describe specific neurobiological processes, it can be useful in prevention and treatment practices as an illustration of drug-using behaviors and expected dynamics of withdrawal and craving during abstinence. PMID:28868531

  12. Modeling Narrative Discourse

    ERIC Educational Resources Information Center

    Elson, David K.

    2012-01-01

    This thesis describes new approaches to the formal modeling of narrative discourse. Although narratives of all kinds are ubiquitous in daily life, contemporary text processing techniques typically do not leverage the aspects that separate narrative from expository discourse. We describe two approaches to the problem. The first approach considers…

  13. Modeling the Fluid Withdraw and Injection Induced Earthquakes

    NASA Astrophysics Data System (ADS)

    Meng, C.

    2016-12-01

    We present an open source numerical code, Defmod, that allows one to model the induced seismicity in an efficient and standalone manner. The fluid withdraw and injection induced earthquake has been a great concern to the industries including oil/gas, wastewater disposal and CO2 sequestration. Being able to numerically model the induced seismicity is long desired. To do that, one has to consider at lease two processes, a steady process that describes the inducing and aseismic stages before and in between the seismic events, and an abrupt process that describes the dynamic fault rupture accompanied by seismic energy radiations during the events. The steady process can be adequately modeled by a quasi-static model, while the abrupt process has to be modeled by a dynamic model. In most of the published modeling works, only one of these processes is considered. The geomechanicists and reservoir engineers are focused more on the quasi-static modeling, whereas the geophysicists and seismologists are focused more on the dynamic modeling. The finite element code Defmod combines these two models into a hybrid model that uses the failure criterion and frictional laws to adaptively switch between the (quasi-)static and dynamic states. The code is capable of modeling episodic fault rupture driven by quasi-static loading, e.g. due to reservoir fluid withdraw and/or injection, and by dynamic loading, e.g. due to the foregoing earthquakes. We demonstrate a case study for the 2013 Azle earthquake.

  14. A Taoist Paradigm of EAP Consultation.

    ERIC Educational Resources Information Center

    Gerstein, Lawrence H.; Sturmer, Paul

    1993-01-01

    Describes new Taoist model as alternative approach to conceptualizing consultation process and to formulating successful, isomorphic interventions constructed to facilitate four change processes. Presents model stressing importance of interrelationships between individuals and groups; integrating repulsion and assimilation forces; balancing human…

  15. Three dimensional geometric modeling of processing-tomatoes

    USDA-ARS?s Scientific Manuscript database

    Characterizing tomato geometries with different shapes and sizes would facilitate the design of tomato processing equipments and promote computer-based engineering simulations. This research sought to develop a three-dimensional geometric model that can describe the morphological attributes of proce...

  16. Constructing an Integrated and Evidenced-Based Model for Residential Services

    ERIC Educational Resources Information Center

    Metzger, Jed

    2006-01-01

    There is paucity in both the literature and in the practice of integrated, evidence-based models of residential care for youth. This article describes the assessment and the process that led to the redesign of services at a residential center. The article describes how evidence-based models for each of the four major disciplines (residential…

  17. Toward Multiscale Models of Cyanobacterial Growth: A Modular Approach

    PubMed Central

    Westermark, Stefanie; Steuer, Ralf

    2016-01-01

    Oxygenic photosynthesis dominates global primary productivity ever since its evolution more than three billion years ago. While many aspects of phototrophic growth are well understood, it remains a considerable challenge to elucidate the manifold dependencies and interconnections between the diverse cellular processes that together facilitate the synthesis of new cells. Phototrophic growth involves the coordinated action of several layers of cellular functioning, ranging from the photosynthetic light reactions and the electron transport chain, to carbon-concentrating mechanisms and the assimilation of inorganic carbon. It requires the synthesis of new building blocks by cellular metabolism, protection against excessive light, as well as diurnal regulation by a circadian clock and the orchestration of gene expression and cell division. Computational modeling allows us to quantitatively describe these cellular functions and processes relevant for phototrophic growth. As yet, however, computational models are mostly confined to the inner workings of individual cellular processes, rather than describing the manifold interactions between them in the context of a living cell. Using cyanobacteria as model organisms, this contribution seeks to summarize existing computational models that are relevant to describe phototrophic growth and seeks to outline their interactions and dependencies. Our ultimate aim is to understand cellular functioning and growth as the outcome of a coordinated operation of diverse yet interconnected cellular processes. PMID:28083530

  18. A pivotal-based approach for enterprise business process and IS integration

    NASA Astrophysics Data System (ADS)

    Ulmer, Jean-Stéphane; Belaud, Jean-Pierre; Le Lann, Jean-Marc

    2013-02-01

    A company must be able to describe and react against any endogenous or exogenous event. Such flexibility can be achieved through business process management (BPM). Nevertheless a BPM approach highlights complex relations between business and IT domains. A non-alignment is exposed between heterogeneous models: this is the 'business-IT gap' as described in the literature. Through concepts from business engineering and information systems driven by models and IT, we define a generic approach ensuring multi-view consistency. Its role is to maintain and provide all information related to the structure and semantic of models. Allowing the full return of a transformed model in the sense of reverse engineering, our platform enables synchronisation between analysis model and implementation model.

  19. A Cognitive Processing Account of Individual Differences in Novice Logo Programmers' Conceptualisation and Use of Recursion.

    ERIC Educational Resources Information Center

    Gibbons, Pamela

    1995-01-01

    Describes a study that investigated individual differences in the construction of mental models of recursion in LOGO programming. The learning process was investigated from the perspective of Norman's mental models theory and employed diSessa's ontology regarding distributed, functional, and surrogate mental models, and the Luria model of brain…

  20. Service-based analysis of biological pathways

    PubMed Central

    Zheng, George; Bouguettaya, Athman

    2009-01-01

    Background Computer-based pathway discovery is concerned with two important objectives: pathway identification and analysis. Conventional mining and modeling approaches aimed at pathway discovery are often effective at achieving either objective, but not both. Such limitations can be effectively tackled leveraging a Web service-based modeling and mining approach. Results Inspired by molecular recognitions and drug discovery processes, we developed a Web service mining tool, named PathExplorer, to discover potentially interesting biological pathways linking service models of biological processes. The tool uses an innovative approach to identify useful pathways based on graph-based hints and service-based simulation verifying user's hypotheses. Conclusion Web service modeling of biological processes allows the easy access and invocation of these processes on the Web. Web service mining techniques described in this paper enable the discovery of biological pathways linking these process service models. Algorithms presented in this paper for automatically highlighting interesting subgraph within an identified pathway network enable the user to formulate hypothesis, which can be tested out using our simulation algorithm that are also described in this paper. PMID:19796403

  1. Rethinking Design Process: Using 3D Digital Models as an Interface in Collaborative Session

    ERIC Educational Resources Information Center

    Ding, Suining

    2008-01-01

    This paper describes a pilot study for an alternative design process by integrating a designer-user collaborative session with digital models. The collaborative session took place in a 3D AutoCAD class for a real world project. The 3D models served as an interface for designer-user collaboration during the design process. Students not only learned…

  2. Implementation of a Goal-Based Systems Engineering Process Using the Systems Modeling Language (SysML)

    NASA Technical Reports Server (NTRS)

    Patterson, Jonathan D.; Breckenridge, Jonathan T.; Johnson, Stephen B.

    2013-01-01

    Building upon the purpose, theoretical approach, and use of a Goal-Function Tree (GFT) being presented by Dr. Stephen B. Johnson, described in a related Infotech 2013 ISHM abstract titled "Goal-Function Tree Modeling for Systems Engineering and Fault Management", this paper will describe the core framework used to implement the GFTbased systems engineering process using the Systems Modeling Language (SysML). These two papers are ideally accepted and presented together in the same Infotech session. Statement of problem: SysML, as a tool, is currently not capable of implementing the theoretical approach described within the "Goal-Function Tree Modeling for Systems Engineering and Fault Management" paper cited above. More generally, SysML's current capabilities to model functional decompositions in the rigorous manner required in the GFT approach are limited. The GFT is a new Model-Based Systems Engineering (MBSE) approach to the development of goals and requirements, functions, and its linkage to design. As a growing standard for systems engineering, it is important to develop methods to implement GFT in SysML. Proposed Method of Solution: Many of the central concepts of the SysML language are needed to implement a GFT for large complex systems. In the implementation of those central concepts, the following will be described in detail: changes to the nominal SysML process, model view definitions and examples, diagram definitions and examples, and detailed SysML construct and stereotype definitions.

  3. Tools for evaluating Veterinary Services: an external auditing model for the quality assurance process.

    PubMed

    Melo, E Correa

    2003-08-01

    The author describes the reasons why evaluation processes should be applied to the Veterinary Services of Member Countries, either for trade in animals and animal products and by-products between two countries, or for establishing essential measures to improve the Veterinary Service concerned. The author also describes the basic elements involved in conducting an evaluation process, including the instruments for doing so. These basic elements centre on the following:--designing a model, or desirable image, against which a comparison can be made--establishing a list of processes to be analysed and defining the qualitative and quantitative mechanisms for this analysis--establishing a multidisciplinary evaluation team and developing a process for standardising the evaluation criteria.

  4. Two-part models with stochastic processes for modelling longitudinal semicontinuous data: Computationally efficient inference and modelling the overall marginal mean.

    PubMed

    Yiu, Sean; Tom, Brian Dm

    2017-01-01

    Several researchers have described two-part models with patient-specific stochastic processes for analysing longitudinal semicontinuous data. In theory, such models can offer greater flexibility than the standard two-part model with patient-specific random effects. However, in practice, the high dimensional integrations involved in the marginal likelihood (i.e. integrated over the stochastic processes) significantly complicates model fitting. Thus, non-standard computationally intensive procedures based on simulating the marginal likelihood have so far only been proposed. In this paper, we describe an efficient method of implementation by demonstrating how the high dimensional integrations involved in the marginal likelihood can be computed efficiently. Specifically, by using a property of the multivariate normal distribution and the standard marginal cumulative distribution function identity, we transform the marginal likelihood so that the high dimensional integrations are contained in the cumulative distribution function of a multivariate normal distribution, which can then be efficiently evaluated. Hence, maximum likelihood estimation can be used to obtain parameter estimates and asymptotic standard errors (from the observed information matrix) of model parameters. We describe our proposed efficient implementation procedure for the standard two-part model parameterisation and when it is of interest to directly model the overall marginal mean. The methodology is applied on a psoriatic arthritis data set concerning functional disability.

  5. Gyrofluid Modeling of Turbulent, Kinetic Physics

    NASA Astrophysics Data System (ADS)

    Despain, Kate Marie

    2011-12-01

    Gyrofluid models to describe plasma turbulence combine the advantages of fluid models, such as lower dimensionality and well-developed intuition, with those of gyrokinetics models, such as finite Larmor radius (FLR) effects. This allows gyrofluid models to be more tractable computationally while still capturing much of the physics related to the FLR of the particles. We present a gyrofluid model derived to capture the behavior of slow solar wind turbulence and describe the computer code developed to implement the model. In addition, we describe the modifications we made to a gyrofluid model and code that simulate plasma turbulence in tokamak geometries. Specifically, we describe a nonlinear phase mixing phenomenon, part of the E x B term, that was previously missing from the model. An inherently FLR effect, it plays an important role in predicting turbulent heat flux and diffusivity levels for the plasma. We demonstrate this importance by comparing results from the updated code to studies done previously by gyrofluid and gyrokinetic codes. We further explain what would be necessary to couple the updated gyrofluid code, gryffin, to a turbulent transport code, thus allowing gryffin to play a role in predicting profiles for fusion devices such as ITER and to explore novel fusion configurations. Such a coupling would require the use of Graphical Processing Units (GPUs) to make the modeling process fast enough to be viable. Consequently, we also describe our experience with GPU computing and demonstrate that we are poised to complete a gryffin port to this innovative architecture.

  6. Modeling games from the 20th century

    PubMed Central

    Killeen, P.R.

    2008-01-01

    A scientific framework is described in which scientists are cast as problem-solvers, and problems as solved when data are mapped to models. This endeavor is limited by finite attentional capacity which keeps depth of understanding complementary to breadth of vision; and which distinguishes the process of science from its products, scientists from scholars. All four aspects of explanation described by Aristotle trigger, function, substrate, and model are required for comprehension. Various modeling languages are described, ranging from set theory to calculus of variations, along with exemplary applications in behavior analysis. PMID:11369459

  7. The Process of Change in Higher Education Institutions. AAHE-ERIC/Higher Education Research Report, No. 7, 1982.

    ERIC Educational Resources Information Center

    Nordvall, Robert C.

    Conditions that inhibit change in higher education institutions and various models of the change process are described. Attention is also directed to: organizational character, structural features, planning procedures, key individuals in the change process, and practical advice about change. The major change models for higher education…

  8. Beyond ROC Curvature: Strength Effects and Response Time Data Support Continuous-Evidence Models of Recognition Memory

    ERIC Educational Resources Information Center

    Dube, Chad; Starns, Jeffrey J.; Rotello, Caren M.; Ratcliff, Roger

    2012-01-01

    A classic question in the recognition memory literature is whether retrieval is best described as a continuous-evidence process consistent with signal detection theory (SDT), or a threshold process consistent with many multinomial processing tree (MPT) models. Because receiver operating characteristics (ROCs) based on confidence ratings are…

  9. Parallel Mechanisms of Sentence Processing: Assigning Roles to Constituents of Sentences.

    ERIC Educational Resources Information Center

    McClelland, James L.; Kawamoto, Alan H.

    This paper describes and illustrates a simulation model for the processing of grammatical elements in a sentence, focusing on one aspect of sentence comprehension: the assignment of the constituent elements of a sentence to the correct thematic case roles. The model addresses questions about sentence processing from a perspective very different…

  10. Technology for Transient Simulation of Vibration during Combustion Process in Rocket Thruster

    NASA Astrophysics Data System (ADS)

    Zubanov, V. M.; Stepanov, D. V.; Shabliy, L. S.

    2018-01-01

    The article describes the technology for simulation of transient combustion processes in the rocket thruster for determination of vibration frequency occurs during combustion. The engine operates on gaseous propellant: oxygen and hydrogen. Combustion simulation was performed using the ANSYS CFX software. Three reaction mechanisms for the stationary mode were considered and described in detail. The way for obtaining quick CFD-results with intermediate combustion components using an EDM model was found. The way to generate the Flamelet library with CFX-RIF was described. A technique for modeling transient combustion processes in the rocket thruster was proposed based on the Flamelet library. A cyclic irregularity of the temperature field like vortex core precession was detected in the chamber. Frequency of flame precession was obtained with the proposed simulation technique.

  11. Simple model of inhibition of chain-branching combustion processes

    NASA Astrophysics Data System (ADS)

    Babushok, Valeri I.; Gubernov, Vladimir V.; Minaev, Sergei S.; Miroshnichenko, Taisia P.

    2017-11-01

    A simple kinetic model has been suggested to describe the inhibition and extinction of flame propagation in reaction systems with chain-branching reactions typical for hydrocarbon systems. The model is based on the generalised model of the combustion process with chain-branching reaction combined with the one-stage reaction describing the thermal mode of flame propagation with the addition of inhibition reaction steps. Inhibitor addition suppresses the radical overshoot in flame and leads to the change of reaction mode from the chain-branching reaction to a thermal mode of flame propagation. With the increase of inhibitor the transition of chain-branching mode of reaction to the reaction with straight-chains (non-branching chain reaction) is observed. The inhibition part of the model includes a block of three reactions to describe the influence of the inhibitor. The heat losses are incorporated into the model via Newton cooling. The flame extinction is the result of the decreased heat release of inhibited reaction processes and the suppression of radical overshoot with the further decrease of the reaction rate due to the temperature decrease and mixture dilution. A comparison of the results of modelling laminar premixed methane/air flames inhibited by potassium bicarbonate (gas phase model, detailed kinetic model) with the results obtained using the suggested simple model is presented. The calculations with the detailed kinetic model demonstrate the following modes of combustion process: (1) flame propagation with chain-branching reaction (with radical overshoot, inhibitor addition decreases the radical overshoot down to the equilibrium level); (2) saturation of chemical influence of inhibitor, and (3) transition to thermal mode of flame propagation (non-branching chain mode of reaction). The suggested simple kinetic model qualitatively reproduces the modes of flame propagation with the addition of the inhibitor observed using detailed kinetic models.

  12. Ecology of Yersinia pestis and the Epidemiology of Plague.

    PubMed

    Dubyanskiy, Vladimir M; Yeszhanov, Aidyn B

    2016-01-01

    This chapter summarizes information about the natural foci of plague in the world. We describe the location, main hosts, and vectors of Yersinia pestis. The ecological features of the hosts and vectors of plague are listed, including predators - birds and mammals and their role in the epizootic. The epizootic process in plague and the factors affecting the dynamics of epizootic activity of natural foci of Y. pestis are described in detail. The mathematical models of the epizootic process in plague and predictive models are briefly described. The most comprehensive list of the hosts and vectors of Y. pestis in the world is presented as well.

  13. Software algorithms for false alarm reduction in LWIR hyperspectral chemical agent detection

    NASA Astrophysics Data System (ADS)

    Manolakis, D.; Model, J.; Rossacci, M.; Zhang, D.; Ontiveros, E.; Pieper, M.; Seeley, J.; Weitz, D.

    2008-04-01

    The long-wave infrared (LWIR) hyperpectral sensing modality is one that is often used for the problem of detection and identification of chemical warfare agents (CWA) which apply to both military and civilian situations. The inherent nature and complexity of background clutter dictates a need for sophisticated and robust statistical models which are then used in the design of optimum signal processing algorithms that then provide the best exploitation of hyperspectral data to ultimately make decisions on the absence or presence of potentially harmful CWAs. This paper describes the basic elements of an automated signal processing pipeline developed at MIT Lincoln Laboratory. In addition to describing this signal processing architecture in detail, we briefly describe the key signal models that form the foundation of these algorithms as well as some spatial processing techniques used for false alarm mitigation. Finally, we apply this processing pipeline to real data measured by the Telops FIRST hyperspectral (FIRST) sensor to demonstrate its practical utility for the user community.

  14. The coalescent process in models with selection and recombination.

    PubMed

    Hudson, R R; Kaplan, N L

    1988-11-01

    The statistical properties of the process describing the genealogical history of a random sample of genes at a selectively neutral locus which is linked to a locus at which natural selection operates are investigated. It is found that the equations describing this process are simple modifications of the equations describing the process assuming that the two loci are completely linked. Thus, the statistical properties of the genealogical process for a random sample at a neutral locus linked to a locus with selection follow from the results obtained for the selected locus. Sequence data from the alcohol dehydrogenase (Adh) region of Drosophila melanogaster are examined and compared to predictions based on the theory. It is found that the spatial distribution of nucleotide differences between Fast and Slow alleles of Adh is very similar to the spatial distribution predicted if balancing selection operates to maintain the allozyme variation at the Adh locus. The spatial distribution of nucleotide differences between different Slow alleles of Adh do not match the predictions of this simple model very well.

  15. An assembly process model based on object-oriented hierarchical time Petri Nets

    NASA Astrophysics Data System (ADS)

    Wang, Jiapeng; Liu, Shaoli; Liu, Jianhua; Du, Zenghui

    2017-04-01

    In order to improve the versatility, accuracy and integrity of the assembly process model of complex products, an assembly process model based on object-oriented hierarchical time Petri Nets is presented. A complete assembly process information model including assembly resources, assembly inspection, time, structure and flexible parts is established, and this model describes the static and dynamic data involved in the assembly process. Through the analysis of three-dimensional assembly process information, the assembly information is hierarchically divided from the whole, the local to the details and the subnet model of different levels of object-oriented Petri Nets is established. The communication problem between Petri subnets is solved by using message database, and it reduces the complexity of system modeling effectively. Finally, the modeling process is presented, and a five layer Petri Nets model is established based on the hoisting process of the engine compartment of a wheeled armored vehicle.

  16. Assessment of hemoglobin responsiveness to epoetin alfa in patients on hemodialysis using a population pharmacokinetic pharmacodynamic model.

    PubMed

    Wu, Liviawati; Mould, Diane R; Perez Ruixo, Juan Jose; Doshi, Sameer

    2015-10-01

    A population pharmacokinetic pharmacodynamic (PK/PD) model describing the effect of epoetin alfa on hemoglobin (Hb) response in hemodialysis patients was developed. Epoetin alfa pharmacokinetics was described using a linear 2-compartment model. PK parameter estimates were similar to previously reported values. A maturation-structured cytokinetic model consisting of 5 compartments linked in a catenary fashion by first-order cell transfer rates following a zero-order input process described the Hb time course. The PD model described 2 subpopulations, one whose Hb response reflected epoetin alfa dosing and a second whose response was unrelated to epoetin alfa dosing. Parameter estimates from the PK/PD model were physiologically reasonable and consistent with published reports. Numerical and visual predictive checks using data from 2 studies were performed. The PK and PD of epoetin alfa were well described by the model. © 2015, The American College of Clinical Pharmacology.

  17. Learning-Testing Process in Classroom: An Empirical Simulation Model

    ERIC Educational Resources Information Center

    Buda, Rodolphe

    2009-01-01

    This paper presents an empirical micro-simulation model of the teaching and the testing process in the classroom (Programs and sample data are available--the actual names of pupils have been hidden). It is a non-econometric micro-simulation model describing informational behaviors of the pupils, based on the observation of the pupils'…

  18. Simulation of Plant Physiological Process Using Fuzzy Variables

    Treesearch

    Daniel L. Schmoldt

    1991-01-01

    Qualitative modelling can help us understand and project effects of multiple stresses on trees. It is not practical to collect and correlate empirical data for all combinations of plant/environments and human/climate stresses, especially for mature trees in natural settings. Therefore, a mechanistic model was developed to describe ecophysiological processes. This model...

  19. Simulation modeling of forest landscape disturbances: An overview

    Treesearch

    Ajith H. Perera; Brian R. Sturtevant; Lisa J. Buse

    2015-01-01

    Quantification of ecological processes and formulation of the mathematical expressions that describe those processes in computer models has been a cornerstone of landscape ecology research and its application. Consequently, the body of publications on simulation models in landscape ecology has grown rapidly in recent decades. This trend is also evident in the subfield...

  20. Foreign Language Methods and an Information Processing Model of Memory.

    ERIC Educational Resources Information Center

    Willebrand, Julia

    The major approaches to language teaching (audiolingual method, generative grammar, Community Language Learning and Silent Way) are investigated to discover whether or not they are compatible in structure with an information-processing model of memory (IPM). The model of memory used was described by Roberta Klatzky in "Human Memory:…

  1. A Model of the Base Civil Engineering Work Request/Work Order Processing System.

    DTIC Science & Technology

    1979-09-01

    changes to the work order processing system. This research identifies the variables that significantly affect the accomplishment time and proposes a... order processing system and its behavior with respect to work order processing time. A conceptual model was developed to describe the work request...work order processing system as a stochastic queueing system in which the processing times and the various distributions are treated as random variables

  2. Ground control station software design for micro aerial vehicles

    NASA Astrophysics Data System (ADS)

    Walendziuk, Wojciech; Oldziej, Daniel; Binczyk, Dawid Przemyslaw; Slowik, Maciej

    2017-08-01

    This article describes the process of designing the equipment part and the software of a ground control station used for configuring and operating micro unmanned aerial vehicles (UAV). All the works were conducted on a quadrocopter model being a commonly accessible commercial construction. This article contains a characteristics of the research object, the basics of operating the micro aerial vehicles (MAV) and presents components of the ground control station model. It also describes the communication standards for the purpose of building a model of the station. Further part of the work concerns the software of the product - the GIMSO application (Generally Interactive Station for Mobile Objects), which enables the user to manage the actions and communication and control processes from the UAV. The process of creating the software and the field tests of a station model are also presented in the article.

  3. TKKMOD: A computer simulation program for an integrated wind diesel system. Version 1.0: Document and user guide

    NASA Astrophysics Data System (ADS)

    Manninen, L. M.

    1993-12-01

    The document describes TKKMOD, a simulation model developed at Helsinki University of Technology for a specific wind-diesel system layout, with special emphasis on the battery submodel and its use in simulation. The model has been included into the European wind-diesel modeling software package WDLTOOLS under the CEC JOULE project 'Engineering Design Tools for Wind-Diesel Systems' (JOUR-0078). WDLTOOLS serves as the user interface and processes the input and output data of different logistic simulation models developed by the project participants. TKKMOD cannot be run without this shell. The report only describes the simulation principles and model specific parameters of TKKMOD and gives model specific user instructions. The input and output data processing performed outside this model is described in the documentation of the shell. The simulation model is utilized for calculation of long-term performance of the reference system configuration for given wind and load conditions. The main results are energy flows, losses in the system components, diesel fuel consumption, and the number of diesel engine starts.

  4. Motoneuron membrane potentials follow a time inhomogeneous jump diffusion process.

    PubMed

    Jahn, Patrick; Berg, Rune W; Hounsgaard, Jørn; Ditlevsen, Susanne

    2011-11-01

    Stochastic leaky integrate-and-fire models are popular due to their simplicity and statistical tractability. They have been widely applied to gain understanding of the underlying mechanisms for spike timing in neurons, and have served as building blocks for more elaborate models. Especially the Ornstein-Uhlenbeck process is popular to describe the stochastic fluctuations in the membrane potential of a neuron, but also other models like the square-root model or models with a non-linear drift are sometimes applied. Data that can be described by such models have to be stationary and thus, the simple models can only be applied over short time windows. However, experimental data show varying time constants, state dependent noise, a graded firing threshold and time-inhomogeneous input. In the present study we build a jump diffusion model that incorporates these features, and introduce a firing mechanism with a state dependent intensity. In addition, we suggest statistical methods to estimate all unknown quantities and apply these to analyze turtle motoneuron membrane potentials. Finally, simulated and real data are compared and discussed. We find that a square-root diffusion describes the data much better than an Ornstein-Uhlenbeck process with constant diffusion coefficient. Further, the membrane time constant decreases with increasing depolarization, as expected from the increase in synaptic conductance. The network activity, which the neuron is exposed to, can be reasonably estimated to be a threshold version of the nerve output from the network. Moreover, the spiking characteristics are well described by a Poisson spike train with an intensity depending exponentially on the membrane potential.

  5. Applying Multiagent Simulation to Planetary Surface Operations

    NASA Technical Reports Server (NTRS)

    Sierhuis, Maarten; Sims, Michael H.; Clancey, William J.; Lee, Pascal; Swanson, Keith (Technical Monitor)

    2000-01-01

    This paper describes a multiagent modeling and simulation approach for designing cooperative systems. Issues addressed include the use of multiagent modeling and simulation for the design of human and robotic operations, as a theory for human/robot cooperation on planetary surface missions. We describe a design process for cooperative systems centered around the Brahms modeling and simulation environment being developed at NASA Ames.

  6. Generalizing a model beyond the inherence heuristic and applying it to beliefs about objective value.

    PubMed

    Wood, Graham

    2014-10-01

    The inherence heuristic is characterized as part of an instantiation of a more general model that describes the interaction between undeveloped intuitions, produced by System 1 heuristics, and developed beliefs, constructed by System 2 reasoning. The general model is described and illustrated by examining another instantiation of the process that constructs belief in objective moral value.

  7. The spatiotemporal MEG covariance matrix modeled as a sum of Kronecker products.

    PubMed

    Bijma, Fetsje; de Munck, Jan C; Heethaar, Rob M

    2005-08-15

    The single Kronecker product (KP) model for the spatiotemporal covariance of MEG residuals is extended to a sum of Kronecker products. This sum of KP is estimated such that it approximates the spatiotemporal sample covariance best in matrix norm. Contrary to the single KP, this extension allows for describing multiple, independent phenomena in the ongoing background activity. Whereas the single KP model can be interpreted by assuming that background activity is generated by randomly distributed dipoles with certain spatial and temporal characteristics, the sum model can be physiologically interpreted by assuming a composite of such processes. Taking enough terms into account, the spatiotemporal sample covariance matrix can be described exactly by this extended model. In the estimation of the sum of KP model, it appears that the sum of the first 2 KP describes between 67% and 93%. Moreover, these first two terms describe two physiological processes in the background activity: focal, frequency-specific alpha activity, and more widespread non-frequency-specific activity. Furthermore, temporal nonstationarities due to trial-to-trial variations are not clearly visible in the first two terms, and, hence, play only a minor role in the sample covariance matrix in terms of matrix power. Considering the dipole localization, the single KP model appears to describe around 80% of the noise and seems therefore adequate. The emphasis of further improvement of localization accuracy should be on improving the source model rather than the covariance model.

  8. FIRETEC: A transport description of wildfire behavior

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Linn, R.R.; Harlow, F.H.

    1997-12-01

    Wildfires are a threat to human life and property, yet they are an unavoidable part of nature and in some instances they are necessary for the natural maintenance and evolution of forests. Investigators have attempted to describe the behavior (speed, direction, modes of spread) of wildfires for over fifty years. Current models for numerical description are mainly algebraic and based on statistical or empirical ideas. The authors describe, in contrast, a transport model called FIRETEC, which is a self-determining fire behavior model. The use of transport formulations connects the propagation rates to the full conservation equations for energy, momentum, speciesmore » concentrations, mass, and turbulence. In this text, highlights of the model formulation and results are described. The goal of the FIRETEC model is to describe average behavior of the gases and fuels. It represents the essence of the combination of many small-scale processes without resolving each process in complete detail. The FIRETEC model is implemented into a computer code that examines line-fire propagation in a vertical spatial cut parallel to the direction of advancement. With this code the authors are able to examine wind effects, slope effects, and the effects of nonhomogeneous fuel distribution.« less

  9. Milestones of mathematical model for business process management related to cost estimate documentation in petroleum industry

    NASA Astrophysics Data System (ADS)

    Khamidullin, R. I.

    2018-05-01

    The paper is devoted to milestones of the optimal mathematical model for a business process related to cost estimate documentation compiled during construction and reconstruction of oil and gas facilities. It describes the study and analysis of fundamental issues in petroleum industry, which are caused by economic instability and deterioration of a business strategy. Business process management is presented as business process modeling aimed at the improvement of the studied business process, namely main criteria of optimization and recommendations for the improvement of the above-mentioned business model.

  10. Participatory Model of Mental Health Programming: Lessons Learned from Work in a Developing Country.

    ERIC Educational Resources Information Center

    Nastasi, Bonnie K.; Varjas, Kristen; Sarkar, Sreeroopa; Jayasena, Asoka

    1998-01-01

    Describes application of participatory model for creating school-based mental health services in a developing country. Describes process of identifying individual and cultural factors relevant to mental health. Discusses importance of formative research and collaboration with stakeholders to ensure cultural specificity of interventions, and the…

  11. Business process modeling in healthcare.

    PubMed

    Ruiz, Francisco; Garcia, Felix; Calahorra, Luis; Llorente, César; Gonçalves, Luis; Daniel, Christel; Blobel, Bernd

    2012-01-01

    The importance of the process point of view is not restricted to a specific enterprise sector. In the field of health, as a result of the nature of the service offered, health institutions' processes are also the basis for decision making which is focused on achieving their objective of providing quality medical assistance. In this chapter the application of business process modelling - using the Business Process Modelling Notation (BPMN) standard is described. Main challenges of business process modelling in healthcare are the definition of healthcare processes, the multi-disciplinary nature of healthcare, the flexibility and variability of the activities involved in health care processes, the need of interoperability between multiple information systems, and the continuous updating of scientific knowledge in healthcare.

  12. Development of a program logic model and evaluation plan for a participatory ergonomics intervention in construction.

    PubMed

    Jaegers, Lisa; Dale, Ann Marie; Weaver, Nancy; Buchholz, Bryan; Welch, Laura; Evanoff, Bradley

    2014-03-01

    Intervention studies in participatory ergonomics (PE) are often difficult to interpret due to limited descriptions of program planning and evaluation. In an ongoing PE program with floor layers, we developed a logic model to describe our program plan, and process and summative evaluations designed to describe the efficacy of the program. The logic model was a useful tool for describing the program elements and subsequent modifications. The process evaluation measured how well the program was delivered as intended, and revealed the need for program modifications. The summative evaluation provided early measures of the efficacy of the program as delivered. Inadequate information on program delivery may lead to erroneous conclusions about intervention efficacy due to Type III error. A logic model guided the delivery and evaluation of our intervention and provides useful information to aid interpretation of results. © 2013 Wiley Periodicals, Inc.

  13. Development of a Program Logic Model and Evaluation Plan for a Participatory Ergonomics Intervention in Construction

    PubMed Central

    Jaegers, Lisa; Dale, Ann Marie; Weaver, Nancy; Buchholz, Bryan; Welch, Laura; Evanoff, Bradley

    2013-01-01

    Background Intervention studies in participatory ergonomics (PE) are often difficult to interpret due to limited descriptions of program planning and evaluation. Methods In an ongoing PE program with floor layers, we developed a logic model to describe our program plan, and process and summative evaluations designed to describe the efficacy of the program. Results The logic model was a useful tool for describing the program elements and subsequent modifications. The process evaluation measured how well the program was delivered as intended, and revealed the need for program modifications. The summative evaluation provided early measures of the efficacy of the program as delivered. Conclusions Inadequate information on program delivery may lead to erroneous conclusions about intervention efficacy due to Type III error. A logic model guided the delivery and evaluation of our intervention and provides useful information to aid interpretation of results. PMID:24006097

  14. Time-lapse microscopy and image processing for stem cell research: modeling cell migration

    NASA Astrophysics Data System (ADS)

    Gustavsson, Tomas; Althoff, Karin; Degerman, Johan; Olsson, Torsten; Thoreson, Ann-Catrin; Thorlin, Thorleif; Eriksson, Peter

    2003-05-01

    This paper presents hardware and software procedures for automated cell tracking and migration modeling. A time-lapse microscopy system equipped with a computer controllable motorized stage was developed. The performance of this stage was improved by incorporating software algorithms for stage motion displacement compensation and auto focus. The microscope is suitable for in-vitro stem cell studies and allows for multiple cell culture image sequence acquisition. This enables comparative studies concerning rate of cell splits, average cell motion velocity, cell motion as a function of cell sample density and many more. Several cell segmentation procedures are described as well as a cell tracking algorithm. Statistical methods for describing cell migration patterns are presented. In particular, the Hidden Markov Model (HMM) was investigated. Results indicate that if the cell motion can be described as a non-stationary stochastic process, then the HMM can adequately model aspects of its dynamic behavior.

  15. Resolution-Adapted All-Atomic and Coarse-Grained Model for Biomolecular Simulations.

    PubMed

    Shen, Lin; Hu, Hao

    2014-06-10

    We develop here an adaptive multiresolution method for the simulation of complex heterogeneous systems such as the protein molecules. The target molecular system is described with the atomistic structure while maintaining concurrently a mapping to the coarse-grained models. The theoretical model, or force field, used to describe the interactions between two sites is automatically adjusted in the simulation processes according to the interaction distance/strength. Therefore, all-atomic, coarse-grained, or mixed all-atomic and coarse-grained models would be used together to describe the interactions between a group of atoms and its surroundings. Because the choice of theory is made on the force field level while the sampling is always carried out in the atomic space, the new adaptive method preserves naturally the atomic structure and thermodynamic properties of the entire system throughout the simulation processes. The new method will be very useful in many biomolecular simulations where atomistic details are critically needed.

  16. Advances in Geoscience Modeling: Smart Modeling Frameworks, Self-Describing Models and the Role of Standardized Metadata

    NASA Astrophysics Data System (ADS)

    Peckham, Scott

    2016-04-01

    Over the last decade, model coupling frameworks like CSDMS (Community Surface Dynamics Modeling System) and ESMF (Earth System Modeling Framework) have developed mechanisms that make it much easier for modelers to connect heterogeneous sets of process models in a plug-and-play manner to create composite "system models". These mechanisms greatly simplify code reuse, but must simultaneously satisfy many different design criteria. They must be able to mediate or compensate for differences between the process models, such as their different programming languages, computational grids, time-stepping schemes, variable names and variable units. However, they must achieve this interoperability in a way that: (1) is noninvasive, requiring only relatively small and isolated changes to the original source code, (2) does not significantly reduce performance, (3) is not time-consuming or confusing for a model developer to implement, (4) can very easily be updated to accommodate new versions of a given process model and (5) does not shift the burden of providing model interoperability to the model developers. In tackling these design challenges, model framework developers have learned that the best solution is to provide each model with a simple, standardized interface, i.e. a set of standardized functions that make the model: (1) fully-controllable by a caller (e.g. a model framework) and (2) self-describing with standardized metadata. Model control functions are separate functions that allow a caller to initialize the model, advance the model's state variables in time and finalize the model. Model description functions allow a caller to retrieve detailed information on the model's input and output variables, its computational grid and its timestepping scheme. If the caller is a modeling framework, it can use the self description functions to learn about each process model in a collection to be coupled and then automatically call framework service components (e.g. regridders, time interpolators and unit converters) as necessary to mediate the differences between them so they can work together. This talk will first review two key products of the CSDMS project, namely a standardized model interface called the Basic Model Interface (BMI) and the CSDMS Standard Names. The standard names are used in conjunction with BMI to provide a semantic matching mechanism that allows output variables from one process model or data set to be reliably used as input variables to other process models in a collection. They include not just a standardized naming scheme for model variables, but also a standardized set of terms for describing the attributes and assumptions of a given model. Recent efforts to bring powerful uncertainty analysis and inverse modeling toolkits such as DAKOTA into modeling frameworks will also be described. This talk will conclude with an overview of several related modeling projects that have been funded by NSF's EarthCube initiative, namely the Earth System Bridge, OntoSoft and GeoSemantics projects.

  17. The Legacy of Hobbs and Gray: Research on the Development and Prevention of Conduct Problems.

    ERIC Educational Resources Information Center

    Dodge, Kenneth A.

    1996-01-01

    Describes research on the development of chronic conduct problems in childhood and adolescence, examining a multiple risk-factor model that includes biological predispositions, ecological context, family processes, peer influences, academic performance, and social information processing as factors leading to conduct problems. The paper describes a…

  18. Associating versus Proposing or Associating What We Propose: Comment on Gawronski and Bodenhausen

    ERIC Educational Resources Information Center

    Albarracin, Dolores; Hart, William; McCulloch, Kathleen C.

    2006-01-01

    This commentary on the article by B. Gawronski and G. V. Bodenhausen (see record 2006-10465-003) highlights the strengths of the associative-propositional evaluation model. It then describes problems in proposing a qualitative separation between propositional and associative processes. Propositional processes are instead described as associative.…

  19. Feminist Relational Advocacy: Processes and Outcomes from the Perspective of Low-Income Women with Depression

    ERIC Educational Resources Information Center

    Goodman, Lisa A.; Glenn, Catherine; Bohlig, Amanda; Banyard, Victoria; Borges, Angela

    2009-01-01

    This article describes a qualitative study of how low-income women who are struggling with symptoms of depression experience feminist relational advocacy, a new model that is informed by feminist, multicultural, and community psychology theories. Using qualitative content analysis of participant interviews, the authors describe the processes and…

  20. Method for modeling social care processes for national information exchange.

    PubMed

    Miettinen, Aki; Mykkänen, Juha; Laaksonen, Maarit

    2012-01-01

    Finnish social services include 21 service commissions of social welfare including Adoption counselling, Income support, Child welfare, Services for immigrants and Substance abuse care. This paper describes the method used for process modeling in the National project for IT in Social Services in Finland (Tikesos). The process modeling in the project aimed to support common national target state processes from the perspective of national electronic archive, increased interoperability between systems and electronic client documents. The process steps and other aspects of the method are presented. The method was developed, used and refined during the three years of process modeling in the national project.

  1. FLBEIA : A simulation model to conduct Bio-Economic evaluation of fisheries management strategies

    NASA Astrophysics Data System (ADS)

    Garcia, Dorleta; Sánchez, Sonia; Prellezo, Raúl; Urtizberea, Agurtzane; Andrés, Marga

    Fishery systems are complex systems that need to be managed in order to ensure a sustainable and efficient exploitation of marine resources. Traditionally, fisheries management has relied on biological models. However, in recent years the focus on mathematical models which incorporate economic and social aspects has increased. Here, we present FLBEIA, a flexible software to conduct bio-economic evaluation of fisheries management strategies. The model is multi-stock, multi-fleet, stochastic and seasonal. The fishery system is described as a sum of processes, which are internally assembled in a predetermined way. There are several functions available to describe the dynamic of each process and new functions can be added to satisfy specific requirements.

  2. Application of State Analysis and Goal-based Operations to a MER Mission Scenario

    NASA Technical Reports Server (NTRS)

    Morris, John Richard; Ingham, Michel D.; Mishkin, Andrew H.; Rasmussen, Robert D.; Starbird, Thomas W.

    2006-01-01

    State Analysis is a model-based systems engineering methodology employing a rigorous discovery process which articulates operations concepts and operability needs as an integrated part of system design. The process produces requirements on system and software design in the form of explicit models which describe the system behavior in terms of state variables and the relationships among them. By applying State Analysis to an actual MER flight mission scenario, this study addresses the specific real world challenges of complex space operations and explores technologies that can be brought to bear on future missions. The paper first describes the tools currently used on a daily basis for MER operations planning and provides an in-depth description of the planning process, in the context of a Martian day's worth of rover engineering activities, resource modeling, flight rules, science observations, and more. It then describes how State Analysis allows for the specification of a corresponding goal-based sequence that accomplishes the same objectives, with several important additional benefits.

  3. Application of State Analysis and Goal-Based Operations to a MER Mission Scenario

    NASA Technical Reports Server (NTRS)

    Morris, J. Richard; Ingham, Michel D.; Mishkin, Andrew H.; Rasmussen, Robert D.; Starbird, Thomas W.

    2006-01-01

    State Analysis is a model-based systems engineering methodology employing a rigorous discovery process which articulates operations concepts and operability needs as an integrated part of system design. The process produces requirements on system and software design in the form of explicit models which describe the behavior of states and the relationships among them. By applying State Analysis to an actual MER flight mission scenario, this study addresses the specific real world challenges of complex space operations and explores technologies that can be brought to bear on future missions. The paper describes the tools currently used on a daily basis for MER operations planning and provides an in-depth description of the planning process, in the context of a Martian day's worth of rover engineering activities, resource modeling, flight rules, science observations, and more. It then describes how State Analysis allows for the specification of a corresponding goal-based sequence that accomplishes the same objectives, with several important additional benefits.

  4. ASSIST user manual

    NASA Technical Reports Server (NTRS)

    Johnson, Sally C.; Boerschlein, David P.

    1995-01-01

    Semi-Markov models can be used to analyze the reliability of virtually any fault-tolerant system. However, the process of delineating all the states and transitions in a complex system model can be devastatingly tedious and error prone. The Abstract Semi-Markov Specification Interface to the SURE Tool (ASSIST) computer program allows the user to describe the semi-Markov model in a high-level language. Instead of listing the individual model states, the user specifies the rules governing the behavior of the system, and these are used to generate the model automatically. A few statements in the abstract language can describe a very large, complex model. Because no assumptions are made about the system being modeled, ASSIST can be used to generate models describing the behavior of any system. The ASSIST program and its input language are described and illustrated by examples.

  5. Examining the Process of Responding to Circumplex Scales of Interpersonal Values Items: Should Ideal Point Scoring Methods Be Considered?

    PubMed

    Ling, Ying; Zhang, Minqiang; Locke, Kenneth D; Li, Guangming; Li, Zonglong

    2016-01-01

    The Circumplex Scales of Interpersonal Values (CSIV) is a 64-item self-report measure of goals from each octant of the interpersonal circumplex. We used item response theory methods to compare whether dominance models or ideal point models best described how people respond to CSIV items. Specifically, we fit a polytomous dominance model called the generalized partial credit model and an ideal point model of similar complexity called the generalized graded unfolding model to the responses of 1,893 college students. The results of both graphical comparisons of item characteristic curves and statistical comparisons of model fit suggested that an ideal point model best describes the process of responding to CSIV items. The different models produced different rank orderings of high-scoring respondents, but overall the models did not differ in their prediction of criterion variables (agentic and communal interpersonal traits and implicit motives).

  6. Delivery of asteroids and meteorites to the inner solar system

    NASA Technical Reports Server (NTRS)

    Greenberg, Richard; Nolan, Michael C.

    1989-01-01

    Two major models of asteroid/meteorite demographics are described and reviewed critically: the collisional model of Greenberg and Chapman (1983), and the orbital evolution model of Wetherill (1985). It is shown that each of the two models tends to gloss over the central processes in the other and tends to ignore (and to some degree violate) the key observables that constrain the other model. The uncertainties that prevent definite acceptance of any particular model for the delivery of asteroidal material to the earth are described.

  7. Evaluating the Facilities Planning, Design, and Construction Department: The Capital Programs Management Audit.

    ERIC Educational Resources Information Center

    Kaiser, Harvey H.; Kirkwood, Dennis M.

    2000-01-01

    Presents a diagnostic model for assessing the state of an institution's capital programs management (CPM) by delineating "work processes" which comprise that function. What capital programs management is, its resources, and its phases and work processes are described, followed by case studies of the CPM Process Model as an assessment tool. (GR)

  8. Using Economic Impact Models as an Educational Tool in Community Economic Development Programming: Lessons from Pennsylvania and Wisconsin.

    ERIC Educational Resources Information Center

    Shields, Martin; Deller, Steven C.

    2003-01-01

    Outlines an educational process designed to help provide communities with economic, social, and political information using community economic impact modeling. Describes the process of community meetings using economic impact, community demographics, and fiscal impact modules and the local preconditions that help make the process successful. (SK)

  9. Estimation and Optimization of the Parameters Preserving the Lustre of the Fabrics

    NASA Astrophysics Data System (ADS)

    Prodanova, Krasimira

    2009-11-01

    The paper discusses the optimization of the continuance of the Damp-Heating Process of a steaming iron press machine, and the preserving of the lustre of the fabrics. In order to be obtained high qualitative damp-heating processing, it is necessary to monitor parameters such as temperature, damp, and pressure during the process. The purpose of the present paper is a mathematical model to be constructed that adequately describes the technological process using multivariate data analysis. It was established that the full factorial design of type 23 is not adequate. The research has proceeded with central rotatable design of experiment. The obtained model adequately describes the technological process of damp-heating treatment in the defined factor space. The present investigation is helpful to the technological improvement and modernization in sewing companies.

  10. Youth experiences of family violence and teen dating violence perpetration: cognitive and emotional mediators.

    PubMed

    Jouriles, Ernest N; McDonald, Renee; Mueller, Victoria; Grych, John H

    2012-03-01

    This article describes a conceptual model of cognitive and emotional processes proposed to mediate the relation between youth exposure to family violence and teen dating violence perpetration. Explicit beliefs about violence, internal knowledge structures, and executive functioning are hypothesized as cognitive mediators, and their potential influences upon one another are described. Theory and research on the role of emotions and emotional processes in the relation between youths' exposure to family violence and teen dating violence perpetration are also reviewed. We present an integrated model that highlights how emotions and emotional processes work in tandem with hypothesized cognitive mediators to predict teen dating violence.

  11. Behavioral Signal Processing: Deriving Human Behavioral Informatics From Speech and Language

    PubMed Central

    Narayanan, Shrikanth; Georgiou, Panayiotis G.

    2013-01-01

    The expression and experience of human behavior are complex and multimodal and characterized by individual and contextual heterogeneity and variability. Speech and spoken language communication cues offer an important means for measuring and modeling human behavior. Observational research and practice across a variety of domains from commerce to healthcare rely on speech- and language-based informatics for crucial assessment and diagnostic information and for planning and tracking response to an intervention. In this paper, we describe some of the opportunities as well as emerging methodologies and applications of human behavioral signal processing (BSP) technology and algorithms for quantitatively understanding and modeling typical, atypical, and distressed human behavior with a specific focus on speech- and language-based communicative, affective, and social behavior. We describe the three important BSP components of acquiring behavioral data in an ecologically valid manner across laboratory to real-world settings, extracting and analyzing behavioral cues from measured data, and developing models offering predictive and decision-making support. We highlight both the foundational speech and language processing building blocks as well as the novel processing and modeling opportunities. Using examples drawn from specific real-world applications ranging from literacy assessment and autism diagnostics to psychotherapy for addiction and marital well being, we illustrate behavioral informatics applications of these signal processing techniques that contribute to quantifying higher level, often subjectively described, human behavior in a domain-sensitive fashion. PMID:24039277

  12. Kinetic models for nitrogen inhibition in ANAMMOX and nitrification process on deammonification system at room temperature.

    PubMed

    De Prá, Marina C; Kunz, Airton; Bortoli, Marcelo; Scussiato, Lucas A; Coldebella, Arlei; Vanotti, Matias; Soares, Hugo M

    2016-02-01

    In this study were fitted the best kinetic model for nitrogen removal inhibition by ammonium and/or nitrite in three different nitrogen removal systems operated at 25 °C: a nitrifying system (NF) containing only ammonia oxidizing bacteria (AOB), an ANAMMOX system (AMX) containing only ANAMMOX bacteria, and a deammonification system (DMX) containing both AOB and ANAMMOX bacteria. NF system showed inhibition by ammonium and was best described by Andrews model. The AMX system showed a strong inhibition by nitrite and Edwards model presented a best system representation. For DMX system, the increased substrate concentration (until 1060 mg NH3-N/L) tested was not limiting for the ammonia consumption rate and the Monod model was the best model to describe this process. The AOB and ANAMMOX sludges combined in the DMX system displayed a better activity, substrate affinity and excellent substrate tolerance than in nitrifying and ANAMMOX process. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. Elementary model of severe plastic deformation by KoBo process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gusak, A.; Storozhuk, N.; Danielewski, M., E-mail: daniel@agh.edu.pl

    2014-01-21

    Self-consistent model of generation, interaction, and annihilation of point defects in the gradient of oscillating stresses is presented. This model describes the recently suggested method of severe plastic deformation by combination of pressure and oscillating rotations of the die along the billet axis (KoBo process). Model provides the existence of distinct zone of reduced viscosity with sharply increased concentration of point defects. This zone provides the high extrusion velocity. Presented model confirms that the Severe Plastic Deformation (SPD) in KoBo may be treated as non-equilibrium phase transition of abrupt drop of viscosity in rather well defined spatial zone. In thismore » very zone, an intensive lateral rotational movement proceeds together with generation of point defects which in self-organized manner make rotation possible by the decrease of viscosity. The special properties of material under KoBo version of SPD can be described without using the concepts of nonequilibrium grain boundaries, ballistic jumps and amorphization. The model can be extended to include different SPD processes.« less

  14. Use of a Computer-Mediated Delphi Process to Validate a Mass Casualty Conceptual Model

    PubMed Central

    CULLEY, JOAN M.

    2012-01-01

    Since the original work on the Delphi technique, multiple versions have been developed and used in research and industry; however, very little empirical research has been conducted that evaluates the efficacy of using online computer, Internet, and e-mail applications to facilitate a Delphi method that can be used to validate theoretical models. The purpose of this research was to develop computer, Internet, and e-mail applications to facilitate a modified Delphi technique through which experts provide validation for a proposed conceptual model that describes the information needs for a mass-casualty continuum of care. Extant literature and existing theoretical models provided the basis for model development. Two rounds of the Delphi process were needed to satisfy the criteria for consensus and/or stability related to the constructs, relationships, and indicators in the model. The majority of experts rated the online processes favorably (mean of 6.1 on a seven-point scale). Using online Internet and computer applications to facilitate a modified Delphi process offers much promise for future research involving model building or validation. The online Delphi process provided an effective methodology for identifying and describing the complex series of events and contextual factors that influence the way we respond to disasters. PMID:21076283

  15. Use of a computer-mediated Delphi process to validate a mass casualty conceptual model.

    PubMed

    Culley, Joan M

    2011-05-01

    Since the original work on the Delphi technique, multiple versions have been developed and used in research and industry; however, very little empirical research has been conducted that evaluates the efficacy of using online computer, Internet, and e-mail applications to facilitate a Delphi method that can be used to validate theoretical models. The purpose of this research was to develop computer, Internet, and e-mail applications to facilitate a modified Delphi technique through which experts provide validation for a proposed conceptual model that describes the information needs for a mass-casualty continuum of care. Extant literature and existing theoretical models provided the basis for model development. Two rounds of the Delphi process were needed to satisfy the criteria for consensus and/or stability related to the constructs, relationships, and indicators in the model. The majority of experts rated the online processes favorably (mean of 6.1 on a seven-point scale). Using online Internet and computer applications to facilitate a modified Delphi process offers much promise for future research involving model building or validation. The online Delphi process provided an effective methodology for identifying and describing the complex series of events and contextual factors that influence the way we respond to disasters.

  16. Modeling and Validation of a Three-Stage Solidification Model for Sprays

    NASA Astrophysics Data System (ADS)

    Tanner, Franz X.; Feigl, Kathleen; Windhab, Erich J.

    2010-09-01

    A three-stage freezing model and its validation are presented. In the first stage, the cooling of the droplet down to the freezing temperature is described as a convective heat transfer process in turbulent flow. In the second stage, when the droplet has reached the freezing temperature, the solidification process is initiated via nucleation and crystal growth. The latent heat release is related to the amount of heat convected away from the droplet and the rate of solidification is expressed with a freezing progress variable. After completion of the solidification process, in stage three, the cooling of the solidified droplet (particle) is described again by a convective heat transfer process until the particle approaches the temperature of the gaseous environment. The model has been validated by experimental data of a single cocoa butter droplet suspended in air. The subsequent spray validations have been performed with data obtained from a cocoa butter melt in an experimental spray tower using the open-source computational fluid dynamics code KIVA-3.

  17. Models of policy-making and their relevance for drug research.

    PubMed

    Ritter, Alison; Bammer, Gabriele

    2010-07-01

    Researchers are often frustrated by their inability to influence policy. We describe models of policy-making to provide new insights and a more realistic assessment of research impacts on policy. We describe five prominent models of policy-making and illustrate them with examples from the alcohol and drugs field, before drawing lessons for researchers. Policy-making is a complex and messy process, with different models describing different elements. We start with the incrementalist model, which highlights small amendments to policy, as occurs in school-based drug education. A technical/rational approach then outlines the key steps in a policy process from identification of problems and their causes, through to examination and choice of response options, and subsequent implementation and evaluation. There is a clear role for research, as we illustrate with the introduction of new medications, but this model largely ignores the dominant political aspects of policy-making. Such political aspects include the influence of interest groups, and we describe models about power and pressure groups, as well as advocacy coalitions, and the challenges they pose for researchers. These are illustrated with reference to the alcohol industry, and interest group conflicts in establishing a Medically Supervised Injecting Centre. Finally, we describe the multiple streams framework, which alerts researchers to 'windows of opportunity', and we show how these were effectively exploited in policy for cannabis law reform in Western Australia. Understanding models of policy-making can help researchers maximise the uptake of their work and advance evidence-informed policy.

  18. On Using SysML, DoDAF 2.0 and UPDM to Model the Architecture for the NOAA's Joint Polar Satellite System (JPSS) Ground System (GS)

    NASA Technical Reports Server (NTRS)

    Hayden, Jeffrey L.; Jeffries, Alan

    2012-01-01

    The JPSS Ground System is a lIexible system of systems responsible for telemetry, tracking & command (TT &C), data acquisition, routing and data processing services for a varied lIeet of satellites to support weather prediction, modeling and climate modeling. To assist in this engineering effort, architecture modeling tools are being employed to translate the former NPOESS baseline to the new JPSS baseline, The paper will focus on the methodology for the system engineering process and the use of these architecture modeling tools within that process, The Department of Defense Architecture Framework version 2,0 (DoDAF 2.0) viewpoints and views that are being used to describe the JPSS GS architecture are discussed. The Unified Profile for DoOAF and MODAF (UPDM) and Systems Modeling Language (SysML), as ' provided by extensions to the MagicDraw UML modeling tool, are used to develop the diagrams and tables that make up the architecture model. The model development process and structure are discussed, examples are shown, and details of handling the complexities of a large System of Systems (SoS), such as the JPSS GS, with an equally complex modeling tool, are described

  19. A 1D thermomechanical network transition constitutive model coupled with multiple structural relaxation for shape memory polymers

    NASA Astrophysics Data System (ADS)

    Zeng, Hao; Xie, Zhimin; Gu, Jianping; Sun, Huiyu

    2018-03-01

    A new thermomechanical network transition constitutive model is proposed in the study to describe the viscoelastic behavior of shape memory polymers (SMPs). Based on the microstructure of semi-crystalline SMPs, a new simplified transformation equation is proposed to describe the transform of transient networks. And the generalized fractional Maxwell model is introduced in the paper to estimate the temperature-dependent storage modulus. In addition, a neo-KAHR theory with multiple discrete relaxation processes is put forward to study the structural relaxation of the nonlinear thermal strain in cooling/heating processes. The evolution equations of the time- and temperature-dependent stress and strain response are developed. In the model, the thermodynamical and mechanical characteristics of SMPs in the typical thermomechanical cycle are described clearly and the irreversible deformation is studied in detail. Finally, the typical thermomechanical cycles are simulated using the present constitutive model, and the simulation results agree well with the experimental results.

  20. A model for the distributed storage and processing of large arrays

    NASA Technical Reports Server (NTRS)

    Mehrota, P.; Pratt, T. W.

    1983-01-01

    A conceptual model for parallel computations on large arrays is developed. The model provides a set of language concepts appropriate for processing arrays which are generally too large to fit in the primary memories of a multiprocessor system. The semantic model is used to represent arrays on a concurrent architecture in such a way that the performance realities inherent in the distributed storage and processing can be adequately represented. An implementation of the large array concept as an Ada package is also described.

  1. Framework Programmable Platform for the Advanced Software Development Workstation (FPP/ASDW). Demonstration framework document. Volume 2: Framework process description

    NASA Technical Reports Server (NTRS)

    Mayer, Richard J.; Blinn, Thomas M.; Dewitte, Paula S.; Crump, John W.; Ackley, Keith A.

    1992-01-01

    In the second volume of the Demonstration Framework Document, the graphical representation of the demonstration framework is given. This second document was created to facilitate the reading and comprehension of the demonstration framework. It is designed to be viewed in parallel with Section 4.2 of the first volume to help give a picture of the relationships between the UOB's (Unit of Behavior) of the model. The model is quite large and the design team felt that this form of presentation would make it easier for the reader to get a feel for the processes described in this document. The IDEF3 (Process Description Capture Method) diagrams of the processes of an Information System Development are presented. Volume 1 describes the processes and the agents involved with each process, while this volume graphically shows the precedence relationships among the processes.

  2. Combining phase-field crystal methods with a Cahn-Hilliard model for binary alloys

    NASA Astrophysics Data System (ADS)

    Balakrishna, Ananya Renuka; Carter, W. Craig

    2018-04-01

    Diffusion-induced phase transitions typically change the lattice symmetry of the host material. In battery electrodes, for example, Li ions (diffusing species) are inserted between layers in a crystalline electrode material (host). This diffusion induces lattice distortions and defect formations in the electrode. The structural changes to the lattice symmetry affect the host material's properties. Here, we propose a 2D theoretical framework that couples a Cahn-Hilliard (CH) model, which describes the composition field of a diffusing species, with a phase-field crystal (PFC) model, which describes the host-material lattice symmetry. We couple the two continuum models via coordinate transformation coefficients. We introduce the transformation coefficients in the PFC method to describe affine lattice deformations. These transformation coefficients are modeled as functions of the composition field. Using this coupled approach, we explore the effects of coarse-grained lattice symmetry and distortions on a diffusion-induced phase transition process. In this paper, we demonstrate the working of the CH-PFC model through three representative examples: First, we describe base cases with hexagonal and square symmetries for two composition fields. Next, we illustrate how the CH-PFC method interpolates lattice symmetry across a diffuse phase boundary. Finally, we compute a Cahn-Hilliard type of diffusion and model the accompanying changes to lattice symmetry during a phase transition process.

  3. Simulation Framework for Teaching in Modeling and Simulation Areas

    ERIC Educational Resources Information Center

    De Giusti, Marisa Raquel; Lira, Ariel Jorge; Villarreal, Gonzalo Lujan

    2008-01-01

    Simulation is the process of executing a model that describes a system with enough detail; this model has its entities, an internal state, some input and output variables and a list of processes bound to these variables. Teaching a simulation language such as general purpose simulation system (GPSS) is always a challenge, because of the way it…

  4. Aircraft Flight Modeling During the Optimization of Gas Turbine Engine Working Process

    NASA Astrophysics Data System (ADS)

    Tkachenko, A. Yu; Kuz'michev, V. S.; Krupenich, I. N.

    2018-01-01

    The article describes a method for simulating the flight of the aircraft along a predetermined path, establishing a functional connection between the parameters of the working process of gas turbine engine and the efficiency criteria of the aircraft. This connection is necessary for solving the optimization tasks of the conceptual design stage of the engine according to the systems approach. Engine thrust level, in turn, influences the operation of aircraft, thus making accurate simulation of the aircraft behavior during flight necessary for obtaining the correct solution. The described mathematical model of aircraft flight provides the functional connection between the airframe characteristics, working process of gas turbine engines (propulsion system), ambient and flight conditions and flight profile features. This model provides accurate results of flight simulation and the resulting aircraft efficiency criteria, required for optimization of working process and control function of a gas turbine engine.

  5. Gaia DR2 documentation Chapter 3: Astrometry

    NASA Astrophysics Data System (ADS)

    Hobbs, D.; Lindegren, L.; Bastian, U.; Klioner, S.; Butkevich, A.; Stephenson, C.; Hernandez, J.; Lammers, U.; Bombrun, A.; Mignard, F.; Altmann, M.; Davidson, M.; de Bruijne, J. H. J.; Fernández-Hernández, J.; Siddiqui, H.; Utrilla Molina, E.

    2018-04-01

    This chapter of the Gaia DR2 documentation describes the models and processing steps used for the astrometric core solution, namely, the Astrometric Global Iterative Solution (AGIS). The inputs to this solution rely heavily on the basic observables (or astrometric elementaries) which have been pre-processed and discussed in Chapter 2, the results of which were published in Fabricius et al. (2016). The models consist of reference systems and time scales; assumed linear stellar motion and relativistic light deflection; in addition to fundamental constants and the transformation of coordinate systems. Higher level inputs such as: planetary and solar system ephemeris; Gaia tracking and orbit information; initial quasar catalogues and BAM data are all needed for the processing described here. The astrometric calibration models are outlined followed by the details processing steps which give AGIS its name. We also present a basic quality assessment and validation of the scientific results (for details, see Lindegren et al. 2018).

  6. A participatory evaluation model for Healthier Communities: developing indicators for New Mexico.

    PubMed Central

    Wallerstein, N

    2000-01-01

    Participatory evaluation models that invite community coalitions to take an active role in developing evaluations of their programs are a natural fit with Healthy Communities initiatives. The author describes the development of a participatory evaluation model for New Mexico's Healthier Communities program. She describes evaluation principles, research questions, and baseline findings. The evaluation model shows the links between process, community-level system impacts, and population health changes. PMID:10968754

  7. Splitting algorithm for numerical simulation of Li-ion battery electrochemical processes

    NASA Astrophysics Data System (ADS)

    Iliev, Oleg; Nikiforova, Marina A.; Semenov, Yuri V.; Zakharov, Petr E.

    2017-11-01

    In this paper we present a splitting algorithm for a numerical simulation of Li-ion battery electrochemical processes. Liion battery consists of three domains: anode, cathode and electrolyte. Mathematical model of electrochemical processes is described on a microscopic scale, and contains nonlinear equations for concentration and potential in each domain. On the interface of electrodes and electrolyte there are the Lithium ions intercalation and deintercalation processes, which are described by Butler-Volmer nonlinear equation. To approximate in spatial coordinates we use finite element methods with discontinues Galerkin elements. To simplify numerical simulations we develop the splitting algorithm, which split the original problem into three independent subproblems. We investigate the numerical convergence of the algorithm on 2D model problem.

  8. Modelling and simulation of a moving interface problem: freeze drying of black tea extract

    NASA Astrophysics Data System (ADS)

    Aydin, Ebubekir Sıddık; Yucel, Ozgun; Sadikoglu, Hasan

    2017-06-01

    The moving interface separates the material that is subjected to the freeze drying process as dried and frozen. Therefore, the accurate modeling the moving interface reduces the process time and energy consumption by improving the heat and mass transfer predictions during the process. To describe the dynamic behavior of the drying stages of the freeze-drying, a case study of brewed black tea extract in storage trays including moving interface was modeled that the heat and mass transfer equations were solved using orthogonal collocation method based on Jacobian polynomial approximation. Transport parameters and physical properties describing the freeze drying of black tea extract were evaluated by fitting the experimental data using Levenberg-Marquardt algorithm. Experimental results showed good agreement with the theoretical predictions.

  9. Assessing Performance Tradeoffs in Undersea Distributed Sensor Networks

    DTIC Science & Technology

    2006-09-01

    time. We refer to this process as track - before - detect (see [5] for a description), since the final determination of a target presence is not made until...expressions for probability of successful search and probability of false search for modeling the track - before - detect process. We then describe a numerical...random manner (randomly sampled from a uniform distribution). II. SENSOR NETWORK PERFORMANCE MODELS We model the process of track - before - detect by

  10. The methodology of database design in organization management systems

    NASA Astrophysics Data System (ADS)

    Chudinov, I. L.; Osipova, V. V.; Bobrova, Y. V.

    2017-01-01

    The paper describes the unified methodology of database design for management information systems. Designing the conceptual information model for the domain area is the most important and labor-intensive stage in database design. Basing on the proposed integrated approach to design, the conceptual information model, the main principles of developing the relation databases are provided and user’s information needs are considered. According to the methodology, the process of designing the conceptual information model includes three basic stages, which are defined in detail. Finally, the article describes the process of performing the results of analyzing user’s information needs and the rationale for use of classifiers.

  11. End of Life in a Haitian American, Faith-Based Community: Caring for Family and Communal Unity.

    PubMed

    Ladd, Susan Charlotte; Gordon, Shirley C

    This article presents two models resulting from a grounded theory study of the end-of-life decision-making process for Haitian Americans. Successful access to this vulnerable population was achieved through the faith-based community. The first model describes this faith-based community of Haitian Americans. The second model describes the process used by families in this community who must make end-of-life healthcare decisions. Implications for nursing practice and caring science include a need to improve the congruence between the nursing care provided at this vulnerable time and the cultural values of a population.

  12. Self-optimisation and model-based design of experiments for developing a C-H activation flow process.

    PubMed

    Echtermeyer, Alexander; Amar, Yehia; Zakrzewski, Jacek; Lapkin, Alexei

    2017-01-01

    A recently described C(sp 3 )-H activation reaction to synthesise aziridines was used as a model reaction to demonstrate the methodology of developing a process model using model-based design of experiments (MBDoE) and self-optimisation approaches in flow. The two approaches are compared in terms of experimental efficiency. The self-optimisation approach required the least number of experiments to reach the specified objectives of cost and product yield, whereas the MBDoE approach enabled a rapid generation of a process model.

  13. Modeling Patient-Specific Deformable Mitral Valves.

    PubMed

    Ginty, Olivia; Moore, John; Peters, Terry; Bainbridge, Daniel

    2018-06-01

    Medical imaging has advanced enormously over the last few decades, revolutionizing patient diagnostics and care. At the same time, additive manufacturing has emerged as a means of reproducing physical shapes and models previously not possible. In combination, they have given rise to 3-dimensional (3D) modeling, an entirely new technology for physicians. In an era in which 3D imaging has become a standard for aiding in the diagnosis and treatment of cardiac disease, this visualization now can be taken further by bringing the patient's anatomy into physical reality as a model. The authors describe the generalized process of creating a model of cardiac anatomy from patient images and their experience creating patient-specific dynamic mitral valve models. This involves a combination of image processing software and 3D printing technology. In this article, the complexity of 3D modeling is described and the decision-making process for cardiac anesthesiologists is summarized. The management of cardiac disease has been altered with the emergence of 3D echocardiography, and 3D modeling represents the next paradigm shift. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. Development of a Model for Some Aspects of University Policy. Technical Report.

    ERIC Educational Resources Information Center

    Goossens, J. L. M.; And Others

    A method to calculate the need for academic staff per faculty, based on educational programs and numbers of students, is described which is based on quantitative relations between programs, student enrollment, and total budget. The model is described schematically and presented in a mathematical form adapted to computer processing. Its application…

  15. Everyday Miracles: Supporting Parents of Infants in Foster Care

    ERIC Educational Resources Information Center

    Wotherspoon, Evelyn; McInnis, Jan

    2013-01-01

    This article describes a model for supporting parents and their infants during separations due to temporary foster care. Using a case example, the authors describe a model for visit coaching, including their process for assessment and strategies used for intervention. The lessons learned are: (a) that individual parents can present very…

  16. The Emerging and Employed Worker: Planning for the Strategic Imperative.

    ERIC Educational Resources Information Center

    Geroy, Gary D.

    This paper describes a series of four models around which plans can be developed to determine human development needs. It presents needs assessment models describing the process and participant interaction by which information is gathered to be used in education, training, funding, and/or other human resource development interventions to increase…

  17. The Design and Evaluation of Teaching Experiments in Computer Science.

    ERIC Educational Resources Information Center

    Forcheri, Paola; Molfino, Maria Teresa

    1992-01-01

    Describes a relational model that was developed to provide a framework for the design and evaluation of teaching experiments for the introduction of computer science in secondary schools in Italy. Teacher training is discussed, instructional materials are considered, and use of the model for the evaluation process is described. (eight references)…

  18. Curriculum-Integrated Information Literacy (CIIL) in a Community College Nursing Program: A Practical Model

    ERIC Educational Resources Information Center

    Argüelles, Carlos

    2016-01-01

    This article describes a strategy to integrate information literacy into the curriculum of a nursing program in a community college. The model is articulated in four explained phases: preparatory, planning, implementation, and evaluation. It describes a collaborative process encouraging librarians to work with nursing faculty, driving students to…

  19. Semantic Information Processing of Physical Simulation Based on Scientific Concept Vocabulary Model

    NASA Astrophysics Data System (ADS)

    Kino, Chiaki; Suzuki, Yoshio; Takemiya, Hiroshi

    Scientific Concept Vocabulary (SCV) has been developed to actualize Cognitive methodology based Data Analysis System: CDAS which supports researchers to analyze large scale data efficiently and comprehensively. SCV is an information model for processing semantic information for physics and engineering. In the model of SCV, all semantic information is related to substantial data and algorisms. Consequently, SCV enables a data analysis system to recognize the meaning of execution results output from a numerical simulation. This method has allowed a data analysis system to extract important information from a scientific view point. Previous research has shown that SCV is able to describe simple scientific indices and scientific perceptions. However, it is difficult to describe complex scientific perceptions by currently-proposed SCV. In this paper, a new data structure for SCV has been proposed in order to describe scientific perceptions in more detail. Additionally, the prototype of the new model has been constructed and applied to actual data of numerical simulation. The result means that the new SCV is able to describe more complex scientific perceptions.

  20. A mathematical model for soil solute transfer into surface runoff as influenced by rainfall detachment.

    PubMed

    Yang, Ting; Wang, Quanjiu; Wu, Laosheng; Zhao, Guangxu; Liu, Yanli; Zhang, Pengyu

    2016-07-01

    Nutrients transport is a main source of water pollution. Several models describing transport of soil nutrients such as potassium, phosphate and nitrate in runoff water have been developed. The objectives of this research were to describe the nutrients transport processes by considering the effect of rainfall detachment, and to evaluate the factors that have greatest influence on nutrients transport into runoff. In this study, an existing mass-conservation equation and rainfall detachment process were combined and augmented to predict runoff of nutrients in surface water in a Loess Plateau soil in Northwestern Yangling, China. The mixing depth is a function of time as a result of rainfall impact, not a constant as described in previous models. The new model was tested using two different sub-models of complete-mixing and incomplete-mixing. The complete-mixing model is more popular to use for its simplicity. It captured the runoff trends of those high adsorption nutrients, and of nutrients transport along steep slopes. While the incomplete-mixing model predicted well for the highest observed concentrations of the test nutrients. Parameters inversely estimated by the models were applied to simulate nutrients transport, results suggested that both models can be adopted to describe nutrients transport in runoff under the impact of rainfall. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Conditions and limitations on learning in the adaptive management of mallard harvests

    USGS Publications Warehouse

    Johnson, F.A.; Kendall, W.L.; Dubovsky, J.A.

    2002-01-01

    In 1995, the United States Fish and Wildlife Service adopted a protocol for the adaptive management of waterfowl hunting regulations (AHM) to help reduce uncertainty about the magnitude of sustainable harvests. To date, the AHM process has focused principally on the midcontinent population of mallards (Anas platyrhynchos), whose dynamics are described by 4 alternative models. Collectively, these models express uncertainty (or disagreement) about whether harvest is an additive or a compensatory form of mortality and whether the reproductive process is weakly or strongly density-dependent. Each model is associated with a probability or 'weight,' which describes its relative ability to predict changes in population size. These Bayesian probabilities are updated annually using a comparison of population size predicted under each model with that observed by a monitoring program. The current AHM process is passively adaptive, in the sense that there is no a priori consideration of how harvest decisions might affect discrimination among models. We contrast this approach with an actively adaptive approach, in which harvest decisions are used in part to produce the learning needed to increase long-term management performance. Our investigation suggests that the passive approach is expected to perform nearly as well as an optimal actively adaptive approach, particularly considering the nature of the model set, management objectives and constraints, and current regulatory alternatives. We offer some comments about the nature of the biological hypotheses being tested and describe some of the inherent limitations on learning in the AHM process.

  2. The materials processing research base of the Materials Processing Center. Report for FY 1982

    NASA Technical Reports Server (NTRS)

    Flemings, M. C.

    1983-01-01

    The work described, while involving research in the broad field of materials processing, has two common features: the problems are closed related to space precessing of materials and have both practical and fundamental significance. An interesting and important feature of many of the projects is that the interdisciplinary nature of the problem mandates complementary analytical modeling/experimental approaches. An other important aspect of many of the projects is the increasing use of mathematical modeling techniques as one of the research tools. The predictive capability of these models, when tested against measurements, plays a very important role in both the planning of experimental programs and in the rational interpretation of the results. Many of the projects described have a space experiment as their ultimate objective. Mathematical models are proving to be extremely valuable in projecting the findings of ground - based experiments to microgravity conditions.

  3. Roles of Diffusion Dynamics in Stem Cell Signaling and Three-Dimensional Tissue Development.

    PubMed

    McMurtrey, Richard J

    2017-09-15

    Recent advancements in the ability to construct three-dimensional (3D) tissues and organoids from stem cells and biomaterials have not only opened abundant new research avenues in disease modeling and regenerative medicine but also have ignited investigation into important aspects of molecular diffusion in 3D cellular architectures. This article describes fundamental mechanics of diffusion with equations for modeling these dynamic processes under a variety of scenarios in 3D cellular tissue constructs. The effects of these diffusion processes and resultant concentration gradients are described in the context of the major molecular signaling pathways in stem cells that both mediate and are influenced by gas and nutrient concentrations, including how diffusion phenomena can affect stem cell state, cell differentiation, and metabolic states of the cell. The application of these diffusion models and pathways is of vital importance for future studies of developmental processes, disease modeling, and tissue regeneration.

  4. Modeling a Material's Instantaneous Velocity during Acceleration Driven by a Detonation's Gas-Push Process

    NASA Astrophysics Data System (ADS)

    Backofen, Joseph E.

    2005-07-01

    This paper will describe both the scientific findings and the model developed in order to quantfy a material's instantaneous velocity versus position, time, or the expansion ratio of an explosive's gaseous products while its gas pressure is accelerating the material. The formula derived to represent this gas-push process for the 2nd stage of the BRIGS Two-Step Detonation Propulsion Model was found to fit very well the published experimental data available for twenty explosives. When the formula's two key parameters (the ratio Vinitial / Vfinal and ExpansionRatioFinal) were adjusted slightly from the average values describing closely many explosives to values representing measured data for a particular explosive, the formula's representation of that explosive's gas-push process was improved. The time derivative of the velocity formula representing acceleration and/or pressure compares favorably to Jones-Wilkins-Lee equation-of-state model calculations performed using published JWL parameters.

  5. Improvement Guides for I.A. Curriculum

    ERIC Educational Resources Information Center

    Ritz, John M.; Wright, Lawrence S.

    1977-01-01

    Describes a project to revise "The Wisconsin Guide to Local Curriculum Improvement in Industrial Education, K-12", originally prepared in 1973. Four figures from the guide are included: (1) model of a field objective, (2) curriculum planning model, (3) instructional development process, and (4) process for developing objectives. (MF)

  6. Modeling and Simulation for Mission Operations Work System Design

    NASA Technical Reports Server (NTRS)

    Sierhuis, Maarten; Clancey, William J.; Seah, Chin; Trimble, Jay P.; Sims, Michael H.

    2003-01-01

    Work System analysis and design is complex and non-deterministic. In this paper we describe Brahms, a multiagent modeling and simulation environment for designing complex interactions in human-machine systems. Brahms was originally conceived as a business process design tool that simulates work practices, including social systems of work. We describe our modeling and simulation method for mission operations work systems design, based on a research case study in which we used Brahms to design mission operations for a proposed discovery mission to the Moon. We then describe the results of an actual method application project-the Brahms Mars Exploration Rover. Space mission operations are similar to operations of traditional organizations; we show that the application of Brahms for space mission operations design is relevant and transferable to other types of business processes in organizations.

  7. Hierarchical random cellular neural networks for system-level brain-like signal processing.

    PubMed

    Kozma, Robert; Puljic, Marko

    2013-09-01

    Sensory information processing and cognition in brains are modeled using dynamic systems theory. The brain's dynamic state is described by a trajectory evolving in a high-dimensional state space. We introduce a hierarchy of random cellular automata as the mathematical tools to describe the spatio-temporal dynamics of the cortex. The corresponding brain model is called neuropercolation which has distinct advantages compared to traditional models using differential equations, especially in describing spatio-temporal discontinuities in the form of phase transitions. Phase transitions demarcate singularities in brain operations at critical conditions, which are viewed as hallmarks of higher cognition and awareness experience. The introduced Monte-Carlo simulations obtained by parallel computing point to the importance of computer implementations using very large-scale integration (VLSI) and analog platforms. Copyright © 2013 Elsevier Ltd. All rights reserved.

  8. How to build a course in mathematical-biological modeling: content and processes for knowledge and skill.

    PubMed

    Hoskinson, Anne-Marie

    2010-01-01

    Biological problems in the twenty-first century are complex and require mathematical insight, often resulting in mathematical models of biological systems. Building mathematical-biological models requires cooperation among biologists and mathematicians, and mastery of building models. A new course in mathematical modeling presented the opportunity to build both content and process learning of mathematical models, the modeling process, and the cooperative process. There was little guidance from the literature on how to build such a course. Here, I describe the iterative process of developing such a course, beginning with objectives and choosing content and process competencies to fulfill the objectives. I include some inductive heuristics for instructors seeking guidance in planning and developing their own courses, and I illustrate with a description of one instructional model cycle. Students completing this class reported gains in learning of modeling content, the modeling process, and cooperative skills. Student content and process mastery increased, as assessed on several objective-driven metrics in many types of assessments.

  9. How to Build a Course in Mathematical–Biological Modeling: Content and Processes for Knowledge and Skill

    PubMed Central

    2010-01-01

    Biological problems in the twenty-first century are complex and require mathematical insight, often resulting in mathematical models of biological systems. Building mathematical–biological models requires cooperation among biologists and mathematicians, and mastery of building models. A new course in mathematical modeling presented the opportunity to build both content and process learning of mathematical models, the modeling process, and the cooperative process. There was little guidance from the literature on how to build such a course. Here, I describe the iterative process of developing such a course, beginning with objectives and choosing content and process competencies to fulfill the objectives. I include some inductive heuristics for instructors seeking guidance in planning and developing their own courses, and I illustrate with a description of one instructional model cycle. Students completing this class reported gains in learning of modeling content, the modeling process, and cooperative skills. Student content and process mastery increased, as assessed on several objective-driven metrics in many types of assessments. PMID:20810966

  10. Aviation System Analysis Capability Executive Assistant Design

    NASA Technical Reports Server (NTRS)

    Roberts, Eileen; Villani, James A.; Osman, Mohammed; Godso, David; King, Brent; Ricciardi, Michael

    1998-01-01

    In this technical document, we describe the design developed for the Aviation System Analysis Capability (ASAC) Executive Assistant (EA) Proof of Concept (POC). We describe the genesis and role of the ASAC system, discuss the objectives of the ASAC system and provide an overview of components and models within the ASAC system, and describe the design process and the results of the ASAC EA POC system design. We also describe the evaluation process and results for applicable COTS software. The document has six chapters, a bibliography, three appendices and one attachment.

  11. Aviation System Analysis Capability Executive Assistant Development

    NASA Technical Reports Server (NTRS)

    Roberts, Eileen; Villani, James A.; Anderson, Kevin; Book, Paul

    1999-01-01

    In this technical document, we describe the development of the Aviation System Analysis Capability (ASAC) Executive Assistant (EA) Proof of Concept (POC) and Beta version. We describe the genesis and role of the ASAC system, discuss the objectives of the ASAC system and provide an overview of components and models in the ASAC system, and describe the design process and the results of the ASAC EA POC and Beta system development. We also describe the evaluation process and results for applicable COTS software. The document has seven chapters, a bibliography, and two appendices.

  12. The Role of Abduction in Proving Processes

    ERIC Educational Resources Information Center

    Pedemonte, Bettina; Reid, David

    2011-01-01

    This paper offers a typology of forms and uses of abduction that can be exploited to better analyze abduction in proving processes. Based on the work of Peirce and Eco, we describe different kinds of abductions that occur in students' mathematical activity and extend Toulmin's model of an argument as a methodological tool to describe students'…

  13. TOWARDS A MULTI-SCALE AGENT-BASED PROGRAMMING LANGUAGE METHODOLOGY

    PubMed Central

    Somogyi, Endre; Hagar, Amit; Glazier, James A.

    2017-01-01

    Living tissues are dynamic, heterogeneous compositions of objects, including molecules, cells and extra-cellular materials, which interact via chemical, mechanical and electrical process and reorganize via transformation, birth, death and migration processes. Current programming language have difficulty describing the dynamics of tissues because: 1: Dynamic sets of objects participate simultaneously in multiple processes, 2: Processes may be either continuous or discrete, and their activity may be conditional, 3: Objects and processes form complex, heterogeneous relationships and structures, 4: Objects and processes may be hierarchically composed, 5: Processes may create, destroy and transform objects and processes. Some modeling languages support these concepts, but most cannot translate models into executable simulations. We present a new hybrid executable modeling language paradigm, the Continuous Concurrent Object Process Methodology (CCOPM) which naturally expresses tissue models, enabling users to visually create agent-based models of tissues, and also allows computer simulation of these models. PMID:29282379

  14. TOWARDS A MULTI-SCALE AGENT-BASED PROGRAMMING LANGUAGE METHODOLOGY.

    PubMed

    Somogyi, Endre; Hagar, Amit; Glazier, James A

    2016-12-01

    Living tissues are dynamic, heterogeneous compositions of objects , including molecules, cells and extra-cellular materials, which interact via chemical, mechanical and electrical process and reorganize via transformation, birth, death and migration processes . Current programming language have difficulty describing the dynamics of tissues because: 1: Dynamic sets of objects participate simultaneously in multiple processes, 2: Processes may be either continuous or discrete, and their activity may be conditional, 3: Objects and processes form complex, heterogeneous relationships and structures, 4: Objects and processes may be hierarchically composed, 5: Processes may create, destroy and transform objects and processes. Some modeling languages support these concepts, but most cannot translate models into executable simulations. We present a new hybrid executable modeling language paradigm, the Continuous Concurrent Object Process Methodology ( CCOPM ) which naturally expresses tissue models, enabling users to visually create agent-based models of tissues, and also allows computer simulation of these models.

  15. Emissions model of waste treatment operations at the Idaho Chemical Processing Plant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schindler, R.E.

    1995-03-01

    An integrated model of the waste treatment systems at the Idaho Chemical Processing Plant (ICPP) was developed using a commercially-available process simulation software (ASPEN Plus) to calculate atmospheric emissions of hazardous chemicals for use in an application for an environmental permit to operate (PTO). The processes covered by the model are the Process Equipment Waste evaporator, High Level Liquid Waste evaporator, New Waste Calcining Facility and Liquid Effluent Treatment and Disposal facility. The processes are described along with the model and its assumptions. The model calculates emissions of NO{sub x}, CO, volatile acids, hazardous metals, and organic chemicals. Some calculatedmore » relative emissions are summarized and insights on building simulations are discussed.« less

  16. A standard satellite control reference model

    NASA Technical Reports Server (NTRS)

    Golden, Constance

    1994-01-01

    This paper describes a Satellite Control Reference Model that provides the basis for an approach to identify where standards would be beneficial in supporting space operations functions. The background and context for the development of the model and the approach are described. A process for using this reference model to trace top level interoperability directives to specific sets of engineering interface standards that must be implemented to meet these directives is discussed. Issues in developing a 'universal' reference model are also identified.

  17. Development of a cervical cancer educational program for Chinese women using intervention mapping.

    PubMed

    Hou, Su-I; Fernandez, Maria E; Parcel, Guy S

    2004-01-01

    This article describes the development of a program to increase Pap screening behavior among women in Taiwan. Intervention mapping, an innovative process of intervention design, guided the development of this program. The development process included a needs assessment identifying factors influencing Pap screening behavior relevant to Chinese women. The program used methods such as information transmission, modeling, persuasion, and facilitation. Strategies included direct mail communication, role-model stories and testimonials, and a telephone-counseling component. The delineation of specific plans for implementation and evaluation are also described.

  18. History of research on modelling gypsy moth population ecology

    Treesearch

    J. J. Colbert

    1991-01-01

    History of research to develop models of gypsy moth population dynamics and some related studies are described. Empirical regression-based models are reviewed, and then the more comprehensive process models are discussed. Current model- related research efforts are introduced.

  19. Assessing Students' Abilities in Processes of Scientific Inquiry in Biology Using a Paper-and-Pencil Test

    ERIC Educational Resources Information Center

    Nowak, Kathrin Helena; Nehring, Andreas; Tiemann, Rüdiger; Upmeier zu Belzen, Annette

    2013-01-01

    The aim of the study was to describe, categorise and analyse students' (aged 14-16) processes of scientific inquiry in biology and chemistry education. Therefore, a theoretical structure for scientific inquiry for both biology and chemistry, the VerE model, was developed. This model consists of nine epistemological acts, which combine processes of…

  20. Rotorcraft system identification techniques for handling qualities and stability and control evaluation

    NASA Technical Reports Server (NTRS)

    Hall, W. E., Jr.; Gupta, N. K.; Hansen, R. S.

    1978-01-01

    An integrated approach to rotorcraft system identification is described. This approach consists of sequential application of (1) data filtering to estimate states of the system and sensor errors, (2) model structure estimation to isolate significant model effects, and (3) parameter identification to quantify the coefficient of the model. An input design algorithm is described which can be used to design control inputs which maximize parameter estimation accuracy. Details of each aspect of the rotorcraft identification approach are given. Examples of both simulated and actual flight data processing are given to illustrate each phase of processing. The procedure is shown to provide means of calibrating sensor errors in flight data, quantifying high order state variable models from the flight data, and consequently computing related stability and control design models.

  1. Software-Engineering Process Simulation (SEPS) model

    NASA Technical Reports Server (NTRS)

    Lin, C. Y.; Abdel-Hamid, T.; Sherif, J. S.

    1992-01-01

    The Software Engineering Process Simulation (SEPS) model is described which was developed at JPL. SEPS is a dynamic simulation model of the software project development process. It uses the feedback principles of system dynamics to simulate the dynamic interactions among various software life cycle development activities and management decision making processes. The model is designed to be a planning tool to examine tradeoffs of cost, schedule, and functionality, and to test the implications of different managerial policies on a project's outcome. Furthermore, SEPS will enable software managers to gain a better understanding of the dynamics of software project development and perform postmodern assessments.

  2. On the Modeling of Vacuum Arc Remelting Process in Titanium Alloys

    NASA Astrophysics Data System (ADS)

    Patel, Ashish; Fiore, Daniel

    2016-07-01

    Mathematical modeling is routinely used in the process development and production of advanced aerospace alloys to gain greater insight into the effect of process parameters on final properties. This article describes the application of a 2-D mathematical VAR model presented at previous LMPC meetings. The impact of process parameters on melt pool geometry, solidification behavior, fluid-flow and chemistry in a Ti-6Al-4V ingot is discussed. Model predictions are validated against published data from a industrial size ingot, and results of a parametric study on particle dissolution are also discussed.

  3. Implementing Information and Communication Technology to Support Community Aged Care Service Integration: Lessons from an Australian Aged Care Provider.

    PubMed

    Douglas, Heather E; Georgiou, Andrew; Tariq, Amina; Prgomet, Mirela; Warland, Andrew; Armour, Pauline; Westbrook, Johanna I

    2017-04-10

    There is limited evidence of the benefits of information and communication technology (ICT) to support integrated aged care services. We undertook a case study to describe carelink+, a centralised client service management ICT system implemented by a large aged and community care service provider, Uniting. We sought to explicate the care-related information exchange processes associated with carelink+ and identify lessons for organisations attempting to use ICT to support service integration. Our case study included seventeen interviews and eleven observation sessions with a purposive sample of staff within the organisation. Inductive analysis was used to develop a model of ICT-supported information exchange. Management staff described the integrated care model designed to underpin carelink+. Frontline staff described complex information exchange processes supporting coordination of client services. Mismatches between the data quality and the functions carelink+ was designed to support necessitated the evolution of new work processes associated with the system. There is value in explicitly modelling the work processes that emerge as a consequence of ICT. Continuous evaluation of the match between ICT and work processes will help aged care organisations to achieve higher levels of ICT maturity that support their efforts to provide integrated care to clients.

  4. Implementing Information and Communication Technology to Support Community Aged Care Service Integration: Lessons from an Australian Aged Care Provider

    PubMed Central

    Georgiou, Andrew; Tariq, Amina; Prgomet, Mirela; Warland, Andrew; Armour, Pauline; Westbrook, Johanna I

    2017-01-01

    Introduction: There is limited evidence of the benefits of information and communication technology (ICT) to support integrated aged care services. Objectives: We undertook a case study to describe carelink+, a centralised client service management ICT system implemented by a large aged and community care service provider, Uniting. We sought to explicate the care-related information exchange processes associated with carelink+ and identify lessons for organisations attempting to use ICT to support service integration. Methods: Our case study included seventeen interviews and eleven observation sessions with a purposive sample of staff within the organisation. Inductive analysis was used to develop a model of ICT-supported information exchange. Results: Management staff described the integrated care model designed to underpin carelink+. Frontline staff described complex information exchange processes supporting coordination of client services. Mismatches between the data quality and the functions carelink+ was designed to support necessitated the evolution of new work processes associated with the system. Conclusions: There is value in explicitly modelling the work processes that emerge as a consequence of ICT. Continuous evaluation of the match between ICT and work processes will help aged care organisations to achieve higher levels of ICT maturity that support their efforts to provide integrated care to clients. PMID:29042851

  5. Model-Data Fusion to Test Hypothesized Drivers of Lake Carbon Cycling Reveals Importance of Physical Controls

    NASA Astrophysics Data System (ADS)

    Hararuk, Oleksandra; Zwart, Jacob A.; Jones, Stuart E.; Prairie, Yves; Solomon, Christopher T.

    2018-03-01

    Formal integration of models and data to test hypotheses about the processes controlling carbon dynamics in lakes is rare, despite the importance of lakes in the carbon cycle. We built a suite of models (n = 102) representing different hypotheses about lake carbon processing, fit these models to data from a north-temperate lake using data assimilation, and identified which processes were essential for adequately describing the observations. The hypotheses that we tested concerned organic matter lability and its variability through time, temperature dependence of biological decay, photooxidation, microbial dynamics, and vertical transport of water via hypolimnetic entrainment and inflowing density currents. The data included epilimnetic and hypolimnetic CO2 and dissolved organic carbon, hydrologic fluxes, carbon loads, gross primary production, temperature, and light conditions at high frequency for one calibration and one validation year. The best models explained 76-81% and 64-67% of the variability in observed epilimnetic CO2 and dissolved organic carbon content in the validation data. Accurately describing C dynamics required accounting for hypolimnetic entrainment and inflowing density currents, in addition to accounting for biological transformations. In contrast, neither photooxidation nor variable organic matter lability improved model performance. The temperature dependence of biological decay (Q10) was estimated at 1.45, significantly lower than the commonly assumed Q10 of 2. By confronting multiple models of lake C dynamics with observations, we identified processes essential for describing C dynamics in a temperate lake at daily to annual scales, while also providing a methodological roadmap for using data assimilation to further improve understanding of lake C cycling.

  6. Numerical studies from quantum to macroscopic scales of carbon nanoparticules in hydrogen plasma

    NASA Astrophysics Data System (ADS)

    Lombardi, Guillaume; Ngandjong, Alain; Mezei, Zsolt; Mougenot, Jonathan; Michau, Armelle; Hassouni, Khaled; Seydou, Mahamadou; Maurel, François

    2016-09-01

    Dusty plasmas take part in large scientific domains from Universe Science to nanomaterial synthesis processes. They are often generated by growth from molecular precursor. This growth leads to the formation of larger clusters which induce solid germs nucleation. Particle formed are described by an aerosol dynamic taking into account coagulation, molecular deposition and transport processes. These processes are controlled by the elementary particle. So there is a strong coupling between particle dynamics and plasma discharge equilibrium. This study is focused on the development of a multiscale physic and numeric model of hydrogen plasmas and carbon particles around three essential coupled axes to describe the various physical phenomena: (i) Macro/mesoscopic fluid modeling describing in an auto-coherent way, characteristics of the plasma, molecular clusters and aerosol behavior; (ii) the classic molecular dynamics offering a description to the scale molecular of the chains of chemical reactions and the phenomena of aggregation; (iii) the quantum chemistry to establish the activation barriers of the different processes driving the nanopoarticule formation.

  7. Conceptual hierarchical modeling to describe wetland plant community organization

    USGS Publications Warehouse

    Little, A.M.; Guntenspergen, G.R.; Allen, T.F.H.

    2010-01-01

    Using multivariate analysis, we created a hierarchical modeling process that describes how differently-scaled environmental factors interact to affect wetland-scale plant community organization in a system of small, isolated wetlands on Mount Desert Island, Maine. We followed the procedure: 1) delineate wetland groups using cluster analysis, 2) identify differently scaled environmental gradients using non-metric multidimensional scaling, 3) order gradient hierarchical levels according to spatiotem-poral scale of fluctuation, and 4) assemble hierarchical model using group relationships with ordination axes and post-hoc tests of environmental differences. Using this process, we determined 1) large wetland size and poor surface water chemistry led to the development of shrub fen wetland vegetation, 2) Sphagnum and water chemistry differences affected fen vs. marsh / sedge meadows status within small wetlands, and 3) small-scale hydrologic differences explained transitions between forested vs. non-forested and marsh vs. sedge meadow vegetation. This hierarchical modeling process can help explain how upper level contextual processes constrain biotic community response to lower-level environmental changes. It creates models with more nuanced spatiotemporal complexity than classification and regression tree procedures. Using this process, wetland scientists will be able to generate more generalizable theories of plant community organization, and useful management models. ?? Society of Wetland Scientists 2009.

  8. Simulation model for plant growth in controlled environment systems

    NASA Technical Reports Server (NTRS)

    Raper, C. D., Jr.; Wann, M.

    1986-01-01

    The role of the mathematical model is to relate the individual processes to environmental conditions and the behavior of the whole plant. Using the controlled-environment facilities of the phytotron at North Carolina State University for experimentation at the whole-plant level and methods for handling complex models, researchers developed a plant growth model to describe the relationships between hierarchial levels of the crop production system. The fundamental processes that are considered are: (1) interception of photosynthetically active radiation by leaves, (2) absorption of photosynthetically active radiation, (3) photosynthetic transformation of absorbed radiation into chemical energy of carbon bonding in solube carbohydrates in the leaves, (4) translocation between carbohydrate pools in leaves, stems, and roots, (5) flow of energy from carbohydrate pools for respiration, (6) flow from carbohydrate pools for growth, and (7) aging of tissues. These processes are described at the level of organ structure and of elementary function processes. The driving variables of incident photosynthetically active radiation and ambient temperature as inputs pertain to characterization at the whole-plant level. The output of the model is accumulated dry matter partitioned among leaves, stems, and roots; thus, the elementary processes clearly operate under the constraints of the plant structure which is itself the output of the model.

  9. IDC Re-Engineering Phase 2 Iteration E2 Use Case Realizations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, James M.; Burns, John F.; Hamlet, Benjamin R.

    2016-06-01

    This architecturally significant use case describes how the System acquires meteorological data to build atmospheric models used in automatic and interactive processing of infrasound data. The System requests the latest available high-resolution global meteorological data from external data centers and puts it into the correct formats for generation of infrasound propagation models. The system moves the meteorological data from Data Acquisition Partition to the Data Processing Partition and stores the meteorological data. The System builds a new atmospheric model based on the meteorological data. This use case is architecturally significant because it describes acquiring meteorological data from various sources andmore » creating dynamic atmospheric transmission model to support the prediction of infrasonic signal detection« less

  10. Experiences Using Formal Methods for Requirements Modeling

    NASA Technical Reports Server (NTRS)

    Easterbrook, Steve; Lutz, Robyn; Covington, Rick; Kelly, John; Ampo, Yoko; Hamilton, David

    1996-01-01

    This paper describes three cases studies in the lightweight application of formal methods to requirements modeling for spacecraft fault protection systems. The case studies differ from previously reported applications of formal methods in that formal methods were applied very early in the requirements engineering process, to validate the evolving requirements. The results were fed back into the projects, to improve the informal specifications. For each case study, we describe what methods were applied, how they were applied, how much effort was involved, and what the findings were. In all three cases, the formal modeling provided a cost effective enhancement of the existing verification and validation processes. We conclude that the benefits gained from early modeling of unstable requirements more than outweigh the effort needed to maintain multiple representations.

  11. Stochastic dynamics of melt ponds and sea ice-albedo climate feedback

    NASA Astrophysics Data System (ADS)

    Sudakov, Ivan

    Evolution of melt ponds on the Arctic sea surface is a complicated stochastic process. We suggest a low-order model with ice-albedo feedback which describes stochastic dynamics of melt ponds geometrical characteristics. The model is a stochastic dynamical system model of energy balance in the climate system. We describe the equilibria in this model. We conclude the transition in fractal dimension of melt ponds affects the shape of the sea ice albedo curve.

  12. Diagnostic Perspectives on the Family: Process, Structural and Historical Contextual Models.

    ERIC Educational Resources Information Center

    Levant, Ronald F.

    1983-01-01

    Describes diagnostic perspectives for viewing dysfunctional families. Presents three general types of models (process, structural, and historical) and organized them along a continuum from most descriptive to most inferential. Presented at the 39th Annual Conference of the American Association for Marriage and Family Therapy, October-November…

  13. Facial Affect Processing and Depression Susceptibility: Cognitive Biases and Cognitive Neuroscience

    ERIC Educational Resources Information Center

    Bistricky, Steven L.; Ingram, Rick E.; Atchley, Ruth Ann

    2011-01-01

    Facial affect processing is essential to social development and functioning and is particularly relevant to models of depression. Although cognitive and interpersonal theories have long described different pathways to depression, cognitive-interpersonal and evolutionary social risk models of depression focus on the interrelation of interpersonal…

  14. Business process modeling for the Virginia Department of Transportation : a demonstration with the integrated six-year improvement program and the statewide transportation improvement program.

    DOT National Transportation Integrated Search

    2005-01-01

    This effort demonstrates business process modeling to describe the integration of particular planning and programming activities of a state highway agency. The motivations to document planning and programming activities are that: (i) resources for co...

  15. Modelling the Air–Surface Exchange of Ammonia from the Field to Global Scale

    EPA Science Inventory

    The Working Group addressed the current understanding and uncertainties in the processes controlling ammonia (NH3) bi-directional exchange, and in the application of numerical models to describe these processes. As a starting point for the discussion, the Working Group drew on th...

  16. Aligning Grammatical Theories and Language Processing Models

    ERIC Educational Resources Information Center

    Lewis, Shevaun; Phillips, Colin

    2015-01-01

    We address two important questions about the relationship between theoretical linguistics and psycholinguistics. First, do grammatical theories and language processing models describe separate cognitive systems, or are they accounts of different aspects of the same system? We argue that most evidence is consistent with the one-system view. Second,…

  17. Electrospining of polyaniline/poly(lactic acid) ultrathin fibers: process and statistical modeling using a non-gaussian approach

    USDA-ARS?s Scientific Manuscript database

    Cover: The electrospinning technique was employed to obtain conducting nanofibers based on polyaniline and poly(lactic acid). A statistical model was employed to describe how the process factors (solution concentration, applied voltage, and flow rate) govern the fiber dimensions. Nanofibers down to ...

  18. How Students Learn: Information Processing, Intellectual Development and Confrontation

    ERIC Educational Resources Information Center

    Entwistle, Noel

    1975-01-01

    A model derived from information processing theory is described, which helps to explain the complex verbal learning of students and suggests implications for lecturing techniques. Other factors affecting learning, which are not covered by the model, are discussed in relationship to it: student's intellectual development and effects of individual…

  19. Smart Frameworks and Self-Describing Models: Model Metadata for Automated Coupling of Hydrologic Process Components (Invited)

    NASA Astrophysics Data System (ADS)

    Peckham, S. D.

    2013-12-01

    Model coupling frameworks like CSDMS (Community Surface Dynamics Modeling System) and ESMF (Earth System Modeling Framework) have developed mechanisms that allow heterogeneous sets of process models to be assembled in a plug-and-play manner to create composite "system models". These mechanisms facilitate code reuse, but must simultaneously satisfy many different design criteria. They must be able to mediate or compensate for differences between the process models, such as their different programming languages, computational grids, time-stepping schemes, variable names and variable units. However, they must achieve this interoperability in a way that: (1) is noninvasive, requiring only relatively small and isolated changes to the original source code, (2) does not significantly reduce performance, (3) is not time-consuming or confusing for a model developer to implement, (4) can very easily be updated to accommodate new versions of a given process model and (5) does not shift the burden of providing model interoperability to the model developers, e.g. by requiring them to provide their output in specific forms that meet the input requirements of other models. In tackling these design challenges, model framework developers have learned that the best solution is to provide each model with a simple, standardized interface, i.e. a set of standardized functions that make the model: (1) fully-controllable by a caller (e.g. a model framework) and (2) self-describing. Model control functions are separate functions that allow a caller to initialize the model, advance the model's state variables in time and finalize the model. Model description functions allow a caller to retrieve detailed information on the model's input and output variables, its computational grid and its timestepping scheme. If the caller is a modeling framework, it can compare the answers to these queries with similar answers from other process models in a collection and then automatically call framework service components as necessary to mediate the differences between the coupled models. This talk will first review two key products of the CSDMS project, namely a standardized model interface called the Basic Model Interface (BMI) and the CSDMS Standard Names. The standard names are used in conjunction with BMI to provide a semantic matching mechanism that allows output variables from one process model to be reliably used as input variables to other process models in a collection. They include not just a standardized naming scheme for model variables, but also a standardized set of terms for describing the attributes and assumptions of a given model. To illustrate the power of standardized model interfaces and metadata, a smart, light-weight modeling framework written in Python will be introduced that can automatically (without user intervention) couple a set of BMI-enabled hydrologic process components together to create a spatial hydrologic model. The same mechanisms could also be used to provide seamless integration (import/export) of data and models.

  20. The stochastic dance of early HIV infection

    NASA Astrophysics Data System (ADS)

    Merrill, Stephen J.

    2005-12-01

    The stochastic nature of early HIV infection is described in a series of models, each of which captures aspects of the dance of HIV during the early stages of infection. It is to this highly variable target that the immune response must respond. The adaptability of the various components of the immune response is an important aspect of the system's operation, as the nature of the pathogens that the response will be required to respond to and the order in which those responses must be made cannot be known beforehand. As HIV infection has direct influence over cells responsible for the immune response, the dance predicts that the immune response will be also in a variable state of readiness and capability for this task of adaptation. The description of the stochastic dance of HIV here will use the tools of stochastic models, and for the most part, simulation. The justification for this approach is that the early stages and the development of HIV diversity require that the model to be able to describe both individual sample path and patient-to-patient variability. In addition, as early viral dynamics are best described using branching processes, the explosive growth of these models both predicts high variability and rapid response of HIV to changes in system parameters.In this paper, a basic viral growth model based on a time dependent continuous-time branching process is used to describe the growth of HIV infected cells in the macrophage and lymphocyte populations. Immigration from the reservoir population is added to the basic model to describe the incubation time distribution. This distribution is deduced directly from the modeling assumptions and the model of viral growth. A system of two branching processes, one in the infected macrophage population and one in the infected lymphocyte population is used to provide a description of the relationship between the development of HIV diversity as it relates to tropism (host cell preference). The role of the immune response to HIV and HIV infected cells is used to describe the movement of the infection from a few infected macrophages to a disease of infected CD4+ T lymphocytes.

  1. TSPA 1991: An initial total-system performance assessment for Yucca Mountain; Yucca Mountain Site Characterization Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barnard, R.W.; Wilson, M.L.; Dockery, H.A.

    1992-07-01

    This report describes an assessment of the long-term performance of a repository system that contains deeply buried highly radioactive waste; the system is assumed to be located at the potential site at Yucca Mountain, Nevada. The study includes an identification of features, events, and processes that might affect the potential repository, a construction of scenarios based on this identification, a selection of models describing these scenarios (including abstraction of appropriate models from detailed models), a selection of probability distributions for the parameters in the models, a stochastic calculation of radionuclide releases for the scenarios, and a derivation of complementary cumulativemore » distribution functions (CCDFs) for the releases. Releases and CCDFs are calculated for four categories of scenarios: aqueous flow (modeling primarily the existing conditions at the site, with allowances for climate change), gaseous flow, basaltic igneous activity, and human intrusion. The study shows that models of complex processes can be abstracted into more simplified representations that preserve the understanding of the processes and produce results consistent with those of more complex models.« less

  2. Modified parton branching model for multi-particle production in hadronic collisions: Application to SUSY particle branching

    NASA Astrophysics Data System (ADS)

    Yuanyuan, Zhang

    The stochastic branching model of multi-particle productions in high energy collision has theoretical basis in perturbative QCD, and also successfully describes the experimental data for a wide energy range. However, over the years, little attention has been put on the branching model for supersymmetric (SUSY) particles. In this thesis, a stochastic branching model has been built to describe the pure supersymmetric particle jets evolution. This model is a modified two-phase stochastic branching process, or more precisely a two phase Simple Birth Process plus Poisson Process. The general case that the jets contain both ordinary particle jets and supersymmetric particle jets has also been investigated. We get the multiplicity distribution of the general case, which contains a Hypergeometric function in its expression. We apply this new multiplicity distribution to the current experimental data of pp collision at center of mass energy √s = 0.9, 2.36, 7 TeV. The fitting shows the supersymmetric particles haven't participate branching at current collision energy.

  3. Stochastic and Deterministic Models for the Metastatic Emission Process: Formalisms and Crosslinks.

    PubMed

    Gomez, Christophe; Hartung, Niklas

    2018-01-01

    Although the detection of metastases radically changes prognosis of and treatment decisions for a cancer patient, clinically undetectable micrometastases hamper a consistent classification into localized or metastatic disease. This chapter discusses mathematical modeling efforts that could help to estimate the metastatic risk in such a situation. We focus on two approaches: (1) a stochastic framework describing metastatic emission events at random times, formalized via Poisson processes, and (2) a deterministic framework describing the micrometastatic state through a size-structured density function in a partial differential equation model. Three aspects are addressed in this chapter. First, a motivation for the Poisson process framework is presented and modeling hypotheses and mechanisms are introduced. Second, we extend the Poisson model to account for secondary metastatic emission. Third, we highlight an inherent crosslink between the stochastic and deterministic frameworks and discuss its implications. For increased accessibility the chapter is split into an informal presentation of the results using a minimum of mathematical formalism and a rigorous mathematical treatment for more theoretically interested readers.

  4. Concentration-driven models revisited: towards a unified framework to model settling tanks in water resource recovery facilities.

    PubMed

    Torfs, Elena; Martí, M Carmen; Locatelli, Florent; Balemans, Sophie; Bürger, Raimund; Diehl, Stefan; Laurent, Julien; Vanrolleghem, Peter A; François, Pierre; Nopens, Ingmar

    2017-02-01

    A new perspective on the modelling of settling behaviour in water resource recovery facilities is introduced. The ultimate goal is to describe in a unified way the processes taking place both in primary settling tanks (PSTs) and secondary settling tanks (SSTs) for a more detailed operation and control. First, experimental evidence is provided, pointing out distributed particle properties (such as size, shape, density, porosity, and flocculation state) as an important common source of distributed settling behaviour in different settling unit processes and throughout different settling regimes (discrete, hindered and compression settling). Subsequently, a unified model framework that considers several particle classes is proposed in order to describe distributions in settling behaviour as well as the effect of variations in particle properties on the settling process. The result is a set of partial differential equations (PDEs) that are valid from dilute concentrations, where they correspond to discrete settling, to concentrated suspensions, where they correspond to compression settling. Consequently, these PDEs model both PSTs and SSTs.

  5. Modeling and simulation of the debonding process of composite solid propellants

    NASA Astrophysics Data System (ADS)

    Feng, Tao; Xu, Jin-sheng; Han, Long; Chen, Xiong

    2017-07-01

    In order to study the damage evolution law of composite solid propellants, the molecular dynamics particle filled algorithm was used to establish the mesoscopic structure model of HTPB(Hydroxyl-terminated polybutadiene) propellants. The cohesive element method was employed for the adhesion interface between AP(Ammonium perchlorate) particle and HTPB matrix and the bilinear cohesive zone model was used to describe the mechanical response of the interface elements. The inversion analysis method based on Hooke-Jeeves optimization algorithm was employed to identify the parameters of cohesive zone model(CZM) of the particle/binder interface. Then, the optimized parameters were applied to the commercial finite element software ABAQUS to simulate the damage evolution process for AP particle and HTPB matrix, including the initiation, development, gathering and macroscopic crack. Finally, the stress-strain simulation curve was compared with the experiment curves. The result shows that the bilinear cohesive zone model can accurately describe the debonding and fracture process between the AP particles and HTPB matrix under the uniaxial tension loading.

  6. Incorporating Handling Qualities Analysis into Rotorcraft Conceptual Design

    NASA Technical Reports Server (NTRS)

    Lawrence, Ben

    2014-01-01

    This paper describes the initial development of a framework to incorporate handling qualities analyses into a rotorcraft conceptual design process. In particular, the paper describes how rotorcraft conceptual design level data can be used to generate flight dynamics models for handling qualities analyses. Also, methods are described that couple a basic stability augmentation system to the rotorcraft flight dynamics model to extend analysis to beyond that of the bare airframe. A methodology for calculating the handling qualities characteristics of the flight dynamics models and for comparing the results to ADS-33E criteria is described. Preliminary results from the application of the handling qualities analysis for variations in key rotorcraft design parameters of main rotor radius, blade chord, hub stiffness and flap moment of inertia are shown. Varying relationships, with counteracting trends for different handling qualities criteria and different flight speeds are exhibited, with the action of the control system playing a complex part in the outcomes. Overall, the paper demonstrates how a broad array of technical issues across flight dynamics stability and control, simulation and modeling, control law design and handling qualities testing and evaluation had to be confronted to implement even a moderately comprehensive handling qualities analysis of relatively low fidelity models. A key outstanding issue is to how to 'close the loop' with an overall design process, and options for the exploration of how to feedback handling qualities results to a conceptual design process are proposed for future work.

  7. Data processing and optimization system to study prospective interstate power interconnections

    NASA Astrophysics Data System (ADS)

    Podkovalnikov, Sergei; Trofimov, Ivan; Trofimov, Leonid

    2018-01-01

    The paper presents Data processing and optimization system for studying and making rational decisions on the formation of interstate electric power interconnections, with aim to increasing effectiveness of their functioning and expansion. The technologies for building and integrating a Data processing and optimization system including an object-oriented database and a predictive mathematical model for optimizing the expansion of electric power systems ORIRES, are described. The technology of collection and pre-processing of non-structured data collected from various sources and its loading to the object-oriented database, as well as processing and presentation of information in the GIS system are described. One of the approaches of graphical visualization of the results of optimization model is considered on the example of calculating the option for expansion of the South Korean electric power grid.

  8. MARKOV: A methodology for the solution of infinite time horizon MARKOV decision processes

    USGS Publications Warehouse

    Williams, B.K.

    1988-01-01

    Algorithms are described for determining optimal policies for finite state, finite action, infinite discrete time horizon Markov decision processes. Both value-improvement and policy-improvement techniques are used in the algorithms. Computing procedures are also described. The algorithms are appropriate for processes that are either finite or infinite, deterministic or stochastic, discounted or undiscounted, in any meaningful combination of these features. Computing procedures are described in terms of initial data processing, bound improvements, process reduction, and testing and solution. Application of the methodology is illustrated with an example involving natural resource management. Management implications of certain hypothesized relationships between mallard survival and harvest rates are addressed by applying the optimality procedures to mallard population models.

  9. A Buffer Model of Memory Encoding and Temporal Correlations in Retrieval

    ERIC Educational Resources Information Center

    Lehman, Melissa; Malmberg, Kenneth J.

    2013-01-01

    Atkinson and Shiffrin's (1968) dual-store model of memory includes structural aspects of memory along with control processes. The rehearsal buffer is a process by which items are kept in mind and long-term episodic traces are formed. The model has been both influential and controversial. Here, we describe a novel variant of Atkinson and Shiffrin's…

  10. Wheat stress indicator model, Crop Condition Assessment Division (CCAD) data base interface driver, user's manual

    NASA Technical Reports Server (NTRS)

    Hansen, R. F. (Principal Investigator)

    1981-01-01

    The use of the wheat stress indicator model CCAD data base interface driver is described. The purpose of this system is to interface the wheat stress indicator model with the CCAD operational data base. The interface driver routine decides what meteorological stations should be processed and calls the proper subroutines to process the stations.

  11. Selective Processing Techniques for Electronics and Opto-Electronic Applications: Quantum-Well Devices and Integrated Optic Circuits

    DTIC Science & Technology

    1993-02-10

    new technology is to have sufficient control of processing to *- describable by an appropriate elecromagnetic model . build useful devices. For example...3. W aveguide Modulators .................................. 7 B. Integrated Optical Device and Circuit Modeling ... ................... .. 10 C...following categories: A. Integrated Optical Devices and Technology B. Integrated Optical Device and Circuit Modeling C. Cryogenic Etching for Low

  12. Mathematical modeling of high-pH chemical flooding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhuyan, D.; Lake, L.W.; Pope, G.A.

    1990-05-01

    This paper describes a generalized compositional reservoir simulator for high-pH chemical flooding processes. This simulator combines the reaction chemistry associated with these processes with the extensive physical- and flow-property modeling schemes of an existing micellar/polymer flood simulator, UTCHEM. Application of the model is illustrated for cases from a simple alkaline preflush to surfactant-enhanced alkaline-polymer flooding.

  13. Chip level modeling of LSI devices

    NASA Technical Reports Server (NTRS)

    Armstrong, J. R.

    1984-01-01

    The advent of Very Large Scale Integration (VLSI) technology has rendered the gate level model impractical for many simulation activities critical to the design automation process. As an alternative, an approach to the modeling of VLSI devices at the chip level is described, including the specification of modeling language constructs important to the modeling process. A model structure is presented in which models of the LSI devices are constructed as single entities. The modeling structure is two layered. The functional layer in this structure is used to model the input/output response of the LSI chip. A second layer, the fault mapping layer, is added, if fault simulations are required, in order to map the effects of hardware faults onto the functional layer. Modeling examples for each layer are presented. Fault modeling at the chip level is described. Approaches to realistic functional fault selection and defining fault coverage for functional faults are given. Application of the modeling techniques to single chip and bit slice microprocessors is discussed.

  14. Analysis and modeling of wafer-level process variability in 28 nm FD-SOI using split C-V measurements

    NASA Astrophysics Data System (ADS)

    Pradeep, Krishna; Poiroux, Thierry; Scheer, Patrick; Juge, André; Gouget, Gilles; Ghibaudo, Gérard

    2018-07-01

    This work details the analysis of wafer level global process variability in 28 nm FD-SOI using split C-V measurements. The proposed approach initially evaluates the native on wafer process variability using efficient extraction methods on split C-V measurements. The on-wafer threshold voltage (VT) variability is first studied and modeled using a simple analytical model. Then, a statistical model based on the Leti-UTSOI compact model is proposed to describe the total C-V variability in different bias conditions. This statistical model is finally used to study the contribution of each process parameter to the total C-V variability.

  15. TREAT Modeling and Simulation Strategy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DeHart, Mark David

    2015-09-01

    This report summarizes a four-phase process used to describe the strategy in developing modeling and simulation software for the Transient Reactor Test Facility. The four phases of this research and development task are identified as (1) full core transient calculations with feedback, (2) experiment modeling, (3) full core plus experiment simulation and (4) quality assurance. The document describes the four phases, the relationship between these research phases, and anticipated needs within each phase.

  16. Fast simulation of reconstructed phylogenies under global time-dependent birth-death processes.

    PubMed

    Höhna, Sebastian

    2013-06-01

    Diversification rates and patterns may be inferred from reconstructed phylogenies. Both the time-dependent and the diversity-dependent birth-death process can produce the same observed patterns of diversity over time. To develop and test new models describing the macro-evolutionary process of diversification, generic and fast algorithms to simulate under these models are necessary. Simulations are not only important for testing and developing models but play an influential role in the assessment of model fit. In the present article, I consider as the model a global time-dependent birth-death process where each species has the same rates but rates may vary over time. For this model, I derive the likelihood of the speciation times from a reconstructed phylogenetic tree and show that each speciation event is independent and identically distributed. This fact can be used to simulate efficiently reconstructed phylogenetic trees when conditioning on the number of species, the time of the process or both. I show the usability of the simulation by approximating the posterior predictive distribution of a birth-death process with decreasing diversification rates applied on a published bird phylogeny (family Cettiidae). The methods described in this manuscript are implemented in the R package TESS, available from the repository CRAN (http://cran.r-project.org/web/packages/TESS/). Supplementary data are available at Bioinformatics online.

  17. Do's and Don'ts of Computer Models for Planning

    ERIC Educational Resources Information Center

    Hammond, John S., III

    1974-01-01

    Concentrates on the managerial issues involved in computer planning models. Describes what computer planning models are and the process by which managers can increase the likelihood of computer planning models being successful in their organizations. (Author/DN)

  18. A distributed computing model for telemetry data processing

    NASA Astrophysics Data System (ADS)

    Barry, Matthew R.; Scott, Kevin L.; Weismuller, Steven P.

    1994-05-01

    We present a new approach to distributing processed telemetry data among spacecraft flight controllers within the control centers at NASA's Johnson Space Center. This approach facilitates the development of application programs which integrate spacecraft-telemetered data and ground-based synthesized data, then distributes this information to flight controllers for analysis and decision-making. The new approach combines various distributed computing models into one hybrid distributed computing model. The model employs both client-server and peer-to-peer distributed computing models cooperating to provide users with information throughout a diverse operations environment. Specifically, it provides an attractive foundation upon which we are building critical real-time monitoring and control applications, while simultaneously lending itself to peripheral applications in playback operations, mission preparations, flight controller training, and program development and verification. We have realized the hybrid distributed computing model through an information sharing protocol. We shall describe the motivations that inspired us to create this protocol, along with a brief conceptual description of the distributed computing models it employs. We describe the protocol design in more detail, discussing many of the program design considerations and techniques we have adopted. Finally, we describe how this model is especially suitable for supporting the implementation of distributed expert system applications.

  19. A distributed computing model for telemetry data processing

    NASA Technical Reports Server (NTRS)

    Barry, Matthew R.; Scott, Kevin L.; Weismuller, Steven P.

    1994-01-01

    We present a new approach to distributing processed telemetry data among spacecraft flight controllers within the control centers at NASA's Johnson Space Center. This approach facilitates the development of application programs which integrate spacecraft-telemetered data and ground-based synthesized data, then distributes this information to flight controllers for analysis and decision-making. The new approach combines various distributed computing models into one hybrid distributed computing model. The model employs both client-server and peer-to-peer distributed computing models cooperating to provide users with information throughout a diverse operations environment. Specifically, it provides an attractive foundation upon which we are building critical real-time monitoring and control applications, while simultaneously lending itself to peripheral applications in playback operations, mission preparations, flight controller training, and program development and verification. We have realized the hybrid distributed computing model through an information sharing protocol. We shall describe the motivations that inspired us to create this protocol, along with a brief conceptual description of the distributed computing models it employs. We describe the protocol design in more detail, discussing many of the program design considerations and techniques we have adopted. Finally, we describe how this model is especially suitable for supporting the implementation of distributed expert system applications.

  20. Modeling nutrient retention at the watershed scale: Does small stream research apply to the whole river network?

    NASA Astrophysics Data System (ADS)

    Aguilera, Rosana; Marcé, Rafael; Sabater, Sergi

    2013-06-01

    are conveyed from terrestrial and upstream sources through drainage networks. Streams and rivers contribute to regulate the material exported downstream by means of transformation, storage, and removal of nutrients. It has been recently suggested that the efficiency of process rates relative to available nutrient concentration in streams eventually declines, following an efficiency loss (EL) dynamics. However, most of these predictions are based at the reach scale in pristine streams, failing to describe the role of entire river networks. Models provide the means to study nutrient cycling from the stream network perspective via upscaling to the watershed the key mechanisms occurring at the reach scale. We applied a hybrid process-based and statistical model (SPARROW, Spatially Referenced Regression on Watershed Attributes) as a heuristic approach to describe in-stream nutrient processes in a highly impaired, high stream order watershed (the Llobregat River Basin, NE Spain). The in-stream decay specifications of the model were modified to include a partial saturation effect in uptake efficiency (expressed as a power law) and better capture biological nutrient retention in river systems under high anthropogenic stress. The stream decay coefficients were statistically significant in both nitrate and phosphate models, indicating the potential role of in-stream processing in limiting nutrient export. However, the EL concept did not reliably describe the patterns of nutrient uptake efficiency for the concentration gradient and streamflow values found in the Llobregat River basin, posing in doubt its complete applicability to explain nutrient retention processes in stream networks comprising highly impaired rivers.

  1. Behavioral Signal Processing: Deriving Human Behavioral Informatics From Speech and Language: Computational techniques are presented to analyze and model expressed and perceived human behavior-variedly characterized as typical, atypical, distressed, and disordered-from speech and language cues and their applications in health, commerce, education, and beyond.

    PubMed

    Narayanan, Shrikanth; Georgiou, Panayiotis G

    2013-02-07

    The expression and experience of human behavior are complex and multimodal and characterized by individual and contextual heterogeneity and variability. Speech and spoken language communication cues offer an important means for measuring and modeling human behavior. Observational research and practice across a variety of domains from commerce to healthcare rely on speech- and language-based informatics for crucial assessment and diagnostic information and for planning and tracking response to an intervention. In this paper, we describe some of the opportunities as well as emerging methodologies and applications of human behavioral signal processing (BSP) technology and algorithms for quantitatively understanding and modeling typical, atypical, and distressed human behavior with a specific focus on speech- and language-based communicative, affective, and social behavior. We describe the three important BSP components of acquiring behavioral data in an ecologically valid manner across laboratory to real-world settings, extracting and analyzing behavioral cues from measured data, and developing models offering predictive and decision-making support. We highlight both the foundational speech and language processing building blocks as well as the novel processing and modeling opportunities. Using examples drawn from specific real-world applications ranging from literacy assessment and autism diagnostics to psychotherapy for addiction and marital well being, we illustrate behavioral informatics applications of these signal processing techniques that contribute to quantifying higher level, often subjectively described, human behavior in a domain-sensitive fashion.

  2. Engaging Students In Modeling Instruction for Introductory Physics

    NASA Astrophysics Data System (ADS)

    Brewe, Eric

    2016-05-01

    Teaching introductory physics is arguably one of the most important things that a physics department does. It is the primary way that students from other science disciplines engage with physics and it is the introduction to physics for majors. Modeling instruction is an active learning strategy for introductory physics built on the premise that science proceeds through the iterative process of model construction, development, deployment, and revision. We describe the role that participating in authentic modeling has in learning and then explore how students engage in this process in the classroom. In this presentation, we provide a theoretical background on models and modeling and describe how these theoretical elements are enacted in the introductory university physics classroom. We provide both quantitative and video data to link the development of a conceptual model to the design of the learning environment and to student outcomes. This work is supported in part by DUE #1140706.

  3. Description of the Process Model for the Technoeconomic Evaluation of MEA versus Mixed Amines for Carbon Dioxide Removal from Stack Gas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, Dale A.

    This model description is supplemental to the Lawrence Livermore National Laboratory (LLNL) report LLNL-TR-642494, Technoeconomic Evaluation of MEA versus Mixed Amines for CO2 Removal at Near- Commercial Scale at Duke Energy Gibson 3 Plant. We describe the assumptions and methodology used in the Laboratory’s simulation of its understanding of Huaneng’s novel amine solvent for CO2 capture with 35% mixed amine. The results of that simulation have been described in LLNL-TR-642494. The simulation was performed using ASPEN 7.0. The composition of the Huaneng’s novel amine solvent was estimated based on information gleaned from Huaneng patents. The chemistry of the process wasmore » described using nine equations, representing reactions within the absorber and stripper columns using the ELECTNRTL property method. As a rate-based ASPEN simulation model was not available to Lawrence Livermore at the time of writing, the height of a theoretical plate was estimated using open literature for similar processes. Composition of the flue gas was estimated based on information supplied by Duke Energy for Unit 3 of the Gibson plant. The simulation was scaled at one million short tons of CO2 absorbed per year. To aid stability of the model, convergence of the main solvent recycle loop was implemented manually, as described in the Blocks section below. Automatic convergence of this loop led to instability during the model iterations. Manual convergence of the loop enabled accurate representation and maintenance of model stability.« less

  4. Simulations of ecosystem hydrological processes using a unified multi-scale model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Xiaofan; Liu, Chongxuan; Fang, Yilin

    2015-01-01

    This paper presents a unified multi-scale model (UMSM) that we developed to simulate hydrological processes in an ecosystem containing both surface water and groundwater. The UMSM approach modifies the Navier–Stokes equation by adding a Darcy force term to formulate a single set of equations to describe fluid momentum and uses a generalized equation to describe fluid mass balance. The advantage of the approach is that the single set of the equations can describe hydrological processes in both surface water and groundwater where different models are traditionally required to simulate fluid flow. This feature of the UMSM significantly facilitates modelling ofmore » hydrological processes in ecosystems, especially at locations where soil/sediment may be frequently inundated and drained in response to precipitation, regional hydrological and climate changes. In this paper, the UMSM was benchmarked using WASH123D, a model commonly used for simulating coupled surface water and groundwater flow. Disney Wilderness Preserve (DWP) site at the Kissimmee, Florida, where active field monitoring and measurements are ongoing to understand hydrological and biogeochemical processes, was then used as an example to illustrate the UMSM modelling approach. The simulations results demonstrated that the DWP site is subject to the frequent changes in soil saturation, the geometry and volume of surface water bodies, and groundwater and surface water exchange. All the hydrological phenomena in surface water and groundwater components including inundation and draining, river bank flow, groundwater table change, soil saturation, hydrological interactions between groundwater and surface water, and the migration of surface water and groundwater interfaces can be simultaneously simulated using the UMSM. Overall, the UMSM offers a cross-scale approach that is particularly suitable to simulate coupled surface and ground water flow in ecosystems with strong surface water and groundwater interactions.« less

  5. Using Quality Management Methods in Knowledge-Based Organizations. An Approach to the Application of the Taguchi Method to the Process of Pressing Tappets into Anchors

    NASA Astrophysics Data System (ADS)

    Ţîţu, M. A.; Pop, A. B.; Ţîţu, Ș

    2017-06-01

    This paper presents a study on the modelling and optimization of certain variables by using the Taguchi Method with a view to modelling and optimizing the process of pressing tappets into anchors, process conducted in an organization that promotes knowledge-based management. The paper promotes practical concepts of the Taguchi Method and describes the way in which the objective functions are obtained and used during the modelling and optimization of the process of pressing tappets into the anchors.

  6. Continuous time Boolean modeling for biological signaling: application of Gillespie algorithm.

    PubMed

    Stoll, Gautier; Viara, Eric; Barillot, Emmanuel; Calzone, Laurence

    2012-08-29

    Mathematical modeling is used as a Systems Biology tool to answer biological questions, and more precisely, to validate a network that describes biological observations and predict the effect of perturbations. This article presents an algorithm for modeling biological networks in a discrete framework with continuous time. There exist two major types of mathematical modeling approaches: (1) quantitative modeling, representing various chemical species concentrations by real numbers, mainly based on differential equations and chemical kinetics formalism; (2) and qualitative modeling, representing chemical species concentrations or activities by a finite set of discrete values. Both approaches answer particular (and often different) biological questions. Qualitative modeling approach permits a simple and less detailed description of the biological systems, efficiently describes stable state identification but remains inconvenient in describing the transient kinetics leading to these states. In this context, time is represented by discrete steps. Quantitative modeling, on the other hand, can describe more accurately the dynamical behavior of biological processes as it follows the evolution of concentration or activities of chemical species as a function of time, but requires an important amount of information on the parameters difficult to find in the literature. Here, we propose a modeling framework based on a qualitative approach that is intrinsically continuous in time. The algorithm presented in this article fills the gap between qualitative and quantitative modeling. It is based on continuous time Markov process applied on a Boolean state space. In order to describe the temporal evolution of the biological process we wish to model, we explicitly specify the transition rates for each node. For that purpose, we built a language that can be seen as a generalization of Boolean equations. Mathematically, this approach can be translated in a set of ordinary differential equations on probability distributions. We developed a C++ software, MaBoSS, that is able to simulate such a system by applying Kinetic Monte-Carlo (or Gillespie algorithm) on the Boolean state space. This software, parallelized and optimized, computes the temporal evolution of probability distributions and estimates stationary distributions. Applications of the Boolean Kinetic Monte-Carlo are demonstrated for three qualitative models: a toy model, a published model of p53/Mdm2 interaction and a published model of the mammalian cell cycle. Our approach allows to describe kinetic phenomena which were difficult to handle in the original models. In particular, transient effects are represented by time dependent probability distributions, interpretable in terms of cell populations.

  7. The process of accepting breast cancer among Chinese women: A grounded theory study.

    PubMed

    Chen, Shuang-Qin; Liu, Jun-E; Li, Zhi; Su, Ya-Li

    2017-06-01

    To describe the process by which Chinese women accept living with breast cancer. Individual interviews were conducted with 18 Chinese women who completed breast cancer treatment. Data were collected from September 2014 to January 2015 at a large tertiary teaching hospital in Beijing, China. In this grounded theory study, data were analyzed using constant comparative and coding analysis methods. In order to explain the process of accepting having breast cancer among women in China through the grounded theory study, a model that includes 5 axial categories was developed. Cognitive reconstruction emerged as the core category. The extent to which the women with breast cancer accepted having the disease was found to increase with the treatment stage and as their treatment stage progressed with time. The accepting process included five stages: non-acceptance, passive acceptance, willingness to accept, behavioral acceptance, and transcendence of acceptance. Our study using grounded theory study develops a model describing the process by which women accept having breast cancer. The model provides some intervention opportunities at every point of the process. Copyright © 2017. Published by Elsevier Ltd.

  8. Diffusion models of the flanker task: Discrete versus gradual attentional selection

    PubMed Central

    White, Corey N.; Ratcliff, Roger; Starns, Jeffrey S.

    2011-01-01

    The present study tested diffusion models of processing in the flanker task, in which participants identify a target that is flanked by items that indicate the same (congruent) or opposite response (incongruent). Single- and dual-process flanker models were implemented in a diffusion-model framework and tested against data from experiments that manipulated response bias, speed/accuracy tradeoffs, attentional focus, and stimulus configuration. There was strong mimcry among the models, and each captured the main trends in the data for the standard conditions. However, when more complex conditions were used, a single-process spotlight model captured qualitative and quantitative patterns that the dual-process models could not. Since the single-process model provided the best balance of fit quality and parsimony, the results indicate that processing in the simple versions of the flanker task is better described by gradual rather than discrete narrowing of attention. PMID:21964663

  9. Enhancing Services to the Rural Elderly through Primary Care Centers.

    ERIC Educational Resources Information Center

    Leighton, Jeannette; Sprague, Patricia

    This paper describes a systematic, coordinated approach to the delivery of health and social services to the rural elderly of Maine provided by the Kennebec Valley Regional Health Agency. Four points of the model are described which distinguish it from other models of coordination: (1) a strong medical orientation in the assessment process; (2)…

  10. The River Basin Model: Computer Output. Water Pollution Control Research Series.

    ERIC Educational Resources Information Center

    Envirometrics, Inc., Washington, DC.

    This research report is part of the Water Pollution Control Research Series which describes the results and progress in the control and abatement of pollution in our nation's waters. The River Basin Model described is a computer-assisted decision-making tool in which a number of computer programs simulate major processes related to water use that…

  11. Backus-Gilbert inversion of travel time data

    NASA Technical Reports Server (NTRS)

    Johnson, L. E.

    1972-01-01

    Application of the Backus-Gilbert theory for geophysical inverse problems to the seismic body wave travel-time problem is described. In particular, it is shown how to generate earth models that fit travel-time data to within one standard error and having generated such models how to describe their degree of uniqueness. An example is given to illustrate the process.

  12. Numerical modeling of divergent detonation wave

    NASA Astrophysics Data System (ADS)

    Li, Zhiwei; Liu, Bangdi

    1987-11-01

    The indefinite nature of divergent detonations under the assumption of instantaneous stable detonation is described. In the numerical modeling method for divergent detonation, the artificial cohesiveness was improved and the Cochran reaction rate and the JWL equations of state were used to describe the ignition process of the explosion. Several typical divergent detonation problems were computed obtaining rather satisfying results.

  13. A model for process representation and synthesis. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Thomas, R. H.

    1971-01-01

    The problem of representing groups of loosely connected processes is investigated, and a model for process representation useful for synthesizing complex patterns of process behavior is developed. There are three parts, the first part isolates the concepts which form the basis for the process representation model by focusing on questions such as: What is a process; What is an event; Should one process be able to restrict the capabilities of another? The second part develops a model for process representation which captures the concepts and intuitions developed in the first part. The model presented is able to describe both the internal structure of individual processes and the interface structure between interacting processes. Much of the model's descriptive power derives from its use of the notion of process state as a vehicle for relating the internal and external aspects of process behavior. The third part demonstrates by example that the model for process representation is a useful one for synthesizing process behavior patterns. In it the model is used to define a variety of interesting process behavior patterns. The dissertation closes by suggesting how the model could be used as a semantic base for a very potent language extension facility.

  14. Selecting the process variables for filament winding

    NASA Technical Reports Server (NTRS)

    Calius, E.; Springer, G. S.

    1986-01-01

    A model is described which can be used to determine the appropriate values of the process variables for filament winding cylinders. The process variables which can be selected by the model include the winding speed, fiber tension, initial resin degree of cure, and the temperatures applied during winding, curing, and post-curing. The effects of these process variables on the properties of the cylinder during and after manufacture are illustrated by a numerical example.

  15. Gaussian process-based surrogate modeling framework for process planning in laser powder-bed fusion additive manufacturing of 316L stainless steel

    DOE PAGES

    Tapia, Gustavo; Khairallah, Saad A.; Matthews, Manyalibo J.; ...

    2017-09-22

    Here, Laser Powder-Bed Fusion (L-PBF) metal-based additive manufacturing (AM) is complex and not fully understood. Successful processing for one material, might not necessarily apply to a different material. This paper describes a workflow process that aims at creating a material data sheet standard that describes regimes where the process can be expected to be robust. The procedure consists of building a Gaussian process-based surrogate model of the L-PBF process that predicts melt pool depth in single-track experiments given a laser power, scan speed, and laser beam size combination. The predictions are then mapped onto a power versus scan speed diagrammore » delimiting the conduction from the keyhole melting controlled regimes. This statistical framework is shown to be robust even for cases where experimental training data might be suboptimal in quality, if appropriate physics-based filters are applied. Additionally, it is demonstrated that a high-fidelity simulation model of L-PBF can equally be successfully used for building a surrogate model, which is beneficial since simulations are getting more efficient and are more practical to study the response of different materials, than to re-tool an AM machine for new material powder.« less

  16. Gaussian process-based surrogate modeling framework for process planning in laser powder-bed fusion additive manufacturing of 316L stainless steel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tapia, Gustavo; Khairallah, Saad A.; Matthews, Manyalibo J.

    Here, Laser Powder-Bed Fusion (L-PBF) metal-based additive manufacturing (AM) is complex and not fully understood. Successful processing for one material, might not necessarily apply to a different material. This paper describes a workflow process that aims at creating a material data sheet standard that describes regimes where the process can be expected to be robust. The procedure consists of building a Gaussian process-based surrogate model of the L-PBF process that predicts melt pool depth in single-track experiments given a laser power, scan speed, and laser beam size combination. The predictions are then mapped onto a power versus scan speed diagrammore » delimiting the conduction from the keyhole melting controlled regimes. This statistical framework is shown to be robust even for cases where experimental training data might be suboptimal in quality, if appropriate physics-based filters are applied. Additionally, it is demonstrated that a high-fidelity simulation model of L-PBF can equally be successfully used for building a surrogate model, which is beneficial since simulations are getting more efficient and are more practical to study the response of different materials, than to re-tool an AM machine for new material powder.« less

  17. PHOTOCHEMICAL MODELING APPLIED TO NATURAL WATERS

    EPA Science Inventory

    The study examines the application of modeling photochemical processes in natural water systems. For many photochemical reactions occurring in natural waters, a simple photochemical model describing reaction rate as a function of intensity, radiation attenuation, reactant absorpt...

  18. Model-Based Verification and Validation of the SMAP Uplink Processes

    NASA Technical Reports Server (NTRS)

    Khan, M. Omair; Dubos, Gregory F.; Tirona, Joseph; Standley, Shaun

    2013-01-01

    This case study stands as an example of how a project can validate a system-level design earlier in the project life cycle than traditional V&V processes by using simulation on a system model. Specifically, this paper describes how simulation was added to a system model of the Soil Moisture Active-Passive (SMAP) mission's uplink process.Also discussed are the advantages and disadvantages of the methods employed and the lessons learned; which are intended to benefit future model-based and simulation-based V&V development efforts.

  19. Muscle activation described with a differential equation model for large ensembles of locally coupled molecular motors.

    PubMed

    Walcott, Sam

    2014-10-01

    Molecular motors, by turning chemical energy into mechanical work, are responsible for active cellular processes. Often groups of these motors work together to perform their biological role. Motors in an ensemble are coupled and exhibit complex emergent behavior. Although large motor ensembles can be modeled with partial differential equations (PDEs) by assuming that molecules function independently of their neighbors, this assumption is violated when motors are coupled locally. It is therefore unclear how to describe the ensemble behavior of the locally coupled motors responsible for biological processes such as calcium-dependent skeletal muscle activation. Here we develop a theory to describe locally coupled motor ensembles and apply the theory to skeletal muscle activation. The central idea is that a muscle filament can be divided into two phases: an active and an inactive phase. Dynamic changes in the relative size of these phases are described by a set of linear ordinary differential equations (ODEs). As the dynamics of the active phase are described by PDEs, muscle activation is governed by a set of coupled ODEs and PDEs, building on previous PDE models. With comparison to Monte Carlo simulations, we demonstrate that the theory captures the behavior of locally coupled ensembles. The theory also plausibly describes and predicts muscle experiments from molecular to whole muscle scales, suggesting that a micro- to macroscale muscle model is within reach.

  20. Answering Questions about Complex Events

    DTIC Science & Technology

    2008-12-19

    in their environment. To reason about events requires a means of describing, simulating, and analyzing their underlying dynamic processes . For our...that are relevant to our goal of connecting inference and reasoning about processes to answering questions about events. 11 We start with a...different event and process descriptions, ontologies, and models. 2.1.1 Logical AI In AI, formal approaches to model the ability to reason about

  1. Land Use Change on Household Farms in the Ecuadorian Amazon: Design and Implementation of an Agent-Based Model.

    PubMed

    Mena, Carlos F; Walsh, Stephen J; Frizzelle, Brian G; Xiaozheng, Yao; Malanson, George P

    2011-01-01

    This paper describes the design and implementation of an Agent-Based Model (ABM) used to simulate land use change on household farms in the Northern Ecuadorian Amazon (NEA). The ABM simulates decision-making processes at the household level that is examined through a longitudinal, socio-economic and demographic survey that was conducted in 1990 and 1999. Geographic Information Systems (GIS) are used to establish spatial relationships between farms and their environment, while classified Landsat Thematic Mapper (TM) imagery is used to set initial land use/land cover conditions for the spatial simulation, assess from-to land use/land cover change patterns, and describe trajectories of land use change at the farm and landscape levels. Results from prior studies in the NEA provide insights into the key social and ecological variables, describe human behavioral functions, and examine population-environment interactions that are linked to deforestation and agricultural extensification, population migration, and demographic change. Within the architecture of the model, agents are classified as active or passive. The model comprises four modules, i.e., initialization, demography, agriculture, and migration that operate individually, but are linked through key household processes. The main outputs of the model include a spatially-explicit representation of the land use/land cover on survey and non-survey farms and at the landscape level for each annual time-step, as well as simulated socio-economic and demographic characteristics of households and communities. The work describes the design and implementation of the model and how population-environment interactions can be addressed in a frontier setting. The paper contributes to land change science by examining important pattern-process relations, advocating a spatial modeling approach that is capable of synthesizing fundamental relationships at the farm level, and links people and environment in complex ways.

  2. Water condensation: a multiscale phenomenon.

    PubMed

    Jensen, Kasper Risgaard; Fojan, Peter; Jensen, Rasmus Lund; Gurevich, Leonid

    2014-02-01

    The condensation of water is a phenomenon occurring in multiple situations in everyday life, e.g., when fog is formed or when dew forms on the grass or on windows. This means that this phenomenon plays an important role within the different fields of science including meteorology, building physics, and chemistry. In this review we address condensation models and simulations with the main focus on heterogeneous condensation of water. The condensation process is, at first, described from a thermodynamic viewpoint where the nucleation step is described by the classical nucleation theory. Further, we address the shortcomings of the thermodynamic theory in describing the nucleation and emphasize the importance of nanoscale effects. This leads to the description of condensation from a molecular viewpoint. Also presented is how the nucleation can be simulated by use of molecular models, and how the condensation process is simulated on the macroscale using computational fluid dynamics. Finally, examples of hybrid models combining molecular and macroscale models for the simulation of condensation on a surface are presented.

  3. Understanding viral video dynamics through an epidemic modelling approach

    NASA Astrophysics Data System (ADS)

    Sachak-Patwa, Rahil; Fadai, Nabil T.; Van Gorder, Robert A.

    2018-07-01

    Motivated by the hypothesis that the spread of viral videos is analogous to the spread of a disease epidemic, we formulate a novel susceptible-exposed-infected-recovered-susceptible (SEIRS) delay differential equation epidemic model to describe the popularity evolution of viral videos. Our models incorporate time-delay, in order to accurately describe the virtual contact process between individuals and the temporary immunity of individuals to videos after they have grown tired of watching them. We validate our models by fitting model parameters to viewing data from YouTube music videos, in order to demonstrate that the model solutions accurately reproduce real behaviour seen in this data. We use an SEIR model to describe the initial growth and decline of daily views, and an SEIRS model to describe the long term behaviour of the popularity of music videos. We also analyse the decay rates in the daily views of videos, determining whether they follow a power law or exponential distribution. Although we focus on viral videos, the modelling approach may be used to understand dynamics emergent from other areas of science which aim to describe consumer behaviour.

  4. Models of Solar Wind Structures and Their Interaction with the Earth's Space Environment

    NASA Astrophysics Data System (ADS)

    Watermann, J.; Wintoft, P.; Sanahuja, B.; Saiz, E.; Poedts, S.; Palmroth, M.; Milillo, A.; Metallinou, F.-A.; Jacobs, C.; Ganushkina, N. Y.; Daglis, I. A.; Cid, C.; Cerrato, Y.; Balasis, G.; Aylward, A. D.; Aran, A.

    2009-11-01

    The discipline of “Space Weather” is built on the scientific foundation of solar-terrestrial physics but with a strong orientation toward applied research. Models describing the solar-terrestrial environment are therefore at the heart of this discipline, for both physical understanding of the processes involved and establishing predictive capabilities of the consequences of these processes. Depending on the requirements, purely physical models, semi-empirical or empirical models are considered to be the most appropriate. This review focuses on the interaction of solar wind disturbances with geospace. We cover interplanetary space, the Earth’s magnetosphere (with the exception of radiation belt physics), the ionosphere (with the exception of radio science), the neutral atmosphere and the ground (via electromagnetic induction fields). Space weather relevant state-of-the-art physical and semi-empirical models of the various regions are reviewed. They include models for interplanetary space, its quiet state and the evolution of recurrent and transient solar perturbations (corotating interaction regions, coronal mass ejections, their interplanetary remnants, and solar energetic particle fluxes). Models of coupled large-scale solar wind-magnetosphere-ionosphere processes (global magnetohydrodynamic descriptions) and of inner magnetosphere processes (ring current dynamics) are discussed. Achievements in modeling the coupling between magnetospheric processes and the neutral and ionized upper and middle atmospheres are described. Finally we mention efforts to compile comprehensive and flexible models from selections of existing modules applicable to particular regions and conditions in interplanetary space and geospace.

  5. BIM integration in education: A case study of the construction technology project Bolt Tower Dolni Vitkovice

    NASA Astrophysics Data System (ADS)

    Venkrbec, Vaclav; Bittnerova, Lucie

    2017-12-01

    Building information modeling (BIM) can support effectiveness during many activities in the AEC industry. even when processing a construction-technological project. This paper presents an approach how to use building information model in higher education, especially during the work on diploma thesis and it supervision. Diploma thesis is project based work, which aims to compile a construction-technological project for a selected construction. The paper describes the use of input data, working with them and compares this process with standard input data such as printed design documentation. The effectiveness of using the building information model as a input data for construction-technological project is described in the conclusion.

  6. Influence of winding construction on starter-generator thermal processes

    NASA Astrophysics Data System (ADS)

    Grachev, P. Yu; Bazarov, A. A.; Tabachinskiy, A. S.

    2018-01-01

    Dynamic processes in starter-generators features high winding are overcurrent. It can lead to insulation overheating and fault operation mode. For hybrid and electric vehicles, new high efficiency construction of induction machines windings is proposed. Stator thermal processes need be considered in the most difficult operation modes. The article describes construction features of new compact stator windings, electromagnetic and thermal models of processes in stator windings and explains the influence of innovative construction on thermal processes. Models are based on finite element method.

  7. Quantum-like model for the adaptive dynamics of the genetic regulation of E. coli's metabolism of glucose/lactose.

    PubMed

    Asano, Masanari; Basieva, Irina; Khrennikov, Andrei; Ohya, Masanori; Tanaka, Yoshiharu; Yamato, Ichiro

    2012-06-01

    We developed a quantum-like model describing the gene regulation of glucose/lactose metabolism in a bacterium, Escherichia coli. Our quantum-like model can be considered as a kind of the operational formalism for microbiology and genetics. Instead of trying to describe processes in a cell in the very detail, we propose a formal operator description. Such a description may be very useful in situation in which the detailed description of processes is impossible or extremely complicated. We analyze statistical data obtained from experiments, and we compute the degree of E. coli's preference within adaptive dynamics. It is known that there are several types of E. coli characterized by the metabolic system. We demonstrate that the same type of E. coli can be described by the well determined operators; we find invariant operator quantities characterizing each type. Such invariant quantities can be calculated from the obtained statistical data.

  8. Describing the clinical reasoning process: application of a model of enablement to a pediatric case.

    PubMed

    Furze, Jennifer; Nelson, Kelly; O'Hare, Megan; Ortner, Amanda; Threlkeld, A Joseph; Jensen, Gail M

    2013-04-01

    Clinical reasoning is a core tenet of physical therapy practice leading to optimal patient care. The purpose of this case was to describe the outcomes, subjective experience, and reflective clinical reasoning process for a child with cerebral palsy using the International Classification of Functioning, Disability, and Health (ICF) model. Application of the ICF framework to a 9-year-old boy with spastic triplegic cerebral palsy was utilized to capture the interwoven factors present in this case. Interventions in the pool occurred twice weekly for 1 h over a 10-week period. Immediately post and 4 months post-intervention, the child made functional and meaningful gains. The family unit also developed an enjoyment of exercising together. Each individual family member described psychological, emotional, or physical health improvements. Reflection using the ICF model as a framework to discuss clinical reasoning can highlight important factors contributing to effective patient management.

  9. Functional correlation approach to operational risk in banking organizations

    NASA Astrophysics Data System (ADS)

    Kühn, Reimer; Neu, Peter

    2003-05-01

    A Value-at-Risk-based model is proposed to compute the adequate equity capital necessary to cover potential losses due to operational risks, such as human and system process failures, in banking organizations. Exploring the analogy to a lattice gas model from physics, correlations between sequential failures are modeled by as functionally defined, heterogeneous couplings between mutually supportive processes. In contrast to traditional risk models for market and credit risk, where correlations are described as equal-time-correlations by a covariance matrix, the dynamics of the model shows collective phenomena such as bursts and avalanches of process failures.

  10. Engineered Barrier System: Physical and Chemical Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    P. Dixon

    2004-04-26

    The conceptual and predictive models documented in this Engineered Barrier System: Physical and Chemical Environment Model report describe the evolution of the physical and chemical conditions within the waste emplacement drifts of the repository. The modeling approaches and model output data will be used in the total system performance assessment (TSPA-LA) to assess the performance of the engineered barrier system and the waste form. These models evaluate the range of potential water compositions within the emplacement drifts, resulting from the interaction of introduced materials and minerals in dust with water seeping into the drifts and with aqueous solutions forming bymore » deliquescence of dust (as influenced by atmospheric conditions), and from thermal-hydrological-chemical (THC) processes in the drift. These models also consider the uncertainty and variability in water chemistry inside the drift and the compositions of introduced materials within the drift. This report develops and documents a set of process- and abstraction-level models that constitute the engineered barrier system: physical and chemical environment model. Where possible, these models use information directly from other process model reports as input, which promotes integration among process models used for total system performance assessment. Specific tasks and activities of modeling the physical and chemical environment are included in the technical work plan ''Technical Work Plan for: In-Drift Geochemistry Modeling'' (BSC 2004 [DIRS 166519]). As described in the technical work plan, the development of this report is coordinated with the development of other engineered barrier system analysis model reports.« less

  11. Mathematical Modeling of Protein Misfolding Mechanisms in Neurological Diseases: A Historical Overview.

    PubMed

    Carbonell, Felix; Iturria-Medina, Yasser; Evans, Alan C

    2018-01-01

    Protein misfolding refers to a process where proteins become structurally abnormal and lose their specific 3-dimensional spatial configuration. The histopathological presence of misfolded protein (MP) aggregates has been associated as the primary evidence of multiple neurological diseases, including Prion diseases, Alzheimer's disease, Parkinson's disease, and Creutzfeldt-Jacob disease. However, the exact mechanisms of MP aggregation and propagation, as well as their impact in the long-term patient's clinical condition are still not well understood. With this aim, a variety of mathematical models has been proposed for a better insight into the kinetic rate laws that govern the microscopic processes of protein aggregation. Complementary, another class of large-scale models rely on modern molecular imaging techniques for describing the phenomenological effects of MP propagation over the whole brain. Unfortunately, those neuroimaging-based studies do not take full advantage of the tremendous capabilities offered by the chemical kinetics modeling approach. Actually, it has been barely acknowledged that the vast majority of large-scale models have foundations on previous mathematical approaches that describe the chemical kinetics of protein replication and propagation. The purpose of the current manuscript is to present a historical review about the development of mathematical models for describing both microscopic processes that occur during the MP aggregation and large-scale events that characterize the progression of neurodegenerative MP-mediated diseases.

  12. Flight crew aiding for recovery from subsystem failures

    NASA Technical Reports Server (NTRS)

    Hudlicka, E.; Corker, K.; Schudy, R.; Baron, Sheldon

    1990-01-01

    Some of the conceptual issues associated with pilot aiding systems are discussed and an implementation of one component of such an aiding system is described. It is essential that the format and content of the information the aiding system presents to the crew be compatible with the crew's mental models of the task. It is proposed that in order to cooperate effectively, both the aiding system and the flight crew should have consistent information processing models, especially at the point of interface. A general information processing strategy, developed by Rasmussen, was selected to serve as the bridge between the human and aiding system's information processes. The development and implementation of a model-based situation assessment and response generation system for commercial transport aircraft are described. The current implementation is a prototype which concentrates on engine and control surface failure situations and consequent flight emergencies. The aiding system, termed Recovery Recommendation System (RECORS), uses a causal model of the relevant subset of the flight domain to simulate the effects of these failures and to generate appropriate responses, given the current aircraft state and the constraints of the current flight phase. Since detailed information about the aircraft state may not always be available, the model represents the domain at varying levels of abstraction and uses the less detailed abstraction levels to make inferences when exact information is not available. The structure of this model is described in detail.

  13. An analysis of the least-squares problem for the DSN systematic pointing error model

    NASA Technical Reports Server (NTRS)

    Alvarez, L. S.

    1991-01-01

    A systematic pointing error model is used to calibrate antennas in the Deep Space Network. The least squares problem is described and analyzed along with the solution methods used to determine the model's parameters. Specifically studied are the rank degeneracy problems resulting from beam pointing error measurement sets that incorporate inadequate sky coverage. A least squares parameter subset selection method is described and its applicability to the systematic error modeling process is demonstrated on Voyager 2 measurement distribution.

  14. Modelling atmospheric transport of persistent organic pollutants in the Northern Hemisphere with a 3-D dynamical model: DEHM-POP

    NASA Astrophysics Data System (ADS)

    Hansen, K. M.; Christensen, J. H.; Brandt, J.; Frohn, L. M.; Geels, C.

    2004-03-01

    The Danish Eulerian Hemispheric Model (DEHM) is a 3-D dynamical atmospheric transport model originally developed to describe the atmospheric transport of sulphur into the Arctic. A new version of the model, DEHM-POP, developed to study the atmospheric transport and environmental fate of persistent organic pollutants (POPs) is presented. During environmental cycling, POPs can be deposited and re-emitted several times before reaching a final destination. A description of the exchange processes between the land/ocean surfaces and the atmosphere is included in the model to account for this multi-hop transport. The α-isomer of the pesticide hexachlorocyclohexane (α-HCH) is used as tracer in the model development. The structure of the model and processes included are described in detail. The results from a model simulation showing the atmospheric transport for the years 1991 to 1998 are presented and evaluated against measurements. The annual averaged atmospheric concentration of α-HCH for the 1990s is well described by the model; however, the shorter-term average concentration for most of the stations is not well captured. This indicates that the present simple surface description needs to be refined to get a better description of the air-surface exchange proceses of POPs.

  15. Application of a Model for Simulating the Vacuum Arc Remelting Process in Titanium Alloys

    NASA Astrophysics Data System (ADS)

    Patel, Ashish; Tripp, David W.; Fiore, Daniel

    Mathematical modeling is routinely used in the process development and production of advanced aerospace alloys to gain greater insight into system dynamics and to predict the effect of process modifications or upsets on final properties. This article describes the application of a 2-D mathematical VAR model presented in previous LMPC meetings. The impact of process parameters on melt pool geometry, solidification behavior, fluid-flow and chemistry in Ti-6Al-4V ingots will be discussed. Model predictions were first validated against the measured characteristics of industrially produced ingots, and process inputs and model formulation were adjusted to match macro-etched pool shapes. The results are compared to published data in the literature. Finally, the model is used to examine ingot chemistry during successive VAR melts.

  16. Phonemic Characteristics of Apraxia of Speech Resulting from Subcortical Hemorrhage

    ERIC Educational Resources Information Center

    Peach, Richard K.; Tonkovich, John D.

    2004-01-01

    Reports describing subcortical apraxia of speech (AOS) have received little consideration in the development of recent speech processing models because the speech characteristics of patients with this diagnosis have not been described precisely. We describe a case of AOS with aphasia secondary to basal ganglia hemorrhage. Speech-language symptoms…

  17. Understanding Business Models in Health Care.

    PubMed

    Sharan, Alok D; Schroeder, Gregory D; West, Michael E; Vaccaro, Alexander R

    2016-05-01

    The increasing focus on the costs of care is forcing health care organizations to critically look at their basic set of processes and activities, to determine what type of value they can deliver. A business model describes the resources, processes, and cost assumptions that an organization makes that will lead to the delivery of a unique value proposition to a customer. As health care organizations are beginning to transform their structure in preparation for a value-based delivery system, understanding business model theory can help in the redesign process.

  18. Engineering stromal-epithelial interactions in vitro for ...

    EPA Pesticide Factsheets

    Background: Crosstalk between epithelial and stromal cells drives the morphogenesis of ectodermal organs during development and promotes normal mature adult epithelial tissue function. Epithelial-mesenchymal interactions (EMIs) have been examined using mammalian models, ex vivo tissue recombination, and in vitro co-cultures. Although these approaches have elucidated signaling mechanisms underlying morphogenetic processes and adult mammalian epithelial tissue function, they are limited by the availability of human tissue, low throughput, and human developmental or physiological relevance. Objectives: Bioengineering strategies to promote EMIs using human epithelial and mesenchymal cells have enabled the development of human in vitro models of adult epidermal and glandular tissues. In this review, we describe recent bioengineered models of human epithelial tissue and organs that can instruct the design of organotypic models of human developmental processes.Methods: We reviewed current bioengineering literature and here describe how bioengineered EMIs have enabled the development of human in vitro epithelial tissue models.Discussion: Engineered models to promote EMIs have recapitulated the architecture, phenotype, and function of adult human epithelial tissue, and similar engineering principles could be used to develop models of developmental morphogenesis. We describe how bioengineering strategies including bioprinting and spheroid culture could be implemented to

  19. Parallel Processing with Digital Signal Processing Hardware and Software

    NASA Technical Reports Server (NTRS)

    Swenson, Cory V.

    1995-01-01

    The assembling and testing of a parallel processing system is described which will allow a user to move a Digital Signal Processing (DSP) application from the design stage to the execution/analysis stage through the use of several software tools and hardware devices. The system will be used to demonstrate the feasibility of the Algorithm To Architecture Mapping Model (ATAMM) dataflow paradigm for static multiprocessor solutions of DSP applications. The individual components comprising the system are described followed by the installation procedure, research topics, and initial program development.

  20. Modeling Adsorption-Desorption Processes at the Intermolecular Interactions Level

    NASA Astrophysics Data System (ADS)

    Varfolomeeva, Vera V.; Terentev, Alexey V.

    2018-01-01

    Modeling of the surface adsorption and desorption processes, as well as the diffusion, are of considerable interest for the physical phenomenon under study in ground tests conditions. When imitating physical processes and phenomena, it is important to choose the correct parameters to describe the adsorption of gases and the formation of films on the structural materials surface. In the present research the adsorption-desorption processes on the gas-solid interface are modeled with allowance for diffusion. Approaches are proposed to describe the adsorbate distribution on the solid body surface at the intermolecular interactions level. The potentials of the intermolecular interaction of water-water, water-methane and methane-methane were used to adequately modeling the real physical and chemical processes. The energies calculated by the B3LYP/aug-cc-pVDZ method. Computational algorithms for determining the average molecule area in a dense monolayer, are considered here. Differences in modeling approaches are also given: that of the proposed in this work and the previously approved probabilistic cellular automaton (PCA) method. It has been shown that the main difference is due to certain limitations of the PCA method. The importance of accounting the intermolecular interactions via hydrogen bonding has been indicated. Further development of the adsorption-desorption processes modeling will allow to find the conditions for of surface processes regulation by means of quantity adsorbed molecules control. The proposed approach to representing the molecular system significantly shortens the calculation time in comparison with the use of atom-atom potentials. In the future, this will allow to modeling the multilayer adsorption at a reasonable computational cost.

  1. A Stochastic Model of Space Radiation Transport as a Tool in the Development of Time-Dependent Risk Assessment

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Nounu, Hatem N.; Ponomarev, Artem L.; Cucinotta, Francis A.

    2011-01-01

    A new computer model, the GCR Event-based Risk Model code (GERMcode), was developed to describe biophysical events from high-energy protons and heavy ions that have been studied at the NASA Space Radiation Laboratory (NSRL) [1] for the purpose of simulating space radiation biological effects. In the GERMcode, the biophysical description of the passage of heavy ions in tissue and shielding materials is made with a stochastic approach that includes both ion track structure and nuclear interactions. The GERMcode accounts for the major nuclear interaction processes of importance for describing heavy ion beams, including nuclear fragmentation, elastic scattering, and knockout-cascade processes by using the quantum multiple scattering fragmentation (QMSFRG) model [2]. The QMSFRG model has been shown to be in excellent agreement with available experimental data for nuclear fragmentation cross sections

  2. Developing a logic model for youth mental health: participatory research with a refugee community in Beirut

    PubMed Central

    Afifi, Rema A; Makhoul, Jihad; El Hajj, Taghreed; Nakkash, Rima T

    2011-01-01

    Although logic models are now touted as an important component of health promotion planning, implementation and evaluation, there are few published manuscripts that describe the process of logic model development, and fewer which do so with community involvement, despite the increasing emphasis on participatory research. This paper describes a process leading to the development of a logic model for a youth mental health promotion intervention using a participatory approach in a Palestinian refugee camp in Beirut, Lebanon. First, a needs assessment, including quantitative and qualitative data collection was carried out with children, parents and teachers. The second phase was identification of a priority health issue and analysis of determinants. The final phase in the construction of the logic model involved development of an intervention. The process was iterative and resulted in a more grounded depiction of the pathways of influence informed by evidence. Constructing a logic model with community input ensured that the intervention was more relevant to community needs, feasible for implementation and more likely to be sustainable. PMID:21278370

  3. High-resolution modeling of a marine ecosystem using the FRESCO hydroecological model

    NASA Astrophysics Data System (ADS)

    Zalesny, V. B.; Tamsalu, R.

    2009-02-01

    The FRESCO (Finnish Russian Estonian Cooperation) mathematical model describing a marine hydroecosystem is presented. The methodology of the numerical solution is based on the method of multicomponent splitting into physical and biological processes, spatial coordinates, etc. The model is used for the reproduction of physical and biological processes proceeding in the Baltic Sea. Numerical experiments are performed with different spatial resolutions for four marine basins that are enclosed into one another: the Baltic Sea, the Gulf of Finland, the Tallinn-Helsinki water area, and Tallinn Bay. Physical processes are described by the equations of nonhydrostatic dynamics, including the k-ω parametrization of turbulence. Biological processes are described by the three-dimensional equations of an aquatic ecosystem with the use of a size-dependent parametrization of biochemical reactions. The main goal of this study is to illustrate the efficiency of the developed numerical technique and to demonstrate the importance of a high spatial resolution for water basins that have complex bottom topography, such as the Baltic Sea. Detailed information about the atmospheric forcing, bottom topography, and coastline is very important for the description of coastal dynamics and specific features of a marine ecosystem. Experiments show that the spatial inhomogeneity of hydroecosystem fields is caused by the combined effect of upwelling, turbulent mixing, surface-wave breaking, and temperature variations, which affect biochemical reactions.

  4. Modeling and Analysis of Global and Regional Climate Change in Relation to Atmospheric Hydrologic Processes

    NASA Technical Reports Server (NTRS)

    Johnson, Donald R.

    1998-01-01

    The goal of this research is the continued development and application of global isentropic modeling and analysis capabilities to describe hydrologic processes and energy exchange in the climate system, and discern regional climate change. This work involves a combination of modeling and analysis efforts involving 4DDA datasets and simulations from the University of Wisconsin (UW) hybrid isentropic-sigma (theta-sigma) coordinate model and the GEOS GCM.

  5. Legal Policy Optimizing Models

    ERIC Educational Resources Information Center

    Nagel, Stuart; Neef, Marian

    1977-01-01

    The use of mathematical models originally developed by economists and operations researchers is described for legal process research. Situations involving plea bargaining, arraignment, and civil liberties illustrate the applicability of decision theory, inventory modeling, and linear programming in operations research. (LBH)

  6. Some Reflections on Strategic Planning in Public Libraries.

    ERIC Educational Resources Information Center

    Palmour, Vernon E.

    1985-01-01

    Presents the Public Library Association's planning model for strategic planning in public libraries. The development of the model is explained, the basic steps of the planning process are described, and improvements to the model are suggested. (CLB)

  7. Business process modeling for the Virginia Department of Transportation : a demonstration with the integrated six-year improvement program and the statewide transportation improvement program : executive summary.

    DOT National Transportation Integrated Search

    2005-01-01

    This effort demonstrates business process modeling to describe the integration of particular planning and programming activities of a state highway agency. The motivations to document planning and programming activities are that: (i) resources for co...

  8. The Sender-Receiver Model and the Targeting Process.

    ERIC Educational Resources Information Center

    Larson, Mark A.

    The goal of this paper is to describe how one classroom teacher uses the Sender-Receiver Communications Model to illustrate for students in a lively and memorable way the process of "targeting your audience" with medium and message. Students are used as examples of Receivers, or target audience, illustrating the potential range of…

  9. Mental Workload as a Key Factor in Clinical Decision Making

    ERIC Educational Resources Information Center

    Byrne, Aidan

    2013-01-01

    The decision making process is central to the practice of a clinician and has traditionally been described in terms of the hypothetico-deductive model. More recently, models adapted from cognitive psychology, such as the dual process and script theories have proved useful in explaining patterns of practice not consistent with purely cognitive…

  10. Capturing business requirements for the Swedish national information structure.

    PubMed

    Kajbjer, Karin; Johansson, Catharina

    2009-01-01

    As a subproject for the National Information Structure project of the National Board of Health and Welfare, four different stakeholder groups were used to capture business requirements. These were: Subjects of care, Health professionals, Managers/Research and Industry. The process is described with formulating goal models, concept, process and information models.

  11. Intellectual, Psychosocial, and Moral Development in College: Four Major Theories. Revised.

    ERIC Educational Resources Information Center

    Kurfiss, Joanne

    Four models are discussed with which to view students, educational goals, and learning environments. Each of the four theories emphasizes a unique aspect of the total development process. Piaget's model describes the development of structures and processes which characterize mature logical thinking. Perry provides a closer look at students'…

  12. The Point of Creative Frustration and the Creative Process: A New Look at an Old Model.

    ERIC Educational Resources Information Center

    Sapp, D. David

    1992-01-01

    This paper offers an extension of Graham Wallas' model of the creative process. It identifies periods of problem solving, incubation, and growth with specific points of initial idea inception, creative frustration, and illumination. Responses to creative frustration are described including denial, rationalization, acceptance of stagnation, and new…

  13. Developing Explanations and Developing Understanding: Students Explain the Phases of the Moon Using Visual Representations

    ERIC Educational Resources Information Center

    Parnafes, Orit

    2012-01-01

    This article presents a theoretical model of the process by which students construct and elaborate explanations of scientific phenomena using visual representations. The model describes progress in the underlying conceptual processes in students' explanations as a reorganization of fine-grained knowledge elements based on the Knowledge in Pieces…

  14. SLS Navigation Model-Based Design Approach

    NASA Technical Reports Server (NTRS)

    Oliver, T. Emerson; Anzalone, Evan; Geohagan, Kevin; Bernard, Bill; Park, Thomas

    2018-01-01

    The SLS Program chose to implement a Model-based Design and Model-based Requirements approach for managing component design information and system requirements. This approach differs from previous large-scale design efforts at Marshall Space Flight Center where design documentation alone conveyed information required for vehicle design and analysis and where extensive requirements sets were used to scope and constrain the design. The SLS Navigation Team has been responsible for the Program-controlled Design Math Models (DMMs) which describe and represent the performance of the Inertial Navigation System (INS) and the Rate Gyro Assemblies (RGAs) used by Guidance, Navigation, and Controls (GN&C). The SLS Navigation Team is also responsible for the navigation algorithms. The navigation algorithms are delivered for implementation on the flight hardware as a DMM. For the SLS Block 1-B design, the additional GPS Receiver hardware is managed as a DMM at the vehicle design level. This paper provides a discussion of the processes and methods used to engineer, design, and coordinate engineering trades and performance assessments using SLS practices as applied to the GN&C system, with a particular focus on the Navigation components. These include composing system requirements, requirements verification, model development, model verification and validation, and modeling and analysis approaches. The Model-based Design and Requirements approach does not reduce the effort associated with the design process versus previous processes used at Marshall Space Flight Center. Instead, the approach takes advantage of overlap between the requirements development and management process, and the design and analysis process by efficiently combining the control (i.e. the requirement) and the design mechanisms. The design mechanism is the representation of the component behavior and performance in design and analysis tools. The focus in the early design process shifts from the development and management of design requirements to the development of usable models, model requirements, and model verification and validation efforts. The models themselves are represented in C/C++ code and accompanying data files. Under the idealized process, potential ambiguity in specification is reduced because the model must be implementable versus a requirement which is not necessarily subject to this constraint. Further, the models are shown to emulate the hardware during validation. For models developed by the Navigation Team, a common interface/standalone environment was developed. The common environment allows for easy implementation in design and analysis tools. Mechanisms such as unit test cases ensure implementation as the developer intended. The model verification and validation process provides a very high level of component design insight. The origin and implementation of the SLS variant of Model-based Design is described from the perspective of the SLS Navigation Team. The format of the models and the requirements are described. The Model-based Design approach has many benefits but is not without potential complications. Key lessons learned associated with the implementation of the Model Based Design approach and process from infancy to verification and certification are discussed

  15. Enterprise application architecture development based on DoDAF and TOGAF

    NASA Astrophysics Data System (ADS)

    Tao, Zhi-Gang; Luo, Yun-Feng; Chen, Chang-Xin; Wang, Ming-Zhe; Ni, Feng

    2017-05-01

    For the purpose of supporting the design and analysis of enterprise application architecture, here, we report a tailored enterprise application architecture description framework and its corresponding design method. The presented framework can effectively support service-oriented architecting and cloud computing by creating the metadata model based on architecture content framework (ACF), DoDAF metamodel (DM2) and Cloud Computing Modelling Notation (CCMN). The framework also makes an effort to extend and improve the mapping between The Open Group Architecture Framework (TOGAF) application architectural inputs/outputs, deliverables and Department of Defence Architecture Framework (DoDAF)-described models. The roadmap of 52 DoDAF-described models is constructed by creating the metamodels of these described models and analysing the constraint relationship among metamodels. By combining the tailored framework and the roadmap, this article proposes a service-oriented enterprise application architecture development process. Finally, a case study is presented to illustrate the results of implementing the tailored framework in the Southern Base Management Support and Information Platform construction project using the development process proposed by the paper.

  16. Coarse-graining, Electrostatics and pH effects in phospholipid systems

    NASA Astrophysics Data System (ADS)

    Travesset, Alex; Vangaveti, Sweta

    2010-03-01

    We introduce a minimal free energy describing the interaction of charged groups and counterions including both classical electrostatic and specific interactions. The predictions of the model are compared against the standard model for describing ions next to charged interfaces, consisting of Poisson-Boltzmann theory with additional constants describing ion binding, which are specific to the counterion and the interfacial charge (``chemical binding''). It is shown that the ``chemical'' model can be appropriately described by an underlying ``physical'' model over several decades in concentration, but the extracted binding constants are not uniquely defined, as they differ depending on the particular observable quantity being studied. It is also shown that electrostatic correlations for divalent (or higher valence) ions enhance the surface charge by increasing deprotonation, an effect not properly accounted within chemical models. The model is applied to the charged phospholipids phosphatidylserine, Phosphatidc acid and Phosphoinositides and implications for different biological processes are discussed.

  17. Dynamic occupancy models for explicit colonization processes

    USGS Publications Warehouse

    Broms, Kristin M.; Hooten, Mevin B.; Johnson, Devin S.; Altwegg, Res; Conquest, Loveday

    2016-01-01

    The dynamic, multi-season occupancy model framework has become a popular tool for modeling open populations with occupancies that change over time through local colonizations and extinctions. However, few versions of the model relate these probabilities to the occupancies of neighboring sites or patches. We present a modeling framework that incorporates this information and is capable of describing a wide variety of spatiotemporal colonization and extinction processes. A key feature of the model is that it is based on a simple set of small-scale rules describing how the process evolves. The result is a dynamic process that can account for complicated large-scale features. In our model, a site is more likely to be colonized if more of its neighbors were previously occupied and if it provides more appealing environmental characteristics than its neighboring sites. Additionally, a site without occupied neighbors may also become colonized through the inclusion of a long-distance dispersal process. Although similar model specifications have been developed for epidemiological applications, ours formally accounts for detectability using the well-known occupancy modeling framework. After demonstrating the viability and potential of this new form of dynamic occupancy model in a simulation study, we use it to obtain inference for the ongoing Common Myna (Acridotheres tristis) invasion in South Africa. Our results suggest that the Common Myna continues to enlarge its distribution and its spread via short distance movement, rather than long-distance dispersal. Overall, this new modeling framework provides a powerful tool for managers examining the drivers of colonization including short- vs. long-distance dispersal, habitat quality, and distance from source populations.

  18. A coordination theory for intelligent machines

    NASA Technical Reports Server (NTRS)

    Wang, Fei-Yue; Saridis, George N.

    1990-01-01

    A formal model for the coordination level of intelligent machines is established. The framework of the coordination level investigated consists of one dispatcher and a number of coordinators. The model called coordination structure has been used to describe analytically the information structure and information flow for the coordination activities in the coordination level. Specifically, the coordination structure offers a formalism to (1) describe the task translation of the dispatcher and coordinators; (2) represent the individual process within the dispatcher and coordinators; (3) specify the cooperation and connection among the dispatcher and coordinators; (4) perform the process analysis and evaluation; and (5) provide a control and communication mechanism for the real-time monitor or simulation of the coordination process. A simple procedure for the task scheduling in the coordination structure is presented. The task translation is achieved by a stochastic learning algorithm. The learning process is measured with entropy and its convergence is guaranteed. Finally, a case study of the coordination structure with three coordinators and one dispatcher for a simple intelligent manipulator system illustrates the proposed model and the simulation of the task processes performed on the model verifies the soundness of the theory.

  19. ASSIST: User's manual

    NASA Technical Reports Server (NTRS)

    Johnson, S. C.

    1986-01-01

    Semi-Markov models can be used to compute the reliability of virtually any fault-tolerant system. However, the process of delineating all of the states and transitions in a model of a complex system can be devastingly tedious and error-prone. The ASSIST program allows the user to describe the semi-Markov model in a high-level language. Instead of specifying the individual states of the model, the user specifies the rules governing the behavior of the system and these are used by ASSIST to automatically generate the model. The ASSIST program is described and illustrated by examples.

  20. A standardised graphic method for describing data privacy frameworks in primary care research using a flexible zone model.

    PubMed

    Kuchinke, Wolfgang; Ohmann, Christian; Verheij, Robert A; van Veen, Evert-Ben; Arvanitis, Theodoros N; Taweel, Adel; Delaney, Brendan C

    2014-12-01

    To develop a model describing core concepts and principles of data flow, data privacy and confidentiality, in a simple and flexible way, using concise process descriptions and a diagrammatic notation applied to research workflow processes. The model should help to generate robust data privacy frameworks for research done with patient data. Based on an exploration of EU legal requirements for data protection and privacy, data access policies, and existing privacy frameworks of research projects, basic concepts and common processes were extracted, described and incorporated into a model with a formal graphical representation and a standardised notation. The Unified Modelling Language (UML) notation was enriched by workflow and own symbols to enable the representation of extended data flow requirements, data privacy and data security requirements, privacy enhancing techniques (PET) and to allow privacy threat analysis for research scenarios. Our model is built upon the concept of three privacy zones (Care Zone, Non-care Zone and Research Zone) containing databases, data transformation operators, such as data linkers and privacy filters. Using these model components, a risk gradient for moving data from a zone of high risk for patient identification to a zone of low risk can be described. The model was applied to the analysis of data flows in several general clinical research use cases and two research scenarios from the TRANSFoRm project (e.g., finding patients for clinical research and linkage of databases). The model was validated by representing research done with the NIVEL Primary Care Database in the Netherlands. The model allows analysis of data privacy and confidentiality issues for research with patient data in a structured way and provides a framework to specify a privacy compliant data flow, to communicate privacy requirements and to identify weak points for an adequate implementation of data privacy. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  1. Towards Semantic Modelling of Business Processes for Networked Enterprises

    NASA Astrophysics Data System (ADS)

    Furdík, Karol; Mach, Marián; Sabol, Tomáš

    The paper presents an approach to the semantic modelling and annotation of business processes and information resources, as it was designed within the FP7 ICT EU project SPIKE to support creation and maintenance of short-term business alliances and networked enterprises. A methodology for the development of the resource ontology, as a shareable knowledge model for semantic description of business processes, is proposed. Systematically collected user requirements, conceptual models implied by the selected implementation platform as well as available ontology resources and standards are employed in the ontology creation. The process of semantic annotation is described and illustrated using an example taken from a real application case.

  2. A combustion model of vegetation burning in "Tiger" fire propagation tool

    NASA Astrophysics Data System (ADS)

    Giannino, F.; Ascoli, D.; Sirignano, M.; Mazzoleni, S.; Russo, L.; Rego, F.

    2017-11-01

    In this paper, we propose a semi-physical model for the burning of vegetation in a wildland fire. The main physical-chemical processes involved in fire spreading are modelled through a set of ordinary differential equations, which describe the combustion process as linearly related to the consumption of fuel. The water evaporation process from leaves and wood is also considered. Mass and energy balance equations are written for fuel (leaves and wood) assuming that combustion process is homogeneous in space. The model is developed with the final aim of simulating large-scale wildland fires which spread on heterogeneous landscape while keeping the computation cost very low.

  3. Cognitive Support During High-Consequence Episodes of Care in Cardiovascular Surgery.

    PubMed

    Conboy, Heather M; Avrunin, George S; Clarke, Lori A; Osterweil, Leon J; Christov, Stefan C; Goldman, Julian M; Yule, Steven J; Zenati, Marco A

    2017-03-01

    Despite significant efforts to reduce preventable adverse events in medical processes, such events continue to occur at unacceptable rates. This paper describes a computer science approach that uses formal process modeling to provide situationally aware monitoring and management support to medical professionals performing complex processes. These process models represent both normative and non-normative situations, and are validated by rigorous automated techniques such as model checking and fault tree analysis, in addition to careful review by experts. Context-aware Smart Checklists are then generated from the models, providing cognitive support during high-consequence surgical episodes. The approach is illustrated with a case study in cardiovascular surgery.

  4. Conceptual Model of Iodine Behavior in the Subsurface at the Hanford Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Truex, Michael J.; Lee, Brady D.; Johnson, Christian D.

    The fate and transport of 129I in the environment and potential remediation technologies are currently being studied as part of environmental remediation activities at the Hanford Site. A conceptual model describing the nature and extent of subsurface contamination, factors that control plume behavior, and factors relevant to potential remediation processes is needed to support environmental remedy decisions. Because 129I is an uncommon contaminant, relevant remediation experience and scientific literature are limited. Thus, the conceptual model also needs to both describe known contaminant and biogeochemical process information and to identify aspects about which additional information needed to effectively support remedy decisions.more » this document summarizes the conceptual model of iodine behavior relevant to iodine in the subsurface environment at the Hanford site.« less

  5. A Conceptual Model of the Pasadena Housing System

    NASA Technical Reports Server (NTRS)

    Hirshberg, Alan S.; Barber, Thomas A.

    1971-01-01

    During the last 5 years, there have been several attempts at applying systems analysis to complex urban problems. This paper describes one such attempt by a multidisciplinary team of students, engineers, professors, and community representatives. The Project organization is discussed and the interaction of the different disciplines (the process) described. The two fundamental analysis questions posed by the Project were: "Why do houses deteriorate?" and "Why do people move?" The analysis of these questions led to the development of a conceptual system model of housing in Pasadena. The major elements of this model are described, and several conclusions drawn from it are presented.

  6. Automated Interactive Simulation Model (AISIM) VAX Version 5.0 Training Manual.

    DTIC Science & Technology

    1987-05-29

    action, activity, decision , etc. that consumes time. The entity is automatically created by the system when an ACTION Primitive is placed. 1.3.2.4 The...MODELED SYSTEM 1.3.2.1 The Process Entity. A Process is used to represent the operations, decisions , actions or activities that can be decomposed and...is associated with the Action entity described below, is included in Process definitions to indicate the time a certain Action (or process, decision

  7. On the nature of bias and defects in the software specification process

    NASA Technical Reports Server (NTRS)

    Straub, Pablo A.; Zelkowitz, Marvin V.

    1992-01-01

    Implementation bias in a specification is an arbitrary constraint in the solution space. This paper describes the problem of bias. Additionally, this paper presents a model of the specification and design processes describing individual subprocesses in terms of precision/detail diagrams and a model of bias in multi-attribute software specifications. While studying how bias is introduced into a specification we realized that software defects and bias are dual problems of a single phenomenon. This was used to explain the large proportion of faults found during the coding phase at the Software Engineering Laboratory at NASA/GSFC.

  8. Comparison of elevation derived from insar data with dem from topography map in Son Dong, Bac Giang, Viet Nam

    NASA Astrophysics Data System (ADS)

    Nguyen, Duy

    2012-07-01

    Digital Elevation Models (DEMs) are used in many applications in the context of earth sciences such as in topographic mapping, environmental modeling, rainfall-runoff studies, landslide hazard zonation, seismic source modeling, etc. During the last years multitude of scientific applications of Synthetic Aperture Radar Interferometry (InSAR) techniques have evolved. It has been shown that InSAR is an established technique of generating high quality DEMs from space borne and airborne data, and that it has advantages over other methods for the generation of large area DEM. However, the processing of InSAR data is still a challenging task. This paper describes InSAR operational steps and processing chain for DEM generation from Single Look Complex (SLC) SAR data and compare a satellite SAR estimate of surface elevation with a digital elevation model (DEM) from Topography map. The operational steps are performed in three major stages: Data Search, Data Processing, and product Validation. The Data processing stage is further divided into five steps of Data Pre-Processing, Co-registration, Interferogram generation, Phase unwrapping, and Geocoding. The Data processing steps have been tested with ERS 1/2 data using Delft Object-oriented Interferometric (DORIS) InSAR processing software. Results of the outcome of the application of the described processing steps to real data set are presented.

  9. Development of a case tool to support decision based software development

    NASA Technical Reports Server (NTRS)

    Wild, Christian J.

    1993-01-01

    A summary of the accomplishments of the research over the past year are presented. Achievements include: made demonstrations with DHC, a prototype supporting decision based software development (DBSD) methodology, for Paramax personnel at ODU; met with Paramax personnel to discuss DBSD issues, the process of integrating DBSD and Refinery and the porting process model; completed and submitted a paper describing DBSD paradigm to IFIP '92; completed and presented a paper describing the approach for software reuse at the Software Reuse Workshop in April 1993; continued to extend DHC with a project agenda, facility necessary for a better project management; completed a primary draft of the re-engineering process model for porting; created a logging form to trace all the activities involved in the process of solving the reengineering problem, and developed a primary chart with the problems involved by the reengineering process.

  10. Data driven model generation based on computational intelligence

    NASA Astrophysics Data System (ADS)

    Gemmar, Peter; Gronz, Oliver; Faust, Christophe; Casper, Markus

    2010-05-01

    The simulation of discharges at a local gauge or the modeling of large scale river catchments are effectively involved in estimation and decision tasks of hydrological research and practical applications like flood prediction or water resource management. However, modeling such processes using analytical or conceptual approaches is made difficult by both complexity of process relations and heterogeneity of processes. It was shown manifold that unknown or assumed process relations can principally be described by computational methods, and that system models can automatically be derived from observed behavior or measured process data. This study describes the development of hydrological process models using computational methods including Fuzzy logic and artificial neural networks (ANN) in a comprehensive and automated manner. Methods We consider a closed concept for data driven development of hydrological models based on measured (experimental) data. The concept is centered on a Fuzzy system using rules of Takagi-Sugeno-Kang type which formulate the input-output relation in a generic structure like Ri : IFq(t) = lowAND...THENq(t+Δt) = ai0 +ai1q(t)+ai2p(t-Δti1)+ai3p(t+Δti2)+.... The rule's premise part (IF) describes process states involving available process information, e.g. actual outlet q(t) is low where low is one of several Fuzzy sets defined over variable q(t). The rule's conclusion (THEN) estimates expected outlet q(t + Δt) by a linear function over selected system variables, e.g. actual outlet q(t), previous and/or forecasted precipitation p(t ?Δtik). In case of river catchment modeling we use head gauges, tributary and upriver gauges in the conclusion part as well. In addition, we consider temperature and temporal (season) information in the premise part. By creating a set of rules R = {Ri|(i = 1,...,N)} the space of process states can be covered as concise as necessary. Model adaptation is achieved by finding on optimal set A = (aij) of conclusion parameters with respect to a defined rating function and experimental data. To find A, we use for example a linear equation solver and RMSE-function. In practical process models, the number of Fuzzy sets and the according number of rules is fairly low. Nevertheless, creating the optimal model requires some experience. Therefore, we improved this development step by methods for automatic generation of Fuzzy sets, rules, and conclusions. Basically, the model achievement depends to a great extend on the selection of the conclusion variables. It is the aim that variables having most influence on the system reaction being considered and superfluous ones being neglected. At first, we use Kohonen maps, a specialized ANN, to identify relevant input variables from the large set of available system variables. A greedy algorithm selects a comprehensive set of dominant and uncorrelated variables. Next, the premise variables are analyzed with clustering methods (e.g. Fuzzy-C-means) and Fuzzy sets are then derived from cluster centers and outlines. The rule base is automatically constructed by permutation of the Fuzzy sets of the premise variables. Finally, the conclusion parameters are calculated and the total coverage of the input space is iteratively tested with experimental data, rarely firing rules are combined and coarse coverage of sensitive process states results in refined Fuzzy sets and rules. Results The described methods were implemented and integrated in a development system for process models. A series of models has already been built e.g. for rainfall-runoff modeling or for flood prediction (up to 72 hours) in river catchments. The models required significantly less development effort and showed advanced simulation results compared to conventional models. The models can be used operationally and simulation takes only some minutes on a standard PC e.g. for a gauge forecast (up to 72 hours) for the whole Mosel (Germany) river catchment.

  11. Toward a Rational Educational Policy. An Econometric Analysis of Ontario, Canada, 1950-65 with Tests 1966-68 and Projections 1969-75.

    ERIC Educational Resources Information Center

    Handa, M. L.

    This report describes some models the author developed to investigate the simultaneous interaction of decisionmakers in a province-wide educational system and to help formulate educational policy for achieving specified enrollments and expenditures. In chapter one, the author describes the models that examine the process of simultaneous…

  12. Social Power and Influence: Understanding Its Relevance in Early Childhood Consultation

    ERIC Educational Resources Information Center

    Spino, Margie A.; Dinnebeil, Laurie A.; McInerney, William F.

    2013-01-01

    The purpose of this article is to introduce and describe a model of social power and influence developed by Erchul and Raven (1997). This model describes the decision-making process a consultant would engage in to choose, implement, evaluate as well as the use of strategies that they might use to influence another person to act in a particular…

  13. Using Dispersed Modes During Model Correlation

    NASA Technical Reports Server (NTRS)

    Stewart, Eric C.; Hathcock, Megan L.

    2017-01-01

    The model correlation process for the modal characteristics of a launch vehicle is well established. After a test, parameters within the nominal model are adjusted to reflect structural dynamics revealed during testing. However, a full model correlation process for a complex structure can take months of man-hours and many computational resources. If the analyst only has weeks, or even days, of time in which to correlate the nominal model to the experimental results, then the traditional correlation process is not suitable. This paper describes using model dispersions to assist the model correlation process and decrease the overall cost of the process. The process creates thousands of model dispersions from the nominal model prior to the test and then compares each of them to the test data. Using mode shape and frequency error metrics, one dispersion is selected as the best match to the test data. This dispersion is further improved by using a commercial model correlation software. In the three examples shown in this paper, this dispersion based model correlation process performs well when compared to models correlated using traditional techniques and saves time in the post-test analysis.

  14. Nonlinear Growth Curves in Developmental Research

    PubMed Central

    Grimm, Kevin J.; Ram, Nilam; Hamagami, Fumiaki

    2011-01-01

    Developmentalists are often interested in understanding change processes and growth models are the most common analytic tool for examining such processes. Nonlinear growth curves are especially valuable to developmentalists because the defining characteristics of the growth process such as initial levels, rates of change during growth spurts, and asymptotic levels can be estimated. A variety of growth models are described beginning with the linear growth model and moving to nonlinear models of varying complexity. A detailed discussion of nonlinear models is provided, highlighting the added insights into complex developmental processes associated with their use. A collection of growth models are fit to repeated measures of height from participants of the Berkeley Growth and Guidance Studies from early childhood through adulthood. PMID:21824131

  15. Multi-scale modeling for the transmission of influenza and the evaluation of interventions toward it.

    PubMed

    Guo, Dongmin; Li, King C; Peters, Timothy R; Snively, Beverly M; Poehling, Katherine A; Zhou, Xiaobo

    2015-03-11

    Mathematical modeling of influenza epidemic is important for analyzing the main cause of the epidemic and finding effective interventions towards it. The epidemic is a dynamic process. In this process, daily infections are caused by people's contacts, and the frequency of contacts can be mainly influenced by their cognition to the disease. The cognition is in turn influenced by daily illness attack rate, climate, and other environment factors. Few existing methods considered the dynamic process in their models. Therefore, their prediction results can hardly be explained by the mechanisms of epidemic spreading. In this paper, we developed a heterogeneous graph modeling approach (HGM) to describe the dynamic process of influenza virus transmission by taking advantage of our unique clinical data. We built social network of studied region and embedded an Agent-Based Model (ABM) in the HGM to describe the dynamic change of an epidemic. Our simulations have a good agreement with clinical data. Parameter sensitivity analysis showed that temperature influences the dynamic of epidemic significantly and system behavior analysis showed social network degree is a critical factor determining the size of an epidemic. Finally, multiple scenarios for vaccination and school closure strategies were simulated and their performance was analyzed.

  16. A statistical metadata model for clinical trials' data management.

    PubMed

    Vardaki, Maria; Papageorgiou, Haralambos; Pentaris, Fragkiskos

    2009-08-01

    We introduce a statistical, process-oriented metadata model to describe the process of medical research data collection, management, results analysis and dissemination. Our approach explicitly provides a structure for pieces of information used in Clinical Study Data Management Systems, enabling a more active role for any associated metadata. Using the object-oriented paradigm, we describe the classes of our model that participate during the design of a clinical trial and the subsequent collection and management of the relevant data. The advantage of our approach is that we focus on presenting the structural inter-relation of these classes when used during datasets manipulation by proposing certain transformations that model the simultaneous processing of both data and metadata. Our solution reduces the possibility of human errors and allows for the tracking of all changes made during datasets lifecycle. The explicit modeling of processing steps improves data quality and assists in the problem of handling data collected in different clinical trials. The case study illustrates the applicability of the proposed framework demonstrating conceptually the simultaneous handling of datasets collected during two randomized clinical studies. Finally, we provide the main considerations for implementing the proposed framework into a modern Metadata-enabled Information System.

  17. Calibrating the sqHIMMELI v1.0 wetland methane emission model with hierarchical modeling and adaptive MCMC

    NASA Astrophysics Data System (ADS)

    Susiluoto, Jouni; Raivonen, Maarit; Backman, Leif; Laine, Marko; Makela, Jarmo; Peltola, Olli; Vesala, Timo; Aalto, Tuula

    2018-03-01

    Estimating methane (CH4) emissions from natural wetlands is complex, and the estimates contain large uncertainties. The models used for the task are typically heavily parameterized and the parameter values are not well known. In this study, we perform a Bayesian model calibration for a new wetland CH4 emission model to improve the quality of the predictions and to understand the limitations of such models.The detailed process model that we analyze contains descriptions for CH4 production from anaerobic respiration, CH4 oxidation, and gas transportation by diffusion, ebullition, and the aerenchyma cells of vascular plants. The processes are controlled by several tunable parameters. We use a hierarchical statistical model to describe the parameters and obtain the posterior distributions of the parameters and uncertainties in the processes with adaptive Markov chain Monte Carlo (MCMC), importance resampling, and time series analysis techniques. For the estimation, the analysis utilizes measurement data from the Siikaneva flux measurement site in southern Finland. The uncertainties related to the parameters and the modeled processes are described quantitatively. At the process level, the flux measurement data are able to constrain the CH4 production processes, methane oxidation, and the different gas transport processes. The posterior covariance structures explain how the parameters and the processes are related. Additionally, the flux and flux component uncertainties are analyzed both at the annual and daily levels. The parameter posterior densities obtained provide information regarding importance of the different processes, which is also useful for development of wetland methane emission models other than the square root HelsinkI Model of MEthane buiLd-up and emIssion for peatlands (sqHIMMELI). The hierarchical modeling allows us to assess the effects of some of the parameters on an annual basis. The results of the calibration and the cross validation suggest that the early spring net primary production could be used to predict parameters affecting the annual methane production. Even though the calibration is specific to the Siikaneva site, the hierarchical modeling approach is well suited for larger-scale studies and the results of the estimation pave way for a regional or global-scale Bayesian calibration of wetland emission models.

  18. Using 3-D Numerical Weather Data in Piloted Simulations

    NASA Technical Reports Server (NTRS)

    Daniels, Taumi S.

    2016-01-01

    This report describes the process of acquiring and using 3-D numerical model weather data sets in NASA Langley's Research Flight Deck (RFD). A set of software tools implement the process and can be used for other purposes as well. Given time and location information of a weather phenomenon of interest, the user can download associated numerical weather model data. These data are created by the National Oceanic and Atmospheric Administration (NOAA) High Resolution Rapid Refresh (HRRR) model, and are then processed using a set of Mathworks' Matlab(TradeMark) scripts to create the usable 3-D weather data sets. Each data set includes radar re ectivity, water vapor, component winds, temperature, supercooled liquid water, turbulence, pressure, altitude, land elevation, relative humidity, and water phases. An open-source data processing program, wgrib2, is available from NOAA online, and is used along with Matlab scripts. These scripts are described with sucient detail to make future modi cations. These software tools have been used to generate 3-D weather data for various RFD experiments.

  19. Flux frequency analysis of seasonally dry ecosystem fluxes in two unique biomes of Sonora Mexico

    NASA Astrophysics Data System (ADS)

    Verduzco, V. S.; Yepez, E. A.; Robles-Morua, A.; Garatuza, J.; Rodriguez, J. C.; Watts, C.

    2013-05-01

    Complex dynamics from the interactions of ecosystems processes makes difficult to model the behavior of ecosystems fluxes of carbon and water in response to the variation of environmental and biological drivers. Although process oriented ecosystem models are critical tools for studying land-atmosphere fluxes, its validity depends on the appropriate parameterization of equations describing temporal and spatial changes of model state variables and their interactions. This constraint often leads to discrepancies between model simulations and observed data that reduce models reliability especially in arid and semiarid ecosystems. In the semiarid north western Mexico, ecosystem processes are fundamentally controlled by the seasonality of water and the intermittence of rain pulses which are conditions that require calibration of specific fitting functions to describe the response of ecosystem variables (i.e. NEE, GPP, ET, respiration) to these wetting and drying periods. The goal is to find functions that describe the magnitude of ecosystem fluxes during individual rain pulses and the seasonality of the ecosystem. Relaying on five years of eddy covariance flux data of a tropical dry forest and a subtropical shrubland we present a flux frequency analysis that describe the variation of net ecosystem exchange (NEE) of CO2 to highlight the relevance of pulse driven dynamics controlling this flux. Preliminary results of flux frequency analysis of NEE indicate that these ecosystems are strongly controlled by the frequency distribution of rain. Also, the output of fitting functions for NEE, GPP, ET and respiration using semi-empirical functions applied at specific rain pulses compared with season-long statistically generated simulations do not agree. Seasonality and the intrinsic nature of individual pulses have different effects on ecosystem flux responses. This suggests that relationships between the nature of seasonality and individual pulses can help improve the parameterization of process oriented ecosystem models.

  20. A functional-structural model of rice linking quantitative genetic information with morphological development and physiological processes.

    PubMed

    Xu, Lifeng; Henke, Michael; Zhu, Jun; Kurth, Winfried; Buck-Sorlin, Gerhard

    2011-04-01

    Although quantitative trait loci (QTL) analysis of yield-related traits for rice has developed rapidly, crop models using genotype information have been proposed only relatively recently. As a first step towards a generic genotype-phenotype model, we present here a three-dimensional functional-structural plant model (FSPM) of rice, in which some model parameters are controlled by functions describing the effect of main-effect and epistatic QTLs. The model simulates the growth and development of rice based on selected ecophysiological processes, such as photosynthesis (source process) and organ formation, growth and extension (sink processes). It was devised using GroIMP, an interactive modelling platform based on the Relational Growth Grammar formalism (RGG). RGG rules describe the course of organ initiation and extension resulting in final morphology. The link between the phenotype (as represented by the simulated rice plant) and the QTL genotype was implemented via a data interface between the rice FSPM and the QTLNetwork software, which computes predictions of QTLs from map data and measured trait data. Using plant height and grain yield, it is shown how QTL information for a given trait can be used in an FSPM, computing and visualizing the phenotypes of different lines of a mapping population. Furthermore, we demonstrate how modification of a particular trait feeds back on the entire plant phenotype via the physiological processes considered. We linked a rice FSPM to a quantitative genetic model, thereby employing QTL information to refine model parameters and visualizing the dynamics of development of the entire phenotype as a result of ecophysiological processes, including the trait(s) for which genetic information is available. Possibilities for further extension of the model, for example for the purposes of ideotype breeding, are discussed.

  1. A functional–structural model of rice linking quantitative genetic information with morphological development and physiological processes

    PubMed Central

    Xu, Lifeng; Henke, Michael; Zhu, Jun; Kurth, Winfried; Buck-Sorlin, Gerhard

    2011-01-01

    Background and Aims Although quantitative trait loci (QTL) analysis of yield-related traits for rice has developed rapidly, crop models using genotype information have been proposed only relatively recently. As a first step towards a generic genotype–phenotype model, we present here a three-dimensional functional–structural plant model (FSPM) of rice, in which some model parameters are controlled by functions describing the effect of main-effect and epistatic QTLs. Methods The model simulates the growth and development of rice based on selected ecophysiological processes, such as photosynthesis (source process) and organ formation, growth and extension (sink processes). It was devised using GroIMP, an interactive modelling platform based on the Relational Growth Grammar formalism (RGG). RGG rules describe the course of organ initiation and extension resulting in final morphology. The link between the phenotype (as represented by the simulated rice plant) and the QTL genotype was implemented via a data interface between the rice FSPM and the QTLNetwork software, which computes predictions of QTLs from map data and measured trait data. Key Results Using plant height and grain yield, it is shown how QTL information for a given trait can be used in an FSPM, computing and visualizing the phenotypes of different lines of a mapping population. Furthermore, we demonstrate how modification of a particular trait feeds back on the entire plant phenotype via the physiological processes considered. Conclusions We linked a rice FSPM to a quantitative genetic model, thereby employing QTL information to refine model parameters and visualizing the dynamics of development of the entire phenotype as a result of ecophysiological processes, including the trait(s) for which genetic information is available. Possibilities for further extension of the model, for example for the purposes of ideotype breeding, are discussed. PMID:21247905

  2. Development of advanced second-generation micromirror devices fabricated in a four-level planarized surface-micromachined polycrystalline silicon process

    NASA Astrophysics Data System (ADS)

    Michalicek, M. Adrian; Comtois, John H.; Schriner, Heather K.

    1998-04-01

    This paper describes the design and characterization of several types of micromirror devices to include process capabilities, device modeling, and test data resulting in deflection versus applied potential curves and surface contour measurements. These devices are the first to be fabricated in the state-of-the-art four-level planarized polysilicon process available at Sandia National Laboratories known as the Sandia Ultra-planar Multi-level MEMS Technology. This enabling process permits the development of micromirror devices with near-ideal characteristics which have previously been unrealizable in standard three-layer polysilicon processes. This paper describes such characteristics which have previously been unrealizable in standard three-layer polysilicon processes. This paper describes such characteristics as elevated address electrodes, various address wiring techniques, planarized mirror surfaces suing Chemical Mechanical Polishing, unique post-process metallization, and the best active surface area to date.

  3. A Mathematical Model for the Middle Ear Ventilation

    NASA Astrophysics Data System (ADS)

    Molnárka, G.; Miletics, E. M.; Fücsek, M.

    2008-09-01

    The otitis media is one of the mostly existing illness for the children, therefore investigation of the human middle ear ventilation is an actual problem. In earlier investigations both experimental and theoretical approach one can find in ([l]-[3]). Here we give a new mathematical and computer model to simulate this ventilation process. This model able to describe the diffusion and flow processes simultaneously, therefore it gives more precise results than earlier models did. The article contains the mathematical model and some results of the simulation.

  4. Applying STAMP in Accident Analysis

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy; Daouk, Mirna; Dulac, Nicolas; Marais, Karen

    2003-01-01

    Accident models play a critical role in accident investigation and analysis. Most traditional models are based on an underlying chain of events. These models, however, have serious limitations when used for complex, socio-technical systems. Previously, Leveson proposed a new accident model (STAMP) based on system theory. In STAMP, the basic concept is not an event but a constraint. This paper shows how STAMP can be applied to accident analysis using three different views or models of the accident process and proposes a notation for describing this process.

  5. Seven Processes that Enable NASA Software Engineering Technologies

    NASA Technical Reports Server (NTRS)

    Housch, Helen; Godfrey, Sally

    2011-01-01

    This slide presentation reviews seven processes that NASA uses to ensure that software is developed, acquired and maintained as specified in the NPR 7150.2A requirement. The requirement is to ensure that all software be appraised for the Capability Maturity Model Integration (CMMI). The enumerated processes are: (7) Product Integration, (6) Configuration Management, (5) Verification, (4) Software Assurance, (3) Measurement and Analysis, (2) Requirements Management and (1) Planning & Monitoring. Each of these is described and the group(s) that are responsible is described.

  6. Process Modeling and Dynamic Simulation for EAST Helium Refrigerator

    NASA Astrophysics Data System (ADS)

    Lu, Xiaofei; Fu, Peng; Zhuang, Ming; Qiu, Lilong; Hu, Liangbing

    2016-06-01

    In this paper, the process modeling and dynamic simulation for the EAST helium refrigerator has been completed. The cryogenic process model is described and the main components are customized in detail. The process model is controlled by the PLC simulator, and the realtime communication between the process model and the controllers is achieved by a customized interface. Validation of the process model has been confirmed based on EAST experimental data during the cool down process of 300-80 K. Simulation results indicate that this process simulator is able to reproduce dynamic behaviors of the EAST helium refrigerator very well for the operation of long pulsed plasma discharge. The cryogenic process simulator based on control architecture is available for operation optimization and control design of EAST cryogenic systems to cope with the long pulsed heat loads in the future. supported by National Natural Science Foundation of China (No. 51306195) and Key Laboratory of Cryogenics, Technical Institute of Physics and Chemistry, CAS (No. CRYO201408)

  7. A Bayesian model for visual space perception

    NASA Technical Reports Server (NTRS)

    Curry, R. E.

    1972-01-01

    A model for visual space perception is proposed that contains desirable features in the theories of Gibson and Brunswik. This model is a Bayesian processor of proximal stimuli which contains three important elements: an internal model of the Markov process describing the knowledge of the distal world, the a priori distribution of the state of the Markov process, and an internal model relating state to proximal stimuli. The universality of the model is discussed and it is compared with signal detection theory models. Experimental results of Kinchla are used as a special case.

  8. Chang'E-3 data pre-processing system based on scientific workflow

    NASA Astrophysics Data System (ADS)

    tan, xu; liu, jianjun; wang, yuanyuan; yan, wei; zhang, xiaoxia; li, chunlai

    2016-04-01

    The Chang'E-3(CE3) mission have obtained a huge amount of lunar scientific data. Data pre-processing is an important segment of CE3 ground research and application system. With a dramatic increase in the demand of data research and application, Chang'E-3 data pre-processing system(CEDPS) based on scientific workflow is proposed for the purpose of making scientists more flexible and productive by automating data-driven. The system should allow the planning, conduct and control of the data processing procedure with the following possibilities: • describe a data processing task, include:1)define input data/output data, 2)define the data relationship, 3)define the sequence of tasks,4)define the communication between tasks,5)define mathematical formula, 6)define the relationship between task and data. • automatic processing of tasks. Accordingly, Describing a task is the key point whether the system is flexible. We design a workflow designer which is a visual environment for capturing processes as workflows, the three-level model for the workflow designer is discussed:1) The data relationship is established through product tree.2)The process model is constructed based on directed acyclic graph(DAG). Especially, a set of process workflow constructs, including Sequence, Loop, Merge, Fork are compositional one with another.3)To reduce the modeling complexity of the mathematical formulas using DAG, semantic modeling based on MathML is approached. On top of that, we will present how processed the CE3 data with CEDPS.

  9. The GEMS Model of Volunteer Administration.

    ERIC Educational Resources Information Center

    Culp, Ken, III; Deppe, Catherine A.; Castillo, Jaime X.; Wells, Betty J.

    1998-01-01

    Describes GEMS, a spiral model that profiles volunteer administration. Components include Generate, Educate, Mobilize, and Sustain, four sets of processes that span volunteer recruitment and selection to retention or disengagement. (SK)

  10. The Integration of Extrarational and Rational Learning Processes: Moving Towards the Whole Learner.

    ERIC Educational Resources Information Center

    Puk, Tom

    1996-01-01

    Discusses the dichotomy between rational and nonrational learning processes, arguing for an integration of both. Reviews information processing theory and related learning strategies. Presents a model instructional strategy that fully integrates rational and nonrational processes. Describes implications for teaching and learning of the learning…

  11. Laser welding of polymers: phenomenological model for a quick and reliable process quality estimation considering beam shape influences

    NASA Astrophysics Data System (ADS)

    Timpe, Nathalie F.; Stuch, Julia; Scholl, Marcus; Russek, Ulrich A.

    2016-03-01

    This contribution presents a phenomenological, analytical model for laser welding of polymers which is suited for a quick process quality estimation for the practitioner. Besides material properties of the polymer and processing parameters like welding pressure, feed rate and laser power the model is based on a simple few parameter description of the size and shape of the laser power density distribution (PDD) in the processing zone. The model allows an estimation of the weld seam tensile strength. It is based on energy balance considerations within a thin sheet with the thickness of the optical penetration depth on the surface of the absorbing welding partner. The joining process itself is modelled by a phenomenological approach. The model reproduces the experimentally known process windows for the main process parameters correctly. Using the parameters describing the shape of the laser PDD the critical dependence of the process windows on the PDD shape will be predicted and compared with experiments. The adaption of the model to other laser manufacturing processes where the PDD influence can be modelled comparably will be discussed.

  12. A method to investigate the diffusion properties of nuclear calcium.

    PubMed

    Queisser, Gillian; Wittum, Gabriel

    2011-10-01

    Modeling biophysical processes in general requires knowledge about underlying biological parameters. The quality of simulation results is strongly influenced by the accuracy of these parameters, hence the identification of parameter values that the model includes is a major part of simulating biophysical processes. In many cases, secondary data can be gathered by experimental setups, which are exploitable by mathematical inverse modeling techniques. Here we describe a method for parameter identification of diffusion properties of calcium in the nuclei of rat hippocampal neurons. The method is based on a Gauss-Newton method for solving a least-squares minimization problem and was formulated in such a way that it is ideally implementable in the simulation platform uG. Making use of independently published space- and time-dependent calcium imaging data, generated from laser-assisted calcium uncaging experiments, here we could identify the diffusion properties of nuclear calcium and were able to validate a previously published model that describes nuclear calcium dynamics as a diffusion process.

  13. Modelling wastewater treatment in a submerged anaerobic membrane bioreactor.

    PubMed

    Spagni, Alessandro; Ferraris, Marco; Casu, Stefania

    2015-01-01

    Mathematical modelling has been widely applied to membrane bioreactor (MBRs) processes. However, to date, very few studies have reported on the application of the anaerobic digestion model N.1 (ADM1) to anaerobic membrane processes. The aim of this study was to evaluate the applicability of the ADM1 to a submerged anaerobic MBR (SAMBR) treating simulated industrial wastewater composed of cheese whey and sucrose. This study demonstrated that the biological processes involved in SAMBRs can be modelled by using the ADM1. Moreover, the results showed that very few modifications of the parameters describing the ADM1 were required to reasonably fit the experimental data. In particular, adaptation to the specific conditions of the coefficients describing the wastewater characterisation and the reduction of the hydrolysis rate of particulate carbohydrate (khyd,ch) from 0.25 d(-1) (as suggested by the ADM1 for high-rate mesophilic reactors) to 0.13 d(-1) were required to fit the experimental data.

  14. Biosorption of Cu(II) by powdered anaerobic granular sludge from aqueous medium.

    PubMed

    Zhou, Xu; Chen, Chuan; Wang, Aijie; Jiang, Guangming; Liu, Lihong; Xu, Xijun; Yuan, Ye; Lee, Duu-Jung; Ren, Nanqi

    2013-01-01

    Copper(II) biosorption processes by two pre-treated powdered anaerobic granular sludges (PAGS) (original sludges were methanogenic anaerobic granules and denitrifying sulfide removal (DSR) anaerobic granules) were investigated through batch tests. Factors affecting the biosorption process, such as pH, temperature and initial copper concentrations, were examined. Also, the physico-chemical characteristics of the anaerobic sludge were analyzed by Fourier transform infrared spectroscopy, scanning electron microscopy image, surface area and elemental analysis. A second-order kinetic model was applied to describe the biosorption process, and the model could fit the biosorption process. The Freundlich model was used for describing the adsorption equilibrium data and could fit the equilibrium data well. It was found that the methanogenic PAGS was more effective in Copper(II) biosorption process than the DSR PAGS, whose maximum biosorption capacity was 39.6% lower. The mechanisms of the biosorption capacities for different PAGS were discussed, and the conclusion suggested that the environment and biochemical reactions during the growth of biomass may have affected the structure of the PAGS. The methanogenic PAGS had larger specific surface area and more biosorption capacity than the DSR PAGS.

  15. A modelling approach for the heterogeneous oxidation of elastomers

    NASA Astrophysics Data System (ADS)

    Herzig, A.; Sekerakova, L.; Johlitz, M.; Lion, A.

    2017-09-01

    The influence of oxygen on elastomers, known as oxidation, is one of the most important ageing processes and becomes more and more important for nowadays applications. The interaction with thermal effects as well as antioxidants makes oxidation of polymers a complex process. Based on the polymer chosen and environmental conditions, the ageing processes may behave completely different. In a lot of cases the influence of oxygen is limited to the surface layer of the samples, commonly referred to as diffusion-limited oxidation. For the lifetime prediction of elastomer components, it is essential to have detailed knowledge about the absorption and diffusion behaviour of oxygen molecules during thermo-oxidative ageing and how they react with the elastomer. Experimental investigations on industrially used elastomeric materials are executed in order to develop and fit models, which shall be capable of predicting the permeation and consumption of oxygen as well as changes in the mechanical properties. The latter are of prime importance for technical applications of rubber components. Oxidation does not occur homogeneously over the entire elastomeric component. Hence, material models which include ageing effects have to be amplified in order to consider heterogeneous ageing, which highly depends on the ageing temperature. The influence of elevated temperatures upon accelerated ageing has to be critically analysed, and influences on the permeation and diffusion coefficient have to be taken into account. This work presents phenomenological models which describe the oxygen uptake and the diffusion into elastomers based on an improved understanding of ongoing chemical processes and diffusion limiting modifications. On the one side, oxygen uptake is modelled by means of Henry's law in which solubility is a function of the temperature as well as the ageing progress. The latter is an irreversible process and described by an inner differential evolution equation. On the other side, further diffusion of oxygen into the material is described by a model based on Fick's law, which is modified by a reaction term. The evolved diffusion-reaction equation depends on the ageing temperature as well as on the progress of ageing and is able to describe diffusion-limited oxidation.

  16. Effect of river flow fluctuations on riparian vegetation dynamics: Processes and models

    NASA Astrophysics Data System (ADS)

    Vesipa, Riccardo; Camporeale, Carlo; Ridolfi, Luca

    2017-12-01

    Several decades of field observations, laboratory experiments and mathematical modelings have demonstrated that the riparian environment is a disturbance-driven ecosystem, and that the main source of disturbance is river flow fluctuations. The focus of the present work has been on the key role that flow fluctuations play in determining the abundance, zonation and species composition of patches of riparian vegetation. To this aim, the scientific literature on the subject, over the last 20 years, has been reviewed. First, the most relevant ecological, morphological and chemical mechanisms induced by river flow fluctuations are described from a process-based perspective. The role of flow variability is discussed for the processes that affect the recruitment of vegetation, the vegetation during its adult life, and the morphological and nutrient dynamics occurring in the riparian habitat. Particular emphasis has been given to studies that were aimed at quantifying the effect of these processes on vegetation, and at linking them to the statistical characteristics of the river hydrology. Second, the advances made, from a modeling point of view, have been considered and discussed. The main models that have been developed to describe the dynamics of riparian vegetation have been presented. Different modeling approaches have been compared, and the corresponding advantages and drawbacks have been pointed out. Finally, attention has been paid to identifying the processes considered by the models, and these processes have been compared with those that have actually been observed or measured in field/laboratory studies.

  17. Modeling stroke rehabilitation processes using the Unified Modeling Language (UML).

    PubMed

    Ferrante, Simona; Bonacina, Stefano; Pinciroli, Francesco

    2013-10-01

    In organising and providing rehabilitation procedures for stroke patients, the usual need for many refinements makes it inappropriate to attempt rigid standardisation, but greater detail is required concerning workflow. The aim of this study was to build a model of the post-stroke rehabilitation process. The model, implemented in the Unified Modeling Language, was grounded on international guidelines and refined following the clinical pathway adopted at local level by a specialized rehabilitation centre. The model describes the organisation of the rehabilitation delivery and it facilitates the monitoring of recovery during the process. Indeed, a system software was developed and tested to support clinicians in the digital administration of clinical scales. The model flexibility assures easy updating after process evolution. Copyright © 2013 Elsevier Ltd. All rights reserved.

  18. Modeling of active transmembrane transport in a mixture theory framework.

    PubMed

    Ateshian, Gerard A; Morrison, Barclay; Hung, Clark T

    2010-05-01

    This study formulates governing equations for active transport across semi-permeable membranes within the framework of the theory of mixtures. In mixture theory, which models the interactions of any number of fluid and solid constituents, a supply term appears in the conservation of linear momentum to describe momentum exchanges among the constituents. In past applications, this momentum supply was used to model frictional interactions only, thereby describing passive transport processes. In this study, it is shown that active transport processes, which impart momentum to solutes or solvent, may also be incorporated in this term. By projecting the equation of conservation of linear momentum along the normal to the membrane, a jump condition is formulated for the mechano-electrochemical potential of fluid constituents which is generally applicable to nonequilibrium processes involving active transport. The resulting relations are simple and easy to use, and address an important need in the membrane transport literature.

  19. Accreditation of Continuing Education: The Critical Elements.

    ERIC Educational Resources Information Center

    DeSilets, Lynore D.

    1998-01-01

    Reviews the history of accreditation in nursing continuing education, describes the system and process, and identifies institutional characteristics needed before beginning the process. Uses the American Nurses Center Commission on Accreditation model. (Author/SK)

  20. Enterprise and system of systems capability development life-cycle processes.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beck, David Franklin

    2014-08-01

    This report and set of appendices are a collection of memoranda originally drafted circa 2007-2009 for the purpose of describing and detailing a models-based systems engineering approach for satisfying enterprise and system-of-systems life cycle process requirements. At the time there was interest and support to move from Capability Maturity Model Integration (CMMI) Level One (ad hoc processes) to Level Three. The main thrust of the material presents a rational exposâe of a structured enterprise development life cycle that uses the scientific method as a framework, with further rigor added from adapting relevant portions of standard systems engineering processes. While themore » approach described invokes application of the Department of Defense Architectural Framework (DoDAF), it is suitable for use with other architectural description frameworks.« less

  1. Physiologically based pharmacokinetic modeling of deltamethrin: Development of a rat and human diffusion-limited model

    EPA Science Inventory

    Mirfazaelian et al. (2006) developed a physiologically based pharmacokinetic (PBPK) model for the pyrethroid pesticide deltamethrin in the rat. This model describes gastrointestinal tract absorption as a saturable process mediated by phase III efflux transporters which pump delta...

  2. Modelling and calculation of flotation process in one-dimensional formulation

    NASA Astrophysics Data System (ADS)

    Amanbaev, Tulegen; Tilleuov, Gamidulla; Tulegenova, Bibigul

    2016-08-01

    In the framework of the assumptions of the mechanics of the multiphase media is constructed a mathematical model of the flotation process in the dispersed mixture of liquid, solid and gas phases, taking into account the degree of mineralization of the surface of the bubbles. Application of the constructed model is demonstrated on the example of one-dimensional stationary flotation and it is shown that the equations describing the process of ascent of the bubbles are singularly perturbed ("rigid"). The effect of size and concentration of bubbles and the volumetric content of dispersed particles on the flotation process are analyzed.

  3. High Accuracy Transistor Compact Model Calibrations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hembree, Charles E.; Mar, Alan; Robertson, Perry J.

    2015-09-01

    Typically, transistors are modeled by the application of calibrated nominal and range models. These models consists of differing parameter values that describe the location and the upper and lower limits of a distribution of some transistor characteristic such as current capacity. Correspond- ingly, when using this approach, high degrees of accuracy of the transistor models are not expected since the set of models is a surrogate for a statistical description of the devices. The use of these types of models describes expected performances considering the extremes of process or transistor deviations. In contrast, circuits that have very stringent accuracy requirementsmore » require modeling techniques with higher accuracy. Since these accurate models have low error in transistor descriptions, these models can be used to describe part to part variations as well as an accurate description of a single circuit instance. Thus, models that meet these stipulations also enable the calculation of quantifi- cation of margins with respect to a functional threshold and uncertainties in these margins. Given this need, new model high accuracy calibration techniques for bipolar junction transis- tors have been developed and are described in this report.« less

  4. Dementia Grief: A Theoretical Model of a Unique Grief Experience

    PubMed Central

    Blandin, Kesstan; Pepin, Renee

    2016-01-01

    Previous literature reveals a high prevalence of grief in dementia caregivers before physical death of the person with dementia that is associated with stress, burden, and depression. To date, theoretical models and therapeutic interventions with grief in caregivers have not adequately considered the grief process, but instead have focused on grief as a symptom that manifests within the process of caregiving. The Dementia Grief Model explicates the unique process of pre-death grief in dementia caregivers. In this paper we introduce the Dementia Grief Model, describe the unique characteristics dementia grief, and present the psychological states associated with the process of dementia grief. The model explicates an iterative grief process involving three states – separation, liminality, and re-emergence – each with a dynamic mechanism that facilitates or hinders movement through the dementia grief process. Finally, we offer potential applied research questions informed by the model. PMID:25883036

  5. Comparing an annual and daily time-step model for predicting field-scale phosphorus loss

    USDA-ARS?s Scientific Manuscript database

    Numerous models exist for describing phosphorus (P) losses from agricultural fields. The complexity of these models varies considerably ranging from simple empirically-based annual time-step models to more complex process-based daily time step models. While better accuracy is often assumed with more...

  6. Application of Hierarchy Theory to Cross-Scale Hydrologic Modeling of Nutrient Loads

    EPA Science Inventory

    We describe a model called Regional Hydrologic Modeling for Environmental Evaluation 16 (RHyME2) for quantifying annual nutrient loads in stream networks and watersheds. RHyME2 is 17 a cross-scale statistical and process-based water-quality model. The model ...

  7. WICS: A New Model for School Psychology

    ERIC Educational Resources Information Center

    Sternberg, Robert J.

    2010-01-01

    This article presents a unified model for cognitive processing, WICS, which is an acronym for wisdom, intelligence, and creativity, synthesized. The model can be applied to identification/admissions, diagnosis, instruction, and assessment. I discuss why there is a need for such a model. Then I describe traditional models, after which I describe…

  8. Wyoming greater sage-grouse habitat prioritization: A collection of multi-scale seasonal models and geographic information systems land management tools

    USGS Publications Warehouse

    O'Donnell, Michael S.; Aldridge, Cameron L.; Doherty, Kevin E.; Fedy, Bradley C.

    2015-01-01

    We deliver all products described herein as online geographic information system data for visualization and downloading. We outline the data properties for each model and their data inputs, describe the process of selecting appropriate data products for multifarious applications, describe all data products and software, provide newly derived model composites, and discuss how land managers may use the models to inform future sage-grouse studies and potentially refine conservation efforts. The models, software tools, and associated opportunities for novel applications of these products should provide a suite of additional, but not exclusive, tools for assessing Wyoming Greater Sage-grouse habitats, which land managers, conservationists, and scientists can apply to myriad applications.

  9. Linking Local Scale Ecosystem Science to Regional Scale Management

    NASA Astrophysics Data System (ADS)

    Shope, C. L.; Tenhunen, J.; Peiffer, S.

    2012-04-01

    Ecosystem management with respect to sufficient water yield, a quality water supply, habitat and biodiversity conservation, and climate change effects requires substantial observational data at a range of scales. Complex interactions of local physical processes oftentimes vary over space and time, particularly in locations with extreme meteorological conditions. Modifications to local conditions (ie: agricultural land use changes, nutrient additions, landscape management, water usage) can further affect regional ecosystem services. The international, inter-disciplinary TERRECO research group is intensively investigating a variety of local processes, parameters, and conditions to link complex physical, economic, and social interactions at the regional scale. Field-based meteorology, hydrology, soil physics, plant production, solute and sediment transport, economic, and social behavior data were measured in a South Korean catchment. The data are used to parameterize suite of models describing local to landscape level water, sediment, nutrient, and monetary relationships. We focus on using the agricultural and hydrological SWAT model to synthesize the experimental field data and local-scale models throughout the catchment. The approach of our study was to describe local scientific processes, link potential interrelationships between different processes, and predict environmentally efficient management efforts. The Haean catchment case study shows how research can be structured to provide cross-disciplinary scientific linkages describing complex ecosystems and landscapes that can be used for regional management evaluations and predictions.

  10. Variable classification in the LSST era: exploring a model for quasi-periodic light curves

    NASA Astrophysics Data System (ADS)

    Zinn, J. C.; Kochanek, C. S.; Kozłowski, S.; Udalski, A.; Szymański, M. K.; Soszyński, I.; Wyrzykowski, Ł.; Ulaczyk, K.; Poleski, R.; Pietrukowicz, P.; Skowron, J.; Mróz, P.; Pawlak, M.

    2017-06-01

    The Large Synoptic Survey Telescope (LSST) is expected to yield ˜107 light curves over the course of its mission, which will require a concerted effort in automated classification. Stochastic processes provide one means of quantitatively describing variability with the potential advantage over simple light-curve statistics that the parameters may be physically meaningful. Here, we survey a large sample of periodic, quasi-periodic and stochastic Optical Gravitational Lensing Experiment-III variables using the damped random walk (DRW; CARMA(1,0)) and quasi-periodic oscillation (QPO; CARMA(2,1)) stochastic process models. The QPO model is described by an amplitude, a period and a coherence time-scale, while the DRW has only an amplitude and a time-scale. We find that the periodic and quasi-periodic stellar variables are generally better described by a QPO than a DRW, while quasars are better described by the DRW model. There are ambiguities in interpreting the QPO coherence time due to non-sinusoidal light-curve shapes, signal-to-noise ratio, error mischaracterizations and cadence. Higher order implementations of the QPO model that better capture light-curve shapes are necessary for the coherence time to have its implied physical meaning. Independent of physical meaning, the extra parameter of the QPO model successfully distinguishes most of the classes of periodic and quasi-periodic variables we consider.

  11. Abstracting event-based control models for high autonomy systems

    NASA Technical Reports Server (NTRS)

    Luh, Cheng-Jye; Zeigler, Bernard P.

    1993-01-01

    A high autonomy system needs many models on which to base control, management, design, and other interventions. These models differ in level of abstraction and in formalism. Concepts and tools are needed to organize the models into a coherent whole. The paper deals with the abstraction processes for systematic derivation of related models for use in event-based control. The multifaceted modeling methodology is briefly reviewed. The morphism concepts needed for application to model abstraction are described. A theory for supporting the construction of DEVS models needed for event-based control is then presented. An implemented morphism on the basis of this theory is also described.

  12. Logistics of Trainsets Creation with the Use of Simulation Models

    NASA Astrophysics Data System (ADS)

    Sedláček, Michal; Pavelka, Hynek

    2016-12-01

    This paper focuses on rail transport in following the train formation operational processes problem using computer simulations. The problem has been solved using SIMUL8 and applied to specific train formation station in the Czech Republic. The paper describes a proposal simulation model of the train formation work. Experimental modeling with an assessment of achievements and design solution for optimizing of the train formation operational process is also presented.

  13. Moment-Based Physical Models of Broadband Clutter due to Aggregations of Fish

    DTIC Science & Technology

    2013-09-30

    statistical models for signal-processing algorithm development. These in turn will help to develop a capability to statistically forecast the impact of...aggregations of fish based on higher-order statistical measures describable in terms of physical and system parameters. Environmentally , these models...processing. In this experiment, we had good ground truth on (1) and (2), and had control over (3) and (4) except for environmentally -imposed restrictions

  14. Parameter estimation for terrain modeling from gradient data. [navigation system for Martian rover

    NASA Technical Reports Server (NTRS)

    Dangelo, K. R.

    1974-01-01

    A method is developed for modeling terrain surfaces for use on an unmanned Martian roving vehicle. The modeling procedure employs a two-step process which uses gradient as well as height data in order to improve the accuracy of the model's gradient. Least square approximation is used in order to stochastically determine the parameters which describe the modeled surface. A complete error analysis of the modeling procedure is included which determines the effect of instrumental measurement errors on the model's accuracy. Computer simulation is used as a means of testing the entire modeling process which includes the acquisition of data points, the two-step modeling process and the error analysis. Finally, to illustrate the procedure, a numerical example is included.

  15. Choice of mathematical models for technological process of glass rod drawing

    NASA Astrophysics Data System (ADS)

    Alekseeva, L. B.

    2017-10-01

    The technological process of drawing glass rods (light guides) is considered. Automated control of the drawing process is reduced to the process of making decisions to ensure a given quality. The drawing process is considered as a control object, including the drawing device (control device) and the optical fiber forming zone (control object). To study the processes occurring in the formation zone, mathematical models are proposed, based on the continuum mechanics basics. To assess the influence of disturbances, a transfer function is obtained from the basis of the wave equation. Obtaining the regression equation also adequately describes the drawing process.

  16. The use of mechanistic descriptions of algal growth and zooplankton grazing in an estuarine eutrophication model

    NASA Astrophysics Data System (ADS)

    Baird, M. E.; Walker, S. J.; Wallace, B. B.; Webster, I. T.; Parslow, J. S.

    2003-03-01

    A simple model of estuarine eutrophication is built on biomechanical (or mechanistic) descriptions of a number of the key ecological processes in estuaries. Mechanistically described processes include the nutrient uptake and light capture of planktonic and benthic autotrophs, and the encounter rates of planktonic predators and prey. Other more complex processes, such as sediment biogeochemistry, detrital processes and phosphate dynamics, are modelled using empirical descriptions from the Port Phillip Bay Environmental Study (PPBES) ecological model. A comparison is made between the mechanistically determined rates of ecological processes and the analogous empirically determined rates in the PPBES ecological model. The rates generally agree, with a few significant exceptions. Model simulations were run at a range of estuarine depths and nutrient loads, with outputs presented as the annually averaged biomass of autotrophs. The simulations followed a simple conceptual model of eutrophication, suggesting a simple biomechanical understanding of estuarine processes can provide a predictive tool for ecological processes in a wide range of estuarine ecosystems.

  17. An Information System Development Method Connecting Business Process Modeling and its Experimental Evaluation

    NASA Astrophysics Data System (ADS)

    Okawa, Tsutomu; Kaminishi, Tsukasa; Kojima, Yoshiyuki; Hirabayashi, Syuichi; Koizumi, Hisao

    Business process modeling (BPM) is gaining attention as a measure of analysis and improvement of the business process. BPM analyses the current business process as an AS-IS model and solves problems to improve the current business and moreover it aims to create a business process, which produces values, as a TO-BE model. However, researches of techniques that connect the business process improvement acquired by BPM to the implementation of the information system seamlessly are rarely reported. If the business model obtained by BPM is converted into UML, and the implementation can be carried out by the technique of UML, we can expect the improvement in efficiency of information system implementation. In this paper, we describe a method of the system development, which converts the process model obtained by BPM into UML and the method is evaluated by modeling a prototype of a parts procurement system. In the evaluation, comparison with the case where the system is implemented by the conventional UML technique without going via BPM is performed.

  18. Brownian Motion at Lipid Membranes: A Comparison of Hydrodynamic Models Describing and Experiments Quantifying Diffusion within Lipid Bilayers.

    PubMed

    Block, Stephan

    2018-05-22

    The capability of lipid bilayers to exhibit fluid-phase behavior is a fascinating property, which enables, for example, membrane-associated components, such as lipids (domains) and transmembrane proteins, to diffuse within the membrane. These diffusion processes are of paramount importance for cells, as they are for example involved in cell signaling processes or the recycling of membrane components, but also for recently developed analytical approaches, which use differences in the mobility for certain analytical purposes, such as in-membrane purification of membrane proteins or the analysis of multivalent interactions. Here, models describing the Brownian motion of membrane inclusions (lipids, peptides, proteins, and complexes thereof) in model bilayers (giant unilamellar vesicles, black lipid membranes, supported lipid bilayers) are summarized and model predictions are compared with the available experimental data, thereby allowing for evaluating the validity of the introduced models. It will be shown that models describing the diffusion in freestanding (Saffman-Delbrück and Hughes-Pailthorpe-White model) and supported bilayers (the Evans-Sackmann model) are well supported by experiments, though only few experimental studies have been published so far for the latter case, calling for additional tests to reach the same level of experimental confirmation that is currently available for the case of freestanding bilayers.

  19. Antineutrino Charged-Current Reactions on Hydrocarbon with Low Momentum Transfer

    NASA Astrophysics Data System (ADS)

    Gran, R.; Betancourt, M.; Elkins, M.; Rodrigues, P. A.; Akbar, F.; Aliaga, L.; Andrade, D. A.; Bashyal, A.; Bellantoni, L.; Bercellie, A.; Bodek, A.; Bravar, A.; Budd, H.; Vera, G. F. R. Caceres; Cai, T.; Carneiro, M. F.; Coplowe, D.; da Motta, H.; Dytman, S. A.; Díaz, G. A.; Felix, J.; Fields, L.; Fine, R.; Gallagher, H.; Ghosh, A.; Haider, H.; Han, J. Y.; Harris, D. A.; Henry, S.; Jena, D.; Kleykamp, J.; Kordosky, M.; Le, T.; Leistico, J. R.; Lovlein, A.; Lu, X.-G.; Maher, E.; Manly, S.; Mann, W. A.; Marshall, C. M.; McFarland, K. S.; McGowan, A. M.; Messerly, B.; Miller, J.; Mislivec, A.; Morfín, J. G.; Mousseau, J.; Naples, D.; Nelson, J. K.; Nguyen, C.; Norrick, A.; Nuruzzaman, Olivier, A.; Paolone, V.; Patrick, C. E.; Perdue, G. N.; Ramírez, M. A.; Ransome, R. D.; Ray, H.; Ren, L.; Rimal, D.; Ruterbories, D.; Schellman, H.; Salinas, C. J. Solano; Su, H.; Sultana, M.; Falero, S. Sánchez; Valencia, E.; Wolcott, J.; Wospakrik, M.; Yaeggy, B.; Minerva Collaboration

    2018-06-01

    We report on multinucleon effects in low momentum transfer (<0.8 GeV /c ) antineutrino interactions on plastic (CH) scintillator. These data are from the 2010-2011 antineutrino phase of the MINERvA experiment at Fermilab. The hadronic energy spectrum of this inclusive sample is well described when a screening effect at a low energy transfer and a two-nucleon knockout process are added to a relativistic Fermi gas model of quasielastic, Δ resonance, and higher resonance processes. In this analysis, model elements introduced to describe previously published neutrino results have quantitatively similar benefits for this antineutrino sample. We present the results as a double-differential cross section to accelerate the investigation of alternate models for antineutrino scattering off nuclei.

  20. Antineutrino Charged-Current Reactions on Hydrocarbon with Low Momentum Transfer.

    PubMed

    Gran, R; Betancourt, M; Elkins, M; Rodrigues, P A; Akbar, F; Aliaga, L; Andrade, D A; Bashyal, A; Bellantoni, L; Bercellie, A; Bodek, A; Bravar, A; Budd, H; Vera, G F R Caceres; Cai, T; Carneiro, M F; Coplowe, D; da Motta, H; Dytman, S A; Díaz, G A; Felix, J; Fields, L; Fine, R; Gallagher, H; Ghosh, A; Haider, H; Han, J Y; Harris, D A; Henry, S; Jena, D; Kleykamp, J; Kordosky, M; Le, T; Leistico, J R; Lovlein, A; Lu, X-G; Maher, E; Manly, S; Mann, W A; Marshall, C M; McFarland, K S; McGowan, A M; Messerly, B; Miller, J; Mislivec, A; Morfín, J G; Mousseau, J; Naples, D; Nelson, J K; Nguyen, C; Norrick, A; Nuruzzaman; Olivier, A; Paolone, V; Patrick, C E; Perdue, G N; Ramírez, M A; Ransome, R D; Ray, H; Ren, L; Rimal, D; Ruterbories, D; Schellman, H; Salinas, C J Solano; Su, H; Sultana, M; Falero, S Sánchez; Valencia, E; Wolcott, J; Wospakrik, M; Yaeggy, B

    2018-06-01

    We report on multinucleon effects in low momentum transfer (<0.8  GeV/c) antineutrino interactions on plastic (CH) scintillator. These data are from the 2010-2011 antineutrino phase of the MINERvA experiment at Fermilab. The hadronic energy spectrum of this inclusive sample is well described when a screening effect at a low energy transfer and a two-nucleon knockout process are added to a relativistic Fermi gas model of quasielastic, Δ resonance, and higher resonance processes. In this analysis, model elements introduced to describe previously published neutrino results have quantitatively similar benefits for this antineutrino sample. We present the results as a double-differential cross section to accelerate the investigation of alternate models for antineutrino scattering off nuclei.

  1. Children's Solution Processes in Elementary Arithmetic Problems: Analysis and Improvement. Report No. 19.

    ERIC Educational Resources Information Center

    De Corte, Erik; Verschaffel, Lieven

    Design and results of an investigation attempting to analyze and improve children's solution processes in elementary addition and subtraction problems are described. As background for the study, a conceptual model was developed based on previous research. One dimension of the model relates to the characteristics of the tasks (numerical versus word…

  2. Policy Internationalization, National Variety and Governance: Global Models and Network Power in Higher Education States

    ERIC Educational Resources Information Center

    King, Roger

    2010-01-01

    This article analyzes policy convergence and the adoption of globalizing models by higher education states, a process we describe, following Thatcher (2007), as policy internationalization. This refers to processes found in many policy domains and which increasingly are exemplified in tertiary education systems too. The focus is on governmental…

  3. Toward a Model of Text Comprehension and Production.

    ERIC Educational Resources Information Center

    Kintsch, Walter; Van Dijk, Teun A.

    1978-01-01

    Described is the system of mental operations occurring in text comprehension and in recall and summarization. A processing model is outlined: 1) the meaning elements of a text become organized into a coherent whole, 2) the full meaning of the text is condensed into its gist, and 3) new texts are generated from the comprehension processes.…

  4. An Onto-Semiotic Analysis of Combinatorial Problems and the Solving Processes by University Students

    ERIC Educational Resources Information Center

    Godino, Juan D.; Batanero, Carmen; Roa, Rafael

    2005-01-01

    In this paper we describe an ontological and semiotic model for mathematical knowledge, using elementary combinatorics as an example. We then apply this model to analyze the solving process of some combinatorial problems by students with high mathematical training, and show its utility in providing a semiotic explanation for the difficulty of…

  5. Models for Selecting Chief State School Officers. Policy Memo Series, No. 1.

    ERIC Educational Resources Information Center

    Sanchez, Karen L. Van Til; Hall, Gayle C.

    The process of selecting a chief state school officer (CSSO) can be a significant means of allocating policymaking power in state educational governance. This paper examines the role of the chief state school officer and explains how that role is influenced by the selection process. Four selection models are described, along with the advantages…

  6. Multi-photon EIT

    NASA Astrophysics Data System (ADS)

    Laarits, Toomas; O'Gorman, Bryan; Crescimanno, Michael

    2008-03-01

    We describe and solve a quantum optics models for multiphoton interrogation of an electromagnetically induced transparency (EIT) resonance. Multiphoton EIT, like its well studied Lambda-system EIT progenitor, is a generalization of the N-resonance process recently studied for atomic time keeping. The solution of these models allows a preliminary determination of this processes utility as the basis of a frequency standard.

  7. Mesocell study area snow distributions for the Cold Land Processes Experiment (CLPX)

    Treesearch

    Glen E. Liston; Christopher A. Hiemstra; Kelly Elder; Donald W. Cline

    2008-01-01

    The Cold Land Processes Experiment (CLPX) had a goal of describing snow-related features over a wide range of spatial and temporal scales. This required linking disparate snow tools and datasets into one coherent, integrated package. Simulating realistic high-resolution snow distributions and features requires a snow-evolution modeling system (SnowModel) that can...

  8. An Interdisciplinary Approach to Designing Online Learning: Fostering Pre-Service Mathematics Teachers' Capabilities in Mathematical Modelling

    ERIC Educational Resources Information Center

    Geiger, Vince; Mulligan, Joanne; Date-Huxtable, Liz; Ahlip, Rehez; Jones, D. Heath; May, E. Julian; Rylands, Leanne; Wright, Ian

    2018-01-01

    In this article we describe and evaluate processes utilized to develop an online learning module on mathematical modelling for pre-service teachers. The module development process involved a range of professionals working within the STEM disciplines including mathematics and science educators, mathematicians, scientists, in-service and pre-service…

  9. Exposure to Community Violence: Processes That Increase the Risk for Inner-City Middle School Children

    ERIC Educational Resources Information Center

    Salzinger, Suzanne; Ng-Mak, Daisy S.; Feldman, Richard S.; Kam, Chi-Ming; Rosario, Margaret

    2006-01-01

    An ecologically framed model is presented describing processes accounting for early adolescents' exposure to community violence in high-risk neighborhoods as a function of risk factors in four ecological domains assessed in the prior year. The model was tested for hypothesized pathways along which the combined domains of risk might operate. The…

  10. Time to Teach: Teaching-Learning Processes in Primary Schools.

    ERIC Educational Resources Information Center

    Bennett, Neville

    A model of the teaching-learning process identifies and describes varied behavioral dimensions of the classroom and how they relate to pupil achievement. The model is based on the assumption that the total amount of engaged time on a particular topic is the most important determinant of achievement and has the components of: (1) quantity of…

  11. Application fields for the new Object Management Group (OMG) Standards Case Management Model and Notation (CMMN) and Decision Management Notation (DMN) in the perioperative field.

    PubMed

    Wiemuth, M; Junger, D; Leitritz, M A; Neumann, J; Neumuth, T; Burgert, O

    2017-08-01

    Medical processes can be modeled using different methods and notations. Currently used modeling systems like Business Process Model and Notation (BPMN) are not capable of describing the highly flexible and variable medical processes in sufficient detail. We combined two modeling systems, Business Process Management (BPM) and Adaptive Case Management (ACM), to be able to model non-deterministic medical processes. We used the new Standards Case Management Model and Notation (CMMN) and Decision Management Notation (DMN). First, we explain how CMMN, DMN and BPMN could be used to model non-deterministic medical processes. We applied this methodology to model 79 cataract operations provided by University Hospital Leipzig, Germany, and four cataract operations provided by University Eye Hospital Tuebingen, Germany. Our model consists of 85 tasks and about 20 decisions in BPMN. We were able to expand the system with more complex situations that might appear during an intervention. An effective modeling of the cataract intervention is possible using the combination of BPM and ACM. The combination gives the possibility to depict complex processes with complex decisions. This combination allows a significant advantage for modeling perioperative processes.

  12. Two stochastic models useful in petroleum exploration

    NASA Technical Reports Server (NTRS)

    Kaufman, G. M.; Bradley, P. G.

    1972-01-01

    A model of the petroleum exploration process that tests empirically the hypothesis that at an early stage in the exploration of a basin, the process behaves like sampling without replacement is proposed along with a model of the spatial distribution of petroleum reserviors that conforms to observed facts. In developing the model of discovery, the following topics are discussed: probabilitistic proportionality, likelihood function, and maximum likelihood estimation. In addition, the spatial model is described, which is defined as a stochastic process generating values of a sequence or random variables in a way that simulates the frequency distribution of areal extent, the geographic location, and shape of oil deposits

  13. Exact solutions for network rewiring models

    NASA Astrophysics Data System (ADS)

    Evans, T. S.

    2007-03-01

    Evolving networks with a constant number of edges may be modelled using a rewiring process. These models are used to describe many real-world processes including the evolution of cultural artifacts such as family names, the evolution of gene variations, and the popularity of strategies in simple econophysics models such as the minority game. The model is closely related to Urn models used for glasses, quantum gravity and wealth distributions. The full mean field equation for the degree distribution is found and its exact solution and generating solution are given.

  14. 2014 Version 7.0 Technical Support Document (TSD)

    EPA Pesticide Factsheets

    The 2014 Version 7 document describes the processing of emission inventories into inputs for the Community Multiscale Air Quality model for use in the 2014 National Air Toxics Assessment initial modeling.

  15. Parameter inference from hitting times for perturbed Brownian motion.

    PubMed

    Tamborrino, Massimiliano; Ditlevsen, Susanne; Lansky, Peter

    2015-07-01

    A latent internal process describes the state of some system, e.g. the social tension in a political conflict, the strength of an industrial component or the health status of a person. When this process reaches a predefined threshold, the process terminates and an observable event occurs, e.g. the political conflict finishes, the industrial component breaks down or the person dies. Imagine an intervention, e.g., a political decision, maintenance of a component or a medical treatment, is initiated to the process before the event occurs. How can we evaluate whether the intervention had an effect? To answer this question we describe the effect of the intervention through parameter changes of the law governing the internal process. Then, the time interval between the start of the process and the final event is divided into two subintervals: the time from the start to the instant of intervention, denoted by S, and the time between the intervention and the threshold crossing, denoted by R. The first question studied here is: What is the joint distribution of (S,R)? The theoretical expressions are provided and serve as a basis to answer the main question: Can we estimate the parameters of the model from observations of S and R and compare them statistically? Maximum likelihood estimators are calculated and applied on simulated data under the assumption that the process before and after the intervention is described by the same type of model, i.e. a Brownian motion, but with different parameters. Also covariates and handling of censored observations are incorporated into the statistical model, and the method is illustrated on lung cancer data.

  16. The model of drugs distribution dynamics in biological tissue

    NASA Astrophysics Data System (ADS)

    Ginevskij, D. A.; Izhevskij, P. V.; Sheino, I. N.

    2017-09-01

    The dose distribution by Neutron Capture Therapy follows the distribution of 10B in the tissue. The modern models of pharmacokinetics of drugs describe the processes occurring in conditioned "chambers" (blood-organ-tumor), but fail to describe the spatial distribution of the drug in the tumor and in normal tissue. The mathematical model of the spatial distribution dynamics of drugs in the tissue, depending on the concentration of the drug in the blood, was developed. The modeling method is the representation of the biological structure in the form of a randomly inhomogeneous medium in which the 10B distribution occurs. The parameters of the model, which cannot be determined rigorously in the experiment, are taken as the quantities subject to the laws of the unconnected random processes. The estimates of 10B distribution preparations in the tumor and healthy tissue, inside/outside the cells, are obtained.

  17. A visual metaphor describing neural dynamics in schizophrenia.

    PubMed

    van Beveren, Nico J M; de Haan, Lieuwe

    2008-07-09

    In many scientific disciplines the use of a metaphor as an heuristic aid is not uncommon. A well known example in somatic medicine is the 'defense army metaphor' used to characterize the immune system. In fact, probably a large part of the everyday work of doctors consists of 'translating' scientific and clinical information (i.e. causes of disease, percentage of success versus risk of side-effects) into information tailored to the needs and capacities of the individual patient. The ability to do so in an effective way is at least partly what makes a clinician a good communicator. Schizophrenia is a severe psychiatric disorder which affects approximately 1% of the population. Over the last two decades a large amount of molecular-biological, imaging and genetic data have been accumulated regarding the biological underpinnings of schizophrenia. However, it remains difficult to understand how the characteristic symptoms of schizophrenia such as hallucinations and delusions are related to disturbances on the molecular-biological level. In general, psychiatry seems to lack a conceptual framework with sufficient explanatory power to link the mental- and molecular-biological domains. Here, we present an essay-like study in which we propose to use visualized concepts stemming from the theory on dynamical complex systems as a 'visual metaphor' to bridge the mental- and molecular-biological domains in schizophrenia. We first describe a computer model of neural information processing; we show how the information processing in this model can be visualized, using concepts from the theory on complex systems. We then describe two computer models which have been used to investigate the primary theory on schizophrenia, the neurodevelopmental model, and show how disturbed information processing in these two computer models can be presented in terms of the visual metaphor previously described. Finally, we describe the effects of dopamine neuromodulation, of which disturbances have been frequently described in schizophrenia, in terms of the same visualized metaphor. The conceptual framework and metaphor described offers a heuristic tool to understand the relationship between the mental- and molecular-biological domains in an intuitive way. The concepts we present may serve to facilitate communication between researchers, clinicians and patients.

  18. SiGe BiCMOS manufacturing platform for mmWave applications

    NASA Astrophysics Data System (ADS)

    Kar-Roy, Arjun; Howard, David; Preisler, Edward; Racanelli, Marco; Chaudhry, Samir; Blaschke, Volker

    2010-10-01

    TowerJazz offers high volume manufacturable commercial SiGe BiCMOS technology platforms to address the mmWave market. In this paper, first, the SiGe BiCMOS process technology platforms such as SBC18 and SBC13 are described. These manufacturing platforms integrate 200 GHz fT/fMAX SiGe NPN with deep trench isolation into 0.18μm and 0.13μm node CMOS processes along with high density 5.6fF/μm2 stacked MIM capacitors, high value polysilicon resistors, high-Q metal resistors, lateral PNP transistors, and triple well isolation using deep n-well for mixed-signal integration, and, multiple varactors and compact high-Q inductors for RF needs. Second, design enablement tools that maximize performance and lowers costs and time to market such as scalable PSP and HICUM models, statistical and Xsigma models, reliability modeling tools, process control model tools, inductor toolbox and transmission line models are described. Finally, demonstrations in silicon for mmWave applications in the areas of optical networking, mobile broadband, phased array radar, collision avoidance radar and W-band imaging are listed.

  19. Integration Framework of Process Planning based on Resource Independent Operation Summary to Support Collaborative Manufacturing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kulvatunyou, Boonserm; Wysk, Richard A.; Cho, Hyunbo

    2004-06-01

    In today's global manufacturing environment, manufacturing functions are distributed as never before. Design, engineering, fabrication, and assembly of new products are done routinely in many different enterprises scattered around the world. Successful business transactions require the sharing of design and engineering data on an unprecedented scale. This paper describes a framework that facilitates the collaboration of engineering tasks, particularly process planning and analysis, to support such globalized manufacturing activities. The information models of data and the software components that integrate those information models are described. The integration framework uses an Integrated Product and Process Data (IPPD) representation called a Resourcemore » Independent Operation Summary (RIOS) to facilitate the communication of business and manufacturing requirements. Hierarchical process modeling, process planning decomposition and an augmented AND/OR directed graph are used in this representation. The Resource Specific Process Planning (RSPP) module assigns required equipment and tools, selects process parameters, and determines manufacturing costs based on two-level hierarchical RIOS data. The shop floor knowledge (resource and process knowledge) and a hybrid approach (heuristic and linear programming) to linearize the AND/OR graph provide the basis for the planning. Finally, a prototype system is developed and demonstrated with an exemplary part. Java and XML (Extensible Markup Language) are used to ensure software and information portability.« less

  20. Coal conversion systems design and process modeling. Volume 1: Application of MPPR and Aspen computer models

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The development of a coal gasification system design and mass and energy balance simulation program for the TVA and other similar facilities is described. The materials-process-product model (MPPM) and the advanced system for process engineering (ASPEN) computer program were selected from available steady state and dynamic models. The MPPM was selected to serve as the basis for development of system level design model structure because it provided the capability for process block material and energy balance and high-level systems sizing and costing. The ASPEN simulation serves as the basis for assessing detailed component models for the system design modeling program. The ASPEN components were analyzed to identify particular process blocks and data packages (physical properties) which could be extracted and used in the system design modeling program. While ASPEN physical properties calculation routines are capable of generating physical properties required for process simulation, not all required physical property data are available, and must be user-entered.

  1. Testing Strategies for Model-Based Development

    NASA Technical Reports Server (NTRS)

    Heimdahl, Mats P. E.; Whalen, Mike; Rajan, Ajitha; Miller, Steven P.

    2006-01-01

    This report presents an approach for testing artifacts generated in a model-based development process. This approach divides the traditional testing process into two parts: requirements-based testing (validation testing) which determines whether the model implements the high-level requirements and model-based testing (conformance testing) which determines whether the code generated from a model is behaviorally equivalent to the model. The goals of the two processes differ significantly and this report explores suitable testing metrics and automation strategies for each. To support requirements-based testing, we define novel objective requirements coverage metrics similar to existing specification and code coverage metrics. For model-based testing, we briefly describe automation strategies and examine the fault-finding capability of different structural coverage metrics using tests automatically generated from the model.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Padaki, S.; Drzal, L.T.

    The consolidation process in composites made out of powder impregnated tapes differs from that of other material forms because of the distribution of fiber and matrix in the unconsolidated state. A number of factors (e.g. time, pressure, particle size, volume fraction and viscosity) affect the efficiency of the consolidation of these tapes. This paper describes the development of a mathematical process model that describes the best set of parameters required for the consolidation of a given prepreg tape.

  3. Modeling of NASA's 30/20 GHz satellite communications system

    NASA Technical Reports Server (NTRS)

    Kwatra, S. C.; Maples, B. W.; Stevens, G. A.

    1984-01-01

    NASA is in the process of developing technology for a 30/20 GHz satellite communications link. Currently hardware is being assembled for a test transponder. A simulation package is being developed to study the link performance in the presence of interference and noise. This requires developing models for the components of the system. This paper describes techniques used to model the components for which data is available. Results of experiments performed using these models are described. A brief overview of NASA's 30/20 GHz communications satellite program is also included.

  4. Nonparametric Bayesian models through probit stick-breaking processes

    PubMed Central

    Rodríguez, Abel; Dunson, David B.

    2013-01-01

    We describe a novel class of Bayesian nonparametric priors based on stick-breaking constructions where the weights of the process are constructed as probit transformations of normal random variables. We show that these priors are extremely flexible, allowing us to generate a great variety of models while preserving computational simplicity. Particular emphasis is placed on the construction of rich temporal and spatial processes, which are applied to two problems in finance and ecology. PMID:24358072

  5. Nonparametric Bayesian models through probit stick-breaking processes.

    PubMed

    Rodríguez, Abel; Dunson, David B

    2011-03-01

    We describe a novel class of Bayesian nonparametric priors based on stick-breaking constructions where the weights of the process are constructed as probit transformations of normal random variables. We show that these priors are extremely flexible, allowing us to generate a great variety of models while preserving computational simplicity. Particular emphasis is placed on the construction of rich temporal and spatial processes, which are applied to two problems in finance and ecology.

  6. Sleep, offline processing, and vocal learning

    PubMed Central

    Margoliash, Daniel; Schmidt, Marc F

    2009-01-01

    The study of song learning and the neural song system has provided an important comparative model system for the study of speech and language acquisition. We describe some recent advances in the bird song system, focusing on the role of offline processing including sleep in processing sensory information and in guiding developmental song learning. These observations motivate a new model of the organization and role of the sensory memories in vocal learning. PMID:19906416

  7. Creating a culture of shared Governance begins with developing the nurse as scholar.

    PubMed

    Donohue-Porter, Patricia

    2012-01-01

    The relationship between shared governance and nursing scholarship is investigated with an emphasis on the connection between stages of scholarly development and nursing action in the evolution of professional practice models. The scholarly image of nursing is described and four critical stages of scholarship (scholarly inquiry, conscious reflection, persistent critique, and intellectual creation) are presented. The development of nursing scholars is described with emphasis on intellectual virtues as described by philosophers and values as described by nursing theorists that are foundational to this process. Shared governance is viewed holistically as a true scholarly process when these elements are in place and are used by nurses.

  8. Introducing the Equiangular Spiral by Using Logo to Model Nature.

    ERIC Educational Resources Information Center

    Boyadzhiev, Irina; Boyadzhiev, Khristo

    1992-01-01

    Describes the method for producing the equiangular spiral, the geometric curve generated by modeling an insect's orientation process to an illumination source, utilizing a LOGO Turtle program which is included. (JJK)

  9. Coupling biology and oceanography in models.

    PubMed

    Fennel, W; Neumann, T

    2001-08-01

    The dynamics of marine ecosystems, i.e. the changes of observable chemical-biological quantities in space and time, are driven by biological and physical processes. Predictions of future developments of marine systems need a theoretical framework, i.e. models, solidly based on research and understanding of the different processes involved. The natural way to describe marine systems theoretically seems to be the embedding of chemical-biological models into circulation models. However, while circulation models are relatively advanced the quantitative theoretical description of chemical-biological processes lags behind. This paper discusses some of the approaches and problems in the development of consistent theories and indicates the beneficial potential of the coupling of marine biology and oceanography in models.

  10. The new car following model considering vehicle dynamics influence and numerical simulation

    NASA Astrophysics Data System (ADS)

    Sun, Dihua; Liu, Hui; Zhang, Geng; Zhao, Min

    2015-12-01

    In this paper, the car following model is investigated by considering the vehicle dynamics in a cyber physical view. In fact, that driving is a typical cyber physical process which couples the cyber aspect of the vehicles' information and driving decision tightly with the dynamics and physics of the vehicles and traffic environment. However, the influence from the physical (vehicle) view was been ignored in the previous car following models. In order to describe the car following behavior more reasonably in real traffic, a new car following model by considering vehicle dynamics (for short, D-CFM) is proposed. In this paper, we take the full velocity difference (FVD) car following model as a case. The stability condition is given on the base of the control theory. The analytical method and numerical simulation results show that the new models can describe the evolution of traffic congestion. The simulations also show vehicles with a more actual acceleration of starting process than early models.

  11. Discrete Fracture Network Modeling and Simulation of Subsurface Transport for the Topopah Springs and Lava Flow Aquifers at Pahute Mesa, FY 15 Progress Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Makedonska, Nataliia; Kwicklis, Edward Michael; Birdsell, Kay Hanson

    This progress report for fiscal year 2015 (FY15) describes the development of discrete fracture network (DFN) models for Pahute Mesa. DFN models will be used to upscale parameters for simulations of subsurface flow and transport in fractured media in Pahute Mesa. The research focuses on modeling of groundwater flow and contaminant transport using DFNs generated according to fracture characteristics observed in the Topopah Spring Aquifer (TSA) and the Lava Flow Aquifer (LFA). This work will improve the representation of radionuclide transport processes in large-scale, regulatory-focused models with a view to reduce pessimistic bounding approximations and provide more realistic contaminant boundarymore » calculations that can be used to describe the future extent of contaminated groundwater. Our goal is to refine a modeling approach that can translate parameters to larger-scale models that account for local-scale flow and transport processes, which tend to attenuate migration.« less

  12. Developing a physiologically based approach for modeling plutonium decorporation therapy with DTPA.

    PubMed

    Kastl, Manuel; Giussani, Augusto; Blanchardon, Eric; Breustedt, Bastian; Fritsch, Paul; Hoeschen, Christoph; Lopez, Maria Antonia

    2014-11-01

    To develop a physiologically based compartmental approach for modeling plutonium decorporation therapy with the chelating agent Diethylenetriaminepentaacetic acid (Ca-DTPA/Zn-DTPA). Model calculations were performed using the software package SAAM II (©The Epsilon Group, Charlottesville, Virginia, USA). The Luciani/Polig compartmental model with age-dependent description of the bone recycling processes was used for the biokinetics of plutonium. The Luciani/Polig model was slightly modified in order to account for the speciation of plutonium in blood and for the different affinities for DTPA of the present chemical species. The introduction of two separate blood compartments, describing low-molecular-weight complexes of plutonium (Pu-LW) and transferrin-bound plutonium (Pu-Tf), respectively, and one additional compartment describing plutonium in the interstitial fluids was performed successfully. The next step of the work is the modeling of the chelation process, coupling the physiologically modified structure with the biokinetic model for DTPA. RESULTS of animal studies performed under controlled conditions will enable to better understand the principles of the involved mechanisms.

  13. Partial discharges and breakdown in C3F8

    NASA Astrophysics Data System (ADS)

    Koch, M.; Franck, C. M.

    2014-10-01

    Traditional search processes of gases or gas mixtures for replacing SF6 involve time consuming measurements of partial discharges and breakdown behaviour for several voltage waveforms and different field configurations. Recently a model for prediction of this behaviour for SF6 was described in literature. The model only requires basic properties of the gas such as the critical field strength and the effective ionization coefficient, which can be obtained by swarm parameter measurements, and thermodynamic properties, which can be calculated. In this paper, we show for the well-known and electronegative gas octafluoropropane (C3F8) that it is possible to transfer the model developed for SF6 to this gas to describe the breakdown behaviour of C3F8. Thus the model can be beneficial in the screening process of new insulation gases.

  14. A fuzzy model for assessing risk of occupational safety in the processing industry.

    PubMed

    Tadic, Danijela; Djapan, Marko; Misita, Mirjana; Stefanovic, Miladin; Milanovic, Dragan D

    2012-01-01

    Managing occupational safety in any kind of industry, especially in processing, is very important and complex. This paper develops a new method for occupational risk assessment in the presence of uncertainties. Uncertain values of hazardous factors and consequence frequencies are described with linguistic expressions defined by a safety management team. They are modeled with fuzzy sets. Consequence severities depend on current hazardous factors, and their values are calculated with the proposed procedure. The proposed model is tested with real-life data from fruit processing firms in Central Serbia.

  15. Functional Fault Model Development Process to Support Design Analysis and Operational Assessment

    NASA Technical Reports Server (NTRS)

    Melcher, Kevin J.; Maul, William A.; Hemminger, Joseph A.

    2016-01-01

    A functional fault model (FFM) is an abstract representation of the failure space of a given system. As such, it simulates the propagation of failure effects along paths between the origin of the system failure modes and points within the system capable of observing the failure effects. As a result, FFMs may be used to diagnose the presence of failures in the modeled system. FFMs necessarily contain a significant amount of information about the design, operations, and failure modes and effects. One of the important benefits of FFMs is that they may be qualitative, rather than quantitative and, as a result, may be implemented early in the design process when there is more potential to positively impact the system design. FFMs may therefore be developed and matured throughout the monitored system's design process and may subsequently be used to provide real-time diagnostic assessments that support system operations. This paper provides an overview of a generalized NASA process that is being used to develop and apply FFMs. FFM technology has been evolving for more than 25 years. The FFM development process presented in this paper was refined during NASA's Ares I, Space Launch System, and Ground Systems Development and Operations programs (i.e., from about 2007 to the present). Process refinement took place as new modeling, analysis, and verification tools were created to enhance FFM capabilities. In this paper, standard elements of a model development process (i.e., knowledge acquisition, conceptual design, implementation & verification, and application) are described within the context of FFMs. Further, newer tools and analytical capabilities that may benefit the broader systems engineering process are identified and briefly described. The discussion is intended as a high-level guide for future FFM modelers.

  16. Incorporating redox processes improves prediction of carbon and nutrient cycling and greenhouse gas emission

    NASA Astrophysics Data System (ADS)

    Tang, Guoping; Zheng, Jianqiu; Yang, Ziming; Graham, David; Gu, Baohua; Mayes, Melanie; Painter, Scott; Thornton, Peter

    2016-04-01

    Among the coupled thermal, hydrological, geochemical, and biological processes, redox processes play major roles in carbon and nutrient cycling and greenhouse gas (GHG) emission. Increasingly, mechanistic representation of redox processes is acknowledged as necessary for accurate prediction of GHG emission in the assessment of land-atmosphere interactions. Simple organic substrates, Fe reduction, microbial reactions, and the Windermere Humic Aqueous Model (WHAM) were added to a reaction network used in the land component of an Earth system model. In conjunction with this amended reaction network, various temperature response functions used in ecosystem models were assessed for their ability to describe experimental observations from incubation tests with arctic soils. Incorporation of Fe reduction reactions improves the prediction of the lag time between CO2 and CH4 accumulation. The inclusion of the WHAM model enables us to approximately simulate the initial pH drop due to organic acid accumulation and then a pH increase due to Fe reduction without parameter adjustment. The CLM4.0, CENTURY, and Ratkowsky temperature response functions better described the observations than the Q10 method, Arrhenius equation, and ROTH-C. As electron acceptors between O2 and CO2 (e.g., Fe(III), SO42-) are often involved, our results support inclusion of these redox reactions for accurate prediction of CH4 production and consumption. Ongoing work includes improving the parameterization of organic matter decomposition to produce simple organic substrates, examining the influence of redox potential on methanogenesis under thermodynamically favorable conditions, and refining temperature response representation near the freezing point by additional model-experiment iterations. We will use the model to describe observed GHG emission at arctic and tropical sites.

  17. The patient centered medical home: mental models and practice culture driving the transformation process.

    PubMed

    Cronholm, Peter F; Shea, Judy A; Werner, Rachel M; Miller-Day, Michelle; Tufano, Jim; Crabtree, Benjamin F; Gabbay, Robert

    2013-09-01

    The Patient-Centered Medical Home (PCMH) has become a dominant model of primary care re-design. The PCMH model is a departure from more traditional models of healthcare delivery and requires significant transformation to be realized. To describe factors shaping mental models and practice culture driving the PCMH transformation process in a large multi-payer PCMH demonstration project. Individual interviews were conducted at 17 primary care practices in South Eastern Pennsylvania. A total of 118 individual interviews were conducted with clinicians (N = 47), patient educators (N = 4), office administrators (N = 12), medical assistants (N = 26), front office staff (N = 7), nurses (N = 4), care managers (N = 11), social workers (N = 4), and other stakeholders (N = 3). A multi-disciplinary research team used a grounded theory approach to develop the key constructs describing factors shaping successful practice transformation. Three central themes emerged from the data related to changes in practice culture and mental models necessary for PCMH practice transformation: 1) shifting practice perspectives towards proactive, population-oriented care based in practice-patient partnerships; 2) creating a culture of self-examination; and 3) challenges to developing new roles within the practice through distribution of responsibilities and team-based care. The most tension in shifting the required mental models was displayed between clinician and medical assistant participants, revealing significant barriers towards moving away from clinician-centric care. Key factors driving the PCMH transformation process require shifting mental models at the individual level and culture change at the practice level. Transformation is based upon structural and process changes that support orientation of practice mental models towards perceptions of population health, self-assessment, and the development of shared decision-making. Staff buy-in to the new roles and responsibilities driving PCMH transformation was described as central to making sustainable change at the practice level; however, key barriers related to clinician autonomy appeared to interfere with the formation of team-based care.

  18. A biologically inspired model of bat echolocation in a cluttered environment with inputs designed from field Recordings

    NASA Astrophysics Data System (ADS)

    Loncich, Kristen Teczar

    Bat echolocation strategies and neural processing of acoustic information, with a focus on cluttered environments, is investigated in this study. How a bat processes the dense field of echoes received while navigating and foraging in the dark is not well understood. While several models have been developed to describe the mechanisms behind bat echolocation, most are based in mathematics rather than biology, and focus on either peripheral or neural processing---not exploring how these two levels of processing are vitally connected. Current echolocation models also do not use habitat specific acoustic input, or account for field observations of echolocation strategies. Here, a new approach to echolocation modeling is described capturing the full picture of echolocation from signal generation to a neural picture of the acoustic scene. A biologically inspired echolocation model is developed using field research measurements of the interpulse interval timing used by a frequency modulating (FM) bat in the wild, with a whole method approach to modeling echolocation including habitat specific acoustic inputs, a biologically accurate peripheral model of sound processing by the outer, middle, and inner ear, and finally a neural model incorporating established auditory pathways and neuron types with echolocation adaptations. Field recordings analyzed underscore bat sonar design differences observed in the laboratory and wild, and suggest a correlation between interpulse interval groupings and increased clutter. The scenario model provides habitat and behavior specific echoes and is a useful tool for both modeling and behavioral studies, and the peripheral and neural model show that spike-time information and echolocation specific neuron types can produce target localization in the midbrain.

  19. Theoretical studies in isoelectric focusing. [mathematical modeling and computer simulation for biologicals purification process

    NASA Technical Reports Server (NTRS)

    Mosher, R. A.; Palusinski, O. A.; Bier, M.

    1982-01-01

    A mathematical model has been developed which describes the steady state in an isoelectric focusing (IEF) system with ampholytes or monovalent buffers. The model is based on the fundamental equations describing the component dissociation equilibria, mass transport due to diffusion and electromigration, electroneutrality, and the conservation of charge. The validity and usefulness of the model has been confirmed by using it to formulate buffer systems in actual laboratory experiments. The model has been recently extended to include the evolution of transient states not only in IEF but also in other modes of electrophoresis.

  20. Crowd computing: using competitive dynamics to develop and refine highly predictive models.

    PubMed

    Bentzien, Jörg; Muegge, Ingo; Hamner, Ben; Thompson, David C

    2013-05-01

    A recent application of a crowd computing platform to develop highly predictive in silico models for use in the drug discovery process is described. The platform, Kaggle™, exploits a competitive dynamic that results in model optimization as the competition unfolds. Here, this dynamic is described in detail and compared with more-conventional modeling strategies. The complete and full structure of the underlying dataset is disclosed and some thoughts as to the broader utility of such 'gamification' approaches to the field of modeling are offered. Copyright © 2013 Elsevier Ltd. All rights reserved.

  1. Investigation of Biogrout processes by numerical analysis at pore scale

    NASA Astrophysics Data System (ADS)

    Bergwerff, Luke; van Paassen, Leon A.; Picioreanu, Cristian; van Loosdrecht, Mark C. M.

    2013-04-01

    Biogrout is a soil improving process that aims to improve the strength of sandy soils. The process is based on microbially induced calcite precipitation (MICP). In this study the main process is based on denitrification facilitated by bacteria indigenous to the soil using substrates, which can be derived from pretreated waste streams containing calcium salts of fatty acids and calcium nitrate, making it a cost effective and environmentally friendly process. The goal of this research is to improve the understanding of the process by numerical analysis so that it may be improved and applied properly for varying applications, such as borehole stabilization, liquefaction prevention, levee fortification and mitigation of beach erosion. During the denitrification process there are many phases present in the pore space including a liquid phase containing solutes, crystals, bacteria forming biofilms and gas bubbles. Due to the amount of phases and their dynamic changes (multiphase flow with (non-linear) reactive transport), there are many interactions making the process very complex. To understand this complexity in the system, the interactions between these phases are studied in a reductionist approach, increasing the complexity of the system by one phase at a time. The model will initially include flow, solute transport, crystal nucleation and growth in 2D at pore scale. The flow will be described by Navier-Stokes equations. Initial study and simulations has revealed that describing crystal growth for this application on a fixed grid can introduce significant fundamental errors. Therefore a level set method will be employed to better describe the interface of developing crystals in between sand grains. Afterwards the model will be expanded to 3D to provide more realistic flow, nucleation and clogging behaviour at pore scale. Next biofilms and lastly gas bubbles may be added to the model. From the results of these pore scale models the behaviour of the system may be studied and eventually observations may be extrapolated to a larger continuum scale.

  2. Streamlined, Inexpensive 3D Printing of the Brain and Skull.

    PubMed

    Naftulin, Jason S; Kimchi, Eyal Y; Cash, Sydney S

    2015-01-01

    Neuroimaging technologies such as Magnetic Resonance Imaging (MRI) and Computed Tomography (CT) collect three-dimensional data (3D) that is typically viewed on two-dimensional (2D) screens. Actual 3D models, however, allow interaction with real objects such as implantable electrode grids, potentially improving patient specific neurosurgical planning and personalized clinical education. Desktop 3D printers can now produce relatively inexpensive, good quality prints. We describe our process for reliably generating life-sized 3D brain prints from MRIs and 3D skull prints from CTs. We have integrated a standardized, primarily open-source process for 3D printing brains and skulls. We describe how to convert clinical neuroimaging Digital Imaging and Communications in Medicine (DICOM) images to stereolithography (STL) files, a common 3D object file format that can be sent to 3D printing services. We additionally share how to convert these STL files to machine instruction gcode files, for reliable in-house printing on desktop, open-source 3D printers. We have successfully printed over 19 patient brain hemispheres from 7 patients on two different open-source desktop 3D printers. Each brain hemisphere costs approximately $3-4 in consumable plastic filament as described, and the total process takes 14-17 hours, almost all of which is unsupervised (preprocessing = 4-6 hr; printing = 9-11 hr, post-processing = <30 min). Printing a matching portion of a skull costs $1-5 in consumable plastic filament and takes less than 14 hr, in total. We have developed a streamlined, cost-effective process for 3D printing brain and skull models. We surveyed healthcare providers and patients who confirmed that rapid-prototype patient specific 3D models may help interdisciplinary surgical planning and patient education. The methods we describe can be applied for other clinical, research, and educational purposes.

  3. On the time-homogeneous Ornstein-Uhlenbeck process in the foreign exchange rates

    NASA Astrophysics Data System (ADS)

    da Fonseca, Regina C. B.; Matsushita, Raul Y.; de Castro, Márcio T.; Figueiredo, Annibal

    2015-10-01

    Since Gaussianity and stationarity assumptions cannot be fulfilled by financial data, the time-homogeneous Ornstein-Uhlenbeck (THOU) process was introduced as a candidate model to describe time series of financial returns [1]. It is an Ornstein-Uhlenbeck (OU) process in which these assumptions are replaced by linearity and time-homogeneity. We employ the OU and THOU processes to analyze daily foreign exchange rates against the US dollar. We confirm that the OU process does not fit the data, while in most cases the first four cumulants patterns from data can be described by the THOU process. However, there are some exceptions in which the data do not follow linearity or time-homogeneity assumptions.

  4. Research procedure for buck-boost converter for small electric vehicles

    NASA Astrophysics Data System (ADS)

    Vacheva, Gergana; Hinov, Nikolay; Penev, Dimitar

    2017-12-01

    In the current paper is developed a mathematical model realized in Matlab for describing a buck-boost converter for control of small electric vehicle. The model is presented with differential equations which describes the processes in the converter. Through the research of this model it can be accomplished the optimal work mode of a small electric vehicles. The proposed converter can be used in a wide range of applications like small electric vehicles, smart grids and different systems for energy storage.

  5. Numerical and experimental modelling of the centrifugal compressor stage - setting the model of impellers with 2D blades

    NASA Astrophysics Data System (ADS)

    Matas, Richard; Syka, Tomáš; Luňáček, Ondřej

    The article deals with a description of results from research and development of a radial compressor stage. The experimental compressor and used numerical models are briefly described. In the first part, the comparisons of characteristics obtained experimentally and by numerical simulations for stage with vaneless diffuser are described. In the second part, the results for stage with vanned diffuser are presented. The results are relevant for next studies in research and development process.

  6. Hydrologic controls on the development of equilibrium soil depths

    NASA Astrophysics Data System (ADS)

    Nicotina, L.; Tarboton, D. G.; Tesfa, T. K.; Rinaldo, A.

    2010-12-01

    The object of the present work was the study of the coevolution of runoff production and geomorphological processes and its effects on the formation of equilibrium soil depth by focusing on their mutual feedbacks. The primary goal of this work is to describe spatial patterns of soil depth resulting, under the hypothesis of dynamic equilibrium, from long-term interactions between hydrologic forcings and soil production, erosion and sediment transport processes. These processes dominate the formation of actual soil depth patterns that represent the boundary condition for water redistribution, thus this paper also proposes and attempt to set the premises for decoding their individual role and mutual interactions in shaping the hydrologic response of a catchment. The relevance of the study stems from the massive improvement in hydrologic predictions for ungauged basins that would be achieved by using directly soil depths derived from geomorphic features remotely measured and objectively manipulated. Moreover the setup of a coupled hydrologic-geomorphologic approach represents a first step into the study of such interactions and in particular of the effects of soil moisture in determining soil production functions. Hydrological processes are here described by explicitly accounting for local soil depths and detailed catchment topography from high resolution digital terrain models (DTM). Geomorphological processes are described by means of well-studied geomorphic transport laws. Soil depth is assumed, in the exponential soil production function, as a proxy for all the mechanisms that induce mechanical disruption of bedrock and it’s conversion into soil. This formulation, although empirical, has been widely used in the literature and is currently accepted. The modeling approach is applied to the semi-arid Dry Creek Experimental Watershed, located near Boise, Idaho, USA. Modeled soil depths are compared with field data obtained from an extensive survey of the catchment. Our results show the ability of the model to describe properly the mean soil depth and the broad features of the distribution of measured data. However, local comparisons show significant scatter whose origin is discussed.

  7. A Prototype for the Support of Integrated Software Process Development and Improvement

    NASA Astrophysics Data System (ADS)

    Porrawatpreyakorn, Nalinpat; Quirchmayr, Gerald; Chutimaskul, Wichian

    An efficient software development process is one of key success factors for quality software. Not only can the appropriate establishment but also the continuous improvement of integrated project management and of the software development process result in efficiency. This paper hence proposes a software process maintenance framework which consists of two core components: an integrated PMBOK-Scrum model describing how to establish a comprehensive set of project management and software engineering processes and a software development maturity model advocating software process improvement. Besides, a prototype tool to support the framework is introduced.

  8. MODELING THE ELECTROLYTIC DECHLORINATION OF TRICHLOROETHYLENE IN A GRANULAR GRAPHITE-PACKED REACTOR

    EPA Science Inventory

    A comprehensive reactor model was developed for the electrolytic dechlorination of trichloroethylene (TCE) at a granular-graphite cathode. The reactor model describes the dynamic processes of TCE dechlorination and adsorption, and the formation and dechlorination of all the major...

  9. Carbon Dynamics and Export from Flooded Wetlands: A Modeling Approach

    EPA Science Inventory

    Described in this article is development and validation of a process based model for carbon cycling in flooded wetlands, called WetQual-C. The model considers various biogeochemical interactions affecting C cycling, greenhouse gas emissions, organic carbon export and retention. ...

  10. Models, Part V: Composition Models.

    ERIC Educational Resources Information Center

    Callison, Daniel

    2003-01-01

    Describes four models: The Authoring Cycle, a whole language approach that reflects the inquiry process; I-Search, an approach to research that uses the power of student interests; Cultural Celebration, using local heritage topics; and Science Lab Report, for the composition of a lab report. (LRW)

  11. A physiologically based kinetic model for bacterial sulfide oxidation.

    PubMed

    Klok, Johannes B M; de Graaff, Marco; van den Bosch, Pim L F; Boelee, Nadine C; Keesman, Karel J; Janssen, Albert J H

    2013-02-01

    In the biotechnological process for hydrogen sulfide removal from gas streams, a variety of oxidation products can be formed. Under natron-alkaline conditions, sulfide is oxidized by haloalkaliphilic sulfide oxidizing bacteria via flavocytochrome c oxidoreductase. From previous studies, it was concluded that the oxidation-reduction state of cytochrome c is a direct measure for the bacterial end-product formation. Given this physiological feature, incorporation of the oxidation state of cytochrome c in a mathematical model for the bacterial oxidation kinetics will yield a physiologically based model structure. This paper presents a physiologically based model, describing the dynamic formation of the various end-products in the biodesulfurization process. It consists of three elements: 1) Michaelis-Menten kinetics combined with 2) a cytochrome c driven mechanism describing 3) the rate determining enzymes of the respiratory system of haloalkaliphilic sulfide oxidizing bacteria. The proposed model is successfully validated against independent data obtained from biological respiration tests and bench scale gas-lift reactor experiments. The results demonstrate that the model is a powerful tool to describe product formation for haloalkaliphilic biomass under dynamic conditions. The model predicts a maximum S⁰ formation of about 98 mol%. A future challenge is the optimization of this bioprocess by improving the dissolved oxygen control strategy and reactor design. Copyright © 2012 Elsevier Ltd. All rights reserved.

  12. Mass balance modelling of contaminants in river basins: a flexible matrix approach.

    PubMed

    Warren, Christopher; Mackay, Don; Whelan, Mick; Fox, Kay

    2005-12-01

    A novel and flexible approach is described for simulating the behaviour of chemicals in river basins. A number (n) of river reaches are defined and their connectivity is described by entries in an n x n matrix. Changes in segmentation can be readily accommodated by altering the matrix entries, without the need for model revision. Two models are described. The simpler QMX-R model only considers advection and an overall loss due to the combined processes of volatilization, net transfer to sediment and degradation. The rate constant for the overall loss is derived from fugacity calculations for a single segment system. The more rigorous QMX-F model performs fugacity calculations for each segment and explicitly includes the processes of advection, evaporation, water-sediment exchange and degradation in both water and sediment. In this way chemical exposure in all compartments (including equilibrium concentrations in biota) can be estimated. Both models are designed to serve as intermediate-complexity exposure assessment tools for river basins with relatively low data requirements. By considering the spatially explicit nature of emission sources and the changes in concentration which occur with transport in the channel system, the approach offers significant advantages over simple one-segment simulations while being more readily applicable than more sophisticated, highly segmented, GIS-based models.

  13. Yes, the GIGP Really Does Work--And Is Workable!

    ERIC Educational Resources Information Center

    Burrell, Quentin L.; Fenton, Michael R.

    1993-01-01

    Discusses the generalized inverse Gaussian-Poisson (GIGP) process for informetric modeling. Negative binomial distribution is discussed, construction of the GIGP process is explained, zero-truncated GIGP is considered, and applications of the process with journals, library circulation statistics, and database index terms are described. (50…

  14. SEIPS-based process modeling in primary care.

    PubMed

    Wooldridge, Abigail R; Carayon, Pascale; Hundt, Ann Schoofs; Hoonakker, Peter L T

    2017-04-01

    Process mapping, often used as part of the human factors and systems engineering approach to improve care delivery and outcomes, should be expanded to represent the complex, interconnected sociotechnical aspects of health care. Here, we propose a new sociotechnical process modeling method to describe and evaluate processes, using the SEIPS model as the conceptual framework. The method produces a process map and supplementary table, which identify work system barriers and facilitators. In this paper, we present a case study applying this method to three primary care processes. We used purposeful sampling to select staff (care managers, providers, nurses, administrators and patient access representatives) from two clinics to observe and interview. We show the proposed method can be used to understand and analyze healthcare processes systematically and identify specific areas of improvement. Future work is needed to assess usability and usefulness of the SEIPS-based process modeling method and further refine it. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. SEIPS-Based Process Modeling in Primary Care

    PubMed Central

    Wooldridge, Abigail R.; Carayon, Pascale; Hundt, Ann Schoofs; Hoonakker, Peter

    2016-01-01

    Process mapping, often used as part of the human factors and systems engineering approach to improve care delivery and outcomes, should be expanded to represent the complex, interconnected sociotechnical aspects of health care. Here, we propose a new sociotechnical process modeling method to describe and evaluate processes, using the SEIPS model as the conceptual framework. The method produces a process map and supplementary table, which identify work system barriers and facilitators. In this paper, we present a case study applying this method to three primary care processes. We used purposeful sampling to select staff (care managers, providers, nurses, administrators and patient access representatives) from two clinics to observe and interview. We show the proposed method can be used to understand and analyze healthcare processes systematically and identify specific areas of improvement. Future work is needed to assess usability and usefulness of the SEIPS-based process modeling method and further refine it. PMID:28166883

  16. An Instructional Approach to Modeling in Microevolution.

    ERIC Educational Resources Information Center

    Thompson, Steven R.

    1988-01-01

    Describes an approach to teaching population genetics and evolution and some of the ways models can be used to enhance understanding of the processes being studied. Discusses the instructional plan, and the use of models including utility programs and analysis with models. Provided are a basic program and sample program outputs. (CW)

  17. Urban tree growth modeling

    Treesearch

    E. Gregory McPherson; Paula J. Peper

    2012-01-01

    This paper describes three long-term tree growth studies conducted to evaluate tree performance because repeated measurements of the same trees produce critical data for growth model calibration and validation. Several empirical and process-based approaches to modeling tree growth are reviewed. Modeling is more advanced in the fields of forestry and...

  18. Identifiability Of Systems With Modeling Errors

    NASA Technical Reports Server (NTRS)

    Hadaegh, Yadolah " fred" ; Bekey, George A.

    1988-01-01

    Advances in theory of modeling errors reported. Recent paper on errors in mathematical models of deterministic linear or weakly nonlinear systems. Extends theoretical work described in NPO-16661 and NPO-16785. Presents concrete way of accounting for difference in structure between mathematical model and physical process or system that it represents.

  19. Implementation of the Interteaching Model: Implications for Staff

    ERIC Educational Resources Information Center

    Chester, Andrea; Kienhuis, Mandy; Wilson, Peter

    2015-01-01

    This article describes the process of implementing a teaching innovation, the interteaching model, in a second-year psychology course. Interteaching is an evidence-based model that uses guided independent learning and reciprocal peer-tutoring to enhance student engagement and learning. The model shifts the focus from lectures to tutorials:…

  20. Mechanistic modelling of drug release from a polymer matrix using magnetic resonance microimaging.

    PubMed

    Kaunisto, Erik; Tajarobi, Farhad; Abrahmsen-Alami, Susanna; Larsson, Anette; Nilsson, Bernt; Axelsson, Anders

    2013-03-12

    In this paper a new model describing drug release from a polymer matrix tablet is presented. The utilization of the model is described as a two step process where, initially, polymer parameters are obtained from a previously published pure polymer dissolution model. The results are then combined with drug parameters obtained from literature data in the new model to predict solvent and drug concentration profiles and polymer and drug release profiles. The modelling approach was applied to the case of a HPMC matrix highly loaded with mannitol (model drug). The results showed that the drug release rate can be successfully predicted, using the suggested modelling approach. However, the model was not able to accurately predict the polymer release profile, possibly due to the sparse amount of usable pure polymer dissolution data. In addition to the case study, a sensitivity analysis of model parameters relevant to drug release was performed. The analysis revealed important information that can be useful in the drug formulation process. Copyright © 2013 Elsevier B.V. All rights reserved.

  1. Fractal modeling of fluidic leakage through metal sealing surfaces

    NASA Astrophysics Data System (ADS)

    Zhang, Qiang; Chen, Xiaoqian; Huang, Yiyong; Chen, Yong

    2018-04-01

    This paper investigates the fluidic leak rate through metal sealing surfaces by developing fractal models for the contact process and leakage process. An improved model is established to describe the seal-contact interface of two metal rough surface. The contact model divides the deformed regions by classifying the asperities of different characteristic lengths into the elastic, elastic-plastic and plastic regimes. Using the improved contact model, the leakage channel under the contact surface is mathematically modeled based on the fractal theory. The leakage model obtains the leak rate using the fluid transport theory in porous media, considering that the pores-forming percolation channels can be treated as a combination of filled tortuous capillaries. The effects of fractal structure, surface material and gasket size on the contact process and leakage process are analyzed through numerical simulations for sealed ring gaskets.

  2. Pharmacokinetic Model of the Transport of Fast-Acting Insulin From the Subcutaneous and Intradermal Spaces to Blood.

    PubMed

    Lv, Dayu; Kulkarni, Sandip D; Chan, Alice; Keith, Stephen; Pettis, Ron; Kovatchev, Boris P; Farhi, Leon S; Breton, Marc D

    2015-07-01

    Pharmacokinetic (PK) models describing the transport of insulin from the injection site to blood assist clinical decision making and are part of in silico platforms for developing and testing of insulin delivery strategies for treatment of patients with diabetes. The ability of these models to accurately describe all facets of the in vivo insulin transport is therefore critical for their application. Here, we propose a new model of fast-acting insulin analogs transport from the subcutaneous and intradermal spaces to blood that can accommodate clinically observed biphasic appearance and delayed clearance of injected insulin, 2 phenomena that are not captured by existing PK models. To develop the model we compare 9 insulin transport PK models which describe hypothetical insulin delivery pathways potentially capable of approximating biphasic appearance of exogenous insulin. The models are tested with respect to their ability to describe clinical data from 10 healthy volunteers which received 1 subcutaneous and 2 intradermal insulin injections on 3 different occasions. The optimal model, selected based on information and posterior identifiability criteria, assumes that insulin is delivered at the administrative site and is then transported to the bloodstream via 2 independent routes (1) diffusion-like process to the blood and (2) combination of diffusion-like processes followed by an additional compartment before entering the blood. This optimal model accounts for biphasic appearance and delayed clearance of exogenous insulin. It agrees better with the clinical data as compared to commonly used models and is expected to improve the in silico development and testing of insulin treatment strategies, including artificial pancreas systems. © 2015 Diabetes Technology Society.

  3. Markov Chain-Like Quantum Biological Modeling of Mutations, Aging, and Evolution.

    PubMed

    Djordjevic, Ivan B

    2015-08-24

    Recent evidence suggests that quantum mechanics is relevant in photosynthesis, magnetoreception, enzymatic catalytic reactions, olfactory reception, photoreception, genetics, electron-transfer in proteins, and evolution; to mention few. In our recent paper published in Life, we have derived the operator-sum representation of a biological channel based on codon basekets, and determined the quantum channel model suitable for study of the quantum biological channel capacity. However, this model is essentially memoryless and it is not able to properly model the propagation of mutation errors in time, the process of aging, and evolution of genetic information through generations. To solve for these problems, we propose novel quantum mechanical models to accurately describe the process of creation spontaneous, induced, and adaptive mutations and their propagation in time. Different biological channel models with memory, proposed in this paper, include: (i) Markovian classical model, (ii) Markovian-like quantum model, and (iii) hybrid quantum-classical model. We then apply these models in a study of aging and evolution of quantum biological channel capacity through generations. We also discuss key differences of these models with respect to a multilevel symmetric channel-based Markovian model and a Kimura model-based Markovian process. These models are quite general and applicable to many open problems in biology, not only biological channel capacity, which is the main focus of the paper. We will show that the famous quantum Master equation approach, commonly used to describe different biological processes, is just the first-order approximation of the proposed quantum Markov chain-like model, when the observation interval tends to zero. One of the important implications of this model is that the aging phenotype becomes determined by different underlying transition probabilities in both programmed and random (damage) Markov chain-like models of aging, which are mutually coupled.

  4. Markov Chain-Like Quantum Biological Modeling of Mutations, Aging, and Evolution

    PubMed Central

    Djordjevic, Ivan B.

    2015-01-01

    Recent evidence suggests that quantum mechanics is relevant in photosynthesis, magnetoreception, enzymatic catalytic reactions, olfactory reception, photoreception, genetics, electron-transfer in proteins, and evolution; to mention few. In our recent paper published in Life, we have derived the operator-sum representation of a biological channel based on codon basekets, and determined the quantum channel model suitable for study of the quantum biological channel capacity. However, this model is essentially memoryless and it is not able to properly model the propagation of mutation errors in time, the process of aging, and evolution of genetic information through generations. To solve for these problems, we propose novel quantum mechanical models to accurately describe the process of creation spontaneous, induced, and adaptive mutations and their propagation in time. Different biological channel models with memory, proposed in this paper, include: (i) Markovian classical model, (ii) Markovian-like quantum model, and (iii) hybrid quantum-classical model. We then apply these models in a study of aging and evolution of quantum biological channel capacity through generations. We also discuss key differences of these models with respect to a multilevel symmetric channel-based Markovian model and a Kimura model-based Markovian process. These models are quite general and applicable to many open problems in biology, not only biological channel capacity, which is the main focus of the paper. We will show that the famous quantum Master equation approach, commonly used to describe different biological processes, is just the first-order approximation of the proposed quantum Markov chain-like model, when the observation interval tends to zero. One of the important implications of this model is that the aging phenotype becomes determined by different underlying transition probabilities in both programmed and random (damage) Markov chain-like models of aging, which are mutually coupled. PMID:26305258

  5. Parent Management Training-Oregon Model (PMTO™) in Mexico City: Integrating Cultural Adaptation Activities in an Implementation Model

    PubMed Central

    Baumann, Ana A.; Domenech Rodríguez, Melanie M.; Amador, Nancy G.; Forgatch, Marion S.; Parra-Cardona, J. Rubén

    2015-01-01

    This article describes the process of cultural adaptation at the start of the implementation of the Parent Management Training intervention-Oregon model (PMTO) in Mexico City. The implementation process was guided by the model, and the cultural adaptation of PMTO was theoretically guided by the cultural adaptation process (CAP) model. During the process of the adaptation, we uncovered the potential for the CAP to be embedded in the implementation process, taking into account broader training and economic challenges and opportunities. We discuss how cultural adaptation and implementation processes are inextricably linked and iterative and how maintaining a collaborative relationship with the treatment developer has guided our work and has helped expand our research efforts, and how building human capital to implement PMTO in Mexico supported the implementation efforts of PMTO in other places in the United States. PMID:26052184

  6. Parent Management Training-Oregon Model (PMTO™) in Mexico City: Integrating Cultural Adaptation Activities in an Implementation Model.

    PubMed

    Baumann, Ana A; Domenech Rodríguez, Melanie M; Amador, Nancy G; Forgatch, Marion S; Parra-Cardona, J Rubén

    2014-03-01

    This article describes the process of cultural adaptation at the start of the implementation of the Parent Management Training intervention-Oregon model (PMTO) in Mexico City. The implementation process was guided by the model, and the cultural adaptation of PMTO was theoretically guided by the cultural adaptation process (CAP) model. During the process of the adaptation, we uncovered the potential for the CAP to be embedded in the implementation process, taking into account broader training and economic challenges and opportunities. We discuss how cultural adaptation and implementation processes are inextricably linked and iterative and how maintaining a collaborative relationship with the treatment developer has guided our work and has helped expand our research efforts, and how building human capital to implement PMTO in Mexico supported the implementation efforts of PMTO in other places in the United States.

  7. Developing a semantic web model for medical differential diagnosis recommendation.

    PubMed

    Mohammed, Osama; Benlamri, Rachid

    2014-10-01

    In this paper we describe a novel model for differential diagnosis designed to make recommendations by utilizing semantic web technologies. The model is a response to a number of requirements, ranging from incorporating essential clinical diagnostic semantics to the integration of data mining for the process of identifying candidate diseases that best explain a set of clinical features. We introduce two major components, which we find essential to the construction of an integral differential diagnosis recommendation model: the evidence-based recommender component and the proximity-based recommender component. Both approaches are driven by disease diagnosis ontologies designed specifically to enable the process of generating diagnostic recommendations. These ontologies are the disease symptom ontology and the patient ontology. The evidence-based diagnosis process develops dynamic rules based on standardized clinical pathways. The proximity-based component employs data mining to provide clinicians with diagnosis predictions, as well as generates new diagnosis rules from provided training datasets. This article describes the integration between these two components along with the developed diagnosis ontologies to form a novel medical differential diagnosis recommendation model. This article also provides test cases from the implementation of the overall model, which shows quite promising diagnostic recommendation results.

  8. Learner Performance Accounting: A Tri-Cycle Process

    ERIC Educational Resources Information Center

    Brown, Thomas C.; McCleary, Lloyd E.

    1973-01-01

    The Tri-Cycle Process described in the model permits for the first time an integrated system for designing an individualized instructional system that would permit a rational, diagnosis-prescription-evaluation system keyed to an accounting system. (Author)

  9. A unified dislocation density-dependent physical-based constitutive model for cold metal forming

    NASA Astrophysics Data System (ADS)

    Schacht, K.; Motaman, A. H.; Prahl, U.; Bleck, W.

    2017-10-01

    Dislocation-density-dependent physical-based constitutive models of metal plasticity while are computationally efficient and history-dependent, can accurately account for varying process parameters such as strain, strain rate and temperature; different loading modes such as continuous deformation, creep and relaxation; microscopic metallurgical processes; and varying chemical composition within an alloy family. Since these models are founded on essential phenomena dominating the deformation, they have a larger range of usability and validity. Also, they are suitable for manufacturing chain simulations since they can efficiently compute the cumulative effect of the various manufacturing processes by following the material state through the entire manufacturing chain and also interpass periods and give a realistic prediction of the material behavior and final product properties. In the physical-based constitutive model of cold metal plasticity introduced in this study, physical processes influencing cold and warm plastic deformation in polycrystalline metals are described using physical/metallurgical internal variables such as dislocation density and effective grain size. The evolution of these internal variables are calculated using adequate equations that describe the physical processes dominating the material behavior during cold plastic deformation. For validation, the model is numerically implemented in general implicit isotropic elasto-viscoplasticity algorithm as a user-defined material subroutine (UMAT) in ABAQUS/Standard and used for finite element simulation of upsetting tests and a complete cold forging cycle of case hardenable MnCr steel family.

  10. Geodynamics branch data base for main magnetic field analysis

    NASA Technical Reports Server (NTRS)

    Langel, Robert A.; Baldwin, R. T.

    1991-01-01

    The data sets used in geomagnetic field modeling at GSFC are described. Data are measured and obtained from a variety of information and sources. For clarity, data sets from different sources are categorized and processed separately. The data base is composed of magnetic observatory data, surface data, high quality aeromagnetic, high quality total intensity marine data, satellite data, and repeat data. These individual data categories are described in detail in a series of notebooks in the Geodynamics Branch, GSFC. This catalog reviews the original data sets, the processing history, and the final data sets available for each individual category of the data base and is to be used as a reference manual for the notebooks. Each data type used in geomagnetic field modeling has varying levels of complexity requiring specialized processing routines for satellite and observatory data and two general routines for processing aeromagnetic, marine, land survey, and repeat data.

  11. A standard protocol for describing individual-based and agent-based models

    USGS Publications Warehouse

    Grimm, Volker; Berger, Uta; Bastiansen, Finn; Eliassen, Sigrunn; Ginot, Vincent; Giske, Jarl; Goss-Custard, John; Grand, Tamara; Heinz, Simone K.; Huse, Geir; Huth, Andreas; Jepsen, Jane U.; Jorgensen, Christian; Mooij, Wolf M.; Muller, Birgit; Pe'er, Guy; Piou, Cyril; Railsback, Steven F.; Robbins, Andrew M.; Robbins, Martha M.; Rossmanith, Eva; Ruger, Nadja; Strand, Espen; Souissi, Sami; Stillman, Richard A.; Vabo, Rune; Visser, Ute; DeAngelis, Donald L.

    2006-01-01

    Simulation models that describe autonomous individual organisms (individual based models, IBM) or agents (agent-based models, ABM) have become a widely used tool, not only in ecology, but also in many other disciplines dealing with complex systems made up of autonomous entities. However, there is no standard protocol for describing such simulation models, which can make them difficult to understand and to duplicate. This paper presents a proposed standard protocol, ODD, for describing IBMs and ABMs, developed and tested by 28 modellers who cover a wide range of fields within ecology. This protocol consists of three blocks (Overview, Design concepts, and Details), which are subdivided into seven elements: Purpose, State variables and scales, Process overview and scheduling, Design concepts, Initialization, Input, and Submodels. We explain which aspects of a model should be described in each element, and we present an example to illustrate the protocol in use. In addition, 19 examples are available in an Online Appendix. We consider ODD as a first step for establishing a more detailed common format of the description of IBMs and ABMs. Once initiated, the protocol will hopefully evolve as it becomes used by a sufficiently large proportion of modellers.

  12. Real-time simulation of the retina allowing visualization of each processing stage

    NASA Astrophysics Data System (ADS)

    Teeters, Jeffrey L.; Werblin, Frank S.

    1991-08-01

    The retina computes to let us see, but can we see the retina compute? Until now, the answer has been no, because the unconscious nature of the processing hides it from our view. Here the authors describe a method of seeing computations performed throughout the retina. This is achieved by using neurophysiological data to construct a model of the retina, and using a special-purpose image processing computer (PIPE) to implement the model in real time. Processing in the model is organized into stages corresponding to computations performed by each retinal cell type. The final stage is the transient (change detecting) ganglion cell. A CCD camera forms the input image, and the activity of a selected retinal cell type is the output which is displayed on a TV monitor. By changing the retina cell driving the monitor, the progressive transformations of the image by the retina can be observed. These simulations demonstrate the ubiquitous presence of temporal and spatial variations in the patterns of activity generated by the retina which are fed into the brain. The dynamical aspects make these patterns very different from those generated by the common DOG (Difference of Gaussian) model of receptive field. Because the retina is so successful in biological vision systems, the processing described here may be useful in machine vision.

  13. Population density equations for stochastic processes with memory kernels

    NASA Astrophysics Data System (ADS)

    Lai, Yi Ming; de Kamps, Marc

    2017-06-01

    We present a method for solving population density equations (PDEs)-a mean-field technique describing homogeneous populations of uncoupled neurons—where the populations can be subject to non-Markov noise for arbitrary distributions of jump sizes. The method combines recent developments in two different disciplines that traditionally have had limited interaction: computational neuroscience and the theory of random networks. The method uses a geometric binning scheme, based on the method of characteristics, to capture the deterministic neurodynamics of the population, separating the deterministic and stochastic process cleanly. We can independently vary the choice of the deterministic model and the model for the stochastic process, leading to a highly modular numerical solution strategy. We demonstrate this by replacing the master equation implicit in many formulations of the PDE formalism by a generalization called the generalized Montroll-Weiss equation—a recent result from random network theory—describing a random walker subject to transitions realized by a non-Markovian process. We demonstrate the method for leaky- and quadratic-integrate and fire neurons subject to spike trains with Poisson and gamma-distributed interspike intervals. We are able to model jump responses for both models accurately to both excitatory and inhibitory input under the assumption that all inputs are generated by one renewal process.

  14. Modeling of ETL-Processes and Processed Information in Clinical Data Warehousing.

    PubMed

    Tute, Erik; Steiner, Jochen

    2018-01-01

    Literature describes a big potential for reuse of clinical patient data. A clinical data warehouse (CDWH) is a means for that. To support management and maintenance of processes extracting, transforming and loading (ETL) data into CDWHs as well as to ease reuse of metadata between regular IT-management, CDWH and secondary data users by providing a modeling approach. Expert survey and literature review to find requirements and existing modeling techniques. An ETL-modeling-technique was developed extending existing modeling techniques. Evaluation by exemplarily modeling existing ETL-process and a second expert survey. Nine experts participated in the first survey. Literature review yielded 15 included publications. Six existing modeling techniques were identified. A modeling technique extending 3LGM2 and combining it with openEHR information models was developed and evaluated. Seven experts participated in the evaluation. The developed approach can help in management and maintenance of ETL-processes and could serve as interface between regular IT-management, CDWH and secondary data users.

  15. Mathematical model of the loan portfolio dynamics in the form of Markov chain considering the process of new customers attraction

    NASA Astrophysics Data System (ADS)

    Bozhalkina, Yana

    2017-12-01

    Mathematical model of the loan portfolio structure change in the form of Markov chain is explored. This model considers in one scheme both the process of customers attraction, their selection based on the credit score, and loans repayment. The model describes the structure and volume of the loan portfolio dynamics, which allows to make medium-term forecasts of profitability and risk. Within the model corrective actions of bank management in order to increase lending volumes or to reduce the risk are formalized.

  16. Improvement of radiology services based on the process management approach.

    PubMed

    Amaral, Creusa Sayuri Tahara; Rozenfeld, Henrique; Costa, Janaina Mascarenhas Hornos; Magon, Maria de Fátima de Andrade; Mascarenhas, Yvone Maria

    2011-06-01

    The health sector requires continuous investments to ensure the improvement of products and services from a technological standpoint, the use of new materials, equipment and tools, and the application of process management methods. Methods associated with the process management approach, such as the development of reference models of business processes, can provide significant innovations in the health sector and respond to the current market trend for modern management in this sector (Gunderman et al. (2008) [4]). This article proposes a process model for diagnostic medical X-ray imaging, from which it derives a primary reference model and describes how this information leads to gains in quality and improvements. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  17. Physical Modeling of Contact Processes on the Cutting Tools Surfaces of STM When Turning

    NASA Astrophysics Data System (ADS)

    Belozerov, V. A.; Uteshev, M. H.

    2016-08-01

    This article describes how to create an optimization model of the process of fine turning of superalloys and steel tools from STM on CNC machines, flexible manufacturing units (GPM), machining centers. Creation of the optimization model allows you to link (unite) contact processes simultaneously on the front and back surfaces of the tool from STM to manage contact processes and the dynamic strength of the cutting tool at the top of the STM. Established optimization model of management of the dynamic strength of the incisors of the STM in the process of fine turning is based on a previously developed thermomechanical (physical, heat) model, which allows the system thermomechanical approach to choosing brands STM (domestic and foreign) for cutting tools from STM designed for fine turning of heat resistant alloys and steels.

  18. Complex Networks in Psychological Models

    NASA Astrophysics Data System (ADS)

    Wedemann, R. S.; Carvalho, L. S. A. V. D.; Donangelo, R.

    We develop schematic, self-organizing, neural-network models to describe mechanisms associated with mental processes, by a neurocomputational substrate. These models are examples of real world complex networks with interesting general topological structures. Considering dopaminergic signal-to-noise neuronal modulation in the central nervous system, we propose neural network models to explain development of cortical map structure and dynamics of memory access, and unify different mental processes into a single neurocomputational substrate. Based on our neural network models, neurotic behavior may be understood as an associative memory process in the brain, and the linguistic, symbolic associative process involved in psychoanalytic working-through can be mapped onto a corresponding process of reconfiguration of the neural network. The models are illustrated through computer simulations, where we varied dopaminergic modulation and observed the self-organizing emergent patterns at the resulting semantic map, interpreting them as different manifestations of mental functioning, from psychotic through to normal and neurotic behavior, and creativity.

  19. User's guide to the western spruce budworm modeling system

    Treesearch

    Nicholas L. Crookston; J. J. Colbert; Paul W. Thomas; Katharine A. Sheehan; William P. Kemp

    1990-01-01

    The Budworm Modeling System is a set of four computer programs: The Budworm Dynamics Model, the Prognosis-Budworm Dynamics Model, the Prognosis-Budworm Damage Model, and the Parallel Processing-Budworm Dynamics Model. Input to the first three programs and the output produced are described in this guide. A guide to the fourth program will be published separately....

  20. Educational Process Reflection (EPR): An Evaluation of a Model for Professional Development Concerning Social Interaction and Educational Climate in the Swedish Preschool

    ERIC Educational Resources Information Center

    Bygdeson-Larsson, Kerstin

    2006-01-01

    Educational process reflection (EPR) is a professional development model aimed at supporting preschool teachers reflecting on and changing their practice. A particular focus is on interaction between practitioners and children, and between the children themselves. In this article, I first describe the theoretical frameworks that helped shape EPR,…

  1. One Factor or Two Parallel Processes? Comorbidity and Development of Adolescent Anxiety and Depressive Disorder Symptoms

    ERIC Educational Resources Information Center

    Hale, William W., III; Raaijmakers, Quinten A. W.; Muris, Peter; van Hoof, Anne; Meeus, Wim H. J.

    2009-01-01

    Background: This study investigates whether anxiety and depressive disorder symptoms of adolescents from the general community are best described by a model that assumes they are indicative of one general factor or by a model that assumes they are two distinct disorders with parallel growth processes. Additional analyses were conducted to explore…

  2. A Comprehensive Participative Planning Model for Small Liberal Arts Colleges: Morrison, Renfro, and Boucher Meet Madan Capoor. AIR 1992 Annual Forum Paper.

    ERIC Educational Resources Information Center

    Popovics, Alexander J.; Jonas, Peter M.

    This paper describes the use of a comprehensive participative planning model for colleges and universities that includes processes of environmental scanning, proposed by J. Morrison and others, and key elements of the Objective-Based Assessment, Planning, and Resource Allocation System (OAPRAS) proposed by M. Capoor. The process is explained…

  3. The application of a unique flow modeling technique to complex combustion systems

    NASA Astrophysics Data System (ADS)

    Waslo, J.; Hasegawa, T.; Hilt, M. B.

    1986-06-01

    This paper describes the application of a unique three-dimensional water flow modeling technique to the study of complex fluid flow patterns within an advanced gas turbine combustor. The visualization technique uses light scattering, coupled with real-time image processing, to determine flow fields. Additional image processing is used to make concentration measurements within the combustor.

  4. Use of a Process Analysis Tool for Diagnostic Study on Fine Particulate Matter Predictions in the U.S.-Part II: Analysis and Sensitivity Simulations

    EPA Science Inventory

    Following the Part I paper that described an application of the U.S. EPA Models-3/Community Multiscale Air Quality (CMAQ) modeling system to the 1999 Southern Oxidants Study episode, this paper presents results from process analysis (PA) using the PA tool embedded in CMAQ and s...

  5. Predicting indoor pollutant concentrations, and applications to air quality management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lorenzetti, David M.

    Because most people spend more than 90% of their time indoors, predicting exposure to airborne pollutants requires models that incorporate the effect of buildings. Buildings affect the exposure of their occupants in a number of ways, both by design (for example, filters in ventilation systems remove particles) and incidentally (for example, sorption on walls can reduce peak concentrations, but prolong exposure to semivolatile organic compounds). Furthermore, building materials and occupant activities can generate pollutants. Indoor air quality depends not only on outdoor air quality, but also on the design, maintenance, and use of the building. For example, ''sick building'' symptomsmore » such as respiratory problems and headaches have been related to the presence of air-conditioning systems, to carpeting, to low ventilation rates, and to high occupant density (1). The physical processes of interest apply even in simple structures such as homes. Indoor air quality models simulate the processes, such as ventilation and filtration, that control pollutant concentrations in a building. Section 2 describes the modeling approach, and the important transport processes in buildings. Because advection usually dominates among the transport processes, Sections 3 and 4 describe methods for predicting airflows. The concluding section summarizes the application of these models.« less

  6. Analytical results for a stochastic model of gene expression with arbitrary partitioning of proteins

    NASA Astrophysics Data System (ADS)

    Tschirhart, Hugo; Platini, Thierry

    2018-05-01

    In biophysics, the search for analytical solutions of stochastic models of cellular processes is often a challenging task. In recent work on models of gene expression, it was shown that a mapping based on partitioning of Poisson arrivals (PPA-mapping) can lead to exact solutions for previously unsolved problems. While the approach can be used in general when the model involves Poisson processes corresponding to creation or degradation, current applications of the method and new results derived using it have been limited to date. In this paper, we present the exact solution of a variation of the two-stage model of gene expression (with time dependent transition rates) describing the arbitrary partitioning of proteins. The methodology proposed makes full use of the PPA-mapping by transforming the original problem into a new process describing the evolution of three biological switches. Based on a succession of transformations, the method leads to a hierarchy of reduced models. We give an integral expression of the time dependent generating function as well as explicit results for the mean, variance, and correlation function. Finally, we discuss how results for time dependent parameters can be extended to the three-stage model and used to make inferences about models with parameter fluctuations induced by hidden stochastic variables.

  7. Neurolinguistically constrained simulation of sentence comprehension: integrating artificial intelligence and brain theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gigley, H.M.

    1982-01-01

    An artificial intelligence approach to the simulation of neurolinguistically constrained processes in sentence comprehension is developed using control strategies for simulation of cooperative computation in associative networks. The desirability of this control strategy in contrast to ATN and production system strategies is explained. A first pass implementation of HOPE, an artificial intelligence simulation model of sentence comprehension, constrained by studies of aphasic performance, psycholinguistics, neurolinguistics, and linguistic theory is described. Claims that the model could serve as a basis for sentence production simulation and for a model of language acquisition as associative learning are discussed. HOPE is a model thatmore » performs in a normal state and includes a lesion simulation facility. HOPE is also a research tool. Its modifiability and use as a tool to investigate hypothesized causes of degradation in comprehension performance by aphasic patients are described. Issues of using behavioral constraints in modelling and obtaining appropriate data for simulated process modelling are discussed. Finally, problems of validation of the simulation results are raised; and issues of how to interpret clinical results to define the evolution of the model are discussed. Conclusions with respect to the feasibility of artificial intelligence simulation process modelling are discussed based on the current state of research.« less

  8. Modeling formalisms in Systems Biology

    PubMed Central

    2011-01-01

    Systems Biology has taken advantage of computational tools and high-throughput experimental data to model several biological processes. These include signaling, gene regulatory, and metabolic networks. However, most of these models are specific to each kind of network. Their interconnection demands a whole-cell modeling framework for a complete understanding of cellular systems. We describe the features required by an integrated framework for modeling, analyzing and simulating biological processes, and review several modeling formalisms that have been used in Systems Biology including Boolean networks, Bayesian networks, Petri nets, process algebras, constraint-based models, differential equations, rule-based models, interacting state machines, cellular automata, and agent-based models. We compare the features provided by different formalisms, and discuss recent approaches in the integration of these formalisms, as well as possible directions for the future. PMID:22141422

  9. Issues in knowledge representation to support maintainability: A case study in scientific data preparation

    NASA Technical Reports Server (NTRS)

    Chien, Steve; Kandt, R. Kirk; Roden, Joseph; Burleigh, Scott; King, Todd; Joy, Steve

    1992-01-01

    Scientific data preparation is the process of extracting usable scientific data from raw instrument data. This task involves noise detection (and subsequent noise classification and flagging or removal), extracting data from compressed forms, and construction of derivative or aggregate data (e.g. spectral densities or running averages). A software system called PIPE provides intelligent assistance to users developing scientific data preparation plans using a programming language called Master Plumber. PIPE provides this assistance capability by using a process description to create a dependency model of the scientific data preparation plan. This dependency model can then be used to verify syntactic and semantic constraints on processing steps to perform limited plan validation. PIPE also provides capabilities for using this model to assist in debugging faulty data preparation plans. In this case, the process model is used to focus the developer's attention upon those processing steps and data elements that were used in computing the faulty output values. Finally, the dependency model of a plan can be used to perform plan optimization and runtime estimation. These capabilities allow scientists to spend less time developing data preparation procedures and more time on scientific analysis tasks. Because the scientific data processing modules (called fittings) evolve to match scientists' needs, issues regarding maintainability are of prime importance in PIPE. This paper describes the PIPE system and describes how issues in maintainability affected the knowledge representation used in PIPE to capture knowledge about the behavior of fittings.

  10. A review of the integrate-and-fire neuron model: II. Inhomogeneous synaptic input and network properties.

    PubMed

    Burkitt, A N

    2006-08-01

    The integrate-and-fire neuron model describes the state of a neuron in terms of its membrane potential, which is determined by the synaptic inputs and the injected current that the neuron receives. When the membrane potential reaches a threshold, an action potential (spike) is generated. This review considers the model in which the synaptic input varies periodically and is described by an inhomogeneous Poisson process, with both current and conductance synapses. The focus is on the mathematical methods that allow the output spike distribution to be analyzed, including first passage time methods and the Fokker-Planck equation. Recent interest in the response of neurons to periodic input has in part arisen from the study of stochastic resonance, which is the noise-induced enhancement of the signal-to-noise ratio. Networks of integrate-and-fire neurons behave in a wide variety of ways and have been used to model a variety of neural, physiological, and psychological phenomena. The properties of the integrate-and-fire neuron model with synaptic input described as a temporally homogeneous Poisson process are reviewed in an accompanying paper (Burkitt in Biol Cybern, 2006).

  11. Interactions of social, terrestrial, and marine sub-systems in the Galapagos Islands, Ecuador.

    PubMed

    Walsh, Stephen J; Mena, Carlos F

    2016-12-20

    Galapagos is often cited as an example of the conflicts that are emerging between resource conservation and economic development in island ecosystems, as the pressures associated with tourism threaten nature, including the iconic and emblematic species, unique terrestrial landscapes, and special marine environments. In this paper, two projects are described that rely upon dynamic systems models and agent-based models to examine human-environment interactions. We use a theoretical context rooted in complexity theory to guide the development of our models that are linked to social-ecological dynamics. The goal of this paper is to describe key elements, relationships, and processes to inform and enhance our understanding of human-environment interactions in the Galapagos Islands of Ecuador. By formalizing our knowledge of how systems operate and the manner in which key elements are linked in coupled human-natural systems, we specify rules, relationships, and rates of exchange between social and ecological features derived through statistical functions and/or functions specified in theory or practice. The processes described in our models also have practical applications in that they emphasize how political policies generate different human responses and model outcomes, many detrimental to the social-ecological sustainability of the Galapagos Islands.

  12. Interactions of social, terrestrial, and marine sub-systems in the Galapagos Islands, Ecuador

    PubMed Central

    Walsh, Stephen J.; Mena, Carlos F.

    2016-01-01

    Galapagos is often cited as an example of the conflicts that are emerging between resource conservation and economic development in island ecosystems, as the pressures associated with tourism threaten nature, including the iconic and emblematic species, unique terrestrial landscapes, and special marine environments. In this paper, two projects are described that rely upon dynamic systems models and agent-based models to examine human–environment interactions. We use a theoretical context rooted in complexity theory to guide the development of our models that are linked to social–ecological dynamics. The goal of this paper is to describe key elements, relationships, and processes to inform and enhance our understanding of human–environment interactions in the Galapagos Islands of Ecuador. By formalizing our knowledge of how systems operate and the manner in which key elements are linked in coupled human–natural systems, we specify rules, relationships, and rates of exchange between social and ecological features derived through statistical functions and/or functions specified in theory or practice. The processes described in our models also have practical applications in that they emphasize how political policies generate different human responses and model outcomes, many detrimental to the social–ecological sustainability of the Galapagos Islands. PMID:27791072

  13. Modeling of InP metalorganic chemical vapor deposition

    NASA Technical Reports Server (NTRS)

    Black, Linda R.; Clark, Ivan O.; Kui, J.; Jesser, William A.

    1991-01-01

    The growth of InP by metalorganic chemical vapor deposition (MOCVD) in a horizontal reactor is being modeled with a commercially available computational fluid dynamics modeling code. The mathematical treatment of the MOCVD process has four primary areas of concern: 1) transport phenomena, 2) chemistry, 3) boundary conditions, and 4) numerical solution methods. The transport processes involved in CVD are described by conservation of total mass, momentum, energy, and atomic species. Momentum conservation is described by a generalized form of the Navier-Stokes equation for a Newtonian fluid and laminar flow. The effect of Soret diffusion on the transport of particular chemical species and on the predicted deposition rate is examined. Both gas-phase and surface chemical reactions are employed in the model. Boundary conditions are specified at the inlet and walls of the reactor for temperature, fluid flow and chemical species. The coupled set of equations described above is solved by a finite difference method over a nonuniform rectilinear grid in both two and three dimensions. The results of the 2-D computational model is presented for gravity levels of zero- and one-g. The predicted growth rates at one-g are compared to measured growth rates on fused silica substrates.

  14. A Framework for Preliminary Design of Aircraft Structures Based on Process Information. Part 1

    NASA Technical Reports Server (NTRS)

    Rais-Rohani, Masoud

    1998-01-01

    This report discusses the general framework and development of a computational tool for preliminary design of aircraft structures based on process information. The described methodology is suitable for multidisciplinary design optimization (MDO) activities associated with integrated product and process development (IPPD). The framework consists of three parts: (1) product and process definitions; (2) engineering synthesis, and (3) optimization. The product and process definitions are part of input information provided by the design team. The backbone of the system is its ability to analyze a given structural design for performance as well as manufacturability and cost assessment. The system uses a database on material systems and manufacturing processes. Based on the identified set of design variables and an objective function, the system is capable of performing optimization subject to manufacturability, cost, and performance constraints. The accuracy of the manufacturability measures and cost models discussed here depend largely on the available data on specific methods of manufacture and assembly and associated labor requirements. As such, our focus in this research has been on the methodology itself and not so much on its accurate implementation in an industrial setting. A three-tier approach is presented for an IPPD-MDO based design of aircraft structures. The variable-complexity cost estimation methodology and an approach for integrating manufacturing cost assessment into design process are also discussed. This report is presented in two parts. In the first part, the design methodology is presented, and the computational design tool is described. In the second part, a prototype model of the preliminary design Tool for Aircraft Structures based on Process Information (TASPI) is described. Part two also contains an example problem that applies the methodology described here for evaluation of six different design concepts for a wing spar.

  15. Learning to read aloud: A neural network approach using sparse distributed memory

    NASA Technical Reports Server (NTRS)

    Joglekar, Umesh Dwarkanath

    1989-01-01

    An attempt to solve a problem of text-to-phoneme mapping is described which does not appear amenable to solution by use of standard algorithmic procedures. Experiments based on a model of distributed processing are also described. This model (sparse distributed memory (SDM)) can be used in an iterative supervised learning mode to solve the problem. Additional improvements aimed at obtaining better performance are suggested.

  16. Effect of electrolyte nature on kinetics of remazol yellow G removal by electrocoagulation

    NASA Astrophysics Data System (ADS)

    Rajabi, M.; Bagheri-Roochi, M.; Asghari, A.

    2011-10-01

    The present study describes an electrocoagulation process for the removal of remazol yellow G from dye solutions using Iron as the anode and Steel as the cathode. Pseudo-first-order, pseudo-second-order and intraparticle diffusion models were used to analyze the kinetic data obtained at different concentrations in different conditions. The adsorption kinetics was well described by the pseudo-second-order kinetic model.

  17. Automated method for the systematic interpretation of resonance peaks in spectrum data

    DOEpatents

    Damiano, B.; Wood, R.T.

    1997-04-22

    A method is described for spectral signature interpretation. The method includes the creation of a mathematical model of a system or process. A neural network training set is then developed based upon the mathematical model. The neural network training set is developed by using the mathematical model to generate measurable phenomena of the system or process based upon model input parameter that correspond to the physical condition of the system or process. The neural network training set is then used to adjust internal parameters of a neural network. The physical condition of an actual system or process represented by the mathematical model is then monitored by extracting spectral features from measured spectra of the actual process or system. The spectral features are then input into said neural network to determine the physical condition of the system or process represented by the mathematical model. More specifically, the neural network correlates the spectral features (i.e. measurable phenomena) of the actual process or system with the corresponding model input parameters. The model input parameters relate to specific components of the system or process, and, consequently, correspond to the physical condition of the process or system. 1 fig.

  18. Synchronized and noise-robust audio recordings during realtime magnetic resonance imaging scans.

    PubMed

    Bresch, Erik; Nielsen, Jon; Nayak, Krishna; Narayanan, Shrikanth

    2006-10-01

    This letter describes a data acquisition setup for recording, and processing, running speech from a person in a magnetic resonance imaging (MRI) scanner. The main focus is on ensuring synchronicity between image and audio acquisition, and in obtaining good signal to noise ratio to facilitate further speech analysis and modeling. A field-programmable gate array based hardware design for synchronizing the scanner image acquisition to other external data such as audio is described. The audio setup itself features two fiber optical microphones and a noise-canceling filter. Two noise cancellation methods are described including a novel approach using a pulse sequence specific model of the gradient noise of the MRI scanner. The setup is useful for scientific speech production studies. Sample results of speech and singing data acquired and processed using the proposed method are given.

  19. Synchronized and noise-robust audio recordings during realtime magnetic resonance imaging scans (L)

    PubMed Central

    Bresch, Erik; Nielsen, Jon; Nayak, Krishna; Narayanan, Shrikanth

    2007-01-01

    This letter describes a data acquisition setup for recording, and processing, running speech from a person in a magnetic resonance imaging (MRI) scanner. The main focus is on ensuring synchronicity between image and audio acquisition, and in obtaining good signal to noise ratio to facilitate further speech analysis and modeling. A field-programmable gate array based hardware design for synchronizing the scanner image acquisition to other external data such as audio is described. The audio setup itself features two fiber optical microphones and a noise-canceling filter. Two noise cancellation methods are described including a novel approach using a pulse sequence specific model of the gradient noise of the MRI scanner. The setup is useful for scientific speech production studies. Sample results of speech and singing data acquired and processed using the proposed method are given. PMID:17069275

  20. Identification of the states of the processes at liquid cathodes under potentiostatic conditions using semantic diagram models

    NASA Astrophysics Data System (ADS)

    Smirnov, G. B.; Markina, S. E.; Tomashevich, V. G.

    2012-08-01

    A technique is described for constructing semantic diagram models of the electrolysis at a liquid cathode in a salt halide melt under potentiostatic conditions that are intended for identifying the static states of this system that correspond to certain combinations of the electrode processes or the processes occurring in the volumes of salt and liquid-metal phases. Examples are given for the discharge of univalent and polyvalent metals.

Top