Sample records for discrete event process

  1. A Summary of Some Discrete-Event System Control Problems

    NASA Astrophysics Data System (ADS)

    Rudie, Karen

    A summary of the area of control of discrete-event systems is given. In this research area, automata and formal language theory is used as a tool to model physical problems that arise in technological and industrial systems. The key ingredients to discrete-event control problems are a process that can be modeled by an automaton, events in that process that cannot be disabled or prevented from occurring, and a controlling agent that manipulates the events that can be disabled to guarantee that the process under control either generates all the strings in some prescribed language or as many strings as possible in some prescribed language. When multiple controlling agents act on a process, decentralized control problems arise. In decentralized discrete-event systems, it is presumed that the agents effecting control cannot each see all event occurrences. Partial observation leads to some problems that cannot be solved in polynomial time and some others that are not even decidable.

  2. Method for distributed agent-based non-expert simulation of manufacturing process behavior

    DOEpatents

    Ivezic, Nenad; Potok, Thomas E.

    2004-11-30

    A method for distributed agent based non-expert simulation of manufacturing process behavior on a single-processor computer comprises the steps of: object modeling a manufacturing technique having a plurality of processes; associating a distributed agent with each the process; and, programming each the agent to respond to discrete events corresponding to the manufacturing technique, wherein each discrete event triggers a programmed response. The method can further comprise the step of transmitting the discrete events to each agent in a message loop. In addition, the programming step comprises the step of conditioning each agent to respond to a discrete event selected from the group consisting of a clock tick message, a resources received message, and a request for output production message.

  3. Synchronization Of Parallel Discrete Event Simulations

    NASA Technical Reports Server (NTRS)

    Steinman, Jeffrey S.

    1992-01-01

    Adaptive, parallel, discrete-event-simulation-synchronization algorithm, Breathing Time Buckets, developed in Synchronous Parallel Environment for Emulation and Discrete Event Simulation (SPEEDES) operating system. Algorithm allows parallel simulations to process events optimistically in fluctuating time cycles that naturally adapt while simulation in progress. Combines best of optimistic and conservative synchronization strategies while avoiding major disadvantages. Algorithm processes events optimistically in time cycles adapting while simulation in progress. Well suited for modeling communication networks, for large-scale war games, for simulated flights of aircraft, for simulations of computer equipment, for mathematical modeling, for interactive engineering simulations, and for depictions of flows of information.

  4. An algebra of discrete event processes

    NASA Technical Reports Server (NTRS)

    Heymann, Michael; Meyer, George

    1991-01-01

    This report deals with an algebraic framework for modeling and control of discrete event processes. The report consists of two parts. The first part is introductory, and consists of a tutorial survey of the theory of concurrency in the spirit of Hoare's CSP, and an examination of the suitability of such an algebraic framework for dealing with various aspects of discrete event control. To this end a new concurrency operator is introduced and it is shown how the resulting framework can be applied. It is further shown that a suitable theory that deals with the new concurrency operator must be developed. In the second part of the report the formal algebra of discrete event control is developed. At the present time the second part of the report is still an incomplete and occasionally tentative working paper.

  5. A network of discrete events for the representation and analysis of diffusion dynamics.

    PubMed

    Pintus, Alberto M; Pazzona, Federico G; Demontis, Pierfranco; Suffritti, Giuseppe B

    2015-11-14

    We developed a coarse-grained description of the phenomenology of diffusive processes, in terms of a space of discrete events and its representation as a network. Once a proper classification of the discrete events underlying the diffusive process is carried out, their transition matrix is calculated on the basis of molecular dynamics data. This matrix can be represented as a directed, weighted network where nodes represent discrete events, and the weight of edges is given by the probability that one follows the other. The structure of this network reflects dynamical properties of the process of interest in such features as its modularity and the entropy rate of nodes. As an example of the applicability of this conceptual framework, we discuss here the physics of diffusion of small non-polar molecules in a microporous material, in terms of the structure of the corresponding network of events, and explain on this basis the diffusivity trends observed. A quantitative account of these trends is obtained by considering the contribution of the various events to the displacement autocorrelation function.

  6. Discrete event simulation and the resultant data storage system response in the operational mission environment of Jupiter-Saturn /Voyager/ spacecraft

    NASA Technical Reports Server (NTRS)

    Mukhopadhyay, A. K.

    1978-01-01

    The Data Storage Subsystem Simulator (DSSSIM) simulating (by ground software) occurrence of discrete events in the Voyager mission is described. Functional requirements for Data Storage Subsystems (DSS) simulation are discussed, and discrete event simulation/DSSSIM processing is covered. Four types of outputs associated with a typical DSSSIM run are presented, and DSSSIM limitations and constraints are outlined.

  7. DEVS representation of dynamical systems - Event-based intelligent control. [Discrete Event System Specification

    NASA Technical Reports Server (NTRS)

    Zeigler, Bernard P.

    1989-01-01

    It is shown how systems can be advantageously represented as discrete-event models by using DEVS (discrete-event system specification), a set-theoretic formalism. Such DEVS models provide a basis for the design of event-based logic control. In this control paradigm, the controller expects to receive confirming sensor responses to its control commands within definite time windows determined by its DEVS model of the system under control. The event-based contral paradigm is applied in advanced robotic and intelligent automation, showing how classical process control can be readily interfaced with rule-based symbolic reasoning systems.

  8. Non-Lipschitz Dynamics Approach to Discrete Event Systems

    NASA Technical Reports Server (NTRS)

    Zak, M.; Meyers, R.

    1995-01-01

    This paper presents and discusses a mathematical formalism for simulation of discrete event dynamics (DED) - a special type of 'man- made' system designed to aid specific areas of information processing. A main objective is to demonstrate that the mathematical formalism for DED can be based upon the terminal model of Newtonian dynamics which allows one to relax Lipschitz conditions at some discrete points.

  9. Terminal Dynamics Approach to Discrete Event Systems

    NASA Technical Reports Server (NTRS)

    Zak, Michail; Meyers, Ronald

    1995-01-01

    This paper presents and discusses a mathematical formalism for simulation of discrete event dynamic (DED)-a special type of 'man-made' systems to serve specific purposes of information processing. The main objective of this work is to demonstrate that the mathematical formalism for DED can be based upon a terminal model of Newtonian dynamics which allows one to relax Lipschitz conditions at some discrete points.!.

  10. Attention and working memory: two basic mechanisms for constructing temporal experiences

    PubMed Central

    Marchetti, Giorgio

    2014-01-01

    Various kinds of observations show that the ability of human beings to both consciously relive past events – episodic memory – and conceive future events, entails an active process of construction. This construction process also underpins many other important aspects of conscious human life, such as perceptions, language, and conscious thinking. This article provides an explanation of what makes the constructive process possible and how it works. The process mainly relies on attentional activity, which has a discrete and periodic nature, and working memory, which allows for the combination of discrete attentional operations. An explanation is also provided of how past and future events are constructed. PMID:25177305

  11. The detection and analysis of point processes in biological signals

    NASA Technical Reports Server (NTRS)

    Anderson, D. J.; Correia, M. J.

    1977-01-01

    A pragmatic approach to the detection and analysis of discrete events in biomedical signals is taken. Examples from both clinical and basic research are provided. Introductory sections discuss not only discrete events which are easily extracted from recordings by conventional threshold detectors but also events embedded in other information carrying signals. The primary considerations are factors governing event-time resolution and the effects limits to this resolution have on the subsequent analysis of the underlying process. The analysis portion describes tests for qualifying the records as stationary point processes and procedures for providing meaningful information about the biological signals under investigation. All of these procedures are designed to be implemented on laboratory computers of modest computational capacity.

  12. A Decision Tool that Combines Discrete Event Software Process Models with System Dynamics Pieces for Software Development Cost Estimation and Analysis

    NASA Technical Reports Server (NTRS)

    Mizell, Carolyn Barrett; Malone, Linda

    2007-01-01

    The development process for a large software development project is very complex and dependent on many variables that are dynamic and interrelated. Factors such as size, productivity and defect injection rates will have substantial impact on the project in terms of cost and schedule. These factors can be affected by the intricacies of the process itself as well as human behavior because the process is very labor intensive. The complex nature of the development process can be investigated with software development process models that utilize discrete event simulation to analyze the effects of process changes. The organizational environment and its effects on the workforce can be analyzed with system dynamics that utilizes continuous simulation. Each has unique strengths and the benefits of both types can be exploited by combining a system dynamics model and a discrete event process model. This paper will demonstrate how the two types of models can be combined to investigate the impacts of human resource interactions on productivity and ultimately on cost and schedule.

  13. Discrete event simulation tool for analysis of qualitative models of continuous processing systems

    NASA Technical Reports Server (NTRS)

    Malin, Jane T. (Inventor); Basham, Bryan D. (Inventor); Harris, Richard A. (Inventor)

    1990-01-01

    An artificial intelligence design and qualitative modeling tool is disclosed for creating computer models and simulating continuous activities, functions, and/or behavior using developed discrete event techniques. Conveniently, the tool is organized in four modules: library design module, model construction module, simulation module, and experimentation and analysis. The library design module supports the building of library knowledge including component classes and elements pertinent to a particular domain of continuous activities, functions, and behavior being modeled. The continuous behavior is defined discretely with respect to invocation statements, effect statements, and time delays. The functionality of the components is defined in terms of variable cluster instances, independent processes, and modes, further defined in terms of mode transition processes and mode dependent processes. Model construction utilizes the hierarchy of libraries and connects them with appropriate relations. The simulation executes a specialized initialization routine and executes events in a manner that includes selective inherency of characteristics through a time and event schema until the event queue in the simulator is emptied. The experimentation and analysis module supports analysis through the generation of appropriate log files and graphics developments and includes the ability of log file comparisons.

  14. An Advanced Simulation Framework for Parallel Discrete-Event Simulation

    NASA Technical Reports Server (NTRS)

    Li, P. P.; Tyrrell, R. Yeung D.; Adhami, N.; Li, T.; Henry, H.

    1994-01-01

    Discrete-event simulation (DEVS) users have long been faced with a three-way trade-off of balancing execution time, model fidelity, and number of objects simulated. Because of the limits of computer processing power the analyst is often forced to settle for less than desired performances in one or more of these areas.

  15. A discrete event simulation tool to support and predict hospital and clinic staffing.

    PubMed

    DeRienzo, Christopher M; Shaw, Ryan J; Meanor, Phillip; Lada, Emily; Ferranti, Jeffrey; Tanaka, David

    2017-06-01

    We demonstrate how to develop a simulation tool to help healthcare managers and administrators predict and plan for staffing needs in a hospital neonatal intensive care unit using administrative data. We developed a discrete event simulation model of nursing staff needed in a neonatal intensive care unit and then validated the model against historical data. The process flow was translated into a discrete event simulation model. Results demonstrated that the model can be used to give a respectable estimate of annual admissions, transfers, and deaths based upon two different staffing levels. The discrete event simulation tool model can provide healthcare managers and administrators with (1) a valid method of modeling patient mix, patient acuity, staffing needs, and costs in the present state and (2) a forecast of how changes in a unit's staffing, referral patterns, or patient mix would affect a unit in a future state.

  16. Disaster Response Modeling Through Discrete-Event Simulation

    NASA Technical Reports Server (NTRS)

    Wang, Jeffrey; Gilmer, Graham

    2012-01-01

    Organizations today are required to plan against a rapidly changing, high-cost environment. This is especially true for first responders to disasters and other incidents, where critical decisions must be made in a timely manner to save lives and resources. Discrete-event simulations enable organizations to make better decisions by visualizing complex processes and the impact of proposed changes before they are implemented. A discrete-event simulation using Simio software has been developed to effectively analyze and quantify the imagery capabilities of domestic aviation resources conducting relief missions. This approach has helped synthesize large amounts of data to better visualize process flows, manage resources, and pinpoint capability gaps and shortfalls in disaster response scenarios. Simulation outputs and results have supported decision makers in the understanding of high risk locations, key resource placement, and the effectiveness of proposed improvements.

  17. Synchronous parallel system for emulation and discrete event simulation

    NASA Technical Reports Server (NTRS)

    Steinman, Jeffrey S. (Inventor)

    1992-01-01

    A synchronous parallel system for emulation and discrete event simulation having parallel nodes responds to received messages at each node by generating event objects having individual time stamps, stores only the changes to state variables of the simulation object attributable to the event object, and produces corresponding messages. The system refrains from transmitting the messages and changing the state variables while it determines whether the changes are superseded, and then stores the unchanged state variables in the event object for later restoral to the simulation object if called for. This determination preferably includes sensing the time stamp of each new event object and determining which new event object has the earliest time stamp as the local event horizon, determining the earliest local event horizon of the nodes as the global event horizon, and ignoring the events whose time stamps are less than the global event horizon. Host processing between the system and external terminals enables such a terminal to query, monitor, command or participate with a simulation object during the simulation process.

  18. Synchronous Parallel System for Emulation and Discrete Event Simulation

    NASA Technical Reports Server (NTRS)

    Steinman, Jeffrey S. (Inventor)

    2001-01-01

    A synchronous parallel system for emulation and discrete event simulation having parallel nodes responds to received messages at each node by generating event objects having individual time stamps, stores only the changes to the state variables of the simulation object attributable to the event object and produces corresponding messages. The system refrains from transmitting the messages and changing the state variables while it determines whether the changes are superseded, and then stores the unchanged state variables in the event object for later restoral to the simulation object if called for. This determination preferably includes sensing the time stamp of each new event object and determining which new event object has the earliest time stamp as the local event horizon, determining the earliest local event horizon of the nodes as the global event horizon, and ignoring events whose time stamps are less than the global event horizon. Host processing between the system and external terminals enables such a terminal to query, monitor, command or participate with a simulation object during the simulation process.

  19. Requirements analysis for a hardware, discrete-event, simulation engine accelerator

    NASA Astrophysics Data System (ADS)

    Taylor, Paul J., Jr.

    1991-12-01

    An analysis of a general Discrete Event Simulation (DES), executing on the distributed architecture of an eight mode Intel PSC/2 hypercube, was performed. The most time consuming portions of the general DES algorithm were determined to be the functions associated with message passing of required simulation data between processing nodes of the hypercube architecture. A behavioral description, using the IEEE standard VHSIC Hardware Description and Design Language (VHDL), for a general DES hardware accelerator is presented. The behavioral description specifies the operational requirements for a DES coprocessor to augment the hypercube's execution of DES simulations. The DES coprocessor design implements the functions necessary to perform distributed discrete event simulations using a conservative time synchronization protocol.

  20. Application of process monitoring to anomaly detection in nuclear material processing systems via system-centric event interpretation of data from multiple sensors of varying reliability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garcia, Humberto E.; Simpson, Michael F.; Lin, Wen-Chiao

    In this paper, we apply an advanced safeguards approach and associated methods for process monitoring to a hypothetical nuclear material processing system. The assessment regarding the state of the processing facility is conducted at a systemcentric level formulated in a hybrid framework. This utilizes architecture for integrating both time- and event-driven data and analysis for decision making. While the time-driven layers of the proposed architecture encompass more traditional process monitoring methods based on time series data and analysis, the event-driven layers encompass operation monitoring methods based on discrete event data and analysis. By integrating process- and operation-related information and methodologiesmore » within a unified framework, the task of anomaly detection is greatly improved. This is because decision-making can benefit from not only known time-series relationships among measured signals but also from known event sequence relationships among generated events. This available knowledge at both time series and discrete event layers can then be effectively used to synthesize observation solutions that optimally balance sensor and data processing requirements. The application of the proposed approach is then implemented on an illustrative monitored system based on pyroprocessing and results are discussed.« less

  1. Parallel Discrete Molecular Dynamics Simulation With Speculation and In-Order Commitment*†

    PubMed Central

    Khan, Md. Ashfaquzzaman; Herbordt, Martin C.

    2011-01-01

    Discrete molecular dynamics simulation (DMD) uses simplified and discretized models enabling simulations to advance by event rather than by timestep. DMD is an instance of discrete event simulation and so is difficult to scale: even in this multi-core era, all reported DMD codes are serial. In this paper we discuss the inherent difficulties of scaling DMD and present our method of parallelizing DMD through event-based decomposition. Our method is microarchitecture inspired: speculative processing of events exposes parallelism, while in-order commitment ensures correctness. We analyze the potential of this parallelization method for shared-memory multiprocessors. Achieving scalability required extensive experimentation with scheduling and synchronization methods to mitigate serialization. The speed-up achieved for a variety of system sizes and complexities is nearly 6× on an 8-core and over 9× on a 12-core processor. We present and verify analytical models that account for the achieved performance as a function of available concurrency and architectural limitations. PMID:21822327

  2. Parallel Discrete Molecular Dynamics Simulation With Speculation and In-Order Commitment.

    PubMed

    Khan, Md Ashfaquzzaman; Herbordt, Martin C

    2011-07-20

    Discrete molecular dynamics simulation (DMD) uses simplified and discretized models enabling simulations to advance by event rather than by timestep. DMD is an instance of discrete event simulation and so is difficult to scale: even in this multi-core era, all reported DMD codes are serial. In this paper we discuss the inherent difficulties of scaling DMD and present our method of parallelizing DMD through event-based decomposition. Our method is microarchitecture inspired: speculative processing of events exposes parallelism, while in-order commitment ensures correctness. We analyze the potential of this parallelization method for shared-memory multiprocessors. Achieving scalability required extensive experimentation with scheduling and synchronization methods to mitigate serialization. The speed-up achieved for a variety of system sizes and complexities is nearly 6× on an 8-core and over 9× on a 12-core processor. We present and verify analytical models that account for the achieved performance as a function of available concurrency and architectural limitations.

  3. Analysis of Phase-Type Stochastic Petri Nets With Discrete and Continuous Timing

    NASA Technical Reports Server (NTRS)

    Jones, Robert L.; Goode, Plesent W. (Technical Monitor)

    2000-01-01

    The Petri net formalism is useful in studying many discrete-state, discrete-event systems exhibiting concurrency, synchronization, and other complex behavior. As a bipartite graph, the net can conveniently capture salient aspects of the system. As a mathematical tool, the net can specify an analyzable state space. Indeed, one can reason about certain qualitative properties (from state occupancies) and how they arise (the sequence of events leading there). By introducing deterministic or random delays, the model is forced to sojourn in states some amount of time, giving rise to an underlying stochastic process, one that can be specified in a compact way and capable of providing quantitative, probabilistic measures. We formalize a new non-Markovian extension to the Petri net that captures both discrete and continuous timing in the same model. The approach affords efficient, stationary analysis in most cases and efficient transient analysis under certain restrictions. Moreover, this new formalism has the added benefit in modeling fidelity stemming from the simultaneous capture of discrete- and continuous-time events (as opposed to capturing only one and approximating the other). We show how the underlying stochastic process, which is non-Markovian, can be resolved into simpler Markovian problems that enjoy efficient solutions. Solution algorithms are provided that can be easily programmed.

  4. The use of discrete-event simulation modeling to compare handwritten and electronic prescribing systems.

    PubMed

    Ghany, Ahmad; Vassanji, Karim; Kuziemsky, Craig; Keshavjee, Karim

    2013-01-01

    Electronic prescribing (e-prescribing) is expected to bring many benefits to Canadian healthcare, such as a reduction in errors and adverse drug reactions. As there currently is no functioning e-prescribing system in Canada that is completely electronic, we are unable to evaluate the performance of a live system. An alternative approach is to use simulation modeling for evaluation. We developed two discrete-event simulation models, one of the current handwritten prescribing system and one of a proposed e-prescribing system, to compare the performance of these two systems. We were able to compare the number of processes in each model, workflow efficiency, and the distribution of patients or prescriptions. Although we were able to compare these models to each other, using discrete-event simulation software was challenging. We were limited in the number of variables we could measure. We discovered non-linear processes and feedback loops in both models that could not be adequately represented using discrete-event simulation software. Finally, interactions between entities in both models could not be modeled using this type of software. We have come to the conclusion that a more appropriate approach to modeling both the handwritten and electronic prescribing systems would be to use a complex adaptive systems approach using agent-based modeling or systems-based modeling.

  5. An Empirical Study of Combining Communicating Processes in a Parallel Discrete Event Simulation

    DTIC Science & Technology

    1990-12-01

    dynamics of the cost/performance criteria which typically made up computer resource acquisition decisions . offering a broad range of tradeoffs in the way... prcesses has a significant impact on simulation performance. It is the hypothesis of this 3-4 SYSTEM DECOMPOSITION PHYSICAL SYSTEM 1: N PHYSICAL PROCESS 1...EMPTY)) next-event = pop(next-event-queue); lp-clock = next-event - time; Simulate next event departure- consume event-enqueue new event end while; If no

  6. Nonlinear Control and Discrete Event Systems

    NASA Technical Reports Server (NTRS)

    Meyer, George; Null, Cynthia H. (Technical Monitor)

    1995-01-01

    As the operation of large systems becomes ever more dependent on extensive automation, the need for an effective solution to the problem of design and validation of the underlying software becomes more critical. Large systems possesses much detailed structure, typically hierarchical, and they are hybrid. Information processing at the top of the hierarchy is by means of formal logic and sentences; on the bottom it is by means of simple scalar differential equations and functions of time; and in the middle it is by an interacting mix of nonlinear multi-axis differential equations and automata, and functions of time and discrete events. The lecture will address the overall problem as it relates to flight vehicle management, describe the middle level, and offer a design approach that is based on Differential Geometry and Discrete Event Dynamic Systems Theory.

  7. Continuous and discrete extreme climatic events affecting the dynamics of a high-arctic reindeer population.

    PubMed

    Chan, Kung-Sik; Mysterud, Atle; Øritsland, Nils Are; Severinsen, Torbjørn; Stenseth, Nils Chr

    2005-10-01

    Climate at northern latitudes are currently changing both with regard to the mean and the temporal variability at any given site, increasing the frequency of extreme events such as cold and warm spells. Here we use a conceptually new modelling approach with two different dynamic terms of the climatic effects on a Svalbard reindeer population (the Brøggerhalvøya population) which underwent an extreme icing event ("locked pastures") with 80% reduction in population size during one winter (1993/94). One term captures the continuous and linear effect depending upon the Arctic Oscillation and another the discrete (rare) "event" process. The introduction of an "event" parameter describing the discrete extreme winter resulted in a more parsimonious model. Such an approach may be useful in strongly age-structured ungulate populations, with young and very old individuals being particularly prone to mortality factors during adverse conditions (resulting in a population structure that differs before and after extreme climatic events). A simulation study demonstrates that our approach is able to properly detect the ecological effects of such extreme climate events.

  8. Stochastic Adaptive Estimation and Control.

    DTIC Science & Technology

    1994-10-26

    Marcus, "Language Stability and Stabilizability of Discrete Event Dynamical Systems ," SIAM Journal on Control and Optimization, 31, September 1993...in the hierarchical control of flexible manufacturing systems ; in this problem, the model involves a hybrid process in continuous time whose state is...of the average cost control problem for discrete- time Markov processes. Our exposition covers from finite to Borel state and action spaces and

  9. LISP based simulation generators for modeling complex space processes

    NASA Technical Reports Server (NTRS)

    Tseng, Fan T.; Schroer, Bernard J.; Dwan, Wen-Shing

    1987-01-01

    The development of a simulation assistant for modeling discrete event processes is presented. Included are an overview of the system, a description of the simulation generators, and a sample process generated using the simulation assistant.

  10. Small-scale plasticity critically needs a new mechanics description

    NASA Astrophysics Data System (ADS)

    Ngan, Alfonso H. W.

    2013-06-01

    Continuum constitutive laws describe the plastic deformation of materials as a smooth, continuously differentiable process. However, provided that the measurement is done with a fine enough resolution, the plastic deformation of real materials is often found to comprise discrete events usually nanometric in size. For bulk-sized specimens, such nanoscale events are minute compared with the specimen size, and so their associated strain changes are negligibly small, and this is why the continuum laws work well. However, when the specimen size is in the micrometer scale or smaller, the strain changes due to the discrete events could be significant, and the continuum description would be highly unsatisfactory. Yet, because of the advent of microtechnology and nanotechnolgy, small-sized materials will be increasingly used, and so there is a strong need to develop suitable replacement descriptions for plasticity of small materials. As the occurrence of the discrete plastic events is also strongly stochastic, their satisfactory description should also be one of a probabilistic, rather than deterministic, nature.

  11. Timing Processes Are Correlated when Tasks Share a Salient Event

    ERIC Educational Resources Information Center

    Zelaznik, Howard N.; Rosenbaum, David A.

    2010-01-01

    Event timing is manifested when participants make discrete movements such as repeatedly tapping a key. Emergent timing is manifested when participants make continuous movements such as repeatedly drawing a circle. Here we pursued the possibility that providing salient perceptual events to mark the completion of time intervals could allow circle…

  12. USMC Inventory Control Using Optimization Modeling and Discrete Event Simulation

    DTIC Science & Technology

    2016-09-01

    release. Distribution is unlimited. USMC INVENTORY CONTROL USING OPTIMIZATION MODELING AND DISCRETE EVENT SIMULATION by Timothy A. Curling...USING OPTIMIZATION MODELING AND DISCRETE EVENT SIMULATION 5. FUNDING NUMBERS 6. AUTHOR(S) Timothy A. Curling 7. PERFORMING ORGANIZATION NAME(S...optimization and discrete -event simulation. This construct can potentially provide an effective means in improving order management decisions. However

  13. Network Science Research Laboratory (NSRL) Discrete Event Toolkit

    DTIC Science & Technology

    2016-01-01

    ARL-TR-7579 ● JAN 2016 US Army Research Laboratory Network Science Research Laboratory (NSRL) Discrete Event Toolkit by...Laboratory (NSRL) Discrete Event Toolkit by Theron Trout and Andrew J Toth Computational and Information Sciences Directorate, ARL...Research Laboratory (NSRL) Discrete Event Toolkit 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Theron Trout

  14. ANALYSIS OF INPATIENT HOSPITAL STAFF MENTAL WORKLOAD BY MEANS OF DISCRETE-EVENT SIMULATION

    DTIC Science & Technology

    2016-03-24

    ANALYSIS OF INPATIENT HOSPITAL STAFF MENTAL WORKLOAD BY MEANS OF DISCRETE -EVENT SIMULATION...in the United States. AFIT-ENV-MS-16-M-166 ANALYSIS OF INPATIENT HOSPITAL STAFF MENTAL WORKLOAD BY MEANS OF DISCRETE -EVENT SIMULATION...UNLIMITED. AFIT-ENV-MS-16-M-166 ANALYSIS OF INPATIENT HOSPITAL STAFF MENTAL WORKLOAD BY MEANS OF DISCRETE -EVENT SIMULATION Erich W

  15. Program For Simulation Of Trajectories And Events

    NASA Technical Reports Server (NTRS)

    Gottlieb, Robert G.

    1992-01-01

    Universal Simulation Executive (USE) program accelerates and eases generation of application programs for numerical simulation of continuous trajectories interrupted by or containing discrete events. Developed for simulation of multiple spacecraft trajectories with events as one spacecraft crossing the equator, two spacecraft meeting or parting, or firing rocket engine. USE also simulates operation of chemical batch processing factory. Written in Ada.

  16. Supervisory Control of Discrete Event Systems Modeled by Mealy Automata with Nondeterministic Output Functions

    NASA Astrophysics Data System (ADS)

    Ushio, Toshimitsu; Takai, Shigemasa

    Supervisory control is a general framework of logical control of discrete event systems. A supervisor assigns a set of control-disabled controllable events based on observed events so that the controlled discrete event system generates specified languages. In conventional supervisory control, it is assumed that observed events are determined by internal events deterministically. But, this assumption does not hold in a discrete event system with sensor errors and a mobile system, where each observed event depends on not only an internal event but also a state just before the occurrence of the internal event. In this paper, we model such a discrete event system by a Mealy automaton with a nondeterministic output function. We introduce two kinds of supervisors: one assigns each control action based on a permissive policy and the other based on an anti-permissive one. We show necessary and sufficient conditions for the existence of each supervisor. Moreover, we discuss the relationship between the supervisors in the case that the output function is determinisitic.

  17. Discrete-Event Simulation in Chemical Engineering.

    ERIC Educational Resources Information Center

    Schultheisz, Daniel; Sommerfeld, Jude T.

    1988-01-01

    Gives examples, descriptions, and uses for various types of simulation systems, including the Flowtran, Process, Aspen Plus, Design II, GPSS, Simula, and Simscript. Explains similarities in simulators, terminology, and a batch chemical process. Tables and diagrams are included. (RT)

  18. Simulation studies of vestibular macular afferent-discharge patterns using a new, quasi-3-D finite volume method

    NASA Technical Reports Server (NTRS)

    Ross, M. D.; Linton, S. W.; Parnas, B. R.

    2000-01-01

    A quasi-three-dimensional finite-volume numerical simulator was developed to study passive voltage spread in vestibular macular afferents. The method, borrowed from computational fluid dynamics, discretizes events transpiring in small volumes over time. The afferent simulated had three calyces with processes. The number of processes and synapses, and direction and timing of synapse activation, were varied. Simultaneous synapse activation resulted in shortest latency, while directional activation (proximal to distal and distal to proximal) yielded most regular discharges. Color-coded visualizations showed that the simulator discretized events and demonstrated that discharge produced a distal spread of voltage from the spike initiator into the ending. The simulations indicate that directional input, morphology, and timing of synapse activation can affect discharge properties, as must also distal spread of voltage from the spike initiator. The finite volume method has generality and can be applied to more complex neurons to explore discrete synaptic effects in four dimensions.

  19. Modeling Anti-Air Warfare With Discrete Event Simulation and Analyzing Naval Convoy Operations

    DTIC Science & Technology

    2016-06-01

    WARFARE WITH DISCRETE EVENT SIMULATION AND ANALYZING NAVAL CONVOY OPERATIONS by Ali E. Opcin June 2016 Thesis Advisor: Arnold H. Buss Co...REPORT DATE June 2016 3. REPORT TYPE AND DATES COVERED Master’s thesis 4. TITLE AND SUBTITLE MODELING ANTI-AIR WARFARE WITH DISCRETE EVENT...In this study, a discrete event simulation (DES) was built by modeling ships, and their sensors and weapons, to simulate convoy operations under

  20. Chemical Dosing and First-Order Kinetics

    ERIC Educational Resources Information Center

    Hladky, Paul W.

    2011-01-01

    College students encounter a variety of first-order phenomena in their mathematics and science courses. Introductory chemistry textbooks that discuss first-order processes, usually in conjunction with chemical kinetics or radioactive decay, stop at single, discrete dose events. Although single-dose situations are important, multiple-dose events,…

  1. Time Warp Operating System (TWOS)

    NASA Technical Reports Server (NTRS)

    Bellenot, Steven F.

    1993-01-01

    Designed to support parallel discrete-event simulation, TWOS is complete implementation of Time Warp mechanism - distributed protocol for virtual time synchronization based on process rollback and message annihilation.

  2. A SAS-based solution to evaluate study design efficiency of phase I pediatric oncology trials via discrete event simulation.

    PubMed

    Barrett, Jeffrey S; Jayaraman, Bhuvana; Patel, Dimple; Skolnik, Jeffrey M

    2008-06-01

    Previous exploration of oncology study design efficiency has focused on Markov processes alone (probability-based events) without consideration for time dependencies. Barriers to study completion include time delays associated with patient accrual, inevaluability (IE), time to dose limiting toxicities (DLT) and administrative and review time. Discrete event simulation (DES) can incorporate probability-based assignment of DLT and IE frequency, correlated with cohort in the case of DLT, with time-based events defined by stochastic relationships. A SAS-based solution to examine study efficiency metrics and evaluate design modifications that would improve study efficiency is presented. Virtual patients are simulated with attributes defined from prior distributions of relevant patient characteristics. Study population datasets are read into SAS macros which select patients and enroll them into a study based on the specific design criteria if the study is open to enrollment. Waiting times, arrival times and time to study events are also sampled from prior distributions; post-processing of study simulations is provided within the decision macros and compared across designs in a separate post-processing algorithm. This solution is examined via comparison of the standard 3+3 decision rule relative to the "rolling 6" design, a newly proposed enrollment strategy for the phase I pediatric oncology setting.

  3. Performance bounds on parallel self-initiating discrete-event

    NASA Technical Reports Server (NTRS)

    Nicol, David M.

    1990-01-01

    The use is considered of massively parallel architectures to execute discrete-event simulations of what is termed self-initiating models. A logical process in a self-initiating model schedules its own state re-evaluation times, independently of any other logical process, and sends its new state to other logical processes following the re-evaluation. The interest is in the effects of that communication on synchronization. The performance is considered of various synchronization protocols by deriving upper and lower bounds on optimal performance, upper bounds on Time Warp's performance, and lower bounds on the performance of a new conservative protocol. The analysis of Time Warp includes the overhead costs of state-saving and rollback. The analysis points out sufficient conditions for the conservative protocol to outperform Time Warp. The analysis also quantifies the sensitivity of performance to message fan-out, lookahead ability, and the probability distributions underlying the simulation.

  4. Optimal estimation for discrete time jump processes

    NASA Technical Reports Server (NTRS)

    Vaca, M. V.; Tretter, S. A.

    1977-01-01

    Optimum estimates of nonobservable random variables or random processes which influence the rate functions of a discrete time jump process (DTJP) are obtained. The approach is based on the a posteriori probability of a nonobservable event expressed in terms of the a priori probability of that event and of the sample function probability of the DTJP. A general representation for optimum estimates and recursive equations for minimum mean squared error (MMSE) estimates are obtained. MMSE estimates are nonlinear functions of the observations. The problem of estimating the rate of a DTJP when the rate is a random variable with a probability density function of the form cx super K (l-x) super m and show that the MMSE estimates are linear in this case. This class of density functions explains why there are insignificant differences between optimum unconstrained and linear MMSE estimates in a variety of problems.

  5. Optimal estimation for discrete time jump processes

    NASA Technical Reports Server (NTRS)

    Vaca, M. V.; Tretter, S. A.

    1978-01-01

    Optimum estimates of nonobservable random variables or random processes which influence the rate functions of a discrete time jump process (DTJP) are derived. The approach used is based on the a posteriori probability of a nonobservable event expressed in terms of the a priori probability of that event and of the sample function probability of the DTJP. Thus a general representation is obtained for optimum estimates, and recursive equations are derived for minimum mean-squared error (MMSE) estimates. In general, MMSE estimates are nonlinear functions of the observations. The problem is considered of estimating the rate of a DTJP when the rate is a random variable with a beta probability density function and the jump amplitudes are binomially distributed. It is shown that the MMSE estimates are linear. The class of beta density functions is rather rich and explains why there are insignificant differences between optimum unconstrained and linear MMSE estimates in a variety of problems.

  6. An integrated logit model for contamination event detection in water distribution systems.

    PubMed

    Housh, Mashor; Ostfeld, Avi

    2015-05-15

    The problem of contamination event detection in water distribution systems has become one of the most challenging research topics in water distribution systems analysis. Current attempts for event detection utilize a variety of approaches including statistical, heuristics, machine learning, and optimization methods. Several existing event detection systems share a common feature in which alarms are obtained separately for each of the water quality indicators. Unifying those single alarms from different indicators is usually performed by means of simple heuristics. A salient feature of the current developed approach is using a statistically oriented model for discrete choice prediction which is estimated using the maximum likelihood method for integrating the single alarms. The discrete choice model is jointly calibrated with other components of the event detection system framework in a training data set using genetic algorithms. The fusing process of each indicator probabilities, which is left out of focus in many existing event detection system models, is confirmed to be a crucial part of the system which could be modelled by exploiting a discrete choice model for improving its performance. The developed methodology is tested on real water quality data, showing improved performances in decreasing the number of false positive alarms and in its ability to detect events with higher probabilities, compared to previous studies. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Symbolic Processing Combined with Model-Based Reasoning

    NASA Technical Reports Server (NTRS)

    James, Mark

    2009-01-01

    A computer program for the detection of present and prediction of future discrete states of a complex, real-time engineering system utilizes a combination of symbolic processing and numerical model-based reasoning. One of the biggest weaknesses of a purely symbolic approach is that it enables prediction of only future discrete states while missing all unmodeled states or leading to incorrect identification of an unmodeled state as a modeled one. A purely numerical approach is based on a combination of statistical methods and mathematical models of the applicable physics and necessitates development of a complete model to the level of fidelity required for prediction. In addition, a purely numerical approach does not afford the ability to qualify its results without some form of symbolic processing. The present software implements numerical algorithms to detect unmodeled events and symbolic algorithms to predict expected behavior, correlate the expected behavior with the unmodeled events, and interpret the results in order to predict future discrete states. The approach embodied in this software differs from that of the BEAM methodology (aspects of which have been discussed in several prior NASA Tech Briefs articles), which provides for prediction of future measurements in the continuous-data domain.

  8. A Simulation of Alternatives for Wholesale Inventory Replenishment

    DTIC Science & Technology

    2016-03-01

    algorithmic details. The last method is a mixed-integer, linear optimization model. Comparative Inventory Simulation, a discrete event simulation model, is...simulation; event graphs; reorder point; fill-rate; backorder; discrete event simulation; wholesale inventory optimization model 15. NUMBER OF PAGES...model. Comparative Inventory Simulation, a discrete event simulation model, is designed to find fill rates achieved for each National Item

  9. An extension of the OpenModelica compiler for using Modelica models in a discrete event simulation

    DOE PAGES

    Nutaro, James

    2014-11-03

    In this article, a new back-end and run-time system is described for the OpenModelica compiler. This new back-end transforms a Modelica model into a module for the adevs discrete event simulation package, thereby extending adevs to encompass complex, hybrid dynamical systems. The new run-time system that has been built within the adevs simulation package supports models with state-events and time-events and that comprise differential-algebraic systems with high index. Finally, although the procedure for effecting this transformation is based on adevs and the Discrete Event System Specification, it can be adapted to any discrete event simulation package.

  10. Implementing ARFORGEN: Installation Capability and Feasibility Study of Meeting ARFORGEN Guidelines

    DTIC Science & Technology

    2007-07-26

    aligning troop requirements with the Army’s new strategic mission, the force stabilization element of ARFORGEN was developed to raise the morale of...a discrete event simulation model developed for the project to mirror the reset process. The Unit Reset model is implemented in Java as a discrete...and transportation. Further, the typical installation support staff is manned by a Table of Distribution and Allowance ( TDA ) designed to

  11. A Calculus of Macro-Events: Progress Report

    DTIC Science & Technology

    2000-01-01

    1410, USA iliano@itd.nrl.navy.mil Angelo Montanari Dipartimento di Matematica e Informatica Universita di Udine Via delle Scienze, 206 { 33100 Udine...and process iteration. This proposal builds on work by Chittaro and Montanari [10] on mod- eling discrete processes. The set of constructors of the...situations, in many cases the occurrence of an event happens over a period of time [24]. Capturing this possibility enables ner mod- els , as we can now

  12. Hybrid Markov-mass action law model for cell activation by rare binding events: Application to calcium induced vesicular release at neuronal synapses.

    PubMed

    Guerrier, Claire; Holcman, David

    2016-10-18

    Binding of molecules, ions or proteins to small target sites is a generic step of cell activation. This process relies on rare stochastic events where a particle located in a large bulk has to find small and often hidden targets. We present here a hybrid discrete-continuum model that takes into account a stochastic regime governed by rare events and a continuous regime in the bulk. The rare discrete binding events are modeled by a Markov chain for the encounter of small targets by few Brownian particles, for which the arrival time is Poissonian. The large ensemble of particles is described by mass action laws. We use this novel model to predict the time distribution of vesicular release at neuronal synapses. Vesicular release is triggered by the binding of few calcium ions that can originate either from the synaptic bulk or from the entry through calcium channels. We report here that the distribution of release time is bimodal although it is triggered by a single fast action potential. While the first peak follows a stimulation, the second corresponds to the random arrival over much longer time of ions located in the synaptic terminal to small binding vesicular targets. To conclude, the present multiscale stochastic modeling approach allows studying cellular events based on integrating discrete molecular events over several time scales.

  13. On the role of fluids in stick-slip dynamics of saturated granular fault gouge using a coupled computational fluid dynamics-discrete element approach

    NASA Astrophysics Data System (ADS)

    Dorostkar, Omid; Guyer, Robert A.; Johnson, Paul A.; Marone, Chris; Carmeliet, Jan

    2017-05-01

    The presence of fault gouge has considerable influence on slip properties of tectonic faults and the physics of earthquake rupture. The presence of fluids within faults also plays a significant role in faulting and earthquake processes. In this paper, we present 3-D discrete element simulations of dry and fluid-saturated granular fault gouge and analyze the effect of fluids on stick-slip behavior. Fluid flow is modeled using computational fluid dynamics based on the Navier-Stokes equations for an incompressible fluid and modified to take into account the presence of particles. Analysis of a long time train of slip events shows that the (1) drop in shear stress, (2) compaction of granular layer, and (3) the kinetic energy release during slip all increase in magnitude in the presence of an incompressible fluid, compared to dry conditions. We also observe that on average, the recurrence interval between slip events is longer for fluid-saturated granular fault gouge compared to the dry case. This observation is consistent with the occurrence of larger events in the presence of fluid. It is found that the increase in kinetic energy during slip events for saturated conditions can be attributed to the increased fluid flow during slip. Our observations emphasize the important role that fluid flow and fluid-particle interactions play in tectonic fault zones and show in particular how discrete element method (DEM) models can help understand the hydromechanical processes that dictate fault slip.

  14. Input-output identification of controlled discrete manufacturing systems

    NASA Astrophysics Data System (ADS)

    Estrada-Vargas, Ana Paula; López-Mellado, Ernesto; Lesage, Jean-Jacques

    2014-03-01

    The automated construction of discrete event models from observations of external system's behaviour is addressed. This problem, often referred to as system identification, allows obtaining models of ill-known (or even unknown) systems. In this article, an identification method for discrete event systems (DESs) controlled by a programmable logic controller is presented. The method allows processing a large quantity of observed long sequences of input/output signals generated by the controller and yields an interpreted Petri net model describing the closed-loop behaviour of the automated DESs. The proposed technique allows the identification of actual complex systems because it is sufficiently efficient and well adapted to cope with both the technological characteristics of industrial controllers and data collection requirements. Based on polynomial-time algorithms, the method is implemented as an efficient software tool which constructs and draws the model automatically; an overview of this tool is given through a case study dealing with an automated manufacturing system.

  15. Swarming Reconnaissance Using Unmanned Aerial Vehicles in a Parallel Discrete Event Simulation

    DTIC Science & Technology

    2004-03-01

    60 4.3.1.4 Data Distribution Management . . . . . . . . . 60 4.3.1.5 Breathing Time Warp Algorithm/ Rolling Back . 61...58 BTW Breathing Time Warp . . . . . . . . . . . . . . . . . . . . . . . . . 59 DDM Data Distribution Management . . . . . . . . . . . . . . . . . . . . 60...events based on the 58 process algorithm. Data proxies/ distribution management is the vital portion of the SPEEDES im- plementation that allows objects

  16. A computational framework for prime implicants identification in noncoherent dynamic systems.

    PubMed

    Di Maio, Francesco; Baronchelli, Samuele; Zio, Enrico

    2015-01-01

    Dynamic reliability methods aim at complementing the capability of traditional static approaches (e.g., event trees [ETs] and fault trees [FTs]) by accounting for the system dynamic behavior and its interactions with the system state transition process. For this, the system dynamics is here described by a time-dependent model that includes the dependencies with the stochastic transition events. In this article, we present a novel computational framework for dynamic reliability analysis whose objectives are i) accounting for discrete stochastic transition events and ii) identifying the prime implicants (PIs) of the dynamic system. The framework entails adopting a multiple-valued logic (MVL) to consider stochastic transitions at discretized times. Then, PIs are originally identified by a differential evolution (DE) algorithm that looks for the optimal MVL solution of a covering problem formulated for MVL accident scenarios. For testing the feasibility of the framework, a dynamic noncoherent system composed of five components that can fail at discretized times has been analyzed, showing the applicability of the framework to practical cases. © 2014 Society for Risk Analysis.

  17. The cost of conservative synchronization in parallel discrete event simulations

    NASA Technical Reports Server (NTRS)

    Nicol, David M.

    1990-01-01

    The performance of a synchronous conservative parallel discrete-event simulation protocol is analyzed. The class of simulation models considered is oriented around a physical domain and possesses a limited ability to predict future behavior. A stochastic model is used to show that as the volume of simulation activity in the model increases relative to a fixed architecture, the complexity of the average per-event overhead due to synchronization, event list manipulation, lookahead calculations, and processor idle time approach the complexity of the average per-event overhead of a serial simulation. The method is therefore within a constant factor of optimal. The analysis demonstrates that on large problems--those for which parallel processing is ideally suited--there is often enough parallel workload so that processors are not usually idle. The viability of the method is also demonstrated empirically, showing how good performance is achieved on large problems using a thirty-two node Intel iPSC/2 distributed memory multiprocessor.

  18. Adaptation as process: the future of Darwinism and the legacy of Theodosius Dobzhansky.

    PubMed

    Depew, David J

    2011-03-01

    Conceptions of adaptation have varied in the history of genetic Darwinism depending on whether what is taken to be focal is the process of adaptation, adapted states of populations, or discrete adaptations in individual organisms. I argue that Theodosius Dobzhansky's view of adaptation as a dynamical process contrasts with so-called "adaptationist" views of natural selection figured as "design-without-a-designer" of relatively discrete, enumerable adaptations. Correlated with these respectively process and product oriented approaches to adaptive natural selection are divergent pictures of organisms themselves as developmental wholes or as "bundles" of adaptations. While even process versions of genetical Darwinism are insufficiently sensitive to the fact much of the variation on which adaptive selection works consists of changes in the timing, rate, or location of ontogenetic events, I argue that articulations of the Modern Synthesis influenced by Dobzhansky are more easily reconciled with the recent shift to evolutionary developmentalism than are versions that make discrete adaptations central. Copyright © 2010 Elsevier Ltd. All rights reserved.

  19. Adaptive Event-Triggered Control Based on Heuristic Dynamic Programming for Nonlinear Discrete-Time Systems.

    PubMed

    Dong, Lu; Zhong, Xiangnan; Sun, Changyin; He, Haibo

    2017-07-01

    This paper presents the design of a novel adaptive event-triggered control method based on the heuristic dynamic programming (HDP) technique for nonlinear discrete-time systems with unknown system dynamics. In the proposed method, the control law is only updated when the event-triggered condition is violated. Compared with the periodic updates in the traditional adaptive dynamic programming (ADP) control, the proposed method can reduce the computation and transmission cost. An actor-critic framework is used to learn the optimal event-triggered control law and the value function. Furthermore, a model network is designed to estimate the system state vector. The main contribution of this paper is to design a new trigger threshold for discrete-time systems. A detailed Lyapunov stability analysis shows that our proposed event-triggered controller can asymptotically stabilize the discrete-time systems. Finally, we test our method on two different discrete-time systems, and the simulation results are included.

  20. Improving the Teaching of Discrete-Event Control Systems Using a LEGO Manufacturing Prototype

    ERIC Educational Resources Information Center

    Sanchez, A.; Bucio, J.

    2012-01-01

    This paper discusses the usefulness of employing LEGO as a teaching-learning aid in a post-graduate-level first course on the control of discrete-event systems (DESs). The final assignment of the course is presented, which asks students to design and implement a modular hierarchical discrete-event supervisor for the coordination layer of a…

  1. Safety Discrete Event Models for Holonic Cyclic Manufacturing Systems

    NASA Astrophysics Data System (ADS)

    Ciufudean, Calin; Filote, Constantin

    In this paper the expression “holonic cyclic manufacturing systems” refers to complex assembly/disassembly systems or fork/join systems, kanban systems, and in general, to any discrete event system that transforms raw material and/or components into products. Such a system is said to be cyclic if it provides the same sequence of products indefinitely. This paper considers the scheduling of holonic cyclic manufacturing systems and describes a new approach using Petri nets formalism. We propose an approach to frame the optimum schedule of holonic cyclic manufacturing systems in order to maximize the throughput while minimize the work in process. We also propose an algorithm to verify the optimum schedule.

  2. The Effects of Time Advance Mechanism on Simple Agent Behaviors in Combat Simulations

    DTIC Science & Technology

    2011-12-01

    modeling packages that illustrate the differences between discrete-time simulation (DTS) and discrete-event simulation ( DES ) methodologies. Many combat... DES ) models , often referred to as “next-event” (Law and Kelton 2000) or discrete time simulation (DTS), commonly referred to as “time-step.” DTS...discrete-time simulation (DTS) and discrete-event simulation ( DES ) methodologies. Many combat models use DTS as their simulation time advance mechanism

  3. Analytical Models of Cross-Layer Protocol Optimization in Real-Time Wireless Sensor Ad Hoc Networks

    NASA Astrophysics Data System (ADS)

    Hortos, William S.

    The real-time interactions among the nodes of a wireless sensor network (WSN) to cooperatively process data from multiple sensors are modeled. Quality-of-service (QoS) metrics are associated with the quality of fused information: throughput, delay, packet error rate, etc. Multivariate point process (MVPP) models of discrete random events in WSNs establish stochastic characteristics of optimal cross-layer protocols. Discrete-event, cross-layer interactions in mobile ad hoc network (MANET) protocols have been modeled using a set of concatenated design parameters and associated resource levels by the MVPPs. Characterization of the "best" cross-layer designs for a MANET is formulated by applying the general theory of martingale representations to controlled MVPPs. Performance is described in terms of concatenated protocol parameters and controlled through conditional rates of the MVPPs. Modeling limitations to determination of closed-form solutions versus explicit iterative solutions for ad hoc WSN controls are examined.

  4. NasoNet, modeling the spread of nasopharyngeal cancer with networks of probabilistic events in discrete time.

    PubMed

    Galán, S F; Aguado, F; Díez, F J; Mira, J

    2002-07-01

    The spread of cancer is a non-deterministic dynamic process. As a consequence, the design of an assistant system for the diagnosis and prognosis of the extent of a cancer should be based on a representation method that deals with both uncertainty and time. The ultimate goal is to know the stage of development of a cancer in a patient before selecting the appropriate treatment. A network of probabilistic events in discrete time (NPEDT) is a type of Bayesian network for temporal reasoning that models the causal mechanisms associated with the time evolution of a process. This paper describes NasoNet, a system that applies NPEDTs to the diagnosis and prognosis of nasopharyngeal cancer. We have made use of temporal noisy gates to model the dynamic causal interactions that take place in the domain. The methodology we describe is general enough to be applied to any other type of cancer.

  5. Enhancement of the Logistics Battle Command Model: Architecture Upgrades and Attrition Module Development

    DTIC Science & Technology

    2017-01-05

    module. 15. SUBJECT TERMS Logistics, attrition, discrete event simulation, Simkit, LBC 16. SECURITY CLASSIFICATION OF: Unclassified 17. LIMITATION...stochastics, and discrete event model programmed in Java building largely on the Simkit library. The primary purpose of the LBC model is to support...equations makes them incompatible with the discrete event construct of LBC. Bullard further advances this methodology by developing a stochastic

  6. A Simulation of Readiness-Based Sparing Policies

    DTIC Science & Technology

    2017-06-01

    variant of a greedy heuristic algorithm to set stock levels and estimate overall WS availability. Our discrete event simulation is then used to test the...available in the optimization tools. 14. SUBJECT TERMS readiness-based sparing, discrete event simulation, optimization, multi-indenture...variant of a greedy heuristic algorithm to set stock levels and estimate overall WS availability. Our discrete event simulation is then used to test the

  7. Modeling a Million-Node Slim Fly Network Using Parallel Discrete-Event Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wolfe, Noah; Carothers, Christopher; Mubarak, Misbah

    As supercomputers close in on exascale performance, the increased number of processors and processing power translates to an increased demand on the underlying network interconnect. The Slim Fly network topology, a new lowdiameter and low-latency interconnection network, is gaining interest as one possible solution for next-generation supercomputing interconnect systems. In this paper, we present a high-fidelity Slim Fly it-level model leveraging the Rensselaer Optimistic Simulation System (ROSS) and Co-Design of Exascale Storage (CODES) frameworks. We validate our Slim Fly model with the Kathareios et al. Slim Fly model results provided at moderately sized network scales. We further scale the modelmore » size up to n unprecedented 1 million compute nodes; and through visualization of network simulation metrics such as link bandwidth, packet latency, and port occupancy, we get an insight into the network behavior at the million-node scale. We also show linear strong scaling of the Slim Fly model on an Intel cluster achieving a peak event rate of 36 million events per second using 128 MPI tasks to process 7 billion events. Detailed analysis of the underlying discrete-event simulation performance shows that a million-node Slim Fly model simulation can execute in 198 seconds on the Intel cluster.« less

  8. The use of discrete-event simulation modelling to improve radiation therapy planning processes.

    PubMed

    Werker, Greg; Sauré, Antoine; French, John; Shechter, Steven

    2009-07-01

    The planning portion of the radiation therapy treatment process at the British Columbia Cancer Agency is efficient but nevertheless contains room for improvement. The purpose of this study is to show how a discrete-event simulation (DES) model can be used to represent this complex process and to suggest improvements that may reduce the planning time and ultimately reduce overall waiting times. A simulation model of the radiation therapy (RT) planning process was constructed using the Arena simulation software, representing the complexities of the system. Several types of inputs feed into the model; these inputs come from historical data, a staff survey, and interviews with planners. The simulation model was validated against historical data and then used to test various scenarios to identify and quantify potential improvements to the RT planning process. Simulation modelling is an attractive tool for describing complex systems, and can be used to identify improvements to the processes involved. It is possible to use this technique in the area of radiation therapy planning with the intent of reducing process times and subsequent delays for patient treatment. In this particular system, reducing the variability and length of oncologist-related delays contributes most to improving the planning time.

  9. Kennedy Space Center Orion Processing Team Planning for Ground Operations

    NASA Technical Reports Server (NTRS)

    Letchworth, Gary; Schlierf, Roland

    2011-01-01

    Topics in this presentation are: Constellation Ares I/Orion/Ground Ops Elements Orion Ground Operations Flow Orion Operations Planning Process and Toolset Overview, including: 1 Orion Concept of Operations by Phase 2 Ops Analysis Capabilities Overview 3 Operations Planning Evolution 4 Functional Flow Block Diagrams 5 Operations Timeline Development 6 Discrete Event Simulation (DES) Modeling 7 Ground Operations Planning Document Database (GOPDb) Using Operations Planning Tools for Operability Improvements includes: 1 Kaizen/Lean Events 2 Mockups 3 Human Factors Analysis

  10. A mathematical approach for evaluating Markov models in continuous time without discrete-event simulation.

    PubMed

    van Rosmalen, Joost; Toy, Mehlika; O'Mahony, James F

    2013-08-01

    Markov models are a simple and powerful tool for analyzing the health and economic effects of health care interventions. These models are usually evaluated in discrete time using cohort analysis. The use of discrete time assumes that changes in health states occur only at the end of a cycle period. Discrete-time Markov models only approximate the process of disease progression, as clinical events typically occur in continuous time. The approximation can yield biased cost-effectiveness estimates for Markov models with long cycle periods and if no half-cycle correction is made. The purpose of this article is to present an overview of methods for evaluating Markov models in continuous time. These methods use mathematical results from stochastic process theory and control theory. The methods are illustrated using an applied example on the cost-effectiveness of antiviral therapy for chronic hepatitis B. The main result is a mathematical solution for the expected time spent in each state in a continuous-time Markov model. It is shown how this solution can account for age-dependent transition rates and discounting of costs and health effects, and how the concept of tunnel states can be used to account for transition rates that depend on the time spent in a state. The applied example shows that the continuous-time model yields more accurate results than the discrete-time model but does not require much computation time and is easily implemented. In conclusion, continuous-time Markov models are a feasible alternative to cohort analysis and can offer several theoretical and practical advantages.

  11. Statistical and Probabilistic Extensions to Ground Operations' Discrete Event Simulation Modeling

    NASA Technical Reports Server (NTRS)

    Trocine, Linda; Cummings, Nicholas H.; Bazzana, Ashley M.; Rychlik, Nathan; LeCroy, Kenneth L.; Cates, Grant R.

    2010-01-01

    NASA's human exploration initiatives will invest in technologies, public/private partnerships, and infrastructure, paving the way for the expansion of human civilization into the solar system and beyond. As it is has been for the past half century, the Kennedy Space Center will be the embarkation point for humankind's journey into the cosmos. Functioning as a next generation space launch complex, Kennedy's launch pads, integration facilities, processing areas, launch and recovery ranges will bustle with the activities of the world's space transportation providers. In developing this complex, KSC teams work through the potential operational scenarios: conducting trade studies, planning and budgeting for expensive and limited resources, and simulating alternative operational schemes. Numerous tools, among them discrete event simulation (DES), were matured during the Constellation Program to conduct such analyses with the purpose of optimizing the launch complex for maximum efficiency, safety, and flexibility while minimizing life cycle costs. Discrete event simulation is a computer-based modeling technique for complex and dynamic systems where the state of the system changes at discrete points in time and whose inputs may include random variables. DES is used to assess timelines and throughput, and to support operability studies and contingency analyses. It is applicable to any space launch campaign and informs decision-makers of the effects of varying numbers of expensive resources and the impact of off nominal scenarios on measures of performance. In order to develop representative DES models, methods were adopted, exploited, or created to extend traditional uses of DES. The Delphi method was adopted and utilized for task duration estimation. DES software was exploited for probabilistic event variation. A roll-up process was used, which was developed to reuse models and model elements in other less - detailed models. The DES team continues to innovate and expand DES capabilities to address KSC's planning needs.

  12. Determining A Purely Symbolic Transfer Function from Symbol Streams: Theory and Algorithms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Griffin, Christopher H

    Transfer function modeling is a \\emph{standard technique} in classical Linear Time Invariant and Statistical Process Control. The work of Box and Jenkins was seminal in developing methods for identifying parameters associated with classicalmore » $(r,s,k)$$ transfer functions. Discrete event systems are often \\emph{used} for modeling hybrid control structures and high-level decision problems. \\emph{Examples include} discrete time, discrete strategy repeated games. For these games, a \\emph{discrete transfer function in the form of} an accurate hidden Markov model of input-output relations \\emph{could be used to derive optimal response strategies.} In this paper, we develop an algorithm \\emph{for} creating probabilistic \\textit{Mealy machines} that act as transfer function models for discrete event dynamic systems (DEDS). Our models are defined by three parameters, $$(l_1, l_2, k)$ just as the Box-Jenkins transfer function models. Here $$l_1$$ is the maximal input history lengths to consider, $$l_2$$ is the maximal output history lengths to consider and $k$ is the response lag. Using related results, We show that our Mealy machine transfer functions are optimal in the sense that they maximize the mutual information between the current known state of the DEDS and the next observed input/output pair.« less

  13. On the role of fluids in stick-slip dynamics of saturated granular fault gouge using a coupled computational fluid dynamics-discrete element approach: STICK-SLIP IN SATURATED FAULT GOUGE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dorostkar, Omid; Guyer, Robert A.; Johnson, Paul A.

    The presence of fault gouge has considerable influence on slip properties of tectonic faults and the physics of earthquake rupture. The presence of fluids within faults also plays a significant role in faulting and earthquake processes. In this study, we present 3-D discrete element simulations of dry and fluid-saturated granular fault gouge and analyze the effect of fluids on stick-slip behavior. Fluid flow is modeled using computational fluid dynamics based on the Navier-Stokes equations for an incompressible fluid and modified to take into account the presence of particles. Analysis of a long time train of slip events shows that themore » (1) drop in shear stress, (2) compaction of granular layer, and (3) the kinetic energy release during slip all increase in magnitude in the presence of an incompressible fluid, compared to dry conditions. We also observe that on average, the recurrence interval between slip events is longer for fluid-saturated granular fault gouge compared to the dry case. This observation is consistent with the occurrence of larger events in the presence of fluid. It is found that the increase in kinetic energy during slip events for saturated conditions can be attributed to the increased fluid flow during slip. Finally, our observations emphasize the important role that fluid flow and fluid-particle interactions play in tectonic fault zones and show in particular how discrete element method (DEM) models can help understand the hydromechanical processes that dictate fault slip.« less

  14. On the role of fluids in stick-slip dynamics of saturated granular fault gouge using a coupled computational fluid dynamics-discrete element approach: STICK-SLIP IN SATURATED FAULT GOUGE

    DOE PAGES

    Dorostkar, Omid; Guyer, Robert A.; Johnson, Paul A.; ...

    2017-05-01

    The presence of fault gouge has considerable influence on slip properties of tectonic faults and the physics of earthquake rupture. The presence of fluids within faults also plays a significant role in faulting and earthquake processes. In this study, we present 3-D discrete element simulations of dry and fluid-saturated granular fault gouge and analyze the effect of fluids on stick-slip behavior. Fluid flow is modeled using computational fluid dynamics based on the Navier-Stokes equations for an incompressible fluid and modified to take into account the presence of particles. Analysis of a long time train of slip events shows that themore » (1) drop in shear stress, (2) compaction of granular layer, and (3) the kinetic energy release during slip all increase in magnitude in the presence of an incompressible fluid, compared to dry conditions. We also observe that on average, the recurrence interval between slip events is longer for fluid-saturated granular fault gouge compared to the dry case. This observation is consistent with the occurrence of larger events in the presence of fluid. It is found that the increase in kinetic energy during slip events for saturated conditions can be attributed to the increased fluid flow during slip. Finally, our observations emphasize the important role that fluid flow and fluid-particle interactions play in tectonic fault zones and show in particular how discrete element method (DEM) models can help understand the hydromechanical processes that dictate fault slip.« less

  15. An advanced environment for hybrid modeling of biological systems based on modelica.

    PubMed

    Pross, Sabrina; Bachmann, Bernhard

    2011-01-20

    Biological systems are often very complex so that an appropriate formalism is needed for modeling their behavior. Hybrid Petri Nets, consisting of time-discrete Petri Net elements as well as continuous ones, have proven to be ideal for this task. Therefore, a new Petri Net library was implemented based on the object-oriented modeling language Modelica which allows the modeling of discrete, stochastic and continuous Petri Net elements by differential, algebraic and discrete equations. An appropriate Modelica-tool performs the hybrid simulation with discrete events and the solution of continuous differential equations. A special sub-library contains so-called wrappers for specific reactions to simplify the modeling process. The Modelica-models can be connected to Simulink-models for parameter optimization, sensitivity analysis and stochastic simulation in Matlab. The present paper illustrates the implementation of the Petri Net component models, their usage within the modeling process and the coupling between the Modelica-tool Dymola and Matlab/Simulink. The application is demonstrated by modeling the metabolism of Chinese Hamster Ovary Cells.

  16. Borrowing as a Process in the Standardization of Language.

    ERIC Educational Resources Information Center

    Byron, Janet

    This paper suggests that new approaches are needed in the study of language standardization. One such approach is the consideration of standardization in terms of processes, i.e., in terms of series of related events, rather than as a group of unrelated discrete happenings. Borrowing is one recurring feature in language standardization, and in…

  17. Evaluation of the Navys Sea/Shore Flow Policy

    DTIC Science & Technology

    2016-06-01

    Std. Z39.18 i Abstract CNA developed an independent Discrete -Event Simulation model to evaluate and assess the effect of...a more steady manning level, but the variability remains, even if the system is optimized. In building a Discrete -Event Simulation model, we...steady-state model. In FY 2014, CNA developed a Discrete -Event Simulation model to evaluate the impact of sea/shore flow policy (the DES-SSF model

  18. Application of simulation models for the optimization of business processes

    NASA Astrophysics Data System (ADS)

    Jašek, Roman; Sedláček, Michal; Chramcov, Bronislav; Dvořák, Jiří

    2016-06-01

    The paper deals with the applications of modeling and simulation tools in the optimization of business processes, especially in solving an optimization of signal flow in security company. As a modeling tool was selected Simul8 software that is used to process modeling based on discrete event simulation and which enables the creation of a visual model of production and distribution processes.

  19. Modeling Temporal Processes in Early Spacecraft Design: Application of Discrete-Event Simulations for Darpa's F6 Program

    NASA Technical Reports Server (NTRS)

    Dubos, Gregory F.; Cornford, Steven

    2012-01-01

    While the ability to model the state of a space system over time is essential during spacecraft operations, the use of time-based simulations remains rare in preliminary design. The absence of the time dimension in most traditional early design tools can however become a hurdle when designing complex systems whose development and operations can be disrupted by various events, such as delays or failures. As the value delivered by a space system is highly affected by such events, exploring the trade space for designs that yield the maximum value calls for the explicit modeling of time.This paper discusses the use of discrete-event models to simulate spacecraft development schedule as well as operational scenarios and on-orbit resources in the presence of uncertainty. It illustrates how such simulations can be utilized to support trade studies, through the example of a tool developed for DARPA's F6 program to assist the design of "fractionated spacecraft".

  20. Joint modeling of longitudinal data and discrete-time survival outcome.

    PubMed

    Qiu, Feiyou; Stein, Catherine M; Elston, Robert C

    2016-08-01

    A predictive joint shared parameter model is proposed for discrete time-to-event and longitudinal data. A discrete survival model with frailty and a generalized linear mixed model for the longitudinal data are joined to predict the probability of events. This joint model focuses on predicting discrete time-to-event outcome, taking advantage of repeated measurements. We show that the probability of an event in a time window can be more precisely predicted by incorporating the longitudinal measurements. The model was investigated by comparison with a two-step model and a discrete-time survival model. Results from both a study on the occurrence of tuberculosis and simulated data show that the joint model is superior to the other models in discrimination ability, especially as the latent variables related to both survival times and the longitudinal measurements depart from 0. © The Author(s) 2013.

  1. Incorporating discrete event simulation into quality improvement efforts in health care systems.

    PubMed

    Rutberg, Matthew Harris; Wenczel, Sharon; Devaney, John; Goldlust, Eric Jonathan; Day, Theodore Eugene

    2015-01-01

    Quality improvement (QI) efforts are an indispensable aspect of health care delivery, particularly in an environment of increasing financial and regulatory pressures. The ability to test predictions of proposed changes to flow, policy, staffing, and other process-level changes using discrete event simulation (DES) has shown significant promise and is well reported in the literature. This article describes how to incorporate DES into QI departments and programs in order to support QI efforts, develop high-fidelity simulation models, conduct experiments, make recommendations, and support adoption of results. The authors describe how DES-enabled QI teams can partner with clinical services and administration to plan, conduct, and sustain QI investigations. © 2013 by the American College of Medical Quality.

  2. CONFIG - Adapting qualitative modeling and discrete event simulation for design of fault management systems

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Basham, Bryan D.

    1989-01-01

    CONFIG is a modeling and simulation tool prototype for analyzing the normal and faulty qualitative behaviors of engineered systems. Qualitative modeling and discrete-event simulation have been adapted and integrated, to support early development, during system design, of software and procedures for management of failures, especially in diagnostic expert systems. Qualitative component models are defined in terms of normal and faulty modes and processes, which are defined by invocation statements and effect statements with time delays. System models are constructed graphically by using instances of components and relations from object-oriented hierarchical model libraries. Extension and reuse of CONFIG models and analysis capabilities in hybrid rule- and model-based expert fault-management support systems are discussed.

  3. Conceptual Modeling of a Quantum Key Distribution Simulation Framework Using the Discrete Event System Specification

    DTIC Science & Technology

    2014-09-18

    and full/scale experimental verifications towards ground/ satellite quantum key distribution0 Oat Qhotonics 4235>9+7,=5;9!អ \\58^ Zin K. Dao Z. Miu T...Conceptual Modeling of a Quantum Key Distribution Simulation Framework Using the Discrete Event System Specification DISSERTATION Jeffrey D. Morris... QUANTUM KEY DISTRIBUTION SIMULATION FRAMEWORK USING THE DISCRETE EVENT SYSTEM SPECIFICATION DISSERTATION Presented to the Faculty Department of Systems

  4. Evaluation of the Navys Sea/Shore Flow Policy

    DTIC Science & Technology

    2016-06-01

    CNA developed an independent Discrete -Event Simulation model to evaluate and assess the effect of alternative sea/shore flow policies. In this study...remains, even if the system is optimized. In building a Discrete -Event Simulation model, we discovered key factors that should be included in the... Discrete -Event Simulation model to evaluate the impact of sea/shore flow policy (the DES-SSF model) and compared the results with the SSFM for one

  5. Systems Operation Studies for Automated Guideway Transit Systems : Detailed Station Model Functional Specifications

    DOT National Transportation Integrated Search

    1981-07-01

    The Detailed Station Model (DSM) is a discrete event model representing the interrelated queueing processes associated with vehicle and passenger activities in an AGT station. The DSM will provide operational and performance measures of alternative s...

  6. System Operations Studies for Automated Guideway Transit Systems : Detailed Station Model User's Manual

    DOT National Transportation Integrated Search

    1981-07-01

    The Detailed Station Model (DSM) is a discrete event model representing the interrelated queueing processes associated with vehicle and passenger activities in an AGT station. The DSM will provide operational and performance measures of alternative s...

  7. Computer simulation of the metastatic progression.

    PubMed

    Wedemann, Gero; Bethge, Anja; Haustein, Volker; Schumacher, Udo

    2014-01-01

    A novel computer model based on a discrete event simulation procedure describes quantitatively the processes underlying the metastatic cascade. Analytical functions describe the size of the primary tumor and the metastases, while a rate function models the intravasation events of the primary tumor and metastases. Events describe the behavior of the malignant cells until the formation of new metastases. The results of the computer simulations are in quantitative agreement with clinical data determined from a patient with hepatocellular carcinoma in the liver. The model provides a more detailed view on the process than a conventional mathematical model. In particular, the implications of interventions on metastasis formation can be calculated.

  8. Structure and Randomness of Continuous-Time, Discrete-Event Processes

    NASA Astrophysics Data System (ADS)

    Marzen, Sarah E.; Crutchfield, James P.

    2017-10-01

    Loosely speaking, the Shannon entropy rate is used to gauge a stochastic process' intrinsic randomness; the statistical complexity gives the cost of predicting the process. We calculate, for the first time, the entropy rate and statistical complexity of stochastic processes generated by finite unifilar hidden semi-Markov models—memoryful, state-dependent versions of renewal processes. Calculating these quantities requires introducing novel mathematical objects (ɛ -machines of hidden semi-Markov processes) and new information-theoretic methods to stochastic processes.

  9. Simulating subduction zone earthquakes using discrete element method: a window into elusive source processes

    NASA Astrophysics Data System (ADS)

    Blank, D. G.; Morgan, J.

    2017-12-01

    Large earthquakes that occur on convergent plate margin interfaces have the potential to cause widespread damage and loss of life. Recent observations reveal that a wide range of different slip behaviors take place along these megathrust faults, which demonstrate both their complexity, and our limited understanding of fault processes and their controls. Numerical modeling provides us with a useful tool that we can use to simulate earthquakes and related slip events, and to make direct observations and correlations among properties and parameters that might control them. Further analysis of these phenomena can lead to a more complete understanding of the underlying mechanisms that accompany the nucleation of large earthquakes, and what might trigger them. In this study, we use the discrete element method (DEM) to create numerical analogs to subduction megathrusts with heterogeneous fault friction. Displacement boundary conditions are applied in order to simulate tectonic loading, which in turn, induces slip along the fault. A wide range of slip behaviors are observed, ranging from creep to stick slip. We are able to characterize slip events by duration, stress drop, rupture area, and slip magnitude, and to correlate the relationships among these quantities. These characterizations allow us to develop a catalog of rupture events both spatially and temporally, for comparison with slip processes on natural faults.

  10. Using a simulation assistant in modeling manufacturing systems

    NASA Technical Reports Server (NTRS)

    Schroer, Bernard J.; Tseng, Fan T.; Zhang, S. X.; Wolfsberger, John W.

    1988-01-01

    Numerous simulation languages exist for modeling discrete event processes, and are now ported to microcomputers. Graphic and animation capabilities were added to many of these languages to assist the users build models and evaluate the simulation results. With all these languages and added features, the user is still plagued with learning the simulation language. Futhermore, the time to construct and then to validate the simulation model is always greater than originally anticipated. One approach to minimize the time requirement is to use pre-defined macros that describe various common processes or operations in a system. The development of a simulation assistant for modeling discrete event manufacturing processes is presented. A simulation assistant is defined as an interactive intelligent software tool that assists the modeler in writing a simulation program by translating the modeler's symbolic description of the problem and then automatically generating the corresponding simulation code. The simulation assistant is discussed with emphasis on an overview of the simulation assistant, the elements of the assistant, and the five manufacturing simulation generators. A typical manufacturing system will be modeled using the simulation assistant and the advantages and disadvantages discussed.

  11. Reversible Parallel Discrete-Event Execution of Large-scale Epidemic Outbreak Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perumalla, Kalyan S; Seal, Sudip K

    2010-01-01

    The spatial scale, runtime speed and behavioral detail of epidemic outbreak simulations together require the use of large-scale parallel processing. In this paper, an optimistic parallel discrete event execution of a reaction-diffusion simulation model of epidemic outbreaks is presented, with an implementation over themore » $$\\mu$$sik simulator. Rollback support is achieved with the development of a novel reversible model that combines reverse computation with a small amount of incremental state saving. Parallel speedup and other runtime performance metrics of the simulation are tested on a small (8,192-core) Blue Gene / P system, while scalability is demonstrated on 65,536 cores of a large Cray XT5 system. Scenarios representing large population sizes (up to several hundred million individuals in the largest case) are exercised.« less

  12. Design of Flight Vehicle Management Systems

    NASA Technical Reports Server (NTRS)

    Meyer, George; Aiken, Edwin W. (Technical Monitor)

    1994-01-01

    As the operation of large systems becomes ever more dependent on extensive automation, the need for an effective solution to the problem of design and validation of the underlying software becomes more critical. Large systems possess much detailed structure, typically hierarchical, and they are hybrid. Information processing at the top of the hierarchy is by means of formal logic and sentences; on the bottom it is by means of simple scalar differential equations and functions of time; and in the middle it is by an interacting mix of nonlinear multi-axis differential equations and automata, and functions of time and discrete events. The lecture will address the overall problem as it relates to flight vehicle management, describe the middle level, and offer a design approach that is based on Differential Geometry and Discrete Event Dynamic Systems Theory.

  13. Improving Energy Efficiency for the Vehicle Assembly Industry: A Discrete Event Simulation Approach

    NASA Astrophysics Data System (ADS)

    Oumer, Abduaziz; Mekbib Atnaw, Samson; Kie Cheng, Jack; Singh, Lakveer

    2016-11-01

    This paper presented a Discrete Event Simulation (DES) model for investigating and improving energy efficiency in vehicle assembly line. The car manufacturing industry is one of the highest energy consuming industries. Using Rockwell Arena DES package; a detailed model was constructed for an actual vehicle assembly plant. The sources of energy considered in this research are electricity and fuel; which are the two main types of energy sources used in a typical vehicle assembly plant. The model depicts the performance measurement for process- specific energy measures of painting, welding, and assembling processes. Sound energy efficiency model within this industry has two-fold advantage: reducing CO2 emission and cost reduction associated with fuel and electricity consumption. The paper starts with an overview of challenges in energy consumption within the facilities of automotive assembly line and highlights the parameters for energy efficiency. The results of the simulation model indicated improvements for energy saving objectives and reduced costs.

  14. Parallel discrete-event simulation schemes with heterogeneous processing elements.

    PubMed

    Kim, Yup; Kwon, Ikhyun; Chae, Huiseung; Yook, Soon-Hyung

    2014-07-01

    To understand the effects of nonidentical processing elements (PEs) on parallel discrete-event simulation (PDES) schemes, two stochastic growth models, the restricted solid-on-solid (RSOS) model and the Family model, are investigated by simulations. The RSOS model is the model for the PDES scheme governed by the Kardar-Parisi-Zhang equation (KPZ scheme). The Family model is the model for the scheme governed by the Edwards-Wilkinson equation (EW scheme). Two kinds of distributions for nonidentical PEs are considered. In the first kind computing capacities of PEs are not much different, whereas in the second kind the capacities are extremely widespread. The KPZ scheme on the complex networks shows the synchronizability and scalability regardless of the kinds of PEs. The EW scheme never shows the synchronizability for the random configuration of PEs of the first kind. However, by regularizing the arrangement of PEs of the first kind, the EW scheme is made to show the synchronizability. In contrast, EW scheme never shows the synchronizability for any configuration of PEs of the second kind.

  15. Using Discrete Event Simulation to Model Integrated Commodities Consumption for a Launch Campaign of the Space Launch System

    NASA Technical Reports Server (NTRS)

    Leonard, Daniel; Parsons, Jeremy W.; Cates, Grant

    2014-01-01

    In May 2013, NASA's GSDO Program requested a study to develop a discrete event simulation (DES) model that analyzes the launch campaign process of the Space Launch System (SLS) from an integrated commodities perspective. The scope of the study includes launch countdown and scrub turnaround and focuses on four core launch commodities: hydrogen, oxygen, nitrogen, and helium. Previously, the commodities were only analyzed individually and deterministically for their launch support capability, but this study was the first to integrate them to examine the impact of their interactions on a launch campaign as well as the effects of process variability on commodity availability. The study produced a validated DES model with Rockwell Arena that showed that Kennedy Space Center's ground systems were capable of supporting a 48-hour scrub turnaround for the SLS. The model will be maintained and updated to provide commodity consumption analysis of future ground system and SLS configurations.

  16. The Seeds of Time: Why Classroom Dialogue Needs a Temporal Analysis

    ERIC Educational Resources Information Center

    Mercer, Neil

    2008-01-01

    The process of teaching and learning in school has a natural long-term trajectory and cannot be understood only as a series of discrete educational events. Classroom talk plays an important role in mediating this long-term process, and in this article I argue that more attention should be given to the temporal dimension of classroom dialogue, both…

  17. Desktop microsimulation: a tool to improve efficiency in the medical office practice.

    PubMed

    Montgomery, James B; Linville, Beth A; Slonim, Anthony D

    2013-01-01

    Because the economic crisis in the United States continues to have an impact on healthcare organizations, industry leaders must optimize their decision making. Discrete-event computer simulation is a quality tool with a demonstrated track record of improving the precision of analysis for process redesign. However, the use of simulation to consolidate practices and design efficiencies into an unfinished medical office building was a unique task. A discrete-event computer simulation package was used to model the operations and forecast future results for four orthopedic surgery practices. The scenarios were created to allow an evaluation of the impact of process change on the output variables of exam room utilization, patient queue size, and staff utilization. The model helped with decisions regarding space allocation and efficient exam room use by demonstrating the impact of process changes in patient queues at check-in/out, x-ray, and cast room locations when compared to the status quo model. The analysis impacted decisions on facility layout, patient flow, and staff functions in this newly consolidated practice. Simulation was found to be a useful tool for process redesign and decision making even prior to building occupancy. © 2011 National Association for Healthcare Quality.

  18. Object-oriented models of cognitive processing.

    PubMed

    Mather, G

    2001-05-01

    Information-processing models of vision and cognition are inspired by procedural programming languages. Models that emphasize object-based representations are closely related to object-oriented programming languages. The concepts underlying object-oriented languages provide a theoretical framework for cognitive processing that differs markedly from that offered by procedural languages. This framework is well-suited to a system designed to deal flexibly with discrete objects and unpredictable events in the world.

  19. Running Parallel Discrete Event Simulators on Sierra

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barnes, P. D.; Jefferson, D. R.

    2015-12-03

    In this proposal we consider porting the ROSS/Charm++ simulator and the discrete event models that run under its control so that they run on the Sierra architecture and make efficient use of the Volta GPUs.

  20. High-speed event detector for embedded nanopore bio-systems.

    PubMed

    Huang, Yiyun; Magierowski, Sebastian; Ghafar-Zadeh, Ebrahim; Wang, Chengjie

    2015-08-01

    Biological measurements of microscopic phenomena often deal with discrete-event signals. The ability to automatically carry out such measurements at high-speed in a miniature embedded system is desirable but compromised by high-frequency noise along with practical constraints on filter quality and sampler resolution. This paper presents a real-time event-detection method in the context of nanopore sensing that helps to mitigate these drawbacks and allows accurate signal processing in an embedded system. Simulations show at least a 10× improvement over existing on-line detection methods.

  1. Autonomous control of production networks using a pheromone approach

    NASA Astrophysics Data System (ADS)

    Armbruster, D.; de Beer, C.; Freitag, M.; Jagalski, T.; Ringhofer, C.

    2006-04-01

    The flow of parts through a production network is usually pre-planned by a central control system. Such central control fails in presence of highly fluctuating demand and/or unforeseen disturbances. To manage such dynamic networks according to low work-in-progress and short throughput times, an autonomous control approach is proposed. Autonomous control means a decentralized routing of the autonomous parts themselves. The parts’ decisions base on backward propagated information about the throughput times of finished parts for different routes. So, routes with shorter throughput times attract parts to use this route again. This process can be compared to ants leaving pheromones on their way to communicate with following ants. The paper focuses on a mathematical description of such autonomously controlled production networks. A fluid model with limited service rates in a general network topology is derived and compared to a discrete-event simulation model. Whereas the discrete-event simulation of production networks is straightforward, the formulation of the addressed scenario in terms of a fluid model is challenging. Here it is shown, how several problems in a fluid model formulation (e.g. discontinuities) can be handled mathematically. Finally, some simulation results for the pheromone-based control with both the discrete-event simulation model and the fluid model are presented for a time-dependent influx.

  2. United States Marine Corps Motor Transport Mechanic-to-Equipment Ratio

    DTIC Science & Technology

    time motor transport equipment remains in maintenance at the organizational command level. This thesis uses a discrete event simulation model of the...applied to a single experiment that allows for assessment of risk of not achieving the objective. Inter-arrival time, processing time, work schedule

  3. Electronic circuit detects left ventricular ejection events in cardiovascular system

    NASA Technical Reports Server (NTRS)

    Gebben, V. D.; Webb, J. A., Jr.

    1972-01-01

    Electronic circuit processes arterial blood pressure waveform to produce discrete signals that coincide with beginning and end of left ventricular ejection. Output signals provide timing signals for computers that monitor cardiovascular systems. Circuit operates reliably for heart rates between 50 and 200 beats per minute.

  4. Discretely Integrated Condition Event (DICE) Simulation for Pharmacoeconomics.

    PubMed

    Caro, J Jaime

    2016-07-01

    Several decision-analytic modeling techniques are in use for pharmacoeconomic analyses. Discretely integrated condition event (DICE) simulation is proposed as a unifying approach that has been deliberately designed to meet the modeling requirements in a straightforward transparent way, without forcing assumptions (e.g., only one transition per cycle) or unnecessary complexity. At the core of DICE are conditions that represent aspects that persist over time. They have levels that can change and many may coexist. Events reflect instantaneous occurrences that may modify some conditions or the timing of other events. The conditions are discretely integrated with events by updating their levels at those times. Profiles of determinant values allow for differences among patients in the predictors of the disease course. Any number of valuations (e.g., utility, cost, willingness-to-pay) of conditions and events can be applied concurrently in a single run. A DICE model is conveniently specified in a series of tables that follow a consistent format and the simulation can be implemented fully in MS Excel, facilitating review and validation. DICE incorporates both state-transition (Markov) models and non-resource-constrained discrete event simulation in a single formulation; it can be executed as a cohort or a microsimulation; and deterministically or stochastically.

  5. Flexible Programmes in Higher Professional Education: Expert Validation of a Flexible Educational Model

    ERIC Educational Resources Information Center

    Schellekens, Ad; Paas, Fred; Verbraeck, Alexander; van Merrienboer, Jeroen J. G.

    2010-01-01

    In a preceding case study, a process-focused demand-driven approach for organising flexible educational programmes in higher professional education (HPE) was developed. Operations management and instructional design contributed to designing a flexible educational model by means of discrete-event simulation. Educational experts validated the model…

  6. Educational Attainment as Process: Using Hierarchical Discrete-Time Event History Analysis to Model Rate of Progress

    ERIC Educational Resources Information Center

    Bahr, Peter Riley

    2009-01-01

    Variables that address student enrollment patterns (e.g., persistence, enrollment inconsistency, completed credit hours, course credit load, course completion rate, procrastination) constitute a longstanding fixture of analytical strategies in educational research, particularly research that focuses on explaining variation in academic outcomes.…

  7. Discrete-event system simulation on small and medium enterprises productivity improvement

    NASA Astrophysics Data System (ADS)

    Sulistio, J.; Hidayah, N. A.

    2017-12-01

    Small and medium industries in Indonesia is currently developing. The problem faced by SMEs is the difficulty of meeting growing demand coming into the company. Therefore, SME need an analysis and evaluation on its production process in order to meet all orders. The purpose of this research is to increase the productivity of SMEs production floor by applying discrete-event system simulation. This method preferred because it can solve complex problems die to the dynamic and stochastic nature of the system. To increase the credibility of the simulation, model validated by cooperating the average of two trials, two trials of variance and chi square test. Afterwards, Benferroni method applied to development several alternatives. The article concludes that, the productivity of SMEs production floor increased up to 50% by adding the capacity of dyeing and drying machines.

  8. Patient flow improvement for an ophthalmic specialist outpatient clinic with aid of discrete event simulation and design of experiment.

    PubMed

    Pan, Chong; Zhang, Dali; Kon, Audrey Wan Mei; Wai, Charity Sue Lea; Ang, Woo Boon

    2015-06-01

    Continuous improvement in process efficiency for specialist outpatient clinic (SOC) systems is increasingly being demanded due to the growth of the patient population in Singapore. In this paper, we propose a discrete event simulation (DES) model to represent the patient and information flow in an ophthalmic SOC system in the Singapore National Eye Centre (SNEC). Different improvement strategies to reduce the turnaround time for patients in the SOC were proposed and evaluated with the aid of the DES model and the Design of Experiment (DOE). Two strategies for better patient appointment scheduling and one strategy for dilation-free examination are estimated to have a significant impact on turnaround time for patients. One of the improvement strategies has been implemented in the actual SOC system in the SNEC with promising improvement reported.

  9. Changes in the Martian atmosphere induced by auroral electron precipitation

    NASA Astrophysics Data System (ADS)

    Shematovich, V. I.; Bisikalo, D. V.; Gérard, J.-C.; Hubert, B.

    2017-09-01

    Typical auroral events in the Martian atmosphere, such as discrete and diffuse auroral emissions detected by UV spectrometers onboard ESA Mars Express and NASA MAVEN, are investigated. Auroral electron kinetic energy distribution functions and energy spectra of the upward and downward electron fluxes are obtained by electron transport calculations using the kinetic Monte Carlo model. These characteristics of auroral electron fluxes make it possible to calculate both the precipitation-induced changes in the atmosphere and the observed manifestations of auroral events on Mars. In particular, intensities of discrete and diffuse auroral emissions in the UV and visible wavelength ranges (Soret et al., 2016; Bisikalo et al., 2017; Gérard et al., 2017). For these conditions of auroral events, the analysis is carried out, and the contribution of the fluxes of precipitating electrons to the heating and ionization of the Martian atmosphere is estimated. Numerical calculations show that in the case of discrete auroral events the effect of the residual crustal magnetic field leads to a significant increase in the upward fluxes of electrons, which causes a decrease in the rates of heating and ionization of the atmospheric gas in comparison with the calculations without taking into account the residual magnetic field. It is shown that all the above-mentioned impact factors of auroral electron precipitation processes should be taken into account both in the photochemical models of the Martian atmosphere and in the interpretation of observations of the chemical composition and its variations using the ACS instrument onboard ExoMars.

  10. Discrete diffusion models to study the effects of Mg2+ concentration on the PhoPQ signal transduction system

    PubMed Central

    2010-01-01

    Background The challenge today is to develop a modeling and simulation paradigm that integrates structural, molecular and genetic data for a quantitative understanding of physiology and behavior of biological processes at multiple scales. This modeling method requires techniques that maintain a reasonable accuracy of the biological process and also reduces the computational overhead. This objective motivates the use of new methods that can transform the problem from energy and affinity based modeling to information theory based modeling. To achieve this, we transform all dynamics within the cell into a random event time, which is specified through an information domain measure like probability distribution. This allows us to use the “in silico” stochastic event based modeling approach to find the molecular dynamics of the system. Results In this paper, we present the discrete event simulation concept using the example of the signal transduction cascade triggered by extra-cellular Mg2+ concentration in the two component PhoPQ regulatory system of Salmonella Typhimurium. We also present a model to compute the information domain measure of the molecular transport process by estimating the statistical parameters of inter-arrival time between molecules/ions coming to a cell receptor as external signal. This model transforms the diffusion process into the information theory measure of stochastic event completion time to get the distribution of the Mg2+ departure events. Using these molecular transport models, we next study the in-silico effects of this external trigger on the PhoPQ system. Conclusions Our results illustrate the accuracy of the proposed diffusion models in explaining the molecular/ionic transport processes inside the cell. Also, the proposed simulation framework can incorporate the stochasticity in cellular environments to a certain degree of accuracy. We expect that this scalable simulation platform will be able to model more complex biological systems with reasonable accuracy to understand their temporal dynamics. PMID:21143785

  11. Discrete diffusion models to study the effects of Mg2+ concentration on the PhoPQ signal transduction system.

    PubMed

    Ghosh, Preetam; Ghosh, Samik; Basu, Kalyan; Das, Sajal K; Zhang, Chaoyang

    2010-12-01

    The challenge today is to develop a modeling and simulation paradigm that integrates structural, molecular and genetic data for a quantitative understanding of physiology and behavior of biological processes at multiple scales. This modeling method requires techniques that maintain a reasonable accuracy of the biological process and also reduces the computational overhead. This objective motivates the use of new methods that can transform the problem from energy and affinity based modeling to information theory based modeling. To achieve this, we transform all dynamics within the cell into a random event time, which is specified through an information domain measure like probability distribution. This allows us to use the "in silico" stochastic event based modeling approach to find the molecular dynamics of the system. In this paper, we present the discrete event simulation concept using the example of the signal transduction cascade triggered by extra-cellular Mg2+ concentration in the two component PhoPQ regulatory system of Salmonella Typhimurium. We also present a model to compute the information domain measure of the molecular transport process by estimating the statistical parameters of inter-arrival time between molecules/ions coming to a cell receptor as external signal. This model transforms the diffusion process into the information theory measure of stochastic event completion time to get the distribution of the Mg2+ departure events. Using these molecular transport models, we next study the in-silico effects of this external trigger on the PhoPQ system. Our results illustrate the accuracy of the proposed diffusion models in explaining the molecular/ionic transport processes inside the cell. Also, the proposed simulation framework can incorporate the stochasticity in cellular environments to a certain degree of accuracy. We expect that this scalable simulation platform will be able to model more complex biological systems with reasonable accuracy to understand their temporal dynamics.

  12. Complex discrete dynamics from simple continuous population models.

    PubMed

    Gamarra, Javier G P; Solé, Ricard V

    2002-05-01

    Nonoverlapping generations have been classically modelled as difference equations in order to account for the discrete nature of reproductive events. However, other events such as resource consumption or mortality are continuous and take place in the within-generation time. We have realistically assumed a hybrid ODE bidimensional model of resources and consumers with discrete events for reproduction. Numerical and analytical approaches showed that the resulting dynamics resembles a Ricker map, including the doubling route to chaos. Stochastic simulations with a handling-time parameter for indirect competition of juveniles may affect the qualitative behaviour of the model.

  13. The Hungtsaiping landslide:A kinematic model based on morphology

    NASA Astrophysics Data System (ADS)

    Huang, W.-K.; Chu, H.-K.; Lo, C.-M.; Lin, M.-L.

    2012-04-01

    A large and deep-seated landslide at Hungtsaiping was triggered by the 7.3 magnitude 1999 Chi-Chi earthquake. Extensive site investigations of the landslide were conducted including field reconnaissance, geophysical exploration, borehole logs, and laboratory experiments. Thick colluvium was found around the landslide area and indicated the occurrence of a large ancient landslide. This study presents the catastrophic landslide event which occurred during the Chi-Chi earthquake. The mechanism of the 1999 landslide which cannot be revealed by the underground exploration data alone, is clarified. This research include investigations of the landslide kinematic process and the deposition geometry. A 3D discrete element method (program), PFC3D, was used to model the kinematic process that led to the landslide. The proposed procedure enables a rational and efficient way to simulate the landslide dynamic process. Key word: Hungtsaiping catastrophic landslide, kinematic process, deposition geometry, discrete element method

  14. Synchronization of autonomous objects in discrete event simulation

    NASA Technical Reports Server (NTRS)

    Rogers, Ralph V.

    1990-01-01

    Autonomous objects in event-driven discrete event simulation offer the potential to combine the freedom of unrestricted movement and positional accuracy through Euclidean space of time-driven models with the computational efficiency of event-driven simulation. The principal challenge to autonomous object implementation is object synchronization. The concept of a spatial blackboard is offered as a potential methodology for synchronization. The issues facing implementation of a spatial blackboard are outlined and discussed.

  15. Synchronous Parallel Emulation and Discrete Event Simulation System with Self-Contained Simulation Objects and Active Event Objects

    NASA Technical Reports Server (NTRS)

    Steinman, Jeffrey S. (Inventor)

    1998-01-01

    The present invention is embodied in a method of performing object-oriented simulation and a system having inter-connected processor nodes operating in parallel to simulate mutual interactions of a set of discrete simulation objects distributed among the nodes as a sequence of discrete events changing state variables of respective simulation objects so as to generate new event-defining messages addressed to respective ones of the nodes. The object-oriented simulation is performed at each one of the nodes by assigning passive self-contained simulation objects to each one of the nodes, responding to messages received at one node by generating corresponding active event objects having user-defined inherent capabilities and individual time stamps and corresponding to respective events affecting one of the passive self-contained simulation objects of the one node, restricting the respective passive self-contained simulation objects to only providing and receiving information from die respective active event objects, requesting information and changing variables within a passive self-contained simulation object by the active event object, and producing corresponding messages specifying events resulting therefrom by the active event objects.

  16. Discrete Event Simulation of Distributed Team Communication

    DTIC Science & Technology

    2012-03-22

    performs, and auditory information that is provided through multiple audio devices with speech response. This paper extends previous discrete event workload...2008, pg. 1) notes that “Architecture modeling furnishes abstrac- tions for use in managing complexities, allowing engineers to visualise the proposed

  17. Application of Petri Nets in Bone Remodeling

    PubMed Central

    Li, Lingxi; Yokota, Hiroki

    2009-01-01

    Understanding a mechanism of bone remodeling is a challenging task for both life scientists and model builders, since this highly interactive and nonlinear process can seldom be grasped by simple intuition. A set of ordinary differential equations (ODEs) have been built for simulating bone formation as well as bone resorption. Although solving ODEs numerically can provide useful predictions for dynamical behaviors in a continuous time frame, an actual bone remodeling process in living tissues is driven by discrete events of molecular and cellular interactions. Thus, an event-driven tool such as Petri nets (PNs), which may dynamically and graphically mimic individual molecular collisions or cellular interactions, seems to augment the existing ODE-based systems analysis. Here, we applied PNs to expand the ODE-based approach and examined discrete, dynamical behaviors of key regulatory molecules and bone cells. PNs have been used in many engineering areas, but their application to biological systems needs to be explored. Our PN model was based on 8 ODEs that described an osteoprotegerin linked molecular pathway consisting of 4 types of bone cells. The models allowed us to conduct both qualitative and quantitative evaluations and evaluate homeostatic equilibrium states. The results support that application of PN models assists understanding of an event-driven bone remodeling mechanism using PN-specific procedures such as places, transitions, and firings. PMID:19838338

  18. Deciding the liveness for a subclass of weighted Petri nets based on structurally circular wait

    NASA Astrophysics Data System (ADS)

    Liu, GuanJun; Chen, LiJing

    2016-05-01

    Weighted Petri nets as a kind of formal language are widely used to model and verify discrete event systems related to resource allocation like flexible manufacturing systems. System of Simple Sequential Processes with Multi-Resources (S3PMR, a subclass of weighted Petri nets and an important extension to the well-known System of Simple Sequential Processes with Resources, can model many discrete event systems in which (1) multiple processes may run in parallel and (2) each execution step of each process may use multiple units from multiple resource types. This paper gives a necessary and sufficient condition for the liveness of S3PMR. A new structural concept called Structurally Circular Wait (SCW) is proposed for S3PMR. Blocking Marking (BM) associated with an SCW is defined. It is proven that a marked S3PMR is live if and only if each SCW has no BM. We use an example of multi-processor system-on-chip to show that SCW and BM can precisely characterise the (partial) deadlocks for S3PMR. Simultaneously, two examples are used to show the advantages of SCW in preventing deadlocks of S3PMR. These results are significant for the further research on dealing with the deadlock problem.

  19. Markov-chain model of classified atomistic transition states for discrete kinetic Monte Carlo simulations.

    PubMed

    Numazawa, Satoshi; Smith, Roger

    2011-10-01

    Classical harmonic transition state theory is considered and applied in discrete lattice cells with hierarchical transition levels. The scheme is then used to determine transitions that can be applied in a lattice-based kinetic Monte Carlo (KMC) atomistic simulation model. The model results in an effective reduction of KMC simulation steps by utilizing a classification scheme of transition levels for thermally activated atomistic diffusion processes. Thermally activated atomistic movements are considered as local transition events constrained in potential energy wells over certain local time periods. These processes are represented by Markov chains of multidimensional Boolean valued functions in three-dimensional lattice space. The events inhibited by the barriers under a certain level are regarded as thermal fluctuations of the canonical ensemble and accepted freely. Consequently, the fluctuating system evolution process is implemented as a Markov chain of equivalence class objects. It is shown that the process can be characterized by the acceptance of metastable local transitions. The method is applied to a problem of Au and Ag cluster growth on a rippled surface. The simulation predicts the existence of a morphology-dependent transition time limit from a local metastable to stable state for subsequent cluster growth by accretion. Excellent agreement with observed experimental results is obtained.

  20. Modeling Airport Ground Operations using Discrete Event Simulation (DES) and X3D Visualization

    DTIC Science & Technology

    2008-03-01

    scenes. It is written in open-source Java and XML using the Netbeans platform, which gave the features of being suitable as standalone applications...and as a plug-in module for the Netbeans integrated development environment (IDE). X3D Graphics is the tool used for the elaboration the creation of...process is shown in Figure 2. To 20 create a new event graph in Viskit, first, Viskit tool must be launched via Netbeans or from the executable

  1. Generalized Detectability for Discrete Event Systems

    PubMed Central

    Shu, Shaolong; Lin, Feng

    2011-01-01

    In our previous work, we investigated detectability of discrete event systems, which is defined as the ability to determine the current and subsequent states of a system based on observation. For different applications, we defined four types of detectabilities: (weak) detectability, strong detectability, (weak) periodic detectability, and strong periodic detectability. In this paper, we extend our results in three aspects. (1) We extend detectability from deterministic systems to nondeterministic systems. Such a generalization is necessary because there are many systems that need to be modeled as nondeterministic discrete event systems. (2) We develop polynomial algorithms to check strong detectability. The previous algorithms are based on observer whose construction is of exponential complexity, while the new algorithms are based on a new automaton called detector. (3) We extend detectability to D-detectability. While detectability requires determining the exact state of a system, D-detectability relaxes this requirement by asking only to distinguish certain pairs of states. With these extensions, the theory on detectability of discrete event systems becomes more applicable in solving many practical problems. PMID:21691432

  2. Decoding Overlapping Memories in the Medial Temporal Lobes Using High-Resolution fMRI

    ERIC Educational Resources Information Center

    Chadwick, Martin J.; Hassabis, Demis; Maguire, Eleanor A.

    2011-01-01

    The hippocampus is proposed to process overlapping episodes as discrete memory traces, although direct evidence for this in human episodic memory is scarce. Using green-screen technology we created four highly overlapping movies of everyday events. Participants were scanned using high-resolution fMRI while recalling the movies. Multivariate…

  3. Simulation of a master-slave event set processor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Comfort, J.C.

    1984-03-01

    Event set manipulation may consume a considerable amount of the computation time spent in performing a discrete-event simulation. One way of minimizing this time is to allow event set processing to proceed in parallel with the remainder of the simulation computation. The paper describes a multiprocessor simulation computer, in which all non-event set processing is performed by the principal processor (called the host). Event set processing is coordinated by a front end processor (the master) and actually performed by several other functionally identical processors (the slaves). A trace-driven simulation program modeling this system was constructed, and was run with tracemore » output taken from two different simulation programs. Output from this simulation suggests that a significant reduction in run time may be realized by this approach. Sensitivity analysis was performed on the significant parameters to the system (number of slave processors, relative processor speeds, and interprocessor communication times). A comparison between actual and simulation run times for a one-processor system was used to assist in the validation of the simulation. 7 references.« less

  4. A conceptual modeling framework for discrete event simulation using hierarchical control structures.

    PubMed

    Furian, N; O'Sullivan, M; Walker, C; Vössner, S; Neubacher, D

    2015-08-01

    Conceptual Modeling (CM) is a fundamental step in a simulation project. Nevertheless, it is only recently that structured approaches towards the definition and formulation of conceptual models have gained importance in the Discrete Event Simulation (DES) community. As a consequence, frameworks and guidelines for applying CM to DES have emerged and discussion of CM for DES is increasing. However, both the organization of model-components and the identification of behavior and system control from standard CM approaches have shortcomings that limit CM's applicability to DES. Therefore, we discuss the different aspects of previous CM frameworks and identify their limitations. Further, we present the Hierarchical Control Conceptual Modeling framework that pays more attention to the identification of a models' system behavior, control policies and dispatching routines and their structured representation within a conceptual model. The framework guides the user step-by-step through the modeling process and is illustrated by a worked example.

  5. A conceptual modeling framework for discrete event simulation using hierarchical control structures

    PubMed Central

    Furian, N.; O’Sullivan, M.; Walker, C.; Vössner, S.; Neubacher, D.

    2015-01-01

    Conceptual Modeling (CM) is a fundamental step in a simulation project. Nevertheless, it is only recently that structured approaches towards the definition and formulation of conceptual models have gained importance in the Discrete Event Simulation (DES) community. As a consequence, frameworks and guidelines for applying CM to DES have emerged and discussion of CM for DES is increasing. However, both the organization of model-components and the identification of behavior and system control from standard CM approaches have shortcomings that limit CM’s applicability to DES. Therefore, we discuss the different aspects of previous CM frameworks and identify their limitations. Further, we present the Hierarchical Control Conceptual Modeling framework that pays more attention to the identification of a models’ system behavior, control policies and dispatching routines and their structured representation within a conceptual model. The framework guides the user step-by-step through the modeling process and is illustrated by a worked example. PMID:26778940

  6. Quality Improvement With Discrete Event Simulation: A Primer for Radiologists.

    PubMed

    Booker, Michael T; O'Connell, Ryan J; Desai, Bhushan; Duddalwar, Vinay A

    2016-04-01

    The application of simulation software in health care has transformed quality and process improvement. Specifically, software based on discrete-event simulation (DES) has shown the ability to improve radiology workflows and systems. Nevertheless, despite the successful application of DES in the medical literature, the power and value of simulation remains underutilized. For this reason, the basics of DES modeling are introduced, with specific attention to medical imaging. In an effort to provide readers with the tools necessary to begin their own DES analyses, the practical steps of choosing a software package and building a basic radiology model are discussed. In addition, three radiology system examples are presented, with accompanying DES models that assist in analysis and decision making. Through these simulations, we provide readers with an understanding of the theory, requirements, and benefits of implementing DES in their own radiology practices. Copyright © 2016 American College of Radiology. All rights reserved.

  7. Parallel discrete-event simulation of FCFS stochastic queueing networks

    NASA Technical Reports Server (NTRS)

    Nicol, David M.

    1988-01-01

    Physical systems are inherently parallel. Intuition suggests that simulations of these systems may be amenable to parallel execution. The parallel execution of a discrete-event simulation requires careful synchronization of processes in order to ensure the execution's correctness; this synchronization can degrade performance. Largely negative results were recently reported in a study which used a well-known synchronization method on queueing network simulations. Discussed here is a synchronization method (appointments), which has proven itself to be effective on simulations of FCFS queueing networks. The key concept behind appointments is the provision of lookahead. Lookahead is a prediction on a processor's future behavior, based on an analysis of the processor's simulation state. It is shown how lookahead can be computed for FCFS queueing network simulations, give performance data that demonstrates the method's effectiveness under moderate to heavy loads, and discuss performance tradeoffs between the quality of lookahead, and the cost of computing lookahead.

  8. SIGMA--A Graphical Approach to Teaching Simulation.

    ERIC Educational Resources Information Center

    Schruben, Lee W.

    1992-01-01

    SIGMA (Simulation Graphical Modeling and Analysis) is a computer graphics environment for building, testing, and experimenting with discrete event simulation models on personal computers. It uses symbolic representations (computer animation) to depict the logic of large, complex discrete event systems for easier understanding and has proven itself…

  9. Developing Flexible Discrete Event Simulation Models in an Uncertain Policy Environment

    NASA Technical Reports Server (NTRS)

    Miranda, David J.; Fayez, Sam; Steele, Martin J.

    2011-01-01

    On February 1st, 2010 U.S. President Barack Obama submitted to Congress his proposed budget request for Fiscal Year 2011. This budget included significant changes to the National Aeronautics and Space Administration (NASA), including the proposed cancellation of the Constellation Program. This change proved to be controversial and Congressional approval of the program's official cancellation would take many months to complete. During this same period an end-to-end discrete event simulation (DES) model of Constellation operations was being built through the joint efforts of Productivity Apex Inc. (PAl) and Science Applications International Corporation (SAIC) teams under the guidance of NASA. The uncertainty in regards to the Constellation program presented a major challenge to the DES team, as to: continue the development of this program-of-record simulation, while at the same time remain prepared for possible changes to the program. This required the team to rethink how it would develop it's model and make it flexible enough to support possible future vehicles while at the same time be specific enough to support the program-of-record. This challenge was compounded by the fact that this model was being developed through the traditional DES process-orientation which lacked the flexibility of object-oriented approaches. The team met this challenge through significant pre-planning that led to the "modularization" of the model's structure by identifying what was generic, finding natural logic break points, and the standardization of interlogic numbering system. The outcome of this work resulted in a model that not only was ready to be easily modified to support any future rocket programs, but also a model that was extremely structured and organized in a way that facilitated rapid verification. This paper discusses in detail the process the team followed to build this model and the many advantages this method provides builders of traditional process-oriented discrete event simulations.

  10. Modeling and Simulation of Metallurgical Process Based on Hybrid Petri Net

    NASA Astrophysics Data System (ADS)

    Ren, Yujuan; Bao, Hong

    2016-11-01

    In order to achieve the goals of energy saving and emission reduction of iron and steel enterprises, an increasing number of modeling and simulation technologies are used to research and analyse metallurgical production process. In this paper, the basic principle of Hybrid Petri net is used to model and analyse the Metallurgical Process. Firstly, the definition of Hybrid Petri Net System of Metallurgical Process (MPHPNS) and its modeling theory are proposed. Secondly, the model of MPHPNS based on material flow is constructed. The dynamic flow of materials and the real-time change of each technological state in metallurgical process are simulated vividly by using this model. The simulation process can implement interaction between the continuous event dynamic system and the discrete event dynamic system at the same level, and play a positive role in the production decision.

  11. SPEEDES - A multiple-synchronization environment for parallel discrete-event simulation

    NASA Technical Reports Server (NTRS)

    Steinman, Jeff S.

    1992-01-01

    Synchronous Parallel Environment for Emulation and Discrete-Event Simulation (SPEEDES) is a unified parallel simulation environment. It supports multiple-synchronization protocols without requiring users to recompile their code. When a SPEEDES simulation runs on one node, all the extra parallel overhead is removed automatically at run time. When the same executable runs in parallel, the user preselects the synchronization algorithm from a list of options. SPEEDES currently runs on UNIX networks and on the California Institute of Technology/Jet Propulsion Laboratory Mark III Hypercube. SPEEDES also supports interactive simulations. Featured in the SPEEDES environment is a new parallel synchronization approach called Breathing Time Buckets. This algorithm uses some of the conservative techniques found in Time Bucket synchronization, along with the optimism that characterizes the Time Warp approach. A mathematical model derived from first principles predicts the performance of Breathing Time Buckets. Along with the Breathing Time Buckets algorithm, this paper discusses the rules for processing events in SPEEDES, describes the implementation of various other synchronization protocols supported by SPEEDES, describes some new ones for the future, discusses interactive simulations, and then gives some performance results.

  12. Robust inference in discrete hazard models for randomized clinical trials.

    PubMed

    Nguyen, Vinh Q; Gillen, Daniel L

    2012-10-01

    Time-to-event data in which failures are only assessed at discrete time points are common in many clinical trials. Examples include oncology studies where events are observed through periodic screenings such as radiographic scans. When the survival endpoint is acknowledged to be discrete, common methods for the analysis of observed failure times include the discrete hazard models (e.g., the discrete-time proportional hazards and the continuation ratio model) and the proportional odds model. In this manuscript, we consider estimation of a marginal treatment effect in discrete hazard models where the constant treatment effect assumption is violated. We demonstrate that the estimator resulting from these discrete hazard models is consistent for a parameter that depends on the underlying censoring distribution. An estimator that removes the dependence on the censoring mechanism is proposed and its asymptotic distribution is derived. Basing inference on the proposed estimator allows for statistical inference that is scientifically meaningful and reproducible. Simulation is used to assess the performance of the presented methodology in finite samples.

  13. Integrated G and C Implementation within IDOS: A Simulink Based Reusable Launch Vehicle Simulation

    NASA Technical Reports Server (NTRS)

    Fisher, Joseph E.; Bevacqua, Tim; Lawrence, Douglas A.; Zhu, J. Jim; Mahoney, Michael

    2003-01-01

    The implementation of multiple Integrated Guidance and Control (IG&C) algorithms per flight phase within a vehicle simulation poses a daunting task to coordinate algorithm interactions with the other G&C components and with vehicle subsystems. Currently being developed by Universal Space Lines LLC (USL) under contract from NASA, the Integrated Development and Operations System (IDOS) contains a high fidelity Simulink vehicle simulation, which provides a means to test cutting edge G&C technologies. Combining the modularity of this vehicle simulation and Simulink s built-in primitive blocks provide a quick way to implement algorithms. To add discrete-event functionality to the unfinished IDOS simulation, Vehicle Event Manager (VEM) and Integrated Vehicle Health Monitoring (IVHM) subsystems were created to provide discrete-event and pseudo-health monitoring processing capabilities. Matlab's Stateflow is used to create the IVHM and Event Manager subsystems and to implement a supervisory logic controller referred to as the Auto-commander as part of the IG&C to coordinate the control system adaptation and reconfiguration and to select the control and guidance algorithms for a given flight phase. Manual creation of the Stateflow charts for all of these subsystems is a tedious and time-consuming process. The Stateflow Auto-builder was developed as a Matlab based software tool for the automatic generation of a Stateflow chart from information contained in a database. This paper describes the IG&C, VEM and IVHM implementations in IDOS. In addition, this paper describes the Stateflow Auto-builder.

  14. Discrete event simulation: the preferred technique for health economic evaluations?

    PubMed

    Caro, Jaime J; Möller, Jörgen; Getsios, Denis

    2010-12-01

    To argue that discrete event simulation should be preferred to cohort Markov models for economic evaluations in health care. The basis for the modeling techniques is reviewed. For many health-care decisions, existing data are insufficient to fully inform them, necessitating the use of modeling to estimate the consequences that are relevant to decision-makers. These models must reflect what is known about the problem at a level of detail sufficient to inform the questions. Oversimplification will result in estimates that are not only inaccurate, but potentially misleading. Markov cohort models, though currently popular, have so many limitations and inherent assumptions that they are inadequate to inform most health-care decisions. An event-based individual simulation offers an alternative much better suited to the problem. A properly designed discrete event simulation provides more accurate, relevant estimates without being computationally prohibitive. It does require more data and may be a challenge to convey transparently, but these are necessary trade-offs to provide meaningful and valid results. In our opinion, discrete event simulation should be the preferred technique for health economic evaluations today. © 2010, International Society for Pharmacoeconomics and Outcomes Research (ISPOR).

  15. Simulating Sustainment for an Unmanned Logistics System Concept of Operation in Support of Distributed Operations

    DTIC Science & Technology

    2017-06-01

    designed experiment to model and explore a ship-to-shore logistics process supporting dispersed units via three types of ULSs, which vary primarily in...systems, simulation, discrete event simulation, design of experiments, data analysis, simplekit, nearly orthogonal and balanced designs 15. NUMBER OF... designed experiment to model and explore a ship-to-shore logistics process supporting dispersed units via three types of ULSs, which vary primarily

  16. Generating Discrete Power-Law Distributions from a Death- Multiple Immigration Population Process

    NASA Astrophysics Data System (ADS)

    Matthews, J. O.; Jakeman, E.; Hopcraft, K. I.

    2003-04-01

    We consider the evolution of a simple population process governed by deaths and multiple immigrations that arrive with rates particular to their order. For a particular choice of rates, the equilibrium solution has a discrete power-law form. The model is a generalization of a process investigated previously where immigrants arrived in pairs [1]. The general properties of this model are discussed in a companion paper. The population is initiated with precisely M individuals present and evolves to an equilibrium distribution with a power-law tail. However the power-law tails of the equilibrium distribution are established immediately, so that moments and correlation properties of the population are undefined for any non-zero time. The technique we develop to characterize this process utilizes external monitoring that counts the emigrants leaving the population in specified time intervals. This counting distribution also possesses a power-law tail for all sampling times and the resulting time series exhibits two features worthy of note, a large variation in the strength of the signal, reflecting the power-law PDF; and secondly, intermittency of the emissions. We show that counting with a detector of finite dynamic range regularizes naturally the fluctuations, in effect `clipping' the events. All previously undefined characteristics such as the mean, autocorrelation and probabilities to the first event and time between events are well defined and derived. These properties, although obtained by discarding much data, nevertheless possess embedded power-law regimes that characterize the population in a way that is analogous to box averaging determination of fractal-dimension.

  17. Optimization of Operations Resources via Discrete Event Simulation Modeling

    NASA Technical Reports Server (NTRS)

    Joshi, B.; Morris, D.; White, N.; Unal, R.

    1996-01-01

    The resource levels required for operation and support of reusable launch vehicles are typically defined through discrete event simulation modeling. Minimizing these resources constitutes an optimization problem involving discrete variables and simulation. Conventional approaches to solve such optimization problems involving integer valued decision variables are the pattern search and statistical methods. However, in a simulation environment that is characterized by search spaces of unknown topology and stochastic measures, these optimization approaches often prove inadequate. In this paper, we have explored the applicability of genetic algorithms to the simulation domain. Genetic algorithms provide a robust search strategy that does not require continuity and differentiability of the problem domain. The genetic algorithm successfully minimized the operation and support activities for a space vehicle, through a discrete event simulation model. The practical issues associated with simulation optimization, such as stochastic variables and constraints, were also taken into consideration.

  18. A Software Development Simulation Model of a Spiral Process

    NASA Technical Reports Server (NTRS)

    Mizell, Carolyn; Malone, Linda

    2007-01-01

    There is a need for simulation models of software development processes other than the waterfall because processes such as spiral development are becoming more and more popular. The use of a spiral process can make the inherently difficult job of cost and schedule estimation even more challenging due to its evolutionary nature, but this allows for a more flexible process that can better meet customers' needs. This paper will present a discrete event simulation model of spiral development that can be used to analyze cost and schedule effects of using such a process in comparison to a waterfall process.

  19. Core Competencies and the Prevention of School Failure and Early School Leaving

    ERIC Educational Resources Information Center

    Bradshaw, Catherine P.; O'Brennan, Lindsey M.; McNeely, Clea A.

    2008-01-01

    There is an increasing awareness that school failure and early school leaving are processes, rather than discrete events, that often co-occur and can have lasting negative effects on children's development. Most of the literature has focused on risk factors for failure and dropout rather than on the promotion of competencies that can increase…

  20. A Process Improvement Study on a Military System of Clinics to Manage Patient Demand and Resource Utilization Using Discrete-Event Simulation, Sensitivity Analysis, and Cost-Benefit Analysis

    DTIC Science & Technology

    2015-03-12

    26 Table 3: Optometry Clinic Frequency Count... Optometry Clinic Frequency Count.................................................................. 86 Table 22: Probability Distribution Summary Table...Clinic, the Audiology Clinic, and the Optometry Clinic. Methodology Overview The overarching research goal is to identify feasible solutions to

  1. Hierarchical Discrete Event Supervisory Control of Aircraft Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Yasar, Murat; Tolani, Devendra; Ray, Asok; Shah, Neerav; Litt, Jonathan S.

    2004-01-01

    This paper presents a hierarchical application of Discrete Event Supervisory (DES) control theory for intelligent decision and control of a twin-engine aircraft propulsion system. A dual layer hierarchical DES controller is designed to supervise and coordinate the operation of two engines of the propulsion system. The two engines are individually controlled to achieve enhanced performance and reliability, necessary for fulfilling the mission objectives. Each engine is operated under a continuously varying control system that maintains the specified performance and a local discrete-event supervisor for condition monitoring and life extending control. A global upper level DES controller is designed for load balancing and overall health management of the propulsion system.

  2. Discrete Events as Units of Perceived Time

    ERIC Educational Resources Information Center

    Liverence, Brandon M.; Scholl, Brian J.

    2012-01-01

    In visual images, we perceive both space (as a continuous visual medium) and objects (that inhabit space). Similarly, in dynamic visual experience, we perceive both continuous time and discrete events. What is the relationship between these units of experience? The most intuitive answer may be similar to the spatial case: time is perceived as an…

  3. Modeling Repeatable Events Using Discrete-Time Data: Predicting Marital Dissolution

    ERIC Educational Resources Information Center

    Teachman, Jay

    2011-01-01

    I join two methodologies by illustrating the application of multilevel modeling principles to hazard-rate models with an emphasis on procedures for discrete-time data that contain repeatable events. I demonstrate this application using data taken from the 1995 National Survey of Family Growth (NSFG) to ascertain the relationship between multiple…

  4. Modeling and simulation of count data.

    PubMed

    Plan, E L

    2014-08-13

    Count data, or number of events per time interval, are discrete data arising from repeated time to event observations. Their mean count, or piecewise constant event rate, can be evaluated by discrete probability distributions from the Poisson model family. Clinical trial data characterization often involves population count analysis. This tutorial presents the basics and diagnostics of count modeling and simulation in the context of pharmacometrics. Consideration is given to overdispersion, underdispersion, autocorrelation, and inhomogeneity.

  5. Analysis hierarchical model for discrete event systems

    NASA Astrophysics Data System (ADS)

    Ciortea, E. M.

    2015-11-01

    The This paper presents the hierarchical model based on discrete event network for robotic systems. Based on the hierarchical approach, Petri network is analysed as a network of the highest conceptual level and the lowest level of local control. For modelling and control of complex robotic systems using extended Petri nets. Such a system is structured, controlled and analysed in this paper by using Visual Object Net ++ package that is relatively simple and easy to use, and the results are shown as representations easy to interpret. The hierarchical structure of the robotic system is implemented on computers analysed using specialized programs. Implementation of hierarchical model discrete event systems, as a real-time operating system on a computer network connected via a serial bus is possible, where each computer is dedicated to local and Petri model of a subsystem global robotic system. Since Petri models are simplified to apply general computers, analysis, modelling, complex manufacturing systems control can be achieved using Petri nets. Discrete event systems is a pragmatic tool for modelling industrial systems. For system modelling using Petri nets because we have our system where discrete event. To highlight the auxiliary time Petri model using transport stream divided into hierarchical levels and sections are analysed successively. Proposed robotic system simulation using timed Petri, offers the opportunity to view the robotic time. Application of goods or robotic and transmission times obtained by measuring spot is obtained graphics showing the average time for transport activity, using the parameters sets of finished products. individually.

  6. Activation of alpha2 adrenergic receptors suppresses fear conditioning: expression of c-Fos and phosphorylated CREB in mouse amygdala.

    PubMed

    Davies, M Frances; Tsui, Janet; Flannery, Judy A; Li, Xiangqi; DeLorey, Timothy M; Hoffman, Brian B

    2004-02-01

    alpha(2) adrenergic agonists such as dexmedetomidine generally suppress noradrenergic transmission and have sedative, analgesic, and antihypertensive properties. Considering the importance of the neurotransmitter norepinephrine in forming memories for fearful events, we have investigated the acute and chronic effects of dexmedetomidine on discrete cue and contextual fear conditioning in mice. When administered before training, dexmedetomidine (10-20 microg/kg, i.p.) selectively suppressed discrete cue fear conditioning without affecting contextual memory. This behavioral change was associated with a decrease in memory retrieval-induced expression of c-Fos and P-CREB in the lateral, basolateral, and central nuclei of the amygdala. Dexmedetomidine's action on discrete cue memory did not occur in alpha(2A) adrenoceptor knockout (KO) mice. When dexmedetomidine was administered after training, it suppressed contextual memory, an effect that did not occur in alpha(2A) adrenoceptor KO mice. We conclude that dexmedetomidine, acting at alpha(2A) adrenoceptors, must be present during the encoding process to decrease discrete cue fear memory; however, its ability to suppress contextual memory is likely the result of blocking the consolidation process. The ability of alpha(2) agonists to suppress fear memory may be a valuable property clinically in order to suppress the formation of memories during stressful situations.

  7. Nonlinear dynamic failure process of tunnel-fault system in response to strong seismic event

    NASA Astrophysics Data System (ADS)

    Yang, Zhihua; Lan, Hengxing; Zhang, Yongshuang; Gao, Xing; Li, Langping

    2013-03-01

    Strong earthquakes and faults have significant effect on the stability capability of underground tunnel structures. This study used a 3-Dimensional Discrete Element model and the real records of ground motion in the Wenchuan earthquake to investigate the dynamic response of tunnel-fault system. The typical tunnel-fault system was composed of one planned railway tunnel and one seismically active fault. The discrete numerical model was prudentially calibrated by means of the comparison between the field survey and numerical results of ground motion. It was then used to examine the detailed quantitative information on the dynamic response characteristics of tunnel-fault system, including stress distribution, strain, vibration velocity and tunnel failure process. The intensive tunnel-fault interaction during seismic loading induces the dramatic stress redistribution and stress concentration in the intersection of tunnel and fault. The tunnel-fault system behavior is characterized by the complicated nonlinear dynamic failure process in response to a real strong seismic event. It can be qualitatively divided into 5 main stages in terms of its stress, strain and rupturing behaviors: (1) strain localization, (2) rupture initiation, (3) rupture acceleration, (4) spontaneous rupture growth and (5) stabilization. This study provides the insight into the further stability estimation of underground tunnel structures under the combined effect of strong earthquakes and faults.

  8. Discrete State Change Model of Manufacturing Quality to Aid Assembly Process Design

    NASA Astrophysics Data System (ADS)

    Koga, Tsuyoshi; Aoyama, Kazuhiro

    This paper proposes a representation model of the quality state change in an assembly process that can be used in a computer-aided process design system. In order to formalize the state change of the manufacturing quality in the assembly process, the functions, operations, and quality changes in the assembly process are represented as a network model that can simulate discrete events. This paper also develops a design method for the assembly process. The design method calculates the space of quality state change and outputs a better assembly process (better operations and better sequences) that can be used to obtain the intended quality state of the final product. A computational redesigning algorithm of the assembly process that considers the manufacturing quality is developed. The proposed method can be used to design an improved manufacturing process by simulating the quality state change. A prototype system for planning an assembly process is implemented and applied to the design of an auto-breaker assembly process. The result of the design example indicates that the proposed assembly process planning method outputs a better manufacturing scenario based on the simulation of the quality state change.

  9. Activating clinical trials: a process improvement approach.

    PubMed

    Martinez, Diego A; Tsalatsanis, Athanasios; Yalcin, Ali; Zayas-Castro, José L; Djulbegovic, Benjamin

    2016-02-24

    The administrative process associated with clinical trial activation has been criticized as costly, complex, and time-consuming. Prior research has concentrated on identifying administrative barriers and proposing various solutions to reduce activation time, and consequently associated costs. Here, we expand on previous research by incorporating social network analysis and discrete-event simulation to support process improvement decision-making. We searched for all operational data associated with the administrative process of activating industry-sponsored clinical trials at the Office of Clinical Research of the University of South Florida in Tampa, Florida. We limited the search to those trials initiated and activated between July 2011 and June 2012. We described the process using value stream mapping, studied the interactions of the various process participants using social network analysis, and modeled potential process modifications using discrete-event simulation. The administrative process comprised 5 sub-processes, 30 activities, 11 decision points, 5 loops, and 8 participants. The mean activation time was 76.6 days. Rate-limiting sub-processes were those of contract and budget development. Key participants during contract and budget development were the Office of Clinical Research, sponsors, and the principal investigator. Simulation results indicate that slight increments on the number of trials, arriving to the Office of Clinical Research, would increase activation time by 11 %. Also, incrementing the efficiency of contract and budget development would reduce the activation time by 28 %. Finally, better synchronization between contract and budget development would reduce time spent on batching documentation; however, no improvements would be attained in total activation time. The presented process improvement analytic framework not only identifies administrative barriers, but also helps to devise and evaluate potential improvement scenarios. The strength of our framework lies in its system analysis approach that recognizes the stochastic duration of the activation process and the interdependence between process activities and entities.

  10. Temporal integration: intentional sound discrimination does not modulate stimulus-driven processes in auditory event synthesis.

    PubMed

    Sussman, Elyse; Winkler, István; Kreuzer, Judith; Saher, Marieke; Näätänen, Risto; Ritter, Walter

    2002-12-01

    Our previous study showed that the auditory context could influence whether two successive acoustic changes occurring within the temporal integration window (approximately 200ms) were pre-attentively encoded as a single auditory event or as two discrete events (Cogn Brain Res 12 (2001) 431). The aim of the current study was to assess whether top-down processes could influence the stimulus-driven processes in determining what constitutes an auditory event. Electroencepholagram (EEG) was recorded from 11 scalp electrodes to frequently occurring standard and infrequently occurring deviant sounds. Within the stimulus blocks, deviants either occurred only in pairs (successive feature changes) or both singly and in pairs. Event-related potential indices of change and target detection, the mismatch negativity (MMN) and the N2b component, respectively, were compared with the simultaneously measured performance in discriminating the deviants. Even though subjects could voluntarily distinguish the two successive auditory feature changes from each other, which was also indicated by the elicitation of the N2b target-detection response, top-down processes did not modify the event organization reflected by the MMN response. Top-down processes can extract elemental auditory information from a single integrated acoustic event, but the extraction occurs at a later processing stage than the one whose outcome is indexed by MMN. Initial processes of auditory event-formation are fully governed by the context within which the sounds occur. Perception of the deviants as two separate sound events (the top-down effects) did not change the initial neural representation of the same deviants as one event (indexed by the MMN), without a corresponding change in the stimulus-driven sound organization.

  11. Relation of Parallel Discrete Event Simulation algorithms with physical models

    NASA Astrophysics Data System (ADS)

    Shchur, L. N.; Shchur, L. V.

    2015-09-01

    We extend concept of local simulation times in parallel discrete event simulation (PDES) in order to take into account architecture of the current hardware and software in high-performance computing. We shortly review previous research on the mapping of PDES on physical problems, and emphasise how physical results may help to predict parallel algorithms behaviour.

  12. A Coupled Earthquake-Tsunami Simulation Framework Applied to the Sumatra 2004 Event

    NASA Astrophysics Data System (ADS)

    Vater, Stefan; Bader, Michael; Behrens, Jörn; van Dinther, Ylona; Gabriel, Alice-Agnes; Madden, Elizabeth H.; Ulrich, Thomas; Uphoff, Carsten; Wollherr, Stephanie; van Zelst, Iris

    2017-04-01

    Large earthquakes along subduction zone interfaces have generated destructive tsunamis near Chile in 1960, Sumatra in 2004, and northeast Japan in 2011. In order to better understand these extreme events, we have developed tools for physics-based, coupled earthquake-tsunami simulations. This simulation framework is applied to the 2004 Indian Ocean M 9.1-9.3 earthquake and tsunami, a devastating event that resulted in the loss of more than 230,000 lives. The earthquake rupture simulation is performed using an ADER discontinuous Galerkin discretization on an unstructured tetrahedral mesh with the software SeisSol. Advantages of this approach include accurate representation of complex fault and sea floor geometries and a parallelized and efficient workflow in high-performance computing environments. Accurate and efficient representation of the tsunami evolution and inundation at the coast is achieved with an adaptive mesh discretizing the shallow water equations with a second-order Runge-Kutta discontinuous Galerkin (RKDG) scheme. With the application of the framework to this historic event, we aim to better understand the involved mechanisms between the dynamic earthquake within the earth's crust, the resulting tsunami wave within the ocean, and the final coastal inundation process. Earthquake model results are constrained by GPS surface displacements and tsunami model results are compared with buoy and inundation data. This research is part of the ASCETE Project, "Advanced Simulation of Coupled Earthquake and Tsunami Events", funded by the Volkswagen Foundation.

  13. Hypercube technology

    NASA Technical Reports Server (NTRS)

    Parker, Jay W.; Cwik, Tom; Ferraro, Robert D.; Liewer, Paulett C.; Patterson, Jean E.

    1991-01-01

    The JPL designed MARKIII hypercube supercomputer has been in application service since June 1988 and has had successful application to a broad problem set including electromagnetic scattering, discrete event simulation, plasma transport, matrix algorithms, neural network simulation, image processing, and graphics. Currently, problems that are not homogeneous are being attempted, and, through this involvement with real world applications, the software is evolving to handle the heterogeneous class problems efficiently.

  14. Performance and cost evaluation of health information systems using micro-costing and discrete-event simulation.

    PubMed

    Rejeb, Olfa; Pilet, Claire; Hamana, Sabri; Xie, Xiaolan; Durand, Thierry; Aloui, Saber; Doly, Anne; Biron, Pierre; Perrier, Lionel; Augusto, Vincent

    2018-06-01

    Innovation and health-care funding reforms have contributed to the deployment of Information and Communication Technology (ICT) to improve patient care. Many health-care organizations considered the application of ICT as a crucial key to enhance health-care management. The purpose of this paper is to provide a methodology to assess the organizational impact of high-level Health Information System (HIS) on patient pathway. We propose an integrated performance evaluation of HIS approach through the combination of formal modeling using the Architecture of Integrated Information Systems (ARIS) models, a micro-costing approach for cost evaluation, and a Discrete-Event Simulation (DES) approach. The methodology is applied to the consultation for cancer treatment process. Simulation scenarios are established to conclude about the impact of HIS on patient pathway. We demonstrated that although high level HIS lengthen the consultation, occupation rate of oncologists are lower and quality of service is higher (through the number of available information accessed during the consultation to formulate the diagnostic). The provided method allows also to determine the most cost-effective ICT elements to improve the care process quality while minimizing costs. The methodology is flexible enough to be applied to other health-care systems.

  15. Seismic Characterization of EGS Reservoirs

    NASA Astrophysics Data System (ADS)

    Templeton, D. C.; Pyle, M. L.; Matzel, E.; Myers, S.; Johannesson, G.

    2014-12-01

    To aid in the seismic characterization of Engineered Geothermal Systems (EGS), we enhance the traditional microearthquake detection and location methodologies at two EGS systems. We apply the Matched Field Processing (MFP) seismic imaging technique to detect new seismic events using known discrete microearthquake sources. Events identified using MFP are typically smaller magnitude events or events that occur within the coda of a larger event. Additionally, we apply a Bayesian multiple-event seismic location algorithm, called MicroBayesLoc, to estimate the 95% probability ellipsoids for events with high signal-to-noise ratios (SNR). Such probability ellipsoid information can provide evidence for determining if a seismic lineation could be real or simply within the anticipated error range. We apply this methodology to the Basel EGS data set and compare it to another EGS dataset. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  16. A discrete element and ray framework for rapid simulation of acoustical dispersion of microscale particulate agglomerations

    NASA Astrophysics Data System (ADS)

    Zohdi, T. I.

    2016-03-01

    In industry, particle-laden fluids, such as particle-functionalized inks, are constructed by adding fine-scale particles to a liquid solution, in order to achieve desired overall properties in both liquid and (cured) solid states. However, oftentimes undesirable particulate agglomerations arise due to some form of mutual-attraction stemming from near-field forces, stray electrostatic charges, process ionization and mechanical adhesion. For proper operation of industrial processes involving particle-laden fluids, it is important to carefully breakup and disperse these agglomerations. One approach is to target high-frequency acoustical pressure-pulses to breakup such agglomerations. The objective of this paper is to develop a computational model and corresponding solution algorithm to enable rapid simulation of the effect of acoustical pulses on an agglomeration composed of a collection of discrete particles. Because of the complex agglomeration microstructure, containing gaps and interfaces, this type of system is extremely difficult to mesh and simulate using continuum-based methods, such as the finite difference time domain or the finite element method. Accordingly, a computationally-amenable discrete element/discrete ray model is developed which captures the primary physical events in this process, such as the reflection and absorption of acoustical energy, and the induced forces on the particulate microstructure. The approach utilizes a staggered, iterative solution scheme to calculate the power transfer from the acoustical pulse to the particles and the subsequent changes (breakup) of the pulse due to the particles. Three-dimensional examples are provided to illustrate the approach.

  17. Passive Seismic Monitoring for Rockfall at Yucca Mountain: Concept Tests

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cheng, J; Twilley, K; Murvosh, H

    2003-03-03

    For the purpose of proof-testing a system intended to remotely monitor rockfall inside a potential radioactive waste repository at Yucca Mountain, a system of seismic sub-arrays will be deployed and tested on the surface of the mountain. The goal is to identify and locate rockfall events remotely using automated data collecting and processing techniques. We install seismometers on the ground surface, generate seismic energy to simulate rockfall in underground space beneath the array, and interpret the surface response to discriminate and locate the event. Data will be analyzed using matched-field processing, a generalized beam forming method for localizing discrete signals.more » Software is being developed to facilitate the processing. To date, a three-component sub-array has been installed and successfully tested.« less

  18. Improving Our Ability to Evaluate Underlying Mechanisms of Behavioral Onset and Other Event Occurrence Outcomes: A Discrete-Time Survival Mediation Model

    PubMed Central

    Fairchild, Amanda J.; Abara, Winston E.; Gottschall, Amanda C.; Tein, Jenn-Yun; Prinz, Ronald J.

    2015-01-01

    The purpose of this article is to introduce and describe a statistical model that researchers can use to evaluate underlying mechanisms of behavioral onset and other event occurrence outcomes. Specifically, the article develops a framework for estimating mediation effects with outcomes measured in discrete-time epochs by integrating the statistical mediation model with discrete-time survival analysis. The methodology has the potential to help strengthen health research by targeting prevention and intervention work more effectively as well as by improving our understanding of discretized periods of risk. The model is applied to an existing longitudinal data set to demonstrate its use, and programming code is provided to facilitate its implementation. PMID:24296470

  19. Aspects regarding at 13C isotope separation column control using Petri nets system

    NASA Astrophysics Data System (ADS)

    Boca, M. L.; Ciortea, M. E.

    2015-11-01

    This paper is intended to show that Petri nets can be also applicable in the chemical industry. It used linear programming, modeling underlying Petri nets, especially discrete event systems for isotopic separation, the purpose of considering and control events in real-time through graphical representations. In this paper it is simulate the control of 13C Isotope Separation column using Petri nets. The major problem with 13C comes from the difficulty of obtaining it and raising its natural fraction. Carbon isotopes can be obtained using many methods, one of them being the cryogenic distillation of carbon monoxide. Some few aspects regarding operating conditions and the construction of such cryogenic plants are known today, and even less information are available as far as the separation process modeling and control are concerned. In fact, the efficient control of the carbon monoxide distillation process represents a necessity for large-scale 13C production. Referring to a classic distillation process, some models for carbon isotope separation have been proposed, some based on mass, component and energy balance equations, some on the nonlinear wave theory or the Cohen equations. For modeling the system it was used Petri nets because in this case it is deal with discrete event systems. In use of the non-timed and with auxiliary times Petri model, the transport stream was divided into sections and these sections will be analyzed successively. Because of the complexity of the system and the large amount of calculations required it was not possible to analyze the system as a unitary whole. A first attempt to model the system as a unitary whole led to the blocking of the model during simulation, because of the large processing times.

  20. Hierarchical Discrete Event Supervisory Control of Aircraft Propulsion Systems

    DTIC Science & Technology

    2004-11-01

    Systems Murat Yasar, Devendra Tolani, and Asok Ray The Pennsylvania State University, University Park, Pennsylvania Neerav Shah Glenn Research Center...Hierarchical Discrete Event Supervisory Control of Aircraft Propulsion Systems Murat Yasar, Devendra Tolani, and Asok Ray The Pennsylvania State University...Systems Murat Yasar, Devendra Tolani, and Asok Ray The Pennsylvania State University University Park, Pennsylvania 16802 Neerav Shah National

  1. Multithreaded Stochastic PDES for Reactions and Diffusions in Neurons.

    PubMed

    Lin, Zhongwei; Tropper, Carl; Mcdougal, Robert A; Patoary, Mohammand Nazrul Ishlam; Lytton, William W; Yao, Yiping; Hines, Michael L

    2017-07-01

    Cells exhibit stochastic behavior when the number of molecules is small. Hence a stochastic reaction-diffusion simulator capable of working at scale can provide a more accurate view of molecular dynamics within the cell. This paper describes a parallel discrete event simulator, Neuron Time Warp-Multi Thread (NTW-MT), developed for the simulation of reaction diffusion models of neurons. To the best of our knowledge, this is the first parallel discrete event simulator oriented towards stochastic simulation of chemical reactions in a neuron. The simulator was developed as part of the NEURON project. NTW-MT is optimistic and thread-based, which attempts to capitalize on multi-core architectures used in high performance machines. It makes use of a multi-level queue for the pending event set and a single roll-back message in place of individual anti-messages to disperse contention and decrease the overhead of processing rollbacks. Global Virtual Time is computed asynchronously both within and among processes to get rid of the overhead for synchronizing threads. Memory usage is managed in order to avoid locking and unlocking when allocating and de-allocating memory and to maximize cache locality. We verified our simulator on a calcium buffer model. We examined its performance on a calcium wave model, comparing it to the performance of a process based optimistic simulator and a threaded simulator which uses a single priority queue for each thread. Our multi-threaded simulator is shown to achieve superior performance to these simulators. Finally, we demonstrated the scalability of our simulator on a larger CICR model and a more detailed CICR model.

  2. Simulation modeling for the health care manager.

    PubMed

    Kennedy, Michael H

    2009-01-01

    This article addresses the use of simulation software to solve administrative problems faced by health care managers. Spreadsheet add-ins, process simulation software, and discrete event simulation software are available at a range of costs and complexity. All use the Monte Carlo method to realistically integrate probability distributions into models of the health care environment. Problems typically addressed by health care simulation modeling are facility planning, resource allocation, staffing, patient flow and wait time, routing and transportation, supply chain management, and process improvement.

  3. Knowledge-based simulation for aerospace systems

    NASA Technical Reports Server (NTRS)

    Will, Ralph W.; Sliwa, Nancy E.; Harrison, F. Wallace, Jr.

    1988-01-01

    Knowledge-based techniques, which offer many features that are desirable in the simulation and development of aerospace vehicle operations, exhibit many similarities to traditional simulation packages. The eventual solution of these systems' current symbolic processing/numeric processing interface problem will lead to continuous and discrete-event simulation capabilities in a single language, such as TS-PROLOG. Qualitative, totally-symbolic simulation methods are noted to possess several intrinsic characteristics that are especially revelatory of the system being simulated, and capable of insuring that all possible behaviors are considered.

  4. THE RELATIONSHIP BETWEEN THE SEPTEMBER 2017 MARS GLOBAL AURORA EVENT AND CRUSTAL MAGNETIC FIELDS

    NASA Astrophysics Data System (ADS)

    Nasr, Camella-Rosa; Schneider, Nick; Connour, Kyle; Jain, Sonal; Deighan, Justin; Jakosky, Bruce; MAVEN/IUVS Team

    2018-01-01

    In September 2017, the Imaging UltraViolet Spectrograph (IUVS) on the MAVEN spacecraft observed global aurora on Mars caused by a surprisingly strong solar energetic particle event. Widespread “diffuse aurora” have previously been detected on Mars through more limited observations (Schneider et al., Science 350, (2015); DOI: 10.1126/science.aad0313), but recent observations established complete coverage of the observable portion of Mars’ nightside. The aurora was global due to Mars’s lack of a global magnetic field, which allowed energetic electrons from the Sun to directly precipitate into the atmosphere. On September 11th, IUVS detected aurora more than 25 times brighter than any prior IUVS observation, with high SNR detections of aurora at the limb and against the disk of the planet. Fainter auroral emission was seen around the nightside limb over 13 orbits spanning nearly 3 days.On September 14th, during the declining phase of the event, faint linear features and patches were detected by the spacecraft, which were higher than the noise floor, with a similar spatial distribution to “discrete aurora” patches observed on Mars by the SPICAM instrument on the Mars Express spacecraft (Bertaux et al., Nature 435, doi :10.1038/nature03603). Discrete aurora occur near areas of the crust affected by the magnetism left over from Mars’ once-strong dipole field. Emission is limited to regions of the crustal magnetic field where the field lines are likely to be open to solar wind interactions. Those regions are concentrated in Mars’ southern hemisphere centered on 180 degrees east longitude.We studied the localized emissions on 14 September to determine whether there might be a connection between the observed diffuse aurora event and discrete auroral processes. First, we investigated the localized emissions to confirm that the observed signal was consistent with expected auroral spectra. Second, their locations were projected on a map of the crustal magnetic fields to determine if they occurred near open magnetic field lines. We will report on the results of these two studies, and the ramifications for Mars auroral processes.

  5. A non-orthogonal decomposition of flows into discrete events

    NASA Astrophysics Data System (ADS)

    Boxx, Isaac; Lewalle, Jacques

    1998-11-01

    This work is based on the formula for the inverse Hermitian wavelet transform. A signal can be interpreted as a (non-unique) superposition of near-singular, partially overlapping events arising from Dirac functions and/or its derivatives combined with diffusion.( No dynamics implied: dimensionless diffusion is related to the definition of the analyzing wavelets.) These events correspond to local maxima of spectral energy density. We successfully fitted model events of various orders on a succession of fields, ranging from elementary signals to one-dimensional hot-wire traces. We document edge effects, event overlap and its implications on the algorithm. The interpretation of the discrete singularities as flow events (such as coherent structures) and the fundamental non-uniqueness of the decomposition are discussed. The dynamics of these events will be examined in the companion paper.

  6. Searching for a Link Between Suprathermal Ions and Solar Wind Parameters During Quiet Times.

    NASA Astrophysics Data System (ADS)

    Nickell, J.; Desai, M. I.; Dayeh, M. A.

    2017-12-01

    The acceleration processes that suprathermal particles undergo are largely ambiguous. The two prevailing acceleration processes are: 1) Continuous acceleration in the IP space due to i) Bulk velocity fluctuations (e.g., Fahr et al. 2012), ii) magnetic compressions (e.g., Fisk and Gloeckler 2012), iii) magnetic field waves and turbulence (e.g., Zhang and Lee 2013), and iv) reconnection between magnetic islands (e.g., Drake et al. 2014) . 2) Discrete acceleration that occurs in discrete solar events such as CIRs, CME-driven shocks, and flares (e.g., Reames 1999, Desai et al. 2008). Using data from ACE/ULEIS during solar cycles 23 and 24 (1997-present), we examine the solar wind and magnetic field parameters during quiet-times (e.g., Dayeh et al. 2017) in an attempt to gain insights into the acceleration processes of the suprathermal particle population. In particular, we look for compression regions by performing comparative studies between solar wind and magnetic field parameters during quiet-times in the interplanetary space.

  7. Rough Mill Simulations Reveal That Productivity When Processing Short Lumber Can Be High

    Treesearch

    Janice K. Wiedenbeck; Philip A. Araman

    1995-01-01

    Handling rates and costs associated with using short-length lumber (less than 8 ft. long) in furniture and cabinet industry rough mills have been assumed to be prohibitive. Discrete-event systems simulation models of both a crosscut-first and gang-rip-first rough mill were built to measure the effect of lumber length on equipment utilization and the volume and value of...

  8. Cross-Paradigm Simulation Modeling: Challenges and Successes

    DTIC Science & Technology

    2011-12-01

    is also highlighted. 2.1 Discrete-Event Simulation Discrete-event simulation ( DES ) is a modeling method for stochastic, dynamic models where...which almost anything can be coded; models can be incredibly detailed. Most commercial DES software has a graphical interface which allows the user to...results. Although the above definition is the commonly accepted definition of DES , there are two different worldviews that dominate DES modeling today: a

  9. The fundamental theorem of asset pricing under default and collateral in finite discrete time

    NASA Astrophysics Data System (ADS)

    Alvarez-Samaniego, Borys; Orrillo, Jaime

    2006-08-01

    We consider a financial market where time and uncertainty are modeled by a finite event-tree. The event-tree has a length of N, a unique initial node at the initial date, and a continuum of branches at each node of the tree. Prices and returns of J assets are modeled, respectively, by a R2JxR2J-valued stochastic process . In this framework we prove a version of the Fundamental Theorem of Asset Pricing which applies to defaultable securities backed by exogenous collateral suffering a contingent linear depreciation.

  10. Areas prone to slow slip events impede earthquake rupture propagation and promote afterslip.

    PubMed

    Rolandone, Frederique; Nocquet, Jean-Mathieu; Mothes, Patricia A; Jarrin, Paul; Vallée, Martin; Cubas, Nadaya; Hernandez, Stephen; Plain, Morgan; Vaca, Sandro; Font, Yvonne

    2018-01-01

    At subduction zones, transient aseismic slip occurs either as afterslip following a large earthquake or as episodic slow slip events during the interseismic period. Afterslip and slow slip events are usually considered as distinct processes occurring on separate fault areas governed by different frictional properties. Continuous GPS (Global Positioning System) measurements following the 2016 M w (moment magnitude) 7.8 Ecuador earthquake reveal that large and rapid afterslip developed at discrete areas of the megathrust that had previously hosted slow slip events. Regardless of whether they were locked or not before the earthquake, these areas appear to persistently release stress by aseismic slip throughout the earthquake cycle and outline the seismic rupture, an observation potentially leading to a better anticipation of future large earthquakes.

  11. Areas prone to slow slip events impede earthquake rupture propagation and promote afterslip

    PubMed Central

    Rolandone, Frederique; Nocquet, Jean-Mathieu; Mothes, Patricia A.; Jarrin, Paul; Vallée, Martin; Cubas, Nadaya; Hernandez, Stephen; Plain, Morgan; Vaca, Sandro; Font, Yvonne

    2018-01-01

    At subduction zones, transient aseismic slip occurs either as afterslip following a large earthquake or as episodic slow slip events during the interseismic period. Afterslip and slow slip events are usually considered as distinct processes occurring on separate fault areas governed by different frictional properties. Continuous GPS (Global Positioning System) measurements following the 2016 Mw (moment magnitude) 7.8 Ecuador earthquake reveal that large and rapid afterslip developed at discrete areas of the megathrust that had previously hosted slow slip events. Regardless of whether they were locked or not before the earthquake, these areas appear to persistently release stress by aseismic slip throughout the earthquake cycle and outline the seismic rupture, an observation potentially leading to a better anticipation of future large earthquakes. PMID:29404404

  12. Not so secret agents: Event-related potentials to semantic roles in visual event comprehension.

    PubMed

    Cohn, Neil; Paczynski, Martin; Kutas, Marta

    2017-12-01

    Research across domains has suggested that agents, the doers of actions, have a processing advantage over patients, the receivers of actions. We hypothesized that agents as "event builders" for discrete actions (e.g., throwing a ball, punching) build on cues embedded in their preparatory postures (e.g., reaching back an arm to throw or punch) that lead to (predictable) culminating actions, and that these cues afford frontloading of event structure processing. To test this hypothesis, we compared event-related brain potentials (ERPs) to averbal comic panels depicting preparatory agents (ex. reaching back an arm to punch) that cued specific actions with those to non-preparatory agents (ex. arm to the side) and patients that did not cue any specific actions. We also compared subsequent completed action panels (ex. agent punching patient) across conditions, where we expected an inverse pattern of ERPs indexing the differential costs of processing completed actions asa function of preparatory cues. Preparatory agents evoked a greater frontal positivity (600-900ms) relative to non-preparatory agents and patients, while subsequent completed actions panels following non-preparatory agents elicited a smaller frontal positivity (600-900ms). These results suggest that preparatory (vs. non-) postures may differentially impact the processing of agents and subsequent actions in real time. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. Adaptively biased sequential importance sampling for rare events in reaction networks with comparison to exact solutions from finite buffer dCME method

    PubMed Central

    Cao, Youfang; Liang, Jie

    2013-01-01

    Critical events that occur rarely in biological processes are of great importance, but are challenging to study using Monte Carlo simulation. By introducing biases to reaction selection and reaction rates, weighted stochastic simulation algorithms based on importance sampling allow rare events to be sampled more effectively. However, existing methods do not address the important issue of barrier crossing, which often arises from multistable networks and systems with complex probability landscape. In addition, the proliferation of parameters and the associated computing cost pose significant problems. Here we introduce a general theoretical framework for obtaining optimized biases in sampling individual reactions for estimating probabilities of rare events. We further describe a practical algorithm called adaptively biased sequential importance sampling (ABSIS) method for efficient probability estimation. By adopting a look-ahead strategy and by enumerating short paths from the current state, we estimate the reaction-specific and state-specific forward and backward moving probabilities of the system, which are then used to bias reaction selections. The ABSIS algorithm can automatically detect barrier-crossing regions, and can adjust bias adaptively at different steps of the sampling process, with bias determined by the outcome of exhaustively generated short paths. In addition, there are only two bias parameters to be determined, regardless of the number of the reactions and the complexity of the network. We have applied the ABSIS method to four biochemical networks: the birth-death process, the reversible isomerization, the bistable Schlögl model, and the enzymatic futile cycle model. For comparison, we have also applied the finite buffer discrete chemical master equation (dCME) method recently developed to obtain exact numerical solutions of the underlying discrete chemical master equations of these problems. This allows us to assess sampling results objectively by comparing simulation results with true answers. Overall, ABSIS can accurately and efficiently estimate rare event probabilities for all examples, often with smaller variance than other importance sampling algorithms. The ABSIS method is general and can be applied to study rare events of other stochastic networks with complex probability landscape. PMID:23862966

  14. Adaptively biased sequential importance sampling for rare events in reaction networks with comparison to exact solutions from finite buffer dCME method

    NASA Astrophysics Data System (ADS)

    Cao, Youfang; Liang, Jie

    2013-07-01

    Critical events that occur rarely in biological processes are of great importance, but are challenging to study using Monte Carlo simulation. By introducing biases to reaction selection and reaction rates, weighted stochastic simulation algorithms based on importance sampling allow rare events to be sampled more effectively. However, existing methods do not address the important issue of barrier crossing, which often arises from multistable networks and systems with complex probability landscape. In addition, the proliferation of parameters and the associated computing cost pose significant problems. Here we introduce a general theoretical framework for obtaining optimized biases in sampling individual reactions for estimating probabilities of rare events. We further describe a practical algorithm called adaptively biased sequential importance sampling (ABSIS) method for efficient probability estimation. By adopting a look-ahead strategy and by enumerating short paths from the current state, we estimate the reaction-specific and state-specific forward and backward moving probabilities of the system, which are then used to bias reaction selections. The ABSIS algorithm can automatically detect barrier-crossing regions, and can adjust bias adaptively at different steps of the sampling process, with bias determined by the outcome of exhaustively generated short paths. In addition, there are only two bias parameters to be determined, regardless of the number of the reactions and the complexity of the network. We have applied the ABSIS method to four biochemical networks: the birth-death process, the reversible isomerization, the bistable Schlögl model, and the enzymatic futile cycle model. For comparison, we have also applied the finite buffer discrete chemical master equation (dCME) method recently developed to obtain exact numerical solutions of the underlying discrete chemical master equations of these problems. This allows us to assess sampling results objectively by comparing simulation results with true answers. Overall, ABSIS can accurately and efficiently estimate rare event probabilities for all examples, often with smaller variance than other importance sampling algorithms. The ABSIS method is general and can be applied to study rare events of other stochastic networks with complex probability landscape.

  15. Adaptively biased sequential importance sampling for rare events in reaction networks with comparison to exact solutions from finite buffer dCME method.

    PubMed

    Cao, Youfang; Liang, Jie

    2013-07-14

    Critical events that occur rarely in biological processes are of great importance, but are challenging to study using Monte Carlo simulation. By introducing biases to reaction selection and reaction rates, weighted stochastic simulation algorithms based on importance sampling allow rare events to be sampled more effectively. However, existing methods do not address the important issue of barrier crossing, which often arises from multistable networks and systems with complex probability landscape. In addition, the proliferation of parameters and the associated computing cost pose significant problems. Here we introduce a general theoretical framework for obtaining optimized biases in sampling individual reactions for estimating probabilities of rare events. We further describe a practical algorithm called adaptively biased sequential importance sampling (ABSIS) method for efficient probability estimation. By adopting a look-ahead strategy and by enumerating short paths from the current state, we estimate the reaction-specific and state-specific forward and backward moving probabilities of the system, which are then used to bias reaction selections. The ABSIS algorithm can automatically detect barrier-crossing regions, and can adjust bias adaptively at different steps of the sampling process, with bias determined by the outcome of exhaustively generated short paths. In addition, there are only two bias parameters to be determined, regardless of the number of the reactions and the complexity of the network. We have applied the ABSIS method to four biochemical networks: the birth-death process, the reversible isomerization, the bistable Schlögl model, and the enzymatic futile cycle model. For comparison, we have also applied the finite buffer discrete chemical master equation (dCME) method recently developed to obtain exact numerical solutions of the underlying discrete chemical master equations of these problems. This allows us to assess sampling results objectively by comparing simulation results with true answers. Overall, ABSIS can accurately and efficiently estimate rare event probabilities for all examples, often with smaller variance than other importance sampling algorithms. The ABSIS method is general and can be applied to study rare events of other stochastic networks with complex probability landscape.

  16. PREFACE: 4th Symposium on Prospects in the Physics of Discrete Symmetries (DISCRETE2014)

    NASA Astrophysics Data System (ADS)

    Di Domenico, Antonio; Mavromatos, Nick E.; Mitsou, Vasiliki A.; Skliros, Dimitri P.

    2015-07-01

    The DISCRETE 2014: Fourth Symposium in the Physics of Discrete Symmetries took place at King's College London, Strand Campus, London WC2R 2LS, from Tuesday, December 2 2014 till Saturday, December 6 2014. This is the fourth Edition of the DISCRETE conference series, which is a biannual event, having been held previously in Valencia (Discrete'08), Rome (Discrete2010) and Lisbon (Discrete2012). The topics covered at the DISCRETE series of conferences are: T, C, P, CP symmetries; accidental symmetries (B, L conservation); CPT symmetry, decoherence and entangled states, Lorentz symmetry breaking (phenomenology and current bounds); neutrino mass and mixing; implications for cosmology and astroparticle physics, dark matter searches; experimental prospects at LHC, new facilities. In DISCRETE 2014 we have also introduced two new topics: cosmological aspects of non-commutative space-times as well as PT symmetric Hamiltonians (non-Hermitian but with real eigenvalues), a topic that has wide applications in particle physics and beyond. The conference was opened by the King's College London Vice Principal on Research and Innovation, Mr Chris Mottershead, followed by a welcome address by the Chair of DISCRETE 2014 (Professor Nick E. Mavromatos). After these introductory talks, the scientific programme of the DISCRETE 2014 symposium started. Following the tradition of DISCRETE series of conferences, the talks (138 in total) were divided into plenary-review talks (25), invited research talks (50) and shorter presentations (63) — selected by the conveners of each session in consultation with the organisers — from the submitted abstracts. We have been fortunate to have very high-quality, thought stimulating and interesting talks at all levels, which, together with the discussions among the participants, made the conference quite enjoyable. There were 152 registered participants for the event.

  17. Interpreting Significant Discrete-Time Periods in Survival Analysis.

    ERIC Educational Resources Information Center

    Schumacker, Randall E.; Denson, Kathleen B.

    Discrete-time survival analysis is a new method for educational researchers to employ when looking at the timing of certain educational events. Previous continuous-time methods do not allow for the flexibility inherent in a discrete-time method. Because both time-invariant and time-varying predictor variables can now be used, the interaction of…

  18. Reducing Patient Waiting Times for Radiation Therapy and Improving the Treatment Planning Process: a Discrete-event Simulation Model (Radiation Treatment Planning).

    PubMed

    Babashov, V; Aivas, I; Begen, M A; Cao, J Q; Rodrigues, G; D'Souza, D; Lock, M; Zaric, G S

    2017-06-01

    We analysed the radiotherapy planning process at the London Regional Cancer Program to determine the bottlenecks and to quantify the effect of specific resource levels with the goal of reducing waiting times. We developed a discrete-event simulation model of a patient's journey from the point of referral to a radiation oncologist to the start of radiotherapy, considering the sequential steps and resources of the treatment planning process. We measured the effect of several resource changes on the ready-to-treat to treatment (RTTT) waiting time and on the percentage treated within a 14 calendar day target. Increasing the number of dosimetrists by one reduced the mean RTTT by 6.55%, leading to 84.92% of patients being treated within the 14 calendar day target. Adding one more oncologist decreased the mean RTTT from 10.83 to 10.55 days, whereas a 15% increase in arriving patients increased the waiting time by 22.53%. The model was relatively robust to the changes in quantity of other resources. Our model identified sensitive and non-sensitive system parameters. A similar approach could be applied by other cancer programmes, using their respective data and individualised adjustments, which may be beneficial in making the most effective use of limited resources. Copyright © 2017 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.

  19. Parallel discrete event simulation using shared memory

    NASA Technical Reports Server (NTRS)

    Reed, Daniel A.; Malony, Allen D.; Mccredie, Bradley D.

    1988-01-01

    With traditional event-list techniques, evaluating a detailed discrete-event simulation-model can often require hours or even days of computation time. By eliminating the event list and maintaining only sufficient synchronization to ensure causality, parallel simulation can potentially provide speedups that are linear in the numbers of processors. A set of shared-memory experiments, using the Chandy-Misra distributed-simulation algorithm, to simulate networks of queues is presented. Parameters of the study include queueing network topology and routing probabilities, number of processors, and assignment of network nodes to processors. These experiments show that Chandy-Misra distributed simulation is a questionable alternative to sequential-simulation of most queueing network models.

  20. Inhomogeneous point-process entropy: An instantaneous measure of complexity in discrete systems

    NASA Astrophysics Data System (ADS)

    Valenza, Gaetano; Citi, Luca; Scilingo, Enzo Pasquale; Barbieri, Riccardo

    2014-05-01

    Measures of entropy have been widely used to characterize complexity, particularly in physiological dynamical systems modeled in discrete time. Current approaches associate these measures to finite single values within an observation window, thus not being able to characterize the system evolution at each moment in time. Here, we propose a new definition of approximate and sample entropy based on the inhomogeneous point-process theory. The discrete time series is modeled through probability density functions, which characterize and predict the time until the next event occurs as a function of the past history. Laguerre expansions of the Wiener-Volterra autoregressive terms account for the long-term nonlinear information. As the proposed measures of entropy are instantaneously defined through probability functions, the novel indices are able to provide instantaneous tracking of the system complexity. The new measures are tested on synthetic data, as well as on real data gathered from heartbeat dynamics of healthy subjects and patients with cardiac heart failure and gait recordings from short walks of young and elderly subjects. Results show that instantaneous complexity is able to effectively track the system dynamics and is not affected by statistical noise properties.

  1. The hippocampus and appetitive Pavlovian conditioning: effects of excitotoxic hippocampal lesions on conditioned locomotor activity and autoshaping.

    PubMed

    Ito, Rutsuko; Everitt, Barry J; Robbins, Trevor W

    2005-01-01

    The hippocampus (HPC) is known to be critically involved in the formation of associations between contextual/spatial stimuli and behaviorally significant events, playing a pivotal role in learning and memory. However, increasing evidence indicates that the HPC is also essential for more basic motivational processes. The amygdala, by contrast, is important for learning about the motivational significance of discrete cues. This study investigated the effects of excitotoxic lesions of the rat HPC and the basolateral amygdala (BLA) on the acquisition of a number of appetitive behaviors known to be dependent on the formation of Pavlovian associations between a reward (food) and discrete stimuli or contexts: (1) conditioned/anticipatory locomotor activity to food delivered in a specific context and (2) autoshaping, where rats learn to show conditioned discriminated approach to a discrete visual CS+. While BLA lesions had minimal effects on conditioned locomotor activity, hippocampal lesions facilitated the development of both conditioned activity to food and autoshaping behavior, suggesting that hippocampal lesions may have increased the incentive motivational properties of food and associated conditioned stimuli, consistent with the hypothesis that the HPC is involved in inhibitory processes in appetitive conditioning. (c) 2005 Wiley-Liss, Inc.

  2. A hybrid-system model of the coagulation cascade: simulation, sensitivity, and validation.

    PubMed

    Makin, Joseph G; Narayanan, Srini

    2013-10-01

    The process of human blood clotting involves a complex interaction of continuous-time/continuous-state processes and discrete-event/discrete-state phenomena, where the former comprise the various chemical rate equations and the latter comprise both threshold-limited behaviors and binary states (presence/absence of a chemical). Whereas previous blood-clotting models used only continuous dynamics and perforce addressed only portions of the coagulation cascade, we capture both continuous and discrete aspects by modeling it as a hybrid dynamical system. The model was implemented as a hybrid Petri net, a graphical modeling language that extends ordinary Petri nets to cover continuous quantities and continuous-time flows. The primary focus is simulation: (1) fidelity to the clinical data in terms of clotting-factor concentrations and elapsed time; (2) reproduction of known clotting pathologies; and (3) fine-grained predictions which may be used to refine clinical understanding of blood clotting. Next we examine sensitivity to rate-constant perturbation. Finally, we propose a method for titrating between reliance on the model and on prior clinical knowledge. For simplicity, we confine these last two analyses to a critical purely-continuous subsystem of the model.

  3. A Discrete Events Delay Differential System Model for Transmission of Vancomycin-Resistant Enterococcus (VRE) in Hospitals

    DTIC Science & Technology

    2010-09-19

    estimated directly form the surveillance data Infection control measures were implemented in the form of health care worker hand - hygiene before and after...hospital infections , is used to motivate possibilities of modeling nosocomial infec- tion dynamics. This is done in the context of hospital monitoring and...model development. Key Words: Delay equations, discrete events, nosocomial infection dynamics, surveil- lance data, inverse problems, parameter

  4. Program For Parallel Discrete-Event Simulation

    NASA Technical Reports Server (NTRS)

    Beckman, Brian C.; Blume, Leo R.; Geiselman, John S.; Presley, Matthew T.; Wedel, John J., Jr.; Bellenot, Steven F.; Diloreto, Michael; Hontalas, Philip J.; Reiher, Peter L.; Weiland, Frederick P.

    1991-01-01

    User does not have to add any special logic to aid in synchronization. Time Warp Operating System (TWOS) computer program is special-purpose operating system designed to support parallel discrete-event simulation. Complete implementation of Time Warp mechanism. Supports only simulations and other computations designed for virtual time. Time Warp Simulator (TWSIM) subdirectory contains sequential simulation engine interface-compatible with TWOS. TWOS and TWSIM written in, and support simulations in, C programming language.

  5. Improving Aircraft Refueling Procedures at Naval Air Station Oceana

    DTIC Science & Technology

    2012-06-01

    Station (NAS) Oceana, VA, using aircraft waiting time for fuel as a measure of performance. We develop a computer-assisted discrete-event simulation to...Station (NAS) Oceana, VA, using aircraft waiting time for fuel as a measure of performance. We develop a computer-assisted discrete-event simulation...server queue, with general interarrival and service time distributions gpm Gallons per minute JDK Java development kit M/M/1 Single-server queue

  6. Parallel Stochastic discrete event simulation of calcium dynamics in neuron.

    PubMed

    Ishlam Patoary, Mohammad Nazrul; Tropper, Carl; McDougal, Robert A; Zhongwei, Lin; Lytton, William W

    2017-09-26

    The intra-cellular calcium signaling pathways of a neuron depends on both biochemical reactions and diffusions. Some quasi-isolated compartments (e.g. spines) are so small and calcium concentrations are so low that one extra molecule diffusing in by chance can make a nontrivial difference in its concentration (percentage-wise). These rare events can affect dynamics discretely in such way that they cannot be evaluated by a deterministic simulation. Stochastic models of such a system provide a more detailed understanding of these systems than existing deterministic models because they capture their behavior at a molecular level. Our research focuses on the development of a high performance parallel discrete event simulation environment, Neuron Time Warp (NTW), which is intended for use in the parallel simulation of stochastic reaction-diffusion systems such as intra-calcium signaling. NTW is integrated with NEURON, a simulator which is widely used within the neuroscience community. We simulate two models, a calcium buffer and a calcium wave model. The calcium buffer model is employed in order to verify the correctness and performance of NTW by comparing it to a serial deterministic simulation in NEURON. We also derived a discrete event calcium wave model from a deterministic model using the stochastic IP3R structure.

  7. Clustered mixed nonhomogeneous Poisson process spline models for the analysis of recurrent event panel data.

    PubMed

    Nielsen, J D; Dean, C B

    2008-09-01

    A flexible semiparametric model for analyzing longitudinal panel count data arising from mixtures is presented. Panel count data refers here to count data on recurrent events collected as the number of events that have occurred within specific follow-up periods. The model assumes that the counts for each subject are generated by mixtures of nonhomogeneous Poisson processes with smooth intensity functions modeled with penalized splines. Time-dependent covariate effects are also incorporated into the process intensity using splines. Discrete mixtures of these nonhomogeneous Poisson process spline models extract functional information from underlying clusters representing hidden subpopulations. The motivating application is an experiment to test the effectiveness of pheromones in disrupting the mating pattern of the cherry bark tortrix moth. Mature moths arise from hidden, but distinct, subpopulations and monitoring the subpopulation responses was of interest. Within-cluster random effects are used to account for correlation structures and heterogeneity common to this type of data. An estimating equation approach to inference requiring only low moment assumptions is developed and the finite sample properties of the proposed estimating functions are investigated empirically by simulation.

  8. Modelling tidewater glacier calving: from detailed process models to simple calving laws

    NASA Astrophysics Data System (ADS)

    Benn, Doug; Åström, Jan; Zwinger, Thomas; Todd, Joe; Nick, Faezeh

    2017-04-01

    The simple calving laws currently used in ice sheet models do not adequately reflect the complexity and diversity of calving processes. To be effective, calving laws must be grounded in a sound understanding of how calving actually works. We have developed a new approach to formulating calving laws, using a) the Helsinki Discrete Element Model (HiDEM) to explicitly model fracture and calving processes, and b) the full-Stokes continuum model Elmer/Ice to identify critical stress states associated with HiDEM calving events. A range of observed calving processes emerges spontaneously from HiDEM in response to variations in ice-front buoyancy and the size of subaqueous undercuts, and we show that HiDEM calving events are associated with characteristic stress patterns simulated in Elmer/Ice. Our results open the way to developing calving laws that properly reflect the diversity of calving processes, and provide a framework for a unified theory of the calving process continuum.

  9. Persistent Surveillance of Transient Events with Unknown Statistics

    DTIC Science & Technology

    2016-12-18

    different bird species by a documentary maker is shown in Fig. 1. Additional examples of scenarios following this setting include robots patrolling the...persistent monitoring application in which a documentary maker would like to monitor three different species of birds appearing in three discrete, species...specific locations. Bird sightings at each location follow a stochastic process with a rate that is initially unknown to the documentary maker and must

  10. MoSeS: Modelling and Simulation for e-Social Science.

    PubMed

    Townend, Paul; Xu, Jie; Birkin, Mark; Turner, Andy; Wu, Belinda

    2009-07-13

    MoSeS (Modelling and Simulation for e-Social Science) is a research node of the National Centre for e-Social Science. MoSeS uses e-Science techniques to execute an events-driven model that simulates discrete demographic processes; this allows us to project the UK population 25 years into the future. This paper describes the architecture, simulation methodology and latest results obtained by MoSeS.

  11. Time Warp Operating System, Version 2.5.1

    NASA Technical Reports Server (NTRS)

    Bellenot, Steven F.; Gieselman, John S.; Hawley, Lawrence R.; Peterson, Judy; Presley, Matthew T.; Reiher, Peter L.; Springer, Paul L.; Tupman, John R.; Wedel, John J., Jr.; Wieland, Frederick P.; hide

    1993-01-01

    Time Warp Operating System, TWOS, is special purpose computer program designed to support parallel simulation of discrete events. Complete implementation of Time Warp software mechanism, which implements distributed protocol for virtual synchronization based on rollback of processes and annihilation of messages. Supports simulations and other computations in which both virtual time and dynamic load balancing used. Program utilizes underlying resources of operating system. Written in C programming language.

  12. Using movement and intentions to understand human activity.

    PubMed

    Zacks, Jeffrey M; Kumar, Shawn; Abrams, Richard A; Mehta, Ritesh

    2009-08-01

    During perception, people segment continuous activity into discrete events. They do so in part by monitoring changes in features of an ongoing activity. Characterizing these features is important for theories of event perception and may be helpful for designing information systems. The three experiments reported here asked whether the body movements of an actor predict when viewers will perceive event boundaries. Body movements were recorded using a magnetic motion tracking system and compared with viewers' segmentation of his activity into events. Changes in movement features were strongly associated with segmentation. This was more true for fine-grained than for coarse-grained boundaries, and was strengthened when the stimulus displays were reduced from live-action movies to simplified animations. These results suggest that movement variables play an important role in the process of segmenting activity into meaningful events, and that the influence of movement on segmentation depends on the availability of other information sources.

  13. The effect of existing turbulence on stratified shear instability

    NASA Astrophysics Data System (ADS)

    Kaminski, Alexis; Smyth, William

    2017-11-01

    Ocean turbulence is an essential process governing, for example, heat uptake by the ocean. In the stably-stratified ocean interior, this turbulence occurs in discrete events driven by vertical variations of the horizontal velocity. Typically, these events have been modelled by assuming an initially laminar stratified shear flow which develops wavelike instabilities, becomes fully turbulent, and then relaminarizes into a stable state. However, in the real ocean there is always some level of turbulence left over from previous events, and it is not yet understood how this turbulence impacts the evolution of future mixing events. Here, we perform a series of direct numerical simulations of turbulent events developing in stratified shear flows that are already at least weakly turbulent. We do so by varying the amplitude of the initial perturbations, and examine the subsequent development of the instability and the impact on the resulting turbulent fluxes. This work is supported by NSF Grant OCE1537173.

  14. Hybrid Modeling for Testing Intelligent Software for Lunar-Mars Closed Life Support

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Nicholson, Leonard S. (Technical Monitor)

    1999-01-01

    Intelligent software is being developed for closed life support systems with biological components, for human exploration of the Moon and Mars. The intelligent software functions include planning/scheduling, reactive discrete control and sequencing, management of continuous control, and fault detection, diagnosis, and management of failures and errors. Four types of modeling information have been essential to system modeling and simulation to develop and test the software and to provide operational model-based what-if analyses: discrete component operational and failure modes; continuous dynamic performance within component modes, modeled qualitatively or quantitatively; configuration of flows and power among components in the system; and operations activities and scenarios. CONFIG, a multi-purpose discrete event simulation tool that integrates all four types of models for use throughout the engineering and operations life cycle, has been used to model components and systems involved in the production and transfer of oxygen and carbon dioxide in a plant-growth chamber and between that chamber and a habitation chamber with physicochemical systems for gas processing.

  15. Visual Data-Analytics of Large-Scale Parallel Discrete-Event Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ross, Caitlin; Carothers, Christopher D.; Mubarak, Misbah

    Parallel discrete-event simulation (PDES) is an important tool in the codesign of extreme-scale systems because PDES provides a cost-effective way to evaluate designs of highperformance computing systems. Optimistic synchronization algorithms for PDES, such as Time Warp, allow events to be processed without global synchronization among the processing elements. A rollback mechanism is provided when events are processed out of timestamp order. Although optimistic synchronization protocols enable the scalability of large-scale PDES, the performance of the simulations must be tuned to reduce the number of rollbacks and provide an improved simulation runtime. To enable efficient large-scale optimistic simulations, one has tomore » gain insight into the factors that affect the rollback behavior and simulation performance. We developed a tool for ROSS model developers that gives them detailed metrics on the performance of their large-scale optimistic simulations at varying levels of simulation granularity. Model developers can use this information for parameter tuning of optimistic simulations in order to achieve better runtime and fewer rollbacks. In this work, we instrument the ROSS optimistic PDES framework to gather detailed statistics about the simulation engine. We have also developed an interactive visualization interface that uses the data collected by the ROSS instrumentation to understand the underlying behavior of the simulation engine. The interface connects real time to virtual time in the simulation and provides the ability to view simulation data at different granularities. We demonstrate the usefulness of our framework by performing a visual analysis of the dragonfly network topology model provided by the CODES simulation framework built on top of ROSS. The instrumentation needs to minimize overhead in order to accurately collect data about the simulation performance. To ensure that the instrumentation does not introduce unnecessary overhead, we perform a scaling study that compares instrumented ROSS simulations with their noninstrumented counterparts in order to determine the amount of perturbation when running at different simulation scales.« less

  16. A New ’Availability-Payment’ Model for Pricing Performance-Based Logistics Contracts

    DTIC Science & Technology

    2014-04-30

    maintenance network connected to the inventory and Original Equipment Manufacturer (OEM) used in this paper. The input to the Petri net in Figure 2 is the...contract structures. The model developed in this paper uses an affine controller to drive a discrete event simulator ( Petri net ) that produces...discrete event simulator ( Petri net ) that produces availability and cost measures. The model is used to explore the optimum availability assessment

  17. Desktop Modeling and Simulation: Parsimonious, yet Effective Discrete-Event Simulation Analysis

    NASA Technical Reports Server (NTRS)

    Bradley, James R.

    2012-01-01

    This paper evaluates how quickly students can be trained to construct useful discrete-event simulation models using Excel The typical supply chain used by many large national retailers is described, and an Excel-based simulation model is constructed of it The set of programming and simulation skills required for development of that model are then determined we conclude that six hours of training are required to teach the skills to MBA students . The simulation presented here contains all fundamental functionallty of a simulation model, and so our result holds for any discrete-event simulation model. We argue therefore that Industry workers with the same technical skill set as students having completed one year in an MBA program can be quickly trained to construct simulation models. This result gives credence to the efficacy of Desktop Modeling and Simulation whereby simulation analyses can be quickly developed, run, and analyzed with widely available software, namely Excel.

  18. Multiple Autonomous Discrete Event Controllers for Constellations

    NASA Technical Reports Server (NTRS)

    Esposito, Timothy C.

    2003-01-01

    The Multiple Autonomous Discrete Event Controllers for Constellations (MADECC) project is an effort within the National Aeronautics and Space Administration Goddard Space Flight Center's (NASA/GSFC) Information Systems Division to develop autonomous positioning and attitude control for constellation satellites. It will be accomplished using traditional control theory and advanced coordination algorithms developed by the Johns Hopkins University Applied Physics Laboratory (JHU/APL). This capability will be demonstrated in the discrete event control test-bed located at JHU/APL. This project will be modeled for the Leonardo constellation mission, but is intended to be adaptable to any constellation mission. To develop a common software architecture. the controllers will only model very high-level responses. For instance, after determining that a maneuver must be made. the MADECC system will output B (Delta)V (velocity change) value. Lower level systems must then decide which thrusters to fire and for how long to achieve that (Delta)V.

  19. Lyapunov Stability of Fuzzy Discrete Event Systems

    NASA Astrophysics Data System (ADS)

    Liu, Fuchun; Qiu, Daowen

    Fuzzy discrete event systems (FDESs) as a generalization of (crisp) discrete event systems (DESs) may better deal with the problems of fuzziness, impreciseness, and subjectivity. Qiu, Cao and Ying, Liu and Qiu interestingly developed the theory of FDESs. As a continuation of Qiu's work, this paper is to deal with the Lyapunov stability of FDESs, some main results of crisp DESs are generalized. We formalize the notions of the reachability of fuzzy states defined on a metric space. A linear algorithm of computing the r-reachable fuzzy state set is presented. Then we introduce the definitions of stability and asymptotical stability in the sense of Lyapunov to guarantee the convergence of the behaviors of fuzzy automaton to the desired fuzzy states when system engages in some illegal behaviors which can be tolerated. In particular, we present a necessary and sufficient condition for stability and another for asymptotical stability of FDESs.

  20. Conditions for extinction events in chemical reaction networks with discrete state spaces.

    PubMed

    Johnston, Matthew D; Anderson, David F; Craciun, Gheorghe; Brijder, Robert

    2018-05-01

    We study chemical reaction networks with discrete state spaces and present sufficient conditions on the structure of the network that guarantee the system exhibits an extinction event. The conditions we derive involve creating a modified chemical reaction network called a domination-expanded reaction network and then checking properties of this network. Unlike previous results, our analysis allows algorithmic implementation via systems of equalities and inequalities and suggests sequences of reactions which may lead to extinction events. We apply the results to several networks including an EnvZ-OmpR signaling pathway in Escherichia coli.

  1. Temporal and Rate Coding for Discrete Event Sequences in the Hippocampus.

    PubMed

    Terada, Satoshi; Sakurai, Yoshio; Nakahara, Hiroyuki; Fujisawa, Shigeyoshi

    2017-06-21

    Although the hippocampus is critical to episodic memory, neuronal representations supporting this role, especially relating to nonspatial information, remain elusive. Here, we investigated rate and temporal coding of hippocampal CA1 neurons in rats performing a cue-combination task that requires the integration of sequentially provided sound and odor cues. The majority of CA1 neurons displayed sensory cue-, combination-, or choice-specific (simply, "event"-specific) elevated discharge activities, which were sustained throughout the event period. These event cells underwent transient theta phase precession at event onset, followed by sustained phase locking to the early theta phases. As a result of this unique single neuron behavior, the theta sequences of CA1 cell assemblies of the event sequences had discrete representations. These results help to update the conceptual framework for space encoding toward a more general model of episodic event representations in the hippocampus. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. Discrete event command and control for networked teams with multiple missions

    NASA Astrophysics Data System (ADS)

    Lewis, Frank L.; Hudas, Greg R.; Pang, Chee Khiang; Middleton, Matthew B.; McMurrough, Christopher

    2009-05-01

    During mission execution in military applications, the TRADOC Pamphlet 525-66 Battle Command and Battle Space Awareness capabilities prescribe expectations that networked teams will perform in a reliable manner under changing mission requirements, varying resource availability and reliability, and resource faults. In this paper, a Command and Control (C2) structure is presented that allows for computer-aided execution of the networked team decision-making process, control of force resources, shared resource dispatching, and adaptability to change based on battlefield conditions. A mathematically justified networked computing environment is provided called the Discrete Event Control (DEC) Framework. DEC has the ability to provide the logical connectivity among all team participants including mission planners, field commanders, war-fighters, and robotic platforms. The proposed data management tools are developed and demonstrated on a simulation study and an implementation on a distributed wireless sensor network. The results show that the tasks of multiple missions are correctly sequenced in real-time, and that shared resources are suitably assigned to competing tasks under dynamically changing conditions without conflicts and bottlenecks.

  3. Using the Integration of Discrete Event and Agent-Based Simulation to Enhance Outpatient Service Quality in an Orthopedic Department.

    PubMed

    Kittipittayakorn, Cholada; Ying, Kuo-Ching

    2016-01-01

    Many hospitals are currently paying more attention to patient satisfaction since it is an important service quality index. Many Asian countries' healthcare systems have a mixed-type registration, accepting both walk-in patients and scheduled patients. This complex registration system causes a long patient waiting time in outpatient clinics. Different approaches have been proposed to reduce the waiting time. This study uses the integration of discrete event simulation (DES) and agent-based simulation (ABS) to improve patient waiting time and is the first attempt to apply this approach to solve this key problem faced by orthopedic departments. From the data collected, patient behaviors are modeled and incorporated into a massive agent-based simulation. The proposed approach is an aid for analyzing and modifying orthopedic department processes, allows us to consider far more details, and provides more reliable results. After applying the proposed approach, the total waiting time of the orthopedic department fell from 1246.39 minutes to 847.21 minutes. Thus, using the correct simulation model significantly reduces patient waiting time in an orthopedic department.

  4. Using the Integration of Discrete Event and Agent-Based Simulation to Enhance Outpatient Service Quality in an Orthopedic Department

    PubMed Central

    Kittipittayakorn, Cholada

    2016-01-01

    Many hospitals are currently paying more attention to patient satisfaction since it is an important service quality index. Many Asian countries' healthcare systems have a mixed-type registration, accepting both walk-in patients and scheduled patients. This complex registration system causes a long patient waiting time in outpatient clinics. Different approaches have been proposed to reduce the waiting time. This study uses the integration of discrete event simulation (DES) and agent-based simulation (ABS) to improve patient waiting time and is the first attempt to apply this approach to solve this key problem faced by orthopedic departments. From the data collected, patient behaviors are modeled and incorporated into a massive agent-based simulation. The proposed approach is an aid for analyzing and modifying orthopedic department processes, allows us to consider far more details, and provides more reliable results. After applying the proposed approach, the total waiting time of the orthopedic department fell from 1246.39 minutes to 847.21 minutes. Thus, using the correct simulation model significantly reduces patient waiting time in an orthopedic department. PMID:27195606

  5. Reducing elective general surgery cancellations at a Canadian hospital

    PubMed Central

    Azari-Rad, Solmaz; Yontef, Alanna L.; Aleman, Dionne M.; Urbach, David R.

    2013-01-01

    Background In Canadian hospitals, which are typically financed by global annual budgets, overuse of operating rooms is a financial risk that is frequently managed by cancelling elective surgical procedures. It is uncertain how different scheduling rules affect the rate of elective surgery cancellations. Methods We used discrete event simulation modelling to represent perioperative processes at a hospital in Toronto, Canada. We tested the effects of the following 3 scenarios on the number of surgical cancellations: scheduling surgeons’ operating days based on their patients’ average length of stay in hospital, sequencing surgical procedures by average duration and variance, and increasing the number of post-surgical ward beds. Results The number of elective cancellations was reduced by scheduling surgeons whose patients had shorter average lengths of stay in hospital earlier in the week, sequencing shorter surgeries and those with less variance in duration earlier in the day, and by adding up to 2 additional beds to the postsurgical ward. Conclusion Discrete event simulation modelling can be used to develop strategies for improving efficiency in operating rooms. PMID:23351498

  6. Primary task event-related potentials related to different aspects of information processing

    NASA Technical Reports Server (NTRS)

    Munson, Robert C.; Horst, Richard L.; Mahaffey, David L.

    1988-01-01

    The results of two studies which investigated the relationships between cognitive processing and components of transient event-related potentials (ERPs) are presented in a task in which mental workload was manipulated. The task involved the monitoring of an array of discrete readouts for values that went out of bounds, and was somewhat analogous to tasks performed in cockpits. The ERPs elicited by the changing readouts varied with the number of readouts being monitored, the number of monitored readouts that were close to going out of bounds, and whether or not the change took a monitored readout out of bounds. Moreover, different regions of the waveform differentially reflected these effects. The results confirm the sensitivity of scalp-recorded ERPs to the cognitive processes affected by mental workload and suggest the possibility of extracting useful ERP indices of primary task performance in a wide range of man-machine settings.

  7. Ultralow dose effects in ion-beam induced grafting of polymethylmethacrylate (PMMA)

    NASA Astrophysics Data System (ADS)

    Corelli, J. C.; Steckl, A. J.; Pulver, D.; Randall, J. N.

    We have investigated the process of image enhancement in high resolution lithography through polymer grafting techniques. Sensitivity gains of 10 3-10 4 were obtained for H +, X-ray, e-beam and deep-UV irradiations. Ultralow dose effects in 60 keV H + irradiated PMMA have been observed through the use of the acrylic acid (AA) monomer grafting with irradiated PMMA. At conventional doses of 10 10 cm -2 an inner structure of each feature is revealed. At doses of (1-2) X 10 9 cm -2, discrete events within the exposed regions are observable. This is the first time that individual events have been observable in a lithography process and sets the upper limit in the useful sensitivity of the resist and ion lithography process. This effect is directly observable only with ions, because of their higher efficiency per particle than either photons or electrons.

  8. Optimal Discrete Event Supervisory Control of Aircraft Gas Turbine Engines

    NASA Technical Reports Server (NTRS)

    Litt, Jonathan (Technical Monitor); Ray, Asok

    2004-01-01

    This report presents an application of the recently developed theory of optimal Discrete Event Supervisory (DES) control that is based on a signed real measure of regular languages. The DES control techniques are validated on an aircraft gas turbine engine simulation test bed. The test bed is implemented on a networked computer system in which two computers operate in the client-server mode. Several DES controllers have been tested for engine performance and reliability.

  9. Discrete-Event Simulation Unmasks the Quantum Cheshire Cat

    NASA Astrophysics Data System (ADS)

    Michielsen, Kristel; Lippert, Thomas; Raedt, Hans De

    2017-05-01

    It is shown that discrete-event simulation accurately reproduces the experimental data of a single-neutron interferometry experiment [T. Denkmayr {\\sl et al.}, Nat. Commun. 5, 4492 (2014)] and provides a logically consistent, paradox-free, cause-and-effect explanation of the quantum Cheshire cat effect without invoking the notion that the neutron and its magnetic moment separate. Describing the experimental neutron data using weak-measurement theory is shown to be useless for unravelling the quantum Cheshire cat effect.

  10. 40 CFR 86.1370-2007 - Not-To-Exceed test procedures.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... that include discrete regeneration events and that send a recordable electronic signal indicating the start and end of the regeneration event, determine the minimum averaging period for each NTE event that... averaging period is used to determine whether the individual NTE event is a valid NTE event. For engines...

  11. Event Segmentation Improves Event Memory up to One Month Later

    ERIC Educational Resources Information Center

    Flores, Shaney; Bailey, Heather R.; Eisenberg, Michelle L.; Zacks, Jeffrey M.

    2017-01-01

    When people observe everyday activity, they spontaneously parse it into discrete meaningful events. Individuals who segment activity in a more normative fashion show better subsequent memory for the events. If segmenting events effectively leads to better memory, does asking people to attend to segmentation improve subsequent memory? To answer…

  12. U.S. Marine Corps Communication-Electronics School Training Process: Discrete-Event Simulation and Lean Options

    DTIC Science & Technology

    2007-12-01

    acknowledged that Continuous Improvement (CI), or Kaizen in Japanese, is practiced in some way, shape, or form by most if not all Fortune 500 companies...greater resistance in the individualistic U.S. culture. Kaizen generally involves methodical examination and testing, followed by the adoption of new...or streamlined procedures, including scrupulous measurement and changes based on statistical deviation formulas. Kaizen appears to be a perfect fit

  13. Parallel processing of general and specific threat during early stages of perception

    PubMed Central

    2016-01-01

    Differential processing of threat can consummate as early as 100 ms post-stimulus. Moreover, early perception not only differentiates threat from non-threat stimuli but also distinguishes among discrete threat subtypes (e.g. fear, disgust and anger). Combining spatial-frequency-filtered images of fear, disgust and neutral scenes with high-density event-related potentials and intracranial source estimation, we investigated the neural underpinnings of general and specific threat processing in early stages of perception. Conveyed in low spatial frequencies, fear and disgust images evoked convergent visual responses with similarly enhanced N1 potentials and dorsal visual (middle temporal gyrus) cortical activity (relative to neutral cues; peaking at 156 ms). Nevertheless, conveyed in high spatial frequencies, fear and disgust elicited divergent visual responses, with fear enhancing and disgust suppressing P1 potentials and ventral visual (occipital fusiform) cortical activity (peaking at 121 ms). Therefore, general and specific threat processing operates in parallel in early perception, with the ventral visual pathway engaged in specific processing of discrete threats and the dorsal visual pathway in general threat processing. Furthermore, selectively tuned to distinctive spatial-frequency channels and visual pathways, these parallel processes underpin dimensional and categorical threat characterization, promoting efficient threat response. These findings thus lend support to hybrid models of emotion. PMID:26412811

  14. Neural correlates of object-in-place learning in hippocampus and prefrontal cortex.

    PubMed

    Kim, Jangjin; Delcasso, Sébastien; Lee, Inah

    2011-11-23

    Hippocampus and prefrontal cortex (PFC) process spatiotemporally discrete events while maintaining goal-directed task demands. Although some studies have reported that neural activities in the two regions are coordinated, such observations have rarely been reported in an object-place paired-associate (OPPA) task in which animals must learn an object-in-place rule. In this study, we recorded single units and local field potentials simultaneously from the CA1 subfield of the hippocampus and PFC as rats learned that Object A, but not Object B, was rewarded in Place 1, but not in Place 2 (vice versa for Object B). Both hippocampus and PFC are required for normal performance in this task. PFC neurons fired in association with the regularity of the occurrence of a certain type of event independent of space, whereas neuronal firing in CA1 was spatially localized for representing a discrete place. Importantly, the differential firing patterns were observed in tandem with common learning-related changes in both regions. Specifically, once OPPA learning occurred and rats used an object-in-place strategy, (1) both CA1 and PFC neurons exhibited spatially more similar and temporally more synchronized firing patterns, (2) spiking activities in both regions were more phase locked to theta rhythms, and (3) CA1-medial PFC coherence in theta oscillation was maximal before entering a critical place for decision making. The results demonstrate differential as well as common neural dynamics between hippocampus and PFC in acquiring the OPPA task and strongly suggest that both regions form a unified functional network for processing an episodic event.

  15. Neural correlates of object-in-place learning in hippocampus and prefrontal cortex

    PubMed Central

    Kim, Jangjin; Delcasso, Sébastien; Lee, Inah

    2011-01-01

    Hippocampus and prefrontal cortex (PFC) process spatiotemporally discrete events while maintaining goal-directed task demands. Although some studies have reported that neural activities in the two regions are coordinated, such observations have rarely been reported in an object-place paired-associate (OPPA) task in which animals must learn an object-in-place rule. In this study, we recorded single units and local field potentials simultaneously from the CA1 subfield of the hippocampus and PFC as rats learned that object A, but not object B, was rewarded in place 1, but not in place 2 (vice versa for object B). Both hippocampus and PFC are required for normal performance in this task. PFC neurons fired in association with the regularity of the occurrence of a certain type of event independent of space, whereas neuronal firing in CA1 was spatially localized for representing a discrete place. Importantly, the differential firing patterns were observed in tandem with common learning-related changes in both regions. Specifically, once OPPA learning occurred and rats used an object-in-place strategy, (i) both CA1 and PFC neurons exhibited spatially more similar and temporally more synchronized firing patterns, (ii) spiking activities in both regions were more phase-locked to theta rhythms, (iii) CA1-mPFC coherence in theta oscillation was maximal before entering a critical place for decision making. The results demonstrate differential as well as common neural dynamics between hippocampus and PFC in acquiring the OPPA task and strongly suggest that both regions form a unified functional network for processing an episodic event. PMID:22114269

  16. Parallel discrete event simulation: A shared memory approach

    NASA Technical Reports Server (NTRS)

    Reed, Daniel A.; Malony, Allen D.; Mccredie, Bradley D.

    1987-01-01

    With traditional event list techniques, evaluating a detailed discrete event simulation model can often require hours or even days of computation time. Parallel simulation mimics the interacting servers and queues of a real system by assigning each simulated entity to a processor. By eliminating the event list and maintaining only sufficient synchronization to insure causality, parallel simulation can potentially provide speedups that are linear in the number of processors. A set of shared memory experiments is presented using the Chandy-Misra distributed simulation algorithm to simulate networks of queues. Parameters include queueing network topology and routing probabilities, number of processors, and assignment of network nodes to processors. These experiments show that Chandy-Misra distributed simulation is a questionable alternative to sequential simulation of most queueing network models.

  17. Hybrid Architectural Framework for C4ISR and Discrete-Event Simulation (DES) to Support Sensor-Driven Model Synthesis in Real-World Scenarios

    DTIC Science & Technology

    2013-09-01

    which utilizes FTA and then loads it into a DES engine to generate simulation results. .......44 Figure 21. This simulation architecture is...While Discrete Event Simulation ( DES ) can provide accurate time estimation and fast simulation speed, models utilizing it often suffer...C4ISR progress in MDW is developed in this research to demonstrate the feasibility of AEMF- DES and explore its potential. The simulation (MDSIM

  18. Cargo-mediated regulation of a rapid Rab4-dependent recycling pathway.

    PubMed

    Yudowski, Guillermo A; Puthenveedu, Manojkumar A; Henry, Anastasia G; von Zastrow, Mark

    2009-06-01

    Membrane trafficking is well known to regulate receptor-mediated signaling processes, but less is known about whether signaling receptors conversely regulate the membrane trafficking machinery. We investigated this question by focusing on the beta-2 adrenergic receptor (B2AR), a G protein-coupled receptor whose cellular signaling activity is controlled by ligand-induced endocytosis followed by recycling. We used total internal reflection fluorescence microscopy (TIR-FM) and tagging with a pH-sensitive GFP variant to image discrete membrane trafficking events mediating B2AR endo- and exocytosis. Within several minutes after initiating rapid endocytosis of B2ARs by the adrenergic agonist isoproterenol, we observed bright "puffs" of locally increased surface fluorescence intensity representing discrete Rab4-dependent recycling events. These events reached a constant frequency in the continuous presence of isoproterenol, and agonist removal produced a rapid (observed within 1 min) and pronounced (approximately twofold) increase in recycling event frequency. This regulation required receptor signaling via the cAMP-dependent protein kinase (PKA) and a specific PKA consensus site located in the carboxyl-terminal cytoplasmic tail of the B2AR itself. B2AR-mediated regulation was not restricted to this membrane cargo, however, as transferrin receptors packaged in the same population of recycling vesicles were similarly affected. In contrast, net recycling measured over a longer time interval (10 to 30 min) was not detectably regulated by B2AR signaling. These results identify rapid regulation of a specific recycling pathway by a signaling receptor cargo.

  19. Discrete Time Rescaling Theorem: Determining Goodness of Fit for Discrete Time Statistical Models of Neural Spiking

    PubMed Central

    Haslinger, Robert; Pipa, Gordon; Brown, Emery

    2010-01-01

    One approach for understanding the encoding of information by spike trains is to fit statistical models and then test their goodness of fit. The time rescaling theorem provides a goodness of fit test consistent with the point process nature of spike trains. The interspike intervals (ISIs) are rescaled (as a function of the model’s spike probability) to be independent and exponentially distributed if the model is accurate. A Kolmogorov Smirnov (KS) test between the rescaled ISIs and the exponential distribution is then used to check goodness of fit. This rescaling relies upon assumptions of continuously defined time and instantaneous events. However spikes have finite width and statistical models of spike trains almost always discretize time into bins. Here we demonstrate that finite temporal resolution of discrete time models prevents their rescaled ISIs from being exponentially distributed. Poor goodness of fit may be erroneously indicated even if the model is exactly correct. We present two adaptations of the time rescaling theorem to discrete time models. In the first we propose that instead of assuming the rescaled times to be exponential, the reference distribution be estimated through direct simulation by the fitted model. In the second, we prove a discrete time version of the time rescaling theorem which analytically corrects for the effects of finite resolution. This allows us to define a rescaled time which is exponentially distributed, even at arbitrary temporal discretizations. We demonstrate the efficacy of both techniques by fitting Generalized Linear Models (GLMs) to both simulated spike trains and spike trains recorded experimentally in monkey V1 cortex. Both techniques give nearly identical results, reducing the false positive rate of the KS test and greatly increasing the reliability of model evaluation based upon the time rescaling theorem. PMID:20608868

  20. Discrete time rescaling theorem: determining goodness of fit for discrete time statistical models of neural spiking.

    PubMed

    Haslinger, Robert; Pipa, Gordon; Brown, Emery

    2010-10-01

    One approach for understanding the encoding of information by spike trains is to fit statistical models and then test their goodness of fit. The time-rescaling theorem provides a goodness-of-fit test consistent with the point process nature of spike trains. The interspike intervals (ISIs) are rescaled (as a function of the model's spike probability) to be independent and exponentially distributed if the model is accurate. A Kolmogorov-Smirnov (KS) test between the rescaled ISIs and the exponential distribution is then used to check goodness of fit. This rescaling relies on assumptions of continuously defined time and instantaneous events. However, spikes have finite width, and statistical models of spike trains almost always discretize time into bins. Here we demonstrate that finite temporal resolution of discrete time models prevents their rescaled ISIs from being exponentially distributed. Poor goodness of fit may be erroneously indicated even if the model is exactly correct. We present two adaptations of the time-rescaling theorem to discrete time models. In the first we propose that instead of assuming the rescaled times to be exponential, the reference distribution be estimated through direct simulation by the fitted model. In the second, we prove a discrete time version of the time-rescaling theorem that analytically corrects for the effects of finite resolution. This allows us to define a rescaled time that is exponentially distributed, even at arbitrary temporal discretizations. We demonstrate the efficacy of both techniques by fitting generalized linear models to both simulated spike trains and spike trains recorded experimentally in monkey V1 cortex. Both techniques give nearly identical results, reducing the false-positive rate of the KS test and greatly increasing the reliability of model evaluation based on the time-rescaling theorem.

  1. Variable selection in discrete survival models including heterogeneity.

    PubMed

    Groll, Andreas; Tutz, Gerhard

    2017-04-01

    Several variable selection procedures are available for continuous time-to-event data. However, if time is measured in a discrete way and therefore many ties occur models for continuous time are inadequate. We propose penalized likelihood methods that perform efficient variable selection in discrete survival modeling with explicit modeling of the heterogeneity in the population. The method is based on a combination of ridge and lasso type penalties that are tailored to the case of discrete survival. The performance is studied in simulation studies and an application to the birth of the first child.

  2. Analyzing time-ordered event data with missed observations.

    PubMed

    Dokter, Adriaan M; van Loon, E Emiel; Fokkema, Wimke; Lameris, Thomas K; Nolet, Bart A; van der Jeugd, Henk P

    2017-09-01

    A common problem with observational datasets is that not all events of interest may be detected. For example, observing animals in the wild can difficult when animals move, hide, or cannot be closely approached. We consider time series of events recorded in conditions where events are occasionally missed by observers or observational devices. These time series are not restricted to behavioral protocols, but can be any cyclic or recurring process where discrete outcomes are observed. Undetected events cause biased inferences on the process of interest, and statistical analyses are needed that can identify and correct the compromised detection processes. Missed observations in time series lead to observed time intervals between events at multiples of the true inter-event time, which conveys information on their detection probability. We derive the theoretical probability density function for observed intervals between events that includes a probability of missed detection. Methodology and software tools are provided for analysis of event data with potential observation bias and its removal. The methodology was applied to simulation data and a case study of defecation rate estimation in geese, which is commonly used to estimate their digestive throughput and energetic uptake, or to calculate goose usage of a feeding site from dropping density. Simulations indicate that at a moderate chance to miss arrival events ( p  = 0.3), uncorrected arrival intervals were biased upward by up to a factor 3, while parameter values corrected for missed observations were within 1% of their true simulated value. A field case study shows that not accounting for missed observations leads to substantial underestimates of the true defecation rate in geese, and spurious rate differences between sites, which are introduced by differences in observational conditions. These results show that the derived methodology can be used to effectively remove observational biases in time-ordered event data.

  3. DISCRETE EVENT SIMULATION OF OPTICAL SWITCH MATRIX PERFORMANCE IN COMPUTER NETWORKS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Imam, Neena; Poole, Stephen W

    2013-01-01

    In this paper, we present application of a Discrete Event Simulator (DES) for performance modeling of optical switching devices in computer networks. Network simulators are valuable tools in situations where one cannot investigate the system directly. This situation may arise if the system under study does not exist yet or the cost of studying the system directly is prohibitive. Most available network simulators are based on the paradigm of discrete-event-based simulation. As computer networks become increasingly larger and more complex, sophisticated DES tool chains have become available for both commercial and academic research. Some well-known simulators are NS2, NS3, OPNET,more » and OMNEST. For this research, we have applied OMNEST for the purpose of simulating multi-wavelength performance of optical switch matrices in computer interconnection networks. Our results suggest that the application of DES to computer interconnection networks provides valuable insight in device performance and aids in topology and system optimization.« less

  4. Adjustable Autonomy Testbed

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Schrenkenghost, Debra K.

    2001-01-01

    The Adjustable Autonomy Testbed (AAT) is a simulation-based testbed located in the Intelligent Systems Laboratory in the Automation, Robotics and Simulation Division at NASA Johnson Space Center. The purpose of the testbed is to support evaluation and validation of prototypes of adjustable autonomous agent software for control and fault management for complex systems. The AA T project has developed prototype adjustable autonomous agent software and human interfaces for cooperative fault management. This software builds on current autonomous agent technology by altering the architecture, components and interfaces for effective teamwork between autonomous systems and human experts. Autonomous agents include a planner, flexible executive, low level control and deductive model-based fault isolation. Adjustable autonomy is intended to increase the flexibility and effectiveness of fault management with an autonomous system. The test domain for this work is control of advanced life support systems for habitats for planetary exploration. The CONFIG hybrid discrete event simulation environment provides flexible and dynamically reconfigurable models of the behavior of components and fluids in the life support systems. Both discrete event and continuous (discrete time) simulation are supported, and flows and pressures are computed globally. This provides fast dynamic simulations of interacting hardware systems in closed loops that can be reconfigured during operations scenarios, producing complex cascading effects of operations and failures. Current object-oriented model libraries support modeling of fluid systems, and models have been developed of physico-chemical and biological subsystems for processing advanced life support gases. In FY01, water recovery system models will be developed.

  5. Budget impact analysis of thrombolysis for stroke in Spain: a discrete event simulation model.

    PubMed

    Mar, Javier; Arrospide, Arantzazu; Comas, Mercè

    2010-01-01

    Thrombolysis within the first 3 hours after the onset of symptoms of a stroke has been shown to be a cost-effective treatment because treated patients are 30% more likely than nontreated patients to have no residual disability. The objective of this study was to calculate by means of a discrete event simulation model the budget impact of thrombolysis in Spain. The budget impact analysis was based on stroke incidence rates and the estimation of the prevalence of stroke-related disability in Spain and its translation to hospital and social costs. A discrete event simulation model was constructed to represent the flow of patients with stroke in Spain. If 10% of patients with stroke from 2000 to 2015 would receive thrombolytic treatment, the prevalence of dependent patients in 2015 would decrease from 149,953 to 145,922. For the first 6 years, the cost of intervention would surpass the savings. Nevertheless, the number of cases in which patient dependency was avoided would steadily increase, and after 2006 the cost savings would be greater, with a widening difference between the cost of intervention and the cost of nonintervention, until 2015. The impact of thrombolysis on society's health and social budget indicates a net benefit after 6 years, and the improvement in health grows continuously. The validation of the model demonstrates the adequacy of the discrete event simulation approach in representing the epidemiology of stroke to calculate the budget impact.

  6. Comparisons of discrete and integrative sampling accuracy in estimating pulsed aquatic exposures.

    PubMed

    Morrison, Shane A; Luttbeg, Barney; Belden, Jason B

    2016-11-01

    Most current-use pesticides have short half-lives in the water column and thus the most relevant exposure scenarios for many aquatic organisms are pulsed exposures. Quantifying exposure using discrete water samples may not be accurate as few studies are able to sample frequently enough to accurately determine time-weighted average (TWA) concentrations of short aquatic exposures. Integrative sampling methods that continuously sample freely dissolved contaminants over time intervals (such as integrative passive samplers) have been demonstrated to be a promising measurement technique. We conducted several modeling scenarios to test the assumption that integrative methods may require many less samples for accurate estimation of peak 96-h TWA concentrations. We compared the accuracies of discrete point samples and integrative samples while varying sampling frequencies and a range of contaminant water half-lives (t 50  = 0.5, 2, and 8 d). Differences the predictive accuracy of discrete point samples and integrative samples were greatest at low sampling frequencies. For example, when the half-life was 0.5 d, discrete point samples required 7 sampling events to ensure median values > 50% and no sampling events reporting highly inaccurate results (defined as < 10% of the true 96-h TWA). Across all water half-lives investigated, integrative sampling only required two samples to prevent highly inaccurate results and measurements resulting in median values > 50% of the true concentration. Regardless, the need for integrative sampling diminished as water half-life increased. For an 8-d water half-life, two discrete samples produced accurate estimates and median values greater than those obtained for two integrative samples. Overall, integrative methods are the more accurate method for monitoring contaminants with short water half-lives due to reduced frequency of extreme values, especially with uncertainties around the timing of pulsed events. However, the acceptability of discrete sampling methods for providing accurate concentration measurements increases with increasing aquatic half-lives. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Fault tree analysis: NiH2 aerospace cells for LEO mission

    NASA Technical Reports Server (NTRS)

    Klein, Glenn C.; Rash, Donald E., Jr.

    1992-01-01

    The Fault Tree Analysis (FTA) is one of several reliability analyses or assessments applied to battery cells to be utilized in typical Electric Power Subsystems for spacecraft in low Earth orbit missions. FTA is generally the process of reviewing and analytically examining a system or equipment in such a way as to emphasize the lower level fault occurrences which directly or indirectly contribute to the major fault or top level event. This qualitative FTA addresses the potential of occurrence for five specific top level events: hydrogen leakage through either discrete leakage paths or through pressure vessel rupture; and four distinct modes of performance degradation - high charge voltage, suppressed discharge voltage, loss of capacity, and high pressure.

  8. Exploration Supply Chain Simulation

    NASA Technical Reports Server (NTRS)

    2008-01-01

    The Exploration Supply Chain Simulation project was chartered by the NASA Exploration Systems Mission Directorate to develop a software tool, with proper data, to quantitatively analyze supply chains for future program planning. This tool is a discrete-event simulation that uses the basic supply chain concepts of planning, sourcing, making, delivering, and returning. This supply chain perspective is combined with other discrete or continuous simulation factors. Discrete resource events (such as launch or delivery reviews) are represented as organizational functional units. Continuous resources (such as civil service or contractor program functions) are defined as enabling functional units. Concepts of fixed and variable costs are included in the model to allow the discrete events to interact with cost calculations. The definition file is intrinsic to the model, but a blank start can be initiated at any time. The current definition file is an Orion Ares I crew launch vehicle. Parameters stretch from Kennedy Space Center across and into other program entities (Michaud Assembly Facility, Aliant Techsystems, Stennis Space Center, Johnson Space Center, etc.) though these will only gain detail as the file continues to evolve. The Orion Ares I file definition in the tool continues to evolve, and analysis from this tool is expected in 2008. This is the first application of such business-driven modeling to a NASA/government-- aerospace contractor endeavor.

  9. Uncertainties in stormwater runoff data collection from a small urban catchment, Southeast China.

    PubMed

    Huang, Jinliang; Tu, Zhenshun; Du, Pengfei; Lin, Jie; Li, Qingsheng

    2010-01-01

    Monitoring data are often used to identify stormwater runoff characteristics and in stormwater runoff modelling without consideration of their inherent uncertainties. Integrated with discrete sample analysis and error propagation analysis, this study attempted to quantify the uncertainties of discrete chemical oxygen demand (COD), total suspended solids (TSS) concentration, stormwater flowrate, stormwater event volumes, COD event mean concentration (EMC), and COD event loads in terms of flow measurement, sample collection, storage and laboratory analysis. The results showed that the uncertainties due to sample collection, storage and laboratory analysis of COD from stormwater runoff are 13.99%, 19.48% and 12.28%. Meanwhile, flow measurement uncertainty was 12.82%, and the sample collection uncertainty of TSS from stormwater runoff was 31.63%. Based on the law of propagation of uncertainties, the uncertainties regarding event flow volume, COD EMC and COD event loads were quantified as 7.03%, 10.26% and 18.47%.

  10. Women's bleeding patterns: ability to recall and predict menstrual events. World Health Organization Task Force on Psychosocial Research in Family, Planning, Special Programme of Research, Development and Research Training in Human Reproduction.

    PubMed

    1981-01-01

    Objective records of the occurrence of menstrual bleeding were compared with women's subjective assessments of the timing and duration of these events. The number of days a woman experienced bleeding during each episode was relatively constant; however, the length of the bleeding episode varied greatly among the 13 cultures studies. A greater understanding of menstrual patterns is possible if the pattern is seen as a succession of discrete events rather than as a whole. A more careful use of terminology relating to these discrete events would provide greater understanding of menstruation for the woman concerned and those advising her. The methodology employed in the collection of data about menstrual events among illiterate women is described and suggestions given as to how such information can be most efficiently obtained.

  11. Hybrid stochastic simplifications for multiscale gene networks.

    PubMed

    Crudu, Alina; Debussche, Arnaud; Radulescu, Ovidiu

    2009-09-07

    Stochastic simulation of gene networks by Markov processes has important applications in molecular biology. The complexity of exact simulation algorithms scales with the number of discrete jumps to be performed. Approximate schemes reduce the computational time by reducing the number of simulated discrete events. Also, answering important questions about the relation between network topology and intrinsic noise generation and propagation should be based on general mathematical results. These general results are difficult to obtain for exact models. We propose a unified framework for hybrid simplifications of Markov models of multiscale stochastic gene networks dynamics. We discuss several possible hybrid simplifications, and provide algorithms to obtain them from pure jump processes. In hybrid simplifications, some components are discrete and evolve by jumps, while other components are continuous. Hybrid simplifications are obtained by partial Kramers-Moyal expansion [1-3] which is equivalent to the application of the central limit theorem to a sub-model. By averaging and variable aggregation we drastically reduce simulation time and eliminate non-critical reactions. Hybrid and averaged simplifications can be used for more effective simulation algorithms and for obtaining general design principles relating noise to topology and time scales. The simplified models reproduce with good accuracy the stochastic properties of the gene networks, including waiting times in intermittence phenomena, fluctuation amplitudes and stationary distributions. The methods are illustrated on several gene network examples. Hybrid simplifications can be used for onion-like (multi-layered) approaches to multi-scale biochemical systems, in which various descriptions are used at various scales. Sets of discrete and continuous variables are treated with different methods and are coupled together in a physically justified approach.

  12. Population density approach for discrete mRNA distributions in generalized switching models for stochastic gene expression.

    PubMed

    Stinchcombe, Adam R; Peskin, Charles S; Tranchina, Daniel

    2012-06-01

    We present a generalization of a population density approach for modeling and analysis of stochastic gene expression. In the model, the gene of interest fluctuates stochastically between an inactive state, in which transcription cannot occur, and an active state, in which discrete transcription events occur; and the individual mRNA molecules are degraded stochastically in an independent manner. This sort of model in simplest form with exponential dwell times has been used to explain experimental estimates of the discrete distribution of random mRNA copy number. In our generalization, the random dwell times in the inactive and active states, T_{0} and T_{1}, respectively, are independent random variables drawn from any specified distributions. Consequently, the probability per unit time of switching out of a state depends on the time since entering that state. Our method exploits a connection between the fully discrete random process and a related continuous process. We present numerical methods for computing steady-state mRNA distributions and an analytical derivation of the mRNA autocovariance function. We find that empirical estimates of the steady-state mRNA probability mass function from Monte Carlo simulations of laboratory data do not allow one to distinguish between underlying models with exponential and nonexponential dwell times in some relevant parameter regimes. However, in these parameter regimes and where the autocovariance function has negative lobes, the autocovariance function disambiguates the two types of models. Our results strongly suggest that temporal data beyond the autocovariance function is required in general to characterize gene switching.

  13. Energy thresholds of discrete breathers in thermal equilibrium and relaxation processes.

    PubMed

    Ming, Yi; Ling, Dong-Bo; Li, Hui-Min; Ding, Ze-Jun

    2017-06-01

    So far, only the energy thresholds of single discrete breathers in nonlinear Hamiltonian systems have been analytically obtained. In this work, the energy thresholds of discrete breathers in thermal equilibrium and the energy thresholds of long-lived discrete breathers which can remain after a long time relaxation are analytically estimated for nonlinear chains. These energy thresholds are size dependent. The energy thresholds of discrete breathers in thermal equilibrium are the same as the previous analytical results for single discrete breathers. The energy thresholds of long-lived discrete breathers in relaxation processes are different from the previous results for single discrete breathers but agree well with the published numerical results known to us. Because real systems are either in thermal equilibrium or in relaxation processes, the obtained results could be important for experimental detection of discrete breathers.

  14. It's Deja Vu All over Again: Using Multiple-Spell Discrete-Time Survival Analysis.

    ERIC Educational Resources Information Center

    Willett, John B.; Singer, Judith D.

    1995-01-01

    The multiple-spell discrete-time survival analysis method is introduced and illustrated using longitudinal data on exit from and reentry into the teaching profession. The method is applicable to many educational problems involving the sequential occurrence of disparate events or episodes. (SLD)

  15. Eye Movements Reveal the Influence of Event Structure on Reading Behavior

    ERIC Educational Resources Information Center

    Swets, Benjamin; Kurby, Christopher A.

    2016-01-01

    When we read narrative texts such as novels and newspaper articles, we segment information presented in such texts into discrete events, with distinct boundaries between those events. But do our eyes reflect this event structure while reading? This study examines whether eye movements during the reading of discourse reveal how readers respond…

  16. 40 CFR 1042.515 - Test procedures related to not-to-exceed standards.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    .... (g) For engines equipped with emission controls that include discrete regeneration events, if a regeneration event occurs during the NTE test, the averaging period must be at least as long as the time between the events multiplied by the number of full regeneration events within the sampling period. This...

  17. 40 CFR 1042.515 - Test procedures related to not-to-exceed standards.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    .... (g) For engines equipped with emission controls that include discrete regeneration events, if a regeneration event occurs during the NTE test, the averaging period must be at least as long as the time between the events multiplied by the number of full regeneration events within the sampling period. This...

  18. 40 CFR 1042.515 - Test procedures related to not-to-exceed standards.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    .... (g) For engines equipped with emission controls that include discrete regeneration events, if a regeneration event occurs during the NTE test, the averaging period must be at least as long as the time between the events multiplied by the number of full regeneration events within the sampling period. This...

  19. 40 CFR 1042.515 - Test procedures related to not-to-exceed standards.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    .... (g) For engines equipped with emission controls that include discrete regeneration events, if a regeneration event occurs during the NTE test, the averaging period must be at least as long as the time between the events multiplied by the number of full regeneration events within the sampling period. This...

  20. Using Discrete Event Simulation to predict KPI's at a Projected Emergency Room.

    PubMed

    Concha, Pablo; Neriz, Liliana; Parada, Danilo; Ramis, Francisco

    2015-01-01

    Discrete Event Simulation (DES) is a powerful factor in the design of clinical facilities. DES enables facilities to be built or adapted to achieve the expected Key Performance Indicators (KPI's) such as average waiting times according to acuity, average stay times and others. Our computational model was built and validated using expert judgment and supporting statistical data. One scenario studied resulted in a 50% decrease in the average cycle time of patients compared to the original model, mainly by modifying the patient's attention model.

  1. Can discrete event simulation be of use in modelling major depression?

    PubMed Central

    Le Lay, Agathe; Despiegel, Nicolas; François, Clément; Duru, Gérard

    2006-01-01

    Background Depression is among the major contributors to worldwide disease burden and adequate modelling requires a framework designed to depict real world disease progression as well as its economic implications as closely as possible. Objectives In light of the specific characteristics associated with depression (multiple episodes at varying intervals, impact of disease history on course of illness, sociodemographic factors), our aim was to clarify to what extent "Discrete Event Simulation" (DES) models provide methodological benefits in depicting disease evolution. Methods We conducted a comprehensive review of published Markov models in depression and identified potential limits to their methodology. A model based on DES principles was developed to investigate the benefits and drawbacks of this simulation method compared with Markov modelling techniques. Results The major drawback to Markov models is that they may not be suitable to tracking patients' disease history properly, unless the analyst defines multiple health states, which may lead to intractable situations. They are also too rigid to take into consideration multiple patient-specific sociodemographic characteristics in a single model. To do so would also require defining multiple health states which would render the analysis entirely too complex. We show that DES resolve these weaknesses and that its flexibility allow patients with differing attributes to move from one event to another in sequential order while simultaneously taking into account important risk factors such as age, gender, disease history and patients attitude towards treatment, together with any disease-related events (adverse events, suicide attempt etc.). Conclusion DES modelling appears to be an accurate, flexible and comprehensive means of depicting disease progression compared with conventional simulation methodologies. Its use in analysing recurrent and chronic diseases appears particularly useful compared with Markov processes. PMID:17147790

  2. Can discrete event simulation be of use in modelling major depression?

    PubMed

    Le Lay, Agathe; Despiegel, Nicolas; François, Clément; Duru, Gérard

    2006-12-05

    Depression is among the major contributors to worldwide disease burden and adequate modelling requires a framework designed to depict real world disease progression as well as its economic implications as closely as possible. In light of the specific characteristics associated with depression (multiple episodes at varying intervals, impact of disease history on course of illness, sociodemographic factors), our aim was to clarify to what extent "Discrete Event Simulation" (DES) models provide methodological benefits in depicting disease evolution. We conducted a comprehensive review of published Markov models in depression and identified potential limits to their methodology. A model based on DES principles was developed to investigate the benefits and drawbacks of this simulation method compared with Markov modelling techniques. The major drawback to Markov models is that they may not be suitable to tracking patients' disease history properly, unless the analyst defines multiple health states, which may lead to intractable situations. They are also too rigid to take into consideration multiple patient-specific sociodemographic characteristics in a single model. To do so would also require defining multiple health states which would render the analysis entirely too complex. We show that DES resolve these weaknesses and that its flexibility allow patients with differing attributes to move from one event to another in sequential order while simultaneously taking into account important risk factors such as age, gender, disease history and patients attitude towards treatment, together with any disease-related events (adverse events, suicide attempt etc.). DES modelling appears to be an accurate, flexible and comprehensive means of depicting disease progression compared with conventional simulation methodologies. Its use in analysing recurrent and chronic diseases appears particularly useful compared with Markov processes.

  3. Seismic Characterization of the Newberry and Cooper Basin EGS Sites

    NASA Astrophysics Data System (ADS)

    Templeton, D. C.; Wang, J.; Goebel, M.; Johannesson, G.; Myers, S. C.; Harris, D.; Cladouhos, T. T.

    2015-12-01

    To aid in the seismic characterization of Engineered Geothermal Systems (EGS), we enhance traditional microearthquake detection and location methodologies at two EGS systems: the Newberry EGS site and the Habanero EGS site in the Cooper Basin of South Australia. We apply the Matched Field Processing (MFP) seismic imaging technique to detect new seismic events using known discrete microearthquake sources. Events identified using MFP typically have smaller magnitudes or occur within the coda of a larger event. Additionally, we apply a Bayesian multiple-event location algorithm, called MicroBayesLoc, to estimate the 95% probability ellipsoids for events with high signal-to-noise ratios (SNR). Such probability ellipsoid information can provide evidence for determining if a seismic lineation is real, or simply within the anticipated error range. At the Newberry EGS site, 235 events were reported in the original catalog. MFP identified 164 additional events (an increase of over 70% more events). For the relocated events in the Newberry catalog, we can distinguish two distinct seismic swarms that fall outside of one another's 95% probability error ellipsoids.This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  4. Simulating Operations at a Spaceport

    NASA Technical Reports Server (NTRS)

    Nevins, Michael R.

    2007-01-01

    SPACESIM is a computer program for detailed simulation of operations at a spaceport. SPACESIM is being developed to greatly improve existing spaceports and to aid in designing, building, and operating future spaceports, given that there is a worldwide trend in spaceport operations from very expensive, research- oriented launches to more frequent commercial launches. From an operational perspective, future spaceports are expected to resemble current airports and seaports, for which it is necessary to resolve issues of safety, security, efficient movement of machinery and people, cost effectiveness, timeliness, and maximizing effectiveness in utilization of resources. Simulations can be performed, for example, to (1) simultaneously analyze launches of reusable and expendable rockets and identify bottlenecks arising from competition for limited resources or (2) perform what-if scenario analyses to identify optimal scenarios prior to making large capital investments. SPACESIM includes an object-oriented discrete-event-simulation engine. (Discrete- event simulation has been used to assess processes at modern seaports.) The simulation engine is built upon the Java programming language for maximum portability. Extensible Markup Language (XML) is used for storage of data to enable industry-standard interchange of data with other software. A graphical user interface facilitates creation of scenarios and analysis of data.

  5. Music as Environment: An Ecological and Biosemiotic Approach

    PubMed Central

    Reybrouck, Mark

    2014-01-01

    This paper provides an attempt to conceive of music in terms of a sounding environment. Starting from a definition of music as a collection of vibrational events, it introduces the distinction between discrete-symbolic representations as against analog-continuous representations of the sounds. The former makes it possible to conceive of music in terms of a Humboldt system, the latter in terms of an experiential approach. Both approaches, further, are not opposed to each other, but are complementary to some extent. There is, however, a distinction to be drawn between the bottom-up approach to auditory processing of environmental sounds and music, which is continuous and proceeding in real time, as against the top-down approach, which is proceeding at a level of mental representation by applying discrete symbolic labels to vibrational events. The distinction is discussed against the background of phylogenetic and ontogenetic claims, with a major focus on the innate auditory capabilities of the fetus and neonate and the gradual evolution from mere sensory perception of sound to sense-making and musical meaning. The latter, finally, is elaborated on the basis of the operational concepts of affordance and functional tone, thus bringing together some older contributions from ecology and biosemiotics. PMID:25545707

  6. Discrete Event Modeling and Massively Parallel Execution of Epidemic Outbreak Phenomena

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perumalla, Kalyan S; Seal, Sudip K

    2011-01-01

    In complex phenomena such as epidemiological outbreaks, the intensity of inherent feedback effects and the significant role of transients in the dynamics make simulation the only effective method for proactive, reactive or post-facto analysis. The spatial scale, runtime speed, and behavioral detail needed in detailed simulations of epidemic outbreaks make it necessary to use large-scale parallel processing. Here, an optimistic parallel execution of a new discrete event formulation of a reaction-diffusion simulation model of epidemic propagation is presented to facilitate in dramatically increasing the fidelity and speed by which epidemiological simulations can be performed. Rollback support needed during optimistic parallelmore » execution is achieved by combining reverse computation with a small amount of incremental state saving. Parallel speedup of over 5,500 and other runtime performance metrics of the system are observed with weak-scaling execution on a small (8,192-core) Blue Gene / P system, while scalability with a weak-scaling speedup of over 10,000 is demonstrated on 65,536 cores of a large Cray XT5 system. Scenarios representing large population sizes exceeding several hundreds of millions of individuals in the largest cases are successfully exercised to verify model scalability.« less

  7. In-situ acoustic signature monitoring in additive manufacturing processes

    NASA Astrophysics Data System (ADS)

    Koester, Lucas W.; Taheri, Hossein; Bigelow, Timothy A.; Bond, Leonard J.; Faierson, Eric J.

    2018-04-01

    Additive manufacturing is a rapidly maturing process for the production of complex metallic, ceramic, polymeric, and composite components. The processes used are numerous, and with the complex geometries involved this can make quality control and standardization of the process and inspection difficult. Acoustic emission measurements have been used previously to monitor a number of processes including machining and welding. The authors have identified acoustic signature measurement as a potential means of monitoring metal additive manufacturing processes using process noise characteristics and those discrete acoustic emission events characteristic of defect growth, including cracks and delamination. Results of acoustic monitoring for a metal additive manufacturing process (directed energy deposition) are reported. The work investigated correlations between acoustic emissions and process noise with variations in machine state and deposition parameters, and provided proof of concept data that such correlations do exist.

  8. Hydrological disposition of flash flood and debris flows events in an Alpine watershed in Austria

    NASA Astrophysics Data System (ADS)

    Prenner, David; Kaitna, Roland; Mostbauer, Karin; Hrachowitz, Markus

    2017-04-01

    Debris flows and flash floods including intensive bedload transport represent severe hazards in the Alpine environment of Austria. For neither of these processes, explicit rainfall thresholds - even for specific regions - are available. This may be due to insufficient data on the temporal and spatial variation of precipitation, but probably also due to variations of the geomorphic and hydrological disposition of a watershed to produce such processes in the course of a rainfall event. In this contribution we investigate the importance of the hydrological system state for triggering debris flows and flash floods in the Ill/Suggadin watershed (500 km2), Austria, by analyzing the effects of dynamics in system state variables such as soil moisture, snow pack, or ground water level. The analysis is based on a semi-distributed conceptual rainfall-runoff model, spatially discretizing the watershed according to the available precipitation observations, elevation, topographic considerations and land cover. Input data are available from six weather stations on a daily basis ranging back to 1947. A Thiessen polygon decomposition results in six individual precipitation zones with a maximum area of about 130 km2. Elevation specific behavior of the quantities temperature and precipitation is covered through an elevation-resolved computation every 200 m. Spatial heterogeneity is considered by distinct hydrological response units for bare rock, forest, grassland, and riparian zone. To reduce numerical smearing on the hydrological results, the Implicit Euler scheme was used to discretize the balance equations. For model calibration we utilized runoff hydrographs, snow cover data as well as prior parameter and process constraints. The obtained hydrological output variables are linked to documented observed flash flood and debris flow events by means of a multivariate logistic regression. We present a summary about the daily hydrological disposition of experiencing a flash flood or debris flow event in each precipitation zone of the Ill/Suggadin region over almost 65 years. Furthermore, we will provide an interpretation of the occurred hydrological trigger patterns and show a frequency ranking. The outcomes of this study shall lead to an improved forecasting and differentiation of trigger conditions leading to debris flows and flash floods.

  9. An accelerated algorithm for discrete stochastic simulation of reaction-diffusion systems using gradient-based diffusion and tau-leaping.

    PubMed

    Koh, Wonryull; Blackwell, Kim T

    2011-04-21

    Stochastic simulation of reaction-diffusion systems enables the investigation of stochastic events arising from the small numbers and heterogeneous distribution of molecular species in biological cells. Stochastic variations in intracellular microdomains and in diffusional gradients play a significant part in the spatiotemporal activity and behavior of cells. Although an exact stochastic simulation that simulates every individual reaction and diffusion event gives a most accurate trajectory of the system's state over time, it can be too slow for many practical applications. We present an accelerated algorithm for discrete stochastic simulation of reaction-diffusion systems designed to improve the speed of simulation by reducing the number of time-steps required to complete a simulation run. This method is unique in that it employs two strategies that have not been incorporated in existing spatial stochastic simulation algorithms. First, diffusive transfers between neighboring subvolumes are based on concentration gradients. This treatment necessitates sampling of only the net or observed diffusion events from higher to lower concentration gradients rather than sampling all diffusion events regardless of local concentration gradients. Second, we extend the non-negative Poisson tau-leaping method that was originally developed for speeding up nonspatial or homogeneous stochastic simulation algorithms. This method calculates each leap time in a unified step for both reaction and diffusion processes while satisfying the leap condition that the propensities do not change appreciably during the leap and ensuring that leaping does not cause molecular populations to become negative. Numerical results are presented that illustrate the improvement in simulation speed achieved by incorporating these two new strategies.

  10. Modeling effectiveness of management practices for flood mitigation using GIS spatial analysis functions in Upper Cilliwung watershed

    NASA Astrophysics Data System (ADS)

    Darma Tarigan, Suria

    2016-01-01

    Flooding is caused by excessive rainfall flowing downstream as cumulative surface runoff. Flooding event is a result of complex interaction of natural system components such as rainfall events, land use, soil, topography and channel characteristics. Modeling flooding event as a result of interaction of those components is a central theme in watershed management. The model is usually used to test performance of various management practices in flood mitigation. There are various types of management practices for flood mitigation including vegetative and structural management practices. Existing hydrological model such as SWAT and HEC-HMS models have limitation to accommodate discrete management practices such as infiltration well, small farm reservoir, silt pits in its analysis due to the lumped structure of these models. Aim of this research is to use raster spatial analysis functions of Geo-Information System (RGIS-HM) to model flooding event in Ciliwung watershed and to simulate impact of discrete management practices on surface runoff reduction. The model was validated using flooding data event of Ciliwung watershed on 29 January 2004. The hourly hydrograph data and rainfall data were available during period of model validation. The model validation provided good result with Nash-Suthcliff efficiency of 0.8. We also compared the RGIS-HM with Netlogo Hydrological Model (NL-HM). The RGIS-HM has similar capability with NL-HM in simulating discrete management practices in watershed scale.

  11. Discretization of Continuous Time Discrete Scale Invariant Processes: Estimation and Spectra

    NASA Astrophysics Data System (ADS)

    Rezakhah, Saeid; Maleki, Yasaman

    2016-07-01

    Imposing some flexible sampling scheme we provide some discretization of continuous time discrete scale invariant (DSI) processes which is a subsidiary discrete time DSI process. Then by introducing some simple random measure we provide a second continuous time DSI process which provides a proper approximation of the first one. This enables us to provide a bilateral relation between covariance functions of the subsidiary process and the new continuous time processes. The time varying spectral representation of such continuous time DSI process is characterized, and its spectrum is estimated. Also, a new method for estimation time dependent Hurst parameter of such processes is provided which gives a more accurate estimation. The performance of this estimation method is studied via simulation. Finally this method is applied to the real data of S & P500 and Dow Jones indices for some special periods.

  12. Bayesian Inference for Signal-Based Seismic Monitoring

    NASA Astrophysics Data System (ADS)

    Moore, D.

    2015-12-01

    Traditional seismic monitoring systems rely on discrete detections produced by station processing software, discarding significant information present in the original recorded signal. SIG-VISA (Signal-based Vertically Integrated Seismic Analysis) is a system for global seismic monitoring through Bayesian inference on seismic signals. By modeling signals directly, our forward model is able to incorporate a rich representation of the physics underlying the signal generation process, including source mechanisms, wave propagation, and station response. This allows inference in the model to recover the qualitative behavior of recent geophysical methods including waveform matching and double-differencing, all as part of a unified Bayesian monitoring system that simultaneously detects and locates events from a global network of stations. We demonstrate recent progress in scaling up SIG-VISA to efficiently process the data stream of global signals recorded by the International Monitoring System (IMS), including comparisons against existing processing methods that show increased sensitivity from our signal-based model and in particular the ability to locate events (including aftershock sequences that can tax analyst processing) precisely from waveform correlation effects. We also provide a Bayesian analysis of an alleged low-magnitude event near the DPRK test site in May 2010 [1] [2], investigating whether such an event could plausibly be detected through automated processing in a signal-based monitoring system. [1] Zhang, Miao and Wen, Lianxing. "Seismological Evidence for a Low-Yield Nuclear Test on 12 May 2010 in North Korea". Seismological Research Letters, January/February 2015. [2] Richards, Paul. "A Seismic Event in North Korea on 12 May 2010". CTBTO SnT 2015 oral presentation, video at https://video-archive.ctbto.org/index.php/kmc/preview/partner_id/103/uiconf_id/4421629/entry_id/0_ymmtpps0/delivery/http

  13. Adaptive Neural Network-Based Event-Triggered Control of Single-Input Single-Output Nonlinear Discrete-Time Systems.

    PubMed

    Sahoo, Avimanyu; Xu, Hao; Jagannathan, Sarangapani

    2016-01-01

    This paper presents a novel adaptive neural network (NN) control of single-input and single-output uncertain nonlinear discrete-time systems under event sampled NN inputs. In this control scheme, the feedback signals are transmitted, and the NN weights are tuned in an aperiodic manner at the event sampled instants. After reviewing the NN approximation property with event sampled inputs, an adaptive state estimator (SE), consisting of linearly parameterized NNs, is utilized to approximate the unknown system dynamics in an event sampled context. The SE is viewed as a model and its approximated dynamics and the state vector, during any two events, are utilized for the event-triggered controller design. An adaptive event-trigger condition is derived by using both the estimated NN weights and a dead-zone operator to determine the event sampling instants. This condition both facilitates the NN approximation and reduces the transmission of feedback signals. The ultimate boundedness of both the NN weight estimation error and the system state vector is demonstrated through the Lyapunov approach. As expected, during an initial online learning phase, events are observed more frequently. Over time with the convergence of the NN weights, the inter-event times increase, thereby lowering the number of triggered events. These claims are illustrated through the simulation results.

  14. The hippocampus and inferential reasoning: building memories to navigate future decisions

    PubMed Central

    Zeithamova, Dagmar; Schlichting, Margaret L.; Preston, Alison R.

    2012-01-01

    A critical aspect of inferential reasoning is the ability to form relationships between items or events that were not experienced together. This review considers different perspectives on the role of the hippocampus in successful inferential reasoning during both memory encoding and retrieval. Intuitively, inference can be thought of as a logical process by which elements of individual existing memories are retrieved and recombined to answer novel questions. Such flexible retrieval is sub-served by the hippocampus and is thought to require specialized hippocampal encoding mechanisms that discretely code events such that event elements are individually accessible from memory. In addition to retrieval-based inference, recent research has also focused on hippocampal processes that support the combination of information acquired across multiple experiences during encoding. This mechanism suggests that by recalling past events during new experiences, connections can be created between newly formed and existing memories. Such hippocampally mediated memory integration would thus underlie the formation of networks of related memories that extend beyond direct experience to anticipate future judgments about the relationships between items and events. We also discuss integrative encoding in the context of emerging evidence linking the hippocampus to the formation of schemas as well as prospective theories of hippocampal function that suggest memories are actively constructed to anticipate future decisions and actions. PMID:22470333

  15. Symbolic discrete event system specification

    NASA Technical Reports Server (NTRS)

    Zeigler, Bernard P.; Chi, Sungdo

    1992-01-01

    Extending discrete event modeling formalisms to facilitate greater symbol manipulation capabilities is important to further their use in intelligent control and design of high autonomy systems. An extension to the DEVS formalism that facilitates symbolic expression of event times by extending the time base from the real numbers to the field of linear polynomials over the reals is defined. A simulation algorithm is developed to generate the branching trajectories resulting from the underlying nondeterminism. To efficiently manage symbolic constraints, a consistency checking algorithm for linear polynomial constraints based on feasibility checking algorithms borrowed from linear programming has been developed. The extended formalism offers a convenient means to conduct multiple, simultaneous explorations of model behaviors. Examples of application are given with concentration on fault model analysis.

  16. Near Optimal Event-Triggered Control of Nonlinear Discrete-Time Systems Using Neurodynamic Programming.

    PubMed

    Sahoo, Avimanyu; Xu, Hao; Jagannathan, Sarangapani

    2016-09-01

    This paper presents an event-triggered near optimal control of uncertain nonlinear discrete-time systems. Event-driven neurodynamic programming (NDP) is utilized to design the control policy. A neural network (NN)-based identifier, with event-based state and input vectors, is utilized to learn the system dynamics. An actor-critic framework is used to learn the cost function and the optimal control input. The NN weights of the identifier, the critic, and the actor NNs are tuned aperiodically once every triggered instant. An adaptive event-trigger condition to decide the trigger instants is derived. Thus, a suitable number of events are generated to ensure a desired accuracy of approximation. A near optimal performance is achieved without using value and/or policy iterations. A detailed analysis of nontrivial inter-event times with an explicit formula to show the reduction in computation is also derived. The Lyapunov technique is used in conjunction with the event-trigger condition to guarantee the ultimate boundedness of the closed-loop system. The simulation results are included to verify the performance of the controller. The net result is the development of event-driven NDP.

  17. Tutorial in medical decision modeling incorporating waiting lines and queues using discrete event simulation.

    PubMed

    Jahn, Beate; Theurl, Engelbert; Siebert, Uwe; Pfeiffer, Karl-Peter

    2010-01-01

    In most decision-analytic models in health care, it is assumed that there is treatment without delay and availability of all required resources. Therefore, waiting times caused by limited resources and their impact on treatment effects and costs often remain unconsidered. Queuing theory enables mathematical analysis and the derivation of several performance measures of queuing systems. Nevertheless, an analytical approach with closed formulas is not always possible. Therefore, simulation techniques are used to evaluate systems that include queuing or waiting, for example, discrete event simulation. To include queuing in decision-analytic models requires a basic knowledge of queuing theory and of the underlying interrelationships. This tutorial introduces queuing theory. Analysts and decision-makers get an understanding of queue characteristics, modeling features, and its strength. Conceptual issues are covered, but the emphasis is on practical issues like modeling the arrival of patients. The treatment of coronary artery disease with percutaneous coronary intervention including stent placement serves as an illustrative queuing example. Discrete event simulation is applied to explicitly model resource capacities, to incorporate waiting lines and queues in the decision-analytic modeling example.

  18. Using Discrete Event Simulation for Programming Model Exploration at Extreme-Scale: Macroscale Components for the Structural Simulation Toolkit (SST).

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilke, Jeremiah J; Kenny, Joseph P.

    2015-02-01

    Discrete event simulation provides a powerful mechanism for designing and testing new extreme- scale programming models for high-performance computing. Rather than debug, run, and wait for results on an actual system, design can first iterate through a simulator. This is particularly useful when test beds cannot be used, i.e. to explore hardware or scales that do not yet exist or are inaccessible. Here we detail the macroscale components of the structural simulation toolkit (SST). Instead of depending on trace replay or state machines, the simulator is architected to execute real code on real software stacks. Our particular user-space threading frameworkmore » allows massive scales to be simulated even on small clusters. The link between the discrete event core and the threading framework allows interesting performance metrics like call graphs to be collected from a simulated run. Performance analysis via simulation can thus become an important phase in extreme-scale programming model and runtime system design via the SST macroscale components.« less

  19. Self-Organisation and Intermittent Coherent Oscillations in the EXTRAP T2 Reversed Field Pinch

    NASA Astrophysics Data System (ADS)

    Cecconello, M.; Malmberg, J.-A.; Sallander, E.; Drake, J. R.

    Many reversed-field pinch (RFP) experiments exhibit a coherent oscillatory behaviour that is characteristic of discrete dynamo events and is associated with intermittent current profile self-organisation phenomena. However, in the vast majority of the discharges in the resistive shell RFP experiment EXTRAP T2, the dynamo activity does not show global, coherent oscillatory behaviour. The internally resonant tearing modes are phase-aligned and wall-locked resulting in a large localised magnetic perturbation. Equilibrium and plasma parameters have a level of high frequency fluctuations but the average values are quasi-steady. For some discharges, however, the equilibrium parameters exhibit the oscillatory behaviour characteristic of the discrete dynamo events. For these discharges, the trend observed in the tearing mode spectra, associated with the onset of the discrete relaxation event behaviour, is a relative higher amplitude of m = 0 mode activity and relative lower amplitude of the m = 1 mode activity compared with their average values. Global plasma parameters and model profile calculations for sample discharges representing the two types of relaxation dynamics are presented.

  20. Diagnosis of delay-deadline failures in real time discrete event models.

    PubMed

    Biswas, Santosh; Sarkar, Dipankar; Bhowal, Prodip; Mukhopadhyay, Siddhartha

    2007-10-01

    In this paper a method for fault detection and diagnosis (FDD) of real time systems has been developed. A modeling framework termed as real time discrete event system (RTDES) model is presented and a mechanism for FDD of the same has been developed. The use of RTDES framework for FDD is an extension of the works reported in the discrete event system (DES) literature, which are based on finite state machines (FSM). FDD of RTDES models are suited for real time systems because of their capability of representing timing faults leading to failures in terms of erroneous delays and deadlines, which FSM-based ones cannot address. The concept of measurement restriction of variables is introduced for RTDES and the consequent equivalence of states and indistinguishability of transitions have been characterized. Faults are modeled in terms of an unmeasurable condition variable in the state map. Diagnosability is defined and the procedure of constructing a diagnoser is provided. A checkable property of the diagnoser is shown to be a necessary and sufficient condition for diagnosability. The methodology is illustrated with an example of a hydraulic cylinder.

  1. A Computational Model of Event Segmentation from Perceptual Prediction

    ERIC Educational Resources Information Center

    Reynolds, Jeremy R.; Zacks, Jeffrey M.; Braver, Todd S.

    2007-01-01

    People tend to perceive ongoing continuous activity as series of discrete events. This partitioning of continuous activity may occur, in part, because events correspond to dynamic patterns that have recurred across different contexts. Recurring patterns may lead to reliable sequential dependencies in observers' experiences, which then can be used…

  2. Nature of Reduced Carbon in Martian Meteorites

    NASA Technical Reports Server (NTRS)

    Gibson, Everett K., Jr.; McKay, D. S.; Thomas-Keprta, K. L.; Clemett, S. J.; White, L. M.

    2012-01-01

    Martian meteorites provide important information on the nature of reduced carbon components present on Mars throughout its history. The first in situ analyses for carbon on the surface of Mars by the Viking landers yielded disappointing results. With the recognition of Martian meteorites on Earth, investigations have shown carbon-bearing phases exist on Mars. Studies have yielded presence of reduced carbon, carbonates and inferred graphitic carbon phases. Samples ranging in age from the first approximately 4 Ga of Mars history [e.g. ALH84001] to nakhlites with a crystallization age of 1.3 Ga [e.g. Nakhla] with aqueous alteration processes occurring 0.5-0.7 Ga after crystallizaton. Shergottites demonstrate formation ages around 165-500 Ma with younger aqueous alterations events. Only a limited number of the Martian meteorites do not show evidence of significance terrestrial alterations. Selected areas within ALH84001, Nakhla, Yamato 000593 and possibly Tissint are suitable for study of their indigenous reduced carbon bearing phases. Nakhla possesses discrete, well-defined carbonaceous phases present within iddingsite alteration zones. Based upon both isotopic measurements and analysis of Nakhla's organic phases the presence of pre-terrestrial organics is now recognized. The reduced carbon-bearing phases appear to have been deposited during preterrestrial aqueous alteration events that produced clays. In addition, the microcrystalline layers of Nakhla's iddingsite have discrete units of salt crystals suggestive of evaporation processes. While we can only speculate on the origin of these unique carbonaceous structures, we note that the significance of such observations is that it may allow us to understand the role of Martian carbon as seen in the Martian meteorites with obvious implications for astrobiology and the pre-biotic evolution of Mars. In any case, our observations strongly suggest that reduced organic carbon exists as micrometer- size, discrete structures on Mars associated with clay and salt minerals. The Mars Science Laboratory s investigators should be aware of reduced organic carbon components within clay-bearing phases.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perkins, Casey J.; Brigantic, Robert T.; Keating, Douglas H.

    There is a need to develop and demonstrate technical approaches for verifying potential future agreements to limit and reduce total warhead stockpiles. To facilitate this aim, warhead monitoring systems employ both concepts of operations (CONOPS) and technologies. A systems evaluation approach can be used to assess the relative performance of CONOPS and technologies in their ability to achieve monitoring system objectives which include: 1) confidence that a treaty accountable item (TAI) initialized by the monitoring system is as declared; 2) confidence that there is no undetected diversion from the monitoring system; and 3) confidence that a TAI is dismantled asmore » declared. Although there are many quantitative methods that can be used to assess system performance for the above objectives, this paper focuses on a simulation perspective primarily for the ability to support analysis of the probabilities that are used to define operating characteristics of CONOPS and technologies. This paper describes a discrete event simulation (DES) model, comprised of three major sub-models: including TAI lifecycle flow, monitoring activities, and declaration behavior. The DES model seeks to capture all processes and decision points associated with the progressions of virtual TAIs, with notional characteristics, through the monitoring system from initialization through dismantlement. The simulation updates TAI progression (i.e., whether the generated test objects are accepted and rejected at the appropriate points) all the way through dismantlement. Evaluation of TAI lifecycles primarily serves to assess how the order, frequency, and combination of functions in the CONOPS affect system performance as a whole. It is important, however, to note that discrete event simulation is also capable (at a basic level) of addressing vulnerabilities in the CONOPS and interdependencies between individual functions as well. This approach is beneficial because it does not rely on complex mathematical models, but instead attempts to recreate the real world system as a decision and event driven simulation. Finally, because the simulation addresses warhead confirmation, chain of custody, and warhead dismantlement in a modular fashion, a discrete-event model could be easily adapted to multiple CONOPS for the exploration of a large number of “what if” scenarios.« less

  4. Computer modeling of lung cancer diagnosis-to-treatment process

    PubMed Central

    Ju, Feng; Lee, Hyo Kyung; Osarogiagbon, Raymond U.; Yu, Xinhua; Faris, Nick

    2015-01-01

    We introduce an example of a rigorous, quantitative method for quality improvement in lung cancer care-delivery. Computer process modeling methods are introduced for lung cancer diagnosis, staging and treatment selection process. Two types of process modeling techniques, discrete event simulation (DES) and analytical models, are briefly reviewed. Recent developments in DES are outlined and the necessary data and procedures to develop a DES model for lung cancer diagnosis, leading up to surgical treatment process are summarized. The analytical models include both Markov chain model and closed formulas. The Markov chain models with its application in healthcare are introduced and the approach to derive a lung cancer diagnosis process model is presented. Similarly, the procedure to derive closed formulas evaluating the diagnosis process performance is outlined. Finally, the pros and cons of these methods are discussed. PMID:26380181

  5. The discrete Fourier transform algorithm for determining decay constants—Implementation using a field programmable gate array

    NASA Astrophysics Data System (ADS)

    Bostrom, G.; Atkinson, D.; Rice, A.

    2015-04-01

    Cavity ringdown spectroscopy (CRDS) uses the exponential decay constant of light exiting a high-finesse resonance cavity to determine analyte concentration, typically via absorption. We present a high-throughput data acquisition system that determines the decay constant in near real time using the discrete Fourier transform algorithm on a field programmable gate array (FPGA). A commercially available, high-speed, high-resolution, analog-to-digital converter evaluation board system is used as the platform for the system, after minor hardware and software modifications. The system outputs decay constants at maximum rate of 4.4 kHz using an 8192-point fast Fourier transform by processing the intensity decay signal between ringdown events. We present the details of the system, including the modifications required to adapt the evaluation board to accurately process the exponential waveform. We also demonstrate the performance of the system, both stand-alone and incorporated into our existing CRDS system. Details of FPGA, microcontroller, and circuitry modifications are provided in the Appendix and computer code is available upon request from the authors.

  6. 49 CFR Appendix B to Part 242 - Procedures for Submission and Approval of Conductor Certification Programs

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... are voluntary or mandatory. Time and circumstances have the capacity to diminish both abstract knowledge and the proper application of that knowledge to discrete events. Time and circumstances also have.... In formulating how it will use the discretion being afforded, each railroad must design its program...

  7. 49 CFR Appendix B to Part 242 - Procedures for Submission and Approval of Conductor Certification Programs

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... are voluntary or mandatory. Time and circumstances have the capacity to diminish both abstract knowledge and the proper application of that knowledge to discrete events. Time and circumstances also have.... In formulating how it will use the discretion being afforded, each railroad must design its program...

  8. 49 CFR Appendix B to Part 242 - Procedures for Submission and Approval of Conductor Certification Programs

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... are voluntary or mandatory. Time and circumstances have the capacity to diminish both abstract knowledge and the proper application of that knowledge to discrete events. Time and circumstances also have.... In formulating how it will use the discretion being afforded, each railroad must design its program...

  9. Taxometric Investigation of PTSD: Data from Two Nationally Representative Samples

    ERIC Educational Resources Information Center

    Broman-Fulks, Joshua J.; Ruggiero, Kenneth J.; Green, Bradley A.; Kilpatrick, Dean G.; Danielson, Carla Kmett; Resnick, Heidi S.; Saunders, Benjamin E.

    2006-01-01

    Current psychiatric nosology depicts posttraumatic stress disorder (PTSD) as a discrete diagnostic category. However, only one study has examined the latent structure of PTSD, and this study suggested that PTSD may be more accurately conceptualized as an extreme reaction to traumatic life events rather than a discrete clinical syndrome. To build…

  10. Theory of quantized systems: formal basis for DEVS/HLA distributed simulation environment

    NASA Astrophysics Data System (ADS)

    Zeigler, Bernard P.; Lee, J. S.

    1998-08-01

    In the context of a DARPA ASTT project, we are developing an HLA-compliant distributed simulation environment based on the DEVS formalism. This environment will provide a user- friendly, high-level tool-set for developing interoperable discrete and continuous simulation models. One application is the study of contract-based predictive filtering. This paper presents a new approach to predictive filtering based on a process called 'quantization' to reduce state update transmission. Quantization, which generates state updates only at quantum level crossings, abstracts a sender model into a DEVS representation. This affords an alternative, efficient approach to embedding continuous models within distributed discrete event simulations. Applications of quantization to message traffic reduction are discussed. The theory has been validated by DEVSJAVA simulations of test cases. It will be subject to further test in actual distributed simulations using the DEVS/HLA modeling and simulation environment.

  11. Initial Evaluation of Signal-Based Bayesian Monitoring

    NASA Astrophysics Data System (ADS)

    Moore, D.; Russell, S.

    2016-12-01

    We present SIGVISA (Signal-based Vertically Integrated Seismic Analysis), a next-generation system for global seismic monitoring through Bayesian inference on seismic signals. Traditional seismic monitoring systems rely on discrete detections produced by station processing software, discarding significant information present in the original recorded signal. By modeling signals directly, our forward model is able to incorporate a rich representation of the physics underlying the signal generation process, including source mechanisms, wave propagation, and station response. This allows inference in the model to recover the qualitative behavior of geophysical methods including waveform matching and double-differencing, all as part of a unified Bayesian monitoring system that simultaneously detects and locates events from a network of stations. We report results from an evaluation of SIGVISA monitoring the western United States for a two-week period following the magnitude 6.0 event in Wells, NV in February 2008. During this period, SIGVISA detects more than twice as many events as NETVISA, and three times as many as SEL3, while operating at the same precision; at lower precisions it detects up to five times as many events as SEL3. At the same time, signal-based monitoring reduces mean location errors by a factor of four relative to detection-based systems. We provide evidence that, given only IMS data, SIGVISA detects events that are missed by regional monitoring networks, indicating that our evaluations may even underestimate its performance. Finally, SIGVISA matches or exceeds the detection rates of existing systems for de novo events - events with no nearby historical seismicity - and detects through automated processing a number of such events missed even by the human analysts generating the LEB.

  12. DISCRETE COMPOUND POISSON PROCESSES AND TABLES OF THE GEOMETRIC POISSON DISTRIBUTION.

    DTIC Science & Technology

    A concise summary of the salient properties of discrete Poisson processes , with emphasis on comparing the geometric and logarithmic Poisson processes . The...the geometric Poisson process are given for 176 sets of parameter values. New discrete compound Poisson processes are also introduced. These...processes have properties that are particularly relevant when the summation of several different Poisson processes is to be analyzed. This study provides the

  13. Hydra—The National Earthquake Information Center’s 24/7 seismic monitoring, analysis, catalog production, quality analysis, and special studies tool suite

    USGS Publications Warehouse

    Patton, John M.; Guy, Michelle R.; Benz, Harley M.; Buland, Raymond P.; Erickson, Brian K.; Kragness, David S.

    2016-08-18

    This report provides an overview of the capabilities and design of Hydra, the global seismic monitoring and analysis system used for earthquake response and catalog production at the U.S. Geological Survey National Earthquake Information Center (NEIC). Hydra supports the NEIC’s worldwide earthquake monitoring mission in areas such as seismic event detection, seismic data insertion and storage, seismic data processing and analysis, and seismic data output.The Hydra system automatically identifies seismic phase arrival times and detects the occurrence of earthquakes in near-real time. The system integrates and inserts parametric and waveform seismic data into discrete events in a database for analysis. Hydra computes seismic event parameters, including locations, multiple magnitudes, moment tensors, and depth estimates. Hydra supports the NEIC’s 24/7 analyst staff with a suite of seismic analysis graphical user interfaces.In addition to the NEIC’s monitoring needs, the system supports the processing of aftershock and temporary deployment data, and supports the NEIC’s quality assurance procedures. The Hydra system continues to be developed to expand its seismic analysis and monitoring capabilities.

  14. Discrete-Event Simulation Models of Plasmodium falciparum Malaria

    PubMed Central

    McKenzie, F. Ellis; Wong, Roger C.; Bossert, William H.

    2008-01-01

    We develop discrete-event simulation models using a single “timeline” variable to represent the Plasmodium falciparum lifecycle in individual hosts and vectors within interacting host and vector populations. Where they are comparable our conclusions regarding the relative importance of vector mortality and the durations of host immunity and parasite development are congruent with those of classic differential-equation models of malaria, epidemiology. However, our results also imply that in regions with intense perennial transmission, the influence of mosquito mortality on malaria prevalence in humans may be rivaled by that of the duration of host infectivity. PMID:18668185

  15. Graph-theoretic analysis of discrete-phase-space states for condition change detection and quantification of information

    DOEpatents

    Hively, Lee M.

    2014-09-16

    Data collected from devices and human condition may be used to forewarn of critical events such as machine/structural failure or events from brain/heart wave data stroke. By monitoring the data, and determining what values are indicative of a failure forewarning, one can provide adequate notice of the impending failure in order to take preventive measures. This disclosure teaches a computer-based method to convert dynamical numeric data representing physical objects (unstructured data) into discrete-phase-space states, and hence into a graph (structured data) for extraction of condition change.

  16. Control of discrete event systems modeled as hierarchical state machines

    NASA Technical Reports Server (NTRS)

    Brave, Y.; Heymann, M.

    1991-01-01

    The authors examine a class of discrete event systems (DESs) modeled as asynchronous hierarchical state machines (AHSMs). For this class of DESs, they provide an efficient method for testing reachability, which is an essential step in many control synthesis procedures. This method utilizes the asynchronous nature and hierarchical structure of AHSMs, thereby illustrating the advantage of the AHSM representation as compared with its equivalent (flat) state machine representation. An application of the method is presented where an online minimally restrictive solution is proposed for the problem of maintaining a controlled AHSM within prescribed legal bounds.

  17. Networked event-triggered control: an introduction and research trends

    NASA Astrophysics Data System (ADS)

    Mahmoud, Magdi S.; Sabih, Muhammad

    2014-11-01

    A physical system can be studied as either continuous time or discrete-time system depending upon the control objectives. Discrete-time control systems can be further classified into two categories based on the sampling: (1) time-triggered control systems and (2) event-triggered control systems. Time-triggered systems sample states and calculate controls at every sampling instant in a periodic fashion, even in cases when states and calculated control do not change much. This indicates unnecessary and useless data transmission and computation efforts of a time-triggered system, thus inefficiency. For networked systems, the transmission of measurement and control signals, thus, cause unnecessary network traffic. Event-triggered systems, on the other hand, have potential to reduce the communication burden in addition to reducing the computation of control signals. This paper provides an up-to-date survey on the event-triggered methods for control systems and highlights the potential research directions.

  18. Non-fragile ?-? control for discrete-time stochastic nonlinear systems under event-triggered protocols

    NASA Astrophysics Data System (ADS)

    Sun, Ying; Ding, Derui; Zhang, Sunjie; Wei, Guoliang; Liu, Hongjian

    2018-07-01

    In this paper, the non-fragile ?-? control problem is investigated for a class of discrete-time stochastic nonlinear systems under event-triggered communication protocols, which determine whether the measurement output should be transmitted to the controller or not. The main purpose of the addressed problem is to design an event-based output feedback controller subject to gain variations guaranteeing the prescribed disturbance attenuation level described by the ?-? performance index. By utilizing the Lyapunov stability theory combined with S-procedure, a sufficient condition is established to guarantee both the exponential mean-square stability and the ?-? performance for the closed-loop system. In addition, with the help of the orthogonal decomposition, the desired controller parameter is obtained in terms of the solution to certain linear matrix inequalities. Finally, a simulation example is exploited to demonstrate the effectiveness of the proposed event-based controller design scheme.

  19. Hybrid stochastic simplifications for multiscale gene networks

    PubMed Central

    Crudu, Alina; Debussche, Arnaud; Radulescu, Ovidiu

    2009-01-01

    Background Stochastic simulation of gene networks by Markov processes has important applications in molecular biology. The complexity of exact simulation algorithms scales with the number of discrete jumps to be performed. Approximate schemes reduce the computational time by reducing the number of simulated discrete events. Also, answering important questions about the relation between network topology and intrinsic noise generation and propagation should be based on general mathematical results. These general results are difficult to obtain for exact models. Results We propose a unified framework for hybrid simplifications of Markov models of multiscale stochastic gene networks dynamics. We discuss several possible hybrid simplifications, and provide algorithms to obtain them from pure jump processes. In hybrid simplifications, some components are discrete and evolve by jumps, while other components are continuous. Hybrid simplifications are obtained by partial Kramers-Moyal expansion [1-3] which is equivalent to the application of the central limit theorem to a sub-model. By averaging and variable aggregation we drastically reduce simulation time and eliminate non-critical reactions. Hybrid and averaged simplifications can be used for more effective simulation algorithms and for obtaining general design principles relating noise to topology and time scales. The simplified models reproduce with good accuracy the stochastic properties of the gene networks, including waiting times in intermittence phenomena, fluctuation amplitudes and stationary distributions. The methods are illustrated on several gene network examples. Conclusion Hybrid simplifications can be used for onion-like (multi-layered) approaches to multi-scale biochemical systems, in which various descriptions are used at various scales. Sets of discrete and continuous variables are treated with different methods and are coupled together in a physically justified approach. PMID:19735554

  20. 40 CFR 1033.535 - Adjusting emission levels to account for infrequently regenerating aftertreatment devices.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... aftertreatment technology with infrequent regeneration events that occur during testing. See paragraph (e) of... adjust discrete-mode testing. For this section, “regeneration” means an intended event during which... section, “infrequent” refers to regeneration events that are expected to occur on average less than once...

  1. 40 CFR 1033.535 - Adjusting emission levels to account for infrequently regenerating aftertreatment devices.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... aftertreatment technology with infrequent regeneration events that occur during testing. See paragraph (e) of... adjust discrete-mode testing. For this section, “regeneration” means an intended event during which... section, “infrequent” refers to regeneration events that are expected to occur on average less than once...

  2. 40 CFR 1033.535 - Adjusting emission levels to account for infrequently regenerating aftertreatment devices.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... aftertreatment technology with infrequent regeneration events that occur during testing. See paragraph (e) of... adjust discrete-mode testing. For this section, “regeneration” means an intended event during which... section, “infrequent” refers to regeneration events that are expected to occur on average less than once...

  3. SimEngine v. 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Le, Hai D.

    2017-03-02

    SimEngine provides the core functionalities and components that are key to the development of discrete event simulation tools. These include events, activities, event queues, random number generators, and basic result tracking classes. SimEngine was designed for high performance, integrates seamlessly into any Microsoft .Net development environment, and provides a flexible API for simulation developers.

  4. Growing degree day calculator

    USDA-ARS?s Scientific Manuscript database

    Degree-day benchmarks indicate discrete biological events in the development of insect pests. For the Sparganothis fruitworm, we have isolated all key development events and linked them to degree-day accumulations. These degree-day accumulations can greatly improve treatment timings for cranberry IP...

  5. Event-driven management algorithm of an Engineering documents circulation system

    NASA Astrophysics Data System (ADS)

    Kuzenkov, V.; Zebzeev, A.; Gromakov, E.

    2015-04-01

    Development methodology of an engineering documents circulation system in the design company is reviewed. Discrete event-driven automatic models using description algorithms of project management is offered. Petri net use for dynamic design of projects is offered.

  6. Event-Based Control Strategy for Mobile Robots in Wireless Environments.

    PubMed

    Socas, Rafael; Dormido, Sebastián; Dormido, Raquel; Fabregas, Ernesto

    2015-12-02

    In this paper, a new event-based control strategy for mobile robots is presented. It has been designed to work in wireless environments where a centralized controller has to interchange information with the robots over an RF (radio frequency) interface. The event-based architectures have been developed for differential wheeled robots, although they can be applied to other kinds of robots in a simple way. The solution has been checked over classical navigation algorithms, like wall following and obstacle avoidance, using scenarios with a unique or multiple robots. A comparison between the proposed architectures and the classical discrete-time strategy is also carried out. The experimental results shows that the proposed solution has a higher efficiency in communication resource usage than the classical discrete-time strategy with the same accuracy.

  7. Event-Based Control Strategy for Mobile Robots in Wireless Environments

    PubMed Central

    Socas, Rafael; Dormido, Sebastián; Dormido, Raquel; Fabregas, Ernesto

    2015-01-01

    In this paper, a new event-based control strategy for mobile robots is presented. It has been designed to work in wireless environments where a centralized controller has to interchange information with the robots over an RF (radio frequency) interface. The event-based architectures have been developed for differential wheeled robots, although they can be applied to other kinds of robots in a simple way. The solution has been checked over classical navigation algorithms, like wall following and obstacle avoidance, using scenarios with a unique or multiple robots. A comparison between the proposed architectures and the classical discrete-time strategy is also carried out. The experimental results shows that the proposed solution has a higher efficiency in communication resource usage than the classical discrete-time strategy with the same accuracy. PMID:26633412

  8. The surface of Mars: An unusual laboratory that preserves a record of catastrophic and unusual events

    USGS Publications Warehouse

    Chapman, M.G.

    2009-01-01

    Catastrophic and unusual events on Earth such as bolide impacts, megafloods, supereruptions, flood volcanism, and subice volcanism may have devastating effects when they occur. Although these processes have unique characteristics and form distinctive features and deposits, we have diffi culties identifying them and measuring the magnitude of their effects. Our diffi culties with interpreting these processes and identifying their consequences are understandable considering their infrequency on Earth, combined with the low preservation potential of their deposits in the terrestrial rock record. Although we know these events do happen, they are infrequent enough that the deposits are poorly preserved on the geologically active face of the Earth, where erosion, volcanism, and tectonism constantly change the surface. Unlike the Earth, on Mars catastrophic and unusual features are well preserved because of the slow modifi cation of the surface. Signifi cant precipitation has not occurred on Mars for billions of years and there appears to be no discrete crustal plates to have undergone subduction and destruction. Therefore the ancient surface of Mars preserves geologic features and deposits that result from these extraordinary events. Also, unlike the other planets, Mars is the most similar to our own, having an atmosphere, surface ice, volcanism, and evidence of onceflowing water. So although our understanding of precursors, processes, and possible biological effects of catastrophic and unusual processes is limited on Earth, some of these mysteries may be better understood through investigating the surface of Mars. ?? 2009 The Geological Society of America.

  9. Implicit Learning of Predictive Relationships in Three-element Visual Sequences by Young and Old Adults

    PubMed Central

    Howard, James H.; Howard, Darlene V.; Dennis, Nancy A.; Kelly, Andrew J.

    2008-01-01

    Knowledge of sequential relationships enables future events to be anticipated and processed efficiently. Research with the serial reaction time task (SRTT) has shown that sequence learning often occurs implicitly without effort or awareness. Here we report four experiments that use a triplet-learning task (TLT) to investigate sequence learning in young and older adults. In the TLT people respond only to the last target event in a series of discrete, three-event sequences or triplets. Target predictability is manipulated by varying the triplet frequency (joint probability) and/or the statistical relationships (conditional probabilities) among events within the triplets. Results revealed that both groups learned, though older adults showed less learning of both joint and conditional probabilities. Young people used the statistical information in both cues, but older adults relied primarily on information in the second cue alone. We conclude that the TLT complements and extends the SRTT and other tasks by offering flexibility in the kinds of sequential statistical regularities that may be studied as well as by controlling event timing and eliminating motor response sequencing. PMID:18763897

  10. A study of discrete control signal fault conditions in the shuttle DPS

    NASA Technical Reports Server (NTRS)

    Reddi, S. S.; Retter, C. T.

    1976-01-01

    An analysis of the effects of discrete failures on the data processing subsystem is presented. A functional description of each discrete together with a list of software modules that use this discrete are included. A qualitative description of the consequences that may ensue due to discrete failures is given followed by a probabilistic reliability analysis of the data processing subsystem. Based on the investigation conducted, recommendations were made to improve the reliability of the subsystem.

  11. Acceleration techniques for dependability simulation. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Barnette, James David

    1995-01-01

    As computer systems increase in complexity, the need to project system performance from the earliest design and development stages increases. We have to employ simulation for detailed dependability studies of large systems. However, as the complexity of the simulation model increases, the time required to obtain statistically significant results also increases. This paper discusses an approach that is application independent and can be readily applied to any process-based simulation model. Topics include background on classical discrete event simulation and techniques for random variate generation and statistics gathering to support simulation.

  12. Behavior coordination of mobile robotics using supervisory control of fuzzy discrete event systems.

    PubMed

    Jayasiri, Awantha; Mann, George K I; Gosine, Raymond G

    2011-10-01

    In order to incorporate the uncertainty and impreciseness present in real-world event-driven asynchronous systems, fuzzy discrete event systems (DESs) (FDESs) have been proposed as an extension to crisp DESs. In this paper, first, we propose an extension to the supervisory control theory of FDES by redefining fuzzy controllable and uncontrollable events. The proposed supervisor is capable of enabling feasible uncontrollable and controllable events with different possibilities. Then, the extended supervisory control framework of FDES is employed to model and control several navigational tasks of a mobile robot using the behavior-based approach. The robot has limited sensory capabilities, and the navigations have been performed in several unmodeled environments. The reactive and deliberative behaviors of the mobile robotic system are weighted through fuzzy uncontrollable and controllable events, respectively. By employing the proposed supervisory controller, a command-fusion-type behavior coordination is achieved. The observability of fuzzy events is incorporated to represent the sensory imprecision. As a systematic analysis of the system, a fuzzy-state-based controllability measure is introduced. The approach is implemented in both simulation and real time. A performance evaluation is performed to quantitatively estimate the validity of the proposed approach over its counterparts.

  13. Stochastic simulation tools and continuum models for describing two-dimensional collective cell spreading with universal growth functions

    NASA Astrophysics Data System (ADS)

    Jin, Wang; Penington, Catherine J.; McCue, Scott W.; Simpson, Matthew J.

    2016-10-01

    Two-dimensional collective cell migration assays are used to study cancer and tissue repair. These assays involve combined cell migration and cell proliferation processes, both of which are modulated by cell-to-cell crowding. Previous discrete models of collective cell migration assays involve a nearest-neighbour proliferation mechanism where crowding effects are incorporated by aborting potential proliferation events if the randomly chosen target site is occupied. There are two limitations of this traditional approach: (i) it seems unreasonable to abort a potential proliferation event based on the occupancy of a single, randomly chosen target site; and, (ii) the continuum limit description of this mechanism leads to the standard logistic growth function, but some experimental evidence suggests that cells do not always proliferate logistically. Motivated by these observations, we introduce a generalised proliferation mechanism which allows non-nearest neighbour proliferation events to take place over a template of r≥slant 1 concentric rings of lattice sites. Further, the decision to abort potential proliferation events is made using a crowding function, f(C), which accounts for the density of agents within a group of sites rather than dealing with the occupancy of a single randomly chosen site. Analysing the continuum limit description of the stochastic model shows that the standard logistic source term, λ C(1-C), where λ is the proliferation rate, is generalised to a universal growth function, λ C f(C). Comparing the solution of the continuum description with averaged simulation data indicates that the continuum model performs well for many choices of f(C) and r. For nonlinear f(C), the quality of the continuum-discrete match increases with r.

  14. Stochastic simulation tools and continuum models for describing two-dimensional collective cell spreading with universal growth functions.

    PubMed

    Jin, Wang; Penington, Catherine J; McCue, Scott W; Simpson, Matthew J

    2016-10-07

    Two-dimensional collective cell migration assays are used to study cancer and tissue repair. These assays involve combined cell migration and cell proliferation processes, both of which are modulated by cell-to-cell crowding. Previous discrete models of collective cell migration assays involve a nearest-neighbour proliferation mechanism where crowding effects are incorporated by aborting potential proliferation events if the randomly chosen target site is occupied. There are two limitations of this traditional approach: (i) it seems unreasonable to abort a potential proliferation event based on the occupancy of a single, randomly chosen target site; and, (ii) the continuum limit description of this mechanism leads to the standard logistic growth function, but some experimental evidence suggests that cells do not always proliferate logistically. Motivated by these observations, we introduce a generalised proliferation mechanism which allows non-nearest neighbour proliferation events to take place over a template of [Formula: see text] concentric rings of lattice sites. Further, the decision to abort potential proliferation events is made using a crowding function, f(C), which accounts for the density of agents within a group of sites rather than dealing with the occupancy of a single randomly chosen site. Analysing the continuum limit description of the stochastic model shows that the standard logistic source term, [Formula: see text], where λ is the proliferation rate, is generalised to a universal growth function, [Formula: see text]. Comparing the solution of the continuum description with averaged simulation data indicates that the continuum model performs well for many choices of f(C) and r. For nonlinear f(C), the quality of the continuum-discrete match increases with r.

  15. An Intensive Observation of Calving at Helheim Glacier, East Greenland

    NASA Technical Reports Server (NTRS)

    Holland, David M.; Voytenko, Denis; Christianson, Knut; Dixon, Timothy H.; Mei, M. Jeffrey; Parizek, Byron R.; Vankova, Irena; Walker, Ryan T.; Walter, Jacob I.; Nicholls, Keith; hide

    2016-01-01

    Calving of glacial ice into the ocean from the Greenland Ice Sheet is an important component of global sea-level rise. The calving process itself is relatively poorly observed, understood, and modeled; as such, it represents a bottleneck in improving future global sea-level estimates in climate models. We organized a pilot project to observe the calving process at Helheim Glacier in east Greenland in an effort to better understand it. During an intensive one-week survey, we deployed a suite of instrumentation, including a terrestrial radar interferometer, global positioning system (GPS) receivers, seismometers, tsunameters, and an automated weather station. We were fortunate to capture a calving process and to measure various glaciological, oceanographic, and atmospheric parameters before, during, and after the event. One outcome of our observations is evidence that the calving process actually consists of a number of discrete events, spread out over time, in this instance over at least two days. This time span has implications for models of the process. Realistic projections of future global sea level will depend on an accurate parametrization of calving, and we argue that more sustained observations will be required to reach this objective.

  16. Using Discrete-Event Simulation to Promote Quality Improvement and Efficiency in a Radiation Oncology Treatment Center.

    PubMed

    Famiglietti, Robin M; Norboge, Emily C; Boving, Valentine; Langabeer, James R; Buchholz, Thomas A; Mikhail, Osama

    To meet demand for radiation oncology services and ensure patient-centered safe care, management in an academic radiation oncology department initiated quality improvement efforts using discrete-event simulation (DES). Although the long-term goal was testing and deploying solutions, the primary aim at the outset was characterizing and validating a computer simulation model of existing operations to identify targets for improvement. The adoption and validation of a DES model of processes and procedures affecting patient flow and satisfaction, employee experience, and efficiency were undertaken in 2012-2013. Multiple sources were tapped for data, including direct observation, equipment logs, timekeeping, and electronic health records. During their treatment visits, patients averaged 50.4 minutes in the treatment center, of which 38% was spent in the treatment room. Patients with appointments between 10 AM and 2 PM experienced the longest delays before entering the treatment room, and those in the clinic in the day's first and last hours, the shortest (<5 minutes). Despite staffed for 14.5 hours daily, the clinic registered only 20% of patients after 2:30 PM. Utilization of equipment averaged 58%, and utilization of staff, 56%. The DES modeling quantified operations, identifying evidence-based targets for next-phase remediation and providing data to justify initiatives.

  17. Modeling using discrete event simulation: a report of the ISPOR-SMDM Modeling Good Research Practices Task Force--4.

    PubMed

    Karnon, Jonathan; Stahl, James; Brennan, Alan; Caro, J Jaime; Mar, Javier; Möller, Jörgen

    2012-01-01

    Discrete event simulation (DES) is a form of computer-based modeling that provides an intuitive and flexible approach to representing complex systems. It has been used in a wide range of health care applications. Most early applications involved analyses of systems with constrained resources, where the general aim was to improve the organization of delivered services. More recently, DES has increasingly been applied to evaluate specific technologies in the context of health technology assessment. The aim of this article was to provide consensus-based guidelines on the application of DES in a health care setting, covering the range of issues to which DES can be applied. The article works through the different stages of the modeling process: structural development, parameter estimation, model implementation, model analysis, and representation and reporting. For each stage, a brief description is provided, followed by consideration of issues that are of particular relevance to the application of DES in a health care setting. Each section contains a number of best practice recommendations that were iterated among the authors, as well as among the wider modeling task force. Copyright © 2012 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  18. Implementing system simulation of C3 systems using autonomous objects

    NASA Technical Reports Server (NTRS)

    Rogers, Ralph V.

    1987-01-01

    The basis of all conflict recognition in simulation is a common frame of reference. Synchronous discrete-event simulation relies on the fixed points in time as the basic frame of reference. Asynchronous discrete-event simulation relies on fixed-points in the model space as the basic frame of reference. Neither approach provides sufficient support for autonomous objects. The use of a spatial template as a frame of reference is proposed to address these insufficiencies. The concept of a spatial template is defined and an implementation approach offered. Discussed are the uses of this approach to analyze the integration of sensor data associated with Command, Control, and Communication systems.

  19. Modeling logistic performance in quantitative microbial risk assessment.

    PubMed

    Rijgersberg, Hajo; Tromp, Seth; Jacxsens, Liesbeth; Uyttendaele, Mieke

    2010-01-01

    In quantitative microbial risk assessment (QMRA), food safety in the food chain is modeled and simulated. In general, prevalences, concentrations, and numbers of microorganisms in media are investigated in the different steps from farm to fork. The underlying rates and conditions (such as storage times, temperatures, gas conditions, and their distributions) are determined. However, the logistic chain with its queues (storages, shelves) and mechanisms for ordering products is usually not taken into account. As a consequence, storage times-mutually dependent in successive steps in the chain-cannot be described adequately. This may have a great impact on the tails of risk distributions. Because food safety risks are generally very small, it is crucial to model the tails of (underlying) distributions as accurately as possible. Logistic performance can be modeled by describing the underlying planning and scheduling mechanisms in discrete-event modeling. This is common practice in operations research, specifically in supply chain management. In this article, we present the application of discrete-event modeling in the context of a QMRA for Listeria monocytogenes in fresh-cut iceberg lettuce. We show the potential value of discrete-event modeling in QMRA by calculating logistic interventions (modifications in the logistic chain) and determining their significance with respect to food safety.

  20. Distinct effect of actin cytoskeleton disassembly on exo- and endocytic events in a membrane patch of rat melanotrophs.

    PubMed

    Chowdhury, Helena H; Kreft, Marko; Zorec, Robert

    2002-12-15

    We used the cell-attached mode of patch-clamp technique to measure discrete attofarad steps in membrane capacitance (C(m)), reporting area changes in the plasma membrane due to unitary exocytic and endocytic events. To investigate the role of the actin cytoskeleton in elementary exocytic and endocytic events, neuroendocrine rat melanotrophs were treated with Clostridium spiroforme toxin (CST), which specifically depolymerises F-actin. The average amplitude of exocytic events was not significantly different in control and in CST-treated cells. However, the amplitude of endocytic events was significantly smaller in CST-treated cells as compared to controls. The frequency of exocytic events increased by 2-fold in CST-treated cells relative to controls. In control cells the average frequency of exocytic events (upsilon;(exo)) was lower than the frequency of endocytic events (upsilon;(endo)) with a ratio upsilon;(exo)/upsilon;(endo) < 1. In the toxin treated cells, the predominant process was exocytosis with a ratio (upsilon;(exo)/upsilon;(endo) > 1). To study the coupling between the two processes, the slopes of regression lines relating upsilon;(exo) and upsilon;(endo) in a given patch of membrane were studied. The slopes of regression lines were similar, whereas the line intercepts with the y-axis were significantly different. The increased frequency of unitary exocytic events in CST-treated cells is consistent with the view, that the actin cytoskeleton acts as a barrier for exocytosis. While the disassembly of the actin cytoskeleton diminishes the size of unitary endocytic events, suggesting an important role of the actin cytoskeleton in determining the size of endocytic vesicles, the coupling between exocytosis and endocytosis in a given patch of membrane was independent of the state of the actin cytoskeleton.

  1. Distinct effect of actin cytoskeleton disassembly on exo- and endocytic events in a membrane patch of rat melanotrophs

    PubMed Central

    Chowdhury, Helena H; Kreft, Marko; Zorec, Robert

    2002-01-01

    We used the cell-attached mode of patch-clamp technique to measure discrete attofarad steps in membrane capacitance (Cm), reporting area changes in the plasma membrane due to unitary exocytic and endocytic events. To investigate the role of the actin cytoskeleton in elementary exocytic and endocytic events, neuroendocrine rat melanotrophs were treated with Clostridium spiroforme toxin (CST), which specifically depolymerises F-actin. The average amplitude of exocytic events was not significantly different in control and in CST-treated cells. However, the amplitude of endocytic events was significantly smaller in CST-treated cells as compared to controls. The frequency of exocytic events increased by 2-fold in CST-treated cells relative to controls. In control cells the average frequency of exocytic events (νexo) was lower than the frequency of endocytic events (νendo) with a ratio νexo/νendo < 1. In the toxin treated cells, the predominant process was exocytosis with a ratio (νexo/νendo > 1). To study the coupling between the two processes, the slopes of regression lines relating νexo and νendo in a given patch of membrane were studied. The slopes of regression lines were similar, whereas the line intercepts with the y-axis were significantly different. The increased frequency of unitary exocytic events in CST-treated cells is consistent with the view, that the actin cytoskeleton acts as a barrier for exocytosis. While the disassembly of the actin cytoskeleton diminishes the size of unitary endocytic events, suggesting an important role of the actin cytoskeleton in determining the size of endocytic vesicles, the coupling between exocytosis and endocytosis in a given patch of membrane was independent of the state of the actin cytoskeleton. PMID:12482893

  2. Estimating Multi-Level Discrete-Time Hazard Models Using Cross-Sectional Data: Neighborhood Effects on the Onset of Adolescent Cigarette Use.

    ERIC Educational Resources Information Center

    Reardon, Sean F.; Brennan, Robert T.; Buka, Stephen L.

    2002-01-01

    Developed procedures for constructing a retrospective person-period data set from cross-sectional data and discusses modeling strategies for estimating multilevel discrete-time event history models. Applied the methods to the analysis of cigarette use by 1,979 urban adolescents. Results show the influence of the racial composition of the…

  3. Discretized Streams: A Fault-Tolerant Model for Scalable Stream Processing

    DTIC Science & Technology

    2012-12-14

    Discretized Streams: A Fault-Tolerant Model for Scalable Stream Processing Matei Zaharia Tathagata Das Haoyuan Li Timothy Hunter Scott Shenker Ion...SUBTITLE Discretized Streams: A Fault-Tolerant Model for Scalable Stream Processing 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER...time. However, current programming models for distributed stream processing are relatively low-level often leaving the user to worry about consistency of

  4. Event-triggered fault detection for a class of discrete-time linear systems using interval observers.

    PubMed

    Zhang, Zhi-Hui; Yang, Guang-Hong

    2017-05-01

    This paper provides a novel event-triggered fault detection (FD) scheme for discrete-time linear systems. First, an event-triggered interval observer is proposed to generate the upper and lower residuals by taking into account the influence of the disturbances and the event error. Second, the robustness of the residual interval against the disturbances and the fault sensitivity are improved by introducing l 1 and H ∞ performances. Third, dilated linear matrix inequalities are used to decouple the Lyapunov matrices from the system matrices. The nonnegative conditions for the estimation error variables are presented with the aid of the slack matrix variables. This technique allows considering a more general Lyapunov function. Furthermore, the FD decision scheme is proposed by monitoring whether the zero value belongs to the residual interval. It is shown that the information communication burden is reduced by designing the event-triggering mechanism, while the FD performance can still be guaranteed. Finally, simulation results demonstrate the effectiveness of the proposed method. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  5. Seasonally active frost-dust avalanches on a north polar scarp of Mars captured by HiRISE

    USGS Publications Warehouse

    Russell, P.; Thomas, N.; Byrne, S.; Herkenhoff, K.; Fishbaugh, K.; Bridges, N.; Okubo, C.; Milazzo, M.; Daubar, I.; Hansen, C.; McEwen, A.

    2008-01-01

    North-polar temporal monitoring by the High Resolution Imaging Science Experiment (HiRISE) orbiting Mars has discovered new, dramatic examples that Mars1 CO2-dominated seasonal volatile cycle is not limited to quiet deposition and sublimation of frost. In early northern martian spring, 2008, HiRISE captured several cases of CO2 frost and dust cascading down a steep, polar scarp in discrete clouds. Analysis of morphology and process reveals these events to be similar to terrestrial powder avalanches, sluffs, and falls of loose, dry snow. Potential material sources and initiating mechanisms are discussed in the context of the Martian polar spring environment and of additional, active, aeolian processes observed on the plateau above the scarp. The scarp events are identified as a trigger for mass wasting of bright, fractured layers within the basal unit, and may indirectly influence the retreat rate of steep polar scarps in competing ways. Copyright 2008 by the American Geophysical Union.

  6. Discrete Analysis of Damage and Shear Banding in Argillaceous Rocks

    NASA Astrophysics Data System (ADS)

    Dinç, Özge; Scholtès, Luc

    2018-05-01

    A discrete approach is proposed to study damage and failure processes taking place in argillaceous rocks which present a transversely isotropic behavior. More precisely, a dedicated discrete element method is utilized to provide a micromechanical description of the mechanisms involved. The purpose of the study is twofold: (1) presenting a three-dimensional discrete element model able to simulate the anisotropic macro-mechanical behavior of the Callovo-Oxfordian claystone as a particular case of argillaceous rocks; (2) studying how progressive failure develops in such material. Material anisotropy is explicitly taken into account in the numerical model through the introduction of weakness planes distributed at the interparticle scale following predefined orientation and intensity. Simulations of compression tests under plane-strain and triaxial conditions are performed to clarify the development of damage and the appearance of shear bands through micromechanical analyses. The overall mechanical behavior and shear banding patterns predicted by the numerical model are in good agreement with respect to experimental observations. Both tensile and shear microcracks emerging from the modeling also present characteristics compatible with microstructural observations. The numerical results confirm that the global failure of argillaceous rocks is well correlated with the mechanisms taking place at the local scale. Specifically, strain localization is shown to directly result from shear microcracking developing with a preferential orientation distribution related to the orientation of the shear band. In addition, localization events presenting characteristics similar to shear bands are observed from the early stages of the loading and might thus be considered as precursors of strain localization.

  7. Bayesian selection of Markov models for symbol sequences: application to microsaccadic eye movements.

    PubMed

    Bettenbühl, Mario; Rusconi, Marco; Engbert, Ralf; Holschneider, Matthias

    2012-01-01

    Complex biological dynamics often generate sequences of discrete events which can be described as a Markov process. The order of the underlying Markovian stochastic process is fundamental for characterizing statistical dependencies within sequences. As an example for this class of biological systems, we investigate the Markov order of sequences of microsaccadic eye movements from human observers. We calculate the integrated likelihood of a given sequence for various orders of the Markov process and use this in a Bayesian framework for statistical inference on the Markov order. Our analysis shows that data from most participants are best explained by a first-order Markov process. This is compatible with recent findings of a statistical coupling of subsequent microsaccade orientations. Our method might prove to be useful for a broad class of biological systems.

  8. When to use discrete event simulation (DES) for the economic evaluation of health technologies? A review and critique of the costs and benefits of DES.

    PubMed

    Karnon, Jonathan; Haji Ali Afzali, Hossein

    2014-06-01

    Modelling in economic evaluation is an unavoidable fact of life. Cohort-based state transition models are most common, though discrete event simulation (DES) is increasingly being used to implement more complex model structures. The benefits of DES relate to the greater flexibility around the implementation and population of complex models, which may provide more accurate or valid estimates of the incremental costs and benefits of alternative health technologies. The costs of DES relate to the time and expertise required to implement and review complex models, when perhaps a simpler model would suffice. The costs are not borne solely by the analyst, but also by reviewers. In particular, modelled economic evaluations are often submitted to support reimbursement decisions for new technologies, for which detailed model reviews are generally undertaken on behalf of the funding body. This paper reports the results from a review of published DES-based economic evaluations. Factors underlying the use of DES were defined, and the characteristics of applied models were considered, to inform options for assessing the potential benefits of DES in relation to each factor. Four broad factors underlying the use of DES were identified: baseline heterogeneity, continuous disease markers, time varying event rates, and the influence of prior events on subsequent event rates. If relevant, individual-level data are available, representation of the four factors is likely to improve model validity, and it is possible to assess the importance of their representation in individual cases. A thorough model performance evaluation is required to overcome the costs of DES from the users' perspective, but few of the reviewed DES models reported such a process. More generally, further direct, empirical comparisons of complex models with simpler models would better inform the benefits of DES to implement more complex models, and the circumstances in which such benefits are most likely.

  9. First X-ray Statistical Tests for Clumpy-Torus Models: Constraints from RXTEmonitoring of Seyfert AGN

    NASA Astrophysics Data System (ADS)

    Markowitz, Alex; Krumpe, Mirko; Nikutta, R.

    2016-06-01

    In two papers (Markowitz, Krumpe, & Nikutta 2014, and Nikutta et al., in prep.), we derive the first X-ray statistical constraints for clumpy-torus models in Seyfert AGN by quantifying multi-timescale variability in line of-sight X-ray absorbing gas as a function of optical classification.We systematically search for discrete absorption events in the vast archive of RXTE monitoring of 55 nearby type Is and Compton-thin type IIs. We are sensitive to discrete absorption events due to clouds of full-covering, neutral/mildly ionized gas transiting the line of sight. Our results apply to both dusty and non-dusty clumpy media, and probe model parameter space complementary to that for eclipses observed with XMM-Newton, Suzaku, and Chandra.We detect twelve eclipse events in eight Seyferts, roughly tripling the number previously published from this archive. Event durations span hours to years. Most of our detected clouds are Compton-thin, and most clouds' distances from the black hole are inferred to be commensurate with the outer portions of the BLR or the inner regions of infrared-emitting dusty tori.We present the density profiles of the highest-quality eclipse events; the column density profile for an eclipsing cloud in NGC 3783 is doubly spiked, possibly indicating a cloud that is being tidallysheared. We discuss implications for cloud distributions in the context of clumpy-torus models. We calculate eclipse probabilities for orientation-dependent Type I/II unification schemes.We present constraints on cloud sizes, stability, and radial distribution. We infer that clouds' small angular sizes as seen from the SMBH imply 107 clouds required across the BLR + torus. Cloud size is roughly proportional to distance from the black hole, hinting at the formation processes (e.g., disk fragmentation). All observed clouds are sub-critical with respect to tidal disruption; self-gravity alone cannot contain them. External forces, such as magnetic fields or ambient pressure, are needed to contain them; otherwise, clouds must be short-lived.

  10. Using the Statecharts paradigm for simulation of patient flow in surgical care.

    PubMed

    Sobolev, Boris; Harel, David; Vasilakis, Christos; Levy, Adrian

    2008-03-01

    Computer simulation of patient flow has been used extensively to assess the impacts of changes in the management of surgical care. However, little research is available on the utility of existing modeling techniques. The purpose of this paper is to examine the capacity of Statecharts, a system of graphical specification, for constructing a discrete-event simulation model of the perioperative process. The Statecharts specification paradigm was originally developed for representing reactive systems by extending the formalism of finite-state machines through notions of hierarchy, parallelism, and event broadcasting. Hierarchy permits subordination between states so that one state may contain other states. Parallelism permits more than one state to be active at any given time. Broadcasting of events allows one state to detect changes in another state. In the context of the peri-operative process, hierarchy provides the means to describe steps within activities and to cluster related activities, parallelism provides the means to specify concurrent activities, and event broadcasting provides the means to trigger a series of actions in one activity according to transitions that occur in another activity. Combined with hierarchy and parallelism, event broadcasting offers a convenient way to describe the interaction of concurrent activities. We applied the Statecharts formalism to describe the progress of individual patients through surgical care as a series of asynchronous updates in patient records generated in reaction to events produced by parallel finite-state machines representing concurrent clinical and managerial activities. We conclude that Statecharts capture successfully the behavioral aspects of surgical care delivery by specifying permissible chronology of events, conditions, and actions.

  11. Eigenforms, Discrete Processes and Quantum Processes

    NASA Astrophysics Data System (ADS)

    Kauffman, Louis H.

    2012-05-01

    This essay is a discussion of the concept of eigenform, due to Heinz von Foerster, and its relationship with discrete physics and quantum mechanics. We interpret the square root of minus one as a simple oscillatory process - a clock, and as an eigenform. By taking a generalization of this identification of i as a clock and eigenform, we show how quantum mechanics emerges from discrete physics.

  12. Airlift Operation Modeling Using Discrete Event Simulation (DES)

    DTIC Science & Technology

    2009-12-01

    Java ......................................................................................................20 2. Simkit...JRE Java Runtime Environment JVM Java Virtual Machine lbs Pounds LAM Load Allocation Mode LRM Landing Spot Reassignment Mode LEGO Listener Event...SOFTWARE DEVELOPMENT ENVIRONMENT The following are the software tools and development environment used for constructing the models. 1. Java Java

  13. Sparganothis fruitworm degree-day benchmarks provide key treatmen timings for cranberry IPM

    USDA-ARS?s Scientific Manuscript database

    Degree-day benchmarks indicate discrete biological events in the development of insect pests. For the Sparganothis fruitworm, we have isolated all key development events and linked them to degree-day accumulations. These degree-day accumulations can greatly improve treatment timings for cranberry ...

  14. Optimization and quantization in gradient symbol systems: a framework for integrating the continuous and the discrete in cognition.

    PubMed

    Smolensky, Paul; Goldrick, Matthew; Mathis, Donald

    2014-08-01

    Mental representations have continuous as well as discrete, combinatorial properties. For example, while predominantly discrete, phonological representations also vary continuously; this is reflected by gradient effects in instrumental studies of speech production. Can an integrated theoretical framework address both aspects of structure? The framework we introduce here, Gradient Symbol Processing, characterizes the emergence of grammatical macrostructure from the Parallel Distributed Processing microstructure (McClelland, Rumelhart, & The PDP Research Group, 1986) of language processing. The mental representations that emerge, Distributed Symbol Systems, have both combinatorial and gradient structure. They are processed through Subsymbolic Optimization-Quantization, in which an optimization process favoring representations that satisfy well-formedness constraints operates in parallel with a distributed quantization process favoring discrete symbolic structures. We apply a particular instantiation of this framework, λ-Diffusion Theory, to phonological production. Simulations of the resulting model suggest that Gradient Symbol Processing offers a way to unify accounts of grammatical competence with both discrete and continuous patterns in language performance. Copyright © 2013 Cognitive Science Society, Inc.

  15. Nonparametric Bayesian Segmentation of a Multivariate Inhomogeneous Space-Time Poisson Process.

    PubMed

    Ding, Mingtao; He, Lihan; Dunson, David; Carin, Lawrence

    2012-12-01

    A nonparametric Bayesian model is proposed for segmenting time-evolving multivariate spatial point process data. An inhomogeneous Poisson process is assumed, with a logistic stick-breaking process (LSBP) used to encourage piecewise-constant spatial Poisson intensities. The LSBP explicitly favors spatially contiguous segments, and infers the number of segments based on the observed data. The temporal dynamics of the segmentation and of the Poisson intensities are modeled with exponential correlation in time, implemented in the form of a first-order autoregressive model for uniformly sampled discrete data, and via a Gaussian process with an exponential kernel for general temporal sampling. We consider and compare two different inference techniques: a Markov chain Monte Carlo sampler, which has relatively high computational complexity; and an approximate and efficient variational Bayesian analysis. The model is demonstrated with a simulated example and a real example of space-time crime events in Cincinnati, Ohio, USA.

  16. Optimizing the availability of a buffered industrial process

    DOEpatents

    Martz, Jr., Harry F.; Hamada, Michael S.; Koehler, Arthur J.; Berg, Eric C.

    2004-08-24

    A computer-implemented process determines optimum configuration parameters for a buffered industrial process. A population size is initialized by randomly selecting a first set of design and operation values associated with subsystems and buffers of the buffered industrial process to form a set of operating parameters for each member of the population. An availability discrete event simulation (ADES) is performed on each member of the population to determine the product-based availability of each member. A new population is formed having members with a second set of design and operation values related to the first set of design and operation values through a genetic algorithm and the product-based availability determined by the ADES. Subsequent population members are then determined by iterating the genetic algorithm with product-based availability determined by ADES to form improved design and operation values from which the configuration parameters are selected for the buffered industrial process.

  17. A Framework for the Optimization of Discrete-Event Simulation Models

    NASA Technical Reports Server (NTRS)

    Joshi, B. D.; Unal, R.; White, N. H.; Morris, W. D.

    1996-01-01

    With the growing use of computer modeling and simulation, in all aspects of engineering, the scope of traditional optimization has to be extended to include simulation models. Some unique aspects have to be addressed while optimizing via stochastic simulation models. The optimization procedure has to explicitly account for the randomness inherent in the stochastic measures predicted by the model. This paper outlines a general purpose framework for optimization of terminating discrete-event simulation models. The methodology combines a chance constraint approach for problem formulation, together with standard statistical estimation and analyses techniques. The applicability of the optimization framework is illustrated by minimizing the operation and support resources of a launch vehicle, through a simulation model.

  18. Advances in Discrete-Event Simulation for MSL Command Validation

    NASA Technical Reports Server (NTRS)

    Patrikalakis, Alexander; O'Reilly, Taifun

    2013-01-01

    In the last five years, the discrete event simulator, SEQuence GENerator (SEQGEN), developed at the Jet Propulsion Laboratory to plan deep-space missions, has greatly increased uplink operations capacity to deal with increasingly complicated missions. In this paper, we describe how the Mars Science Laboratory (MSL) project makes full use of an interpreted environment to simulate change in more than fifty thousand flight software parameters and conditional command sequences to predict the result of executing a conditional branch in a command sequence, and enable the ability to warn users whenever one or more simulated spacecraft states change in an unexpected manner. Using these new SEQGEN features, operators plan more activities in one sol than ever before.

  19. A computational approach to extinction events in chemical reaction networks with discrete state spaces.

    PubMed

    Johnston, Matthew D

    2017-12-01

    Recent work of Johnston et al. has produced sufficient conditions on the structure of a chemical reaction network which guarantee that the corresponding discrete state space system exhibits an extinction event. The conditions consist of a series of systems of equalities and inequalities on the edges of a modified reaction network called a domination-expanded reaction network. In this paper, we present a computational implementation of these conditions written in Python and apply the program on examples drawn from the biochemical literature. We also run the program on 458 models from the European Bioinformatics Institute's BioModels Database and report our results. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. Identification of safety-critical events using kinematic vehicle data and the discrete fourier transform.

    PubMed

    Kluger, Robert; Smith, Brian L; Park, Hyungjun; Dailey, Daniel J

    2016-11-01

    Recent technological advances have made it both feasible and practical to identify unsafe driving behaviors using second-by-second trajectory data. Presented in this paper is a unique approach to detecting safety-critical events using vehicles' longitudinal accelerations. A Discrete Fourier Transform is used in combination with K-means clustering to flag patterns in the vehicles' accelerations in time-series that are likely to be crashes or near-crashes. The algorithm was able to detect roughly 78% of crasjavascript:void(0)hes and near-crashes (71 out of 91 validated events in the Naturalistic Driving Study data used), while generating about 1 false positive every 2.7h. In addition to presenting the promising results, an implementation strategy is discussed and further research topics that can improve this method are suggested in the paper. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Event-triggered H∞ state estimation for semi-Markov jumping discrete-time neural networks with quantization.

    PubMed

    Rakkiyappan, R; Maheswari, K; Velmurugan, G; Park, Ju H

    2018-05-17

    This paper investigates H ∞ state estimation problem for a class of semi-Markovian jumping discrete-time neural networks model with event-triggered scheme and quantization. First, a new event-triggered communication scheme is introduced to determine whether or not the current sampled sensor data should be broad-casted and transmitted to the quantizer, which can save the limited communication resource. Second, a novel communication framework is employed by the logarithmic quantizer that quantifies and reduces the data transmission rate in the network, which apparently improves the communication efficiency of networks. Third, a stabilization criterion is derived based on the sufficient condition which guarantees a prescribed H ∞ performance level in the estimation error system in terms of the linear matrix inequalities. Finally, numerical simulations are given to illustrate the correctness of the proposed scheme. Copyright © 2018 Elsevier Ltd. All rights reserved.

  2. Asynchronous discrete event schemes for PDEs

    NASA Astrophysics Data System (ADS)

    Stone, D.; Geiger, S.; Lord, G. J.

    2017-08-01

    A new class of asynchronous discrete-event simulation schemes for advection-diffusion-reaction equations is introduced, based on the principle of allowing quanta of mass to pass through faces of a (regular, structured) Cartesian finite volume grid. The timescales of these events are linked to the flux on the face. The resulting schemes are self-adaptive, and local in both time and space. Experiments are performed on realistic physical systems related to porous media flow applications, including a large 3D advection diffusion equation and advection diffusion reaction systems. The results are compared to highly accurate reference solutions where the temporal evolution is computed with exponential integrator schemes using the same finite volume discretisation. This allows a reliable estimation of the solution error. Our results indicate a first order convergence of the error as a control parameter is decreased, and we outline a framework for analysis.

  3. Order of events matter: comparing discrete models for optimal control of species augmentation.

    PubMed

    Bodine, Erin N; Gross, Louis J; Lenhart, Suzanne

    2012-01-01

    We investigate optimal timing of augmentation of an endangered/threatened species population in a target region by moving individuals from a reserve or captive population. This is formulated as a discrete-time optimal control problem in which augmentation occurs once per time period over a fixed number of time periods. The population model assumes the Allee effect growth functions in both target and reserve populations and the control objective is to maximize the target and reserve population sizes over the time horizon while accounting for costs of augmentation. Two possible orders of events are considered for different life histories of the species relative to augmentation time: move individuals either before or after population growth occurs. The control variable is the proportion of the reserve population to be moved to the target population. We develop solutions and illustrate numerical results which indicate circumstances for which optimal augmentation strategies depend upon the order of events.

  4. Discrete event simulation model of sudden cardiac death predicts high impact of preventive interventions.

    PubMed

    Andreev, Victor P; Head, Trajen; Johnson, Neil; Deo, Sapna K; Daunert, Sylvia; Goldschmidt-Clermont, Pascal J

    2013-01-01

    Sudden Cardiac Death (SCD) is responsible for at least 180,000 deaths a year and incurs an average cost of $286 billion annually in the United States alone. Herein, we present a novel discrete event simulation model of SCD, which quantifies the chains of events associated with the formation, growth, and rupture of atheroma plaques, and the subsequent formation of clots, thrombosis and on-set of arrhythmias within a population. The predictions generated by the model are in good agreement both with results obtained from pathological examinations on the frequencies of three major types of atheroma, and with epidemiological data on the prevalence and risk of SCD. These model predictions allow for identification of interventions and importantly for the optimal time of intervention leading to high potential impact on SCD risk reduction (up to 8-fold reduction in the number of SCDs in the population) as well as the increase in life expectancy.

  5. Photon strength and the low-energy enhancement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wiedeking, M.; Bernstein, L. A.; Bleuel, D. L.

    2014-08-14

    Several measurements in medium mass nuclei have reported a low-energy enhancement in the photon strength function. Although, much effort has been invested in unraveling the mysteries of this effect, its physical origin is still not conclusively understood. Here, a completely model-independent experimental approach to investigate the existence of this enhancement is presented. The experiment was designed to study statistical feeding from the quasi-continuum (below the neutron separation energy) to individual low-lying discrete levels in {sup 95}Mo produced in the (d, p) reaction. A key aspect to successfully study gamma decay from the region of high-level density is the detection andmore » extraction of correlated particle-gamma-gamma events which was accomplished using an array of Clover HPGe detectors and large area annular silicon detectors. The entrance channel excitation energy into the residual nucleus produced in the reaction was inferred from the detected proton energies in the silicon detectors. Gating on gamma-transitions originating from low-lying discrete levels specifies the state fed by statistical gamma-rays. Any particle-gamma-gamma event in combination with specific energy sum requirements ensures a clean and unambiguous determination of the initial and final state of the observed gamma rays. With these requirements the statistical feeding to individual discrete levels is extracted on an event-by-event basis. The results are presented and compared to {sup 95}Mo photon strength function data measured at the University of Oslo.« less

  6. Causal Networks or Causal Islands? The Representation of Mechanisms and the Transitivity of Causal Judgment

    ERIC Educational Resources Information Center

    Johnson, Samuel G. B.; Ahn, Woo-kyoung

    2015-01-01

    Knowledge of mechanisms is critical for causal reasoning. We contrasted two possible organizations of causal knowledge--an interconnected causal "network," where events are causally connected without any boundaries delineating discrete mechanisms; or a set of disparate mechanisms--causal "islands"--such that events in different…

  7. DEVELOPMENT, EVALUATION AND APPLICATION OF AN AUTOMATED EVENT PRECIPITATION SAMPLER FOR NETWORK OPERATION

    EPA Science Inventory

    In 1993, the University of Michigan Air Quality Laboratory (UMAQL) designed a new wet-only precipitation collection system that was utilized in the Lake Michigan Loading Study. The collection system was designed to collect discrete mercury and trace element samples on an event b...

  8. Using Movement and Intentions to Understand Human Activity

    ERIC Educational Resources Information Center

    Zacks, Jeffrey M.; Kumar, Shawn; Abrams, Richard A.; Mehta, Ritesh

    2009-01-01

    During perception, people segment continuous activity into discrete events. They do so in part by monitoring changes in features of an ongoing activity. Characterizing these features is important for theories of event perception and may be helpful for designing information systems. The three experiments reported here asked whether the body…

  9. Sensitivity of diabetic retinopathy associated vision loss to screening interval in an agent-based/discrete event simulation model.

    PubMed

    Day, T Eugene; Ravi, Nathan; Xian, Hong; Brugh, Ann

    2014-04-01

    To examine the effect of changes to screening interval on the incidence of vision loss in a simulated cohort of Veterans with diabetic retinopathy (DR). This simulation allows us to examine potential interventions without putting patients at risk. Simulated randomized controlled trial. We develop a hybrid agent-based/discrete event simulation which incorporates a population of simulated Veterans--using abstracted data from a retrospective cohort of real-world diabetic Veterans--with a discrete event simulation (DES) eye clinic at which it seeks treatment for DR. We compare vision loss under varying screening policies, in a simulated population of 5000 Veterans over 50 independent ten-year simulation runs for each group. Diabetic Retinopathy associated vision loss increased as the screening interval was extended from one to five years (p<0.0001). This increase was concentrated in the third year of the screening interval (p<0.01). There was no increase in vision loss associated with increasing the screening interval from one year to two years (p=0.98). Increasing the screening interval for diabetic patients who have not yet developed diabetic retinopathy from 1 to 2 years appears safe, while increasing the interval to 3 years heightens risk for vision loss. Published by Elsevier Ltd.

  10. Planning and supervision of reactor defueling using discrete event techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garcia, H.E.; Imel, G.R.; Houshyar, A.

    1995-12-31

    New fuel handling and conditioning activities for the defueling of the Experimental Breeder Reactor II are being performed at Argonne National Laboratory. Research is being conducted to investigate the use of discrete event simulation, analysis, and optimization techniques to plan, supervise, and perform these activities in such a way that productivity can be improved. The central idea is to characterize this defueling operation as a collection of interconnected serving cells, and then apply operational research techniques to identify appropriate planning schedules for given scenarios. In addition, a supervisory system is being developed to provide personnel with on-line information on themore » progress of fueling tasks and to suggest courses of action to accommodate changing operational conditions. This paper provides an introduction to the research in progress at ANL. In particular, it briefly describes the fuel handling configuration for reactor defueling at ANL, presenting the flow of material from the reactor grid to the interim storage location, and the expected contributions of this work. As an example of the studies being conducted for planning and supervision of fuel handling activities at ANL, an application of discrete event simulation techniques to evaluate different fuel cask transfer strategies is given at the end of the paper.« less

  11. Effects of intermediate-scale wind disturbance on composition, structure, and succession in Quercus stands: Implications for natural disturbance-based silviculture

    Treesearch

    M.M. Cowden; J.L. Hart; C.J. Schweitzer; D.C. Dey

    2014-01-01

    Forest disturbances are discrete events in space and time that disrupt the biophysical environment and impart lasting legacies on forest composition and structure. Disturbances are often classified along a gradient of spatial extent and magnitude that ranges from catastrophic events where most of the overstory is removed to gap-scale events that modify local...

  12. Extreme hydroclimatic events and their socio-economic consequences

    NASA Astrophysics Data System (ADS)

    Ghil, Michael

    2017-04-01

    This talk will quickly summarize some earlier work reported in [1,2] and then focus on recent work in progress. The former will include two complementary views on the classical, 1300-year long Nile River records. The latter will cover studies of damage propagation in production-and-supply networks [3,4]. Here we use Boolean delay equations (BDEs), a semi-discrete type of dynamical systems [5], to explore the effect of network topology and of the delays in the supply on network resilience. [1] M. Ghil et al., Nonlin. Processes Geophys. (2011) [2] M. Chavez, M. Ghil & J. Urrutia Fucugauchi, Extreme Events: Observations, Modeling and Economics, Geophys. Monograph 214, AGU & Wiley (2015) [3] B. Coluzzi et al., Intl. J. Bifurcation Chaos (2011) [4] C. Colon & M. Ghil, Chaos, submitted (2017) [5] M. Ghil et al., Physica D (2008)

  13. Fluctuation reduction and enhanced confinement in the MST reversed-field pinch

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chapman, Brett Edward

    1997-10-01

    Plasmas with a factor of ≥3 improvement in energy confinement have been achieved in the MST reversed-field pinch (RFP). These plasmas occur spontaneously, following sawtooth crashes, subject to constraints on, eg, toroidal magnetic field reversal and wall conditioning. Possible contributors to the improved confinement include a reduction of core-resonant, global magnetic fluctuations and a reduction of electrostatic fluctuations over the entire plasma edge. One feature of these plasmas is a region of strong ExB flow shear in the edge. Never before observed in conjunction with enhanced confinement in the RFP, such shear is common in enhanced confinement discharges in tokamaks and stellarators. Another feature of these plasmas is a new type of discrete dynamo event. Like sawtooth crashes, a common form of discrete dynamo, these events correspond to bursts of edge parallel current. The reduction of electrostatic fluctuations in these plasmas occurs within and beyond the region of strong ExB flow shear, similar to what is observed in tokamaks and stellarators. However, the reductions in the MST include fluctuations whose correlation lengths are larger than the width of the shear region. The reduction of the global magnetic fluctuations is most likely due to flattening of the μ=μ 0more » $$\\vec{J}$$∙$$\\vec{B}$$/B 2 profile. Flattening can occur, eg, due to the new type of discrete dynamo event and reduced edge resistivity. Enhanced confinement plasmas are also achieved in the MST when auxiliary current is applied to flatten the μ profile and reduce magnetic fluctuations. Unexpectedly, these plasmas also exhibit a region (broader than in the case above) of strong ExB flow shear in the edge, an edge-wide reduction of electrostatic fluctuations, and the new type of discrete dynamo event. Auxiliary current drive has historically been viewed as the principal route to fusion reactor viability for the RFP.« less

  14. Using discrete event computer simulation to improve patient flow in a Ghanaian acute care hospital.

    PubMed

    Best, Allyson M; Dixon, Cinnamon A; Kelton, W David; Lindsell, Christopher J; Ward, Michael J

    2014-08-01

    Crowding and limited resources have increased the strain on acute care facilities and emergency departments worldwide. These problems are particularly prevalent in developing countries. Discrete event simulation is a computer-based tool that can be used to estimate how changes to complex health care delivery systems such as emergency departments will affect operational performance. Using this modality, our objective was to identify operational interventions that could potentially improve patient throughput of one acute care setting in a developing country. We developed a simulation model of acute care at a district level hospital in Ghana to test the effects of resource-neutral (eg, modified staff start times and roles) and resource-additional (eg, increased staff) operational interventions on patient throughput. Previously captured deidentified time-and-motion data from 487 acute care patients were used to develop and test the model. The primary outcome was the modeled effect of interventions on patient length of stay (LOS). The base-case (no change) scenario had a mean LOS of 292 minutes (95% confidence interval [CI], 291-293). In isolation, adding staffing, changing staff roles, and varying shift times did not affect overall patient LOS. Specifically, adding 2 registration workers, history takers, and physicians resulted in a 23.8-minute (95% CI, 22.3-25.3) LOS decrease. However, when shift start times were coordinated with patient arrival patterns, potential mean LOS was decreased by 96 minutes (95% CI, 94-98), and with the simultaneous combination of staff roles (registration and history taking), there was an overall mean LOS reduction of 152 minutes (95% CI, 150-154). Resource-neutral interventions identified through discrete event simulation modeling have the potential to improve acute care throughput in this Ghanaian municipal hospital. Discrete event simulation offers another approach to identifying potentially effective interventions to improve patient flow in emergency and acute care in resource-limited settings. Copyright © 2014 Elsevier Inc. All rights reserved.

  15. Core discrete event simulation model for the evaluation of health care technologies in major depressive disorder.

    PubMed

    Vataire, Anne-Lise; Aballéa, Samuel; Antonanzas, Fernando; Roijen, Leona Hakkaart-van; Lam, Raymond W; McCrone, Paul; Persson, Ulf; Toumi, Mondher

    2014-03-01

    A review of existing economic models in major depressive disorder (MDD) highlighted the need for models with longer time horizons that also account for heterogeneity in treatment pathways between patients. A core discrete event simulation model was developed to estimate health and cost outcomes associated with alternative treatment strategies. This model simulated short- and long-term clinical events (partial response, remission, relapse, recovery, and recurrence), adverse events, and treatment changes (titration, switch, addition, and discontinuation) over up to 5 years. Several treatment pathways were defined on the basis of fictitious antidepressants with three levels of efficacy, tolerability, and price (low, medium, and high) from first line to third line. The model was populated with input data from the literature for the UK setting. Model outputs include time in different health states, quality-adjusted life-years (QALYs), and costs from National Health Service and societal perspectives. The codes are open source. Predicted costs and QALYs from this model are within the range of results from previous economic evaluations. The largest cost components from the payer perspective were physician visits and hospitalizations. Key parameters driving the predicted costs and QALYs were utility values, effectiveness, and frequency of physician visits. Differences in QALYs and costs between two strategies with different effectiveness increased approximately twofold when the time horizon increased from 1 to 5 years. The discrete event simulation model can provide a more comprehensive evaluation of different therapeutic options in MDD, compared with existing Markov models, and can be used to compare a wide range of health care technologies in various groups of patients with MDD. Copyright © 2014 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  16. Petri nets as a modeling tool for discrete concurrent tasks of the human operator. [describing sequential and parallel demands on human operators

    NASA Technical Reports Server (NTRS)

    Schumacher, W.; Geiser, G.

    1978-01-01

    The basic concepts of Petri nets are reviewed as well as their application as the fundamental model of technical systems with concurrent discrete events such as hardware systems and software models of computers. The use of Petri nets is proposed for modeling the human operator dealing with concurrent discrete tasks. Their properties useful in modeling the human operator are discussed and practical examples are given. By means of and experimental investigation of binary concurrent tasks which are presented in a serial manner, the representation of human behavior by Petri nets is demonstrated.

  17. A framework for service enterprise workflow simulation with multi-agents cooperation

    NASA Astrophysics Data System (ADS)

    Tan, Wenan; Xu, Wei; Yang, Fujun; Xu, Lida; Jiang, Chuanqun

    2013-11-01

    Process dynamic modelling for service business is the key technique for Service-Oriented information systems and service business management, and the workflow model of business processes is the core part of service systems. Service business workflow simulation is the prevalent approach to be used for analysis of service business process dynamically. Generic method for service business workflow simulation is based on the discrete event queuing theory, which is lack of flexibility and scalability. In this paper, we propose a service workflow-oriented framework for the process simulation of service businesses using multi-agent cooperation to address the above issues. Social rationality of agent is introduced into the proposed framework. Adopting rationality as one social factor for decision-making strategies, a flexible scheduling for activity instances has been implemented. A system prototype has been developed to validate the proposed simulation framework through a business case study.

  18. The Effect of Haptic Guidance on Learning a Hybrid Rhythmic-Discrete Motor Task.

    PubMed

    Marchal-Crespo, Laura; Bannwart, Mathias; Riener, Robert; Vallery, Heike

    2015-01-01

    Bouncing a ball with a racket is a hybrid rhythmic-discrete motor task, combining continuous rhythmic racket movements with discrete impact events. Rhythmicity is exceptionally important in motor learning, because it underlies fundamental movements such as walking. Studies suggested that rhythmic and discrete movements are governed by different control mechanisms at different levels of the Central Nervous System. The aim of this study is to evaluate the effect of fixed/fading haptic guidance on learning to bounce a ball to a desired apex in virtual reality with varying gravity. Changing gravity changes dominance of rhythmic versus discrete control: The higher the value of gravity, the more rhythmic the task; lower values reduce the bouncing frequency and increase dwell times, eventually leading to a repetitive discrete task that requires initiation and termination, resembling target-oriented reaching. Although motor learning in the ball-bouncing task with varying gravity has been studied, the effect of haptic guidance on learning such a hybrid rhythmic-discrete motor task has not been addressed. We performed an experiment with thirty healthy subjects and found that the most effective training condition depended on the degree of rhythmicity: Haptic guidance seems to hamper learning of continuous rhythmic tasks, but it seems to promote learning for repetitive tasks that resemble discrete movements.

  19. DeMO: An Ontology for Discrete-event Modeling and Simulation.

    PubMed

    Silver, Gregory A; Miller, John A; Hybinette, Maria; Baramidze, Gregory; York, William S

    2011-09-01

    Several fields have created ontologies for their subdomains. For example, the biological sciences have developed extensive ontologies such as the Gene Ontology, which is considered a great success. Ontologies could provide similar advantages to the Modeling and Simulation community. They provide a way to establish common vocabularies and capture knowledge about a particular domain with community-wide agreement. Ontologies can support significantly improved (semantic) search and browsing, integration of heterogeneous information sources, and improved knowledge discovery capabilities. This paper discusses the design and development of an ontology for Modeling and Simulation called the Discrete-event Modeling Ontology (DeMO), and it presents prototype applications that demonstrate various uses and benefits that such an ontology may provide to the Modeling and Simulation community.

  20. DeMO: An Ontology for Discrete-event Modeling and Simulation

    PubMed Central

    Silver, Gregory A; Miller, John A; Hybinette, Maria; Baramidze, Gregory; York, William S

    2011-01-01

    Several fields have created ontologies for their subdomains. For example, the biological sciences have developed extensive ontologies such as the Gene Ontology, which is considered a great success. Ontologies could provide similar advantages to the Modeling and Simulation community. They provide a way to establish common vocabularies and capture knowledge about a particular domain with community-wide agreement. Ontologies can support significantly improved (semantic) search and browsing, integration of heterogeneous information sources, and improved knowledge discovery capabilities. This paper discusses the design and development of an ontology for Modeling and Simulation called the Discrete-event Modeling Ontology (DeMO), and it presents prototype applications that demonstrate various uses and benefits that such an ontology may provide to the Modeling and Simulation community. PMID:22919114

  1. Regularity of a renewal process estimated from binary data.

    PubMed

    Rice, John D; Strawderman, Robert L; Johnson, Brent A

    2017-10-09

    Assessment of the regularity of a sequence of events over time is important for clinical decision-making as well as informing public health policy. Our motivating example involves determining the effect of an intervention on the regularity of HIV self-testing behavior among high-risk individuals when exact self-testing times are not recorded. Assuming that these unobserved testing times follow a renewal process, the goals of this work are to develop suitable methods for estimating its distributional parameters when only the presence or absence of at least one event per subject in each of several observation windows is recorded. We propose two approaches to estimation and inference: a likelihood-based discrete survival model using only time to first event; and a potentially more efficient quasi-likelihood approach based on the forward recurrence time distribution using all available data. Regularity is quantified and estimated by the coefficient of variation (CV) of the interevent time distribution. Focusing on the gamma renewal process, where the shape parameter of the corresponding interevent time distribution has a monotone relationship with its CV, we conduct simulation studies to evaluate the performance of the proposed methods. We then apply them to our motivating example, concluding that the use of text message reminders significantly improves the regularity of self-testing, but not its frequency. A discussion on interesting directions for further research is provided. © 2017, The International Biometric Society.

  2. Spiking neural network simulation: memory-optimal synaptic event scheduling.

    PubMed

    Stewart, Robert D; Gurney, Kevin N

    2011-06-01

    Spiking neural network simulations incorporating variable transmission delays require synaptic events to be scheduled prior to delivery. Conventional methods have memory requirements that scale with the total number of synapses in a network. We introduce novel scheduling algorithms for both discrete and continuous event delivery, where the memory requirement scales instead with the number of neurons. Superior algorithmic performance is demonstrated using large-scale, benchmarking network simulations.

  3. Discrete post-processing of total cloud cover ensemble forecasts

    NASA Astrophysics Data System (ADS)

    Hemri, Stephan; Haiden, Thomas; Pappenberger, Florian

    2017-04-01

    This contribution presents an approach to post-process ensemble forecasts for the discrete and bounded weather variable of total cloud cover. Two methods for discrete statistical post-processing of ensemble predictions are tested. The first approach is based on multinomial logistic regression, the second involves a proportional odds logistic regression model. Applying them to total cloud cover raw ensemble forecasts from the European Centre for Medium-Range Weather Forecasts improves forecast skill significantly. Based on station-wise post-processing of raw ensemble total cloud cover forecasts for a global set of 3330 stations over the period from 2007 to early 2014, the more parsimonious proportional odds logistic regression model proved to slightly outperform the multinomial logistic regression model. Reference Hemri, S., Haiden, T., & Pappenberger, F. (2016). Discrete post-processing of total cloud cover ensemble forecasts. Monthly Weather Review 144, 2565-2577.

  4. Extreme events and event size fluctuations in biased random walks on networks.

    PubMed

    Kishore, Vimal; Santhanam, M S; Amritkar, R E

    2012-05-01

    Random walk on discrete lattice models is important to understand various types of transport processes. The extreme events, defined as exceedences of the flux of walkers above a prescribed threshold, have been studied recently in the context of complex networks. This was motivated by the occurrence of rare events such as traffic jams, floods, and power blackouts which take place on networks. In this work, we study extreme events in a generalized random walk model in which the walk is preferentially biased by the network topology. The walkers preferentially choose to hop toward the hubs or small degree nodes. In this setting, we show that extremely large fluctuations in event sizes are possible on small degree nodes when the walkers are biased toward the hubs. In particular, we obtain the distribution of event sizes on the network. Further, the probability for the occurrence of extreme events on any node in the network depends on its "generalized strength," a measure of the ability of a node to attract walkers. The generalized strength is a function of the degree of the node and that of its nearest neighbors. We obtain analytical and simulation results for the probability of occurrence of extreme events on the nodes of a network using a generalized random walk model. The result reveals that the nodes with a larger value of generalized strength, on average, display lower probability for the occurrence of extreme events compared to the nodes with lower values of generalized strength.

  5. Cortical Neural Computation by Discrete Results Hypothesis

    PubMed Central

    Castejon, Carlos; Nuñez, Angel

    2016-01-01

    One of the most challenging problems we face in neuroscience is to understand how the cortex performs computations. There is increasing evidence that the power of the cortical processing is produced by populations of neurons forming dynamic neuronal ensembles. Theoretical proposals and multineuronal experimental studies have revealed that ensembles of neurons can form emergent functional units. However, how these ensembles are implicated in cortical computations is still a mystery. Although cell ensembles have been associated with brain rhythms, the functional interaction remains largely unclear. It is still unknown how spatially distributed neuronal activity can be temporally integrated to contribute to cortical computations. A theoretical explanation integrating spatial and temporal aspects of cortical processing is still lacking. In this Hypothesis and Theory article, we propose a new functional theoretical framework to explain the computational roles of these ensembles in cortical processing. We suggest that complex neural computations underlying cortical processing could be temporally discrete and that sensory information would need to be quantized to be computed by the cerebral cortex. Accordingly, we propose that cortical processing is produced by the computation of discrete spatio-temporal functional units that we have called “Discrete Results” (Discrete Results Hypothesis). This hypothesis represents a novel functional mechanism by which information processing is computed in the cortex. Furthermore, we propose that precise dynamic sequences of “Discrete Results” is the mechanism used by the cortex to extract, code, memorize and transmit neural information. The novel “Discrete Results” concept has the ability to match the spatial and temporal aspects of cortical processing. We discuss the possible neural underpinnings of these functional computational units and describe the empirical evidence supporting our hypothesis. We propose that fast-spiking (FS) interneuron may be a key element in our hypothesis providing the basis for this computation. PMID:27807408

  6. Cortical Neural Computation by Discrete Results Hypothesis.

    PubMed

    Castejon, Carlos; Nuñez, Angel

    2016-01-01

    One of the most challenging problems we face in neuroscience is to understand how the cortex performs computations. There is increasing evidence that the power of the cortical processing is produced by populations of neurons forming dynamic neuronal ensembles. Theoretical proposals and multineuronal experimental studies have revealed that ensembles of neurons can form emergent functional units. However, how these ensembles are implicated in cortical computations is still a mystery. Although cell ensembles have been associated with brain rhythms, the functional interaction remains largely unclear. It is still unknown how spatially distributed neuronal activity can be temporally integrated to contribute to cortical computations. A theoretical explanation integrating spatial and temporal aspects of cortical processing is still lacking. In this Hypothesis and Theory article, we propose a new functional theoretical framework to explain the computational roles of these ensembles in cortical processing. We suggest that complex neural computations underlying cortical processing could be temporally discrete and that sensory information would need to be quantized to be computed by the cerebral cortex. Accordingly, we propose that cortical processing is produced by the computation of discrete spatio-temporal functional units that we have called "Discrete Results" (Discrete Results Hypothesis). This hypothesis represents a novel functional mechanism by which information processing is computed in the cortex. Furthermore, we propose that precise dynamic sequences of "Discrete Results" is the mechanism used by the cortex to extract, code, memorize and transmit neural information. The novel "Discrete Results" concept has the ability to match the spatial and temporal aspects of cortical processing. We discuss the possible neural underpinnings of these functional computational units and describe the empirical evidence supporting our hypothesis. We propose that fast-spiking (FS) interneuron may be a key element in our hypothesis providing the basis for this computation.

  7. The Foggy EUV Corona and Coronal Heating by MHD Waves from Explosive Reconnection Events

    NASA Technical Reports Server (NTRS)

    Moore, Ron L.; Cirtain, Jonathan W.; Falconer, David A.

    2008-01-01

    In 0.5 arcsec/pixel TRACE coronal EUV images, the corona rooted in active regions that are at the limb and are not flaring is seen to consist of (1) a complex array of discrete loops and plumes embedded in (2) a diffuse ambient component that shows no fine structure and gradually fades with height. For each of two not-flaring active regions, found that the diffuse component is (1) approximately isothermal and hydrostatic and (2) emits well over half of the total EUV luminosity of the active-region corona. Here, from a TRACE Fe XII coronal image of another not-flaring active region, the large sunspot active region AR 10652 when it was at the west limb on 30 July 2004, we separate the diffuse component from the discrete loop component by spatial filtering, and find that the diffuse component has about 60% of the total luminosity. If under much higher spatial resolution than that of TRACE (e. g., the 0.1 arcsec/pixel resolution of the Hi-C sounding-rocket experiment proposed by J. W. Cirtain et al), most of the diffuse component remains diffuse rather being resolved into very narrow loops and plumes, this will raise the possibility that the EUV corona in active regions consists of two basically different but comparably luminous components: one being the set of discrete bright loops and plumes and the other being a truly diffuse component filling the space between the discrete loops and plumes. This dichotomy would imply that there are two different but comparably powerful coronal heating mechanisms operating in active regions, one for the distinct loops and plumes and another for the diffuse component. We present a scenario in which (1) each discrete bright loop or plume is a flux tube that was recently reconnected in a burst of reconnection, and (2) the diffuse component is heated by MHD waves that are generated by these reconnection events and by other fine-scale explosive reconnection events, most of which occur in and below the base of the corona where they are seen as UV explosive events, EUV blinkers, and type II spicules. These MHD waves propagate across field lines and dissipate, heating the plasma in the field between the bright loops and plumes.

  8. Assessing the Utility of an Event-Step ASMD Model by Analysis of Surface Combatant Shared Self-Defense

    DTIC Science & Technology

    2001-09-01

    Oriented Discrete Event Simulation,” Master’s Thesis in Operations Research, Naval Postgraduate School Monterey, CA, 1996. 12. Arntzen , A., “Software...Dependent Hit Probabilities”, Naval Research Logistics, Vol. 31, pp. 363-371, 1984. 3 Arntzen , A., “Software Components for Air Defense Planning

  9. DynamO: a free O(N) general event-driven molecular dynamics simulator.

    PubMed

    Bannerman, M N; Sargant, R; Lue, L

    2011-11-30

    Molecular dynamics algorithms for systems of particles interacting through discrete or "hard" potentials are fundamentally different to the methods for continuous or "soft" potential systems. Although many software packages have been developed for continuous potential systems, software for discrete potential systems based on event-driven algorithms are relatively scarce and specialized. We present DynamO, a general event-driven simulation package, which displays the optimal O(N) asymptotic scaling of the computational cost with the number of particles N, rather than the O(N) scaling found in most standard algorithms. DynamO provides reference implementations of the best available event-driven algorithms. These techniques allow the rapid simulation of both complex and large (>10(6) particles) systems for long times. The performance of the program is benchmarked for elastic hard sphere systems, homogeneous cooling and sheared inelastic hard spheres, and equilibrium Lennard-Jones fluids. This software and its documentation are distributed under the GNU General Public license and can be freely downloaded from http://marcusbannerman.co.uk/dynamo. Copyright © 2011 Wiley Periodicals, Inc.

  10. Discrete Element Method and its application to materials failure problem on the example of Brazilian Test

    NASA Astrophysics Data System (ADS)

    Klejment, Piotr; Kosmala, Alicja; Foltyn, Natalia; Dębski, Wojciech

    2017-04-01

    The earthquake focus is the point where a rock under external stress starts to fracture. Understanding earthquake nucleation and earthquake dynamics requires thus understanding of fracturing of brittle materials. This, however, is a continuing problem and enduring challenge to geoscience. In spite of significant progress we still do not fully understand the failure of rock materials due to extreme stress concentration in natural condition. One of the reason of this situation is that information about natural or induced seismic events is still not sufficient for precise description of physical processes in seismic foci. One of the possibility of improving this situation is using numerical simulations - a powerful tool of contemporary physics. For this reason we used an advanced implementation of the Discrete Element Method (DEM). DEM's main task is to calculate physical properties of materials which are represented as an assembly of a great number of particles interacting with each other. We analyze the possibility of using DEM for describing materials during so called Brazilian Test. Brazilian Test is a testing method to obtain the tensile strength of brittle material. One of the primary reasons for conducting such simulations is to measure macroscopic parameters of the rock sample. We would like to report our efforts of describing the fracturing process during the Brazilian Test from the microscopic point of view and give an insight into physical processes preceding materials failure.

  11. Ensemble-type numerical uncertainty information from single model integrations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rauser, Florian, E-mail: florian.rauser@mpimet.mpg.de; Marotzke, Jochem; Korn, Peter

    2015-07-01

    We suggest an algorithm that quantifies the discretization error of time-dependent physical quantities of interest (goals) for numerical models of geophysical fluid dynamics. The goal discretization error is estimated using a sum of weighted local discretization errors. The key feature of our algorithm is that these local discretization errors are interpreted as realizations of a random process. The random process is determined by the model and the flow state. From a class of local error random processes we select a suitable specific random process by integrating the model over a short time interval at different resolutions. The weights of themore » influences of the local discretization errors on the goal are modeled as goal sensitivities, which are calculated via automatic differentiation. The integration of the weighted realizations of local error random processes yields a posterior ensemble of goal approximations from a single run of the numerical model. From the posterior ensemble we derive the uncertainty information of the goal discretization error. This algorithm bypasses the requirement of detailed knowledge about the models discretization to generate numerical error estimates. The algorithm is evaluated for the spherical shallow-water equations. For two standard test cases we successfully estimate the error of regional potential energy, track its evolution, and compare it to standard ensemble techniques. The posterior ensemble shares linear-error-growth properties with ensembles of multiple model integrations when comparably perturbed. The posterior ensemble numerical error estimates are of comparable size as those of a stochastic physics ensemble.« less

  12. How Does the Sparse Memory “Engram” Neurons Encode the Memory of a Spatial–Temporal Event?

    PubMed Central

    Guan, Ji-Song; Jiang, Jun; Xie, Hong; Liu, Kai-Yuan

    2016-01-01

    Episodic memory in human brain is not a fixed 2-D picture but a highly dynamic movie serial, integrating information at both the temporal and the spatial domains. Recent studies in neuroscience reveal that memory storage and recall are closely related to the activities in discrete memory engram (trace) neurons within the dentate gyrus region of hippocampus and the layer 2/3 of neocortex. More strikingly, optogenetic reactivation of those memory trace neurons is able to trigger the recall of naturally encoded memory. It is still unknown how the discrete memory traces encode and reactivate the memory. Considering a particular memory normally represents a natural event, which consists of information at both the temporal and spatial domains, it is unknown how the discrete trace neurons could reconstitute such enriched information in the brain. Furthermore, as the optogenetic-stimuli induced recall of memory did not depend on firing pattern of the memory traces, it is most likely that the spatial activation pattern, but not the temporal activation pattern of the discrete memory trace neurons encodes the memory in the brain. How does the neural circuit convert the activities in the spatial domain into the temporal domain to reconstitute memory of a natural event? By reviewing the literature, here we present how the memory engram (trace) neurons are selected and consolidated in the brain. Then, we will discuss the main challenges in the memory trace theory. In the end, we will provide a plausible model of memory trace cell network, underlying the conversion of neural activities between the spatial domain and the temporal domain. We will also discuss on how the activation of sparse memory trace neurons might trigger the replay of neural activities in specific temporal patterns. PMID:27601979

  13. How Does the Sparse Memory "Engram" Neurons Encode the Memory of a Spatial-Temporal Event?

    PubMed

    Guan, Ji-Song; Jiang, Jun; Xie, Hong; Liu, Kai-Yuan

    2016-01-01

    Episodic memory in human brain is not a fixed 2-D picture but a highly dynamic movie serial, integrating information at both the temporal and the spatial domains. Recent studies in neuroscience reveal that memory storage and recall are closely related to the activities in discrete memory engram (trace) neurons within the dentate gyrus region of hippocampus and the layer 2/3 of neocortex. More strikingly, optogenetic reactivation of those memory trace neurons is able to trigger the recall of naturally encoded memory. It is still unknown how the discrete memory traces encode and reactivate the memory. Considering a particular memory normally represents a natural event, which consists of information at both the temporal and spatial domains, it is unknown how the discrete trace neurons could reconstitute such enriched information in the brain. Furthermore, as the optogenetic-stimuli induced recall of memory did not depend on firing pattern of the memory traces, it is most likely that the spatial activation pattern, but not the temporal activation pattern of the discrete memory trace neurons encodes the memory in the brain. How does the neural circuit convert the activities in the spatial domain into the temporal domain to reconstitute memory of a natural event? By reviewing the literature, here we present how the memory engram (trace) neurons are selected and consolidated in the brain. Then, we will discuss the main challenges in the memory trace theory. In the end, we will provide a plausible model of memory trace cell network, underlying the conversion of neural activities between the spatial domain and the temporal domain. We will also discuss on how the activation of sparse memory trace neurons might trigger the replay of neural activities in specific temporal patterns.

  14. Evaluation of NASA's end-to-end data systems using DSDS+

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Davenport, William; Message, Philip

    1994-01-01

    The Data Systems Dynamic Simulator (DSDS+) is a software tool being developed by the authors to evaluate candidate architectures for NASA's end-to-end data systems. Via modeling and simulation, we are able to quickly predict the performance characteristics of each architecture, to evaluate 'what-if' scenarios, and to perform sensitivity analyses. As such, we are using modeling and simulation to help NASA select the optimal system configuration, and to quantify the performance characteristics of this system prior to its delivery. This paper is divided into the following six sections: (1) The role of modeling and simulation in the systems engineering process. In this section, we briefly describe the different types of results obtained by modeling each phase of the systems engineering life cycle, from concept definition through operations and maintenance; (2) Recent applications of DSDS+. In this section, we describe ongoing applications of DSDS+ in support of the Earth Observing System (EOS), and we present some of the simulation results generated of candidate system designs. So far, we have modeled individual EOS subsystems (e.g. the Solid State Recorders used onboard the spacecraft), and we have also developed an integrated model of the EOS end-to-end data processing and data communications systems (from the payloads onboard to the principle investigator facilities on the ground); (3) Overview of DSDS+. In this section we define what a discrete-event model is, and how it works. The discussion is presented relative to the DSDS+ simulation tool that we have developed, including it's run-time optimization algorithms that enables DSDS+ to execute substantially faster than comparable discrete-event simulation tools; (4) Summary. In this section, we summarize our findings and 'lessons learned' during the development and application of DSDS+ to model NASA's data systems; (5) Further Information; and (6) Acknowledgements.

  15. Modelling machine ensembles with discrete event dynamical system theory

    NASA Technical Reports Server (NTRS)

    Hunter, Dan

    1990-01-01

    Discrete Event Dynamical System (DEDS) theory can be utilized as a control strategy for future complex machine ensembles that will be required for in-space construction. The control strategy involves orchestrating a set of interactive submachines to perform a set of tasks for a given set of constraints such as minimum time, minimum energy, or maximum machine utilization. Machine ensembles can be hierarchically modeled as a global model that combines the operations of the individual submachines. These submachines are represented in the global model as local models. Local models, from the perspective of DEDS theory , are described by the following: a set of system and transition states, an event alphabet that portrays actions that takes a submachine from one state to another, an initial system state, a partial function that maps the current state and event alphabet to the next state, and the time required for the event to occur. Each submachine in the machine ensemble is presented by a unique local model. The global model combines the local models such that the local models can operate in parallel under the additional logistic and physical constraints due to submachine interactions. The global model is constructed from the states, events, event functions, and timing requirements of the local models. Supervisory control can be implemented in the global model by various methods such as task scheduling (open-loop control) or implementing a feedback DEDS controller (closed-loop control).

  16. Regression Analysis of Mixed Recurrent-Event and Panel-Count Data with Additive Rate Models

    PubMed Central

    Zhu, Liang; Zhao, Hui; Sun, Jianguo; Leisenring, Wendy; Robison, Leslie L.

    2015-01-01

    Summary Event-history studies of recurrent events are often conducted in fields such as demography, epidemiology, medicine, and social sciences (Cook and Lawless, 2007; Zhao et al., 2011). For such analysis, two types of data have been extensively investigated: recurrent-event data and panel-count data. However, in practice, one may face a third type of data, mixed recurrent-event and panel-count data or mixed event-history data. Such data occur if some study subjects are monitored or observed continuously and thus provide recurrent-event data, while the others are observed only at discrete times and hence give only panel-count data. A more general situation is that each subject is observed continuously over certain time periods but only at discrete times over other time periods. There exists little literature on the analysis of such mixed data except that published by Zhu et al. (2013). In this paper, we consider the regression analysis of mixed data using the additive rate model and develop some estimating equation-based approaches to estimate the regression parameters of interest. Both finite sample and asymptotic properties of the resulting estimators are established, and the numerical studies suggest that the proposed methodology works well for practical situations. The approach is applied to a Childhood Cancer Survivor Study that motivated this study. PMID:25345405

  17. Point process modeling and estimation: Advances in the analysis of dynamic neural spiking data

    NASA Astrophysics Data System (ADS)

    Deng, Xinyi

    2016-08-01

    A common interest of scientists in many fields is to understand the relationship between the dynamics of a physical system and the occurrences of discrete events within such physical system. Seismologists study the connection between mechanical vibrations of the Earth and the occurrences of earthquakes so that future earthquakes can be better predicted. Astrophysicists study the association between the oscillating energy of celestial regions and the emission of photons to learn the Universe's various objects and their interactions. Neuroscientists study the link between behavior and the millisecond-timescale spike patterns of neurons to understand higher brain functions. Such relationships can often be formulated within the framework of state-space models with point process observations. The basic idea is that the dynamics of the physical systems are driven by the dynamics of some stochastic state variables and the discrete events we observe in an interval are noisy observations with distributions determined by the state variables. This thesis proposes several new methodological developments that advance the framework of state-space models with point process observations at the intersection of statistics and neuroscience. In particular, we develop new methods 1) to characterize the rhythmic spiking activity using history-dependent structure, 2) to model population spike activity using marked point process models, 3) to allow for real-time decision making, and 4) to take into account the need for dimensionality reduction for high-dimensional state and observation processes. We applied these methods to a novel problem of tracking rhythmic dynamics in the spiking of neurons in the subthalamic nucleus of Parkinson's patients with the goal of optimizing placement of deep brain stimulation electrodes. We developed a decoding algorithm that can make decision in real-time (for example, to stimulate the neurons or not) based on various sources of information present in population spiking data. Lastly, we proposed a general three-step paradigm that allows us to relate behavioral outcomes of various tasks to simultaneously recorded neural activity across multiple brain areas, which is a step towards closed-loop therapies for psychological diseases using real-time neural stimulation. These methods are suitable for real-time implementation for content-based feedback experiments.

  18. FPGA-Based Front-End Electronics for Positron Emission Tomography

    PubMed Central

    Haselman, Michael; DeWitt, Don; McDougald, Wendy; Lewellen, Thomas K.; Miyaoka, Robert; Hauck, Scott

    2010-01-01

    Modern Field Programmable Gate Arrays (FPGAs) are capable of performing complex discrete signal processing algorithms with clock rates above 100MHz. This combined with FPGA’s low expense, ease of use, and selected dedicated hardware make them an ideal technology for a data acquisition system for positron emission tomography (PET) scanners. Our laboratory is producing a high-resolution, small-animal PET scanner that utilizes FPGAs as the core of the front-end electronics. For this next generation scanner, functions that are typically performed in dedicated circuits, or offline, are being migrated to the FPGA. This will not only simplify the electronics, but the features of modern FPGAs can be utilizes to add significant signal processing power to produce higher resolution images. In this paper two such processes, sub-clock rate pulse timing and event localization, will be discussed in detail. We show that timing performed in the FPGA can achieve a resolution that is suitable for small-animal scanners, and will outperform the analog version given a low enough sampling period for the ADC. We will also show that the position of events in the scanner can be determined in real time using a statistical positioning based algorithm. PMID:21961085

  19. SIG-VISA: Signal-based Vertically Integrated Seismic Monitoring

    NASA Astrophysics Data System (ADS)

    Moore, D.; Mayeda, K. M.; Myers, S. C.; Russell, S.

    2013-12-01

    Traditional seismic monitoring systems rely on discrete detections produced by station processing software; however, while such detections may constitute a useful summary of station activity, they discard large amounts of information present in the original recorded signal. We present SIG-VISA (Signal-based Vertically Integrated Seismic Analysis), a system for seismic monitoring through Bayesian inference on seismic signals. By directly modeling the recorded signal, our approach incorporates additional information unavailable to detection-based methods, enabling higher sensitivity and more accurate localization using techniques such as waveform matching. SIG-VISA's Bayesian forward model of seismic signal envelopes includes physically-derived models of travel times and source characteristics as well as Gaussian process (kriging) statistical models of signal properties that combine interpolation of historical data with extrapolation of learned physical trends. Applying Bayesian inference, we evaluate the model on earthquakes as well as the 2009 DPRK test event, demonstrating a waveform matching effect as part of the probabilistic inference, along with results on event localization and sensitivity. In particular, we demonstrate increased sensitivity from signal-based modeling, in which the SIGVISA signal model finds statistical evidence for arrivals even at stations for which the IMS station processing failed to register any detection.

  20. Hemolytic potential of hydrodynamic cavitation.

    PubMed

    Chambers, S D; Bartlett, R H; Ceccio, S L

    2000-08-01

    The purpose of this study was to determine the hemolytic potentials of discrete bubble cavitation and attached cavitation. To generate controlled cavitation events, a venturigeometry hydrodynamic device, called a Cavitation Susceptibility Meter (CSM), was constructed. A comparison between the hemolytic potential of discrete bubble cavitation and attached cavitation was investigated with a single-pass flow apparatus and a recirculating flow apparatus, both utilizing the CSM. An analytical model, based on spherical bubble dynamics, was developed for predicting the hemolysis caused by discrete bubble cavitation. Experimentally, discrete bubble cavitation did not correlate with a measurable increase in plasma-free hemoglobin (PFHb), as predicted by the analytical model. However, attached cavitation did result in significant PFHb generation. The rate of PFHb generation scaled inversely with the Cavitation number at a constant flow rate, suggesting that the size of the attached cavity was the dominant hemolytic factor.

  1. Fracture process zone in granite

    USGS Publications Warehouse

    Zang, A.; Wagner, F.C.; Stanchits, S.; Janssen, C.; Dresen, G.

    2000-01-01

    In uniaxial compression tests performed on Aue granite cores (diameter 50 mm, length 100 mm), a steel loading plate was used to induce the formation of a discrete shear fracture. A zone of distributed microcracks surrounds the tip of the propagating fracture. This process zone is imaged by locating acoustic emission events using 12 piezoceramic sensors attached to the samples. Propagation velocity of the process zone is varied by using the rate of acoustic emissions to control the applied axial force. The resulting velocities range from 2 mm/s in displacement-controlled tests to 2 ??m/s in tests controlled by acoustic emission rate. Wave velocities and amplitudes are monitored during fault formation. P waves transmitted through the approaching process zone show a drop in amplitude of 26 dB, and ultrasonic velocities are reduced by 10%. The width of the process zone is ???9 times the grain diameter inferred from acoustic data but is only 2 times the grain size from optical crack inspection. The process zone of fast propagating fractures is wider than for slow ones. The density of microcracks and acoustic emissions increases approaching the main fracture. Shear displacement scales linearly with fracture length. Fault plane solutions from acoustic events show similar orientation of nodal planes on both sides of the shear fracture. The ratio of the process zone width to the fault length in Aue granite ranges from 0.01 to 0.1 inferred from crack data and acoustic emissions, respectively. The fracture surface energy is estimated from microstructure analysis to be ???2 J. A lower bound estimate for the energy dissipated by acoustic events is 0.1 J. Copyright 2000 by the American Geophysical Union.

  2. Comparing performance in discrete and continuous comparison tasks.

    PubMed

    Leibovich, Tali; Henik, Avishai

    2014-05-01

    The approximate number system (ANS) theory suggests that all magnitudes, discrete (i.e., number of items) or continuous (i.e., size, density, etc.), are processed by a shared system and comply with Weber's law. The current study reexamined this notion by comparing performance in discrete (comparing numerosities of dot arrays) and continuous (comparisons of area of squares) tasks. We found that: (a) threshold of discrimination was higher for continuous than for discrete comparisons; (b) while performance in the discrete task complied with Weber's law, performance in the continuous task violated it; and (c) performance in the discrete task was influenced by continuous properties (e.g., dot density, dot cumulative area) of the dot array that were not predictive of numerosities or task relevant. Therefore, we propose that the magnitude processing system (MPS) is actually divided into separate (yet interactive) systems for discrete and continuous magnitude processing. Further subdivisions are discussed. We argue that cooperation between these systems results in a holistic comparison of magnitudes, one that takes into account continuous properties in addition to numerosities. Considering the MPS as two systems opens the door to new and important questions that shed light on both normal and impaired development of the numerical system.

  3. Donders revisited: Discrete or continuous temporal processing underlying reaction time distributions?

    PubMed

    Bao, Yan; Yang, Taoxi; Lin, Xiaoxiong; Pöppel, Ernst

    2016-09-01

    Differences of reaction times to specific stimulus configurations are used as indicators of cognitive processing stages. In this classical experimental paradigm, continuous temporal processing is implicitly assumed. Multimodal response distributions indicate, however, discrete time sampling, which is often masked by experimental conditions. Differences in reaction times reflect discrete temporal mechanisms that are pre-semantically implemented and suggested to be based on entrained neural oscillations. © 2016 The Institute of Psychology, Chinese Academy of Sciences and John Wiley & Sons Australia, Ltd.

  4. Effects of sequential and discrete rapid naming on reading in Japanese children with reading difficulty.

    PubMed

    Wakamiya, Eiji; Okumura, Tomohito; Nakanishi, Makoto; Takeshita, Takashi; Mizuta, Mekumi; Kurimoto, Naoko; Tamai, Hiroshi

    2011-06-01

    To clarify whether rapid naming ability itself is a main underpinning factor of rapid automatized naming tests (RAN) and how deep an influence the discrete decoding process has on reading, we performed discrete naming tasks and discrete hiragana reading tasks as well as sequential naming tasks and sequential hiragana reading tasks with 38 Japanese schoolchildren with reading difficulty. There were high correlations between both discrete and sequential hiragana reading and sentence reading, suggesting that some mechanism which automatizes hiragana reading makes sentence reading fluent. In object and color tasks, there were moderate correlations between sentence reading and sequential naming, and between sequential naming and discrete naming. But no correlation was found between reading tasks and discrete naming tasks. The influence of rapid naming ability of objects and colors upon reading seemed relatively small, and multi-item processing may work in relation to these. In contrast, in the digit naming task there was moderate correlation between sentence reading and discrete naming, while no correlation was seen between sequential naming and discrete naming. There was moderate correlation between reading tasks and sequential digit naming tasks. Digit rapid naming ability has more direct effect on reading while its effect on RAN is relatively limited. The ratio of how rapid naming ability influences RAN and reading seems to vary according to kind of the stimuli used. An assumption about components in RAN which influence reading is discussed in the context of both sequential processing and discrete naming speed. Copyright © 2010 The Japanese Society of Child Neurology. Published by Elsevier B.V. All rights reserved.

  5. A methodology for risk analysis based on hybrid Bayesian networks: application to the regasification system of liquefied natural gas onboard a floating storage and regasification unit.

    PubMed

    Martins, Marcelo Ramos; Schleder, Adriana Miralles; Droguett, Enrique López

    2014-12-01

    This article presents an iterative six-step risk analysis methodology based on hybrid Bayesian networks (BNs). In typical risk analysis, systems are usually modeled as discrete and Boolean variables with constant failure rates via fault trees. Nevertheless, in many cases, it is not possible to perform an efficient analysis using only discrete and Boolean variables. The approach put forward by the proposed methodology makes use of BNs and incorporates recent developments that facilitate the use of continuous variables whose values may have any probability distributions. Thus, this approach makes the methodology particularly useful in cases where the available data for quantification of hazardous events probabilities are scarce or nonexistent, there is dependence among events, or when nonbinary events are involved. The methodology is applied to the risk analysis of a regasification system of liquefied natural gas (LNG) on board an FSRU (floating, storage, and regasification unit). LNG is becoming an important energy source option and the world's capacity to produce LNG is surging. Large reserves of natural gas exist worldwide, particularly in areas where the resources exceed the demand. Thus, this natural gas is liquefied for shipping and the storage and regasification process usually occurs at onshore plants. However, a new option for LNG storage and regasification has been proposed: the FSRU. As very few FSRUs have been put into operation, relevant failure data on FSRU systems are scarce. The results show the usefulness of the proposed methodology for cases where the risk analysis must be performed under considerable uncertainty. © 2014 Society for Risk Analysis.

  6. A Surrogate Technique for Investigating Deterministic Dynamics in Discrete Human Movement.

    PubMed

    Taylor, Paul G; Small, Michael; Lee, Kwee-Yum; Landeo, Raul; O'Meara, Damien M; Millett, Emma L

    2016-10-01

    Entropy is an effective tool for investigation of human movement variability. However, before applying entropy, it can be beneficial to employ analyses to confirm that observed data are not solely the result of stochastic processes. This can be achieved by contrasting observed data with that produced using surrogate methods. Unlike continuous movement, no appropriate method has been applied to discrete human movement. This article proposes a novel surrogate method for discrete movement data, outlining the processes for determining its critical values. The proposed technique reliably generated surrogates for discrete joint angle time series, destroying fine-scale dynamics of the observed signal, while maintaining macro structural characteristics. Comparison of entropy estimates indicated observed signals had greater regularity than surrogates and were not only the result of stochastic but also deterministic processes. The proposed surrogate method is both a valid and reliable technique to investigate determinism in other discrete human movement time series.

  7. A novel approach to leveraging electronic health record data to enhance pediatric surgical quality improvement bundle process compliance.

    PubMed

    Fisher, Jason C; Godfried, David H; Lighter-Fisher, Jennifer; Pratko, Joseph; Sheldon, Mary Ellen; Diago, Thelma; Kuenzler, Keith A; Tomita, Sandra S; Ginsburg, Howard B

    2016-06-01

    Quality improvement (QI) bundles have been widely adopted to reduce surgical site infections (SSI). Improvement science suggests when organizations achieve high-reliability to QI processes, outcomes dramatically improve. However, measuring QI process compliance is poorly supported by electronic health record (EHR) systems. We developed a custom EHR tool to facilitate capture of process data for SSI prevention with the aim of increasing bundle compliance and reducing adverse events. Ten SSI prevention bundle processes were linked to EHR data elements that were then aggregated into a snapshot display superimposed on weekly case-log reports. The data aggregation and user interface facilitated efficient review of all SSI bundle elements, providing an exact bundle compliance rate without random sampling or chart review. Nine months after implementation of our custom EHR tool, we observed centerline shifts in median SSI bundle compliance (46% to 72%). Additionally, as predicted by high reliability principles, we began to see a trend toward improvement in SSI rates (1.68 to 0.87 per 100 operations), but a discrete centerline shift was not detected. Simple informatics solutions can facilitate extraction of QI process data from the EHR without relying on adjunctive systems. Analyses of these data may drive reductions in adverse events. Pediatric surgical departments should consider leveraging the EHR to enhance bundle compliance as they implement QI strategies. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. The impact of interoperability of electronic health records on ambulatory physician practices: a discrete-event simulation study.

    PubMed

    Zhou, Yuan; Ancker, Jessica S; Upadhye, Mandar; McGeorge, Nicolette M; Guarrera, Theresa K; Hegde, Sudeep; Crane, Peter W; Fairbanks, Rollin J; Bisantz, Ann M; Kaushal, Rainu; Lin, Li

    2013-01-01

    The effect of health information technology (HIT) on efficiency and workload among clinical and nonclinical staff has been debated, with conflicting evidence about whether electronic health records (EHRs) increase or decrease effort. None of this paper to date, however, examines the effect of interoperability quantitatively using discrete event simulation techniques. To estimate the impact of EHR systems with various levels of interoperability on day-to-day tasks and operations of ambulatory physician offices. Interviews and observations were used to collect workflow data from 12 adult primary and specialty practices. A discrete event simulation model was constructed to represent patient flows and clinical and administrative tasks of physicians and staff members. High levels of EHR interoperability were associated with reduced time spent by providers on four tasks: preparing lab reports, requesting lab orders, prescribing medications, and writing referrals. The implementation of an EHR was associated with less time spent by administrators but more time spent by physicians, compared with time spent at paper-based practices. In addition, the presence of EHRs and of interoperability did not significantly affect the time usage of registered nurses or the total visit time and waiting time of patients. This paper suggests that the impact of using HIT on clinical and nonclinical staff work efficiency varies, however, overall it appears to improve time efficiency more for administrators than for physicians and nurses.

  9. Validation of a DICE Simulation Against a Discrete Event Simulation Implemented Entirely in Code.

    PubMed

    Möller, Jörgen; Davis, Sarah; Stevenson, Matt; Caro, J Jaime

    2017-10-01

    Modeling is an essential tool for health technology assessment, and various techniques for conceptualizing and implementing such models have been described. Recently, a new method has been proposed-the discretely integrated condition event or DICE simulation-that enables frequently employed approaches to be specified using a common, simple structure that can be entirely contained and executed within widely available spreadsheet software. To assess if a DICE simulation provides equivalent results to an existing discrete event simulation, a comparison was undertaken. A model of osteoporosis and its management programmed entirely in Visual Basic for Applications and made public by the National Institute for Health and Care Excellence (NICE) Decision Support Unit was downloaded and used to guide construction of its DICE version in Microsoft Excel ® . The DICE model was then run using the same inputs and settings, and the results were compared. The DICE version produced results that are nearly identical to the original ones, with differences that would not affect the decision direction of the incremental cost-effectiveness ratios (<1% discrepancy), despite the stochastic nature of the models. The main limitation of the simple DICE version is its slow execution speed. DICE simulation did not alter the results and, thus, should provide a valid way to design and implement decision-analytic models without requiring specialized software or custom programming. Additional efforts need to be made to speed up execution.

  10. Species survival emerge from rare events of individual migration

    NASA Astrophysics Data System (ADS)

    Zelnik, Yuval R.; Solomon, Sorin; Yaari, Gur

    2015-01-01

    Ecosystems greatly vary in their species composition and interactions, yet they all show remarkable resilience to external influences. Recent experiments have highlighted the significant effects of spatial structure and connectivity on the extinction and survival of species. It has also been emphasized lately that in order to study extinction dynamics reliably, it is essential to incorporate stochasticity, and in particular the discrete nature of populations, into the model. Accordingly, we applied a bottom-up modeling approach that includes both spatial features and stochastic interactions to study survival mechanisms of species. Using the simplest spatial extension of the Lotka-Volterra predator-prey model with competition, subject to demographic and environmental noise, we were able to systematically study emergent properties of this rich system. By scanning the relevant parameter space, we show that both survival and extinction processes often result from a combination of habitat fragmentation and individual rare events of recolonization.

  11. Species survival emerge from rare events of individual migration.

    PubMed

    Zelnik, Yuval R; Solomon, Sorin; Yaari, Gur

    2015-01-19

    Ecosystems greatly vary in their species composition and interactions, yet they all show remarkable resilience to external influences. Recent experiments have highlighted the significant effects of spatial structure and connectivity on the extinction and survival of species. It has also been emphasized lately that in order to study extinction dynamics reliably, it is essential to incorporate stochasticity, and in particular the discrete nature of populations, into the model. Accordingly, we applied a bottom-up modeling approach that includes both spatial features and stochastic interactions to study survival mechanisms of species. Using the simplest spatial extension of the Lotka-Volterra predator-prey model with competition, subject to demographic and environmental noise, we were able to systematically study emergent properties of this rich system. By scanning the relevant parameter space, we show that both survival and extinction processes often result from a combination of habitat fragmentation and individual rare events of recolonization.

  12. Regression analysis of mixed recurrent-event and panel-count data with additive rate models.

    PubMed

    Zhu, Liang; Zhao, Hui; Sun, Jianguo; Leisenring, Wendy; Robison, Leslie L

    2015-03-01

    Event-history studies of recurrent events are often conducted in fields such as demography, epidemiology, medicine, and social sciences (Cook and Lawless, 2007, The Statistical Analysis of Recurrent Events. New York: Springer-Verlag; Zhao et al., 2011, Test 20, 1-42). For such analysis, two types of data have been extensively investigated: recurrent-event data and panel-count data. However, in practice, one may face a third type of data, mixed recurrent-event and panel-count data or mixed event-history data. Such data occur if some study subjects are monitored or observed continuously and thus provide recurrent-event data, while the others are observed only at discrete times and hence give only panel-count data. A more general situation is that each subject is observed continuously over certain time periods but only at discrete times over other time periods. There exists little literature on the analysis of such mixed data except that published by Zhu et al. (2013, Statistics in Medicine 32, 1954-1963). In this article, we consider the regression analysis of mixed data using the additive rate model and develop some estimating equation-based approaches to estimate the regression parameters of interest. Both finite sample and asymptotic properties of the resulting estimators are established, and the numerical studies suggest that the proposed methodology works well for practical situations. The approach is applied to a Childhood Cancer Survivor Study that motivated this study. © 2014, The International Biometric Society.

  13. Multiple Kernel Learning for Heterogeneous Anomaly Detection: Algorithm and Aviation Safety Case Study

    NASA Technical Reports Server (NTRS)

    Das, Santanu; Srivastava, Ashok N.; Matthews, Bryan L.; Oza, Nikunj C.

    2010-01-01

    The world-wide aviation system is one of the most complex dynamical systems ever developed and is generating data at an extremely rapid rate. Most modern commercial aircraft record several hundred flight parameters including information from the guidance, navigation, and control systems, the avionics and propulsion systems, and the pilot inputs into the aircraft. These parameters may be continuous measurements or binary or categorical measurements recorded in one second intervals for the duration of the flight. Currently, most approaches to aviation safety are reactive, meaning that they are designed to react to an aviation safety incident or accident. In this paper, we discuss a novel approach based on the theory of multiple kernel learning to detect potential safety anomalies in very large data bases of discrete and continuous data from world-wide operations of commercial fleets. We pose a general anomaly detection problem which includes both discrete and continuous data streams, where we assume that the discrete streams have a causal influence on the continuous streams. We also assume that atypical sequence of events in the discrete streams can lead to off-nominal system performance. We discuss the application domain, novel algorithms, and also discuss results on real-world data sets. Our algorithm uncovers operationally significant events in high dimensional data streams in the aviation industry which are not detectable using state of the art methods

  14. Effects of Discrete Emotions on Young Children's Ability to Discern Fantasy and Reality

    ERIC Educational Resources Information Center

    Carrick, Nathalie; Quas, Jodi A.

    2006-01-01

    This study examined 3- to 5-year-olds' (N = 128; 54% girls) ability to discriminate emotional fantasy and reality. Children viewed images depicting fantastic or real events that elicited several emotions, reported whether each event could occur, and rated their emotional reaction to the image. Children were also administered the Play Behavior…

  15. Modification of the SAS4A Safety Analysis Code for Integration with the ADAPT Discrete Dynamic Event Tree Framework.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jankovsky, Zachary Kyle; Denman, Matthew R.

    It is difficult to assess the consequences of a transient in a sodium-cooled fast reactor (SFR) using traditional probabilistic risk assessment (PRA) methods, as numerous safety-related sys- tems have passive characteristics. Often there is significant dependence on the value of con- tinuous stochastic parameters rather than binary success/failure determinations. One form of dynamic PRA uses a system simulator to represent the progression of a transient, tracking events through time in a discrete dynamic event tree (DDET). In order to function in a DDET environment, a simulator must have characteristics that make it amenable to changing physical parameters midway through themore » analysis. The SAS4A SFR system analysis code did not have these characteristics as received. This report describes the code modifications made to allow dynamic operation as well as the linking to a Sandia DDET driver code. A test case is briefly described to demonstrate the utility of the changes.« less

  16. Generation Algorithm of Discrete Line in Multi-Dimensional Grids

    NASA Astrophysics Data System (ADS)

    Du, L.; Ben, J.; Li, Y.; Wang, R.

    2017-09-01

    Discrete Global Grids System (DGGS) is a kind of digital multi-resolution earth reference model, in terms of structure, it is conducive to the geographical spatial big data integration and mining. Vector is one of the important types of spatial data, only by discretization, can it be applied in grids system to make process and analysis. Based on the some constraint conditions, this paper put forward a strict definition of discrete lines, building a mathematic model of the discrete lines by base vectors combination method. Transforming mesh discrete lines issue in n-dimensional grids into the issue of optimal deviated path in n-minus-one dimension using hyperplane, which, therefore realizing dimension reduction process in the expression of mesh discrete lines. On this basis, we designed a simple and efficient algorithm for dimension reduction and generation of the discrete lines. The experimental results show that our algorithm not only can be applied in the two-dimensional rectangular grid, also can be applied in the two-dimensional hexagonal grid and the three-dimensional cubic grid. Meanwhile, when our algorithm is applied in two-dimensional rectangular grid, it can get a discrete line which is more similar to the line in the Euclidean space.

  17. Using simulation modeling to improve patient flow at an outpatient orthopedic clinic.

    PubMed

    Rohleder, Thomas R; Lewkonia, Peter; Bischak, Diane P; Duffy, Paul; Hendijani, Rosa

    2011-06-01

    We report on the use of discrete event simulation modeling to support process improvements at an orthopedic outpatient clinic. The clinic was effective in treating patients, but waiting time and congestion in the clinic created patient dissatisfaction and staff morale issues. The modeling helped to identify improvement alternatives including optimized staffing levels, better patient scheduling, and an emphasis on staff arriving promptly. Quantitative results from the modeling provided motivation to implement the improvements. Statistical analysis of data taken before and after the implementation indicate that waiting time measures were significantly improved and overall patient time in the clinic was reduced.

  18. An application of different dioids in public key cryptography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Durcheva, Mariana I., E-mail: mdurcheva66@gmail.com

    2014-11-18

    Dioids provide a natural framework for analyzing a broad class of discrete event dynamical systems such as the design and analysis of bus and railway timetables, scheduling of high-throughput industrial processes, solution of combinatorial optimization problems, the analysis and improvement of flow systems in communication networks. They have appeared in several branches of mathematics such as functional analysis, optimization, stochastic systems and dynamic programming, tropical geometry, fuzzy logic. In this paper we show how to involve dioids in public key cryptography. The main goal is to create key – exchange protocols based on dioids. Additionally the digital signature scheme ismore » presented.« less

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tagestad, Jerry; Brooks, Matthew; Cullinan, Valerie

    Mojave Desert ecosystem processes are dependent upon the amount and seasonality of precipitation. Multi-decadal periods of drought or above-average rainfall affect landscape vegetation condition, biomass and susceptibility to fire. The seasonality of precipitation events can also affect the likelihood of lightning, a key ignition source for fires. To develop an understanding of precipitation regimes and fire patterns we used monthly average precipitation data and GIS data representing burned areas from 1971-2010. We applied a K-means cluster analysis to the monthly precipitation data identifying three distinct precipitation seasons; winter (October – March), spring (April-June) and summer (July-September) and four discrete precipitationmore » regimes within the Mojave ecoregion.« less

  20. Safety analysis of discrete event systems using a simplified Petri net controller.

    PubMed

    Zareiee, Meysam; Dideban, Abbas; Asghar Orouji, Ali

    2014-01-01

    This paper deals with the problem of forbidden states in discrete event systems based on Petri net models. So, a method is presented to prevent the system from entering these states by constructing a small number of generalized mutual exclusion constraints. This goal is achieved by solving three types of Integer Linear Programming problems. The problems are designed to verify the constraints that some of them are related to verifying authorized states and the others are related to avoiding forbidden states. The obtained constraints can be enforced on the system using a small number of control places. Moreover, the number of arcs related to these places is small, and the controller after connecting them is maximally permissive. © 2013 ISA. Published by Elsevier Ltd. All rights reserved.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barr, G.E.; Borns, D.J.; Fridrich, C.

    A comprehensive collection of scenarios is presented that connect initiating tectonic events with radionuclide releases by logical and physically possible combinations or sequences of features, events and processes. The initiating tectonic events include both discrete faulting and distributed rock deformation developed through the repository and adjacent to it, as well as earthquake-induced ground motion and changes in tectonic stress at the site. The effects of these tectonic events include impacts on the engineered-barrier system, such as container rupture and failure of repository tunnels. These effects also include a wide range of hydrologic effects such as changes in pathways and flowmore » rates in the unsaturated and saturated zones, changes in the water-table configuration, and in the development of perched-water systems. These scenarios are intended go guide performance-assessment analyses and to assist principal investigators in how essential field, laboratory, and calculational studies are used. This suite of scenarios will help ensure that all important aspects of the system disturbance related to a tectonic scenario are captured in numerical analyses. It also provides a record of all options considered by project analysts to provide documentation required for licensing agreement. The final portion of this report discusses issues remaining to be addressed with respect to tectonic activity. 105 refs.« less

  2. A networks-based discrete dynamic systems approach to volcanic seismicity

    NASA Astrophysics Data System (ADS)

    Suteanu, Mirela

    2013-04-01

    The detection and relevant description of pattern change concerning earthquake events is an important, but challenging task. In this paper, earthquake events related to volcanic activity are considered manifestations of a dynamic system evolving over time. The system dynamics is seen as a succession of events with point-like appearance both in time and in space. Each event is characterized by a position in three-dimensional space, a moment of occurrence, and an event size (magnitude). A weighted directed network is constructed to capture the effects of earthquakes on subsequent events. Each seismic event represents a node. Relations among events represent edges. Edge directions are given by the temporal succession of the events. Edges are also characterized by weights reflecting the strengths of the relation between the nodes. Weights are calculated as a function of (i) the time interval separating the two events, (ii) the spatial distance between the events, (iii) the magnitude of the earliest event among the two. Different ways of addressing weight components are explored, and their implications for the properties of the produced networks are analyzed. The resulting networks are then characterized in terms of degree- and weight distributions. Subsequently, the distribution of system transitions is determined for all the edges connecting related events in the network. Two- and three-dimensional diagrams are constructed to reflect transition distributions for each set of events. Networks are thus generated for successive temporal windows of different size, and the evolution of (a) network properties and (b) system transition distributions are followed over time and compared to the timeline of documented geologic processes. Applications concerning volcanic seismicity on the Big Island of Hawaii show that this approach is capable of revealing novel aspects of change occurring in the volcanic system on different scales in time and in space.

  3. Suboptimal distributed control and estimation: application to a four coupled tanks system

    NASA Astrophysics Data System (ADS)

    Orihuela, Luis; Millán, Pablo; Vivas, Carlos; Rubio, Francisco R.

    2016-06-01

    The paper proposes an innovative estimation and control scheme that enables the distributed monitoring and control of large-scale processes. The proposed approach considers a discrete linear time-invariant process controlled by a network of agents that may both collect information about the evolution of the plant and apply control actions to drive its behaviour. The problem makes full sense when local observability/controllability is not assumed and the communication between agents can be exploited to reach system-wide goals. Additionally, to reduce agents bandwidth requirements and power consumption, an event-based communication policy is studied. The design procedure guarantees system stability, allowing the designer to trade-off performance, control effort and communication requirements. The obtained controllers and observers are implemented in a fully distributed fashion. To illustrate the performance of the proposed technique, experimental results on a quadruple-tank process are provided.

  4. Attitudes to the right- and left: frontal ERP asymmetries associated with stimulus valence and processing goals.

    PubMed

    Cunningham, William A; Espinet, Stacey D; DeYoung, Colin G; Zelazo, Philip David

    2005-12-01

    We used dense-array event-related potentials (ERP) to examine the time course and neural bases of evaluative processing. Participants made good vs. bad (evaluative) and abstract vs. concrete (nonevaluative) judgments of socially relevant concepts (e.g., "murder," "welfare"), and then rated all concepts for goodness and badness. Results revealed a late positive potential (LPP) beginning at about 475 ms post-stimulus and maximal over anterior sites. The LPP was lateralized (higher amplitude and shorter latency) on the right for concepts later rated bad, and on the left for concepts later rated good. Moreover, the degree of lateralization for the amplitude but not the latency was larger when participants were making evaluative judgments than when they were making nonevaluative judgments. These data are consistent with a model in which discrete regions of prefrontal cortex (PFC) are specialized for the evaluative processing of positive and negative stimuli.

  5. Complex effusive events at Kilauea as documented by the GOES satellite and remote video cameras

    USGS Publications Warehouse

    Harris, A.J.L.; Thornber, C.R.

    1999-01-01

    GOES provides thermal data for all of the Hawaiian volcanoes once every 15 min. We show how volcanic radiance time series produced from this data stream can be used as a simple measure of effusive activity. Two types of radiance trends in these time series can be used to monitor effusive activity: (a) Gradual variations in radiance reveal steady flow-field extension and tube development. (b) Discrete spikes correlate with short bursts of activity, such as lava fountaining or lava-lake overflows. We are confident that any effusive event covering more than 10,000 m2 of ground in less than 60 min will be unambiguously detectable using this approach. We demonstrate this capability using GOES, video camera and ground-based observational data for the current eruption of Kilauea volcano (Hawai'i). A GOES radiance time series was constructed from 3987 images between 19 June and 12 August 1997. This time series displayed 24 radiance spikes elevated more than two standard deviations above the mean; 19 of these are correlated with video-recorded short-burst effusive events. Less ambiguous events are interpreted, assessed and related to specific volcanic events by simultaneous use of permanently recording video camera data and ground-observer reports. The GOES radiance time series are automatically processed on data reception and made available in near-real-time, so such time series can contribute to three main monitoring functions: (a) automatically alerting major effusive events; (b) event confirmation and assessment; and (c) establishing effusive event chronology.

  6. Is processing of symbols and words influenced by writing system? Evidence from Chinese, Korean, English, and Greek.

    PubMed

    Altani, Angeliki; Georgiou, George K; Deng, Ciping; Cho, Jeung-Ryeul; Katopodi, Katerina; Wei, Wei; Protopapas, Athanassios

    2017-12-01

    We examined cross-linguistic effects in the relationship between serial and discrete versions of digit naming and word reading. In total, 113 Mandarin-speaking Chinese children, 100 Korean children, 112 English-speaking Canadian children, and 108 Greek children in Grade 3 were administered tasks of serial and discrete naming of words and digits. Interrelations among tasks indicated that the link between rapid naming and reading is largely determined by the format of the tasks across orthographies. Multigroup path analyses with discrete and serial word reading as dependent variables revealed commonalities as well as significant differences between writing systems. The path coefficient from discrete digits to discrete words was greater for the more transparent orthographies, consistent with more efficient sight-word processing. The effect of discrete word reading on serial word reading was stronger in alphabetic languages, where there was also a suppressive effect of discrete digit naming. However, the effect of serial digit naming on serial word reading did not differ among the four language groups. This pattern of relationships challenges a universal account of reading fluency acquisition while upholding a universal role of rapid serial naming, further distinguishing between multi-element interword and intraword processing. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Modelling road accident blackspots data with the discrete generalized Pareto distribution.

    PubMed

    Prieto, Faustino; Gómez-Déniz, Emilio; Sarabia, José María

    2014-10-01

    This study shows how road traffic networks events, in particular road accidents on blackspots, can be modelled with simple probabilistic distributions. We considered the number of crashes and the number of fatalities on Spanish blackspots in the period 2003-2007, from Spanish General Directorate of Traffic (DGT). We modelled those datasets, respectively, with the discrete generalized Pareto distribution (a discrete parametric model with three parameters) and with the discrete Lomax distribution (a discrete parametric model with two parameters, and particular case of the previous model). For that, we analyzed the basic properties of both parametric models: cumulative distribution, survival, probability mass, quantile and hazard functions, genesis and rth-order moments; applied two estimation methods of their parameters: the μ and (μ+1) frequency method and the maximum likelihood method; used two goodness-of-fit tests: Chi-square test and discrete Kolmogorov-Smirnov test based on bootstrap resampling; and compared them with the classical negative binomial distribution in terms of absolute probabilities and in models including covariates. We found that those probabilistic models can be useful to describe the road accident blackspots datasets analyzed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. A survival tree method for the analysis of discrete event times in clinical and epidemiological studies.

    PubMed

    Schmid, Matthias; Küchenhoff, Helmut; Hoerauf, Achim; Tutz, Gerhard

    2016-02-28

    Survival trees are a popular alternative to parametric survival modeling when there are interactions between the predictor variables or when the aim is to stratify patients into prognostic subgroups. A limitation of classical survival tree methodology is that most algorithms for tree construction are designed for continuous outcome variables. Hence, classical methods might not be appropriate if failure time data are measured on a discrete time scale (as is often the case in longitudinal studies where data are collected, e.g., quarterly or yearly). To address this issue, we develop a method for discrete survival tree construction. The proposed technique is based on the result that the likelihood of a discrete survival model is equivalent to the likelihood of a regression model for binary outcome data. Hence, we modify tree construction methods for binary outcomes such that they result in optimized partitions for the estimation of discrete hazard functions. By applying the proposed method to data from a randomized trial in patients with filarial lymphedema, we demonstrate how discrete survival trees can be used to identify clinically relevant patient groups with similar survival behavior. Copyright © 2015 John Wiley & Sons, Ltd.

  9. Detecting regular sound changes in linguistics as events of concerted evolution

    DOE PAGES

    Hruschka, Daniel  J.; Branford, Simon; Smith, Eric  D.; ...

    2014-12-18

    Background: Concerted evolution is normally used to describe parallel changes at different sites in a genome, but it is also observed in languages where a specific phoneme changes to the same other phoneme in many words in the lexicon—a phenomenon known as regular sound change. We develop a general statistical model that can detect concerted changes in aligned sequence data and apply it to study regular sound changes in the Turkic language family. Results: Linguistic evolution, unlike the genetic substitutional process, is dominated by events of concerted evolutionary change. Our model identified more than 70 historical events of regular soundmore » change that occurred throughout the evolution of the Turkic language family, while simultaneously inferring a dated phylogenetic tree. Including regular sound changes yielded an approximately 4-fold improvement in the characterization of linguistic change over a simpler model of sporadic change, improved phylogenetic inference, and returned more reliable and plausible dates for events on the phylogenies. The historical timings of the concerted changes closely follow a Poisson process model, and the sound transition networks derived from our model mirror linguistic expectations. Conclusions: We demonstrate that a model with no prior knowledge of complex concerted or regular changes can nevertheless infer the historical timings and genealogical placements of events of concerted change from the signals left in contemporary data. Our model can be applied wherever discrete elements—such as genes, words, cultural trends, technologies, or morphological traits—can change in parallel within an organism or other evolving group.« less

  10. Detecting regular sound changes in linguistics as events of concerted evolution.

    PubMed

    Hruschka, Daniel J; Branford, Simon; Smith, Eric D; Wilkins, Jon; Meade, Andrew; Pagel, Mark; Bhattacharya, Tanmoy

    2015-01-05

    Concerted evolution is normally used to describe parallel changes at different sites in a genome, but it is also observed in languages where a specific phoneme changes to the same other phoneme in many words in the lexicon—a phenomenon known as regular sound change. We develop a general statistical model that can detect concerted changes in aligned sequence data and apply it to study regular sound changes in the Turkic language family. Linguistic evolution, unlike the genetic substitutional process, is dominated by events of concerted evolutionary change. Our model identified more than 70 historical events of regular sound change that occurred throughout the evolution of the Turkic language family, while simultaneously inferring a dated phylogenetic tree. Including regular sound changes yielded an approximately 4-fold improvement in the characterization of linguistic change over a simpler model of sporadic change, improved phylogenetic inference, and returned more reliable and plausible dates for events on the phylogenies. The historical timings of the concerted changes closely follow a Poisson process model, and the sound transition networks derived from our model mirror linguistic expectations. We demonstrate that a model with no prior knowledge of complex concerted or regular changes can nevertheless infer the historical timings and genealogical placements of events of concerted change from the signals left in contemporary data. Our model can be applied wherever discrete elements—such as genes, words, cultural trends, technologies, or morphological traits—can change in parallel within an organism or other evolving group. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  11. Detecting regular sound changes in linguistics as events of concerted evolution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hruschka, Daniel  J.; Branford, Simon; Smith, Eric  D.

    Background: Concerted evolution is normally used to describe parallel changes at different sites in a genome, but it is also observed in languages where a specific phoneme changes to the same other phoneme in many words in the lexicon—a phenomenon known as regular sound change. We develop a general statistical model that can detect concerted changes in aligned sequence data and apply it to study regular sound changes in the Turkic language family. Results: Linguistic evolution, unlike the genetic substitutional process, is dominated by events of concerted evolutionary change. Our model identified more than 70 historical events of regular soundmore » change that occurred throughout the evolution of the Turkic language family, while simultaneously inferring a dated phylogenetic tree. Including regular sound changes yielded an approximately 4-fold improvement in the characterization of linguistic change over a simpler model of sporadic change, improved phylogenetic inference, and returned more reliable and plausible dates for events on the phylogenies. The historical timings of the concerted changes closely follow a Poisson process model, and the sound transition networks derived from our model mirror linguistic expectations. Conclusions: We demonstrate that a model with no prior knowledge of complex concerted or regular changes can nevertheless infer the historical timings and genealogical placements of events of concerted change from the signals left in contemporary data. Our model can be applied wherever discrete elements—such as genes, words, cultural trends, technologies, or morphological traits—can change in parallel within an organism or other evolving group.« less

  12. Observability of discretized partial differential equations

    NASA Technical Reports Server (NTRS)

    Cohn, Stephen E.; Dee, Dick P.

    1988-01-01

    It is shown that complete observability of the discrete model used to assimilate data from a linear partial differential equation (PDE) system is necessary and sufficient for asymptotic stability of the data assimilation process. The observability theory for discrete systems is reviewed and applied to obtain simple observability tests for discretized constant-coefficient PDEs. Examples are used to show how numerical dispersion can result in discrete dynamics with multiple eigenvalues, thereby detracting from observability.

  13. GXNOR-Net: Training deep neural networks with ternary weights and activations without full-precision memory under a unified discretization framework.

    PubMed

    Deng, Lei; Jiao, Peng; Pei, Jing; Wu, Zhenzhi; Li, Guoqi

    2018-04-01

    Although deep neural networks (DNNs) are being a revolutionary power to open up the AI era, the notoriously huge hardware overhead has challenged their applications. Recently, several binary and ternary networks, in which the costly multiply-accumulate operations can be replaced by accumulations or even binary logic operations, make the on-chip training of DNNs quite promising. Therefore there is a pressing need to build an architecture that could subsume these networks under a unified framework that achieves both higher performance and less overhead. To this end, two fundamental issues are yet to be addressed. The first one is how to implement the back propagation when neuronal activations are discrete. The second one is how to remove the full-precision hidden weights in the training phase to break the bottlenecks of memory/computation consumption. To address the first issue, we present a multi-step neuronal activation discretization method and a derivative approximation technique that enable the implementing the back propagation algorithm on discrete DNNs. While for the second issue, we propose a discrete state transition (DST) methodology to constrain the weights in a discrete space without saving the hidden weights. Through this way, we build a unified framework that subsumes the binary or ternary networks as its special cases, and under which a heuristic algorithm is provided at the website https://github.com/AcrossV/Gated-XNOR. More particularly, we find that when both the weights and activations become ternary values, the DNNs can be reduced to sparse binary networks, termed as gated XNOR networks (GXNOR-Nets) since only the event of non-zero weight and non-zero activation enables the control gate to start the XNOR logic operations in the original binary networks. This promises the event-driven hardware design for efficient mobile intelligence. We achieve advanced performance compared with state-of-the-art algorithms. Furthermore, the computational sparsity and the number of states in the discrete space can be flexibly modified to make it suitable for various hardware platforms. Copyright © 2018 Elsevier Ltd. All rights reserved.

  14. THYME: Toolkit for Hybrid Modeling of Electric Power Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nutaro Kalyan Perumalla, James Joseph

    2011-01-01

    THYME is an object oriented library for building models of wide area control and communications in electric power systems. This software is designed as a module to be used with existing open source simulators for discrete event systems in general and communication systems in particular. THYME consists of a typical model for simulating electro-mechanical transients (e.g., as are used in dynamic stability studies), data handling objects to work with CDF and PTI formatted power flow data, and sample models of discrete sensors and controllers.

  15. Testing for Independence between Evolutionary Processes.

    PubMed

    Behdenna, Abdelkader; Pothier, Joël; Abby, Sophie S; Lambert, Amaury; Achaz, Guillaume

    2016-09-01

    Evolutionary events co-occurring along phylogenetic trees usually point to complex adaptive phenomena, sometimes implicating epistasis. While a number of methods have been developed to account for co-occurrence of events on the same internal or external branch of an evolutionary tree, there is a need to account for the larger diversity of possible relative positions of events in a tree. Here we propose a method to quantify to what extent two or more evolutionary events are associated on a phylogenetic tree. The method is applicable to any discrete character, like substitutions within a coding sequence or gains/losses of a biological function. Our method uses a general approach to statistically test for significant associations between events along the tree, which encompasses both events inseparable on the same branch, and events genealogically ordered on different branches. It assumes that the phylogeny and themapping of branches is known without errors. We address this problem from the statistical viewpoint by a linear algebra representation of the localization of the evolutionary events on the tree.We compute the full probability distribution of the number of paired events occurring in the same branch or in different branches of the tree, under a null model of independence where each type of event occurs at a constant rate uniformly inthephylogenetic tree. The strengths andweaknesses of themethodare assessed via simulations;we then apply the method to explore the loss of cell motility in intracellular pathogens. © The Author(s) 2016. Published by Oxford University Press, on behalf of the Society of Systematic Biologists. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  16. Software engineering and simulation

    NASA Technical Reports Server (NTRS)

    Zhang, Shou X.; Schroer, Bernard J.; Messimer, Sherri L.; Tseng, Fan T.

    1990-01-01

    This paper summarizes the development of several automatic programming systems for discrete event simulation. Emphasis is given on the model development, or problem definition, and the model writing phases of the modeling life cycle.

  17. Modeling the rate of HIV testing from repeated binary data amidst potential never-testers.

    PubMed

    Rice, John D; Johnson, Brent A; Strawderman, Robert L

    2018-01-04

    Many longitudinal studies with a binary outcome measure involve a fraction of subjects with a homogeneous response profile. In our motivating data set, a study on the rate of human immunodeficiency virus (HIV) self-testing in a population of men who have sex with men (MSM), a substantial proportion of the subjects did not self-test during the follow-up study. The observed data in this context consist of a binary sequence for each subject indicating whether or not that subject experienced any events between consecutive observation time points, so subjects who never self-tested were observed to have a response vector consisting entirely of zeros. Conventional longitudinal analysis is not equipped to handle questions regarding the rate of events (as opposed to the odds, as in the classical logistic regression model). With the exception of discrete mixture models, such methods are also not equipped to handle settings in which there may exist a group of subjects for whom no events will ever occur, i.e. a so-called "never-responder" group. In this article, we model the observed data assuming that events occur according to some unobserved continuous-time stochastic process. In particular, we consider the underlying subject-specific processes to be Poisson conditional on some unobserved frailty, leading to a natural focus on modeling event rates. Specifically, we propose to use the power variance function (PVF) family of frailty distributions, which contains both the gamma and inverse Gaussian distributions as special cases and allows for the existence of a class of subjects having zero frailty. We generalize a computational algorithm developed for a log-gamma random intercept model (Conaway, 1990. A random effects model for binary data. Biometrics46, 317-328) to compute the exact marginal likelihood, which is then maximized to obtain estimates of model parameters. We conduct simulation studies, exploring the performance of the proposed method in comparison with competitors. Applying the PVF as well as a Gaussian random intercept model and a corresponding discrete mixture model to our motivating data set, we conclude that the group assigned to receive follow-up messages via SMS was self-testing at a significantly lower rate than the control group, but that there is no evidence to support the existence of a group of never-testers. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  18. StratBAM: A Discrete-Event Simulation Model to Support Strategic Hospital Bed Capacity Decisions.

    PubMed

    Devapriya, Priyantha; Strömblad, Christopher T B; Bailey, Matthew D; Frazier, Seth; Bulger, John; Kemberling, Sharon T; Wood, Kenneth E

    2015-10-01

    The ability to accurately measure and assess current and potential health care system capacities is an issue of local and national significance. Recent joint statements by the Institute of Medicine and the Agency for Healthcare Research and Quality have emphasized the need to apply industrial and systems engineering principles to improving health care quality and patient safety outcomes. To address this need, a decision support tool was developed for planning and budgeting of current and future bed capacity, and evaluating potential process improvement efforts. The Strategic Bed Analysis Model (StratBAM) is a discrete-event simulation model created after a thorough analysis of patient flow and data from Geisinger Health System's (GHS) electronic health records. Key inputs include: timing, quantity and category of patient arrivals and discharges; unit-level length of care; patient paths; and projected patient volume and length of stay. Key outputs include: admission wait time by arrival source and receiving unit, and occupancy rates. Electronic health records were used to estimate parameters for probability distributions and to build empirical distributions for unit-level length of care and for patient paths. Validation of the simulation model against GHS operational data confirmed its ability to model real-world data consistently and accurately. StratBAM was successfully used to evaluate the system impact of forecasted patient volumes and length of stay in terms of patient wait times, occupancy rates, and cost. The model is generalizable and can be appropriately scaled for larger and smaller health care settings.

  19. The influence of context boundaries on memory for the sequential order of events.

    PubMed

    DuBrow, Sarah; Davachi, Lila

    2013-11-01

    Episodic memory allows people to reexperience the past by recovering the sequences of events that characterize those prior experiences. Although experience is continuous, people are able to selectively retrieve and reexperience more discrete episodes from their past, raising the possibility that some elements become tightly related to each other in memory, whereas others do not. The current series of experiments was designed to ask how shifts in context during an experience influence how people remember the past. Specifically, we asked how context shifts influence the ability to remember the relative order of past events, a hallmark of episodic memory. We found that memory for the order of events was enhanced within, rather than across, context shifts, or boundaries (Experiment 1). Next, we showed that this relative enhancement in order memory was eliminated when across-item associative processing was disrupted (Experiment 2), suggesting that context shifts have a selective effect on sequential binding. Finally, we provide evidence that the act of making order memory judgments involves the reactivation of representations that bridged the tested items (Experiment 3). Together, these data suggest that boundaries may serve to parse continuous experience into sequences of contextually related events and that this organization facilitates remembering the temporal order of events that share the same context. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  20. Causal Set Phenomenology

    NASA Astrophysics Data System (ADS)

    Philpott, Lydia

    2010-09-01

    Central to the development of any new theory is the investigation of the observable consequences of the theory. In the search for quantum gravity, research in phenomenology has been dominated by models violating Lorentz invariance (LI) -- despite there being, at present, no evidence that LI is violated. Causal set theory is a LI candidate theory of QG that seeks not to quantise gravity as such, but rather to develop a new understanding of the universe from which both GR and QM could arise separately. The key hypothesis is that spacetime is a discrete partial order: a set of events where the partial ordering is the physical causal ordering between the events. This thesis investigates Lorentz invariant QG phenomenology motivated by the causal set approach. Massive particles propagating in a discrete spacetime will experience diffusion in both position and momentum in proper time. This thesis considers this idea in more depth, providing a rigorous derivation of the diffusion equation in terms of observable cosmic time. The diffusion behaviour does not depend on any particular underlying particle model. Simulations of three different models are conducted, revealing behaviour that matches the diffusion equation despite limitations on the size of causal set simulated. The effect of spacetime discreteness on the behaviour of massless particles is also investigated. Diffusion equations in both affine time and cosmic time are derived, and it is found that massless particles undergo diffusion and drift in energy. Constraints are placed on the magnitudes of the drift and diffusion parameters by considering the blackbody nature of the CMB. Spacetime discreteness also has a potentially observable effect on photon polarisation. For linearly polarised photons, underlying discreteness is found to cause a rotation in polarisation angle and a suppression in overall polarisation.

  1. Studies of discrete symmetries in a purely leptonic system using the Jagiellonian Positron Emission Tomograph

    NASA Astrophysics Data System (ADS)

    Moskal, P.; Alfs, D.; Bednarski, T.; Białas, P.; Curceanu, C.; Czerwiński, E.; Dulski, K.; Gajos, A.; Głowacz, B.; Gupta-Sharma, N.; Gorgol, M.; Hiesmayr, B. C.; Jasińska, B.; Kamińska, D.; Khreptak, O.; Korcyl, G.; Kowalski, P.; Krzemień, W.; Krawczyk, N.; Kubicz, E.; Mohammed, M.; Niedźwiecki, Sz.; Pawlik-Niedńwiecka, M.; Raczyński, L.; Rudy, Z.; Silarski, M.; Smyrski, J.; Wieczorek, A.; Wiślicki, W.; Zgardzińska, B.; Zieliński, M.

    2016-11-01

    Discrete symmetries such as parity (P), charge-conjugation (C) and time reversal (T) are of fundamental importance in physics and cosmology. Breaking of charge conjugation symmetry (C) and its combination with parity (CP) constitute necessary conditions for the existence of the asymmetry between matter and antimatter in the observed Universe. The presently known sources of discrete symmetries violations can account for only a tiny fraction of the excess of matter over antimatter. So far CP and T symmetries violations were observed only for systems involving quarks and they were never reported for the purely leptonic objects. In this article we describe briefly an experimental proposal for the test of discrete symmetries in the decays of positronium atom which is made exclusively of leptons. The experiments are conducted by means of the Jagiellonian Positron Emission Tomograph (J-PET) which is constructed from strips of plastic scintillators enabling registration of photons from the positronium annihilation. J-PET tomograph together with the positronium target system enable to measure expectation values for the discrete symmetries odd operators constructed from (i) spin vector of the ortho-positronium atom, (ii) momentum vectors of photons originating from the decay of positronium, and (iii) linear polarization direction of annihilation photons. Linearly polarized positronium will be produced in the highly porous aerogel or polymer targets, exploiting longitudinally polarized positrons emitted by the sodium 22Na isotope. Information about the polarization vector of orthopositronium will be available on the event by event basis and will be reconstructed from the known position of the positron source and the reconstructed position of the orthopositronium annihilation. In 2016 the first tests and calibration runs are planned, and the data collection with high statistics will commence in the year 2017.

  2. RELATIONSHIP BETWEEN LINGUISTIC UNITS AND MOTOR COMMANDS.

    ERIC Educational Resources Information Center

    FROMKIN, VICTORIA A.

    ASSUMING THAT SPEECH IS THE RESULT OF A NUMBER OF DISCRETE NEUROMUSCULAR EVENTS AND THAT THE BRAIN CAN STORE ONLY A LIMITED NUMBER OF MOTOR COMMANDS WITH WHICH TO CONTROL THESE EVENTS, THE RESEARCH REPORTED IN THIS PAPER WAS DIRECTED TO A DETERMINATION OF THE SIZE AND NATURE OF THE STORED ITEMS AND AN EXPLANATION OF HOW SPEAKERS ENCODE A SEQUENCE…

  3. Using Institutional Data to Identify Students at Risk for Leaving Community College: An Event History Approach

    ERIC Educational Resources Information Center

    Bachler, Paul T.

    2013-01-01

    Community colleges have been criticized for having lower graduation rates than four year colleges, but few studies have looked at non-graduation transfer, in which a student leaves the community college for a four-year college without taking an associate degree. The current study utilizes institutional data and a discrete-time event history model…

  4. Pathways to the Principalship: An Event History Analysis of the Careers of Teachers with Principal Certification

    ERIC Educational Resources Information Center

    Davis, Bradley W.; Gooden, Mark A.; Bowers, Alex J.

    2017-01-01

    Utilizing rich data on nearly 11,000 educators over 17 academic years in a highly diverse context, we examine the career paths of teachers to determine whether and when they transition into the principalship. We utilize a variety of event history analyses, including discrete-time hazard modeling, to determine how an individual's race, gender, and…

  5. Modelling approaches: the case of schizophrenia.

    PubMed

    Heeg, Bart M S; Damen, Joep; Buskens, Erik; Caleo, Sue; de Charro, Frank; van Hout, Ben A

    2008-01-01

    Schizophrenia is a chronic disease characterized by periods of relative stability interrupted by acute episodes (or relapses). The course of the disease may vary considerably between patients. Patient histories show considerable inter- and even intra-individual variability. We provide a critical assessment of the advantages and disadvantages of three modelling techniques that have been used in schizophrenia: decision trees, (cohort and micro-simulation) Markov models and discrete event simulation models. These modelling techniques are compared in terms of building time, data requirements, medico-scientific experience, simulation time, clinical representation, and their ability to deal with patient heterogeneity, the timing of events, prior events, patient interaction, interaction between co-variates and variability (first-order uncertainty). We note that, depending on the research question, the optimal modelling approach should be selected based on the expected differences between the comparators, the number of co-variates, the number of patient subgroups, the interactions between co-variates, and simulation time. Finally, it is argued that in case micro-simulation is required for the cost-effectiveness analysis of schizophrenia treatments, a discrete event simulation model is best suited to accurately capture all of the relevant interdependencies in this chronic, highly heterogeneous disease with limited long-term follow-up data.

  6. Perception of binary acoustic events associated with the first heart sound

    NASA Technical Reports Server (NTRS)

    Spodick, D. H.

    1977-01-01

    The resolving power of the auditory apparatus permits discrete vibrations associated with cardiac activity to be perceived as one or more events. Irrespective of the vibratory combinations recorded by conventional phonocardiography, in normal adults and in most adult patients auscultators tend to discriminate only two discrete events associated with the first heart sound S1. It is stressed that the heart sound S4 may be present when a binary acoustic event associated with S1 occurs in the sequence 'low pitched sound preceding high pitched sound', i.e., its components are perceived by auscultation as 'dull-sharp'. The question of S4 audibility arises in those individuals, normal and diseased, in whom the major components of S1 ought to be, at least clinically, at their customary high pitch and indeed on the PCG appear as high frequency oscillations. It is revealed that the apparent audibility of recorded S4 is not related to P-R interval, P-S4 interval, or relative amplitude of S4. The significant S4-LFC (low frequency component of S1) differences can be related to acoustic modification of the early component of S1.

  7. The Pivotal Role of Semantic Memory in Remembering the Past and Imagining the Future

    PubMed Central

    Irish, Muireann; Piguet, Olivier

    2013-01-01

    Episodic memory refers to a complex and multifaceted process which enables the retrieval of richly detailed evocative memories from the past. In contrast, semantic memory is conceptualized as the retrieval of general conceptual knowledge divested of a specific spatiotemporal context. The neural substrates of the episodic and semantic memory systems have been dissociated in healthy individuals during functional imaging studies, and in clinical cohorts, leading to the prevailing view that episodic and semantic memory represent functionally distinct systems subtended by discrete neurobiological substrates. Importantly, however, converging evidence focusing on widespread neural networks now points to significant overlap between those regions essential for retrieval of autobiographical memories, episodic learning, and semantic processing. Here we review recent advances in episodic memory research focusing on neurodegenerative populations which has proved revelatory for our understanding of the complex interplay between episodic and semantic memory. Whereas episodic memory research has traditionally focused on retrieval of autobiographical events from the past, we also include evidence from the recent paradigm shift in which episodic memory is viewed as an adaptive and constructive process which facilitates the imagining of possible events in the future. We examine the available evidence which converges to highlight the pivotal role of semantic memory in providing schemas and meaning whether one is engaged in autobiographical retrieval for the past, or indeed, is endeavoring to construct a plausible scenario of an event in the future. It therefore seems plausible to contend that semantic processing may underlie most, if not all, forms of episodic memory, irrespective of temporal condition. PMID:23565081

  8. Preslip and cascade processes initiating laboratory stick slip

    USGS Publications Warehouse

    McLaskey, Gregory C.; Lockner, David A.

    2014-01-01

    Recent modeling studies have explored whether earthquakes begin with a large aseismic nucleation process or initiate dynamically from the rapid growth of a smaller instability in a “cascade-up” process. To explore such a case in the laboratory, we study the initiation of dynamic rupture (stick slip) of a smooth saw-cut fault in a 76mm diameter cylindrical granite laboratory sample at 40–120MPa confining pressure. We use a high dynamic range recording system to directly compare the seismic waves radiated during the stick-slip event to those radiated from tiny (M _6) discrete seismic events, commonly known as acoustic emissions (AEs), that occur in the seconds prior to each large stick slip. The seismic moments, focal mechanisms, locations, and timing of the AEs all contribute to our understanding of their mechanics and provide us with information about the stick-slip nucleation process. In a sequence of 10 stick slips, the first few microseconds of the signals recorded from stick-slip instabilities are nearly indistinguishable from those of premonitory AEs. In this sense, it appears that each stick slip begins as an AE event that rapidly (~20 μs) grows about 2 orders of magnitude in linear dimension and ruptures the entire 150mm length of the simulated fault. We also measure accelerating fault slip in the final seconds before stick slip. We estimate that this slip is at least 98% aseismic and that it both weakens the fault and produces AEs that will eventually cascade-up to initiate the larger dynamic rupture.

  9. Template-free synthesis and structural evolution of discrete hydroxycancrinite zeolite nanorods from high-concentration hydrogels.

    PubMed

    Chen, Shaojiang; Sorge, Lukas P; Seo, Dong-Kyun

    2017-12-07

    We report the synthesis and characterization of hydroxycancrinite zeolite nanorods by a simple hydrothermal treatment of aluminosilicate hydrogels at high concentrations of precursors without the use of structure-directing agents. Transmission electron microscopy (TEM) analysis reveals that cancrinite nanorods, with lengths of 200-800 nm and diameters of 30-50 nm, exhibit a hexagonal morphology and are elongated along the crystallographic c direction. The powder X-ray diffraction (PXRD), Fourier transform infrared (FT-IR) and TEM studies revealed sequential events of hydrogel formation, the formation of aggregated sodalite nuclei, the conversion of sodalite to cancrinite and finally the growth of cancrinite nanorods into discrete particles. The aqueous dispersion of the discrete nanorods displays a good stability between pH 6-12 with the zeta potential no greater than -30 mV. The synthesis is unique in that the initial aggregated nanocrystals do not grow into microsized particles (aggregative growth) but into discrete nanorods. Our findings demonstrate an unconventional possibility that discrete zeolite nanocrystals could be produced from a concentrated hydrogel.

  10. Updating older forest inventory data with a growth model and satellite records to improve the responsiveness and currency of national carbon monitoring

    NASA Astrophysics Data System (ADS)

    Healey, S. P.; Zhao, F. R.; McCarter, J. B.; Frescino, T.; Goeking, S.

    2017-12-01

    International reporting of American forest carbon trends depends upon the Forest Service's nationally consistent network of inventory plots. Plots are measured on a rolling basis over a 5- to 10-year cycle, so estimates related to any variable, including carbon storage, reflect conditions over a 5- to 10-year window. This makes it difficult to identify the carbon impact of discrete events (e.g., a bad fire year; extraction rates related to home-building trends), particularly if the events are recent.We report an approach to make inventory estimates more sensitive to discrete and recent events. We use a growth model (the Forest Vegetation Simulator - FVS) that is maintained by the Forest Service to annually update the tree list for every plot, allowing all plots to contribute to a series of single-year estimates. Satellite imagery from the Landsat platform guides the FVS simulations by providing information about which plots have been disturbed, which are recovering from disturbance, and which are undergoing undisturbed growth. The FVS model is only used to "update" plot tree lists until the next field measurement is made (maximum of 9 years). As a result, predicted changes are usually small and error rates are low. We present a pilot study of this system in Idaho, which has experienced several major fire events in the last decade. Empirical estimates of uncertainty, accounting for both plot sampling error and FVS model error, suggest that this approach greatly increases temporal specificity and sensitivity to discrete events without sacrificing much estimate precision at the level of a US state. This approach has the potential to take better advantage of the Forest Service's rolling plot measurement schedule to report carbon storage in the US, and it offers the basis of a system that might allow near-term, forward-looking analysis of the effects of hypothetical forest disturbance patterns.

  11. The impact of negative emotions on self-concept abstraction depends on accessible information processing styles.

    PubMed

    Isbell, Linda M; Rovenpor, Daniel R; Lair, Elicia C

    2016-10-01

    Research suggests that anger promotes global, abstract processing whereas sadness and fear promote local, concrete processing (see Schwarz & Clore, 2007 for a review). Contrary to a large and influential body of work suggesting that specific affective experiences are tethered to specific cognitive outcomes, the affect-as-cognitive-feedback account maintains that affective experiences confer positive or negative value on currently dominant processing styles, and thus can lead to either global or local processing (Huntsinger, Isbell, & Clore, 2014). The current work extends this theoretical perspective by investigating the impact of discrete negative emotions on the self-concept. By experimentally manipulating information processing styles and discrete negative emotions that vary in appraisals of certainty, we demonstrate that the impact of discrete negative emotions on the spontaneous self-concept depends on accessible processing styles. When global processing was accessible, individuals in angry (negative, high certainty) states generated more abstract statements about themselves than individuals in either sad (Experiment 1) or fearful (Experiment 2; negative, low certainty) states. When local processing was made accessible, however, the opposite pattern emerged, whereby individuals in angry states generated fewer abstract statements than individuals in sad or fearful states. Together these studies provide new insights into the mechanisms through which discrete emotions influence cognition. In contrast to theories assuming a dedicated link between emotions and processing styles, these results suggest that discrete emotions provide feedback about accessible ways of thinking, and are consistent with recent evidence suggesting that the impact of affect on cognition is highly context-dependent. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  12. Distribution of breakage events in random packings of rodlike particles.

    PubMed

    Grof, Zdeněk; Štěpánek, František

    2013-07-01

    Uniaxial compaction and breakage of rodlike particle packing has been studied using a discrete element method simulation. A scaling relationship between the applied stress, the number of breakage events, and the number-mean particle length has been derived and compared with computational experiments. Based on results for a wide range of intrinsic particle strengths and initial particle lengths, it seems that a single universal relation can be used to describe the incidence of breakage events during compaction of rodlike particle layers.

  13. Discrete event simulation for exploring strategies: an urban water management case.

    PubMed

    Huang, Dong-Bin; Scholz, Roland W; Gujer, Willi; Chitwood, Derek E; Loukopoulos, Peter; Schertenleib, Roland; Siegrist, Hansruedi

    2007-02-01

    This paper presents a model structure aimed at offering an overview of the various elements of a strategy and exploring their multidimensional effects through time in an efficient way. It treats a strategy as a set of discrete events planned to achieve a certain strategic goal and develops a new form of causal networks as an interfacing component between decision makers and environment models, e.g., life cycle inventory and material flow models. The causal network receives a strategic plan as input in a discrete manner and then outputs the updated parameter sets to the subsequent environmental models. Accordingly, the potential dynamic evolution of environmental systems caused by various strategies can be stepwise simulated. It enables a way to incorporate discontinuous change in models for environmental strategy analysis, and enhances the interpretability and extendibility of a complex model by its cellular constructs. It is exemplified using an urban water management case in Kunming, a major city in Southwest China. By utilizing the presented method, the case study modeled the cross-scale interdependencies of the urban drainage system and regional water balance systems, and evaluated the effectiveness of various strategies for improving the situation of Dianchi Lake.

  14. VME rollback hardware for time warp multiprocessor systems

    NASA Technical Reports Server (NTRS)

    Robb, Michael J.; Buzzell, Calvin A.

    1992-01-01

    The purpose of the research effort is to develop and demonstrate innovative hardware to implement specific rollback and timing functions required for efficient queue management and precision timekeeping in multiprocessor discrete event simulations. The previously completed phase 1 effort demonstrated the technical feasibility of building hardware modules which eliminate the state saving overhead of the Time Warp paradigm used in distributed simulations on multiprocessor systems. The current phase 2 effort will build multiple pre-production rollback hardware modules integrated with a network of Sun workstations, and the integrated system will be tested by executing a Time Warp simulation. The rollback hardware will be designed to interface with the greatest number of multiprocessor systems possible. The authors believe that the rollback hardware will provide for significant speedup of large scale discrete event simulation problems and allow multiprocessors using Time Warp to dramatically increase performance.

  15. Discrete event simulation as a tool in optimization of a professional complex adaptive system.

    PubMed

    Nielsen, Anders Lassen; Hilwig, Helmer; Kissoon, Niranjan; Teelucksingh, Surujpal

    2008-01-01

    Similar urgent needs for improvement of health care systems exist in the developed and developing world. The culture and the organization of an emergency department in developing countries can best be described as a professional complex adaptive system, where each agent (employee) are ignorant of the behavior of the system as a whole; no one understands the entire system. Each agent's action is based on the state of the system at the moment (i.e. lack of medicine, unavailable laboratory investigation, lack of beds and lack of staff in certain functions). An important question is how one can improve the emergency service within the given constraints. The use of simulation signals is one new approach in studying issues amenable to improvement. Discrete event simulation was used to simulate part of the patient flow in an emergency department. A simple model was built using a prototyping approach. The simulation showed that a minor rotation among the nurses could reduce the mean number of visitors that had to be refereed to alternative flows within the hospital from 87 to 37 on a daily basis with a mean utilization of the staff between 95.8% (the nurses) and 87.4% (the doctors). We conclude that even faced with resource constraints and lack of accessible data discrete event simulation is a tool that can be used successfully to study the consequences of changes in very complex and self organizing professional complex adaptive systems.

  16. Plasma plume oscillations monitoring during laser welding of stainless steel by discrete wavelet transform application.

    PubMed

    Sibillano, Teresa; Ancona, Antonio; Rizzi, Domenico; Lupo, Valentina; Tricarico, Luigi; Lugarà, Pietro Mario

    2010-01-01

    The plasma optical radiation emitted during CO2 laser welding of stainless steel samples has been detected with a Si-PIN photodiode and analyzed under different process conditions. The discrete wavelet transform (DWT) has been used to decompose the optical signal into various discrete series of sequences over different frequency bands. The results show that changes of the process settings may yield different signal features in the range of frequencies between 200 Hz and 30 kHz. Potential applications of this method to monitor in real time the laser welding processes are also discussed.

  17. [How many patient transfer rooms are necessary for my OR suite? : Effect of the number of OR transfer rooms on waiting times and patient throughput in the OR - analysis by simulation].

    PubMed

    Messer, C; Zander, A; Arnolds, I V; Nickel, S; Schuster, M

    2015-12-01

    In most hospitals the operating rooms (OR) are separated from the rest of the hospital by transfer rooms where patients have to pass through for reasons of hygiene. In the OR transfer room patients are placed on the OR table before surgery and returned to the hospital bed after surgery. It could happen that the number of patients who need to pass through a transfer room at a certain point in time exceed the number of available transfer rooms. As a result the transfer rooms become a bottleneck where patients have to wait and which, in turn, may lead to delays in the OR suite. In this study the ability of a discrete event simulation to analyze the effect of the duration of surgery and the number of ORs on the number of OR transfer rooms needed was investigated. This study was based on a discrete event simulation model developed with the simulation software AnyLogic®. The model studied the effects of the number of OR transfer rooms on the processes in an OR suite of a community hospital by varying the number of ORs from one to eight and using different surgical portfolios. Probability distributions for the process duration of induction, surgery and recovery and transfer room processes were calculated on the basis of real data from the community hospital studied. Furthermore, using a generic simulation model the effect of the average duration of surgery on the number of OR transfer rooms needed was examined. The discrete event simulation model enabled the analysis of both quantitative as well as qualitative changes in the OR process and setting. Key performance indicators of the simulation model were patient throughput per day, the probability of waiting and duration of waiting time in front of OR transfer rooms. In the case of a community hospital with 1 transfer room the average proportion of patients waiting before entering the OR was 17.9 % ± 9.7 % with 3 ORs, 37.6 % ± 9.7 % with 5 ORs and 62.9 % ± 9.1 % with 8 ORs. The average waiting time of patients in the setting with 3 ORs was 3.1 ± 2.7 min, with 5 ORs 5.0 ± 5.8 min and with 8 ORs 11.5 ± 12.5 min. Based on this study the community hospital needs a second transfer room starting from 4 ORs so that there is no bottleneck for the subsequent OR processes. The average patient throughput in a setting with 4 ORs increased significantly by 0.3 patients per day when a second transfer room is available. The generic model showed a strong effect of the average duration of surgery on the number of transfer rooms needed. There was no linear correlation between the number of transfer rooms and the number of ORs. The shorter the average duration of surgery, the earlier an additional transfer room is required. Thus, hospitals with shorter duration of surgery and fewer ORs may need the same or more transfer rooms than a hospital with longer duration of surgery and more ORs. However, with respect to an economic analysis, the costs and benefits of installing additional OR transfer rooms need to be calculated using the profit margins of the specific hospital.

  18. Characteristics of dayside auroral displays in relation to magnetospheric processes

    NASA Astrophysics Data System (ADS)

    Minow, Joseph I.

    1997-09-01

    The use of dayside aurorae as a ground based monitor of magnetopause activity is explored in this thesis. The origin of diffuse (OI) 630.0 nm emissions in the midday auroral oval is considered first. Analysis of low altitude satellite records of precipitating charged particles within the cusp show an unstructured electron component that will produce a 0.5-1 kR 630.0 nm emission throughout the cusp. Distribution of the electrons is controlled by the requirement of charge neutrality in the cusp, predicting a diffuse 630.0 nm background even if the magnetosheath plasma is introduced into the magnetosphere in discrete merging events. Cusp electron fluxes also contain a structured component characterized by enhancements in the electron energy and energy flux over background values in narrow regions a few 10's of kilometers in width. These structured features are identified as the source of the transient midday arcs. An auroral model is developed to study the morphology of (OI) 630.0 nm auroral emissions produced by the transient arcs. The model demonstrates that a diffuse 630.0 nm background emission is produced by transient arcs due to the long lifetime of the O(1D) state. Two sources of diffuse 630.0 nm background emissions exist in the cusp which may originate in discrete merging events. The conclusion is that persistent 630.0 nm emissions cannot be interpreted as prima facie evidence for continuous particle transport from the magnetosheath across the magnetopause boundary and into the polar cusp. The second subject that is considered is the analysis of temporal and spatial variations of the diffuse 557.7 nm pulsating aurora in relation to the 630.0 nm dominated transient aurora. Temporal variations at the poleward boundary of the diffuse 557.7 nm aurora correlate with the formation of the 630.0 nm transient aurorae suggesting that the two events are related. The character of the auroral variations is consistent with the behavior of particle populations reported during satellite observations of flux transfer events near the dayside magnetopause. An interpretation of the events in terms of impulsive magnetic reconnection yields a new observation that relates the poleward moving transient auroral arcs in the midday sector to the flux transfer events.

  19. Models for discrete-time self-similar vector processes with application to network traffic

    NASA Astrophysics Data System (ADS)

    Lee, Seungsin; Rao, Raghuveer M.; Narasimha, Rajesh

    2003-07-01

    The paper defines self-similarity for vector processes by employing the discrete-time continuous-dilation operation which has successfully been used previously by the authors to define 1-D discrete-time stochastic self-similar processes. To define self-similarity of vector processes, it is required to consider the cross-correlation functions between different 1-D processes as well as the autocorrelation function of each constituent 1-D process in it. System models to synthesize self-similar vector processes are constructed based on the definition. With these systems, it is possible to generate self-similar vector processes from white noise inputs. An important aspect of the proposed models is that they can be used to synthesize various types of self-similar vector processes by choosing proper parameters. Additionally, the paper presents evidence of vector self-similarity in two-channel wireless LAN data and applies the aforementioned systems to simulate the corresponding network traffic traces.

  20. Spectral simplicity of apparent complexity. II. Exact complexities and complexity spectra

    NASA Astrophysics Data System (ADS)

    Riechers, Paul M.; Crutchfield, James P.

    2018-03-01

    The meromorphic functional calculus developed in Part I overcomes the nondiagonalizability of linear operators that arises often in the temporal evolution of complex systems and is generic to the metadynamics of predicting their behavior. Using the resulting spectral decomposition, we derive closed-form expressions for correlation functions, finite-length Shannon entropy-rate approximates, asymptotic entropy rate, excess entropy, transient information, transient and asymptotic state uncertainties, and synchronization information of stochastic processes generated by finite-state hidden Markov models. This introduces analytical tractability to investigating information processing in discrete-event stochastic processes, symbolic dynamics, and chaotic dynamical systems. Comparisons reveal mathematical similarities between complexity measures originally thought to capture distinct informational and computational properties. We also introduce a new kind of spectral analysis via coronal spectrograms and the frequency-dependent spectra of past-future mutual information. We analyze a number of examples to illustrate the methods, emphasizing processes with multivariate dependencies beyond pairwise correlation. This includes spectral decomposition calculations for one representative example in full detail.

  1. Efficient Transition Probability Computation for Continuous-Time Branching Processes via Compressed Sensing.

    PubMed

    Xu, Jason; Minin, Vladimir N

    2015-07-01

    Branching processes are a class of continuous-time Markov chains (CTMCs) with ubiquitous applications. A general difficulty in statistical inference under partially observed CTMC models arises in computing transition probabilities when the discrete state space is large or uncountable. Classical methods such as matrix exponentiation are infeasible for large or countably infinite state spaces, and sampling-based alternatives are computationally intensive, requiring integration over all possible hidden events. Recent work has successfully applied generating function techniques to computing transition probabilities for linear multi-type branching processes. While these techniques often require significantly fewer computations than matrix exponentiation, they also become prohibitive in applications with large populations. We propose a compressed sensing framework that significantly accelerates the generating function method, decreasing computational cost up to a logarithmic factor by only assuming the probability mass of transitions is sparse. We demonstrate accurate and efficient transition probability computations in branching process models for blood cell formation and evolution of self-replicating transposable elements in bacterial genomes.

  2. Automated terrestrial laser scanning with near-real-time change detection - monitoring of the Séchilienne landslide

    NASA Astrophysics Data System (ADS)

    Kromer, Ryan A.; Abellán, Antonio; Hutchinson, D. Jean; Lato, Matt; Chanut, Marie-Aurelie; Dubois, Laurent; Jaboyedoff, Michel

    2017-05-01

    We present an automated terrestrial laser scanning (ATLS) system with automatic near-real-time change detection processing. The ATLS system was tested on the Séchilienne landslide in France for a 6-week period with data collected at 30 min intervals. The purpose of developing the system was to fill the gap of high-temporal-resolution TLS monitoring studies of earth surface processes and to offer a cost-effective, light, portable alternative to ground-based interferometric synthetic aperture radar (GB-InSAR) deformation monitoring. During the study, we detected the flux of talus, displacement of the landslide and pre-failure deformation of discrete rockfall events. Additionally, we found the ATLS system to be an effective tool in monitoring landslide and rockfall processes despite missing points due to poor atmospheric conditions or rainfall. Furthermore, such a system has the potential to help us better understand a wide variety of slope processes at high levels of temporal detail.

  3. Efficient Transition Probability Computation for Continuous-Time Branching Processes via Compressed Sensing

    PubMed Central

    Xu, Jason; Minin, Vladimir N.

    2016-01-01

    Branching processes are a class of continuous-time Markov chains (CTMCs) with ubiquitous applications. A general difficulty in statistical inference under partially observed CTMC models arises in computing transition probabilities when the discrete state space is large or uncountable. Classical methods such as matrix exponentiation are infeasible for large or countably infinite state spaces, and sampling-based alternatives are computationally intensive, requiring integration over all possible hidden events. Recent work has successfully applied generating function techniques to computing transition probabilities for linear multi-type branching processes. While these techniques often require significantly fewer computations than matrix exponentiation, they also become prohibitive in applications with large populations. We propose a compressed sensing framework that significantly accelerates the generating function method, decreasing computational cost up to a logarithmic factor by only assuming the probability mass of transitions is sparse. We demonstrate accurate and efficient transition probability computations in branching process models for blood cell formation and evolution of self-replicating transposable elements in bacterial genomes. PMID:26949377

  4. National Launch System: Structures and materials

    NASA Technical Reports Server (NTRS)

    Bunting, Jack O.

    1993-01-01

    The National Launch System provides an opportunity to realize the potential of Al-Li. Advanced structures can reduce weights by 5-40 percent as well as relax propulsion system performance specifications and reduce requirements for labor and materials. The effect on costs will be substantial. Advanced assembly and process control technologies also offer the potential for greatly reduced labor during the manufacturing and inspection processes. Current practices are very labor-intensive and, as a result, labor costs far outweigh material costs for operational space transportation systems. The technological readiness of new structural materials depends on their commercial availability, producibility and materials properties. Martin Marietta is vigorously pursuing the development of its Weldalite 049 Al-Li alloys in each of these areas. Martin Marietta is also preparing to test an automated work cell concept that it has developed using discrete event simulation.

  5. Multi-Interval Discretization of Continuous-Valued Attributes for Classification Learning

    NASA Technical Reports Server (NTRS)

    Fayyad, U.; Irani, K.

    1993-01-01

    Since most real-world applications of classification learning involve continuous-valued attributes, properly addressing the discretization process is an important problem. This paper addresses the use of the entropy minimization heuristic for discretizing the range of a continuous-valued attribute into multiple intervals.

  6. The Spectrum of Mathematical Models.

    ERIC Educational Resources Information Center

    Karplus, Walter J.

    1983-01-01

    Mathematical modeling problems encountered in many disciplines are discussed in terms of the modeling process and applications of models. The models are classified according to three types of abstraction: continuous-space-continuous-time, discrete-space-continuous-time, and discrete-space-discrete-time. Limitations in different kinds of modeling…

  7. On the Grand Challenges in Physical Petrology: the Multiphase Crossroads

    NASA Astrophysics Data System (ADS)

    Bergantz, G. W.

    2014-12-01

    Rapid progress in experimental, micro-analytical and textural analysis at the crystal scale has produced an unprecedented record of magmatic processes. However an obstacle to further progress is the lack of understanding of how mass, energy and momentum flux associated with crystal-rich, open-system events produces identifiable outcomes. Hence developing a physically-based understanding of magmatic systems linking micro-scale petrological observations with a physical template operating at the macro-scale presents a so-called "Grand Challenge." The essence of this challenge is that magmatic systems have characteristic length and feedback scales between those accessible by classical continuum and discrete methods. It has become increasingly obvious that the old-school continuum methods have limited resolution and power of explanation for multiphase (real) magma dynamics. This is, in part, because in crystal-rich systems the deformation is non-affine, and so the concept of constitutive behavior is less applicable and likely not even relevant, especially if one is interested in the emergent character of micro-scale processes. One expression of this is the cottage industry of proposing viscosity laws for magmas, which serves as "blunt force" de facto corrections for what is intrinsically multiphase behavior. Even in more fluid-rich systems many of these laws are not suitable for use in the very transport theories they aim to support. The alternative approach is the discrete method, where multiphase interactions are explicitly resolved. This is a daunting prospect given the numbers of crystals in magmas. But perhaps all crystals don't need to be modeled. I will demonstrate how discrete methods can recover critical state behavior, resolve crystal migration, the onset of visco-elastic behavior such as melt-present shear bands which sets the large-scale mixing volumes, some of the general morpho-dynamics that underlies purported rheological models, and transient controls on the emergence and dissipation of distinct thermodynamic states. As simulations with 106 - 107 crystals are now possible both the local, micro-scale crystal processes as well as the larger scale processes controlled by particle-particle-fluid interactions, can be simultaneously resolved.

  8. Simulating Mission Command for Planning and Analysis

    DTIC Science & Technology

    2015-06-01

    mission plan. 14. SUBJECT TERMS Mission Planning, CPM , PERT, Simulation, DES, Simkit, Triangle Distribution, Critical Path 15. NUMBER OF...Battalion Task Force CO Company CPM Critical Path Method DES Discrete Event Simulation FA BAT Field Artillery Battalion FEL Future Event List FIST...management tools that can be utilized to find the critical path in military projects. These are the Critical Path Method ( CPM ) and the Program Evaluation and

  9. The development of a simulation model of the treatment of coronary heart disease.

    PubMed

    Cooper, Keith; Davies, Ruth; Roderick, Paul; Chase, Debbie; Raftery, James

    2002-11-01

    A discrete event simulation models the progress of patients who have had a coronary event, through their treatment pathways and subsequent coronary events. The main risk factors in the model are age, sex, history of previous events and the extent of the coronary vessel disease. The model parameters are based on data collected from epidemiological studies of incidence and prognosis, efficacy studies. national surveys and treatment audits. The simulation results were validated against different sources of data. The initial results show that increasing revascularisation has considerable implications for resource use but has little impact on patient mortality.

  10. On the performance of voltage stepping for the simulation of adaptive, nonlinear integrate-and-fire neuronal networks.

    PubMed

    Kaabi, Mohamed Ghaith; Tonnelier, Arnaud; Martinez, Dominique

    2011-05-01

    In traditional event-driven strategies, spike timings are analytically given or calculated with arbitrary precision (up to machine precision). Exact computation is possible only for simplified neuron models, mainly the leaky integrate-and-fire model. In a recent paper, Zheng, Tonnelier, and Martinez (2009) introduced an approximate event-driven strategy, named voltage stepping, that allows the generic simulation of nonlinear spiking neurons. Promising results were achieved in the simulation of single quadratic integrate-and-fire neurons. Here, we assess the performance of voltage stepping in network simulations by considering more complex neurons (quadratic integrate-and-fire neurons with adaptation) coupled with multiple synapses. To handle the discrete nature of synaptic interactions, we recast voltage stepping in a general framework, the discrete event system specification. The efficiency of the method is assessed through simulations and comparisons with a modified time-stepping scheme of the Runge-Kutta type. We demonstrated numerically that the original order of voltage stepping is preserved when simulating connected spiking neurons, independent of the network activity and connectivity.

  11. Kennedy Space Center ITC-1 Internship Overview

    NASA Technical Reports Server (NTRS)

    Ni, Marcus

    2011-01-01

    As an intern for Priscilla Elfrey in the ITC-1 department, I was involved in many activities that have helped me to develop many new skills. I supported four different projects during my internship, which included the Center for Life Cycle Design (CfLCD), SISO Space Interoperability Smackdown, RTI Teacher Mentor Program, and the Discrete Event Simulation Integrated Visualization Environment Team (DIVE). I provided the CfLCD with web based research on cyber security initiatives involving simulation, education for young children, cloud computing, Otronicon, and Science, Technology, Engineering, and Mathematics (STEM) education initiatives. I also attended STEM meetings regarding simulation courses, and educational course enhancements. To further improve the SISO Simulation event, I provided observation feedback to the technical advisory board. I also helped to set up a chat federation for HLA. The third project involved the RTI Teacher Mentor program, which I helped to organize. Last, but not least, I worked with the DIVE team to develop new software to help visualize discrete event simulations. All of these projects have provided experience on an interdisciplinary level ranging from speech and communication to solving complex problems using math and science.

  12. Real-time detection of organic contamination events in water distribution systems by principal components analysis of ultraviolet spectral data.

    PubMed

    Zhang, Jian; Hou, Dibo; Wang, Ke; Huang, Pingjie; Zhang, Guangxin; Loáiciga, Hugo

    2017-05-01

    The detection of organic contaminants in water distribution systems is essential to protect public health from potential harmful compounds resulting from accidental spills or intentional releases. Existing methods for detecting organic contaminants are based on quantitative analyses such as chemical testing and gas/liquid chromatography, which are time- and reagent-consuming and involve costly maintenance. This study proposes a novel procedure based on discrete wavelet transform and principal component analysis for detecting organic contamination events from ultraviolet spectral data. Firstly, the spectrum of each observation is transformed using discrete wavelet with a coiflet mother wavelet to capture the abrupt change along the wavelength. Principal component analysis is then employed to approximate the spectra based on capture and fusion features. The significant value of Hotelling's T 2 statistics is calculated and used to detect outliers. An alarm of contamination event is triggered by sequential Bayesian analysis when the outliers appear continuously in several observations. The effectiveness of the proposed procedure is tested on-line using a pilot-scale setup and experimental data.

  13. Use of DES Modeling for Determining Launch Availability for SLS

    NASA Technical Reports Server (NTRS)

    Watson, Michael; Staton, Eric; Cates, Grant; Finn, Ronald; Altino, Karen M.; Burns, K. Lee

    2014-01-01

    (1) NASA is developing a new heavy lift launch system for human and scientific exploration beyond Earth orbit comprising of the Space Launch System (SLS), Orion Multi-Purpose Crew Vehicle (MPCV), and Ground Systems Development and Operations (GSDO); (2) The desire of the system is to ensure a high confidence of successfully launching the exploration missions, especially those that require multiple launches, have a narrow Earth departure window, and high investment costs; and (3) This presentation discusses the process used by a Cross-Program team to develop the Exploration Systems Development (ESD) Launch Availability (LA) Technical Performance Measure (TPM) and allocate it to each of the Programs through the use of Discrete Event Simulations (DES).

  14. CONFIG: Integrated engineering of systems and their operation

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Ryan, Dan; Fleming, Land

    1994-01-01

    This article discusses CONFIG 3, a prototype software tool that supports integrated conceptual design evaluation from early in the product life cycle, by supporting isolated or integrated modeling, simulation, and analysis of the function, structure, behavior, failures and operations of system designs. Integration and reuse of models is supported in an object-oriented environment providing capabilities for graph analysis and discrete event simulation. CONFIG supports integration among diverse modeling approaches (component view, configuration or flow path view, and procedure view) and diverse simulation and analysis approaches. CONFIG is designed to support integrated engineering in diverse design domains, including mechanical and electro-mechanical systems, distributed computer systems, and chemical processing and transport systems.

  15. Recent Advances in Composite Damage Mechanics

    NASA Technical Reports Server (NTRS)

    Reifsnider, Ken; Case, Scott; Iyengar, Nirmal

    1996-01-01

    The state of the art and recent developments in the field of composite material damage mechanics are reviewed, with emphasis on damage accumulation. The kinetics of damage accumulation are considered with emphasis on the general accumulation of discrete local damage events such as single or multiple fiber fractures or microcrack formation. The issues addressed include: how to define strength in the presence of widely distributed damage, and how to combine mechanical representations in order to predict the damage tolerance and life of engineering components. It is shown that a damage mechanics approach can be related to the thermodynamics of the damage accumulation processes in composite laminates subjected to mechanical loading and environmental conditions over long periods of time.

  16. A continuous analog of run length distributions reflecting accumulated fractionation events.

    PubMed

    Yu, Zhe; Sankoff, David

    2016-11-11

    We propose a new, continuous model of the fractionation process (duplicate gene deletion after polyploidization) on the real line. The aim is to infer how much DNA is deleted at a time, based on segment lengths for alternating deleted (invisible) and undeleted (visible) regions. After deriving a number of analytical results for "one-sided" fractionation, we undertake a series of simulations that help us identify the distribution of segment lengths as a gamma with shape and rate parameters evolving over time. This leads to an inference procedure based on observed length distributions for visible and invisible segments. We suggest extensions of this mathematical and simulation work to biologically realistic discrete models, including two-sided fractionation.

  17. Single Versus Multiple Events Error Potential Detection in a BCI-Controlled Car Game With Continuous and Discrete Feedback.

    PubMed

    Kreilinger, Alex; Hiebel, Hannah; Müller-Putz, Gernot R

    2016-03-01

    This work aimed to find and evaluate a new method for detecting errors in continuous brain-computer interface (BCI) applications. Instead of classifying errors on a single-trial basis, the new method was based on multiple events (MEs) analysis to increase the accuracy of error detection. In a BCI-driven car game, based on motor imagery (MI), discrete events were triggered whenever subjects collided with coins and/or barriers. Coins counted as correct events, whereas barriers were errors. This new method, termed ME method, combined and averaged the classification results of single events (SEs) and determined the correctness of MI trials, which consisted of event sequences instead of SEs. The benefit of this method was evaluated in an offline simulation. In an online experiment, the new method was used to detect erroneous MI trials. Such MI trials were discarded and could be repeated by the users. We found that, even with low SE error potential (ErrP) detection rates, feasible accuracies can be achieved when combining MEs to distinguish erroneous from correct MI trials. Online, all subjects reached higher scores with error detection than without, at the cost of longer times needed for completing the game. Findings suggest that ErrP detection may become a reliable tool for monitoring continuous states in BCI applications when combining MEs. This paper demonstrates a novel technique for detecting errors in online continuous BCI applications, which yields promising results even with low single-trial detection rates.

  18. Near Earth Objects and Cascading Effects from the Policy Perspective: Implications from Problem and Solution Definition

    NASA Astrophysics Data System (ADS)

    Lindquist, Eric

    2016-04-01

    The characterization of near-Earth-objects (NEOs) in regard to physical attributes and potential risk and impact factors presents a complex and complicates scientific and engineering challenge. The societal and policy risks and impacts are no less complex, yet are rarely considered in the same context as material properties or related factors. Further, NEO impacts are typically considered as discrete events, not as initial events in a dynamic cascading system. The objective of this contribution is to position the characterization of NEOs within the public policy process domain as a means to reflect on the science-policy nexus in regard to risks and multi-hazard impacts associated with these hazards. This will be accomplished through, first, a brief overview of the science-policy nexus, followed by a discussion of policy process frameworks, such as agenda setting and the multiple streams model, focusing events, and punctuated equilibrium, and their application and appropriateness to the problem of NEOs. How, too, for example, does NEO hazard and risk compare with other low probability, high risk, hazards in regard to public policy? Finally, we will reflect on the implications of alternative NEO "solutions" and the characterization of the NEO "problem," and the political and public acceptance of policy alternatives as a way to link NEO science and policy in the context of the overall NH9.12 panel.

  19. Statistics of EMIC Rising Tones Observed by the Van Allen Probes

    NASA Astrophysics Data System (ADS)

    Sigsbee, K. M.; Kletzing, C.; Smith, C. W.; Santolik, O.

    2017-12-01

    We will present results from an ongoing statistical study of electromagnetic ion cyclotron (EMIC) wave rising tones observed by the Van Allen Probes. Using data from the Electric and Magnetic Field Instrument Suite and Integrated Science (EMFISIS) fluxgate magnetometer, we have identified orbits by both Van Allen Probes with EMIC wave events from the start of the mission in fall 2012 through fall 2016. Orbits with EMIC wave events were further examined for evidence of rising tones. Most EMIC wave rising tones were found during H+ band EMIC wave events. In Fourier time-frequency power spectrograms of the fluxgate magnetometer data, H+ band rising tones generally took the form of triggered emission type events, where the discrete rising tone structures rapidly rise in frequency out of the main band of observed H+ EMIC waves. A smaller percentage of EMIC wave rising tone events were found in the He+ band, where rising tones may appear as discrete structures with a positive slope embedded within the main band of observed He+ EMIC waves, similar in appearance to whistler-mode chorus elements. Understanding the occurrence rate and properties of rising tone EMIC waves will provide observational context for theoretical studies indicating that EMIC waves exhibiting non-linear behavior, such as rising tones, may be more effective at scattering radiation belt electrons than ordinary EMIC waves.

  20. Range Process Simulation Tool

    NASA Technical Reports Server (NTRS)

    Phillips, Dave; Haas, William; Barth, Tim; Benjamin, Perakath; Graul, Michael; Bagatourova, Olga

    2005-01-01

    Range Process Simulation Tool (RPST) is a computer program that assists managers in rapidly predicting and quantitatively assessing the operational effects of proposed technological additions to, and/or upgrades of, complex facilities and engineering systems such as the Eastern Test Range. Originally designed for application to space transportation systems, RPST is also suitable for assessing effects of proposed changes in industrial facilities and large organizations. RPST follows a model-based approach that includes finite-capacity schedule analysis and discrete-event process simulation. A component-based, scalable, open architecture makes RPST easily and rapidly tailorable for diverse applications. Specific RPST functions include: (1) definition of analysis objectives and performance metrics; (2) selection of process templates from a processtemplate library; (3) configuration of process models for detailed simulation and schedule analysis; (4) design of operations- analysis experiments; (5) schedule and simulation-based process analysis; and (6) optimization of performance by use of genetic algorithms and simulated annealing. The main benefits afforded by RPST are provision of information that can be used to reduce costs of operation and maintenance, and the capability for affordable, accurate, and reliable prediction and exploration of the consequences of many alternative proposed decisions.

  1. 40 CFR 464.31 - Specialized definitions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... discrete list of toxic organic pollutants for each process segment where it is regulated, as follows: (1... discrete wet scrubbing devices are employed in series in a single melting furnace exhaust gas stream. The ferrous melting furnace scrubber mass allowance shall be given to each discrete wet scrubbing device that...

  2. Wheat mill stream properties for discrete element method modeling

    USDA-ARS?s Scientific Manuscript database

    A discrete phase approach based on individual wheat kernel characteristics is needed to overcome the limitations of previous statistical models and accurately predict the milling behavior of wheat. As a first step to develop a discrete element method (DEM) model for the wheat milling process, this s...

  3. The role of flooding in the occurrence of sinkholes in mantled karst setting, Orléans area (France)

    NASA Astrophysics Data System (ADS)

    Noury, Gildas; Perrin, Jerome; Luu, Li-Hua; Philippe, Pierre

    2017-04-01

    The Loire River basin is regularly impacted by sinkholes because of its specific geological context constituted of karstic limestone overlain by soft materials. Intense rainfall and associated flooding that occurred in this area in May and June 2016 triggered the collapse of tens of sinkholes. At least 20 houses, one high-traffic road, one levee of the Loire River and one highway were directly threatened. This event highlights not only the vulnerability of the area, especially in the case of a disastrous flood of the Loire River, but also an unexpected kinetic of the process. Two different types of sinkholes occurred in flooded areas: on the plateau, spectacular drop out of former natural caves is suspected; in the Loire valley, flooding is supposed to have accelerated the suffosion of alluvium by a factor of 10 000 to 20 000. This feedback bring new insights on the process dynamics that is currently being analysed in more details using an innovative internal erosion numerical modeling approach, based on Discrete Element - DEM and Lattice Boltzmann methods - LBM. A better understanding of the sinkhole formation is crucial for adequate risk management, especially in the case of a large flooding event.

  4. Role of stochastic processes in maintaining discrete strain structure in antigenically diverse pathogen populations.

    PubMed

    Buckee, Caroline O; Recker, Mario; Watkins, Eleanor R; Gupta, Sunetra

    2011-09-13

    Many highly diverse pathogen populations appear to exist stably as discrete antigenic types despite evidence of genetic exchange. It has been shown that this may arise as a consequence of immune selection on pathogen populations, causing them to segregate permanently into discrete nonoverlapping subsets of antigenic variants to minimize competition for available hosts. However, discrete antigenic strain structure tends to break down under conditions where there are unequal numbers of allelic variants at each locus. Here, we show that the inclusion of stochastic processes can lead to the stable recovery of discrete strain structure through loss of certain alleles. This explains how pathogen populations may continue to behave as independently transmitted strains despite inevitable asymmetries in allelic diversity of major antigens. We present evidence for this type of structuring across global meningococcal isolates in three diverse antigens that are currently being developed as vaccine components.

  5. Apparatus and process for determining the susceptibility of microorganisms to antibiotics

    NASA Technical Reports Server (NTRS)

    Gibson, Sandra F. (Inventor); Fadler, Norman L. (Inventor)

    1976-01-01

    A process for determining the susceptibility of microorganisms to antibiotics involves introducing a diluted specimen into discrete quantities of a selective culture medium which favors a specific microorganism in that the microorganism is sustained by the medium and when so sustained will change the optical characteristics of the medium. Only the specific microorganism will alter the optical characteristics. Some of the discrete quantities are blended with known antibiotics, while at least one is not. If the specimen contains the microorganisms favored by the selective medium, the optical characteristics of the discrete quantity of pure selective medium, that is the one without antibiotics, will change. If the antibiotics in any of the other discrete quantities are ineffective against the favored microorganisms, the optical characteristics of those quantities will likewise change. No change in the optical characteristics of a discrete quantity indicates that the favored microorganism is susceptible to the antibiotic in the quantity.

  6. fixedTimeEvents: An R package for the distribution of distances between discrete events in fixed time

    NASA Astrophysics Data System (ADS)

    Liland, Kristian Hovde; Snipen, Lars

    When a series of Bernoulli trials occur within a fixed time frame or limited space, it is often interesting to assess if the successful outcomes have occurred completely at random, or if they tend to group together. One example, in genetics, is detecting grouping of genes within a genome. Approximations of the distribution of successes are possible, but they become inaccurate for small sample sizes. In this article, we describe the exact distribution of time between random, non-overlapping successes in discrete time of fixed length. A complete description of the probability mass function, the cumulative distribution function, mean, variance and recurrence relation is included. We propose an associated test for the over-representation of short distances and illustrate the methodology through relevant examples. The theory is implemented in an R package including probability mass, cumulative distribution, quantile function, random number generator, simulation functions, and functions for testing.

  7. A software bus for thread objects

    NASA Technical Reports Server (NTRS)

    Callahan, John R.; Li, Dehuai

    1995-01-01

    The authors have implemented a software bus for lightweight threads in an object-oriented programming environment that allows for rapid reconfiguration and reuse of thread objects in discrete-event simulation experiments. While previous research in object-oriented, parallel programming environments has focused on direct communication between threads, our lightweight software bus, called the MiniBus, provides a means to isolate threads from their contexts of execution by restricting communications between threads to message-passing via their local ports only. The software bus maintains a topology of connections between these ports. It routes, queues, and delivers messages according to this topology. This approach allows for rapid reconfiguration and reuse of thread objects in other systems without making changes to the specifications or source code. A layered approach that provides the needed transparency to developers is presented. Examples of using the MiniBus are given, and the value of bus architectures in building and conducting simulations of discrete-event systems is discussed.

  8. Discrete event simulation of patient admissions to a neurovascular unit.

    PubMed

    Hahn-Goldberg, S; Chow, E; Appel, E; Ko, F T F; Tan, P; Gavin, M B; Ng, T; Abrams, H B; Casaubon, L K; Carter, M W

    2014-01-01

    Evidence exists that clinical outcomes improve for stroke patients admitted to specialized Stroke Units. The Toronto Western Hospital created a Neurovascular Unit (NVU) using beds from general internal medicine, Neurology and Neurosurgery to care for patients with stroke and acute neurovascular conditions. Using patient-level data for NVU-eligible patients, a discrete event simulation was created to study changes in patient flow and length of stay pre- and post-NVU implementation. Varying patient volumes and resources were tested to determine the ideal number of beds under various conditions. In the first year of operation, the NVU admitted 507 patients, over 66% of NVU-eligible patient volumes. With the introduction of the NVU, length of stay decreased by around 8%. Scenario testing showed that the current level of 20 beds is sufficient for accommodating the current demand and would continue to be sufficient with an increase in demand of up to 20%.

  9. A generic discrete-event simulation model for outpatient clinics in a large public hospital.

    PubMed

    Weerawat, Waressara; Pichitlamken, Juta; Subsombat, Peerapong

    2013-01-01

    The orthopedic outpatient department (OPD) ward in a large Thai public hospital is modeled using Discrete-Event Stochastic (DES) simulation. Key Performance Indicators (KPIs) are used to measure effects across various clinical operations during different shifts throughout the day. By considering various KPIs such as wait times to see doctors, percentage of patients who can see a doctor within a target time frame, and the time that the last patient completes their doctor consultation, bottlenecks are identified and resource-critical clinics can be prioritized. The simulation model quantifies the chronic, high patient congestion that is prevalent amongst Thai public hospitals with very high patient-to-doctor ratios. Our model can be applied across five different OPD wards by modifying the model parameters. Throughout this work, we show how DES models can be used as decision-support tools for hospital management.

  10. A DISCRETE-EVENT SIMULATION APPROACH TO IDENTIFY RULES THAT GOVERN ARBOR REMODELING FOR BRANCHING CUTANEOUS AFFERENTS IN HAIRY SKIN.

    PubMed

    Kang, Hyojung; Orlowsky, Rachel L; Gerling, Gregory J

    2017-12-01

    In mammals, touch is encoded by sensory receptors embedded in the skin. For one class of receptors in the mouse, the architecture of its Merkel cells, unmyelinated neurites, and heminodes follow particular renewal and remodeling trends over hair cycle stages from ages 4 to 10 weeks. As it is currently impossible to observe such trends across a single animal's hair cycle, this work employs discrete event simulation to identify and evaluate policies of Merkel cell and heminode dynamics. Well matching the observed data, the results show that the baseline model replicates dynamic remodeling behaviors between stages of the hair cycle - based on particular addition and removal polices and estimated probabilities tied to constituent parts of Merkel cells, terminal branch neurites and heminodes. The analysis shows further that certain policies hold greater influence than others. This use of computation is a novel approach to understanding neuronal development.

  11. Short, large amplitude speed enhancements in the near-Sun fast solar wind

    NASA Astrophysics Data System (ADS)

    Horbury, T. S.; Matteini, L.; Stansby, D.

    2018-04-01

    We report the presence of intermittent, short discrete enhancements in plasma speed in the near-Sun high speed solar wind. Lasting tens of seconds to minutes in spacecraft measurements at 0.3 AU, speeds inside these enhancements can reach 1000 km/s, corresponding to a kinetic energy up to twice that of the bulk high speed solar wind. These events, which occur around 5% of the time, are Alfvénic in nature with large magnetic field deflections and are the same temperature as the surrounding plasma, in contrast to the bulk fast wind which has a well-established positive speed-temperature correlation. The origin of these speed enhancements is unclear but they may be signatures of discrete jets associated with transient events in the chromosphere or corona. Such large short velocity changes represent a measurement and analysis challenge for the upcoming Parker Solar Probe and Solar Orbiter missions.

  12. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    NASA Technical Reports Server (NTRS)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  13. Resource-Constrained Spatial Hot Spot Identification

    DTIC Science & Technology

    2011-01-01

    into three categories ( Cameron and Leitner, 2005):2 Thematic Mapping. Concentrations of events are color-coded in discrete geo- graphic areas that...of Boston burglary events in 1999 and provided by Cameron and Leitner (2005). The first map reflects burglary rates per 100,000 residents by Census...Burglary Rates, 1999 RAND A8567-22 1 0 1 2 Miles Thematic mapping Kernel density interpolation Hierarchical clustering Source: Cameron and Leitner, 2005. For

  14. Multiple hydrothermal and metamorphic events in the Kidd Creek volcanogenic massive sulphide deposit, Timmins, Ontario: evidence from tourmalines and chlorites

    USGS Publications Warehouse

    Slack, J.F.; Coad, P.R.

    1989-01-01

    The tourmalines and chlorites record a series of multiple hydrothermal and metamorphic events. Paragenetic studies suggest that tourmaline was deposited during several discrete stages of mineralization, as evidence by brecciation and cross-cutting relationships. Most of the tourmalines have two concentric growth zones defined by different colours (green, brown, blue, yellow). Some tourmalines also display pale discordant rims that cross-cut and embay the inner growth zones and polycrystalline, multiple-extinction domains. Late sulphide veinlets (chalcopyrite, pyrrhotite) transect the inner growth zones and pale discordant rims of many crystals. The concentric growth zones are interpreted as primary features developed by the main ore-forming hydrothermal system, whereas the discordant rims, polycrystalline domains, and cross-cutting sulphide veinlets reflect post-ore metamorphic processes. Variations in mineral proportions and mineral chemistry within the deposit mainly depend on fluctuations in temperature, pH, water/rock ratios, and amounts of entrained seawater. -from Authors

  15. MicroRNA29a regulates IL-33-mediated tissue remodelling in tendon disease

    PubMed Central

    Millar, Neal L.; Gilchrist, Derek S.; Akbar, Moeed; Reilly, James H.; Kerr, Shauna C.; Campbell, Abigail L.; Murrell, George A. C.; Liew, Foo Y.; Kurowska-Stolarska, Mariola; McInnes, Iain B.

    2015-01-01

    MicroRNA (miRNA) has the potential for cross-regulation and functional integration of discrete biological processes during complex physiological events. Utilizing the common human condition tendinopathy as a model system to explore the cross-regulation of immediate inflammation and matrix synthesis by miRNA we observed that elevated IL-33 expression is a characteristic of early tendinopathy. Using in vitro tenocyte cultures and in vivo models of tendon damage, we demonstrate that such IL-33 expression plays a pivotal role in the transition from type 1 to type 3 collagen (Col3) synthesis and thus early tendon remodelling. Both IL-33 effector function, via its decoy receptor sST2, and Col3 synthesis are regulated by miRNA29a. Downregulation of miRNA29a in human tenocytes is sufficient to induce an increase in Col3 expression. These data provide a molecular mechanism of miRNA-mediated integration of the early pathophysiologic events that facilitate tissue remodelling in human tendon after injury. PMID:25857925

  16. Pore invasion dynamics during fluid front displacement in porous media determine functional pore size distribution and phase entrapment

    NASA Astrophysics Data System (ADS)

    Moebius, F.; Or, D.

    2012-12-01

    Dynamics of fluid fronts in porous media shape transport properties of the unsaturated zone and affect management of petroleum reservoirs and their storage properties. What appears macroscopically as smooth and continuous motion of a displacement fluid front may involve numerous rapid interfacial jumps often resembling avalanches of invasion events. Direct observations using high-speed camera and pressure sensors in sintered glass micro-models provide new insights on the influence of flow rates, pore size, and gravity on invasion events and on burst size distribution. Fundamental differences emerge between geometrically-defined pores and "functional" pores invaded during a single burst (invasion event). The waiting times distribution of individual invasion events and decay times of inertial oscillations (following a rapid interfacial jump) are characteristics of different displacement regimes. An invasion percolation model with gradients and including the role of inertia provide a framework for linking flow regimes with invasion sequences and phase entrapment. Model results were compared with measurements and with early studies on invasion burst sizes and waiting times distribution during slow drainage processes by Måløy et al. [1992]. The study provides new insights into the discrete invasion events and their weak links with geometrically-deduced pore geometry. Results highlight factors controlling pore invasion events that exert strong influence on macroscopic phenomena such as front morphology and residual phase entrapment shaping hydraulic properties after the passage of a fluid front.

  17. Finite Volume Element (FVE) discretization and multilevel solution of the axisymmetric heat equation

    NASA Astrophysics Data System (ADS)

    Litaker, Eric T.

    1994-12-01

    The axisymmetric heat equation, resulting from a point-source of heat applied to a metal block, is solved numerically; both iterative and multilevel solutions are computed in order to compare the two processes. The continuum problem is discretized in two stages: finite differences are used to discretize the time derivatives, resulting is a fully implicit backward time-stepping scheme, and the Finite Volume Element (FVE) method is used to discretize the spatial derivatives. The application of the FVE method to a problem in cylindrical coordinates is new, and results in stencils which are analyzed extensively. Several iteration schemes are considered, including both Jacobi and Gauss-Seidel; a thorough analysis of these schemes is done, using both the spectral radii of the iteration matrices and local mode analysis. Using this discretization, a Gauss-Seidel relaxation scheme is used to solve the heat equation iteratively. A multilevel solution process is then constructed, including the development of intergrid transfer and coarse grid operators. Local mode analysis is performed on the components of the amplification matrix, resulting in the two-level convergence factors for various combinations of the operators. A multilevel solution process is implemented by using multigrid V-cycles; the iterative and multilevel results are compared and discussed in detail. The computational savings resulting from the multilevel process are then discussed.

  18. Hybrid High-Fidelity Modeling of Radar Scenarios Using Atemporal, Discrete-Event, and Time-Step Simulation

    DTIC Science & Technology

    2016-12-01

    time T1 for the mover to travel from the current position to the next waypoint is calculated by the T1 = DistanceMaxSeed . The "EndMove" event will...speed of light in a real atmosphere. The factor of 12 is the result of the round trip travel time of the signal. The maximum detection range (Rmax) is...34EnterRange" event is triggered by the referee, the time for the target traveling to the midpoint towards its waypoint tm is calculated and applied

  19. 15 CFR 301.5 - Processing of applications by the Department of Commerce.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... the information contained in the prior submission, shall not be considered in making the decision on... his discretion, may take into account factual information contained in untimely comments. (6... discretion at any stage of processing to insert into the record and consider in making his decision any...

  20. Mortar and artillery variants classification by exploiting characteristics of the acoustic signature

    NASA Astrophysics Data System (ADS)

    Hohil, Myron E.; Grasing, David; Desai, Sachi; Morcos, Amir

    2007-10-01

    Feature extraction methods based on the discrete wavelet transform and multiresolution analysis facilitate the development of a robust classification algorithm that reliably discriminates mortar and artillery variants via acoustic signals produced during the launch/impact events. Utilizing acoustic sensors to exploit the sound waveform generated from the blast for the identification of mortar and artillery variants. Distinct characteristics arise within the different mortar variants because varying HE mortar payloads and related charges emphasize concussive and shrapnel effects upon impact employing varying magnitude explosions. The different mortar variants are characterized by variations in the resulting waveform of the event. The waveform holds various harmonic properties distinct to a given mortar/artillery variant that through advanced signal processing techniques can employed to classify a given set. The DWT and other readily available signal processing techniques will be used to extract the predominant components of these characteristics from the acoustic signatures at ranges exceeding 2km. Exploiting these techniques will help develop a feature set highly independent of range, providing discrimination based on acoustic elements of the blast wave. Highly reliable discrimination will be achieved with a feed-forward neural network classifier trained on a feature space derived from the distribution of wavelet coefficients, frequency spectrum, and higher frequency details found within different levels of the multiresolution decomposition. The process that will be described herein extends current technologies, which emphasis multi modal sensor fusion suites to provide such situational awareness. A two fold problem of energy consumption and line of sight arise with the multi modal sensor suites. The process described within will exploit the acoustic properties of the event to provide variant classification as added situational awareness to the solider.

  1. Reduced discretization error in HZETRN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Slaba, Tony C., E-mail: Tony.C.Slaba@nasa.gov; Blattnig, Steve R., E-mail: Steve.R.Blattnig@nasa.gov; Tweed, John, E-mail: jtweed@odu.edu

    2013-02-01

    The deterministic particle transport code HZETRN is an efficient analysis tool for studying the effects of space radiation on humans, electronics, and shielding materials. In a previous work, numerical methods in the code were reviewed, and new methods were developed that further improved efficiency and reduced overall discretization error. It was also shown that the remaining discretization error could be attributed to low energy light ions (A < 4) with residual ranges smaller than the physical step-size taken by the code. Accurately resolving the spectrum of low energy light particles is important in assessing risk associated with astronaut radiation exposure.more » In this work, modifications to the light particle transport formalism are presented that accurately resolve the spectrum of low energy light ion target fragments. The modified formalism is shown to significantly reduce overall discretization error and allows a physical approximation to be removed. For typical step-sizes and energy grids used in HZETRN, discretization errors for the revised light particle transport algorithms are shown to be less than 4% for aluminum and water shielding thicknesses as large as 100 g/cm{sup 2} exposed to both solar particle event and galactic cosmic ray environments.« less

  2. Discrete wavelet-aided delineation of PCG signal events via analysis of an area curve length-based decision statistic.

    PubMed

    Homaeinezhad, M R; Atyabi, S A; Daneshvar, E; Ghaffari, A; Tahmasebi, M

    2010-12-01

    The aim of this study is to describe a robust unified framework for segmentation of the phonocardiogram (PCG) signal sounds based on the false-alarm probability (FAP) bounded segmentation of a properly calculated detection measure. To this end, first the original PCG signal is appropriately pre-processed and then, a fixed sample size sliding window is moved on the pre-processed signal. In each slid, the area under the excerpted segment is multiplied by its curve-length to generate the Area Curve Length (ACL) metric to be used as the segmentation decision statistic (DS). Afterwards, histogram parameters of the nonlinearly enhanced DS metric are used for regulation of the α-level Neyman-Pearson classifier for FAP-bounded delineation of the PCG events. The proposed method was applied to all 85 records of Nursing Student Heart Sounds database (NSHSDB) including stenosis, insufficiency, regurgitation, gallop, septal defect, split sound, rumble, murmur, clicks, friction rub and snap disorders with different sampling frequencies. Also, the method was applied to the records obtained from an electronic stethoscope board designed for fulfillment of this study in the presence of high-level power-line noise and external disturbing sounds and as the results, no false positive (FP) or false negative (FN) errors were detected. High noise robustness, acceptable detection-segmentation accuracy of PCG events in various cardiac system conditions, and having no parameters dependency to the acquisition sampling frequency can be mentioned as the principal virtues and abilities of the proposed ACL-based PCG events detection-segmentation algorithm.

  3. A fuzzy Petri-net-based mode identification algorithm for fault diagnosis of complex systems

    NASA Astrophysics Data System (ADS)

    Propes, Nicholas C.; Vachtsevanos, George

    2003-08-01

    Complex dynamical systems such as aircraft, manufacturing systems, chillers, motor vehicles, submarines, etc. exhibit continuous and event-driven dynamics. These systems undergo several discrete operating modes from startup to shutdown. For example, a certain shipboard system may be operating at half load or full load or may be at start-up or shutdown. Of particular interest are extreme or "shock" operating conditions, which tend to severely impact fault diagnosis or the progression of a fault leading to a failure. Fault conditions are strongly dependent on the operating mode. Therefore, it is essential that in any diagnostic/prognostic architecture, the operating mode be identified as accurately as possible so that such functions as feature extraction, diagnostics, prognostics, etc. can be correlated with the predominant operating conditions. This paper introduces a mode identification methodology that incorporates both time- and event-driven information about the process. A fuzzy Petri net is used to represent the possible successive mode transitions and to detect events from processed sensor signals signifying a mode change. The operating mode is initialized and verified by analysis of the time-driven dynamics through a fuzzy logic classifier. An evidence combiner module is used to combine the results from both the fuzzy Petri net and the fuzzy logic classifier to determine the mode. Unlike most event-driven mode identifiers, this architecture will provide automatic mode initialization through the fuzzy logic classifier and robustness through the combining of evidence of the two algorithms. The mode identification methodology is applied to an AC Plant typically found as a component of a shipboard system.

  4. Using Discrete Event Simulation to Model the Economic Value of Shorter Procedure Times on EP Lab Efficiency in the VALUE PVI Study.

    PubMed

    Kowalski, Marcin; DeVille, J Brian; Svinarich, J Thomas; Dan, Dan; Wickliffe, Andrew; Kantipudi, Charan; Foell, Jason D; Filardo, Giovanni; Holbrook, Reece; Baker, James; Baydoun, Hassan; Jenkins, Mark; Chang-Sing, Peter

    2016-05-01

    The VALUE PVI study demonstrated that atrial fibrillation (AF) ablation procedures and electrophysiology laboratory (EP lab) occupancy times were reduced for the cryoballoon compared with focal radiofrequency (RF) ablation. However, the economic impact associated with the cryoballoon procedure for hospitals has not been determined. Assess the economic value associated with shorter AF ablation procedure times based on VALUE PVI data. A model was formulated from data from the VALUE PVI study. This model used a discrete event simulation to translate procedural efficiencies into metrics utilized by hospital administrators. A 1000-day period was simulated to determine the accrued impact of procedure time on an institution's EP lab when considering staff and hospital resources. The simulation demonstrated that procedures performed with the cryoballoon catheter resulted in several efficiencies, including: (1) a reduction of 36.2% in days with overtime (422 days RF vs 60 days cryoballoon); (2) 92.7% less cumulative overtime hours (370 hours RF vs 27 hours cryoballoon); and (3) an increase of 46.7% in days with time for an additional EP lab usage (186 days RF vs 653 days cryoballoon). Importantly, the added EP lab utilization could not support the time required for an additional AF ablation procedure. The discrete event simulation of the VALUE PVI data demonstrates the potential positive economic value of AF ablation procedures using the cryoballoon. These benefits include more days where overtime is avoided, fewer cumulative overtime hours, and more days with time left for additional usage of EP lab resources.

  5. Improving outpatient phlebotomy service efficiency and patient experience using discrete-event simulation.

    PubMed

    Yip, Kenneth; Pang, Suk-King; Chan, Kui-Tim; Chan, Chi-Kuen; Lee, Tsz-Leung

    2016-08-08

    Purpose - The purpose of this paper is to present a simulation modeling application to reconfigure the outpatient phlebotomy service of an acute regional and teaching hospital in Hong Kong, with an aim to improve service efficiency, shorten patient queuing time and enhance workforce utilization. Design/methodology/approach - The system was modeled as an inhomogeneous Poisson process and a discrete-event simulation model was developed to simulate the current setting, and to evaluate how various performance metrics would change if switched from a decentralized to a centralized model. Variations were then made to the model to test different workforce arrangements for the centralized service, so that managers could decide on the service's final configuration via an evidence-based and data-driven approach. Findings - This paper provides empirical insights about the relationship between staffing arrangement and system performance via a detailed scenario analysis. One particular staffing scenario was chosen by manages as it was considered to strike the best balance between performance and workforce scheduled. The resulting centralized phlebotomy service was successfully commissioned. Practical implications - This paper demonstrates how analytics could be used for operational planning at the hospital level. The authors show that a transparent and evidence-based scenario analysis, made available through analytics and simulation, greatly facilitates management and clinical stakeholders to arrive at the ideal service configuration. Originality/value - The authors provide a robust method in evaluating the relationship between workforce investment, queuing reduction and workforce utilization, which is crucial for managers when deciding the delivery model for any outpatient-related service.

  6. High Fidelity Simulations of Large-Scale Wireless Networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Onunkwo, Uzoma; Benz, Zachary

    The worldwide proliferation of wireless connected devices continues to accelerate. There are 10s of billions of wireless links across the planet with an additional explosion of new wireless usage anticipated as the Internet of Things develops. Wireless technologies do not only provide convenience for mobile applications, but are also extremely cost-effective to deploy. Thus, this trend towards wireless connectivity will only continue and Sandia must develop the necessary simulation technology to proactively analyze the associated emerging vulnerabilities. Wireless networks are marked by mobility and proximity-based connectivity. The de facto standard for exploratory studies of wireless networks is discrete event simulationsmore » (DES). However, the simulation of large-scale wireless networks is extremely difficult due to prohibitively large turnaround time. A path forward is to expedite simulations with parallel discrete event simulation (PDES) techniques. The mobility and distance-based connectivity associated with wireless simulations, however, typically doom PDES and fail to scale (e.g., OPNET and ns-3 simulators). We propose a PDES-based tool aimed at reducing the communication overhead between processors. The proposed solution will use light-weight processes to dynamically distribute computation workload while mitigating communication overhead associated with synchronizations. This work is vital to the analytics and validation capabilities of simulation and emulation at Sandia. We have years of experience in Sandia’s simulation and emulation projects (e.g., MINIMEGA and FIREWHEEL). Sandia’s current highly-regarded capabilities in large-scale emulations have focused on wired networks, where two assumptions prevent scalable wireless studies: (a) the connections between objects are mostly static and (b) the nodes have fixed locations.« less

  7. Modeling and control of operator functional state in a unified framework of fuzzy inference petri nets.

    PubMed

    Zhang, Jian-Hua; Xia, Jia-Jun; Garibaldi, Jonathan M; Groumpos, Petros P; Wang, Ru-Bin

    2017-06-01

    In human-machine (HM) hybrid control systems, human operator and machine cooperate to achieve the control objectives. To enhance the overall HM system performance, the discrete manual control task-load by the operator must be dynamically allocated in accordance with continuous-time fluctuation of psychophysiological functional status of the operator, so-called operator functional state (OFS). The behavior of the HM system is hybrid in nature due to the co-existence of discrete task-load (control) variable and continuous operator performance (system output) variable. Petri net is an effective tool for modeling discrete event systems, but for hybrid system involving discrete dynamics, generally Petri net model has to be extended. Instead of using different tools to represent continuous and discrete components of a hybrid system, this paper proposed a method of fuzzy inference Petri nets (FIPN) to represent the HM hybrid system comprising a Mamdani-type fuzzy model of OFS and a logical switching controller in a unified framework, in which the task-load level is dynamically reallocated between the operator and machine based on the model-predicted OFS. Furthermore, this paper used a multi-model approach to predict the operator performance based on three electroencephalographic (EEG) input variables (features) via the Wang-Mendel (WM) fuzzy modeling method. The membership function parameters of fuzzy OFS model for each experimental participant were optimized using artificial bee colony (ABC) evolutionary algorithm. Three performance indices, RMSE, MRE, and EPR, were computed to evaluate the overall modeling accuracy. Experiment data from six participants are analyzed. The results show that the proposed method (FIPN with adaptive task allocation) yields lower breakdown rate (from 14.8% to 3.27%) and higher human performance (from 90.30% to 91.99%). The simulation results of the FIPN-based adaptive HM (AHM) system on six experimental participants demonstrate that the FIPN framework provides an effective way to model and regulate/optimize the OFS in HM hybrid systems composed of continuous-time OFS model and discrete-event switching controller. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Hybrid Discrete-Continuous Markov Decision Processes

    NASA Technical Reports Server (NTRS)

    Feng, Zhengzhu; Dearden, Richard; Meuleau, Nicholas; Washington, Rich

    2003-01-01

    This paper proposes a Markov decision process (MDP) model that features both discrete and continuous state variables. We extend previous work by Boyan and Littman on the mono-dimensional time-dependent MDP to multiple dimensions. We present the principle of lazy discretization, and piecewise constant and linear approximations of the model. Having to deal with several continuous dimensions raises several new problems that require new solutions. In the (piecewise) linear case, we use techniques from partially- observable MDPs (POMDPS) to represent value functions as sets of linear functions attached to different partitions of the state space.

  9. Amplifying (Im)perfection: The Impact of Crystallinity in Discrete and Disperse Block Co-oligomers

    PubMed Central

    2017-01-01

    Crystallinity is seldomly utilized as part of the microphase segregation process in ultralow-molecular-weight block copolymers. Here, we show the preparation of two types of discrete, semicrystalline block co-oligomers, comprising an amorphous oligodimethylsiloxane block and a crystalline oligo-l-lactic acid or oligomethylene block. The self-assembly of these discrete materials results in lamellar structures with unforeseen uniformity in the domain spacing. A systematic introduction of dispersity reveals the extreme sensitivity of the microphase segregation process toward chain length dispersity in the crystalline block. PMID:28994585

  10. Amplifying (Im)perfection: The Impact of Crystallinity in Discrete and Disperse Block Co-oligomers.

    PubMed

    van Genabeek, Bas; Lamers, Brigitte A G; de Waal, Bas F M; van Son, Martin H C; Palmans, Anja R A; Meijer, E W

    2017-10-25

    Crystallinity is seldomly utilized as part of the microphase segregation process in ultralow-molecular-weight block copolymers. Here, we show the preparation of two types of discrete, semicrystalline block co-oligomers, comprising an amorphous oligodimethylsiloxane block and a crystalline oligo-l-lactic acid or oligomethylene block. The self-assembly of these discrete materials results in lamellar structures with unforeseen uniformity in the domain spacing. A systematic introduction of dispersity reveals the extreme sensitivity of the microphase segregation process toward chain length dispersity in the crystalline block.

  11. 78 FR 61113 - Acquisition Process: Task and Delivery Order Contracts, Bundling, Consolidation

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-02

    ... that a total set-aside is not appropriate but the procurement can be broken up into smaller discrete... discrete components to support a partial set-aside and market research shows that either: at least two... could divide a multiple award contract for divergent goods and services into discrete categories (which...

  12. Discrete Photodetection and Susskind-Glogower Phase Operators

    NASA Technical Reports Server (NTRS)

    Ben-Aryeh, Y.

    1996-01-01

    State reduction processes in different types of photodetection experiments are described by using different kinds of ladder operators. A special model of discrete photodetection is developed by the use of superoperators which are based on the Susskind-Glogower raising and lower operators. The possibility to realize experimentally the discrete photodetection scheme in a micromaser is discussed.

  13. Hydrological Simulation of Flood Events At Large Basins Using Distributed Modelling

    NASA Astrophysics Data System (ADS)

    Vélez, J.; Vélez, I.; Puricelli, M.; Francés, F.

    Recent advances in technology allows to the scientist community advance in new pro- cedures in order to reduce the risk associated to flood events. A conceptual distributed model has been implemented to simulate the hydrological processes involved during floods. The model has been named TETIS. The basin is divided into rectangular cells, all of them connected according to the network drainage. The rainfall-runoff process is modelled using four linked tanks at each cell with different outflow relationships at each tank, which represent the ET, direct runoff, interflow and base flow, respectively. The routing along the channel network has been proposed using basin geomorpho- logic characteristics coupled to the cinematic wave procedure. The vertical movement along the cell is proposed using simple relationships based on soil properties as field capacity and the saturated hydraulic conductivities, which were previously obtained using land use, litology, edaphology and basin properties maps. The different vertical proccesses along the cell included are: capillar storage, infiltration, percolation and underground losses. Finally, snowmelting and reservoir routing has been included. TETIS has been implemented in the flood warning system of the Tagus River, with a basin of 59 200 km2. The time discretization of the input data is 15 minutes, and the cell size is 500x500 m. The basic parameter maps were estimated for the entire basin, and a calibration and validation processes were performed using some recorded events in the upper part of the basin. Calibration confirmed the initial parameter estimation. Additionally, the validation in time and space showed the robustness of these types of models

  14. Acoustic emission techniques applied to conventionally reinforced concrete bridge girders.

    DOT National Transportation Integrated Search

    2008-09-01

    Reinforced concrete (RC) bridges generally operate at service-level loads except during discrete overload events that can reduce the integrity of the structure by initiating concrete cracks, widening or extending of existing concrete cracks, as well ...

  15. Evolution of damage during deformation in porous granular materials (Louis Néel Medal Lecture)

    NASA Astrophysics Data System (ADS)

    Main, Ian

    2014-05-01

    'Crackling noise' occurs in a wide variety of systems that respond to external forcing in an intermittent way, leading to sudden bursts of energy release similar to those heard when crunching up a piece of paper or listening to a fire. In mineral magnetism ('Barkhausen') crackling noise occurs due to sudden changes in the size and orientation of microscopic ferromagnetic domains when the external magnetic field is changed. In rock physics sudden changes in internal stress associated with microscopically brittle failure events lead to acoustic emissions that can be recorded on the sample boundary, and used to infer the state of internal damage. Crackling noise is inherently stochastic, but the population of events often exhibits remarkably robust scaling properties, in terms of the source area, duration, energy, and in the waiting time between events. Here I describe how these scaling properties emerge and evolve spontaneously in a fully-dynamic discrete element model of sedimentary rocks subject to uniaxial compression at a constant strain rate. The discrete elements have structural disorder similar to that of a real rock, and this is the only source of heterogeneity. Despite the stationary loading and the lack of any time-dependent weakening processes, the results are all characterized by emergent power law distributions over a broad range of scales, in agreement with experimental observation. As deformation evolves, the scaling exponents change systematically in a way that is similar to the evolution of damage in experiments on real sedimentary rocks. The potential for real-time failure forecasting is examined by using synthetic and real data from laboratory tests and prior to volcanic eruptions. The combination of non-linearity and an irreducible stochastic component leads to significant variations in the precision and accuracy of the forecast failure time, leading to a significant proportion of 'false alarms' (forecast too early) and 'missed events' (forecast too late), as well as an over-optimistic assessments of forecasting power and quality when the failure time is known (the 'benefit of hindsight'). The evolution becomes progressively more complex, and the forecasting power diminishes, in going from ideal synthetics to controlled laboratory tests to open natural systems at larger scales in space and time.

  16. Non-verbal numerical cognition: from reals to integers.

    PubMed

    Gallistel; Gelman

    2000-02-01

    Data on numerical processing by verbal (human) and non-verbal (animal and human) subjects are integrated by the hypothesis that a non-verbal counting process represents discrete (countable) quantities by means of magnitudes with scalar variability. These appear to be identical to the magnitudes that represent continuous (uncountable) quantities such as duration. The magnitudes representing countable quantity are generated by a discrete incrementing process, which defines next magnitudes and yields a discrete ordering. In the case of continuous quantities, the continuous accumulation process does not define next magnitudes, so the ordering is also continuous ('dense'). The magnitudes representing both countable and uncountable quantity are arithmetically combined in, for example, the computation of the income to be expected from a foraging patch. Thus, on the hypothesis presented here, the primitive machinery for arithmetic processing works with real numbers (magnitudes).

  17. Optical recording of action potentials and other discrete physiological events: a perspective from signal detection theory.

    PubMed

    Sjulson, Lucas; Miesenböck, Gero

    2007-02-01

    Optical imaging of physiological events in real time can yield insights into biological function that would be difficult to obtain by other experimental means. However, the detection of all-or-none events, such as action potentials or vesicle fusion events, in noisy single-trial data often requires a careful balance of tradeoffs. The analysis of such experiments, as well as the design of optical reporters and instrumentation for them, is aided by an understanding of the principles of signal detection. This review illustrates these principles, using as an example action potential recording with optical voltage reporters.

  18. Predicting System Accidents with Model Analysis During Hybrid Simulation

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Fleming, Land D.; Throop, David R.

    2002-01-01

    Standard discrete event simulation is commonly used to identify system bottlenecks and starving and blocking conditions in resources and services. The CONFIG hybrid discrete/continuous simulation tool can simulate such conditions in combination with inputs external to the simulation. This provides a means for evaluating the vulnerability to system accidents of a system's design, operating procedures, and control software. System accidents are brought about by complex unexpected interactions among multiple system failures , faulty or misleading sensor data, and inappropriate responses of human operators or software. The flows of resource and product materials play a central role in the hazardous situations that may arise in fluid transport and processing systems. We describe the capabilities of CONFIG for simulation-time linear circuit analysis of fluid flows in the context of model-based hazard analysis. We focus on how CONFIG simulates the static stresses in systems of flow. Unlike other flow-related properties, static stresses (or static potentials) cannot be represented by a set of state equations. The distribution of static stresses is dependent on the specific history of operations performed on a system. We discuss the use of this type of information in hazard analysis of system designs.

  19. Virtualized Traffic: reconstructing traffic flows from discrete spatiotemporal data.

    PubMed

    Sewall, Jason; van den Berg, Jur; Lin, Ming C; Manocha, Dinesh

    2011-01-01

    We present a novel concept, Virtualized Traffic, to reconstruct and visualize continuous traffic flows from discrete spatiotemporal data provided by traffic sensors or generated artificially to enhance a sense of immersion in a dynamic virtual world. Given the positions of each car at two recorded locations on a highway and the corresponding time instances, our approach can reconstruct the traffic flows (i.e., the dynamic motions of multiple cars over time) between the two locations along the highway for immersive visualization of virtual cities or other environments. Our algorithm is applicable to high-density traffic on highways with an arbitrary number of lanes and takes into account the geometric, kinematic, and dynamic constraints on the cars. Our method reconstructs the car motion that automatically minimizes the number of lane changes, respects safety distance to other cars, and computes the acceleration necessary to obtain a smooth traffic flow subject to the given constraints. Furthermore, our framework can process a continuous stream of input data in real time, enabling the users to view virtualized traffic events in a virtual world as they occur. We demonstrate our reconstruction technique with both synthetic and real-world input. © 2011 IEEE Published by the IEEE Computer Society

  20. A discrete in continuous mathematical model of cardiac progenitor cells formation and growth as spheroid clusters (Cardiospheres).

    PubMed

    Di Costanzo, Ezio; Giacomello, Alessandro; Messina, Elisa; Natalini, Roberto; Pontrelli, Giuseppe; Rossi, Fabrizio; Smits, Robert; Twarogowska, Monika

    2018-03-14

    We propose a discrete in continuous mathematical model describing the in vitro growth process of biophsy-derived mammalian cardiac progenitor cells growing as clusters in the form of spheres (Cardiospheres). The approach is hybrid: discrete at cellular scale and continuous at molecular level. In the present model, cells are subject to the self-organizing collective dynamics mechanism and, additionally, they can proliferate and differentiate, also depending on stochastic processes. The two latter processes are triggered and regulated by chemical signals present in the environment. Numerical simulations show the structure and the development of the clustered progenitors and are in a good agreement with the results obtained from in vitro experiments.

  1. High-resolution space-time characterization of convective rain cells: implications on spatial aggregation and temporal sampling operated by coarser resolution instruments

    NASA Astrophysics Data System (ADS)

    Marra, Francesco; Morin, Efrat

    2017-04-01

    Forecasting the occurrence of flash floods and debris flows is fundamental to save lives and protect infrastructures and properties. These natural hazards are generated by high-intensity convective storms, on space-time scales that cannot be properly monitored by conventional instrumentation. Consequently, a number of early-warning systems are nowadays based on remote sensing precipitation observations, e.g. from weather radars or satellites, that proved effective in a wide range of situations. However, the uncertainty affecting rainfall estimates represents an important issue undermining the operational use of early-warning systems. The uncertainty related to remote sensing estimates results from (a) an instrumental component, intrinsic of the measurement operation, and (b) a discretization component, caused by the discretization of the continuous rainfall process. Improved understanding on these sources of uncertainty will provide crucial information to modelers and decision makers. This study aims at advancing knowledge on the (b) discretization component. To do so, we take advantage of an extremely-high resolution X-Band weather radar (60 m, 1 min) recently installed in the Eastern Mediterranean. The instrument monitors a semiarid to arid transition area also covered by an accurate C-Band weather radar and by a relatively sparse rain gauge network ( 1 gauge/ 450 km2). Radar quantitative precipitation estimation includes corrections reducing the errors due to ground echoes, orographic beam blockage and attenuation of the signal in heavy rain. Intense, convection-rich, flooding events recently occurred in the area serve as study cases. We (i) describe with very high detail the spatiotemporal characteristics of the convective cores, and (ii) quantify the uncertainty due to spatial aggregation (spatial discretization) and temporal sampling (temporal discretization) operated by coarser resolution remote sensing instruments. We show that instantaneous rain intensity decreases very steeply with the distance from the core of convection with intensity observed at 1 km (2 km) being 10-40% (1-20%) of the core value. The use of coarser temporal resolutions leads to gaps in the observed rainfall and even relatively high resolutions (5 min) can be affected by the problem. We conclude providing to the final user indications about the effects of the discretization component of estimation uncertainty and suggesting viable ways to decrease them.

  2. A geodetic matched-filter search for slow slip with application to the Mexico subduction zone

    NASA Astrophysics Data System (ADS)

    Rousset, B.; Campillo, M.; Lasserre, C.; Frank, W.; Cotte, N.; Walpersdorf, A.; Socquet, A.; Kostoglodov, V.

    2017-12-01

    Since the discovery of slow slip events, many methods have been successfully applied to model obvious transient events in geodetic time series, such as the widely used network strain filter. Independent seismological observations of tremors or low frequency earthquakes and repeating earthquakes provide evidence of low amplitude slow deformation but do not always coincide with clear occurrences of transient signals in geodetic time series. Here, we aim to extract the signal corresponding to slow slips hidden in the noise of GPS time series, without using information from independent datasets. We first build a library of synthetic slow slip event templates by assembling a source function with Green's functions for a discretized fault. We then correlate the templates with post-processed GPS time series. Once the events have been detected in time, we estimate their duration T and magnitude Mw by modelling a weighted stack of GPS time series. An analysis of synthetic time series shows that this method is able to resolve the correct timing, location, T and Mw of events larger than Mw 6.0 in the context of the Mexico subduction zone. Applied on a real data set of 29 GPS time series in the Guerrero area from 2005 to 2014, this technique allows us to detect 28 transient events from Mw 6.3 to 7.2 with durations that range from 3 to 39 days. These events have a dominant recurrence time of 40 days and are mainly located at the down dip edges of the Mw > 7.5 SSEs.

  3. Mars atmospheric loss to space: Observations of present-day loss and implications for long-term volatile evolution

    NASA Astrophysics Data System (ADS)

    Jakosky, Bruce; Brain, David; Luhmann, Janet; Grebowsky, Joe

    2017-04-01

    MAVEN completed its first Mars year of science mapping in October 2016. Results show loss of gas to space by multiple processes, including solar-wind pick-up, sputtering, photochemical loss, and thermal escape, along with their responses to changing solar and solar-wind boundary conditions and to discrete solar events. By understanding the current loss rates and the processes controlling them, we are able to examine the long-term loss to space, including the effects of different solar conditions early in history; in addition, we are able to use stable-isotope ratios to derive the integrated loss to space through time. Preliminary results suggest that loss to space was a dominant, if not the dominant, mechanism that drove the changing climate through time. We will present a framework for analyzing and interpreting the results, along with preliminary results on the extrapolation to long timescales.

  4. Facilitating Co-Design for Extreme-Scale Systems Through Lightweight Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Engelmann, Christian; Lauer, Frank

    This work focuses on tools for investigating algorithm performance at extreme scale with millions of concurrent threads and for evaluating the impact of future architecture choices to facilitate the co-design of high-performance computing (HPC) architectures and applications. The approach focuses on lightweight simulation of extreme-scale HPC systems with the needed amount of accuracy. The prototype presented in this paper is able to provide this capability using a parallel discrete event simulation (PDES), such that a Message Passing Interface (MPI) application can be executed at extreme scale, and its performance properties can be evaluated. The results of an initial prototype aremore » encouraging as a simple 'hello world' MPI program could be scaled up to 1,048,576 virtual MPI processes on a four-node cluster, and the performance properties of two MPI programs could be evaluated at up to 16,384 virtual MPI processes on the same system.« less

  5. The struggle of giving up personal goals: affective, physiological, and cognitive consequences of an action crisis.

    PubMed

    Brandstätter, Veronika; Herrmann, Marcel; Schüler, Julia

    2013-12-01

    A critical phase in goal striving occurs when setbacks accumulate and goal disengagement becomes an issue. This critical phase is conceptualized as an action crisis and assumed to be characterized by an intrapsychic conflict in which the individual becomes torn between further goal pursuit and goal disengagement. Our theorizing converges with Klinger's conceptualization of goal disengagement as a process, rather than a discrete event. Two longitudinal field studies tested and found support for the hypothesis that an action crisis not only compromises an individual's psychological and physiological well-being, but also dampens the cognitive evaluation of the respective goal. In Study 3, marathon runners experiencing an action crisis in their goal of running marathons showed a stronger cortisol secretion and a lower performance in the race 2 weeks later. Results are interpreted in terms of action-phase-specific mindsets with a focus on self-regulatory processes in goal disengagement.

  6. Towards an integrated numerical simulator for crack-seal vein microstructure: Coupling phase-field with the Discrete Element Method

    NASA Astrophysics Data System (ADS)

    Virgo, Simon; Ankit, Kumar; Nestler, Britta; Urai, Janos L.

    2016-04-01

    Crack-seal veins form in a complex interplay of coupled thermal, hydraulic, mechanical and chemical processes. Their formation and cyclic growth involves brittle fracturing and dilatancy, phases of increased fluid flow and the growth of crystals that fill the voids and reestablish the mechanical strength. Existing numerical models of vein formation focus on selected aspects of the coupled process. Until today, no model exists that is able to use a realistic representation of the fracturing AND sealing processes, simultaneously. To address this challenge, we propose the bidirectional coupling of two numerical methods that have proven themselves as very powerful to model the fundamental processes acting in crack-seal systems: Phase-field and the Discrete Element Method (DEM). The phase-field Method was recently successfully extended to model the precipitation of quartz crystals from an aqueous solution and applied to model the sealing of a vein over multiple opening events (Ankit et al., 2013; Ankit et al., 2015a; Ankit et al., 2015b). The advantage over former, purely kinematic approaches is that in phase-field, the crystal growth is modeled based on thermodynamic and kinetic principles. Different driving forces for microstructure evolution, such as chemical bulk free energy, interfacial energy, elastic strain energy and different transport processes, such as mass diffusion and advection, can be coupled and the effect on the evolution process can be studied in 3D. The Discrete Element Method was already used in several studies to model the fracturing of rocks and the incremental growth of veins by repeated fracturing (Virgo et al., 2013; Virgo et al., 2014). Materials in DEM are represented by volumes of packed spherical particles and the response to the material to stress is modeled by interaction of the particles with their nearest neighbours. For rocks, in 3D, the method provides a realistic brittle failure behaviour. Exchange Routines are being developed that translate the spatial domain of the model from DEM to the phase-field and vice versa. This will allow the fracturing process to be modeled with DEM and the sealing processes to be modeled with phase-field approach. With this bidirectional coupling, the strengths of these two numerical methods will be combined into a unified model of iterative crack-seal that will be able to model the complex feedback mechanisms between fracturing and sealing processes and assess the influence of thermal, mechanical, chemical and hydraulic parameters on the evolution of vein microstructures. References: Ankit, K., Nestler, B., Selzer, M., and Reichardt, M., 2013, Phase-field study of grain boundary tracking behavior in crack-seal microstructures: Contributions to Mineralogy and Petrology, v. 166, no. 6, p. 1709-1723 Ankit, K., Selzer, M., Hilgers, C., and Nestler, B., 2015a, Phase-field modeling of fracture cementation processes in 3-D: Journal of Petroleum Science Research, v. 4, no. 2, p. 79-96 Ankit, K., Urai, J.L., and Nestler, B., 2015b, Microstructural evolution in bitaxial crack-seal veins: A phase-field study: Journal of Geophysical Research: Solid Earth, v. 120, no. 5, p. 3096-3118. Virgo, S., Abe, S., and Urai, J.L., 2013, Extension fracture propagation in rocks with veins: Insight into the crack-seal process using Discrete Element Method modeling: Journal of Geophysical Research: Solid Earth, v. 118, no. 10 Virgo, S., Abe, S., and Urai, J.L., 2014, The evolution of crack seal vein and fracture networks in an evolving stress field: Insights from Discrete Element Models of fracture sealing: Journal of Geophysical Research: Solid Earth, p. 2014JB011520

  7. Is Accessing of Words Affected by Affective Valence Only? A Discrete Emotion View on the Emotional Congruency Effect

    PubMed Central

    Chen, Xuqian; Liu, Bo; Lin, Shouwen

    2016-01-01

    This paper advances the discussion on which emotion information affects word accessing. Emotion information, which is formed as a result of repeated experiences, is primary and necessary in learning and representing word meanings. Previous findings suggested that valence (i.e., positive or negative) denoted by words can be automatically activated and plays a role in many significant cognitive processes. However, there has been a lack of discussion about whether discrete emotion information (i.e., happiness, anger, sadness, and fear) is also involved in these processes. According to the hierarchy model, emotions are considered organized within an abstract-to-concrete hierarchy, in which emotion prototypes are organized following affective valence. By controlling different congruencies of emotion relations (i.e., matches or mismatches between valences and prototypes of emotion), the present study showed both an evaluative congruency effect (Experiment 1) and a discrete emotional congruency effect (Experiment 2). These findings indicate that not only affective valences but also discrete emotions can be activated under the present priming lexical decision task. However, the present findings also suggest that discrete emotions might be activated at the later priming stage as compared to valences. The present work provides evidence that information about discrete emotion could be involved in word processing. This might be a result of subjects’ embodied experiences. PMID:27379000

  8. Is Accessing of Words Affected by Affective Valence Only? A Discrete Emotion View on the Emotional Congruency Effect.

    PubMed

    Chen, Xuqian; Liu, Bo; Lin, Shouwen

    2016-01-01

    This paper advances the discussion on which emotion information affects word accessing. Emotion information, which is formed as a result of repeated experiences, is primary and necessary in learning and representing word meanings. Previous findings suggested that valence (i.e., positive or negative) denoted by words can be automatically activated and plays a role in many significant cognitive processes. However, there has been a lack of discussion about whether discrete emotion information (i.e., happiness, anger, sadness, and fear) is also involved in these processes. According to the hierarchy model, emotions are considered organized within an abstract-to-concrete hierarchy, in which emotion prototypes are organized following affective valence. By controlling different congruencies of emotion relations (i.e., matches or mismatches between valences and prototypes of emotion), the present study showed both an evaluative congruency effect (Experiment 1) and a discrete emotional congruency effect (Experiment 2). These findings indicate that not only affective valences but also discrete emotions can be activated under the present priming lexical decision task. However, the present findings also suggest that discrete emotions might be activated at the later priming stage as compared to valences. The present work provides evidence that information about discrete emotion could be involved in word processing. This might be a result of subjects' embodied experiences.

  9. Using Simulation to Interpret a Discrete Time Survival Model in a Complex Biological System: Fertility and Lameness in Dairy Cows

    PubMed Central

    Hudson, Christopher D.; Huxley, Jonathan N.; Green, Martin J.

    2014-01-01

    The ever-growing volume of data routinely collected and stored in everyday life presents researchers with a number of opportunities to gain insight and make predictions. This study aimed to demonstrate the usefulness in a specific clinical context of a simulation-based technique called probabilistic sensitivity analysis (PSA) in interpreting the results of a discrete time survival model based on a large dataset of routinely collected dairy herd management data. Data from 12,515 dairy cows (from 39 herds) were used to construct a multilevel discrete time survival model in which the outcome was the probability of a cow becoming pregnant during a given two day period of risk, and presence or absence of a recorded lameness event during various time frames relative to the risk period amongst the potential explanatory variables. A separate simulation model was then constructed to evaluate the wider clinical implications of the model results (i.e. the potential for a herd’s incidence rate of lameness to influence its overall reproductive performance) using PSA. Although the discrete time survival analysis revealed some relatively large associations between lameness events and risk of pregnancy (for example, occurrence of a lameness case within 14 days of a risk period was associated with a 25% reduction in the risk of the cow becoming pregnant during that risk period), PSA revealed that, when viewed in the context of a realistic clinical situation, a herd’s lameness incidence rate is highly unlikely to influence its overall reproductive performance to a meaningful extent in the vast majority of situations. Construction of a simulation model within a PSA framework proved to be a very useful additional step to aid contextualisation of the results from a discrete time survival model, especially where the research is designed to guide on-farm management decisions at population (i.e. herd) rather than individual level. PMID:25101997

  10. Using simulation to interpret a discrete time survival model in a complex biological system: fertility and lameness in dairy cows.

    PubMed

    Hudson, Christopher D; Huxley, Jonathan N; Green, Martin J

    2014-01-01

    The ever-growing volume of data routinely collected and stored in everyday life presents researchers with a number of opportunities to gain insight and make predictions. This study aimed to demonstrate the usefulness in a specific clinical context of a simulation-based technique called probabilistic sensitivity analysis (PSA) in interpreting the results of a discrete time survival model based on a large dataset of routinely collected dairy herd management data. Data from 12,515 dairy cows (from 39 herds) were used to construct a multilevel discrete time survival model in which the outcome was the probability of a cow becoming pregnant during a given two day period of risk, and presence or absence of a recorded lameness event during various time frames relative to the risk period amongst the potential explanatory variables. A separate simulation model was then constructed to evaluate the wider clinical implications of the model results (i.e. the potential for a herd's incidence rate of lameness to influence its overall reproductive performance) using PSA. Although the discrete time survival analysis revealed some relatively large associations between lameness events and risk of pregnancy (for example, occurrence of a lameness case within 14 days of a risk period was associated with a 25% reduction in the risk of the cow becoming pregnant during that risk period), PSA revealed that, when viewed in the context of a realistic clinical situation, a herd's lameness incidence rate is highly unlikely to influence its overall reproductive performance to a meaningful extent in the vast majority of situations. Construction of a simulation model within a PSA framework proved to be a very useful additional step to aid contextualisation of the results from a discrete time survival model, especially where the research is designed to guide on-farm management decisions at population (i.e. herd) rather than individual level.

  11. Discrete-Time Demodulator Architectures for Free-Space Broadband Optical Pulse-Position Modulation

    NASA Technical Reports Server (NTRS)

    Gray, A. A.; Lee, C.

    2004-01-01

    The objective of this work is to develop discrete-time demodulator architectures for broadband optical pulse-position modulation (PPM) that are capable of processing Nyquist or near-Nyquist data rates. These architectures are motivated by the numerous advantages of realizing communications demodulators in digital very large scale integrated (VLSI) circuits. The architectures are developed within a framework that encompasses a large body of work in optical communications, synchronization, and multirate discrete-time signal processing and are constrained by the limitations of the state of the art in digital hardware. This work attempts to create a bridge between theoretical communication algorithms and analysis for deep-space optical PPM and modern digital VLSI. The primary focus of this work is on the synthesis of discrete-time processing architectures for accomplishing the most fundamental functions required in PPM demodulators, post-detection filtering, synchronization, and decision processing. The architectures derived are capable of closely approximating the theoretical performance of the continuous-time algorithms from which they are derived. The work concludes with an outline of the development path that leads to hardware.

  12. Sm-Nd Mineral Isochron Age Patterns from Garnet-bearing Peridotite of the Western Gneiss Region, Norwegian Caledonides: Discrete Mantle Events or Continuous Re- equilibration?

    NASA Astrophysics Data System (ADS)

    Brueckner, H. K.

    2007-12-01

    The garnet peridotites (and pyroxenites) of the UHP Western Gneiss Region of Norway give Sm-Nd garnet, clinopyroxene, whole rock, orthopyroxene, amphibole ages that range from ca. 1.7 Ga to 424 Ma. Most of these twenty seven ages are much older than the continent-continent collision that transferred these peridoitites from the mantle into the crust (i.e. the 400 Ma Scandian Orogeny) suggesting the garnet peridotites of the WGR are unique relative to those in other UHP terranes, which invariably give ages that overlap the time of UHP metamorphism of the enclosing country rocks. All but the youngest ages given by WGR peridotites reflect processes that occurred deep in the mantle beneath the Baltic Shield, but it is unclear if they date a series of discrete events related to the tectonic evolution of the Baltic Shield or if the ages reflect continuous, but variable, re-equilibration of the Sm-Nd system between phases during the residence of the peridotites in the mantle. Three ages overlap the 1.75 to 1.55 Ga Gothian Orogeny while twelve ages are within error of the 1.2 to 0.9 Ga Sveconorwegian Orogeny. The three youngest ages (438 to 424 Ma) are associated with a younger generation of garnets and may mark the beginning of eclogite-facies metamorphism of Baltica as it was subducted beneath Laurentia during the Scandian Orogeny. However, the remaining nine ages spread more or less continuously between these three major events. The overall pattern on a histogram is a range of ages with a pronounced peak at and near the Sveconorwegian Orogeny. The ages therefore appear to date continuous diffusion between minerals from garnet-bearing assemblages that formed originally during or, less likely, before the Gothian Orogeny interrupted by a pronounced thermal event during the Svconorwegian Orogeny and a recrystallization event during the early stages of the Scandian orogeny. The degree of re-equilibration was probably controlled by the ambient temperature of the peridotite body in the mantle, which was controlled, in turn, by their depth in the mantle and their proximity to hot mantle upwelling during the Sveconorwegian Orogeny.

  13. Activity Diagrams for DEVS Models: A Case Study Modeling Health Care Behavior

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ozmen, Ozgur; Nutaro, James J

    Discrete Event Systems Specification (DEVS) is a widely used formalism for modeling and simulation of discrete and continuous systems. While DEVS provides a sound mathematical representation of discrete systems, its practical use can suffer when models become complex. Five main functions, which construct the core of atomic modules in DEVS, can realize the behaviors that modelers want to represent. The integration of these functions is handled by the simulation routine, however modelers can implement each function in various ways. Therefore, there is a need for graphical representations of complex models to simplify their implementation and facilitate their reproduction. In thismore » work, we illustrate the use of activity diagrams for this purpose in the context of a health care behavior model, which is developed with an agent-based modeling paradigm.« less

  14. Extracting, Tracking, and Visualizing Magnetic Flux Vortices in 3D Complex-Valued Superconductor Simulation Data.

    PubMed

    Guo, Hanqi; Phillips, Carolyn L; Peterka, Tom; Karpeyev, Dmitry; Glatz, Andreas

    2016-01-01

    We propose a method for the vortex extraction and tracking of superconducting magnetic flux vortices for both structured and unstructured mesh data. In the Ginzburg-Landau theory, magnetic flux vortices are well-defined features in a complex-valued order parameter field, and their dynamics determine electromagnetic properties in type-II superconductors. Our method represents each vortex line (a 1D curve embedded in 3D space) as a connected graph extracted from the discretized field in both space and time. For a time-varying discrete dataset, our vortex extraction and tracking method is as accurate as the data discretization. We then apply 3D visualization and 2D event diagrams to the extraction and tracking results to help scientists understand vortex dynamics and macroscale superconductor behavior in greater detail than previously possible.

  15. Measuring agreement of multivariate discrete survival times using a modified weighted kappa coefficient.

    PubMed

    Guo, Ying; Manatunga, Amita K

    2009-03-01

    Assessing agreement is often of interest in clinical studies to evaluate the similarity of measurements produced by different raters or methods on the same subjects. We present a modified weighted kappa coefficient to measure agreement between bivariate discrete survival times. The proposed kappa coefficient accommodates censoring by redistributing the mass of censored observations within the grid where the unobserved events may potentially happen. A generalized modified weighted kappa is proposed for multivariate discrete survival times. We estimate the modified kappa coefficients nonparametrically through a multivariate survival function estimator. The asymptotic properties of the kappa estimators are established and the performance of the estimators are examined through simulation studies of bivariate and trivariate survival times. We illustrate the application of the modified kappa coefficient in the presence of censored observations with data from a prostate cancer study.

  16. A statistical study on the occurrence of discrete frequencies in the high velocity solar wind and in the magnetosphere

    NASA Astrophysics Data System (ADS)

    Di Matteo, Simone; Villante, Umberto

    2016-04-01

    The possible occurrence of oscillations at discrete frequencies in the solar wind and their possible correspondence with magnetospheric field oscillations represent an interesting aspect of the solar wind/magnetopheric research. We analyze a large set of high velocity streams following interplanetary shocks in order to ascertain the possible occurrence of preferential sets of discrete frequencies in the oscillations of the solar wind pressure in such structures. We evaluate, for each event, the power spectrum of the dynamic pressure by means of two methods (Welch and multitaper windowing) and accept the common spectral peaks that also pass a harmonic F-test at the 95% confidence level. We compare these frequencies with those detected at geosynchronous orbit in the magnetospheric field components soon after the manifestation of the corresponding Sudden Impulses.

  17. Denoising embolic Doppler ultrasound signals using Dual Tree Complex Discrete Wavelet Transform.

    PubMed

    Serbes, Gorkem; Aydin, Nizamettin

    2010-01-01

    Early and accurate detection of asymptomatic emboli is important for monitoring of preventive therapy in stroke-prone patients. One of the problems in detection of emboli is the identification of an embolic signal caused by very small emboli. The amplitude of the embolic signal may be so small that advanced processing methods are required to distinguish these signals from Doppler signals arising from red blood cells. In this study instead of conventional discrete wavelet transform, the Dual Tree Complex Discrete Wavelet Transform was used for denoising embolic signals. Performances of both approaches were compared. Unlike the conventional discrete wavelet transform discrete complex wavelet transform is a shift invariant transform with limited redundancy. Results demonstrate that the Dual Tree Complex Discrete Wavelet Transform based denoising outperforms conventional discrete wavelet denoising. Approximately 8 dB improvement is obtained by using the Dual Tree Complex Discrete Wavelet Transform compared to the improvement provided by the conventional Discrete Wavelet Transform (less than 5 dB).

  18. System Operations Studies for Automated Guideway Transit Systems : Discrete Event Simulation Model Programmer's Manual

    DOT National Transportation Integrated Search

    1982-07-01

    In order to examine specific automated guideway transit (AGT) developments and concepts, UMTA undertook a program of studies and technology investigations called Automated Guideway Transit Technology (AGTT) Program. The objectives of one segment of t...

  19. Acoustic emission techniques applied to conventionally reinforced concrete bridge girders : final report.

    DOT National Transportation Integrated Search

    2008-09-01

    Reinforced concrete (RC) bridges generally operate at service-level loads except during discrete overload events that can reduce the integrity of the structure by initiating concrete cracks, widening or extending of existing concrete cracks, as well ...

  20. Particle Filter Based Tracking in a Detection Sparse Discrete Event Simulation Environment

    DTIC Science & Technology

    2007-03-01

    obtained by disqualifying a large number of particles. 52 (a) (b) ( c ) Figure 31. Particle Disqualification via Sanitization b...1 B. RESEARCH APPROACH..............................................................................5 C . THESIS ORGANIZATION...38 b. Detection Distribution Sampling............................................43 c . Estimated Position Calculation

Top